Blog Status

When I got back on the Blog wagon, I intended to write more posts, and even came up with a list of topics that I wanted to write about.  So far I am not doing so well on that.  The last couple of posts that I wrote had to do with SharePoint, which wasn’t even on my list.  There has been a little bit of work on the topics I intended, but I haven’t done as much as I would like.  And I have definitely not done as much blogging as I intended (hoped) by this point.

A status update on my post count…  I am up to 159 (160 when this posts).  I get a few hits here and there on my random PowerShell contributions.  I still get a lot of hits on my DPM posts.  I don’t actually use DPM anymore (we use a third party hosted solution, not because there was any problem with DPM).  I get hits on some of my Hyper-V posts and a few other storage posts. 

My list included improving my management skills.  Maybe I should blog a little more about my experiences in that regard.

Security Token Service

Had a bit of a scare during a maintenance window.  Ran some updates on our SharePoint farm and after that one of the sites wasn’t coming up.  Kept getting a 503 error.  When I checked the event log, I found this error message:

An exception occurred when trying to issue security token: The HTTP service located at http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc/actas is unavailable.  This could be because the service is too busy or because no endpoint was found listening at the specified address. Please ensure that the address is correct and try accessing the service again later..

A quick search led to this article, and when I checked the AppPools, they were all stopped.

https://social.technet.microsoft.com/Forums/sharepoint/en-US/1bb454e8-d395-4059-8bc8-ccc74f999659/the-security-token-service-is-not-issuing-tokens-the-service-could-be-malfunctioning-or-in-a-bad?forum=sharepointgeneralprevious

Starting the AppPools fixed the problem.

SharePoint Documents are all checked out

I wanted to write a PowerShell script to check all the documents back in that had been checked out.  Turns out that is not as straight forward as I had hoped.

First search came up with this: Office Discarding Check Out Using PowerShell

Then when I tried to run the command (from the server console, because it doesn’t appear that you can just install the SP PowerShell Module on your desktop), I received the error below:

Get-SPWeb : Cannot access the local farm. Verify that the local farm is properly configured, currently available, and that you have the appropriate permissions to access the database before trying again.

That let me to this site: http://www.sharepointassist.com/2010/01/29/the-local-farm-is-not-accessible-cmdlets-with-featuredependencyid-are-not-registered/comment-page-1/#comment-1566

and this answer:

run sharepoint management shell with the service account
$db = Get-SPDatabase | Where {$_.Name -eq “SharePoint_ConfigDB”}
Add-SPShellAdmin “domain\user_to_add” -database $db

So I looked figured out how to accomplish that and moved on to writing the script for checking in files.  I ran out of time to work on this.  It will accomplish the task, but it isn’t as clean and efficient as I would like.

# Put your site root here
$TargetSite = "https://your.site.name/blah/blah/blah"
# The root folder that you want to start with
$TargetLibrary = "Shared Documents"

 

function LookForWorkFiles ($workFolder){
    Write-Host $workFolder
    # Get the SubFolder and work through it
    $wf = $Site.GetFolder($workFolder)
    # Get the files
    $FilesToProcess = $wf.Files
   
    Write-Host "How many files you ask?"
    Write-Host $FilesToProcess.Count
    If ($FilesToProcess.Count -le 0) {
        # If there aren’t any files, move on to the SubFolders
        Write-Host "No Files in $workFolder, checking for subfolders"
        }
    Else
        {
        # Check in all the files -NOTE this will cause an error for any
        # file that isn’t checked out. 
        foreach ($_ in $FilesToProcess) {
            Write-Host $_.Name
            $_.UndoCheckOut()
            $_.Update()
            }
        }
    Write-Host "Looking for Subfolders"
    Write-Host $wf.SubFolders
    foreach ($_ in $wf.SubFolders){
        Write-Host "SubFolders?"
        $Site.GetFolder($_)
        LookForWorkFiles $_
        }
    }
   
   
Write-Host "Beginning Script"
Write-Host $TargetSite
Write-Host $TargetLibrary

#Connect to the site
$Site = Get-SPWeb $TargetSite
#Get the Document Folder
$Folder = $Site.GetFolder($TargetLibrary)

foreach ($SPF in $Folder.SubFolders ){
    If ($SPF.Name -eq "Forms"){
        #the forms directory is for SharePoint, not for file management
        Write-Host "Skipping the Forms Directory"
        }
    Else
        {
        Write-Host $SPF.Name
        LookForWorkFiles $SPF
        }
    }

With this script, it is easier to use file management tools to move the files out of SharePoint. 

SharePoint Recycle Bin

I mentioned earlier (here) that we are clearing some old data out of a SharePoint site.  Aside from the various client side issues with trying to do that, you also have to figure out how to clean up SharePoint.  SharePoint has a multi-tier Recycle Bin.  There is a pretty good article about how it works in SharePoint 2010 (How the Recycle Bin Works in SharePoint) and the concept is pretty much the same for SharePoint 2013.

If you go to Site Settings you see something similar to this:

image

If you go down to the “Site Collection Administration” and click on “Recycle Bin” it takes you to the “End user Recycle Bin items”. 

You can change the view to “Deleted from end user Recycle Bin” and see the second level Recycle Bin contents:

image

Once it is gone from there, you can think about shrinking the database.

Here is where I learned what I just said:

http://sharepoint.stackexchange.com/questions/34980/claim-sql-server-space-after-content-deleted-from-sharepoint

I was asked to reduce a content DB recently, heres what I found. Whilst the customer was great in deleting content from their sites, SharePoint has a 2 stage recycle bin, so whilst the items were deleted from the site level recycle bin there was still 30 days left on the site collection recycle bin. So please ensure you check that and flush it.

Yes you’ll find that the DB will still be large at this stage. So as described above, use SQL Management studio and locate your content database in question, it won’t be the SharePoint configuration database as the example shows above but rather WSS_Content_something in all likelihood.

After that you find you will be able to shrink the database file (mdf). BUT! beware chances are your database is set to full recovery mode and when the shrink operation takes place all it will do is blow the transaction log out as many GB as you have removed. So the choice here is you might want to switch the DB into simple mode and then perform the shrink operation> I was lucky in that I had sufficient disk space when the shrink operation was taking place and removing 40GB blew the logs out 40GB, once that was complete the mdf was then shrunk 40GB. After this I switched the DB to simple mode and shrank the log file, then changed it back to full.

One word of warning.  When you go delete that data, if it is a lot of data, SharePoint becomes a bit less responsive.  You may want to do that outside of normal working hours.

Moving data out of SharePoint

We have a SharePoint site that we used to store some data.  It was used at remote locations to be able to “map a drive” and interact with data.  There is some old data in the site now, and we needed to move it off.  I hadn’t messed with it in a while and forgot several of the little issues that come up when you use this solution.

First SharePoint has a file size limit, and an item count limit.  I believe the file size limit is 50MB by default, but I don’t remember for sure (SharePoint 2010 and 2013).  Also, from the client side there is a registry entry that you have to change in order to download files larger that 50 MB.  I found that answer here:

 http://answers.microsoft.com/en-us/ie/forum/ie8-windows_xp/error-0x800700df-the-file-size-exceeds-the-limit/d208bba6-920c-4639-bd45-f345f462934f

FileSizeLimitInBytes is set to 5000000 which limits your download so just set it to maximum! (this is client side btw on windows 7)

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\WebClient\Parameters

  1. Right click on the FileSizeLimitInBytes and click Modify
  2. Click on Decimal
  3. In the Value data box, type 4294967295, and then click OK. Note this sets the maximum you can download from the Webdav to 4 gig at one time, I haven’t figured out how to make it unlimited so if you want to download more you need to split it up.

hopefully this helps for you guys!

The other issue is that PDFs tend to get checked out and stay checked out.  So when you are trying to move the files, you have to make sure they are all checked back in first.

New Favicon for my site

So in a previous life, I spent time in the Navy.  I was a Machinist Mate, in the Nuclear Power Program.  That history, my tendency to (over) analyze things and the fact that my name is Michael all let to the name of this site.  (Nuke it Mike). 

This site is not (currently, or planned to be) used for commercial/profit purposes.  I get a very small amount of traffic these days (I got more traffic when I was blogging about a particular set of problems with early versions of DPM). 

I have no reason to have a personal logo, but I have been thinking for years that I wanted one.  I finally got around to making one.  This is a Machinist Mate Rating badge with the colors of the Radiation Warning symbol.  What do you think?

NIMLogo

2015 Interests Update

I said here that I wanted to get to 175 (from 148) and I have made it to 154 (if you count this post).  I am sure that is right on track.

The next post that I wrote (after my post count) was a list of things I wanted to work on (or that I was interested in working on) this year.

From that list, I have worked on a few items (and a bunch that weren’t on the list).  I created a PowerShell script to capture Management Pack Dependencies.  Upgrading the OS for the OpsMgr server probably wasn’t on my list, but I did that, too.

I am also taking an Azure class.  So far, most of it is stuff I am already familiar with, or at least comfortable with.  I have picked up a few concepts and found a few resources.  The “Cloud Design Patterns” looks interesting.

The rest of the items on my list are a bit more difficult to tie together.  I have been working on some documentation/automation scripts that should be useful.  I will post those when (if) I get them cleaned up enough for posting.

Operations Manager 2012 R2 Management Pack Dependencies

I am not proficient at managing Ops Manager.  I am at best a competent tinkerer.  I have been needing to do some clean up on our Ops Manager installation, and clear some management packs that are not used, or just used to generate noise that we subsequently ignore.  That is a bit of an annoying task, with trying to trace down all the dependencies.

To make it easier, I looked for  a script to log the management pack dependencies, version and ID.  I found this post from 2009, but it wasn’t very effective in the current version of Ops Manager. http://www.systemcentercentral.com/list-management-packs-dependencies-powershell-script/

So I decided to write a new one.  Here is what I came up with:

New-SCOMManagementGroupConnection -ComputerName <Your Ops Mgr Server Name>

# Create a new Excel object using COM
$Excel = New-Object -ComObject Excel.Application
$Excel.visible = $True
$Excel = $Excel.Workbooks.Add()
$Sheet = $Excel.Worksheets.Item(1)

# Counter variable for rows
$intRow = 2

$Sheet.Cells.Item(1,1) = “Parent”
$Sheet.Cells.Item(1,2) = “MP Name”
$Sheet.Cells.Item(1,3) = “Version”
$Sheet.Cells.Item(1,4) = “ID”

 

$Sheet.Cells.Item(1,1).Font.Bold = $True
$Sheet.Cells.Item(1,2).Font.Bold = $True
$Sheet.Cells.Item(1,3).Font.Bold = $True
$Sheet.Cells.Item(1,4).Font.Bold = $True
$MPCollection = Get-SCManagementPack
foreach ($_ in $MPCollection) {
$intRow++
$MPParent = $_.Name
#     Write-Host $MPParent
$Sheet.Cells.Item($intRow,1) = $MPParent
$Sheet.Cells.Item($intRow,2) = “*************”
$MPChecking = Get-SCManagementPack -Name $MPParent -recurse
foreach ($_ in $MPChecking){
$intRow++
Write-Host $intRow
$MPName = $_.Name
#         Write-Host $MPName
$MPVersion = $_.Version
$MPID = $_.ID
#         Write-Host $MPID
$Sheet.Cells.Item($intRow,2) = $MPName
$Sheet.Cells.Item($intRow,3) = $MPVersion
$Sheet.Cells.Item($intRow,4) = “$MPID”
}
}

Upgrade of System Center Operations Manager Server OS

Our Operations Manager installation is an upgrade from 2007 through (currently) 2012 R2.  The operating system was Server 2008 R2.  I like to stay current, so I was interested in figuring out how to upgrade to Server 2012 R2.  When I did a search on it, I found that there were recommendations on how to go through some complicated steps to perform the upgrade. 

And then I read a few comments where people had just updated the OS.  That seemed like the easier path so I did that.  It worked great, after I ran all the system updates following the upgrade.

The Good and the Bad of Microsoft Ignite (IMHO)

I have been attending a Microsoft conference each year since 2003.  (I think I missed one year because of a child being born close to the same time as the conference.)  Until last year, the conference that I went to was the Microsoft Management Summit (MMS).  I am an infrastructure guy so that conference seemed to be pretty relevant.  TechEd would also have been relevant, but I didn’t attend TechEd until last year when the combined MMS and TechEd. 

This year TechEd and several other conferences were all combined into one giant conference, Microsoft Ignite.  This is the first year for this conference.  As a first run, with a conference that has 23,000+ attendees, it hasn’t been too bad.  I am somewhat disappointed in the logistics.  Chicago is a great city and the dedicated bus route between the hotels and the conference center is a great idea.  Not running any buses for half the day isn’t a great idea. 

With a conference that lasts a full week and covers such a broad range of topics, you can become mentally exhausted.  Being able to go back to your hotel room, take a nap, make phone calls, answer natures call, etc. is a key factor in surviving the conference.  With a conference of this size, in a convention center that is the largest in North America, going to sessions an different topics can be a challenge. 

There are a lot of talented people at Microsoft.  I enjoy going to the conferences and hearing what they are working on, what the plans are, what I should be focusing on to be ready for the next thing they release.  I enjoy hearing about the successes, and how the non-successes are handled.  I manage infrastructure.  My interests span a wide range of topics covered at this conference. 

The good:

  • It is Microsoft
  • The presenters are very talented and knowledgeable people
  • There is a lot of information
  • They are getting better at working with customers every year
  • I like the direction they are going with their products
  • Closing party – I think for the number of people, and how it was laid out, the closing party was actually pretty good. 
  • Second screen – I did spend one afternoon, and Friday morning watching and listening to sessions from my hotel rom via the second screen feature. 

The bad:

  • Selling – I am at a Microsoft conference, to hear about technologies that I already believe in.  Can we cover the technical parts a little more, and the sales part a little less?  If there is a track for a technology, can we not cover the same sales information in every session, for 30 minutes (at least) of the 75, and focus more on the technology?
  • Session length – Most of the sessions I went to could have been handled in about half the time if the sales part had been left out.
  • Announcements – Most Microsoft conferences have announcements.  That is part of the fun.  This time, the announcements seemed a little less spectacular, and a little less fun.  It seemed that the announcements weren’t given the spotlight as much as in previous conferences.
  • Food – I don’t eat much conference food.  This time I didn’t eat any, so this next statement is strictly hear say.  I spoke to a person who claimed to eat pretty much anything, and he said the conference food was very disappointing.  Even for conference food. 
  • Crowded sessions – almost all the sessions I went to were very crowded.  I went to a variety of topics, so it wasn’t just a particular topic.  With 23,000 people, maybe there needed to be more sessions.   Although, we brought 5 people, and there were at least 2, maybe as many as 6 other people that “should” have come.  For us that is a significant number of people.

All in all, I hope that some of the issues I have with this year are resolved for next year.  I will come back next year, but I won’t be as excited as I was in years past.