Archive for the 'Microsoft' Category

StorSimple Virtual Array

I really like Microsoft.  As a company, they don’t always do things the way I would like them to, but overall, they make products that meet a need.

We operate in a (mostly) centralized infrastructure.  Our file servers are (mostly) in our main office.  I have a few virtualized StorSimple appliances that I use in a couple of the remote offices. 

Today, I learned that they may occasionally throttle themselves if they are having trouble keeping up with the churn rate.  I probably would have expected this, but I didn’t really put as much thought into that until today. 

I also learned that this can impede a users ability to make changes to or save new files on that share.   This is somewhat frustrating because an error caused it to throttle today, not (apparently) the churn rate.  And my first alert was from the end users.    Bummer…

Even better, the engineer saying “we’ve never seen that one before”.

Migrating to Azure

So, at the time of this writing, my blog is down.  If you are reading this, then I must have succeeded in migrating it to Azure for hosting.  If I didn’t succeed, maybe I will be the only one who reads this.  I am an “on again/ off again” blogger, so anything is possible.

I have until a few months ago, hosted this blog on a server running on infrastructure at my employer.  I do that, because it is a good price (free) and because I mostly talk to myself in this blog anyway. 

There is a fairly simple “how to” on creating a WordPress site in Azure.  I am following this post to create a temporary site to test with.  If all goes well, I will do a test migration to this and then do it again to a “production” instance.

So far so good:


Express Route Provisioning Error


We have recently decided to invest in an Express Route circuit for Azure.  It is supposed to be helpful with Azure and Office 365.  There are two ways to provision the ExpressRoute circuit.  Both require PowerShell.

There is the classic:

And there is the Resource Manager:

Here is the note about those options:

Resource Manager: This is the newest deployment model for Azure resources. Most newer resources already support this deployment model and eventually all resources will.

Classic: This model is supported by most existing Azure resources today. New resources added to Azure will not support this model.

This seems to indicate that using Resource Manager is the right way to go long term.

The problem (for me currently) is the documentation isn’t quite where I think it should be.  If you try to run the commands to setup Express Route and ask for detailed help, you get little if any helpful information. 

One item that kind of bothers me.  When you request the service provider information using the “Get-AzureRmExpressRouteServiceProvider” command, the results are not as informative as they need to be.  I say this because the results look like this:

Name              : Verizon
Id                : /subscriptions//resourceGroups//providers/Microsoft.Network/expressRouteServiceProviders/
ProvisioningState :
Type              : Microsoft.Network/expressRouteServiceProviders
PeeringLocations  : null
BandwidthsOffered : null

Name              : Vodafone
Id                : /subscriptions//resourceGroups//providers/Microsoft.Network/expressRouteServiceProviders/
ProvisioningState :
Type              : Microsoft.Network/expressRouteServiceProviders
PeeringLocations  : null
BandwidthsOffered : null

Name              : Zayo Group
Id                : /subscriptions//resourceGroups//providers/Microsoft.Network/expressRouteServiceProviders/
ProvisioningState :
Type              : Microsoft.Network/expressRouteServiceProviders
PeeringLocations  : null
BandwidthsOffered : null

From that you are supposed to then run a command (per the documentation) that looks like this:

New-AzureRmExpressRouteCircuit -Name "ExpressRouteARMCircuit" -ResourceGroupName "ExpressRouteResourceGroup" -Location "West US" -SkuTier Standard -SkuFamily MeteredData -ServiceProviderName "Equinix" -PeeringLocation "Silicon Valley" -BandwidthInMbps 200

The problem is the previous results don’t give you the PeeringLocation.  All of them come back as “null”.  I looked at the sample output from the “Classic” process and picked the location that seemed to make the most sense.  The command finished so I assume that it worked correctly. 

StorSimple Virtual Appliance

I have been interested in StorSimple for some time, but haven’t actually used it before.  Recently, Microsoft announced a virtual appliance preview.  It looks pretty interesting and I have a test scenario where I want to use it.  If my testing goes well, I may expand my use of it. 

The documentation on it is pretty intimidating, since there are 15 PDFs to help you get started.

It is supported running on VMware and Hyper-V, and use cases are:

  • File Server – User file shares or Department file shares
  • iSCSI Server – Small SQL databases or User home folders

My initial test will be as a small departmental file share.

Shrinking volumes

Sometimes, I find it useful to shrink volumes.  This happens about once every 2 or 3 years.  Since I do it so infrequently, I have to look it up every time.

In my experience, while you can do it from the GUI, it isn’t always successful.  Also, I have not ever seen it work to try and shrink it by the complete amount that is available to shrink.  My experiences are related to very large (larger than 1 TB) volumes.

To shrink via the command line, at an elevated prompt, do the following:

Diskpart – this is the disk management CLI

List volumes – this is the diskpart cmd that gives you a list of volumes (not to be confused with the list of disks

select volume <#> – this is how you select the volume that you want to work on, i.e. “select volume 2

shrink querymax – this tells you how much space can be trimmed off the volume.  There are several factors that affect this, but the primary things are how big the volume is to begin with, and where on that volume the un-movable system files  are located.

shrink desired=<size in MB> – this tells the volume to shrink by the amount of space given in MB.  I.e. “shrink desired=102400” will shrink the volume by 100GB.

shrink minimum=<size in MB> – this tells the volume to shrink by the amount specified, but only if it can shrink by that amount

You can use the minimum and desired together if you want.  You can also add a “NOWAIT” so that the prompt returns and you don’t have to wait to see the results.

Don’t instal build 10547

I have been very happy with the Windows 10 builds.  Most of them have at least been “no visible change” for me.  Build 10547 however, was not like that.

I had heard of some issues with that build from one of my co-workers, but nothing specific enough to not make think it was other than an anomaly.  However, when I installed the build, I was very unhappy. 

First, I rarely shut my machine down.  I had to do a hard shutdown 3 times on my machine after installing Build 10547, because the screen was black and only the mouse pointer was visible.   After I was able to get into the machine, the network adapter wasn’t working.  It has worked fine for Windows 7, 8, 8.1 and every other build of Windows 10.  I kept getting activation errors for Office programs.  after 2 hours, I finally had to revert to the previous build. 

Build 10550 is supposed to have addressed some of these issues.  Cross your fingers.

Security Token Service

Had a bit of a scare during a maintenance window.  Ran some updates on our SharePoint farm and after that one of the sites wasn’t coming up.  Kept getting a 503 error.  When I checked the event log, I found this error message:

An exception occurred when trying to issue security token: The HTTP service located at http://localhost:32843/SecurityTokenServiceApplication/securitytoken.svc/actas is unavailable.  This could be because the service is too busy or because no endpoint was found listening at the specified address. Please ensure that the address is correct and try accessing the service again later..

A quick search led to this article, and when I checked the AppPools, they were all stopped.

Starting the AppPools fixed the problem.

SharePoint Documents are all checked out

I wanted to write a PowerShell script to check all the documents back in that had been checked out.  Turns out that is not as straight forward as I had hoped.

First search came up with this: Office Discarding Check Out Using PowerShell

Then when I tried to run the command (from the server console, because it doesn’t appear that you can just install the SP PowerShell Module on your desktop), I received the error below:

Get-SPWeb : Cannot access the local farm. Verify that the local farm is properly configured, currently available, and that you have the appropriate permissions to access the database before trying again.

That let me to this site:

and this answer:

run sharepoint management shell with the service account
$db = Get-SPDatabase | Where {$_.Name -eq “SharePoint_ConfigDB”}
Add-SPShellAdmin “domain\user_to_add” -database $db

So I looked figured out how to accomplish that and moved on to writing the script for checking in files.  I ran out of time to work on this.  It will accomplish the task, but it isn’t as clean and efficient as I would like.

# Put your site root here
$TargetSite = ""
# The root folder that you want to start with
$TargetLibrary = "Shared Documents"


function LookForWorkFiles ($workFolder){
    Write-Host $workFolder
    # Get the SubFolder and work through it
    $wf = $Site.GetFolder($workFolder)
    # Get the files
    $FilesToProcess = $wf.Files
    Write-Host "How many files you ask?"
    Write-Host $FilesToProcess.Count
    If ($FilesToProcess.Count -le 0) {
        # If there aren’t any files, move on to the SubFolders
        Write-Host "No Files in $workFolder, checking for subfolders"
        # Check in all the files -NOTE this will cause an error for any
        # file that isn’t checked out. 
        foreach ($_ in $FilesToProcess) {
            Write-Host $_.Name
    Write-Host "Looking for Subfolders"
    Write-Host $wf.SubFolders
    foreach ($_ in $wf.SubFolders){
        Write-Host "SubFolders?"
        LookForWorkFiles $_
Write-Host "Beginning Script"
Write-Host $TargetSite
Write-Host $TargetLibrary

#Connect to the site
$Site = Get-SPWeb $TargetSite
#Get the Document Folder
$Folder = $Site.GetFolder($TargetLibrary)

foreach ($SPF in $Folder.SubFolders ){
    If ($SPF.Name -eq "Forms"){
        #the forms directory is for SharePoint, not for file management
        Write-Host "Skipping the Forms Directory"
        Write-Host $SPF.Name
        LookForWorkFiles $SPF

With this script, it is easier to use file management tools to move the files out of SharePoint. 

SharePoint Recycle Bin

I mentioned earlier (here) that we are clearing some old data out of a SharePoint site.  Aside from the various client side issues with trying to do that, you also have to figure out how to clean up SharePoint.  SharePoint has a multi-tier Recycle Bin.  There is a pretty good article about how it works in SharePoint 2010 (How the Recycle Bin Works in SharePoint) and the concept is pretty much the same for SharePoint 2013.

If you go to Site Settings you see something similar to this:


If you go down to the “Site Collection Administration” and click on “Recycle Bin” it takes you to the “End user Recycle Bin items”. 

You can change the view to “Deleted from end user Recycle Bin” and see the second level Recycle Bin contents:


Once it is gone from there, you can think about shrinking the database.

Here is where I learned what I just said:

I was asked to reduce a content DB recently, heres what I found. Whilst the customer was great in deleting content from their sites, SharePoint has a 2 stage recycle bin, so whilst the items were deleted from the site level recycle bin there was still 30 days left on the site collection recycle bin. So please ensure you check that and flush it.

Yes you’ll find that the DB will still be large at this stage. So as described above, use SQL Management studio and locate your content database in question, it won’t be the SharePoint configuration database as the example shows above but rather WSS_Content_something in all likelihood.

After that you find you will be able to shrink the database file (mdf). BUT! beware chances are your database is set to full recovery mode and when the shrink operation takes place all it will do is blow the transaction log out as many GB as you have removed. So the choice here is you might want to switch the DB into simple mode and then perform the shrink operation> I was lucky in that I had sufficient disk space when the shrink operation was taking place and removing 40GB blew the logs out 40GB, once that was complete the mdf was then shrunk 40GB. After this I switched the DB to simple mode and shrank the log file, then changed it back to full.

One word of warning.  When you go delete that data, if it is a lot of data, SharePoint becomes a bit less responsive.  You may want to do that outside of normal working hours.

Moving data out of SharePoint

We have a SharePoint site that we used to store some data.  It was used at remote locations to be able to “map a drive” and interact with data.  There is some old data in the site now, and we needed to move it off.  I hadn’t messed with it in a while and forgot several of the little issues that come up when you use this solution.

First SharePoint has a file size limit, and an item count limit.  I believe the file size limit is 50MB by default, but I don’t remember for sure (SharePoint 2010 and 2013).  Also, from the client side there is a registry entry that you have to change in order to download files larger that 50 MB.  I found that answer here:

FileSizeLimitInBytes is set to 5000000 which limits your download so just set it to maximum! (this is client side btw on windows 7)


  1. Right click on the FileSizeLimitInBytes and click Modify
  2. Click on Decimal
  3. In the Value data box, type 4294967295, and then click OK. Note this sets the maximum you can download from the Webdav to 4 gig at one time, I haven’t figured out how to make it unlimited so if you want to download more you need to split it up.

hopefully this helps for you guys!

The other issue is that PDFs tend to get checked out and stay checked out.  So when you are trying to move the files, you have to make sure they are all checked back in first.