Archive for the 'Scripting' Category

Pull DHCP info using PowerShell

This short script will get all the DHCP servers authorized in your Windows domain, and pull all the scopes and IPs. It exports these to two separate CSV files.

$DHCPServers = Get-DhcpServerInDC
If (Test-Path -Path $env:TEMP\Scopes.csv) {Remove-Item $env:TEMP\Scopes.csv}
If (Test-Path -Path $env:TEMP\Leases.csv) {Remove-Item $env:TEMP\Leases.csv}
If (Test-Path -Path $env:TEMP\Leases.csv) {Remove-Item $env:TEMP\DNSExport.csv}
foreach ($_ In $DHCPServers) {
    Get-DhcpServerv4Scope -ComputerName $_.DnsName | Select-Object -Property ScopeId,SubnetMask,Name,State,StartRange,EndRange,LeaseDuration,Description,type | Export-Csv $env:TEMP\Scopes.csv -Append -NoTypeInformation
    Get-DhcpServerv4Scope -ComputerName $_.DnsName | Get-DhcpServerv4Lease -ComputerName $_.DnsName -AllLeases | Select-Object -Property IPAddress,ScopeId,ClientId,HostName,AddressState,LeaseExpiryTime,ClientType,Description,DnsRegistration,DnsRR,ServerIP    | Export-Csv $env:TEMP\Leases.csv -Append -NoTypeInformation

The formatting for the script is a little off.  I need to get a code display plugin, but I am too lazy.

DNS PowerShell one liner

This is a “one liner” that pulls all the DNS Entries for a particular zone, including the IPv4 and IPv6.  If you don’t care about the IPv6, you can remove that segment of the code.

Get-DnsServerResourceRecord –ComputerName <DNSServerName> –ZoneName <YourZoneName> | Select-Object DistinguishedName,HostName,@{Name=’IPv4Address’;Expression={$_.RecordData.IPv4Address.IPAddressToString}},@{Name=’IPv6Address’;Expression={$_.RecordData.IPv6Address.IPAddressToString}},RecordType,Timestamp,TimeToLive | Export-Csv $env:TEMP\DNSExport.csv –NoTypeInformation

It is amazing how much you can do with just one line (even if the line is obnoxiously long.)

Express Route Provisioning Error


We have recently decided to invest in an Express Route circuit for Azure.  It is supposed to be helpful with Azure and Office 365.  There are two ways to provision the ExpressRoute circuit.  Both require PowerShell.

There is the classic:

And there is the Resource Manager:

Here is the note about those options:

Resource Manager: This is the newest deployment model for Azure resources. Most newer resources already support this deployment model and eventually all resources will.

Classic: This model is supported by most existing Azure resources today. New resources added to Azure will not support this model.

This seems to indicate that using Resource Manager is the right way to go long term.

The problem (for me currently) is the documentation isn’t quite where I think it should be.  If you try to run the commands to setup Express Route and ask for detailed help, you get little if any helpful information. 

One item that kind of bothers me.  When you request the service provider information using the “Get-AzureRmExpressRouteServiceProvider” command, the results are not as informative as they need to be.  I say this because the results look like this:

Name              : Verizon
Id                : /subscriptions//resourceGroups//providers/Microsoft.Network/expressRouteServiceProviders/
ProvisioningState :
Type              : Microsoft.Network/expressRouteServiceProviders
PeeringLocations  : null
BandwidthsOffered : null

Name              : Vodafone
Id                : /subscriptions//resourceGroups//providers/Microsoft.Network/expressRouteServiceProviders/
ProvisioningState :
Type              : Microsoft.Network/expressRouteServiceProviders
PeeringLocations  : null
BandwidthsOffered : null

Name              : Zayo Group
Id                : /subscriptions//resourceGroups//providers/Microsoft.Network/expressRouteServiceProviders/
ProvisioningState :
Type              : Microsoft.Network/expressRouteServiceProviders
PeeringLocations  : null
BandwidthsOffered : null

From that you are supposed to then run a command (per the documentation) that looks like this:

New-AzureRmExpressRouteCircuit -Name "ExpressRouteARMCircuit" -ResourceGroupName "ExpressRouteResourceGroup" -Location "West US" -SkuTier Standard -SkuFamily MeteredData -ServiceProviderName "Equinix" -PeeringLocation "Silicon Valley" -BandwidthInMbps 200

The problem is the previous results don’t give you the PeeringLocation.  All of them come back as “null”.  I looked at the sample output from the “Classic” process and picked the location that seemed to make the most sense.  The command finished so I assume that it worked correctly. 

SharePoint Documents are all checked out

I wanted to write a PowerShell script to check all the documents back in that had been checked out.  Turns out that is not as straight forward as I had hoped.

First search came up with this: Office Discarding Check Out Using PowerShell

Then when I tried to run the command (from the server console, because it doesn’t appear that you can just install the SP PowerShell Module on your desktop), I received the error below:

Get-SPWeb : Cannot access the local farm. Verify that the local farm is properly configured, currently available, and that you have the appropriate permissions to access the database before trying again.

That let me to this site:

and this answer:

run sharepoint management shell with the service account
$db = Get-SPDatabase | Where {$_.Name -eq “SharePoint_ConfigDB”}
Add-SPShellAdmin “domain\user_to_add” -database $db

So I looked figured out how to accomplish that and moved on to writing the script for checking in files.  I ran out of time to work on this.  It will accomplish the task, but it isn’t as clean and efficient as I would like.

# Put your site root here
$TargetSite = ""
# The root folder that you want to start with
$TargetLibrary = "Shared Documents"


function LookForWorkFiles ($workFolder){
    Write-Host $workFolder
    # Get the SubFolder and work through it
    $wf = $Site.GetFolder($workFolder)
    # Get the files
    $FilesToProcess = $wf.Files
    Write-Host "How many files you ask?"
    Write-Host $FilesToProcess.Count
    If ($FilesToProcess.Count -le 0) {
        # If there aren’t any files, move on to the SubFolders
        Write-Host "No Files in $workFolder, checking for subfolders"
        # Check in all the files -NOTE this will cause an error for any
        # file that isn’t checked out. 
        foreach ($_ in $FilesToProcess) {
            Write-Host $_.Name
    Write-Host "Looking for Subfolders"
    Write-Host $wf.SubFolders
    foreach ($_ in $wf.SubFolders){
        Write-Host "SubFolders?"
        LookForWorkFiles $_
Write-Host "Beginning Script"
Write-Host $TargetSite
Write-Host $TargetLibrary

#Connect to the site
$Site = Get-SPWeb $TargetSite
#Get the Document Folder
$Folder = $Site.GetFolder($TargetLibrary)

foreach ($SPF in $Folder.SubFolders ){
    If ($SPF.Name -eq "Forms"){
        #the forms directory is for SharePoint, not for file management
        Write-Host "Skipping the Forms Directory"
        Write-Host $SPF.Name
        LookForWorkFiles $SPF

With this script, it is easier to use file management tools to move the files out of SharePoint. 

Operations Manager 2012 R2 Management Pack Dependencies

I am not proficient at managing Ops Manager.  I am at best a competent tinkerer.  I have been needing to do some clean up on our Ops Manager installation, and clear some management packs that are not used, or just used to generate noise that we subsequently ignore.  That is a bit of an annoying task, with trying to trace down all the dependencies.

To make it easier, I looked for  a script to log the management pack dependencies, version and ID.  I found this post from 2009, but it wasn’t very effective in the current version of Ops Manager.

So I decided to write a new one.  Here is what I came up with:

New-SCOMManagementGroupConnection -ComputerName <Your Ops Mgr Server Name>

# Create a new Excel object using COM
$Excel = New-Object -ComObject Excel.Application
$Excel.visible = $True
$Excel = $Excel.Workbooks.Add()
$Sheet = $Excel.Worksheets.Item(1)

# Counter variable for rows
$intRow = 2

$Sheet.Cells.Item(1,1) = “Parent”
$Sheet.Cells.Item(1,2) = “MP Name”
$Sheet.Cells.Item(1,3) = “Version”
$Sheet.Cells.Item(1,4) = “ID”


$Sheet.Cells.Item(1,1).Font.Bold = $True
$Sheet.Cells.Item(1,2).Font.Bold = $True
$Sheet.Cells.Item(1,3).Font.Bold = $True
$Sheet.Cells.Item(1,4).Font.Bold = $True
$MPCollection = Get-SCManagementPack
foreach ($_ in $MPCollection) {
$MPParent = $_.Name
#     Write-Host $MPParent
$Sheet.Cells.Item($intRow,1) = $MPParent
$Sheet.Cells.Item($intRow,2) = “*************”
$MPChecking = Get-SCManagementPack -Name $MPParent -recurse
foreach ($_ in $MPChecking){
Write-Host $intRow
$MPName = $_.Name
#         Write-Host $MPName
$MPVersion = $_.Version
$MPID = $_.ID
#         Write-Host $MPID
$Sheet.Cells.Item($intRow,2) = $MPName
$Sheet.Cells.Item($intRow,3) = $MPVersion
$Sheet.Cells.Item($intRow,4) = “$MPID”

Script to fix “unknown” power state in Xen Desktop


After an unpretty Hyper-V cluster failover, several machines in our Xen Desktop deployment were showing an “unknown” power state.  After a call to Citrix, they gave my coworker a few commands to use to fix it.

This has to be done from the Xen Desktop controller:

Load the Citrix PSSnapIn:

Add-PSSnapIn Citrix.*

This gets information from VMM about all of the VMs in VMM:

Cd XDHyp:\
Get-ChildItem -recurse | Out-File –Filepath c:\xdhyp.txt

This command gets all of the machines that are PowerState Unknown in Xen Destkop:

Get-BrokerMachine -PowerState Unknown

The problem is that the “Id” from the first command doesn’t match the “HostedMachineId” from the second command.  To fix this, you run this command with the correct domain and machine name from the second command and the  “Id” from the first command:

Set-BrokerMachine -MachineName <MyDomain\MyMachine> -HostedMachineId <Id>

You have a lot of machines where this is a problem, it could take a while to go through and match these up.  To save some time with the 75 or so we had to do, created this script to do it:

#Add-PSSnapIn Citrix.*

$x = 0
$UnknownList = Get-BrokerMachine -PowerState Unknown
# HostedMachineId          : 51c7f7a2-64bf-481a-86fd-49b9a3fbf993
# MachineName              : Domain\MachineName
foreach ($_ in $UnknownList)
        $UnknownMachine = $_
        Write-Host $_.MachineName
        $UnknownMachineName = $_.MachineName
        #trim the domain to search
        $SearchName = $UnknownMachineName.TrimStart("<domain>\")
        Write-Host "Search Name is $SearchName"
        $Group =  "XDHyp:\Connections\<VMMSERVER>\<Vmmhostgroupname>.hostgroup\<clustername>.cluster"
        $GroupList = Get-ChildItem $VDCB | Where-Object {$_.Name -match $SearchName}
        # Name    : MachineName
        # Id    : 8d9d4e54-d374-406b-b4e3-7dcd2f47e7a9
        foreach ($_ in $GroupList)
                $x ++
                Write-Host $_.Name
                $HostedMachineId = $_.Id
                Write-Host $HostedMachineId
        Write-Host $x
        set-BrokerMachine -MachineName $UnknownMachineName -HostedMachineId $HostedMachineId

Import .msg files into Outlook using Powershell

We have some old email database backup files that we extracted messages from.  The purpose of this was to be able to expire the backups and do away with them, while keeping the messages in our Journal for e-discovery purposes.  There are better ways to do what we did, that the way we did this, but it has been a process of learning, and one of the things I was able to learn is how to import .msg files into Outlook.

You have to have a machine that has Outlook installed.  Outlook 2007 is the version I used. This would work with Outlook 2010, but you will get a popup about allowing scripting access to Outlook.

First, create the connection to Outlook:

$outlook = New-Object -comobject outlook.application
$namespace = $outlook.GetNamespace("MAPI")

Then connect to the folder, such as the Inbox:

$objInbox  = $outlook.Session.GetDefaultFolder(6)

Other examples:

$olAppointmentItem = 1
$olFolderDeletedItems = 3
$olFolderOutbox = 4
$olFolderSentMail = 5
$olFolderInbox = 6
$olFolderCalendar = 9
$olFolderContacts = 10
$olFolderJournal = 11
$olFolderNotes = 12
$olFolderTasks = 13
$olFolderDrafts = 16

$objDraftFolder = $outlook.Session.GetDefaultFolder($olFolderDrafts)
$objDeletedFolder = $outlook.Session.GetDefaultFolder($olFolderDeletedItems)

I like to know how many messages are in the folder before I begin the import:

$colItems = $objDraftFolder.Items  #this gets the items in the folder
$FolderItemCount = $colItems.Count #this counts them
Write-Host $FolderItemCount

Now you have to open the item and then move it to the folder you want to save it in:

$olMailItem = $NameSpace.OpenSharedItem($MailItem)
$olMailItem.Move( $objDraftFolder )   

If you put the above lines in, you will get a lot of data on the screen about the email.  To prevent that while still accomplishing the goal of moving the message to Outlook, simply put [void] in front like this:

[void]$olMailItem.Move( $objDraftFolder )

I am working with around a million files, so this was a rather involved script to create.  Here is the script I used:


$olMailItemPath = "F:\Sorted\MoveToOutlook\ByThousands\*"
$AfterTime = "12/21/2007"
$olAppointmentItem = 1
$olFolderDeletedItems = 3
$olFolderOutbox = 4
$olFolderSentMail = 5
$olFolderInbox = 6
$olFolderCalendar = 9
$olFolderContacts = 10
$olFolderJournal = 11
$olFolderNotes = 12
$olFolderTasks = 13
$olFolderDrafts = 16

Write-Host $olMailItemPath
$SourceFolders = Get-Item $olMailItemPath
echo $SourceFolders.count
$outlook = New-Object -comobject outlook.application
$namespace = $outlook.GetNamespace("MAPI")

foreach ($_ in $SourceFolders)
    $SourceFolder = $_
    Write-Host "SourceFolder is $SourceFolder"
    $SourceFiles = Get-ChildItem -path $SourceFolder -recurse -include *.msg   
    $SFCount = $SourceFiles.count
    Write-Host "Source File Count is $SFCount"
    $objDraftFolder = $outlook.Session.GetDefaultFolder($olFolderDrafts)
    $objDeletedFolder = $outlook.Session.GetDefaultFolder($olFolderDeletedItems)
    $colItems = $objDraftFolder.Items
    $FolderItemCount = $colItems.Count
    IF ($FolderItemCount -ge 10000)
            Write-Host "Draft Folder Item Count is $FolderItemCount"
            Write-Host "Sleeping…"
            sleep -s 300
    foreach ($_ in $SourceFiles)
        $x ++
#         Write-Host $x
        $MailItem = $_
#         Write-Host "Mail Item is $MailItem"
        $olMailItem = $NameSpace.OpenSharedItem($MailItem)
        $DateRecieved = $olMailItem.ReceivedTime
#         Write-Host "Date Recieved is $DateRecieved"
        If ($DateRecieved -le $AfterTime)
#             Write-Host "Bad Date $DateRecieved"
            [void]$olMailItem.Move( $objDeletedFolder )
#             Write-Host "Moving $MailItem"   
            [void]$olMailItem.Move( $objDraftFolder )       
#         Write-Host "Removing $MailItem"
        Remove-Item $MailItem

Launch a PowerShell script minimized

We use Citrix for a lot of applications, and I have a need to launch Outlook, then an application, and then close Outlook when that application is closed by the user.  This seems like a pretty simple thing to do (and I suppose it is, sort of) but it took me a while to figure it out. 

One piece of the puzzle is that PowerShell remains open if you do it the way I have it setup right now.  If the user closes that PowerShell window, then the monitor process will not close Outlook when the user exits the LOB app.  In order to mitigate this issue somewhat, I wanted to start PowerShell minimized.  The way to do this is:

powershell -WindowStyle Minimized .\ScriptToRun.ps1

Using Powershell to get logon script path from Active Directory

If you want to know what logon script users are getting, this is an easy way to get that information:

Import-Module -Name ActiveDirectory

Get-ADUser -Filter * -SearchBase "OU=YourOUName,DC=YourDomain,DC=COM" -properties ScriptPath | Export-Csv "c:\script\ADUser.csv"

Note: In order for this to work, you have to have the ActiveDirectory Module loaded. 

Not recognized as a cmdlet…

I have been working on a simple little script to copy a file and then launch a program.  I am sure that there are a lot of ways to do it, but I decided to use PowerShell, and this is what I came up with:

$CheckForFile = "H:\custom.ini"
$FileToCopy = "c:\IT\custom.ini"
$CopyFileTo = "H:\"

$PathTest = Test-Path $CheckForFile
If ($PathTest -eq "false")
    Copy-Item $FileToCopy $CopyFileTo

#uses the Invoke-Item command to launch the application
Invoke-Item "C:\Program Files\executable to launch.exe"

This is for use in a Citrix/Terminal Server environment, so I want to be able to call this script like this: PowerShell copythenlaunch.ps1

When I tested that, I got this:

C:\IT>powershell copythenlaunch.ps1
The term ‘copythenlaunch.ps1’ is not recognized as a cmdlet, function, operable
program, or script file. Verify the term and try again.
At line:1 char:18
+ copythenlaunch.ps1 <<<<

I kept thinking there was some problem with the install of PowerShell (I am running this particular script on a Windows 2003 Server) or that I had some illegal character in the name (it had a number in it originally) or some other simple problem.  Finally I did a search and came across this little bit of conversation:

re: Power and Pith

I just started with PowerShell.

Wanted to run some test scripts from you download.

When I tpye in Beep.ps1 I get "The term ‘Beep.1’ is not recognized….."

What Am I doing wrong?

Friday, December 29, 2006 3:17 PM by MikeL

# re: Power and Pith

> When I tpye in Beep.ps1 I get "The term ‘Beep.1’ is not recognized….."

> What Am I doing wrong?

You are relying upon a traditional bad shell behaviour that has been a security nightmere for decades.

In PowerShell, you have to be explicit if you want to run a command in the current directory.  Type ".\beep.ps1"

Jeffrey Snover [MSFT]

Windows PowerShell/MMC Architect

Visit the Windows PowerShell Team blog at:

Visit the Windows PowerShell ScriptCenter at:

Friday, December 29, 2006 5:19 PM by PowerShellTeam

# re: Power and Pith

Thank You for supplying the ".\*" information.  I have been racking my brain for almost two days wondering what I was doing wrong.  And to think it was as simple as using the PROPER .\yourscripthere.ps1 format.

Thank you very very much

Ditto on the thanks…

Windows PowerShell Blog : Power and Pith