Thursday, 8 December 2011

Manage services on SharePoint 2010 servers using PowerShell

As you may already know, you can use the “Manage Services on Server” option from Central Administration to centrally start and stop service instances across all SharePoint servers in your farm. An example of this administration page is shown below:

image

The issue with this page is that it can be cumbersome to use if there are a few servers in your farm and you keep having to click the “Server” drop-down to select one of them. It is also not very repeatable and there may be issues where you might want to automate the process – for example, stopping the User Profile Synchronization Service before a backup or application update and restarting it once complete.

You can get a similar list to the one shown in Central Admin using PowerShell by typing the following command:

Get-SPServiceInstance -Server PAC-SP2010 | sort TypeName | Format-Table -AutoSize

This will not only provide the service name and status, but also the service instance GUID:

image

There are a couple of ways to stop a service instance. Firstly, the most user friendly method, which is to use the service display name and then call the Stop-SPServiceInstance cmdlet:

Get-SPServiceInstance -Server PAC-SP2010  | where-object {$_.TypeName -eq "Managed Metadata Web Service" } | Stop-SPServiceInstance -Confirm:$false

There are a couple of things to note here:

  • The -Server switch is not required if you are configuring service instances for the server you are currently logged into
  • The -Confirm:$false switch is also optional. Setting this to $false supresses the confirmation prompt that appears when attempting to stop the service instance through PowerShell
  • You can also use the Stop-SPServiceInstance cmdlet directly by specifying the service instance GUID, as shown below for the Managed Metadata Web Service on this farm:

    Stop-SPServiceInstance -Identity 3f36d0f4-ce39-48aa-a776-6300f1e2b58f -Confirm:$false

    image

    Restarting the service is simply a case of typing either of the same commands as before, but this time replacing Stop-SPServiceInstance with Start-SPServiceInstance.

    First example:

    Get-SPServiceInstance -Server PAC-SP2010  | where-object {$_.TypeName -eq "Managed Metadata Web Service" } | Start-SPServiceInstance -Confirm:$false

    Second example:

    Start-SPServiceInstance -Identity 3f36d0f4-ce39-48aa-a776-6300f1e2b58f -Confirm:$false

    That’s great, but you also might want to check a service instance has a particular status before attempting to stop or start it. In the example below, the Managed Metadata Web Service instance is associated with the $serviceinstance variable and the Status is checked to make sure it is “Online” before attempting to stop the service instance:

    $serviceinstance = Get-SPServiceInstance -Server PAC-SP2010  | where-object {$_.TypeName -eq "Managed Metadata Web Service" }
    if ($serviceinstance.Status -eq "Online")
    {
        Write-Host "Stopping Managed Metadata Web Service on PAC-SP2010"
        try
        {
            $serviceinstance | Stop-SPServiceInstance -Confirm:$false
        }
        catch { Write-Host "There was an error:" $_ }
    }
    else
    {
        Write-Host "Managed Metadata Web Service on PAC-SP2010 has already been stopped"
    }

    Wednesday, 2 November 2011

    Use PowerShell to check for illegal characters before uploading multiple files into SharePoint

    If you have done any sort of bulk file uploading into SharePoint, you will be aware of issues with file names containing illegal characters. These files can disrupt the uploading process, potentially causing many hours of frustrating and time consuming tasks examining and repairing file names.

    Files and folders are blocked by SharePoint during the uploading process for the following reasons:

    • They contain the following characters: & { } ~ # % (there are other illegal characters too, but as they are also blocked from use in Windows Explorer, it is assumed you will not have files named with these characters in your file system – if you do, you can adapt the script accordingly)
    • They are 128 characters in length or over
    • They start with a period character
    • They end with a period character
    • They contain consecutive period characters

    There is further information available on this criteria here: http://www.thesug.org/mossasaurus/Wiki%20Pages/SharePoint%20Invalid%20Characters.aspx.

    The PowerShell script in this article allows you to scan an entire folder structure, including subfolders, and report on all files and folders containing one or more of the conditions listed above. There are also options within the script to automatically rename illegal characters in file names with something acceptable to SharePoint – for example, renaming the & symbol with the word ‘and’.

    To use the script, first load the following function in a PowerShell console. Note that loading the function will not actually do anything until you call it later from the command line:

    function Check-IllegalCharacters ($Path, [switch]$Fix, [switch]$Verbose)
    {
        Write-Host Checking files in $Path, please wait...
        #Get all files and folders under the path specified
        $items = Get-ChildItem -Path $Path -Recurse
        foreach ($item in $items)
        {
            #Check if the item is a file or a folder
            if ($item.PSIsContainer) { $type = "Folder" }
            else { $type = "File" }
           
            #Report item has been found if verbose mode is selected
            if ($Verbose) { Write-Host Found a $type called $item.FullName }
           
            #Check if item name is 128 characters or more in length
            if ($item.Name.Length -gt 127)
            {
                Write-Host $type $item.Name is 128 characters or over and will need to be truncated -ForegroundColor Red
            }
            else
            {
                #Got this from
    http://powershell.com/cs/blogs/tips/archive/2011/05/20/finding-multiple-regex-matches.aspx
                $illegalChars = '[&{}~#%]'
                filter Matches($illegalChars)
                {
                    $item.Name | Select-String -AllMatches $illegalChars |
                    Select-Object -ExpandProperty Matches
                    Select-Object -ExpandProperty Values
                }
               
                #Replace illegal characters with legal characters where found
                $newFileName = $item.Name
                Matches $illegalChars | ForEach-Object {
                    Write-Host $type $item.FullName has the illegal character $_.Value -ForegroundColor Red
                    #These characters may be used on the file system but not SharePoint
                    if ($_.Value -match "&") { $newFileName = ($newFileName -replace "&", "and") }
                    if ($_.Value -match "{") { $newFileName = ($newFileName -replace "{", "(") }
                    if ($_.Value -match "}") { $newFileName = ($newFileName -replace "}", ")") }
                    if ($_.Value -match "~") { $newFileName = ($newFileName -replace "~", "-") }
                    if ($_.Value -match "#") { $newFileName = ($newFileName -replace "#", "") }
                    if ($_.Value -match "%") { $newFileName = ($newFileName -replace "%", "") }
                }
               
                #Check for start, end and double periods
                if ($newFileName.StartsWith(".")) { Write-Host $type $item.FullName starts with a period -ForegroundColor red }
                while ($newFileName.StartsWith(".")) { $newFileName = $newFileName.TrimStart(".") }
                if ($newFileName.EndsWith(".")) { Write-Host $type $item.FullName ends with a period -ForegroundColor Red }
                while ($newFileName.EndsWith("."))   { $newFileName = $newFileName.TrimEnd(".") }
                if ($newFileName.Contains("..")) { Write-Host $type $item.FullName contains double periods -ForegroundColor red }
                while ($newFileName.Contains(".."))  { $newFileName = $newFileName.Replace("..", ".") }
               
                #Fix file and folder names if found and the Fix switch is specified
                if (($newFileName -ne $item.Name) -and ($Fix))
                {
                    Rename-Item $item.FullName -NewName ($newFileName)
                    Write-Host $type $item.Name has been changed to $newFileName -ForegroundColor Blue
                }
            }
        }
    }

    As commented in the script, note that I have used a code snippet on the PowerShell.com blog here to find multiple regular expression matches in the file and folder names.

    Once loaded, you can call the script using the following commands as examples:

    Check-IllegalCharacters -Path C:\Files

    The command above will check the folder path specified but will only report file and folder names detected with illegal characters or length.

    Check-IllegalCharacters -Path C:\Files -Verbose

    This command will also only report files and folder names detected with illegal characters or length, but this time it will also tell you names of the files and folders it has checked in the process. This can be used to make sure the script is checking all the locations you are expecting it to.

    Check-IllegalCharacters -Path C:\Files -Fix

    The command here will not only check file and folder names for illegal characters, but will also fix them using the rules specified in the script. You can customise these rules as you see fit, but I have gone with the following criteria:

    • Do not change files and folders with names of 128 characters or over (i.e., manually truncate them)
    • Replace two or more consecutive periods in a file or folder name with a single period
    • If the file or folder name either starts or finishes with a period, remove it
    • File or folder names containing illegal characters are processed as follows:
      • Replace ‘&’ with ‘and’
      • Replace ‘{‘ with ‘(‘
      • Replace ‘}’ with ‘)’
      • Replace “~” with “-“
      • Remove the ‘#’ character
      • Remove the ‘%’ character

    An example running the script on some files and folders containing deliberately illegal characters is shown below:

    The Illegal Files

    The following screenshot shows the output from running the script:

    image

    And evidence that the files were renamed successfully…

    The Proof

    Thursday, 27 October 2011

    Copy SharePoint lists and document libraries from one site to another using PowerShell

    This article is the first in a series on exporting and importing SharePoint objects using the Content Deployment API and PowerShell. I’m not going to cover the API itself, as there are lots of articles doing this already, notably this series from Stefan Go├čner and MSDN, which I would urge you to check out for more details.

    You can achieve some of this functionality with the Export-SPWeb and Import-SPWeb cmdlets provided with SharePoint 2010, but a) these are not available with SharePoint 2007 and b) using the Content Management API directly provides access to extra capabilities. I have written these scripts to work on SharePoint 2007 and 2010 (WSS and Foundation included), although to date I have only tested them on SharePoint Server 2010.

    This first article provides a couple of functions for exporting a list or document library and all items contained within it from one site and importing it into another site. The destination site can be in a different site collection, web application and even farm. I have provided command parameters for what I think will be the most popular configuration options, but as we have full access to the API, the scripts can be adapted to suit requirements.

    Exporting a list or document library

    The first function exports a list or document library from a SharePoint site into either a folder or compressed .cmp file on a network or local drive. Before you can run the command, save the following function as a .ps1 file:

    function Export-List
    {
        Param (
               [parameter(Mandatory=$true)][string]$WebUrl,
               [parameter(Mandatory=$true)][string]$ListName,
               [parameter(Mandatory=$true)][string]$Path,
               [parameter(Mandatory=$false)][switch]$ExcludeDependencies,
               [parameter(Mandatory=$false)][switch]$HaltOnWarning,
               [parameter(Mandatory=$false)][switch]$HaltOnNonfatalError,
               [parameter(Mandatory=$false)][switch]$AutoGenerateDataFileName,
               [parameter(Mandatory=$false)][switch]$TestRun,
               [parameter(Mandatory=$false)][string]$IncludeSecurity,
               [parameter(Mandatory=$false)][string]$IncludeVersions,
               [parameter(Mandatory=$false)][int]$FileMaxSize,
               [parameter(Mandatory=$false)][switch]$Overwrite,
               [parameter(Mandatory=$false)][switch]$SuppressLog
               )
       
        #Load SharePoint 2010 cmdlets
        $ver = $host | select version
        if ($ver.Version.Major -gt 1)  {$Host.Runspace.ThreadOptions = "ReuseThread"}
        Add-PsSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue
        #Load assemblies (needed for SharePoint Server 2007)
        [void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
       
        #Check parameters have the correct values
        if (!$IncludeSecurity)
        {
            $IncludeSecurity = "None"
        }
        else
        {
            if (($IncludeSecurity -ne "All") `
            -and ($IncludeSecurity -ne "WssOnly") `
            -and ($IncludeSecurity -ne "None"))
            {
                Throw "The IncludeSecurity parameter must be set to All, WssOnly or None"
            }
        }
       
        if (!$IncludeVersions)
        {
            $IncludeVersions = "LastMajor"
        }
        else
        {
            if (($IncludeVersions -ne "All") `
            -and ($IncludeVersions -ne "CurrentVersion") `
            -and ($IncludeVersions -ne "LastMajor") `
            -and ($IncludeVersions -ne "LastMajorAndMinor"))
            {
                Throw "The IncludeVersions parameter must be set to All, CurrentVersion, LastMajorAndMinor or LastMajor"
            }
        }
       
        if (!$FileMaxSize)
        {
            $FileMaxSize = 0
        }
           
        $site = New-Object Microsoft.SharePoint.SPSite($WebUrl)
        $web = $site.OpenWeb()
        $list = $web.Lists[$ListName]
        [bool]$FileCompression = $false
       
        #Set file paths for the export file and logs
        [string]$exportPath = $Path.TrimEnd("\")
        if ($exportPath.EndsWith(".cmp"))
        {
            $FileCompression = $true
            $exportFile = $Path.Replace($Path.Remove($Path.LastIndexOf("\")+1),"")
            $exportPath = $Path.Remove($Path.LastIndexOf("\"))
        }
       
        $logFilePath = $exportPath + "\exportlog.txt"
        New-Item -Path $logFilePath -Type File -Force | Out-Null
        Write-Host "Export log file created at" $logFilePath
       
        $exportObject = New-Object Microsoft.SharePoint.Deployment.SPExportObject
        $exportObject.Id = $list.ID
        $exportObject.Type = [Microsoft.SharePoint.Deployment.SPDeploymentObjectType]::List
       
        #Create the export settings from the parameters specified
        $exportSettings = New-Object Microsoft.SharePoint.Deployment.SPExportSettings
        $exportSettings.SiteUrl = $site.Url
        $exportSettings.ExportMethod = [Microsoft.SharePoint.Deployment.SPExportMethodType]::ExportAll
        $exportSettings.FileLocation = $exportPath
        $exportSettings.FileCompression = $FileCompression
        if ($FileCompression) { $exportSettings.BaseFileName = $exportFile }
        $exportSettings.ExcludeDependencies = $ExcludeDependencies
        $exportSettings.OverwriteExistingDataFile = $Overwrite
        $exportSettings.IncludeSecurity = $IncludeSecurity
        $exportSettings.IncludeVersions = $IncludeVersions
        $exportSettings.LogFilePath = $logFilePath
        $exportSettings.HaltOnWarning = $HaltOnWarning
        $exportSettings.HaltOnNonfatalError = $HaltOnNonfatalError
        $exportSettings.AutoGenerateDataFileName = $AutoGenerateDataFileName
        $exportSettings.TestRun = $TestRun
        $exportSettings.FileMaxSize = $FileMaxSize
        $exportSettings.ExportObjects.Add($exportObject)

        #Write the export settings to a log file   
        Out-File -FilePath $logFilePath -InputObject $exportSettings -Append -Encoding utf8
       
        #Run the export procedure
        $export = New-Object Microsoft.SharePoint.Deployment.SPExport($exportSettings)
        $export.Run()
       
        #Load notepad showing the log file
        if (!$SuppressLog) { notepad $logFilePath }
       
        #Dispose of the web and site objects after use
        $web.Dispose()
        $site.Dispose()
    }

    To load the function in a PowerShell console, type . ‘C:\YourPath\YourFilename.ps1’. You should now be able to use the Export-List command to export your list. A list of the parameters available are shown in the following table. Much of this text is a straight copy from the MSDN article referenced at the start of this article.

    Examples

    Optional/ Mandatory

    Description

    -WebUrl “http://portal/”

    Mandatory

    Absolute URL of the site containing the list being exported

    -ListName “Team Contacts” Mandatory Display name of the list to be exported

    -Path “C:\Export”

    -Path “C:\Export\List.cmp”

    Mandatory

    Location on the file system where the exported files will be copied. You can also specify a .cmp file if you want the export to be compressed into files

    -ExcludeDependencies

    Optional

    Exclude dependencies from the export package. Generally, you should always include export dependencies to avoid breaking objects in the export target

    -HaltOnWarning

    Optional

    Stop the export operation if a warning occurs

    -HaltOnNonfatalError

    Optional

    Stop the export operation for a non-fatal error

    -AutoGenerateDataFileName

    Optional

    The file name for the content migration package should be automatically generated. When the export generates multiple .cmp files, the file names are appended numerically. For example, where the file name is "MyList", and where the export operation produces multiple .cmp files, the migration packages are named "MyList1.cmp", "MyList2.cmp", and so forth

    -TestRun

    Optional

    Complete a test run to examine the export process and log any warnings or errors

    -IncludeSecurity All

    Optional

    Site security groups and group membership information is exported.

    The enumeration provide three values:

    All : Specifies the export of user memberships and role assignments such as out of the box roles like Web Designer, plus any custom roles that extend from the out of the box roles. The ACL for each object is exported to the migration package, as well as user information defined in the DAP or LDAP servers.

    None : No user role or security group information is exported. This is the default.

    WssOnly : Specifies the export of user memberships and role assignments such as out of the box roles like Web Designer, plus any custom roles that extend from the out of the box roles. The ACL for each object is exported to the migration package; however, user information defined in the DAP or LDAP servers is not exported.

    The default value when no parameter is specified is None.

    Note: Be careful with this parameter. When exporting objects smaller than a web (for example, a list or list item) you should set IncludeSecurity to None; otherwise, security group and membership information for the entire web is exported.

    -IncludeVersions

    Optional

    Determines what content is selected for export based on version information.

    There are four enumeration values:

    All , which exports all existing versions of selected files.

    CurrentVersion , which exports only the most recent version of selected files.

    LastMajor , which exports only the last major version of selected files. This is the default value.

    LastMajorAndMinor , which exports the last major version and its minor versions.

    Note that LastMajor is the default value when no parameter is specified.

    -FileMaxSize

    Optional

    Maximum size for a content migration package (.cmp) file that is outputted by the export operation.

    By default, the .cmp files are limited to 24 MB in size. If set to zero, the value resets to the default.

    When site data exceeds the specified limit, site data is separated in to two or more migration files. However, if a single site data file exceeds the maximum file size, the operation does not split the source file, but rather it resizes the .cmp file to accommodate the oversize file. You can have any number of .cmp files.

    The range of allowable size values for the .cmp file is from 1 MB to 2GB. If you specify a value that is outside this range, the export operation reverts to the default value of 24 MB.

    -Overwrite

    Optional

    Overwrite an existing content migration package file when running export. If this parameter is not specified, an exception is thrown if the specified data file already exists

    -SuppressLog Optional By default, the command will open Notepad at the end of the export routine, showing an activity list and any warning and errors reported. You can use this switch to prevent the log file from opening in Notepad, which is useful when performing bulk operations

    Here are a few examples of how the Export-List function can be used. First, this command will export a “Team Contacts” list and all list items from the site http://portal/sites/sales to the folder C:\Export\TeamContacts on the server, overwriting any existing export in that folder:

    Export-List -WebUrl “http://portal/sites/sales” -ListName "Team Contacts" -Path "C:\Export\TeamContacts" -Overwrite

    This one will export a “Team Documents” library and all documents contained within it from the site http://portal/sites/sales to the file C:\Export\TeamDocuments.cmp, overwriting any existing .cmp file with the same name. This time, we are also going to export permissions set on the library by using the IncludeSecurity parameter:

    Export-List -WebUrl “http://portal/sites/sales” -ListName "Team Documents" -Path "C:\Export\TeamDocuments.cmp" -Overwrite -IncludeSecurity All

    image

    Importing a list or document library

    Adding your exported list or document library to another SharePoint site can be achieved using the Import-List function below. As with the export function, add the import function to the same or different .ps1 file and load it into the PowerShell console in the same way:

    function Import-List
    {
        Param (
               [parameter(Mandatory=$true)][string]$WebUrl,
               [parameter(Mandatory=$true)][string]$Path,
               [parameter(Mandatory=$false)][switch]$HaltOnWarning,
               [parameter(Mandatory=$false)][switch]$HaltOnNonfatalError,
               [parameter(Mandatory=$false)][switch]$RetainObjectIdentity,
               [parameter(Mandatory=$false)][string]$IncludeSecurity,
               [parameter(Mandatory=$false)][string]$UpdateVersions,
               [parameter(Mandatory=$false)][switch]$SuppressLog
               )
       
        #Load SharePoint 2010 cmdlets
        Add-PSSnapin Microsoft.SharePoint.PowerShell -erroraction SilentlyContinue
        #Load assemblies (needed for SharePoint Server 2007)
        [void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint")
           
        #Check parameters have the correct values
        if (!$IncludeSecurity)
        {
            $IncludeSecurity = "None"
        }
        else
        {
            if (($IncludeSecurity -ne "All") `
            -and ($IncludeSecurity -ne "WssOnly") `
            -and ($IncludeSecurity -ne "None"))
            {
                Throw "The IncludeSecurity parameter must be set to All, WssOnly or None"
            }
        }
       
        if (!$UpdateVersions)
        {
            $UpdateVersions = "Overwrite"
        }
        else
        {
            if (($UpdateVersions -ne "Overwrite") `
            -and ($UpdateVersions -ne "Append") `
            -and ($UpdateVersions -ne "Ignore"))
            {
                Throw "The UpdateVersions parameter must be set to Overwrite, Append or Ignore"
            }
        }
       
        $site = New-Object Microsoft.SharePoint.SPSite($WebUrl)
        $web = $site.OpenWeb()
       
        $importSettings = New-Object Microsoft.SharePoint.Deployment.SPImportSettings
       
        #Set file paths for the import file and logs
        $fileName = ""
        if ($Path.EndsWith(".cmp"))
        {
            $fileName = $Path.Replace($Path.Remove($Path.LastIndexOf("\")+1),"")
            $importPath = $Path.Remove($Path.LastIndexOf("\"))
            $importSettings.FileCompression = $true
            $importSettings.BaseFileName = $fileName
        }
        else
        {
            $importPath = $Path.TrimEnd("\")
            $importSettings.FileCompression = $false
        }
       
        $logFilePath = $importPath + "\importlog.txt"
        New-Item -Path $logFilePath -Type File -Force | Out-Null
        Write-Host "Import log file created at" $logFilePath
       
        #Create the import settings from the parameters specified
        Write-Host "Configuring import settings"
        $importSettings.SiteUrl = $site.Url
        $importSettings.WebUrl = $web.Url
        $importSettings.FileLocation = $importPath
        $importSettings.IncludeSecurity = $IncludeSecurity
        $importSettings.UpdateVersions = $UpdateVersions
        $importSettings.LogFilePath = $logFilePath
        $importSettings.HaltOnWarning = $HaltOnWarning
        $importSettings.HaltOnNonfatalError = $HaltOnNonfatalError
        $importSettings.RetainObjectIdentity = $RetainObjectIdentity
        $importSettings.UserInfoDateTime = "ImportAll"
        if (!$SuppressLog) { $importSettings }
       
        #Write the import settings to a log file
        Out-File -FilePath $logFilePath -InputObject $importSettings -Append -Encoding utf8
       
        #Run the import procedure
        Write-Host "Import running, please wait..."
        $import = New-Object Microsoft.SharePoint.Deployment.SPImport($importSettings)
        $import.Run()
        Write-Host "Import from" $Path "complete"
       
        #Load notepad showing the log file
        if (!$SuppressLog) { notepad $logFilePath }
       
        #Dispose of the web and site objects after use
        $web.Dispose()
        $site.Dispose()
    }

    The parameters available for importing lists and document libraries are shown in the table below:

    Example

    Optional/ Mandatory

    Description

    -WebUrl “http://portal/”

    Mandatory

    Absolute URL of the site into which the list will imported

    -Path “C:\Export”

    -Path “C:\Export\List.cmp”

    Mandatory

    Folder location and file name (if applicable) on the file system where the export files have been copied.

    If a folder only is specified then PowerShell will assume that compression was not used during the export process. If a file with the .cmp extension is specified then compression will be assumed and applied to the import process.

    -HaltOnWarning

    Optional

    Stop the import operation if a warning occurs

    -HaltOnNonfatalError

    Optional

    Stop the import operation for a non-fatal error

    -RetainObjectIdentity

    Optional

    RetainObjectIdentity ensures the same list GUID and item ID’s are used when migrating a list from one environment to another. It is most commonly used when you have two Web sites that are mirror images of each other, or in a publishing scenario to indicate that a list in one database is the same list in another database. An exception error is raised if you try and use the RetainObjectIdentity property for importing a list that already exists within the same SharePoint Web application.

    Note: Because databases do not permit duplicate GUIDs, you should be careful when implementing this property. It is only useful in selective migration scenarios

    -IncludeSecurity All

    Optional

    Site security groups and group membership information is imported.

    The enumeration provide three values:

    All : Specifies the export of user memberships and role assignments such as out of the box roles like Web Designer, plus any custom roles that extend from the out of the box roles. The ACL for each object is exported to the migration package, as well as user information defined in the DAP or LDAP servers.

    None : No user role or security group information is exported. This is the default.

    WssOnly : Specifies the export of user memberships and role assignments such as out of the box roles like Web Designer, plus any custom roles that extend from the out of the box roles. The ACL for each object is exported to the migration package; however, user information defined in the DAP or LDAP servers is not exported.

    The default value when no parameter is specified is None.

    -UpdateVersions

    Optional

    Indicates how versioning is managed in the destination location.

    Allowable values for specifying versioning information on import:

    Append : Appends versioned items on import.

    Overwrite : Deletes the existing item on import, then reinserts the imported version.

    Ignore : Ignores versioning.

    The default value when no parameter is specified is Overwrite.

    -SuppressLog Optional By default, the command will open Notepad at the end of the import routine, showing an activity list and any warning and errors reported. You can use this switch to prevent the log file from opening in Notepad, which is useful when performing bulk operations

    Here are a few examples of how the Import-List function can be used. First, this command will import the “Team Contacts” list and all list items from the exported files in the folder C:\Export\TeamContacts to the site http://intranet/sites/sales:

    Import-List -WebUrl “http://intranet/sites/sales” -Path "C:\Export\TeamContacts"

    This one will import the “Team Documents” library and all documents contained within it from the file C:\Export\TeamDocuments.cmp to the site http://intranet/sites/sales. This time, we are also going to import permissions set on the library by using the IncludeSecurity parameter:

    Import-List -WebUrl "http://intranet/sites/sales” -Path "C:\Export\TeamDocuments.cmp" -IncludeSecurity All

    image

    Note by comparing the screenshot above with the one earlier that the last modified date, time and user of the document has been migrated across with the list, which is set by the $importSettings.UserInfoDateTime = "ImportAll" command in the script.

    The Power in PowerShell

    In addition to having full access to the Content Management API in SharePoint, one of the other advantages with using PowerShell to perform these tasks is the ability for you to reuse previously created scripts or perform operations in bulk to handle repetitive tasks.

    For example, once we have exported our list or document library, we can use these few lines of PowerShell script (SharePoint 2010 only) to walk through every site in a site collection and import the list into each of those sites:

    $site = Get-SPSite “http://intranet/sites/sales”
    $site | Get-SPWeb -Limit All | ForEach-Object {
        Write-Host Importing list to site $_.Url
        Import-List -WebUrl $_.Url -Path "C:\Export\TeamContacts" -IncludeSecurity All -SuppressLog
    }

    Using the Content Management API in this way can be very useful for deploying or migrating lists and libraries that contain a pre-configured standard set of items, columns, permissions and other settings across one or more existing sites in different site collections or farms.

    As always though, make sure you test these scripts in a development environment first before running them in your production farm!

    Tuesday, 18 October 2011

    Pre-create SharePoint 2010 databases in SQL Server 2008 using PowerShell

    There are some scenarios where it is a good idea to create databases in SQL Server prior to installing SharePoint. Typically, this is where a DBA may want control over the database naming convention and configuration in advance of letting you loose on the install DVD!

    There are other benefits with doing this, too. For example, you can set the database and log file location on the file system, configure recovery mode, pre-grow, autogrowth and maximum database sizes in advance, and other Microsoft recommendations listed in the Storage and SQL Server capacity planning and configuration article on TechNet.

    The example here lists all the databases and their configuration settings in a CSV file. We then use a PowerShell script to import this data and use the Invoke-Sqlcmd cmdlet to run a .sql file that creates the databases, configures the properties set in the CSV file, and assigns the SharePoint install account database owner permissions.

    Before doing this, I recommend that you look at a couple of articles on TechNet in addition to the one listed above. First, the Deploy by using DBA-created databases article, which covers the process of installing SharePoint 2010 on pre-created databases (note that you cannot pre-create the Search service application databases using the method shown in this article), and also the “Adding the SQL Server Snap-ins to Windows PowerShell” section of the Running SQL Server PowerShell article, which goes through how to add the SQL Server Provider to the regular PowerShell or PowerShell ISE consoles – you will need to do this before running the PowerShell script shown in this article. Don’t forget that you will also need to login as an account with the correct permissions in SQL to be able to create the databases and set the permissions required.

    CSV File

    I am showing an example CSV file below that includes all columns needed for the script used in this article, but of course, you can add or remove any columns required for your deployment:

    DBNAME,SETUPUSER,DATAFILEPATH,DATASIZE,DATAMAXSIZE,DATAFILEGROWTH,LOGFILEPATH,LOGSIZE,LOGFILEGROWTH,RECOVERYMODE
    SP_Test1,DOMAIN\SP_Admin,C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLINSTANCE\MSSQL\DATA,1024000KB,1024000KB,102400KB,C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLINSTANCE\MSSQL\LOGS,1024KB,10%,SIMPLE
    SP_Test2,DOMAIN\SP_Admin,C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLINSTANCE\MSSQL\DATA,512000KB,UNLIMITED,20%,C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQLINSTANCE\MSSQL\LOGS,1024KB,10%,FULL

    Admittedly, it doesn’t look the easiest to read here, but if you copy the contents to a CSV file using Notepad it will look a lot clearer. Note that I am not specifying SharePoint databases in the CSV file – I’ll leave that one to you considering your naming conventions and database configuration will vary anyway.

    SQL File

    Next, we need a .sql file that will be used by the PowerShell script to create the databases. This file was created using the “Script Action” option in the SQL Management Studio console after creating a single database and assigning dbo permissions to the SharePoint install account.

    Script Action

    I then saved the contents of the script as a “Create SharePoint Database.sql” file and replaced the values configured in the console with variables – e.g., replacing the database name with $(DBNAME). The PowerShell script shown later will replace these variables with values from the CSV file:

    /* Create SharePoint database */
    CREATE DATABASE [$(DBNAME)] ON  PRIMARY
    ( NAME = N'$(DBNAME)', FILENAME = N'$(DATAFILEPATH)\$(DBNAME).mdf' , SIZE = $(DATASIZE) , MAXSIZE = $(DATAMAXSIZE) , FILEGROWTH = $(DATAFILEGROWTH) )
    LOG ON
    ( NAME = N'$(DBNAME)_log', FILENAME = N'$(LOGFILEPATH)\$(DBNAME)_log.ldf' , SIZE = $(LOGSIZE) , FILEGROWTH = $(LOGFILEGROWTH))
    COLLATE Latin1_General_CI_AS_KS_WS
    GO
    ALTER DATABASE [$(DBNAME)] SET COMPATIBILITY_LEVEL = 100
    GO
    ALTER DATABASE [$(DBNAME)] SET ANSI_NULL_DEFAULT OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET ANSI_NULLS OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET ANSI_PADDING OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET ANSI_WARNINGS OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET ARITHABORT OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET AUTO_CLOSE OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET AUTO_CREATE_STATISTICS ON
    GO
    ALTER DATABASE [$(DBNAME)] SET AUTO_SHRINK OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET AUTO_UPDATE_STATISTICS ON
    GO
    ALTER DATABASE [$(DBNAME)] SET CURSOR_CLOSE_ON_COMMIT OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET CURSOR_DEFAULT  GLOBAL
    GO
    ALTER DATABASE [$(DBNAME)] SET CONCAT_NULL_YIELDS_NULL OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET NUMERIC_ROUNDABORT OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET QUOTED_IDENTIFIER OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET RECURSIVE_TRIGGERS OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET  DISABLE_BROKER
    GO
    ALTER DATABASE [$(DBNAME)] SET AUTO_UPDATE_STATISTICS_ASYNC OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET DATE_CORRELATION_OPTIMIZATION OFF
    GO
    ALTER DATABASE [$(DBNAME)] SET PARAMETERIZATION SIMPLE
    GO
    ALTER DATABASE [$(DBNAME)] SET  READ_WRITE
    GO
    ALTER DATABASE [$(DBNAME)] SET RECOVERY $(RECOVERYMODE)
    GO
    ALTER DATABASE [$(DBNAME)] SET  MULTI_USER
    GO
    ALTER DATABASE [$(DBNAME)] SET PAGE_VERIFY CHECKSUM 
    GO
    USE [$(DBNAME)]
    GO
    IF NOT EXISTS (SELECT name FROM sys.filegroups WHERE is_default=1 AND name = N'PRIMARY') ALTER DATABASE [$(DBNAME)] MODIFY FILEGROUP [PRIMARY] DEFAULT
    GO

    /* Add SharePoint account as the database owner */
    USE [$(DBNAME)]
    GO
    CREATE USER [$(SETUPUSER)] FOR LOGIN [$(SETUPUSER)]
    GO
    USE [$(DBNAME)]
    GO
    EXEC sp_addrolemember N'db_owner', N'$(SETUPUSER)'
    GO

    PowerShell Script

    Finally, we need the PowerShell script. This walks through each line in the CSV file, reads the columns, and uses the Invoke-Sqlcmd cmdlet to run the .sql file - replacing the variables with the CSV column data:

    $csvData = @(Import-Csv -Path C:\Scripts\SQL\Databases.csv)
    foreach ($line in $csvData)
    {
        $parameters = @()
        $line | Get-Member -MemberType NoteProperty | ForEach-Object {
           
            $parameters = $parameters + ($_.Name + "=" + $line.($_.Name))
        }   
        Invoke-Sqlcmd -InputFile "C:\Scripts\Create SharePoint Database.sql" -ServerInstance "SQLINSTANCE" -Variable $parameters
    }

    If you are using this method to pre-create SharePoint databases, don’t forget to apply the other recommendations listed in the Deploy by using DBA-created databases article.

    You can also use the method of replacing variables shown in the PowerShell script above to manage any repeatable action in SQL – not just database creation. For example, you could keep a .sql file handy that adds an AD account as a SQL login and then use the Invoke-Sqlcmd command to add the SharePoint install account to SQL before creating any databases:

    SQL File:

    USE [master]
    GO
    CREATE LOGIN [$(USER)] FROM WINDOWS WITH DEFAULT_DATABASE=[master], DEFAULT_LANGUAGE=[British]
    GO
    EXEC master..sp_addsrvrolemember @loginame = N'$(USER)', @rolename = N'dbcreator'
    GO
    EXEC master..sp_addsrvrolemember @loginame = N'$(USER)', @rolename = N'securityadmin'
    GO

    PowerShell script:

    Invoke-Sqlcmd -InputFile "C:\Scripts\Add Account to Logins.sql" -ServerInstance "SQLINSTANCE" -Variable "USER=DOMAIN\SP_Admin"

    Monday, 10 October 2011

    Windows PowerShell Command Builder for SharePoint 2010 and Office 365

    A very handy Silverlight application has been released on Microsoft TechNet to provide a visual interface for assembling PowerShell commands to be used in SharePoint 2010 products and Office 365.

    It uses a drag-and-drop interface to add the verbs and nouns of each cmdlet, and will automatically filter available options depending on which are selected. Once the cmdlet has been chosen, it will display the required and optional parameters associated with it so that you can complete the command.  The example below shows how it can be used to design a Move-SPSite command.

    image

    Using the drop-down options available at the top left corner of the interface, cmdlets can be filtered by selecting products SharePoint Foundation 2010, SharePoint Server 2010 or Office 365 and there are also a few scenarios that can be chosen in the “I need to” drop-down – e.g., “Add a user to a domain group”.

    You can access the command builder here: http://www.microsoft.com/resources/TechNet/en-us/Office/media/WindowsPowerShell/WindowsPowerShellCommandBuilder.html. For more information from Microsoft on PowerShell and SharePoint in general, check out the Resource Center here: http://technet.microsoft.com/en-us/sharepoint/ff603532.aspx.

    Tuesday, 30 August 2011

    Creating bulk Active Directory user and group accounts in PowerShell using Server 2008 R2 cmdlets

    This one is a departure from my normal SharePoint-related articles. Back in the day, I maintained a few batch and VBS scripts that created multiple Active Directory users and groups from CSV files. These had obvious uses for creating test accounts on a development environment, but I also used them from time to time on customer production environments when it came to provisioning new user accounts on mass.

    One of the new features introduced in Windows Server 2008 R2 was the inclusion of 76 cmdlets delivering extensive Active Directory management capabilities using PowerShell. The challenge was to upgrade my old batch files to PowerShell versions using these new cmdlets, building in the following features:

  • Create bulk user accounts from a CSV file, including specifying attributes and account options (e.g., change password at next logon)
  • Create bulk group accounts from a CSV file, specifying a description and group type (e.g., global, security)
  • Assign multiple users to groups as members using a CSV file
  • Exception handling so that the script would check first to see if the account or group member existed before attempting to make the change
  • To achieve this, I have created three PowerShell functions – one to create users, one to create groups, and a final one to read the CSV file, kick off one of the other two functions depending on the type of account to be created, and finally add users as group members. Once these functions are loaded in a PowerShell console, it only requires a single command to create either users or groups – along with an appropriate CSV file, of course.

    Rather than go through each stage of these functions in this article, I have instead decided to annotate the script at various points so that you see what is happening. To get the functions ready for use, save them into a PS1 file using notepad or your favourite script editor and load it from a PowerShell console on a Windows Server 2008 R2 domain controller.

    #Import the PowerShell module containing AD cmdlets
    Import-Module ActiveDirectory
       
    #Read a CSV file with the user or group details and create an account in AD for each entry
    Function Create-ADAccountsFromCSV {
        Param (
            [parameter(Mandatory=$true)][string]$CSVPath,
            [parameter(Mandatory=$true)][string]$Type,
            [parameter(Mandatory=$true)][string]$OrgUnit
            )
     
        if (($Type -ne "Group") -and ($Type -ne "User"))
        {
            Throw New-Object System.ArgumentException("Type parameter must be specified as either 'User' or 'Group'.")
        }
     
        #Read the CSV file
        $csvData = Import-CSV $CSVPath
        foreach ($line in $csvData) {
           
            #Create a hash table of the account details
            $accountTable = @{
                'givenName'=$line.FirstName
                'sn'= $line.LastName
                'displayName'= $line.DisplayName
                'sAMAccountName'= $line.sAMAccountName
                'password' = $line.Password
                'description' = $line.Description
                'ou' = $OrgUnit 
            }
                   
            if ($Type -eq "User")
            {
                #Call the function to create a user account
                CreateUser -AccountInfo $accountTable
            }
     
            if ($Type -eq "Group")
            {
                #Call the function to create a group account
                CreateGroup -AccountInfo $accountTable
               
                #Get new group
                $groupFilterString = "samAccountName -like `"" + $line.sAMAccountName + "`""
                $group = Get-ADGroup -Filter $groupFilterString
               
                #Walk through each member column associated with this group
                $memberColumnNumber = 1
                $memberColumn = "Member" + $memberColumnNumber
               
                #While a member column still exists, add the value to a group
                while ($line.$memberColumn)
                {
                    #Check if user is already a member of the group
                    $member = Get-ADGroupMember $group | where { $_.samAccountName -eq $line.$memberColumn }
                   
                    #If not already a member, add user to the group
                    if ($member -eq $null)
                    {
                        write-host "Adding" $line.$memberColumn "as a member to group" $group.Name
                        try
                        {
                            $userFilterString = "samAccountName -like `"" + $line.$memberColumn + "`""
                            $user = Get-ADUser -Filter $userFilterString
                            Add-ADGroupMember -Identity $group -Members $user
                        }
                        catch
                        {
                            write-host "There was a problem adding" $line.$memberColumn "as a member to group" $group.Name "-" $_ -ForegroundColor red
                        }
                    }
                    else
                    {
                        write-host "User" $line.$memberColumn "not added to group" $group.Name "as it is already a member" -ForegroundColor blue
                    }
                   
                    $memberColumnNumber = $memberColumnNumber + 1
                    $memberColumn = "Member" + $memberColumnNumber
                }
            }
        }
    }        

    #Create an Active Directory user
    Function CreateUser {
      Param($AccountInfo)
     
        try
        {
            #Check to see if the user already exists
            $userFilterString = "samAccountName -like `"" + $AccountInfo['sAMAccountName'] + "`""
            $user = Get-ADUser -Filter $userFilterString
           
            #If user not already created, create them
            if ($user -eq $null)
            {
                write-host "Creating user account:" $AccountInfo['sAMAccountName']
               
                #Create the user account object
                New-ADUser -SamAccountName $AccountInfo['sAMAccountName'] `
                           -Name $AccountInfo['displayName'] `
                           -DisplayName $AccountInfo['displayName'] `
                           -GivenName $AccountInfo['givenName'] `
                           -Surname $AccountInfo['sn'] `
                           -Path $AccountInfo['ou'] `
                           -ChangePasswordAtLogon $true `
                           -AccountPassword (ConvertTo-SecureString $AccountInfo['password'] -AsPlainText -Force) `
                           -Description $AccountInfo['description'] `
                           -Enabled $false
           
                #Set 'User must change password at next logon' to true after user has been created
                #For some reason, the option wasn't set during New-ADUser - could be a bug?
                $user = Get-ADUser -Filter $userFilterString
                Set-ADUser $user -ChangePasswordAtLogon $true          
            }
            else
            {
                write-host "User" $AccountInfo['sAMAccountName'] "not created as it already exists" -ForegroundColor blue
            }
        }
        catch
        {
            write-host "There was a problem creating the user" $AccountInfo['sAMAccountName'] "-" $_ -ForegroundColor red
        }
    }

    #Create an Active Directory group
    Function CreateGroup {
        Param($AccountInfo)
     
        try
        {
            #Check to see if the group already exists
            $groupFilterString = "samAccountName -like `"" + $AccountInfo['sAMAccountName'] + "`""
            $group = Get-ADGroup -Filter $groupFilterString
           
            if ($group -eq $null)
            {  
                write-host "Creating group account:" $AccountInfo['sAMAccountName']
               
                #Create the group account object
                New-ADGroup -SamAccountName $AccountInfo['sAMAccountName'] `
                            -Name $AccountInfo['sAMAccountName'] `
                            -Path $AccountInfo['ou'] `
                            -GroupScope Global `
                            -GroupCategory Security
            }
            else
            {
                write-host "Group" $AccountInfo['sAMAccountName'] "not created as it already exists" -ForegroundColor blue
            }
        }
        catch
        {
            write-host "There was a problem creating the group" $AccountInfo['sAMAccountName'] "-" $_ -ForegroundColor red
        }  
    }

    Creating users

    Once the functions have been loaded in a PowerShell console, you are ready to start creating user accounts from a CSV file. The header of the CSV file for this example should be, as follows:

    sAMAccountName,FirstName,LastName,DisplayName,Description,Password

    Something to note is that I have used a very limited number of attributes when creating a user account – for example, I haven’t specified an Office, Department, Phone Number, Address, etc. You can extend the script to include these by adding extra columns to the CSV file and adapting the CreateUser function in the script, using the appropriate parameters for the New-ADUser cmdlet, as specified on TechNet here.

    Once you have these columns populated with user accounts, save the CSV file to a local or network path. For this example, I am going to copy the file into C:\Scripts and call it UserAccounts.csv.

    You now need to decide which Organisational Unit (OU) to store the users in Active Directory. I could have added this information to an extra column in the CSV file and made a slight modification to the script, but I decided to use a single OU in the PowerShell command instead. You can always move them to a different OU afterwards using the Active Directory Users and Computers console, if required.

    To create my users in the OU “Staff” for the domain “acme.local”, I would need to use the following command:

    Create-ADAccountsFromCSV -CSVPath "C:\Scripts\UserAccounts.csv" -OrgUnit "OU=Staff,DC=acme,DC=local” -Type "User"

    The script will output operation progress to the console, including a warning when it discovers that an account already exists and will not attempt to create it.

    image

    Creating groups and assigning members

    The CSV file for adding groups should have the following header:

    sAMAccountName,Member1,Member2,Member3,Member4,Member5

    The first column should contain the group name, followed by a column for each group member. Note that although I have only included Member 1 to 5 in the example above, you can have as many members as you like by creating extra columns in the header for Member6, Member7, etc.

    As with adding users. there are also extra parameters available for the New-ADGroup cmdlet, as specified on TechNet here.

    Once you have the CSV file formatted with the correct groups and their members, save it to a local or network path for use with the script. For this example, I am going to copy the file into C:\Scripts and call it GroupAccounts.csv. To create my groups in the OU “Groups” for the domain “acme.local”, I would need to use the following command:

    Create-ADAccountsFromCSV -CSVPath "C:\Scripts\GroupAccounts.csv" -OrgUnit "OU=Groups,DC=acme,DC=local” -Type "Group"

    As with creating users, the script will output progress to the PowerShell console, including warnings when either a group already exists or a user is already a member of the group - in both cases it will ignore these operations and move on without making a change.

    image

    Wednesday, 24 August 2011

    Creating a retention policy to start a workflow in SharePoint 2010 using PowerShell

    The script in this article was inspired by this one from Yaroslav Pentsarskyy, where he uses PowerShell to create a retention policy in SharePoint 2010. This works well for the built-in expiration actions (see http://msdn.microsoft.com/en-us/library/microsoft.office.recordsmanagement.policyfeatures.expiration_members.aspx), but I thought it was worth publishing a revised script for creating a retention policy to start a workflow as it requires a few extra properties in the policy XML.

    The scenario for this script was that I wanted to create a retention policy on a specific content type associated with the Pages library on a number of sites. The workflow itself was associated directly to the Pages library and not to a site content type. The retention policy must start the workflow for each page, 20 days after it is created in the Pages library.

    The workflow was created in SharePoint Designer and called “Review Page” - all it does is set a Status column to Expired. I then assigned the workflow to the “Content Page” content type on the Pages library of a site (see this article for details on how to do this in PowerShell) with the association name “Review Page Test”, as shown below:

    Workflow to content type

    We can now use PowerShell to create a retention policy to start this workflow for any page associated with the “Content Page” content type 20 days after it is created in the Pages library. Before the script can be run, you will need to load the following function in a PowerShell console with the SharePoint add-ins loaded:

    function Update-IMPStartWorkflow($siteURL, $wfAssociationName, $listName, $contentType, $WfStartColumn)
    {

        #Get web object
        $web = Get-SPWeb $siteURL
        write-host "Examining site:"$web.Title"at"$web.ServerRelativeUrl -ForegroundColor green
           
        try
        {
            #Do the following if a list exists with the name specified by the user - e.g., Pages
            if ($web.Lists[$ListName]) {

                write-host $web.Title"has a list called"$ListName -ForegroundColor green
               
                #Get the list
                $list = $web.Lists[$ListName]
               
                #Create list policy if one does not exist already
                #$policy = [Microsoft.Office.RecordsManagement.InformationPolicy.ListPolicySettings]($list)
                #if (!$policy.ListHasPolicy)
                #{
                #    $policy.UseListPolicy = $true
                #    $policy.Update()
                #}
               
                #Get the content type
                $ct = $list.ContentTypes[$contentType]
               
                #Get the Workflow template (parent association) ID from the association name
                $wfAssociationId = $ct.WorkflowAssociations.GetAssociationByName($wfAssociationName, 1033).ParentAssociationId.ToString()
                #Get the start column internal name and Id
                $fieldId = $list.Fields[$WfStartColumn].Id.ToString()
                $fieldName = $list.Fields[$WfStartColumn].InternalName
               
                #Create a new policy
                [Microsoft.Office.RecordsManagement.InformationPolicy.Policy]::CreatePolicy($ct, $null)
                $newPolicy = [Microsoft.Office.RecordsManagement.InformationPolicy.Policy]::GetPolicy($ct)    
               
                #Generate policy XML using the values required,
                #injecting column and workflow details from the variables assigned above
                $newPolicyFeatureXml = "<Schedules nextStageId=`"2`">"+
                "<Schedule type=`"Default`">"+
                "<stages>"+
                "<data stageId=`"1`">"+
                "<formula id=`"Microsoft.Office.RecordsManagement.PolicyFeatures.Expiration.Formula.BuiltIn`">"+
                "<number>20</number>"+
                "<property>"+$fieldName+"</property>"+
                "<propertyId>"+$fieldId+"</propertyId>"+
                "<period>days</period>"+
                "</formula>"+
                "<action type=`"workflow`" id=`""+$wfAssociationId+"`" />"+
                "</data>"+
                "</stages>"+
                "</Schedule></Schedules>"
               
                #Add retention policy      
                $newPolicy.Items.Add("Microsoft.Office.RecordsManagement.PolicyFeatures.Expiration", $newPolicyFeatureXml)
               
                write-host "Added retention policy to start workflow"$wfAssociation.Name"on content type"$ct.Name"for list"$list.Title"in site"$web.Title
            }
            else
            {
                #Report if the site does not have the list specified by the user
                write-host $web.Title"does not have a list called"$listName -ForegroundColor red
            }
        }
        catch
        {
            write-host "There has been an error:"$_ -ForegroundColor red
        }
        finally
        {
            #Dispose of the Web object
            $web.Dispose()
        }
    }

    Note that the important part of this script is where you generate the XML for the policy and assign it to the $newPolicyFeatureXml variable. This XML defines the properties required for setting the event, action and recurrence details.

    If you need to set a recurrence on the policy, then you can add some extra properties to the "<data stageId=`"1`">"+ line. For example, to set a recurrence period of every 1 day, replace this line with the following XML:

    "<data stageId=`"1`" recur=`"true`" offset=`"1`" unit=`"days`">"+

    Once the function has been loaded, it can be called with the following command to create a retention policy for the “Content Page” content type, on the Pages library of the site http://portal, to start the workflow with the association name “Review Page Test”:

    Update-IMPStartWorkflow -siteURL http://portal -wfAssociationName "Review Page Test" -listName "Pages" -contentType "Content Page" -WfStartColumn "Created"

    The Information Management Policies administration page on the Pages library for the “Content Page” content type should now look as follows:

    Retention Policy

    Another thing to note is how the workflow association ID is injected into the XML. An early draft of my script used the following line to get the workflow association ID:

    $wfAssociationId = $ct.WorkflowAssociations.GetAssociationByName($wfAssociationName, 1033).Id.ToString()

    However, running the script produced an “Invalid retention stage defined” error message in the Information Management Policies administration page against the retention policy:

    clip_image002

    To resolve this I needed to use the Parent Association ID (the workflow template) rather than the Association ID (the workflow instance), as follows:

    $wfAssociationId = $ct.WorkflowAssociations.GetAssociationByName($wfAssociationName, 1033).ParentAssociationId.ToString()

    I have added this information to the article in case you run into the same error message either in PowerShell or C# code and are looking for a solution…

    Saturday, 13 August 2011

    SharePoint PowerShell Timer Jobs solution by Ingo Karstein

    I was just thinking wouldn’t it be great if someone developed a way of running PowerShell scripts from timer jobs in SharePoint 2010 when a quick search revealed that someone already has! That person is Ingo Karstein, who has published his work on Codeplex for everyone to use.

    You can get full details from his blog at http://ikarstein.wordpress.com/2011/08/04/sharepoint-powershell-timer-jobs-run-powershell-scripts-in-sharepoint-timer-service-context/ and check out the solution itself from http://sppowershelltimerjob.codeplex.com/.

    It’s only in beta at the moment, but it already looks like a very professional piece of work, with script editing in the browser, history list, run enable/disable, option to execute on a specific server, and full integration into the Central Administration UI.

    Nice work Ingo!

    Friday, 12 August 2011

    Diagnose MissingWebPart and MissingAssembly issues from the SharePoint Health Analyzer using PowerShell

    After posting articles on diagnosing MissingFeature and MissingSetupFile errors in the SharePoint 2010 Health Analyzer and Test-SPContentDatabase operation, I have become slightly obsessed with doing the same for other errors, too. I have also had my fair share of them on migration projects recently, so I can’t be the only one experiencing them at the moment.

    In this article I am going to focus on MissingWebPart and MissingAssembly errors. As stated in previous articles, there is no silver bullet to solving these errors in all cases, but the scripts offered here will allow you troubleshoot the errors further to find exactly where they are happening in the content database. Once you know this, you have a fighting chance of being able to solve the problem.

    MissingWebPart Error

    In this example, I have received the following error whilst running a Test-SPContentDatabase operation after a content database migration from SharePoint 2007 to 2010. It also appears in the SharePoint Health Analyzer under the “Configuration” category with the title “Missing server side dependencies”:

    Category        : MissingWebPart
    Error           : True
    UpgradeBlocking : False
    Message         : WebPart class [4575ceaf-0d5e-4174-a3a1-1a623faa919a] is referenced [2] times in the database [SP2010_Content], but is not installed on the current farm. Please install any feature/solution which contains this web part.
    Remedy          : One or more web parts are referenced in the database [SP2010_Content], but are not installed on the current farm. Please install any feature or solution which contains these web  parts.

    As you can see, the error gives you a “WebPart class” GUID, the name of the content database, and how many times it is referenced in the database, but little else. What we need to find out here is either the name of the web part or on which pages it is referenced in the database.

    For this I am going to reuse the Run-SQLQuery PowerShell script that I introduced in my article on MissingSetupFile errors:

    function Run-SQLQuery ($SqlServer, $SqlDatabase, $SqlQuery)
    {
    $SqlConnection = New-Object System.Data.SqlClient.SqlConnection
    $SqlConnection.ConnectionString = "Server =" + $SqlServer + "; Database =" + $SqlDatabase + "; Integrated Security = True"
    $SqlCmd = New-Object System.Data.SqlClient.SqlCommand
    $SqlCmd.CommandText = $SqlQuery
    $SqlCmd.Connection = $SqlConnection
    $SqlAdapter = New-Object System.Data.SqlClient.SqlDataAdapter
    $SqlAdapter.SelectCommand = $SqlCmd
    $DataSet = New-Object System.Data.DataSet
    $SqlAdapter.Fill($DataSet)
    $SqlConnection.Close()
    $DataSet.Tables[0]
    }

    Once you have loaded the function in a PowerShell console, you can run it by using the Run-SQLQuery command with the options relevant to your deployment. For [MissingWebPart] errors, you need to run a SQL SELECT query on the “AllDocs” table in the content database exhibiting the problem, joining to the “AllWebParts” table in order to find details about the missing web part. For example, you would type the following command to find details of the web part with the class ID “4575ceaf-0d5e-4174-a3a1-1a623faa919a”, as reported in the error above:

    Run-SQLQuery -SqlServer "SQLSERVER" -SqlDatabase "SP2010_Content" -SqlQuery "SELECT * from AllDocs inner join AllWebParts on AllDocs.Id = AllWebParts.tp_PageUrlID where AllWebParts.tp_WebPartTypeID = '4575ceaf-0d5e-4174-a3a1-1a623faa919a'" | select Id, SiteId, DirName, LeafName, WebId, ListId, tp_ZoneID, tp_DisplayName | Format-List

    Yes, it is a pretty long command, but it will produce a very useful output, as shown in this example:

    Id             : 6ab5e70b-60d8-4ddf-93cb-6a93fbc410be
    SiteId         : 337c5721-5050-46ce-b112-083ac52f7f26
    DirName        : News/Pages
    LeafName       : ArticleList.aspx
    WebId          : dcc93f3e-437a-4fae-acea-bb15d5c4ea7d
    ListId         : 7e13fe6c-3670-4d46-9601-832e3eb6a1e4
    tp_ZoneID      : Body
    tp_DisplayName :

    Id             : b3fcfcd2-2f02-4fe9-93e4-9c9b5ecddf5b
    SiteId         : 337c5721-5050-46ce-b112-083ac52f7f26
    DirName        : Pages
    LeafName       : Welcome.aspx
    WebId          : 2ae0de59-a008-4244-aa66-d8f76c79f1ad
    ListId         : d8f083f0-16b9-43d0-9aaf-4e9fffecd6cc
    tp_ZoneID      : RightColumnZone
    tp_DisplayName :

    This tells us that the web part has been found on two pages (the references mentioned in the MissingWebPart error). SiteId tells us the site collection and WebId the site where the pages are located. We also have a DirName showing the relative path and the page name itself against the LeafName property. If you’re lucky, you might get the display name of the web part against the tp_DisplayName property, but if not, you should at least be able to tell which zone the web part has been added to by looking at the tp_ZoneID property.

    Easily the best way of resolving these issues is to do as the error suggests and install the missing feature or solution containing the web part, but if this is not possible or feasible to do in your scenario, we can discover the site collection URL from the GUIDs using PowerShell and then remove the offending web parts from the pages specified.

    To find the site collection URL using the information output from the query, type the following command:

    $site = Get-SPSite -Limit all | where {$_.Id -eq "337c5721-5050-46ce-b112-083ac52f7f26"}
    $site.Url

    One you have the site collection URL, you can use the relative path specified by the DirName property to find the location of the file. To remove the web part from the page, type the page URL in the browser and add ?contents=1 to the end of it. For example, to open the web part maintenance page for the ArticleList.aspx page specified in the output, type the following URL in the browser:

    http://portal/news/pages/articlelist.aspx?contents=1

    You can then highlight the offending web part (normally called ErrorWebPart for MissingWebPart errors) by ticking the box and clicking Delete. The screenshot below shows a web part maintenance page to give you an idea of the UI, but not an example of an ErrorWebPart as I had already removed them!

    image

    Note: If you remove an ErrorWebPart from a publishing page with versioning switched on, you may have to delete all earlier versions of the page before the error disappears from the SharePoint Health Analyzer or Test-SPContentDatabase report. This is because the web part will still be referenced from these versions, even though you removed it from the currently published page.

    MissingAssembly Error

    MissingAssembly errors look similar to this one:

    Category        : MissingAssembly
    Error           : True
    UpgradeBlocking : False
    Message         : Assembly [PAC.SharePoint.Tagging, Version=1.0.0.0, Culture=neutral, PublicKeyToken=b504d4b6c1e1a6e5] is referenced in the database [SP2010_Content], but is not installed on the current farm. Please install any feature/solution which contains this assembly.
    Remedy          : One or more assemblies are referenced in the database [SP2010_Content], but are not installed on the current farm. Please install any feature or solution which contains these assemblies.

    I normally find MissingAssembly errors appear as the result of an event receiver, which is still registered on a list or library but part of a feature/solution no longer present on the farm.

    In most cases, you may be able to look at the assembly name reported in this error and know what the problem is straight away. As before, the best way of resolving this is to reinstall the missing solution file. However, if you are not able to install the solution (e.g., maybe it only works in SharePoint 2007 and not 2010), then you may want to find the lists where the event receiver is installed and either remove the event receiver from the lists or delete the lists themselves.

    To troubleshoot this issue we can re-use the Run-SQLQuery function used to help find missing web parts above. The table we need to look at this time though is called “EventReceivers”. For example, you would type the following command to find details of the assembly called “PAC.SharePoint.Tagging, Version=1.0.0.0, Culture=neutral, PublicKeyToken=b504d4b6c1e1a6e5”, as reported in the error above:

    Run-SQLQuery -SqlServer "SQLSERVER" -SqlDatabase "SP2010_Content" -SqlQuery "SELECT * from EventReceivers where Assembly = ‘PAC.SharePoint.Tagging, Version=1.0.0.0, Culture=neutral, PublicKeyToken=b504d4b6c1e1a6e5'" | select Id, Name, SiteId, WebId, HostId, HostType | Format-List

    This will produce an output similar to the following:

    Id       : 657a472f-e51d-428c-ab98-502358d87612
    Name     :
    SiteId   : 337c5721-5050-46ce-b112-083ac52f7f26
    WebId    : 2ae0de59-a008-4244-aa66-d8f76c79f1ad
    HostId   : 09308020-45a8-41e4-bbc0-7c8d8cd54132
    HostType : 2

    Id       : 0f660612-6be0-401e-aa1d-0ede7a9af8da
    Name     :
    SiteId   : 337c5721-5050-46ce-b112-083ac52f7f26
    WebId    : 2ae0de59-a008-4244-aa66-d8f76c79f1ad
    HostId   : 09308020-45a8-41e4-bbc0-7c8d8cd54132
    HostType : 2

    As with the MissingWebPart error before, we can use these GUIDs to get the site collection and site hosting the list with the missing event receiver, as follows:

    $site = Get-SPSite -Limit all | where {$_.Id -eq "337c5721-5050-46ce-b112-083ac52f7f26"}
    $web = $site | Get-SPWeb -Limit all | where {$_.Id -eq "2ae0de59-a008-4244-aa66-d8f76c79f1ad"}
    $web.Url

    The HostId property is the GUID of the object containing the event receiver. The HostType is the object type – in this case, HostType “2” means the event receiver host is a list. You can look at the other host types by checking this article on MSDN: http://msdn.microsoft.com/en-us/library/ee394866(v=prot.13).aspx.

    Now we know the GUID refers to a list, we can get it using PowerShell with this command:

    $list = $web.Lists | where {$_.Id -eq "09308020-45a8-41e4-bbc0-7c8d8cd54132"}

    To remove the list completely, type the following command:

    $list.Delete()

    To keep the list intact and just remove the offending event receiver, copy the Id property from the Run-SQLQuery output into this command:

    $er = $list.EventReceivers | where {$_.Id -eq "657a472f-e51d-428c-ab98-502358d87612"}
    $er.Delete()

    If you do decide to delete the list completely, ensure you also remove it from the site Recycle Bin and Site Collection Recycle Bin to ensure the file is removed from the content database. If not, the error may not disappear from the Health Analyzer or Test-SPContentDatabase operation.

    If you found this article looking for information on how to diagnose MissingFeature issues in the SharePoint Health Analyzer, rather than MissingWebPart or MissingAssembly issues, then have a look at this article for help. For information on troubleshooting MissingSetupFile errors, take a look at this article.