Another Powershell script that helps with a migration. In this instance the requirement was for a way of understanding the accounts that had previously logged onto desktop systems and the credentials of the last logged on user, for either all desktop systems in a Domain\OU structure or all desktop systems listed in a text file. Additionally, in this instance, there was a requirement to add a delay between each iteration of the data gathering to minimise impact on the script running system and to the network.
The challenge was that this information is stored differently in XP and Vista\Win7 in the registry and the Profile information is accessible on Vista\Win7 systems using Get-Profiles, but needs some registry scripting jiggery-pokery for XP\Win2k3 systems. Additionally, I could not find any script out there that did both these things, so I decided to use functions from other peoples work and stitch them together to get what is attached.
To understand what can be done with the script, read the header. If the script is run without parameters it will just get Profile information for all desktop systems in the currently logged on domain. The output is stored in 3 csv files that are suffixed with Date\Time stamps in the same location that the script is run from.
The Syntax for running the script is this and none of the parameters are required.
.\Get-PCUserProfilePath.ps1 -srcDomainPath <SearchPathdnOrDNSName> -IterationDelay <TimeInSeconds> -ComputerType <ServerOrDesktop> -ComputerListFile <sAMAccountNameTextList>
I have not included a listing of the script here because it is too large, but the script can be downloaded at the bottom of this posting.
HTH
Who can guess I am working on an AD migration project at the moment? And it presents an opportunity to cut my teeth further on Powershell scripting. Yes the AD Powershell scripts I have produced so far could have been done much easier if I had used the AD module provided with AD in Windows Server 2008 R2, but unfortunately not everyone is in a situation where they can take advantage of that module or even the Quest Modules that are also available. Hence some of the long winded scripts I have been producing.
Back to this problemette I was presented with. Following a user migration process\procedure the AD support team had discovered that a number of users were not able to access some resources and as one of the checks during remediation, wanted a quick way of comparing a users Group Membership with that of the source account. The following script is what I came up with for them. Basically, it gets all the direct Group memberships for the source object and destination objects (as specified at the command line) and runs a compare operation. The on screen output is of a table that displays the Group Memberships that are different between the objects with an arrow indicator showing what side of the evaluation the Group Membership exists. So for example if you ran the following command :
.\Compare-UserGroups.ps1 -srcDomain domain1.local -destDomain woodgrovebank.com -srcsAMAccountName carlh1 -destsAMAccountName carlh2
You would get 2 files that list each objects Group Membership and something like the table below, which shows that the entry for destsAMAccountName (carlh2) is a member of a group named "Password Policy Group", which the entry for srcsAMAccountName (carlh) is not a member of and vice versa for the other groups.
InputObject SideIndicator ----------- ------------- Password Policy Group => EventLogAccess <= DnsAdmins <= Backup Operators <=
It is true this is just a "like for like" sAMAccountName comparison and not a true SID to SID\SIDHistory comparison and it only compares Direct membership, but the requirement was for a quick check to ensure nothing was awry for odd troubleshooting instances and suits the needs where Groups have been migrated wholesale. If you need anything more funky that does Group Nesting then I advise you pop over to here http://www.rlmueller.net/freecode1.htm . Richards scripts are awesome.
<##################################################################### Compare-UserGroups.ps1
Syntax: Compare-UserGroups.ps1 -srcDomain <SourceDomain> -destDomain <DestinationDomain> -srcsAMAccountName <UserNetbiosName> -destsAMAccountName <UserNetbiosName>
Example: Compare-UserGroups.ps1 -srcDomain domain1.local -destDomain woodgrovebank.com -srcsAMAccountName carlh -destsAMAccountName carlh
Purpose: Compares the DIRECT Group Membership of 2 user accounts. Be aware that it compares the Netbios names (sAMAccountName) of the groups and is only useful either within a domain\forest or after a migration of a user account between domains where the group names have not been changed as a consequence of the migration.
Params: As shown in syntax above or by typing the script name at the command prompt
Req: Windows 2003 SP2 or above, Powershell V2. run "set-executionpolicy remotesigned" in Powershell
http://blogs.technet.com/b/carlh
Author: Carl Harrison
This script is provided "AS IS" with no warranties, confers no rights and is not supported by the authors or authors employer. Use of this script sample is subject to the terms specified at http://www.microsoft.com/info/copyright.mspx.
Version: 1.0 - First cut
#####################################################################>
Param ( [Parameter()][string]$srcDomain='', [Parameter()][String]$destDomain='', [Parameter()][String]$srcsAMAccountName='', [Parameter()][String]$destsAMAccountName='')
Function Compare-GroupsHelp () {$helptext=@"NAME: Compare-UserGroups.ps1
Compares the DIRECT Group Membership of 2 user accountsBe aware that it compares the Netbios names (sAMAccountName) of the groupsand is only useful either within a domain\forest or after a migrationof a user account between domains where the group names have not been changedas a consequence of the migration.
PARAMETERS:-srcDomain Source Domain (Required)-destDomain Destination Domain (Required)-srcsAMAccountName Netbios name of the user account in the source domain (Required)-destsAMAccountName Netbios name of the user account in the destination domain (Required)
SYNTAX:Compare-UserGroups.ps1 -srcDomain domain1.local -destDomain woodgrovebank.com -srcsAMAccountName carlh -destsAMAccountName carlh2
Thsi compares the group memberships that carlh from domain1.local has in domain1.localwith the group memberships that carlh from woodgrovebank.com has in woodgrovebank.com
"@$helptextexit}
Function Get-LDAPUser ($UserName, $SourceDomain) { $domain1 = new-object DirectoryServices.DirectoryEntry ("LDAP://$SourceDomain") $searcher = new-object DirectoryServices.DirectorySearcher($domain1) $searcher.filter = "(&(objectClass=user)(sAMAccountName= $UserName))" $searcher.findone().getDirectoryEntry() $domain1 =""}
if(!($srcDomain)) {"Source Domain Required";Compare-GroupsHelp}if(!($destDomain)) {"Destination Domain Required";Compare-GroupsHelp}if(!($srcsAMAccountName)) {"Netbios Name or Source Account Required";Compare-GroupsHelp}if(!($destsAMAccountName)) {"Netbios Name or Destination Account Required";Compare-GroupsHelp}
$srcUserGroupsFile = '.\srcUserGroupsFile.txt'$destUserGroupsFile = '.\destUserGroupsFile.txt'
Write-Host$srcUser = get-ldapuser $srcsAMAccountName $srcDomainWrite-Host $srcUser.displayName "is a member of" $srcUser.memberOf.Count " groups in domain $srcDomain. The groups are:"$srcUser.memberOf | ftWrite-Host$destUser = get-ldapuser $destsAMAccountName $destDomainWrite-Host $destUser.displayName "is a member of" $destUser.memberOf.Count " groups in domain $destDomain. The groups are:"$destUser.memberOf | ft
Write-Host
$srcUserGroups = @()$srcGroupsDN = @()$destUserGroups = @()$destGroupsDN = @()
Foreach($Group in $srcUser.memberOf) { $GroupsAMAccountName = ([ADSI]"LDAP://$Group").sAMAccountName.value #$GroupsAMAccountName $srcUserGroups += "$GroupsAMAccountName" $srcGroupsDN += $Group.tostring() }Foreach($Group in $destUser.memberOf) { $GroupsAMAccountName = ([ADSI]"LDAP://$Group").sAMAccountName.value #$GroupsAMAccountName $destUserGroups += "$GroupsAMAccountName" $destGroupsDN += $Group.tostring() }
$srcGroupsDN | Out-File $srcUserGroupsFile$destGroupsDN | Out-File $destUserGroupsFile
Compare-Object $srcUserGroups $destUserGroups -SyncWindow 100
$destUser = ""$srcUser = ""
Here's a weird one, you are migrating users from multiple domains to a single domain and your migration process\procedure had determined that the primary domain of a user was not what it actually is, so the settings (Profile Path, Home Drive\Directory, and Script Path) for the new user are for the wrong account. For various reasons you don't want or cannot merge the user objects attributes. This was one of the queries posed to me recently by a customer.
The script below gives a remedy to this by taking as a command line parameter the name and domain of one domain user and copying profile settings (the Profile Path, Home Drive & Directory, and Script Path) to relevant attributes of another command line parameter specified user name (in another domain). I even included the ability to prefix the ScriptPath Attribute (in case you use a different folder tree in the new domain).
Before using the script read the header to give you a clue as to the syntax. I have attached a copy of the code in a text file at the bottom of this post.
<#####################################################################Copy-UserProfile.ps1
Syntax: Copy-UserProfile.ps1 -srcDomain <SourceDomain> -destDomain <DestinationDomain> -scriptPathPrefix <ScriptPrefix> -srcsAMAccountName <UserNetbiosName> -
destsAMAccountName <UserNetbiosName>
Example: Copy-UserProfile.ps1 -srcDomain domain1.local -destDomain woodgrovebank.com -scriptPathPrefix domain1\ -srcsAMAccountName carlh -destsAMAccountName carlh2
Purpose: This Sets the ProfilePath, ScriptPath, HomeDrive and HomeDirectory of the user carlh2 in the Woodgrovebank.com domain to the same settings as those for carlh in domain.local. Additionally, the scriptPath attribute content is prefixed with the word domin1\
This script is provided "AS IS" with no warranties, confers no rights and is not supported by the authors or Microsoft Corporation. Use of this script sample is subject to the terms specified at http://www.microsoft.com/info/copyright.mspx.
Param ( [Parameter()][string]$srcDomain='', [Parameter()][String]$destDomain='', [Parameter()][String]$scriptPathPrefix='', [Parameter()][String]$srcsAMAccountName='', [Parameter()][String]$destsAMAccountName='')
Function GetSetUserHelp () {$helptext=@"NAME: Copy-UserPofile.ps1Used to copy User Profile, Logon Script and Home details of userfrom one domain to the next.
PARAMETERS:-srcDomain Source Domain (Required)-destDomain Destination Domain (Required)-scriptPathPrefix Prefix to add to Script Path attribute (include any back slashes or forward slashes as required)-srcsAMAccountName Netbios name of the user account in the source domain (Required)-destsAMAccountName Netbios name of the user account in the destination domain (Required)
SYNTAX:Copy-UserProfile.ps1 -srcDomain domain1.local -destDomain woodgrovebank.com -scriptPathPrefix domain1\ -srcsAMAccountName carlh -destsAMAccountName carlh2
This Sets the ProfilePath, ScriptPath, HomeDrive and HomeDirectory of the user carlh2 in theWoodgrovebank.com domain to the same settings as those for carlh in domain.local. Additionally,the scriptPath attribute content is prefixed with the word domin1\
Function Set-LDAPUser ($UserName2, $DestinationDomain) { $domain2 = new-object DirectoryServices.DirectoryEntry ("LDAP://$DestinationDomain") $searcher = new-object DirectoryServices.DirectorySearcher($domain2) $searcher.filter = "(&(objectClass=user)(sAMAccountName= $UserName2))" $destUser = $searcher.findone().getDirectoryEntry() $destUser.scriptPath = "$Global:ScriptPathPrefix" + $Global:srcUser.scriptPath $destUser.profilePath = $Global:srcUser.profilePath $destUser.homeDrive = $Global:srcUser.homeDrive $destUser.homeDirectory = $Global:srcUser.homeDirectory $destUser.setinfo() $domain2 = ""}
if(!($srcDomain)) {"Source Domain Required";GetSetUserHelp}if(!($destDomain)) {"Destination Domain Required";GetSetUserHelp}if(!($srcsAMAccountName)) {"Netbios Name or Source Account Required";GetSetUserHelp}if(!($destsAMAccountName)) {"Netbios Name or Destination Account Required";GetSetUserHelp}
$Global:ScriptPathPrefix = $ScriptPathPrefix$Global:srcUser = get-ldapuser $srcsAMAccountName $srcDomainWrite-Host $Global:srcUser.displayName "in domain $srcDomain settings are:"$Global:srcUser.scriptPath$Global:srcUser.profilePath$Global:srcUser.homeDrive$Global:srcUser.homeDirectoryset-ldapuser $destsAMAccountName $destDomain$Global:destUser = get-ldapuser $destsAMAccountName $destDomainWrite-Host ""Write-Host $Global:destUser.displayName "in domain $destDomain settings are now:"$Global:destUser.scriptPath$Global:destUser.profilePath$Global:destUser.homeDrive$Global:destUser.homeDirectory
$Global:destUser = ""$Global:srcUser = ""$ScriptPathPrefix = ""
So here's another script that I quickly knocked up. Basically, I was asked whether there was a way of finding out whether the Domain Admins group was a member of the Local Administrators group on a list of computers. Powershell to the rescue; I'm really getting into this Powershell Malarkey.
It is rather rudimentary and could actually be made a bit more usable by getting it to search AD and even specific OU structures in AD. Examples of this follow later in my TechNet Blog.
Script attached to blog below (requires removal of txt extension to work).
<######################################################################SCRIPT IsADMemberOfLocalAdmins.ps1
SYNTAX.\IsADMemberOfLocalAdmins.ps1 -InputFile <.\ComputerList.txt> -OutputFile <.\OutPutFile.txt>
-InputFile Text file containing list of Computers to query -OutputFile Text File containing results from script
SYNOPSISQueries the Local Administrators group on the computers listed in thetext file provided as a parameter, to determine if Domain Admins islisted as a member.
NOTE Script requires no parameters or arguments, but does have some. I recommend you have the relevant permissions in the domain and on the computers being queried for optimal results.
This script is provided "AS IS" with no warranties, confers no rights and is not supported by the authors or employer.
AUTHOR Carl Harrison
VERSION: 1.0 - First cut######################################################################>
# Change these two to suit your needsParam ( [Parameter()][string]$InputFile='.\computers.txt', [Parameter()][String]$OutputFile='.\IsDAMemberOfAdminsOutput.txt')
$ChildGroups = "Domain Admins"$LocalGroup = "Administrators"
$MemberNames = @()$OutPutResults = @()$Computers = Get-Content $InputFileforeach ( $Computer in $Computers ) { $Group= [ADSI]"WinNT://$Computer/$LocalGroup,group" $Members = @($Group.psbase.Invoke("Members")) $Members | ForEach-Object { $MemberNames += $_.GetType().InvokeMember("Name", 'GetProperty', $null, $_, $null) } $ChildGroups | ForEach-Object { $output = "" | Select-Object Computer, Group, InLocalAdmin $output.Computer = $Computer $output.Group = $_ $output.InLocalAdmin = $MemberNames -contains $_ Write-Output $output $OutputResults += $output } $MemberNames = @()}$OutputResults | Export-CSV -NoTypeInformation $OutputFile
Once again another script to help one of my colleagues in need of a method of bulk deleting objects in AD taken from a list in CSV file. In this instance he need it for deletion of groups that they had determined as no longer useful. Bizarrely, this type of script did not exist when he searched for it (I would have thought someone would have written something like this previously). Actually I had some written some of this code already over 8 years ago and decided to repurpose it for my colleague.
Below is a listing of the VBScript. It reads in a file named Groups.csv that contains a list of all groups (sAMAccountName's) to be deleted (the original CSV file also had a second column that had the group type integer, but the script strips this). The script works in the domain of the currently logged on credentials, so you need the necessary permissions in AD for it to work.
Normally, I comment my scripts a lot more, but this was a rush order :-) and I haven't had the time to revisit it (and I an trying to move from VBScript now).
My colleague has proven, the script is easily altered to enable it t delete any type of object and these scripts have been posted to Microsoft Script Center.
'Script deletes security groups from a csv file.'csv format is strsAMGroupName,Whatever'This script is offered with no warranty'On Error Resume Next 'used in case group not foundOption Explicit
Const ForReading = 1
Dim strL, spl1, strOU, strGroupCN, strGroupNameDim objFSO, objInputFile
Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objInputFile = objFSO.OpenTextFile(".\groups.csv", ForReading) 'your csv file
wscript.echo "script started"
'extract from csv fileDo until objInputFile.AtEndOfStream strL = objInputFile.ReadLine spl1 = Split(strL, ",") strGroupName = (spl1(0)) If GroupExists(strGroupName) = True Then 'WScript.Echo strGroupName & " exists." DelGroup End If Loop
Set objFSO = NothingSet objInputFile = Nothing
wscript.echo "script finished"
'group exist checkFunction GroupExists(strsAMGroupName)
Dim strDNSDomain, strFilter, strQueryDim objConnection, objCommand, objRootLDAP, objLDAPGroup, objRecordSet
GroupExists = FalseSet objConnection = CreateObject("ADODB.Connection")Set objCommand = CreateObject("ADODB.Command")Set objRootLDAP = GetObject("LDAP://RootDSE")objConnection.Provider = "ADsDSOObject"objConnection.Open "Active Directory Provider"Set objCommand.ActiveConnection = objConnectionobjCommand.Properties("Page Size") = 1000'objCommand.Properties("Searchscope") = ADS_SCOPE_SUBTREE
strDNSDomain = objRootLDAP.Get("DefaultNamingContext")strFilter = "(&(objectCategory=group)(sAMAccountName=" & strsAMGroupName & "))"
strQuery = "<LDAP://" & strDNSDomain & ">;" & strFilter & ";sAMAccountName,adspath,CN;subTree"
objCommand.CommandText = strQuery'WScript.Echo strFilter'WScript.Echo strQuerySet objRecordSet = objCommand.Execute
If objRecordSet.RecordCount = 1 Then
objRecordSet.MoveFirst 'WScript.Echo "We got here " & strsAMGroupName 'WScript.Echo objRecordSet.Fields("sAMAccountname").Value 'WScript.Echo objRecordSet.Fields("adspath").Value If objRecordSet.Fields("sAMAccountname").Value = strsAMGroupName Then GroupExists = True Set objLDAPGroup = GetObject(objRecordSet.Fields("adspath").Value) strOU = objLDAPGroup.Parent strGroupCN = objRecordSet.Fields("CN").Value End IfElse WScript.Echo strsAMGroupName & " Group doesn't exist or Duplicate sAMAccountName" GroupExists = False strGroupCN = "" strOU = ""End If
objRecordSet.CloseSet objConnection = NothingSet objCommand = NothingSet objRootLDAP = NothingSet objLDAPGroup = NothingSet objRecordSet = Nothing
end function
Sub DelGroup
Dim objOU
'WScript.Echo strOU'WScript.Echo strGroupCNSet objOU = GetObject(strOU)objOU.Delete "Group", "cn=" & strGroupCN & ""WScript.Echo strGroupName & " (CN=" & strGroupCN & ") has been deleted."
Set ObjOU = NothingstrGroupCN = ""
End Sub
I have been scripting using VBScript for some time now, and only on occasion have I had the motivation and\or reason to write a script using Powershell, but I have decided in the last 9 months to make a concerted effort to utilise the Power of Powershell as my main scripting technology. In this time period I have produced a number of scripts and I will be adding them to my blog in case anybody finds them useful.
So my first offering is something I wrote for a colleague of mine, that queries all the Windows Servers in your domain and gathers their Shares, the permissions set on the shares and the permissions set on the NTFS file system at the share level (not lower). All this is output to 2 CSV files in the same folder that the script is run from. There are no Parameters for the script, and I hadn't written it to specifically search in specific OU structures or to read from text file based list of computers. I have since written another script that I will be posting here, that has this functionality, so if you need it that functionality it can be done.
The script requires that you have the necessary Rights on all the Servers you will be querying to get back all the information.
Below is a listing of the script, but I have attached the code in a text file as well (obviously remove the txt extension to use)..
<######################################################################SCRIPT GetPermissions.ps1SYNOPSIS Gets all Servers from the domain from which the script is run in (i.e. the domain you are logged in to). Then uses the server list to query each servers Share permissions and the NTFS permissions of each share root folder. The Script creates 2 files in the same location the script is run from (sharereport.csv and ntfsreport.csv).NOTE Script requires no parameters or arguments I recommend you have the relevant permissions in the domain and on the servers being queried for optimal results. Myself or my employer do not warrantee this script in any way and use of it is entirely at the users own risk. The working functions of this script are taken from examples found at Microsoft Script Center web site and are written by Boe Prox.AUTHOR Carl Harrison The script utilises 2 functions written by Boe Prox and publically available at Microsoft Script Center. They have been altered slightly to work with this script.VERSION: 1.0 - First cut 1.1 - Sorted FQDN Ping instead of netbios ping ######################################################################>
function Get-SharePermissions{<# .SYNOPSIS Retrieves share permissions. This function was extracted from a Powershell Module written by Boe Prox named FileSharePermissions.psm1.DESCRIPTION Retrieves share permissions..PARAMETER computer Name of server to test. .EXAMPLE Get-SharePermissions -computer Test #> [cmdletbinding( DefaultParameterSetName = 'computer', ConfirmImpact = 'low')] Param( [Parameter( Mandatory = $True, Position = 0, ParameterSetName = 'computer', ValueFromPipeline = $True)] [array]$computer )Begin { #Process Share report $sharereport = @() }Process { #Iterate through comptuers ForEach ($c in $computer) { Try { #Write-Verbose "Computer: $($c)" #Retrieve share information from comptuer $ShareSec = Get-WmiObject -Class Win32_LogicalShareSecuritySetting -ComputerName $c -ea stop ForEach ($Shares in $sharesec) { #Write-Verbose "Share: $($Shares.name)" #Try to get the security descriptor $SecurityDescriptor = $ShareS.GetSecurityDescriptor() #Iterate through each descriptor ForEach ($DACL in $SecurityDescriptor.Descriptor.DACL) { $arrshare = New-Object PSObject $arrshare | Add-Member NoteProperty Computer $c $arrshare | Add-Member NoteProperty Name $Shares.Name $arrshare | Add-Member NoteProperty ID $DACL.Trustee.Name #Convert the current output into something more readable Switch ($DACL.AccessMask) { 2032127 {$AccessMask = "FullControl"} 1179785 {$AccessMask = "Read"} 1180063 {$AccessMask = "Read, Write"} 1179817 {$AccessMask = "ReadAndExecute"} -1610612736 {$AccessMask = "ReadAndExecuteExtended"} 1245631 {$AccessMask = "ReadAndExecute, Modify, Write"} 1180095 {$AccessMask = "ReadAndExecute, Write"} 268435456 {$AccessMask = "FullControl (Sub Only)"} default {$AccessMask = $DACL.AccessMask} } $arrshare | Add-Member NoteProperty AccessMask $AccessMask #Convert the current output into something more readable Switch ($DACL.AceType) { 0 {$AceType = "Allow"} 1 {$AceType = "Deny"} 2 {$AceType = "Audit"} } $arrshare | Add-Member NoteProperty AceType $AceType #Add to existing array $sharereport += $arrshare } } } #Catch any errors Catch { $arrshare = New-Object PSObject $arrshare | Add-Member NoteProperty Computer $c $arrshare | Add-Member NoteProperty Name "NA" $arrshare | Add-Member NoteProperty ID "NA" $arrshare | Add-Member NoteProperty AccessMask "NA" #Add to existing array $sharereport += $arrshare } Finally { #Do Nothing Currently } } } End { #Display report #$ShareReport #$ShareReport | Export-Csv -notypeinformation "sharereport.csv" return $sharereport }}
function Get-ShareNTFSPermissions{<# .SYNOPSIS Retrieves NTFS permissions on a share. This function was extracted from a Powershell Module written by Boe Prox named FileSharePermissions.psm1.DESCRIPTION Retrieves NTFS permissions on a share..PARAMETER computer Name of server to test. .EXAMPLE Get-ShareNTFSPermissions -computer Test #> [cmdletbinding( DefaultParameterSetName = 'computer', ConfirmImpact = 'low')] Param( [Parameter( Mandatory = $True, Position = 0, ParameterSetName = 'computer', ValueFromPipeline = $True)] [array]$computer ) Begin { #Process NTFS Share report $ntfsreport = @() } Process { #Iterate through each computer ForEach ($c in $computer) { Try { Write-Verbose "Computer: $($c)" #Gather share information $shares = Gwmi -comp $c Win32_Share -ea stop | ? {$_.Name -ne 'ADMIN$'-AND $_.Name -ne 'C$' -AND $_.Name -ne 'IPC$'} | Select Name,Path ForEach ($share in $shares) { #Iterate through shares Write-Verbose "Share: $($share.name)" If ($share.path -ne "") { #Retrieve ACL information from Share $remoteshare = $share.path -replace ":","$" Try { #Gather NTFS security information from each share $acls = Get-ACL "\\$computer\$remoteshare" #iterate through each ACL ForEach ($acl in $acls.access) { If ($acl.FileSystemRights -match "\d") { Switch ($acl.FileSystemRights) { 2032127 {$AccessMask = "FullControl"} 1179785 {$AccessMask = "Read"} 1180063 {$AccessMask = "Read, Write"} 1179817 {$AccessMask = "ReadAndExecute"} -1610612736 {$AccessMask = "ReadAndExecuteExtended"} 1245631 {$AccessMask = "ReadAndExecute, Modify, Write"} 1180095 {$AccessMask = "ReadAndExecute, Write"} 268435456 {$AccessMask = "FullControl (Sub Only)"} default {$AccessMask = "Unknown"} } } Else { $AccessMask = $acl.FileSystemRights } $arrntfs = New-Object PSObject #Process NTFS Report $arrntfs | Add-Member NoteProperty Computer $c $arrntfs | Add-Member NoteProperty ShareName $Share.name $arrntfs | Add-Member NoteProperty Path $share.path $arrntfs | Add-Member NoteProperty NTFS_User $acl.IdentityReference $arrntfs | Add-Member NoteProperty NTFS_Rights $AccessMask $ntfsreport += $arrntfs } } Catch { $arrntfs = New-Object PSObject #Process NTFS Report $arrntfs | Add-Member NoteProperty Computer $c $arrntfs | Add-Member NoteProperty ShareName "NA" $arrntfs | Add-Member NoteProperty Path "NA" $arrntfs | Add-Member NoteProperty NTFS_User "NA" $arrntfs | Add-Member NoteProperty NTFS_Rights "NA" #Add to existing array $ntfsreport += $arrntfs } Finally { #Do nothing currently } } } } Catch { $arrntfs = New-Object PSObject $arrntfs | Add-Member NoteProperty Computer $c $arrntfs | Add-Member NoteProperty ShareName "NA" $arrntfs | Add-Member NoteProperty Path "NA" $arrntfs | Add-Member NoteProperty NTFS_User "NA" $arrntfs | Add-Member NoteProperty NTFS_Rights "NA" #Add to existing array $ntfsreport += $arrntfs } Finally { #Do Nothing Currently } } } End { #Display report #$ntfsreport #$ntfsreport | Export-Csv -notypeinformation "ntfsreport.csv" return $ntfsreport } }
#THIS IS THE START OF THE SCRIPT
#Define Some variables and Arrays$strOperatingSystem = "*Server*" $Servers = @() $ServersShareReport = @()$ServersNTFSReport = @()
#Set up the Directory Search parameters$objDomain = New-Object System.DirectoryServices.DirectoryEntry $objSearcher = New-Object System.DirectoryServices.DirectorySearcher $objSearcher.SearchRoot = $objDomain $objSearcher.Filter = ("OperatingSystem=$strOperatingSystem") $objSearcher.PageSize = 1000 #Define the Attributes we need from the search$colProplist = "name","dnshostname","operatingsystemversion"#Do the Searchforeach ($i in $colPropList){$objSearcher.PropertiesToLoad.Add($i)} $colResults = $objSearcher.FindAll() $Ping = New-Object System.Net.NetworkInformation.PingForEach ($objResult in $colResults) { $Server = $objResult.Properties.name $ServerDNS = $objResult.Properties.dnshostname Write-Host "Pinging $ServerDNS" $errorActionPreference="SilentlyContinue" $Pingy = $ping.send($ServerDNS,5000) $ServerShareOutput = $Null $ServerNTFSOutput = $Null Try { #This is where we go get the information If ($Pingy.status.tostring() –eq “Success”) { #Cool we can get to the server now lets gather some information write-host “$Server Available” $errorActionPreference="Continue" $ServerShareOutput = Get-SharePermissions -computer $ServerDNS $ServerNTFSOutput = Get-ShareNTFSPermissions -computer $ServerDNS $ServersShareReport += $ServerShareOutput $ServersNTFSReport += $ServerNTFSOutput } else { #oh-oh either WMI is restricted or we cannot get to the current server write-host “$Server Not Available" $FailedShareServer = New-Object PSObject $FailedShareServer | Add-Member NoteProperty Computer "$Server" $FailedShareServer | Add-Member NoteProperty Name "Server No Available" $ServersShareReport += $FailedShareServer $FailedNTFSServer = New-Object PSObject $FailedNTFSServer | Add-Member NoteProperty Computer "$Server" $FailedNTFSServer | Add-Member NoteProperty ShareName "Server No Available" $ServersNTFSReport += $FailedNTFSServer } } Catch { #And this just catches any weird error events write-host “$Server Not Available" $FailedShareServer = New-Object PSObject $FailedShareServer | Add-Member NoteProperty Computer "$Server" $FailedShareServer | Add-Member NoteProperty Name "Server Not Available" $ServersShareReport += $FailedShareServer $FailedNTFSServer = New-Object PSObject $FailedNTFSServer | Add-Member NoteProperty Computer "$Server" $FailedNTFSServer | Add-Member NoteProperty ShareName "Server Not Available" $ServersNTFSReport += $FailedNTFSServer } Finally { $errorActionPreference="Continue" $pingy = $null $Server = $Null } } #Now we have the data let's save it to file - in the same location the script is run from$ServersShareReport | Export-Csv -notypeinformation ".\sharereport.csv"$ServersShareReport = @()$ServersNTFSReport | Export-Csv -notypeinformation ".\ntfsreport.csv"$ServersNTFSReport = @()
Recently, a colleague of mine had a customer that had a Hyper-V server whose operating system had failed. They did not have any backups of the Operating system and had not taken any backups of the Hyper-V guests using Export\Import (I won't bang on about Best Practices here; I think you probably guess what I would say if I did though :-)). They had access to the disks on which the Hyper-V guest VHD's and configuration xml files were located and intended re-installing the the OS and Hyper-V Role. They were asking whether there is a method of recovering the Hyper-V guests to the new server. Fortunately another one of my colleagues suggested the method detailed on this great blog here http://blogs.msdn.com/b/robertvi/archive/2008/12/19/howto-manually-add-a-vm-configuration-to-hyper-v.aspx which I am led to believe worked.
Funnily enough, I had to use parts of this same method to recover a couple of Hyper-V guests on my PC when I had done some disk volume management and moved the VHD's and config files around a bit, which caused permissions changes obviously; honestly guv'nor I had forgotten I had these guests running from the said volume (I know Best Practice and all that ;-) , but it is just my laptop and not a production service providing environment).
My colleague then commented that it would be cool if such a process were to be scripted. Tada ...... "his wish was my command". To be honest I had put this on my list of things to do, to get me to extend my scripting skills from VBScript to Powershell and having to actually go through the process myself made me agree with my colleague a script would be so much easier and useful to anybody else having to do it. There may actually be another script out there that does this, but at least I actually got to do some Powershell scripting :-) . It may not be the most correctly written script, but it is my first crack of the whip (as the saying goes). There is currently no error trapping and\or exception handling, but I have provided some command line markers that should make it easier to determine at which points it fails (if it does). I intend adding error handling later, but if I didn't share the script now it would always be sat on my laptop as a work in progress without being used (and tested).
What I must say is, that after recovering the Hyper-V Guest using this script, the Hyper-V Guest is in an unsupported state and it is highly recommended that before starting the guest OS, you should use Hyper-V Manager to Export the Guest, remove it from Hyper-V Manager and then Import again into Hyper-V Manager. This will set all the permissions on all files as they should be.
This script definitely works with Windows Server 2008 R2 Hyper-V and I suspect it also works on Windows Server 2008 (I don't have a n environment to check this though). The script utilises icacls and mklink (both provided as command line applications with the OS), because I was unable to use inbuilt Powershell functionality without some deep coding knowledge (not quite there yet) requiring that I write some additional libraries that someone has actually already made available here http://pscx.codeplex.com/ . Basically, the Powershell cmdlet Set-ACL is not able to change owner when setting permissions on certain types of files and folders, this requires some special jiggery-pockery of which I am not yet averse. You may ask why I didn't decide to use the PSCX myself; well I thought it best to produce a script that is standalone and does not require anything that is not already available through Powershell.
Additionally, the script will only work correctly if the volume paths (i.e. drive letters) are the same as from before the issue that caused the Hyper-V guests to be disconnected from the Hyper-V Host Management Services.
The only other thing you will need to do when using this script is to open Powershell with elevated privileges and to run "set-executionpolicy remotesigned".
It may be worth checking the formatting of the code if you copy and paste it from this page as some of the lines have wrapped to the next line on this page. Alternatively, just download the txt file at the bottom of this post.
Enjoy. Oh and please provide feedback if it works or not for you.
##################################################################### Recover-VM.ps1## Syntax: Recover-VM.ps1 "<drive:\path\VM_Config_File_as_GUID.xml>" # (Use quotes in the argument)## Example: Recover-VM.ps1 "v:\VirtualMachine1\Virtual Machines\7660AA46-BA41-4171-8820-CDD7C71050A0.xml"## Purpose: Creates Symbolic Links and sets the required permissions# to make a Hyper-V Virtual Machine available in Hyper-V# Management Console. This script is intended for use in# recovery scenario where a Virtual Guests files have been# manually moved to another Virtual Server, but the volume# drive mappings are the same, only.# The methodology that this script employs is as discussed# in the below blog.# It should be noted that once this script has been used# the Virtual Guest is in an unsupported state and should# be exported and re-imported in Hyper-V Manager to# correctly set all the Virtual Machines permissions.## Params: The drive, folder and filename of the Virtual Machine# xml configuration file.## Req: Windows 2008 or above, Powershell V2, mklink, icacls.# run "set-executionpolicy remotesigned" in Powershell## Ref: http://blogs.msdn.com/b/robertvi/archive/2008/12/19/howto-manually-add-a-vm-configuration-to-hyper-v.aspx# http://blogs.technet.com/b/carlh## Author: CarlH## Myself or my employer do not warantee this script in any way and use# of it is entirely at the users own risk. The script is intended to# Recover Virtual Machines only in a recovery scenario, and such VM's# should be exported and imported onto the Hyper-V server to ensure# a supported state.## Version: 1.0 - First cut#####################################################################
Param($VMConfigFile)
Write-Host ""Write-Host "---------------Recover-VM Script Start"$SystemDrive = (Get-item env:systemdrive).Value$VMFolder = (Get-Item $VMConfigFile).directoryname$VMFolder = (Get-Item $VMFolder).parent$VMFolderName = $VMFolder.fullnameWrite-Host ""
#Read from the VM Configuration file. VM GUID for Service account. VM Name.Write-Host "---------------Reading VM Configuration File"[xml]$VMXML = Get-Content $VMConfigFile$VMSvcAccount = $VMXML.configuration.properties.global_id."#text"$VMName = $VMXML.configuration.properties.name."#text"Write-Host "---------------Completed reading VM Configuration File"Write-Host ""
#This function takes a file name including the full path as a ThisItem string parameter and gets the current ACL.#It adds the VM Service account to the current ACL and then Sets the new ACLfunction SetACL{ Param($ThisItem) Write-Host "" Write-Host "---------------Setting Permissions for $ThisItem" Write-Host "" $Acl = Get-Acl $ThisItem $AccessRule = New-Object system.security.accesscontrol.filesystemaccessrule("NT VIRTUAL MACHINE\$VMSvcAccount","FullControl","Allow") $Acl.SetAccessRule($AccessRule) Set-Acl $ThisItem $Acl Write-Host "" Write-Host "---------------Finished Permissions for $ThisItem"}
#This Function uses the cmd line utility mklink to create a symbolic link of the physical file passed to it in the HardFile parameter as a string#The LinkLocation string parameter defines what folder the link will be created in under systemdrive\programdata\Microsoft\Windows\Hyper-V\
function MkSymLink{ Param($HardFile, $LinkLocation) Write-Host "" Write-Host "---------------Creating Symbolic Link for $HardFile" Write-Host "" $LinkFile = (get-item $HardFile).name #Write-Host "%systemdrive%\programdata\Microsoft\Windows\Hyper-V\$LinkLocation\$LinkFile" $HardFile cmd /c mklink "%systemdrive%\programdata\Microsoft\Windows\Hyper-V\$LinkLocation\$LinkFile" $HardFile Write-Host "" Write-Host "---------------Finished Symbolic Link for $HardFile"}
#Get the VM config xml Fully Qualified Name and pass it to the MKSymLink function with the location for the for storing the Symbolic Link$VMFileString = (get-item $VMConfigFile).nameMkSymLink -HardFile $VMConfigFile -LinkLocation "Virtual Machines"
#Call the SetACL function and pass it the FQN of the Symbolic Link to which we need to set the ACL $ConfigFileSymLink = "$SystemDrive\programdata\Microsoft\Windows\Hyper-V\Virtual Machines\$VMFileString"SetACL -ThisItem $ConfigFileSymLink
#Get the FQN of the boot VHD from the VM Config xml file passed into the script. Then pass this to the SetACL function $VMVHDBootFile = $VMXML.configuration.'_83f8638b-8dca-4152-9eda-2ca8b33039b4_'.controller0.drive0.pathname."#text"SetACL -ThisItem $VMVHDBootFile
#Build a string that defines the VM Service Account and use it with the cmd line utility icacls to set the permissions for all folders and files of the VM Write-Host ""Write-Host "---------------Setting Permissions for Folder $VMFolderName"Write-Host ""$VMSvcAccountFQN = 'NT VIRTUAL MACHINE\'+$VMSvcAccountcmd /c icacls ""$VMFolderName"" /T /grant ""$VMSvcAccountFQN""':(F)'Write-Host ""Write-Host "---------------Finished Permissions for Folder $VMFolderName"
#Discover if there are any Snapshots. If so, for each one, create a Symbolic link for each by passing the FQN of the snapshot and snapsot link location#Also, call SetACL to set permissions for the VM Services account on the Symbolic Linkforeach ($Thing in (Get-ChildItem $VMFolderName)){ If ($Thing.name -eq "Snapshots") { $SnapshotsFolder = $Thing.Fullname foreach ($Thing2 in (Get-ChildItem $SnapshotsFolder)) { If ($Thing2.extension -eq ".xml") { $VMSnapshotFileString = $Thing2.name MkSymLink -HardFile $Thing2.fullname -LinkLocation "Snapshots" $SnapshotFileSymLink = "$SystemDrive\programdata\Microsoft\Windows\Hyper-V\Snapshots\$VMSnapshotFileString" SetACL -ThisItem $SnapshotFileSymLink } } }}Write-Host ""Write-Host "----------------End Script Recover-VM.ps1"
I recently had to assist someone who had virtualised a single Windows 2003 Domain Controller from physical hardware to a virtualised Virtual Guest file using a third party (Non Microsoft) utility (not that I am saying this issue would be any different using a Microsoft utility). This is not an advised\recommended practice for Domain Controllers, but in this instance they had no choice and the DC concerned was in fact a single DC (once again; Bad Practice, I know, but we work with what we have sometimes) in a test environment. Ignoring the list of bad practices, the resulting DC was producing Event ID 13559 errors in the NTFRS log (see description below)
The File Replication Service has detected that the replica root path has changed from "x:\windows\sysvol\domain" to "x:\windows\sysvol\domain". If this is an intentional move then a file with the name NTFRS_CMD_FILE_MOVE_ROOT needs to be created under the new root path. This was detected for the following replica set: "DOMAIN SYSTEM VOLUME (SYSVOL SHARE)"
Changing the replica root path is a two step process which is triggered by the creation of the NTFRS_CMD_FILE_MOVE_ROOT file. [1] At the first poll which will occur in 5 minutes this computer will be deleted from the replica set. [2] At the poll following the deletion this computer will be re-added to the replica set with the new root path. This re-addition will trigger a full tree sync for the replica set. At the end of the sync all the files will be at the new location. The files may or may not be deleted from the old location depending on whether they are needed or not.
Indeed the description is also discussed in this KB article http://support.microsoft.com/kb/887440 and it would appear that the disk signatures for the volumes that host the SYSVOL folder structure are changed during the conversion of the system from physical to virtual. Following the procedures detailed in the error event and in the article did in fact remedy the situation and the NTFRS event logs showed this fact by logging 13560, 13553 and 13516 (which effectively proves the DC is functioning correctly from and FRS\SYSVOL perspective) over a period of time.
This issue has also been seen during the Restore of a Domain Controller from backup or during a failed attempt to move SYSVOL to another volume as per this article http://support.microsoft.com/kb/842162
I was asked why we recommend not using backups older than the lesser of TSL and Deleted Lifetime and below is my interpretation of the answer I got from a reputable source.
There are 2 main reasons we advise not using backups longer than the lesser of Deleted Lifetime and Tombstone Lifetime.
The first reason is very much the same as it has always been. We don’t want the possibility of introducing Lingering Objects. Basically, if you do a restore older than TSL this will introduce what the replication engine perceives as Lingering Objects and Strict Replication Consistency will kick in and stop Replication. Although the objects should disappear and not cause an issue in most circumstances, you still have replication failed and will have to use non recommended methods of getting it going again (i.e. “Allow Replication with Divergent or Corrupt Partner” registry Hack).
The second reason is more to do with the way group memberships are stored in the Link Table for Deleted objects in Windows 2008 R2 with the Recycle Bin feature enabled. Basically, when an object is removed from a group, the Link table (in the database) is updated with a flag and date stamp marking the object that has been removed as “De-activated”. When an object is deleted the groups to which it was a member retain the object as a member, but another flag “Deleted” is added with the time stamp. Both the “Deleted” and “De-activated” flags effectively make the objects not visible as members of the group. In other words the Link table still has some knowledge that an object was a member of a group (remember, group membership is stored by the group not the object that is a member of a group). This enables the ability to ensure that when an object is undeleted it gets added back to any groups it used to be a member of. The “De-activated” and “Deleted” flag is removed from the Link Table after Deleted Lifetime (i.e. as an object becomes a Recycled object; Tombstoned in previous versions). If a backup older than Deleted Lifetime is used then we will not get group membership back of objects as expected (at best), at worst we could have inconsistent link tables across DC’s (although this is a slim chance and needs certain circumstances to occur. Of which I am not sure if I totally understand).
How many times have you had to acquire a kernel memory dump, but you or your customer (quite rightly) refuses to have the target system attached to the internet (which usually needed to download the symbol files). Well, I have had the dubious pleasure 3 times in the past 3 months. So, to remind me of the process I decided to write it down for future reference. If you know this already, sorry to waste your time, but for everyone else it’s one for your cerebral index.
Firstly you need to get the correct symbols files for the kernel memory dump, and just downloading the ones from WHDC or MSDN for the OS version and Service Pack version is not quite good enough, because the symbols change for each version of the kernel files that are released. There may be kernel files that have changed with Updates since the Service Pack was released.
1. Install the Windows Debugging Tools http://www.microsoft.com/whdc/DevTools/Debugging/default.mspx on the computer from which you require a kernel memory dump.
2. Download LiveKd from http://www.microsoft.com/technet/sysinternals/SystemInformation/LiveKd.mspx and on the computer that you require a Kernel Memory Dump from, uncompress it to the location that the Windows Debug Tools have been installed to (the default is "C:\Program Files\Debugging Tools for Windows (X86)\").
3. Copy ntoskrnl.exe, ntkrnlpa.exe, kernel32.dll and ntdll.dll files (not sure if all of these are required every time, but I do it just in case) from the System32 folder of the computer to be debugged (without internet access) and copy them to a folder (e.g. c:\debugsymbols\system32) on a computer with internet access. (If the internet facing computer has the same versions of these files to the non internet facing then there is no need for this part, but the Symbol cache will be enormous and all we want is the symbol files for the kernel).
4. Install the Debugging tools on the internet facing system (or copy the installed folder from the server you installed them on previously; the default is “C:\Program Files\Debugging Tools for Windows”).
5. From a command prompt on the Internet facing server, run this command:
C:\<path to debugging tools>\Symchk.exe /if <path to copied file>\*.* /s srv*<path to folder to store symbols locally>*http://msdl.microsoft.com/download/symbols
e.g. C:\Program Files\Debugging Tools for Windows (x86)>Symchk.exe /if c:\debugsymbols\system32\*.* /s srv*c:\debugsymbols*http://msdl.microsoft.com/download/symbols
6. Copy the cached symbols (i.e. the c:\debugsymbols folder and its new contents, minus the \system32 folder with the 4 exe\dll’s in it) from the internet facing server to the original server.
7. On the computer that you require a Kernel Memory Dump from, Open WinDbg, click on the "File" menu, and choose "Symbol Search Path". Enter the following path SRV*c:\debugsymbols* and click Ok. Close Windbg.exe.
8. With Administrator privileges, open a Command prompt and navigate to the WinDbg installation folder (e.g. C:\Program Files\Debugging Tools for Windows (X86)\) if this is where you expanded LiveKd to and run livekd.exe.
9. Type “.dump -f c:\memory.dmp” (without the quotes) to generate the Complete Memory Dump on the C: drive (you will need to make sure there is enough space on this drive).
10. Type q to quit LiveKD.
11. You should find memory.dmp in the root of c:\
Your work is now complete and you can compress and deliver the kernel dump for analysis.
When Windows 7 Beta came out, a colleague and I decided that the being able to dual boot from a VHD file would be useful for demos and to test Windows7 and Windows Server 2008 R2, due to the ability to replace and service VHD files pretty much at will (VHD files can be mounted as volumes in Vista http://blogs.msdn.com/cschotte/archive/2008/03/26/how-to-mount-a-vhd-quickly-under-vista-using-your-mouse.aspx and it is even easier in Windows 7 using the task Attach VHD in Disk Management). So I pulled together some information from various sources and wrote a script (BootFromVHD.vbs) to come up with a process that quickly enables booting from a VHD file.
This method does not provide any way of single booting from a VHD and assumes that either Windows 7\Windows Server 2008 R2 are already installed and running.
To start with a VHD is required and this will only work with a Windows 7 or Windows Server 2008 R2 VHD. I personally found it easier to create a vanilla Windows Server 2008 R2 (in my case; but it works equally well with Windows 7) virtual guest in Windows Server 2008 Hyper-V with a fixed disk size. Then while the guest virtual computer is still running on the Hyper-V server I ran the following command from an elevated (administrator) command prompt within the guest “c:\windows\system32\sysprep\sysprep /generalize /oobe /shutdown”. This basically, re-bases the image and removes anything specific to the hardware it was built on (in this case a virtual environment).
Alternatively, you could use this method to create a VHD from a WIM image.
1. Download Win7 WAIK - http://www.microsoft.com/downloads/details.aspx?FamilyID=696dd665-9f76-4177-a811-39c26d3b3b34&displaylang=en
2. Download WIM2VHD - http://code.msdn.microsoft.com/wim2vhd
3. Windows 7 or Windows 2008 R2 installation media required.
4. Install WAIK on a Windows 7 client
5. Run WIM2VHD using this command line (this can be modified for Win7 or other version of 2008 R2; see the examples on the WIM2VHD page):
cscript wim2vhd.wsf /wim:path /sku:serverstandard /vhd:path /size:15360 /disktype:fixed
Once you have a VHD file it needs to be copied to the hardware from which it will boot.
Very Important Notes:
This can be done manually by using an elevated command prompt in the original operating system on the computer using the Windows 7 version of BCDEdit.exe, or by script (VBScript or PowerShell). I did tried to write a VBScript that uses just WMI to carry out this task, but unfortunately I couldn’t get the BCDEdit WMI provider to recognise that I had created a new entry for the rest of the script to edit. So, I went with the method of calling BCDEdit.exe from within the VBScript.
I won’t explain the method of doing this manually, but will provide the script below. Please be aware that I or my employer holds no responsibility for your using this script and the outcomes thereof and I provide the code without guarantees or warranties (basically, test it prior to using it and ensure you are happy with what it is doing; myself and many of my colleagues have used this and have only seen one failure; which was not terminal). The script does create a backup of your existing BCD and will roll back any changes if any errors are encountered. If for some reason you want to roll back the BCD changes all you need to do is run “bcdedit /import c:\bcdbackup” from an elevated command prompt.
Notes
'*************************************************************************************************************
'* bcdbootfromvhd.vbs
'*
'* Purpose: Creates a new BCD Boot Loader Entry by copying an
'* existing entry and making changes
'* Parms: None.
'* Requires: To be run on Vista\Windows 7 or Windows Server 2008 (R2) to access BcdStore using
'* BCDEDIT and WMI.
'* References: 1) Vista Software Development Kit.
'* 2) MSDN Library, "Boot Configuration Data (BCD)" at
'* http://msdn2.microsoft.com/en-us/library/aa362692.aspx
'* 3) "Boot Configuration Data in Windows Vista" at
'* http://download.microsoft.com/download/a/f/7/af7777e5-7dcd-4800-8a0a-b18336565f5b/BCD.doc
'* Note : This function should be invoked as a cscript.
'* Issue the command "cscript //h:cscript" to set the
'* default scripting as cscript before issuing this
'* command.
'* Author : Carl Harrison
Option Explicit
On Error Resume Next
Dim strComputer
Dim strNewLoaderGUID
Dim strVHDPath, strBootEntryName
Dim objStoreClass, objStore, objDefault, objElement
Dim objShell, objBCDEditCopyCmd
Dim strCommandOutputReadLine
Dim varDefaultLoader
Const BcdLibraryString_Description = &h12000004
Const Current = "{fa926493-6f1c-4193-a414-58f0b2456d1e}"
Const WindowsImages = &h10200003
strComputer = "."
'Connect to the BCD store with WMI
Set objStoreClass = GetObject("winmgmts:{(Backup,Restore)}\\" & strComputer & "\root\wmi:BcdStore")
if not objStoreClass.OpenStore("", objStore) then
WScript.Echo "Couldn't open the system store!"
WScript.Quit
end if
'Get some info about the current booted OS from the BCD
'We are going to use it later
objStoreClass.OpenStore "", objStore
objStore.OpenObject Current, objDefault
objDefault.GetElement BcdLibraryString_Description, objElement
Set objShell = CreateObject("Wscript.Shell")
'Get a backup of the existing BCD
Set objBCDEditCopyCmd = objShell.Exec("bcdedit /export c:\bcdbackup")
If Instr(objBCDEditCopyCmd.StdOut.ReadAll,"The operation completed successfully.") <> 0 then
WScript.Echo "BCD Backup was created successfully"
Else
WScript.Echo "Failed to create BCD Backup"
WScript.Echo "Script is Exiting"
End If
'Popup dialog to ask for the BCD entry human friendly name
'We don’t accept blanks
strBootEntryName = InputBox ("Name of Boot loader entry at Start Up e.g. Windows 7")
If strBootEntryName = "" then
WScript.Echo "Blank Loader Entry Name is not accepted."
WScript.Echo "ROLLING BACK CHANGES"
Set objBCDEditCopyCmd = objShell.Exec("bcdedit /import C:\bcdbackup")
'Now copy the existing current BCD entry and determine the new GUID of the new BCD entry
Set objBCDEditCopyCmd = objShell.Exec("bcdedit /copy " & objElement.ObjectID & " /d " & """" & strBootEntryName & """")
Do While Not objBCDEditCopyCmd.StdOut.AtEndOfStream
strCommandOutputReadLine = objBCDEditCopyCmd.StdOut.ReadLine()
If Instr(strCommandOutputReadLine, "The entry was successfully copied to") <> 0 Then
strNewLoaderGUID = Left(Right(strCommandOutputReadLine,39),38)
Loop
WScript.Echo "New Loader GUID = " & strNewLoaderGUID
WScript.Echo
If strNewLoaderGUID = "" Then
WScript.Echo strComputer & vbTAB & "Could not Copy OS Loader."
'Popup dialog to get the location and name of the vhd
'The BCD does not need a drive specification
'We do not accept blank entries
strVHDPath = InputBox ("Please provide the path, excluding drive letter and colon and file name of the VHD to boot from (use quotes if there are spaces). So if the path to the vhd is d:\win7.vhd just type \win7.vhd")
If strVHDPath = "" then
WScript.Echo "Blank path is not accepted for " & strNewLoaderGUID
'Ok now call the sub setBCDEdit and pass it the necessary parameters to make changes to the newly copied BCD entry
Call setBCDEdit ("/set " & strNewLoaderGUID & " device vhd=[locate]" & strVHDPath, "device partition", strVHDPath)
Call setBCDEdit ("/set " & strNewLoaderGUID & " osdevice vhd=[locate]" & strVHDPath, "osdevice partition", strVHDPath)
Call setBCDEdit ("/set " & strNewLoaderGUID & " resumeobject " & strNewLoaderGUID, "resumeObject", strNewLoaderGUID)
Call setBCDEdit ("/set " & strNewLoaderGUID & " detecthal on", "detecthal value", "on")
Call setBCDEdit ("/set " & strNewLoaderGUID & " nx OptIn", "nx value", "optin")
varDefaultLoader = MsgBox ("Do you want this to be the default booting OS?", vbYesNo)
If varDefaultLoader = vbYes Then
Call setBCDEdit ("/default " & strNewLoaderGUID, "boot loader", "Default")
Wscript.Echo
'This Sub accepts parameters from the calling line and inserts them into the subsequent bcdedit command line before running them
Sub setBCDEdit (strCmdswitch, strSwitchDescription, strSwitchSetting)
Set objBCDEditCopyCmd = objShell.Exec("bcdedit " & strCmdSwitch)
WScript.Echo "The " & strSwitchDescription & " for the new Boot Loader " & strNewLoaderGUID & " was successfully set to " & strSwitchSetting
WScript.Echo "Failed to set the " & strSwitchDescription & " for the new Boot Loader to " & strSwitchSetting
Some time ago I built my laptop with dual boot and had a few issues along the way. I thought it might be worth sharing, but have only just got round to sharing this information. This information works equally as well with Windows 7 and Windows Server 2008 R2.
ISSUE 1 (Bitlocker PIN enabling for Dual boot partitions that may not be on a domain)
I wanted to dual boot with Vista x64 and Windows 2008 (actually triple boot, but this information is useful all ways). And I wanted all partitions (with the exception of the boot partition) to be protected with Bitlocker. I couldn’t seem to get Bitlocker to use a PIN for all boot partitions (the other 2 partitions were not going to live on any domain; well not immediately anyway).
Unfortunately, I had blindly enabled Bitlocker before ensuring the Advanced Settings were enabled (this can be set to be enabled by default on a domain), which allows the setting of a PIN or USB key. Further it was looking increasingly like I had to decrypt the partition, enable the advanced settings and then re-encrypt (while at the same time setting a PIN). I did a little bit of searching around and came up with the following (in essence you can enable Advanced Bitlocker settings post encryption (the easy bit) and then create a PIN afterwards (even easier, but not well known)).
Enable Advanced Bitlocker Settings As per points 1 -6 in Section “To turn on BitLocker Drive Encryption with a TPM plus a PIN or with a TPM plus a startup key on a USB flash drive” in this article http://technet.microsoft.com/en-us/library/cc766295.aspx (extract below)
1. Click Start, type gpedit.msc in the Start Search box, and then press ENTER.
2. If the User Account Control dialog box appears, verify that the proposed action is what you requested, and then click Continue. For more information, see Additional Resources later in this document.
3. In the Group Policy Object Editor console tree, click Local Computer Policy, click Administrative Templates, click Windows Components, and then double-click BitLocker Drive Encryption.
4. Double-click the setting Control Panel Setup: Enable Advanced Startup Options. The Control Panel Setup: Enable Advanced Startup Options dialog box appears.
5. Select the Enabled option. For TPM plus a PIN or startup key configurations, you do not need to change any further settings, but you can choose to require or disallow users to create a startup key or PIN. Click OK.
6. Click Start, type gpupdate.exe /force in the Search box, and then press ENTER. Wait for the process to finish.
This doesn’t give you the Option\Dialog to set\create a PIN when the volume is already encrypted, which is a bit of a pain.
Create a PIN for the partition
1. Open a Command prompt as Administrator
2. Use the following command “cscript %systemroot%\system32\manage-bde.wsf –protectors –add %systemdrive% -tpmandpin <4-20 digit numeric PIN>” (without the quotes)
3. Now to be really sure the PIN is was you want it to be and you don’t trust what you typed in replacement for <4-20 digit numeric PIN> above, Open Control Panel | Security | Bitlocker Drive Protection and choose Manage Bitlocker Keys for the current Volume, and there should now be a Reset PIN option.
4. Job done. Told you is was even easier (easier being less clicks of the mouse).
ISSUE 2 (well not really an issue, but something that might help in sorting out the list of OS’s of the bootloader)
So, I installed Vista x64 then Windows 2008 and another Vista (x86 this time); don’t ask me why I just did, and it suits my needs. The bootloader initial screen showed the really Useful choices of
Microsoft Windows Vista
Microsoft Windows Server 2008
Which was which Vista, well in time I got used to it and the domain connected one was the default, but due to my installation order it was the third in the list; not very intuitive methinks.
What I really wanted was
Microsoft Windows Vista x64
Microsoft Windows Vista x86
Firstly the following can be used to reorder the display list
Open a Command prompt as Administrator
Before you do anything backup your BCD by using bcdedit /export “c:\bcdbackup1” (where C: is the volume your working on). If you make any mistakes in the next bits you can restore it by using bcdedit /import “c:\bcdbackup1”
This simply changes the order in which bootloader displays the OS’s bcdedit /displayorder {ID1} ID2} {ID3} where {ID1}, {ID2} and {ID3} can be determined by typing just bcdedit at the command prompt and noting the entries adjacent to displayorder. Just put them in the order that you want them when creating the command line above e.g. my display order under Windows BootMgr looked like this (where {current} is the primary x64 install of Vista. It will show as {default} if bcdedit is run in any of the other OS’s and {current } will obviously be the current OS you are working in. Be aware of this and use the exact GUID or words that are listed opposite displayorder, otherwise there will be tears)
But I wanted it to look like this
And this is the command line I used bcdedit /displayorder {current} {3bfc9072-594d-11dd-8d96-c955ae3305ea} {cf632714-6411-11dd-95e7-d088af2f2b01}
Secondly lets rename the entries in the list (it would be useful if the x64 installation displayed as “Microsoft Windows Vista x64” and the x86 version likewise)
Before you do anything backup your BCD by using bcdedit /export “c:\bcdbackup2” (where C: is the volume your working on). If you make any mistakes in the next bits you can restore it by using bcdedit /import “c:\bcdbackup2”
To change the display name in the bootloader list type the following at the command prompt bcdedit –set {current} DESCRIPTION “Microsoft Windows Vista x64” (the quotes are needed and {current } is the entry I wanted to change. If the Identifier is a GUID or {default} use that). And this is what you see when you run bcdedit again.
What is a dynamic object? A dynamic object is an auxiliary class introduced in Windows Server 2003 that can be linked to most other object classes in Active Directory (as a class extension). Basically, by specifying that an object is of objectClass type dynamicObject (together with its normal class), it will get a number of extra attributes that effectively affect the characteristics of the object. This is detailed in RFC 2589 http://www.faqs.org/rfcs/rfc2589.html . It is useful to note that existing objects cannot be converted to Dynamic Objects and the specification of the dynamicObject extension class should be made at object creation time. The use of dynamicObjects was primarily introduced for use in application development and to be used in Application Partitions (Naming Contexts).
When an application creates a dynamicObject, a time-to-live (TTL) value is attached to the object and it is the responsibility of the application (client side or server side) to refresh the TTL if the object is deemed to remain in the Partition. Once the TTL decrements to 0 (zero), the object will be removed, without being Tombstoned. Each domain controller is therefore responsible for deleting local dynamic objects when the TTL expires. If the object is deleted before the TTL reaches 0 (zero) the object is Tombstoned (but the objects TTL is retained and continues to decrement), and it is treated as any other Tombstone (i.e. it can be updated\changed) until the TTL reaches 0 (zero), when it will be removed from the partition (after a short delay).
The TTL is specified in the constructed attribute entryTTL of a dynamicObject and the value (in seconds) can be 15 minutes (default minimum in AD; CN=DynamicObjectMinTTL, CN=Directory Service,CN=Windows NT, CN=Services, CN=Configuration, DC=<ForestName) to 365 days and 8 hours. If entryTTL is not specified for a newly created dynamicObject then it is assigned a default value of 24 hours (CN=DynamicObjectDefaultTTL, CN=Directory Service,CN=Winodws NT, CN=Services, CN=Configuration, DC=<ForestName). If an object is specified as dynamic at creation time and the entryTTL is less than that specified in Active Directory by DynamicObjectMinTTL, then the object will have an entryTTL as specified by DynamicObjectMinTTL i.e. the lower value will be ignored. The actual value of entryTTL is calculated from the value stored in msDS-Entry-Time-To-Die, which contains the date and time that the object will be deleted.
It is worth noting that the Dynamic Object Auxiliary Class, although seemingly perfect for objects stored in DNS Application Partitions, are not actually used in the DNS Application Partitions.
I carried out some rudimentary testing of the characteristics of dynamicObjects (container and objects) and it was interesting to see that dynamic container objects that have an entryTTL that is lower than descendent objects (i.e. child objects of a container have higher entryTTL values), will decrement to 0 (zero), but will then assume the calculated value of entryTTL of the object with the highest calculated entryTTL contained in it. So, a dynamic container object will not be automatically deleted before child dynamic Objects.
You can do some rudimentary testing of dynamicObjects (containers or objects) by just creating an object and specifying the objectClass of dynamicObject at creation time. You can then monitor what happens to the object (and entryTTL) in certain scenarios using LDP. I found the easiest way to create dynamic container Objects for testing was to use LDIFDE and supplying it with an ldf file named createdynamicobjectcontainer.ldf with the following entries:-
dn: cn=test,dc=domain,dc=local
changetype: add
objectClass: container
objectClass: dynamicObject
entryTTL: 900
The command line to create the container is “ldifde -v -i -f createdynamicobjectcontainer.ldf /j c:\”
You can then create a dynamic object under the dynamic container by using an ldf file named createdynamicobject.ldf with the following entries:-
dn: cn=jsmith,cn=test,dc=domain,dc=local
objectClass: user
sAMAccountName: jsmith
The command line to create the object is “ldifde -v -i -f createdynamicobject.ldf /j c:\”
This information was harvested from these links and references
http://www.faqs.org/rfcs/rfc2589.html
http://msdn.microsoft.com/en-us/library/cc223463(PROT.10).aspx
http://msdn.microsoft.com/en-us/library/cc223446(PROT.10).aspx
http://msdn.microsoft.com/en-us/library/cc200600(PROT.10).aspx
http://msdn.microsoft.com/en-us/library/cc201014.aspx
http://my.safaribooksonline.com/0596004648/activedckbk-CHP-4-SECT-14#X2ludGVybmFsX1NlY3Rpb25Db250ZW50P3htbGlkPTA1OTYwMDQ2NDgvYWN0aXZlZGNrYmstQ0hQLTQtU0VDVC0xNg==
Inside Active Directory (Second Edition) – A system Administrators Guide - ISBN-10: 0321228480
If you read nothing else in this blog post take a look at this http://msexchangeteam.com/archive/2006/06/15/427966.aspx .
Recently I was faced with a situation where a DC was receiving Event ID 623 events in the Directory Services eventlog. I had to quite a bit of searching for information on these events and came to the conclusion that AD guys don’t seem to have as many issues with the AD database as much as the Exchange guys. The reason I came to this conclusion, is that the majority of the information that relates to the JET database (or at least to the Version Store) that services both AD and Exchange is written in an Exchange context. Basically, the event relates to Version Store Exhaustion.
So before we go any further I thought I’d better refresh on what the Version Store is and found a couple of explanations. The official explanation is here http://technet.microsoft.com/en-us/library/cc772829.aspx , but I prefer simpler wording (being a simple guy) and think this article helps my understanding better “What is Version Store?” - http://msexchangeteam.com/archive/2006/04/19/425722.aspx ; yes I know it’s by Exchange guys, but that’s how this research generally panned out and it is still relevant. This is an extract and there is also a pretty good explanation of the event ID 623 in the rest of the article.
The Version Store keeps an in-memory list of modifications made to the database. This list has several uses:
1. Rollback - If a transaction needs to rollback it looks in the Version Store to get the list of operations it performed. By performing the inverse of all the operations the transaction can be rolled-back.
2. Write-conflict detection - If two different sessions try to modify the same record the Version Store will notice and reject the second modification.
3. Repeatable reads - When a session begins a transaction it always sees the same view of the database, even if other sessions modify the records it is looking at. When a session reads a record the Version Store is consulted to determine what version of the record the session should see.
4. Deferred before-image logging - A complicated optimization that lets us log less data than "other" database engines.
In simple terms, the Version Store is where transactions are held in memory until they can be written to disk. If something is preventing us from completing transaction or writing to disk we will consume this cache and the store will stop responding to request until there is room in the cache again.
By default the maximum amount of memory that can be allocated to the Version Store is the lesser of one fourth the amount of physical memory or 100Mb. This can be increased, but it is advised that this is a temporary solution to ease a specific issue. HKLM\SYSTEM\CurrentControlSet\Services\NTDS\Parameters\”EDB max ver pages (increment over the minimum)” and it is a REG_DWORD. This value represents 16kbyte pages, e.g. a value of 1000 would be 1000 x 16kbytes.
Anyway, back to the reason for this article. I had my suspicions what was causing the 623 events (combination of an automated provisioning solution and\or Garbage Collection of ridiculous amounts of tombstones (another story) and underspec hardware; it was old tin). So, I decided that firstly, we needed to get some performance data, but the performance counters for monitoring the Version Store are not readily available and I needed to do some searching. Once again the majority of the information I found came from Exchange resources, but I found this http://technet.microsoft.com/en-us/library/cc961947.aspx which relates to Windows 2000, but is still valid. The results of the findings I will have to leave for another article.
To be able to view the allocated (16kb) pages of the version store you will need to carry out the following process:-
1. Copy the performance DLL (Esentprf.dll) located in SystemRoot\ System32 to any directory (for example, C:\Perf).
2. Run Regedit.exe, and make sure that the following registry subkeys exist: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\ESENT HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\ESENT\Performance If these subkeys do not exist, you need to create them. For more information about creating registry subkeys, see Windows 2000 Server Help.
3. Make sure that, under the Performance subkey, the registry values that have the following settings exist:
a. Open : data type REG_SZ : OpenPerformanceData
b. Collect : data type REG_SZ : CollectPerformanceData
c. Close : data type REG_SZ : ClosePerformanceData
d. Library : data type REG_SZ : c:\perf\esentprf.dll
e. Show Advanced Counters : data type REG_DWORD : 1 (This used to be called “Squeaky Lobster” and is pretty well known in Exchange circles; interesting blog on why it was called this from Brett Shirley here http://msexchangeteam.com/archive/2006/06/15/427966.aspx . It still works if you use “Squeaky Lobster” instead of “Show Advanced Counters” J).
4. Open a command prompt and change directory to SystemRoot \Winnt\System32 or to another folder that contains the files Esentperf.ini and Esentprf.hxx generated when Esentprf.dll was compiled.
5. (Optional) To verify that previous counter information is not present in the registry, at the command prompt, type unlodctr.exe ESENT .
6. To load the counter information into the registry, at the command prompt run Lodctr.exe Esentprf.ini.
To view the counters for the Database object, restart Performance Monitor. The counters are under Database and the one you need to monitor the Version store size is “Version Buckets Allocated”. This will show the number of 16kbyte pages allocated in the Version Store.
Did you know that the objectClass attribute in an Active Directory database is not Indexed in pre Windows 2008 Active Directory. This really isn’t an issue with an efficiently formed LDAP query filter such as (&(objectCategory=person)(objectClass=user)) which takes advantage of indexed attributes, but if you were to use a non indexed attribute such as just (objectClass=user) then your query would have to search through every object with objectClass attribute populated to see if there was a match (How many LDAP queries have you seen use this filter?). So, with a database of 100,000 objects and all you were looking for were the User Account objects (say 10,000) you would be parsing all 100,000 objects for a result set; not the most efficient search.So why wasn’t objectClass indexed? Well these are the reasons that a little research (and trawling through forums and blogs) has provided.
It is worth being aware that indexing attributes can make queries slower. Consider the query (&(objectClass=user)(samAccountName=[uniquevalue])), which will execute faster if objectClass is not indexed. In both cases the query processor will end up choosing the index over samAccountName to do the enumeration, but in the case where objectClass is indexed it will waste time evaluating how tightly the objectClass index encloses the result set (answer: not very). Doing that evaluation costs time and I/O.
The above information was harvested from the following sourceshttp://blog.joeware.net/2007/03/24/831/http://blog.joeware.net/2005/12/08/147/http://www.activedir.org/ListArchives/tabid/55/forumid/1/tpage/1/view/Topic/postid/31737/Default.aspxhttp://www.frickelsoft.net/blog/?p=147
I had an interesting issue recently and I thought it would be useful to share the information. I was working on a test environment was going through some DR scenarios and noticed that the DC’s were taking a long time to boot into normal mode after booting into DSRM. This was true even if I did nothing in DSRM other than logon and restart.
DC’s were Windows 2003 SP2 x64
Originally 2 DC’s in the environment, but the test was more pronounced with more DC’s (3 onwards, i.e. more replication partners)
IPSec was utilized between DC’s (configured via group Policy; using Certificates, but the same issue was seen using Kerberos)
After rebooting from DSRM the Domain Controller takes an unusual length of time (6-15 minutes from “Applying Network Settings” to the actual logon prompt; dependent on the number of DC’s in the environment) to enable logon at the console. “Applying Network Settings” takes approximately 5-6 minutes and when the logon dialog box appears, this usually does not display the logon domain for about another 5-10 minutes, by which time a user can logon. When carrying out the same procedure on x86 DC’s the timings were considerably reduced and I haven't had time for further investigation.
Testing, Log Analysis and Research found this is an expected behaviour. This is the High Level theory of what is occurring.
During testing I tried a number of things to resolve the issue, 3 of which worked (listed below), but all except item 2 were not satisfactory fixes for most production environments if you seriously need to IPSec between DC's.