Archiving File Shares with Windows PowerShell

Archiving File Shares with Windows PowerShell

  • Comments 12
  • Likes

  

Summary: Learn how to use Windows PowerShell to archive file shares.

 

Hey, Scripting Guy! QuestionHey, Scripting Guy! I am interested in using Windows PowerShell to archive file shares, is this something that can be done?

-- ST

 

Hey, Scripting Guy! AnswerHello ST, Microsoft Scripting Guy Ed Wilson here. Welcome to guest blogger week. We welcome back Tibor Soos one of our MVP’s in Hungary. To learn more about Tibor and see his first guest blog post please follow this link. Without further ado, here is Tibor.

I frequently give on-site presentations for System Administrators and for their managers so that they can discover the great potential of Windows PowerShell and how Windows PowerShell scripts can help them save lots of money and/or time to reach their operational goals. In this article, I selected one of my favorite examples.

The task is archiving data on file shares. File shares are still heavily used and the data on these shares are larger and larger continuously. Backing up this data and making sure that old, “historical” data is not deleted or modified unintentionally is always challenging. Therefore, I created a script that moves files that were not modified within the last 30 days (by default) from the “live” data folder or share to a read-only “archive” folder or share. Meanwhile, I leave a shortcut at the original location which points to the archived copy so that users can still access these files easily. The content of the read-only archive cannot be modified by users. Therefore, it only has to be backed up once after you run this script. This saves backup media and management costs.

Here is the whole script:

Backup-FileShares.ps1

param(

    [Parameter(

        Mandatory = $true,

        Position = 0,

        HelpMessage = "Root of the folders or share to archive"

    )]

    [String] $source,

 

    [Parameter(

        Mandatory = $true,

        Position = 1,

        HelpMessage = "Path of the folder or share of archive"

    )]

    [string] $target,

 

    [Parameter(

        Mandatory = $false,

        Position = 3

    )]

    [int] $days = 30

)

 

# Object created for shortcut creation

$wsh = new-object -comobject wscript.shell 

 

# Get all the files from the source path, that are not shortcuts and older than the days set

Get-ChildItem $source -Recurse |  

        Where-Object {!$_.psiscontainer -and ((get-date) - $_.lastwritetime).totaldays -gt $days -and $_.extension -ne ".lnk"} |

            ForEach-Object {

# For each file build the destination path 

                $dest = $_.fullname -replace ([regex]::escape($source)), $target

 

# Check if the destination file path has the parent directory, if not create it

                $parent = split-path $dest 

                if(!(test-path $parent)){

                    [void] (new-item -Path (split-path $parent) -Name (split-path $parent -leaf) -ItemType directory)

                }

 

# Save the modification date and the ACL of the file for later use

                $date = $_.lastwritetime

                $acl = $_ | Get-Acl

# Try to move the file into the destination

                Move-Item -Path $_.fullname -Destination $dest -ErrorAction silentlycontinue

 

# If successful create shortcut

                if($?){

                    $shortCut = $wsh.CreateShortCut("$($_.fullname).lnk")    

                    $shortCut.TargetPath = $dest 

                    $shortCut.Save() 

 

# Set the "date modified" property of the shortcut same as date modified property of the original file

                    (Get-Item "$($_.fullname).lnk").lastwritetime = $date

 

# Replace the access control entries on the shortcut, so that users have read only access to it               

                    $acl.SetAccessRuleProtection($true,$true)

                    $acl | Set-Acl -Path "$($_.fullname).lnk"

                    $acl = Get-Item "$($_.fullname).lnk" | Get-Acl

                    $acl.Access | where-object {"BUILTIN\Administrators" -ne $_.identityreference -and "NT AUTHORITY\SYSTEM" -ne $_.identityreference} |

                        ForEach-Object {

                            $identity = $_.identityreference

                            [void] $acl.RemoveAccessRule($_)

                            $restrictedACE = New-Object system.security.AccessControl.FileSystemAccessRule($identity,"ReadAndExecute",,,"Allow")

                            $acl.AddAccessRule($restrictedACE)

                        }

                    $acl | Set-Acl

                }

# Else write error message

                else { write-host "Error moving $_" -ForegroundColor red}

            }

 

If you save this file as – for example, Backup-FileShares.ps1 – then you can call it from Windows PowerShell as follows:

c:\your-path-to-the-scriptfile\Backup-FileShares.ps1 –source \\server1\path-to-archive -target \\server2\path-of-archive -days 60

 

Of course, you have to have the relevant admin rights on both the source server and the target server. You can also use local folder paths instead of UNC share names. I commented the source code, but let’s walk through the code so that I can make some additional comments.

The first section is the parameter definition part. The source and the target are mandatory. If you omit the days parameter, the default value is 30 days.

Next, I define the wscript.shell COM object. This helps me to create the shortcuts later. Then I list all the file objects under the source path that were not modified during the last given days and which are not shortcut files.

With these files, I do a foreach-object cycle, in which the first task is to define the new path of the file in the archive:

$dest = $_.fullname -replace ([regex]::escape($source)), $target

 

I just replace the source part of the full path of the file with the destination path. As the paths generally include characters that are reserved for regex, I escape them by using the escape static method of the Regex class of the .NET Framework. This trick is rarely mentioned in Windows PowerShell books and articles but I found it very useful. So when you do not exactly know what you want to find or replace by regex, I recommend always escaping the search string. Here is an example of how it works:

PS C:\> [regex]::Escape("\\server\share\somefile.ext")

\\\\server\\share\\somefile\.ext

 

You can see that all problematic characters are escaped automatically.

Next, as I want to place the files into the same folder structure in the archive as the original one, I have to build this structure there. So I define the $parent variable to include the path of the parent folder of the archive-file-to-be by the Split-Path cmdlet. Then I check whether this folder already exists. If not, then I create it with the New-Item cmdlet. Let’s stop here for a moment! What happens if the file to be moved to the archive is located in a deep folder hierarchy such as this?

\\server1\share\mainfolder\subfolder\subsubfolder\thefile.docx

 

If the target is \\server2\share, then the parent folder of the archived file is as follows:

\\server2\share\mainfolder\subfolder\subsubfolder

 

But the parent folder of this folder may not exist yet either! Fortunately, the New-Item cmdlet is clever enough to resolve this problem by creating all the necessary folder path elements including all nonexistent folders in the path. But this is true only for directories. Therefore, if you use –ItemType Directory as the parameter, it is not for files! If I want to create a file in a nonexistent folder, New-Item will give an error. But for now I do not have to create new files. I’m just moving existing ones.

If New-Item can create the folder successfully, it also gives an output which I’m not interested in, so I suppress it by converting the result to [void].

Remark: There are at least three ways of suppressing output; convert the output to [void], redirect it to $null and send it to the Out-Null cmdlet through the pipeline. I did an experiment to determine which is the most efficient. I measured the three options by the Measure-Command cmdlet, here are the results:

PS C:\> measure-command {1..1000 | %{$_ > $null}} | Select-Object -ExpandProperty milliseconds

238

PS C:\> measure-command {1..1000 | %{$_ | out-null }} | Select-Object -ExpandProperty milliseconds

405

PS C:\> measure-command {1..1000 | %{[void] $_}} | Select-Object -ExpandProperty milliseconds

180

As you can see the type conversion method is the fastest so I selected this option in my script (not because I really needed that performance benefit here but just to satisfy my engineer self).

So we are at the point where there is a parent folder for the file in the archive. Then I save the modification date of the file to be moved into variable $date and its access control list into $acl for later use, and I try to move the file into the archive. The file may be locked and therefore cannot be moved. I do not want to show the error message, I’d rather handle this situation myself, and so I used the silentlycontinue error-handling method. I check the $? automatic variable, this variable shows the success of the last command.If this is $true, then I know that the move was successful. In this case, I first create the shortcut with the wscript.shell object that was stored in the variable $wsh before. The name of this shortcut is the same as the file name plus an additional extension of .lnk. The shortcut points to the destination path.

As I want to point users to the original file as much as possible, I set the modification date of the shortcut to the modification date of the original file. This time stamp was stored in variable $date before.

I also want to map the access control on the shortcut to the access control of the original file. I want to enable read access to the shortcut only for those who had any access to the original file, but I do not want to change the access of the System or Administrators accounts. This is the relevant code from the script:

$acl.SetAccessRuleProtection($true,$true)

$acl | Set-Acl -Path "$($_.fullname).lnk"

$acl = Get-Item "$($_.fullname).lnk" | Get-Acl

$acl.Access | where-object {"BUILTIN\Administrators" -ne $_.identityreference -and "NT AUTHORITY\SYSTEM" -ne $_.identityreference} |

ForEach-Object {

$identity = $_.identityreference

            [void] $acl.RemoveAccessRule($_)

            $restrictedACE = New-Object                                                 System.security.AccessControl.FileSystemAccessRule($identity,"ReadAndExecute",,,"Allow")

$acl.AddAccessRule($restrictedACE)

}

$acl | Set-Acl

 

First I remove the inheritance flag by calling the SetAccessRuleProtection method. With the two $true parameters I copy the inherited access control entries as explicit entries. Then I commit these changes by setting this DACL to the shortcut. I need this step, because at this point the access control entries within this DACL still have the IsInherited flag set to $true. Unfortunately the RemoveAccessRule method cannot move these kinds of ACEs and the IsInherited property is read-only so this is the easiest way of resetting these ACEs.

Remark: The RemoveAccessRule method does not give any error message when removing ACEs with the IsInherited flag on, it even outputs $true, indicating that the removal was successful. But when you check the ACEs you will discover that these entries are still there. This strange behavior caused me some headaches.

I set the $acl and re-read it back from the shortcut. Now it includes only explicit ACEs, all with the correct IsInherited flags. Then I filter the ACEs by a where-object, because I do not want to change the System and Administrator access. For all the other ACEs I save the user or group to $identity, remove the ACE and put a replacement back for the same identity with a ReadAndExecute entry. Finally, I set the new ACE to the shortcut object again.

Actually, this is it. If the file move was unsuccessful, then I write a message on the console.

The following figures detail the user experience. These are the files here before archiving:

 

And this figure shows the directory after archiving:

 

As you can see, the files that were older than 30 days are replaced by shortcuts. The Date modified column contains the same dates for the shortcuts as the original files. The icons are almost the same.

Let’s analyze the archive location now as seen in the following figure:

 

It has all the archived files and the same folder structure as the original data share.

This figure gives us a view the file permissions on the shortcuts:

 

As you can see the permissions include explicit permissions only, and even the ModifyGroup that has modify permissions on the original file, has read & execute permissions now.

Of course, the script can be developed further to make adjustments to the access control list of the archive, but it is already useful in this form.

If you liked this post and need some help creating scripts, you can contact me by email:

Tibor Soós (soost@IQJB.hu)

Windows PowerShell MVP

www.IQJB.hu

  

ST, that is all there is to using Windows PowerShell to archive file shares.  Thank you Tibor, for your time and for sharing your knowledge with us. Guest blogger week will continue tomorrow when we will have Trevor Sullivan talk about dynamic method invocation.

I invite you to follow me on Twitter or Facebook. If you have any questions, send email to me at scripter@microsoft.com or post them on the Official Scripting Guys Forum. See you tomorrow. Until then, peace.

 

Ed Wilson, Microsoft Scripting Guy 

Your comment has been posted.   Close
Thank you, your comment requires moderation so it may take a while to appear.   Close
Leave a Comment
  • The script "works" for the most part. There is a link created, and the file is archived. However, I get the following error on every single file:

    Set-Acl : The security identifier is not allowed to be the owner of this object.

    At C:\Documents and Settings\johndoe\My Documents\Backup-FileShares.ps1:56 char:35

    +                     $acl | Set-Acl <<<<  -Path "$($_.fullname).lnk"

       + CategoryInfo          : InvalidOperation: (F:\Shared_Files... 1 clip.mov.lnk:String

      ) [Set-Acl], InvalidOperationException

       + FullyQualifiedErrorId : System.InvalidOperationException,Microsoft.PowerShell.Comma

      nds.SetAclCommand

  • Get the same thing - any reason why?  - Is this because it's a Windows 2003 source with a Windows 2008 R2 destination?  Is there an issue with archiving across the network?  (other than speed).  Would it be faster if it were on the same box - would it work without the error NC discussed?

  • Hi Scripting guy, great post! Exactly what I was looking for and works a charm :) Any hints how to retain the archived folders' Created date value or Date modified value? The files keep the original date stamps but since folders are created when the script is run they're assigned fresh values.

    I'd need to be able to track down the archived items by the date abd the folder dates are in crucial role in this operation.

    Thanks in advance,

    Mikko

  • @Mikko Koskenkorva What you will want to do is to modify the code a bit to capture the original timestamps, and then write them to the copied files. Take a look at this Hey Scripting Guy blog article I wrote last week blogs.technet.com/.../weekend-scripter-use-powershell-to-set-word-document-time-stamps.aspx

  • Greate scripts Thanks

  • hi scripting guy,

    i use this script to archive

    ----

    param(

       [Parameter(

           Mandatory = $true,

           Position = 0,

           HelpMessage = "Root of the folders or share to archive"

       )]

       [String] $source,

       [Parameter(

           Mandatory = $true,

           Position = 1,

           HelpMessage = "Path of the folder or share of archive"

       )]

       [string] $target,

       [Parameter(

           Mandatory = $false,

           Position = 3

       )]

       [int] $days = 30

    )

    # Object created for shortcut creation

    $wsh = new-object -comobject wscript.shell  

    Get-ChildItem $source -Recurse |  

           Where-Object {!$_.psiscontainer -and $_.extension -eq ".xla" -and ((get-date) - $_.lastwritetime).totaldays -gt $days -and $_.extension -ne ".lnk"} |

               ForEach-Object {

    # For each file build the destination path  

                   $dest = $_.fullname -replace ([regex]::escape($source)), $target

    # Check if the destination file path has the parent directory, if not create it

                   $parent = split-path $dest  

                   if(!(test-path $parent)){

                       [void] (new-item -Path (split-path $parent) -Name (split-path $parent -leaf) -ItemType directory)

                   }

    #

                   $date = $_.lastwritetime

                   $acl = $_ | Get-Acl

    # Try to move the file into the destination

    ls $_.fullname >> c:\logs\test1.txt

                   Move-Item -Path $_.fullname -Destination $dest -ErrorAction silentlycontinue

               }

    ----

    then i start it with the file  rolle.pa1 file  like.\Backup-FileSharesm.ps1 -source C:\logs -target C:\archive -days 30

    How can i say when you find a folder then if exist rolle.ps2 then read the rolle from rolle.ps2 for archive otherwise is default from rolle.ps1

  • @NC and @Any Reason Yet,

    Trying using the UNC path to your source and destination. This worked for me.

  • Hi, This is a great script. How can I redirect the output to a log file? also how can I exclude certain extensions from moving?

  • Am getting this when I execute the script any ideas?

    Missing ] at end of type token.

    At C:\Dell\Backup-FileShares.ps1:5 char:15

    +    [Parameter( <<<<

       + CategoryInfo          : ParserError: (Parameter(:String) [], ParseException

       + FullyQualifiedErrorId : EndSquareBracketExpectedAtEndOfType

    Have taken the script as is with the following:

    Backup-FileShares.ps1

    param(

     [Parameter(

          Mandatory = $true,

          Position = 0,

          HelpMessage = "Root of the folders or share to archive"

      )]

      [String] $source,

      [Parameter(

          Mandatory = $true,

          Position = 1,

          HelpMessage = "Path of the folder or share of archive"

      )]

      [string] $target,

      [Parameter(

          Mandatory = $false,

          Position = 3

      )]

      [int] $days = 30

    )

    # Object created for shortcut creation

    $wsh = new-object -comobject wscript.shell

    # Get all the files from the source path, that are not shortcuts and older than the days set

    Get-ChildItem $source -Recurse |  

           Where-Object {!$_.psiscontainer -and ((get-date) - $_.lastwritetime).totaldays -gt $days -and $_.extension -ne ".lnk"} |

               ForEach-Object {

    # For each file build the destination path

                   $dest = $_.fullname -replace ([regex]::escape($source)), $target

    # Check if the destination file path has the parent directory, if not create it

                   $parent = split-path $dest

                   if(!(test-path $parent)){

                       [void] (new-item -Path (split-path $parent) -Name (split-path $parent -leaf) -ItemType directory)

                   }

    # Save the modification date and the ACL of the file for later use

                   $date = $_.lastwritetime

                   $acl = $_ | Get-Acl

    # Try to move the file into the destination

                   Move-Item -Path $_.fullname -Destination $dest -ErrorAction silentlycontinue

    # If successful create shortcut

                   if($?){

                       $shortCut = $wsh.CreateShortCut("$($_.fullname).lnk")    

                       $shortCut.TargetPath = $dest

                       $shortCut.Save()

    # Set the "date modified" property of the shortcut same as date modified property of the original file

                       (Get-Item "$($_.fullname).lnk").lastwritetime = $date

  • It works great in my test lab...now for production..

    How can I output to a log files that have moved and files that have errors?

  • This is such a great post! I'm wondering what the best way to send the logs to a file and then possibly email the logs?

  • Awesome script, thanks alot for sharing