Hey, Scripting Guy! How Do I Back Up Files That Have Been Modified in the Last Day?

Hey, Scripting Guy! How Do I Back Up Files That Have Been Modified in the Last Day?

  • Comments 3
  • Likes

Hey, Scripting Guy! Question

Hey, Scripting Guy! I have a USB drive that I use for backup purposes on my computer. Because it appears that Windows Vista does not have a backup program that I can use, I thought it would be useful to write a script using Windows PowerShell to back up the computer. To make things simple, I have all my data in a folder called "data." I only want to copy files that have changed within the last day to a folder on the USB drive that is automatically created based upon the date and time. Can this be done easily?

- MS

SpacerHey, Scripting Guy! Answer

Hi MS,

First of all, Windows Vista does have a back up and restore program. It is actually pretty cool, and does a very good job. Here is an article that talks about this feature. What the redesigned program does not do is selective backup and restore. As a computer professional, I also find this missing feature somewhat annoying. However, when thinking about desktops, laptops, and all that, most of the users have never used a backup program in their lives, and they are not fully in tune with the backup process. So the ability to do a system backup and restore, while useful, unfortunately does not meet our needs here. Luckily, we can easily solve this problem with Windows PowerShell.

For information about working with files and folders in Windows PowerShell see this article. You may also wish to see this section of "What Can I Do with Windows PowerShell" that deals specifically with files and folders. The Windows PowerShell scripting hub is a good place to get started with Windows PowerShell, and it includes download links and other items of use. The Sesame Script archive has several VBScript articles dealing with files and folders. They are worth a look for VBScript examples. The community script repository has a good selection of files and folders scripts in VBscript. You also may want to check the Hey, Scripting Guy! archive for a number of VBScript examples of working with files and folders. This script deletes files older than a certain number of hours, which kind of illustrates part of what we are doing here.

In today’s script, we create two functions. The first function creates the backup folder based upon the date and time the script is run. The second function backs up the files that have been modified in the last day. You can of course change the destination, the source, and the interval. The script is seen here:

Function New-BackUpFolder($destinationFolder)
{
 $dte = get-date
 $dte = $dte.tostring() -replace "[:\s/]", "."
 $backUpPath = "$destinationFolder" + $dte
 $null = New-Item -path $backUpPath -itemType directory
 New-Backup $dataFolder $backUpPath $backUpInterval
} #end New-BackUpFolder

Function New-Backup($dataFolder,$backUpPath,$backUpInterval)
{
 "backing up $dataFolder... check $backUppath for your files"
 Get-Childitem -path $dataFolder -recurse |
 Where-Object { $_.LastWriteTime -ge (get-date).addDays(-$backUpInterval) } |
 Foreach-Object { copy-item -path $_.FullName -destination $backUpPath -force }
} #end New-BackUp

# *** entry point to script ***

$backUpInterval = 1
$dataFolder = "C:\fso"
$destinationFolder = "C:\BU\"
New-BackupFolder $destinationFolder

The essential functionality of this script is contained in two functions. The advantage of this approach to scripting is that it makes the script easier to read, easier to test, easier to modify, and easier to reuse if you should desire to do so later. A good function should encapsulate a single thought, idea, or piece of functionality. In addition, it is a best practice to follow the verb/noun naming convention set up by the Windows PowerShell team. This will become much more important in Windows PowerShell 2.0.

The objective of our script is to back up files that have been modified in the last day. Here is the target folder we will use:

Image of the target folder

 

The first function begins with the function keyword, and names the new function New-BackUpFolder. It takes a single input, which is the path to the destination folder. This is seen here:

Function New-BackUpFolder($destinationFolder)

Next we use the get-date cmdlet to retrieve the current date and time. We store this in a variable named $dte as seen here:

$dte = get-date

Now we want to turn the datetime object that was returned by the get-date cmdlet into a string. To do this, we use the tostring() method. When we have a string, we have something that looks like this:

PS C:\> (get-date).tostring()
12/12/2008 1:00:59 PM

The problem is that we are not allowed to have some of those characters in a file name. The answer is to use a regular expression pattern to fix the string so that we can use it for a folder name. The replace operator allows us to use regular expressions. Regular expressions are talked about in the Microsoft Press book, Windows PowerShell Scripting Guide; they are also documented on MSDN.

The first thing the pattern does is replace the colon with a period, which is the same as:

PS C:\> (get-date).tostring() -replace ":", "."
12/12/2008 1.09.53 PM

The next thing the pattern does is use \s to replace all the spaces with a period. This is seen here:

PS C:\> (get-date).tostring() -replace "\s", "."
12/12/2008.1:11:07.PM

The last thing the pattern does is replace the forward slash with a period. This is seen here:

PS C:\> (get-date).tostring() -replace "/", "."
12.12.2008 1:11:56 PM

When you put them all together, you get the pattern seen here. The square brackets are used when you want to supply a more complex match pattern:

$dte = $dte.tostring() -replace "[:\s/]", "."

The complete path to the backup location is a combination of $destinationfolder and the date time stamp that we created in the previous step. This is stored in the variable $backUpPath as seen here:

$backUpPath = "$destinationFolder" + $dte

Now we want to create the backup folder. To do this, we use the New-Item cmdlet. We give this cmdlet the path we stored in the $backUpPath variable, and tell it we are creating an itemtype that is a directory. This is seen here:

$null = New-Item -path $backUpPath -itemType directory

The result of this section of the script is a folder which is seen in the image that follows this paragraph. As seen in the image, each time the script is run, a new subfolder is created. You can run the script once a second and never receive an error. However, if you run it more than once a second, you will get a duplicate folder error message:

Image of each new subfolder created every time the script is run

 

Now we need to call the New-Backup function. The New-Backup function takes three parameters: $dataFolder, $BackUpPath, and $backUpInterval. This is seen here:

New-Backup $dataFolder $backUpPath $backUpInterval

In the code we use to create the New-Backup function, we define the input parameters. If one of these values is missing when the function is called, the script will generate an error:

Function New-Backup($dataFolder,$backUpPath,$backUpInterval)

We print out a message that lets the user know that the script is backing up the folder, and we also print the destination that is being used. This is seen here:

"backing up $dataFolder... check $backUppath for your files"

Get-ChildItem is used to retrieve a listing of all the files in the folder. The collection of files is sent across the pipeline to the Where-Object that will filter out the files:

Get-Childitem -path $dataFolder -recurse |

We use the Where-Object to filter out the LastWriteTime attribute of the files. If the last write time was greater than the number of days represented by $backUpInterval when subtracted from the current date, we will copy the file. To figure the date, we use the addDays method from the datetime object that is returned by the Get-Date cmdlet. The cool thing is that addDays will accept a negative number, which means you do not need a subtractDays method. This is seen here:

Where-Object { $_.LastWriteTime -ge (get-date).addDays(-$backUpInterval) } |

To work with each individual file, we need to use the Foreach-Object cmdlet. Each file that makes it through the Where-Object filter is then copied by using the Copy-Item cmdlet. It copies files from the path that is obtained via the fullname property of the current item on the pipeline. The destination is stored in the $backUpPath folder. The –force parameter is not really needed, but is there to force overwriting a file if it happened to exist:

Foreach-Object { copy-item -path $_.FullName -destination $backUpPath -force }

The entry point to the script defines three variables and calls the New-backUpFolder function, as seen here:

$backUpInterval = 1
$dataFolder = "C:\fso"
$destinationFolder = "C:\BU\"
New-BackupFolder $destinationFolder

When we look at the results of the script, we see that only a couple of the files were actually copied. These are the recently changed ones, which was our objective:

The screen changes when the user types 'y'

 

MS, it looks as if we wound up with a pretty cool script. It is something that should prove useful to you. See you tomorrow—the last day of 2008.

Ed Wilson and Craig Liebendorfer, Scripting Guys

Your comment has been posted.   Close
Thank you, your comment requires moderation so it may take a while to appear.   Close
Leave a Comment
  • Script Dudes

    I am trying this from your post:

    get-childitem -Path d:\scripts –recurse |

    where-object {$_.lastwritetime -gt (get-date).addDays(-1)} |

    Foreach-Object { $_.FullName }

    I get:

    D:\scripts\Data_Files

    D:\scripts\Power_Shell

    D:\scripts\Data_Files\BackUp_Test.txt

    D:\scripts\Power_Shell\test.ps1

    Note the first two lines are folders, how can I skip folders and list just files?

  • This is a really great script, but how can I modify it so the files are within the folders in the backup. Basically I want the file structure to be maintained?

    Thanks

  • I use this for the date,

    $dte = [Datetime]::Now.ToString("MMddyyyy")

    I think it's easier to read and shorter too.

    Very nice backup script.