nice. thanks
nice. thanks
Thank you for the tip. I will for sure try that on my new Sharepoint Server.
Thank you!!!
Greetings all; Someone posted a script above which I just found out has a bug in it. Here is the correct script: # Location of spbrtoc.xml $spbrtoc = "\\znj1pdbapp01\E$\SQLDumps\Sharepointbackup\spbrtoc.xml" # Days of backup that will be remaining after backup cleanup. $days = 7 # Import the Sharepoint backup report xml file [xml]$sp = gc $spbrtoc # Find the old backups in spbrtoc.xml $old = $sp.SPBackupRestoreHistory.SPHistoryObject | ? { [DateTime]::Parse($_.SPStartTime) -lt ((get-date).adddays(-$days)) } if ($old -eq $Null) { write-host "No reports of backups older than $days days found in spbrtoc.xml.`nspbrtoc.xml isn't changed and no files are removed.`n" ; break} # Delete the old backups from the Sharepoint backup report xml file $old | % { $sp.SPBackupRestoreHistory.RemoveChild($_) } # Delete the physical folders in which the old backups were located $old | % { Remove-Item $_.SPBackupDirectory -Recurse } # Save the new Sharepoint backup report xml file $sp.Save($spbrtoc) Write-host "Backup(s) entries older than $days days are removed from spbrtoc.xml and harddisc." The original script was taking the system date - Days to determine if a backup was older than the $days old. However I discovered that it was doing a string compare so when the calendar crossed over to a new year the current backup was being deleted along with the oldest. By converting the SPStartTime to DateTime this bug is resolved. HNY; Kurt Zimmerman Sr. DBA Lefrak Organization
Great post. Only thing I have to add is that I needed to change the properties on the task to run whether or not the farmadmin account was logged in, and to run under highest privileges (had to supply password). Other than that, this works great! Thanks!
Awesome post thanks for your time
Great script - thanks. Our SharePoint databases are in full recovery mode and the transaction logs have been growing over the months (understandably!). Is there a best practice for doing a transaction log backup in conjunction with this script or should I just do this within SQL?
Hi Sean,
Thanks, cleanup on backup destination script works perfectly, but it keeps the old folders spbrxxx . each old spbrxxx folder contains same txt file named spbackup. do you have an idea on how to delete all those old folders.
Great Post ..
Thanks aloot ..
You save my time ..
thanks, I found it very useful and precise.
Thank you gentlemen, you have saved me much time, may you be blessed. Please keep on providing this kind of directly useful information.
Great post! Since I'm new to SP this was perfect.
3 Lines Backup-Script: that's great.
The Backup seems to be uncompressed: Compression of a full backup (1.49 GB) took only 219 MB with 7zip.
Compression should be built-in (but it's still PoorShell instead of PowerShell - compared to #!/bin/sh)
Thank you Yousef
Cloud to cloud backup for leading online SaaS application. To get the command over the cloud productivity to your business of any size, and also want to save money and time then only office 365 provides all these features. For further information visit: http://www.cloudally.com/office-365-backup/?refId=ESA