Thoughts from the EPS Windows Server Performance Team
Useful Microsoft Blogs
From time to time, customers will call in to report "performance problems" that they are having when copying large files from one location to another. By "performance problems", they mean that the file isn't copying as fast as they expect. The most common scenario is copying large SQL databases from server to server, but this could just as easily occur with other file types. More often than not, the customer has tried different methods of copying the file including Windows Explorer, Copy, XCopy & Robocopy - with the same results. So ... what's going on here?
Assuming that you aren't experiencing network issues (and for the purposes of this article, we'll assume a healthy network), the problem lies in the way in which the copy is performed - specifically Buffered v Unbuffered Input/Output (I/O). So let's quickly define these terms. Buffered I/O describes the process by which the file system will buffer reads and writes to and from the disk in the file system cache. Buffered I/O is intended to speed up future reads and writes to the same file but it has an associated overhead cost. It is effective for speeding up access to files that may change periodically or get accessed frequently. There are two buffered I/O functions commonly used in Windows Applications such as Explorer, Copy, Robocopy or XCopy:
So looking at the definition of buffered I/O above, we can see where the perceived performance problems lie - in the file system cache overhead. Unbuffered I/O (or a raw file copy) is preferred when attempting to copy a large file from one location to another when we do not intend to access the source file after the copy is complete. This will avoid the file system cache overhead and prevent the file system cache from being effectively flushed by the large file data. Many applications accomplish this by calling CreateFile() to create an empty destination file, then using the ReadFile() and WriteFile() functions to transfer the data.
For copying files around the network that are very large, my copy utility of choice is ESEUTIL which is one of the database utilities provided with Exchange. To get ESEUTIL working on a non-Exchange server, you just need to copy the ESEUTIL.EXE and ESE.DLL from your Exchange server to a folder on your client machine. It's that easy. There are x86 & x64 versions of ESEUTIL, so make sure you use the right version for your operating system. The syntax for ESEUTIL is very simple: eseutil /y <srcfile> /d <destfile>. Of course, since we're using command line syntax - we can use ESEUTIL in batch files or scripts. ESEUTIL is dependent on the Visual C++ Runtime Library which is available as a redistributable package.
Addendum: The XCOPY /J switch was added in Win7/2008R2.
Copies files without buffering. Recommended for very large files. This parameter was added introduced in Windows Server® 2008 R2.
- Aaron Maxwell
Any answer from microsoft on this issue? Same in both vista x64 and vista x86. Large files. Tried 8-20GB files. Copy is fast but consumes all memory on the sending machine.
I found this solution to SLOW file copy/delete/move:
Open Programs & Features
Select Turn Windows Features On or Off
De-select Remote Differential Compression
It will take several minutes for this to be de-activated.
Reboot to be sure it takes.
Now these operations fly!
I am copying 40 2Gb files every night & all other attempts have failed, so far this seems to be working great. Who says that the internet is full of rubbish.
The bottom line for Windows though is that you shouldn't have to muck about like this, it should just work. I am copying between 2 Windows Server 2003 R2 machines, this is basic stuff.
Even with this command though I am only getting 20% throughput on a 1Gb network, no other devices connected.
I have now found Teracopy. Free, faster that eseutil & simple:
I had just the same issues. Every "big data" copy was very slow. I've tried a lot of different tools and stopped at secure copy. http://www.scriptlogic.com/products/securecopy/ Ultra fast copying because of multithreaded technology, stability work, friendly gui interface, retaining shares and permissions during copying - those are only a few advantages of this solution. Hope it helps.
Are there any guide lines for how much free space you need on System drive too copy a 100GB file from e: to F:. When I try on a system with 35 GB free space on the system drive if fails even though the copy is not to the C drive but from E to F. Any one?
Yeah well I beat my head against the wall trying to copy large files(>33GB) and directory permissions....essentially trying to migrate a basic volume to another larger drive. I tried Robocopy but the files corrupted. XXCopy succedded in copying a single large file but failed on the directory permissions. Swing volume using MSBackup stalled...anybody have any additional freeware utils?
Recently i tried to burn 506mb of different files to dvd... same thing. it took me cca. 20 minutes, and (IIRC) the same operation takes 3-5 minutes on xp. please hurry with the sp1 :D
I'm really pleased I found this little nugget. I was banging my head up against the disk buffer while trying to move a 400GB SQLServer mdf file from one disk shelf to another.
I never waited to completion on the robocopy approach, but just from the back of the napkin calculations I'm seeing so far it look like ESEUtil is about 3x the speed.
I have tested copying a 4 gig file using ESEUTIL, RoboCopy, command-line copy and Windows Explorer Cut and Paste. Windows Explorer Cut and Paste was fastest.
Windows 2003 EE SP2
4 GB is really not that large a file. If your machine has 2GB or more of memory, a buffered I/O scenario should probably work fine. However, try copying 5 GB files at a time, while doing other operations on the computer that take up paged locked memory.
I just ran into a situation with Eseutil where I am trying to copy a 100gb file from a 3 disk RAID 5 array to an external eSata hard drive. The file copy gets to about 70% and then the RAID controller goes all wonky and basically disconnects from the system until the next reboot. We then need to add another drive to the array, rebuild the drive and then re-add the supposed bad hard drive back into the array. In other words, the drives are fine; it's the controller that gives out.
I am concluding that the RAID controller that I am trying to use is buggy. I have tried another controller from the same manufacturer with the same results.
Eseutil is obviously copying the file so fast that it is triggering a bug in the controller. Eseutil has helped me identify this bug. Jury is still out as to whether this will happen when I use Robocopy or another file copy program.
The controller in question is a:
Highpoint RocketRaid 1640
Hopefully somebody finds this information useful.
Need the ESEUTIL copy to overwite the exisitng file.
Using robocopy (XP010) for a dozen MS SQL2000 DB backup files (and logs) . . . ~50GB-150GB (pending on what time of the month it is . . . ;0) in a batch file mapping drive letters to remote shared folders.
During the copy of the largest single file I encountered the error "ERROR 1130 (0x0000046A) Copying File <insert file name here> Not enough server storage is available to process this command. Waiting 30 seconds... Retrying..."
Tried the same copy with windows explorer . . . One file at a time would copy OK . . . but the "not enough server storage" error would occur if I tried copying ALL the files at once.
We fixed the issue with simple configuration / tuning changes . . . here's the list we used . . .
On the receiving (Windows 2003) server . . .
1.) latest greatest service pack / updates
2.) Local area connetion properties -> File and Printer sharing for Microsoft networks (properties) -> changed "Server Optimization" to "Maximize data throughput for file sharing" (was set to Net Apps)
3.) System Properties -> Advanced Tab -> Advanced Tab - Processor scheduling - Adjust for best performance of: Change to "Background services" (was set to "Programs")
4.) Set swap file to recomended parameters / size
5.) Reboot . . .
6.) Test, Test, Test, Test . . .
Hopefully someone finds this information usefull . . .
This is one of the best places to get answers.
I found all comments with usefull information.