Apparently, software updates are getting so big these days that simply downloading them from a server is becoming prohibitively time consuming, especially when the same updates need to be applied to many different machines. A Dutch university has some 6,500 desktop PCs in ten locations, which on occasion need to download 3.5GB worth of different types of updates. That's a handsome 22.2TB in total. In a traditional client-server world, that's some modest lifting.
In fact, INHOLLAND University's IT department used to have almost two dozen servers distributed over the university's locations to serve up these downloads. The school was able to retire 20 of them after adopting a new way to distribute updates: BitTorrent.
Of course, Microsoft researchers are working also on the "perfect worm" - a piece of software that can distribute patches without the need for centralized servers while minimizing bandwidth.
I don't understand? Did they have to reapply every patch to every system? That's what these numbers seem to imply.
Every MS patch for XP post SP2 totals around 800MB.
It's cool that they leverages a peer to peer solution to distribute their patchs, but I question the scope and magnitude of what they did.