Tuesday, December 11, 2007

Virtual PC and Virtual Server: Doing Differencing Disks

[Updated Feb 26, 2008]
I work at a contracting/consulting firm. I kid you not when I say that what I'm working on sometimes changes every week for months at a time. I mean it. Two weeks ago was InfoPath. Before that was Reporting Services. Last week was SSIS. This week is nHibernate. Next week I'm porting an ASP.NET to SharePoint. You get the picture.


I cannot begin to intimate how dirty my dev environment gets. Take a bucketful of open source offerings, throw in a lot of MS tech stack CTP's, Beta 2's and then stir in a service pack or two and you get a really dirty development environment. Typically I'll scrub my box at least twice a year, but as I'm getting older I find I'm becoming more and more impatient when it comes to sitting through installs.

A couple of weeks ago I stumbled upon this wonderful article from Andrew Connell. Essentially it's a HOW TO for leveraging Virtual PC/Virtual Server images to help your development world. I won't go into the details that are so elegantly laid out in the article but let me assure you it's a dynamite read. It was so good that I mustered up the courage to give it a try.

The Pain

I have to tell you, creating all the base images (I decided to create 4 of them) was PAINFUL. It took what seemed forever to install, defragment, precompact, compress 4 machine bases with different assortments of WinXP, Win2k3, MOSS, VS2005 SP1 and SQL Server 2005 SP2. Essentially I threw away an entire weekend getting things up and running.

The biggest problem I had was my poor notebook disk. When you start creating 12 GB (or bigger) files your disk gets ridiculously fragmented. Factor in a 5400 RPM notebook drive (I decided to do this on my laptop for some reason) and what you get is a SLOW SLOW disk. Eventually the installs and service packs grind to a halt if you don't stay on top of the defragmenting.

I only wish I had known of contig a utility hosted on TechNet that will let you defragment individual files/directories without needing to defrag the entire disk! Let me tell you this little utility is a god send for this type of task. If you have any intention of going through with this (and it's worth considering) be sure to download this utility.

The Pain Summary is as follows:
-Long installs, you'll lose some of your life that you'll never get back. Ideally you'll come out ahead in the long run though.
-My Virtual PC images still don't pick up my second display at work. You can still have multiple displays, just not multiple displays INSIDE the virtual machine. This is irritating to me because I've grown very accustomed to have the SQL Management Studio on on monitor and Visual Studio on the other.
-You'll probably have to buy an external drive, especially if you're doing this on a laptop. All that disk contention on one 5400 RPM drive is just a bad recipe for grouchiness. I'd recommend a Fire Wire one if you can get one, it's less CPU intensive to access.
-It's also probably worth your time/money to get at least 3 GB of memory if you're going to run very memory heavy virtual machines. But hey, memory's cheaper than bad humor, consider yourself lucky.

The Gain

I can now provision a machine in no time at all! And I don't just mean a new development machine either, you can take the same images that you would use with Virtual PC and use them with Virtual Server (you will need to run newsid every time you make a copy of a machine and want to run multiple copies at the same time). You can even run either the base images or differencing disks on Virtual Server. Developing off these base images we can now:
-Get a fresh development environment in a matter of minutes
-Continue working on another machine if my machine dies (as long as the external HD which holds the virtual machines is good)
-Make a copy of any development environment and give it to another developer
-Provision test/QA environments whenever we need a virgin machine
-Roll back any work we've done on the computer within a session (Undo Disks)
-Save the state of the machine at any time and continue later (you could be in the midst of debugging an app, save the state, put the machine down and carry on next week like nothing happened)

Trust me this makes for a FEARLESS development environment, no longer do I grimace before applying some patch or installing some 3rd party utility. Given that both Virtual PC 2007 and Virtual Server 2007 are both free, if your firm has some good hardware lying around you should definitely give it some though if you want your hardware to stretch a little further. It could really benefit both you and your team. Also only one person really needs to set up the base images. After that you can copy them around, run NewSid on them and you're off and running!

Last but not least I recently acquired both an external USB 2.0 and a Fire Wire 400 drive. I thought it might be educational to post the results of some HD Tach here. So here's my version of the USB 2.0 vs Fire wire 400 test. Both drives actually had the exact same data on them so it's a fairly decent case.

USB 2.0 (Western Digital 250 GB 5400 rpm drive)


Fire wire 400 (Verbatim 250 GB Smart Disk)

Internal MacBook Pro Drive (120 GB 4200 RPM)External eSata (WD Raptor 120GB 10,000RPM)
As you can see (click on the images) the USB 2.0 has better burst speeds and better average reads, but the Fire Wire is a LOT less CPU intensive. They both boast the same average read times. My advice, go with the Fire Wire unless you've got a bunch of cores in your machine that you don't want to have idle :-).

[Updated Feb 26, 2008] A friend sent HD Tach scores for an external WD Raptor connected via eSata, as you can see it blows the other stats out of the water. Albeit it's a little less portable then it's slower spinning brethren.

Ok, that's it folks this one got a little long winded.

Happy dev-ing,
Tyler

1 comment:

digital certificates said...

Not only is this a well-written post, but I love the topic. Really trustworthy blog. Thanks for sharing !