Building new whitebox servers for VMware home lab
I have needed to get some more capacity added to the home lab for a while now, but have taken my time. I have been gathering up enterprise servers that are a couple of generations old in the past. These have always done me well but have limited amount of memory in them and upgrading them was pretty expensive, not to mention they are very loud. So I decided to go another direction and build a couple of whitebox servers based on common desktop parts. I’ve been watching for sales and collecting the parts to build them. After finding a couple of good deals lately I finally had all the parts need to build two hosts.
Another thing that I had to make a decision on was if I needed a server class motherboard or would a desktop one work. After thinking about it I came to the decision that a desktop motherboard would work just fine and probably save me a few dollars in the build cost. I almost never use the out of band management access to the enterprise servers that I had at this point and since they are just down in the basement, I can easily run down and access them if needed.
I also did not need the ability to use VT-d so a server board was even less important. I simple needed hosts with good power and more RAM. It really comes down to memory for me, I needed the ability to run more VMs so that I don’t have to turn things on and off.
This type of lab is important to me for personal learning and testing out configurations for the customer designs that I work on during the days. I have access to a sweet lab at work but it’s just better to have your own lab that you are free to do what you want, and my poor bandwidth at the house makes remote access kind of poor.
I want the ability to run a View environment, vCloud suite and my various other tools all at once. With these new hosts I will be able to dedicate one of my older servers as the management host and a pair of the older servers as hosts for VMware View. This will leave the two new hosts to run vCloud suite and other tools on.
I have set the hosts up to boot from the USB sticks and plan to use part of the 60GB SSD drives for host cache. The remaining disk space will be used for VMs. Each host will have 32GB of RAM, this is the max that the motherboard will support with its 4 slots. There is an onboard 1GB network connection that is a Realtek 8111E according to the specs. I can report that after loading vSphere 5.1 the network card was recognized and worked without issue. I had a couple of gigabit network cards laying around that I installed for a second connection in each of the hosts.
The case came with a fan included, but I added another for better cooling and air flow. Even with multiple fans running the hosts are very quiet since there are no spinning disks in them and put out very little heat. I could have probably reduced the noise and heat a bit more by choosing a fan less power supply but they are over $100 and was not a priority for me.
Here is a list of what parts each server was built with. I was able to build these systems for under $500 and add a good amount of capacity to my lab. I always keep a close eye on SlickDeals for parts that I needed and was able to score some good deals from Newegg, Tigerdirect and Amazon.
Motherboard: ASROCK Z77 PRO4 Z77 LGA1155 R $109
CPU: Intel I5 3570K 3.4GHZ 6MB cache $179
Memory: 32GB 8Gx2|GSKILL F3-10666CL9D-16GBXL (4 x 8GB sticks) $57 +57 = $114
SSD: OCZ Agility 60GB $Free w/motherboard
Case: NZXT Source 210 case $20 after rebate
USB Stick: Kingston 16GB $9.99
Power Supply: Corsair 430w power supply $29 after rebate
Fan: Cooler Master SickleFlow 120 $9
For the short time I will be relying on local SSD drives in the hosts and my old Iomega IX2. The Iomega is serving up an iSCSI and NFS share to all my hosts. It has 1TB of capacity but performance is pretty slow once you get more than a couple of VMs running on it. And if you clone something it takes a while.
I also order a pair of Samsung 256GB SSD drives from Amazon. I found them on sale for $154 and free shipping with my prime membership. These are good drives and are suppose to be fast. So I am exploring options on how I will use them. There are several things I am considering like experimenting with the vSphere VSA, the Nexenta community storage VSA or just using them as local datastores. Which ever way I got they will provide a much needed bump in performance.
Long term I need to invest in a better performing shared storage solution. Something like a better Iomega or Synology device would be ideal. But for now they are pretty expensive and I will need to save up funds for them, unless there is a friendly vendor that would like to sponsor me or donate something.
Anyways, I hope that these details can help others that are looking to build a home lab. I will try to get around to do some performance testing on these hosts and post something but that may take a bit with my current schedule. If you have any questions drop me a note in the comments or send me a message.
Other Whitebox builds:
The vHydra a single server with ability to go over 32GB of memory
I ran VMware site survey on one of the hosts just to verify full FT compliance.
About Brian Suhr
Brian is a VCDX5-DCV and a Solutions Architect for a VMware partner and owner of this website. He is active in the VMware community and helps lead the Chicago VMUG group. Specializing in VDI and Cloud project designs. Awarded VMware vExpert status for 2014, 2013, 2012 & 2011. VCP3, VCP5, VCP5-Iaas, VCP-Cloud, VCAP-DTD, VCAP5-DCD, VCAP5-DCA, VCA-DT, VCP5-DT, Cisco UCS Design