Building new whitebox servers for VMware home lab

Posted by on October 22, 2012 in Home Lab, Labs, VMware | 22 comments

I have needed to get some more capacity added to the home lab for a while now, but have taken my time. I have been gathering up enterprise servers that are a couple of generations old in the past. These have always done me well but have limited amount of memory in them and upgrading them was pretty expensive, not to mention they are very loud. So I decided to go another direction and build a couple of whitebox servers based on common desktop parts. I’ve been watching for sales and collecting the parts to build them. After finding a couple of good deals lately I finally had all the parts need to build two hosts.

Another thing that I had to make a decision on was if I needed a server class motherboard or would a desktop one work. After thinking about it I came to the decision that a desktop motherboard would work just fine and probably save me a few dollars in the build cost. I almost never use the out of band management access to the enterprise servers that I had at this point and since they are just down in the basement, I can easily run down and access them if needed.

I also did not need the ability to use VT-d so a server board was even less important. I simple needed hosts with good power and more RAM. It really comes down to memory for me, I needed the ability to run more VMs so that I don’t have to turn things on and off.

The Why:

This type of lab is important to me for personal learning and testing out configurations for the customer designs that I work on during the days. I have access to a sweet lab at work but it’s just better to have your own lab that you are free to do what you want, and my poor bandwidth at the house makes remote access kind of poor.

I want the ability to run a View environment, vCloud suite and my various other tools all at once. With these new hosts I will be able to dedicate one of my older servers as the management host and a pair of the older servers as hosts for VMware View. This will leave the two new hosts to run vCloud suite and other tools on.

The How:

I have set the hosts up to boot from the USB sticks and plan to use part of the 60GB SSD drives for host cache. The remaining disk space will be used for VMs. Each host will have 32GB of RAM, this is the max that the motherboard will support with its 4 slots. There is an onboard 1GB network connection that is a Realtek 8111E according to the specs. I can report that after loading vSphere 5.1 the network card was recognized and worked without issue. I had a couple of gigabit network cards laying around that I installed for a second connection in each of the hosts.

The case came with a fan included, but I added another for better cooling and air flow. Even with multiple fans running the hosts are very quiet since there are no spinning disks in them and put out very little heat. I could have probably reduced the noise and heat a bit more by choosing a fan less power supply but they are over $100 and was not a priority for me.

Hardware List:

Here is a list of what parts each server was built with. I was able to build these systems for under $500 and add a good amount of capacity to my lab. I always keep a close eye on SlickDeals for parts that I needed and was able to score some good deals from Newegg, Tigerdirect and Amazon.

Motherboard: ASROCK Z77 PRO4 Z77 LGA1155 R  $109

CPU: Intel I5 3570K 3.4GHZ 6MB cache $179

Memory: 32GB 8Gx2|GSKILL F3-10666CL9D-16GBXL (4 x 8GB sticks) $57 +57 = $114

SSD: OCZ Agility 60GB   $Free w/motherboard

Case: NZXT Source 210 case $20 after rebate

USB Stick: Kingston 16GB $9.99

Power Supply: Corsair 430w power supply $29 after rebate

Fan: Cooler Master SickleFlow 120 $9

Shared Storage:

For the short time I will be relying on local SSD drives in the hosts and my old Iomega IX2. The Iomega is serving up an iSCSI and NFS share to all my hosts. It has 1TB of capacity but performance is pretty slow once you get more than a couple of VMs running on it. And if you clone something it takes a while.

I also order a pair of Samsung 256GB SSD drives from Amazon. I found them on sale for $154 and free shipping with my prime membership. These are good drives and are suppose to be fast. So I am exploring options on how I will use them. There are several things I am considering like experimenting with the vSphere VSA, the Nexenta community storage VSA or just using them as local datastores. Which ever way I got they will provide a much needed bump in performance.

Long term I need to invest in a better performing shared storage solution. Something like a better Iomega or Synology device would be ideal. But for now they are pretty expensive and I will need to save up funds for them, unless there is a friendly vendor that would like to sponsor me or donate something.

Anyways, I hope that these details can help others that are looking to build a home lab. I will try to get around to do some performance testing on these hosts and post something but that may take a bit with my current schedule. If you have any questions drop me a note in the comments or send me a message.

Other Whitebox builds:

The vHydra a single server with ability to go over 32GB of memory

The baby dragon II build, there are a few revisions – rootWyrm or Chris Wahl

Update 10-23-12:

I ran VMware site survey on one of the hosts just to verify full FT compliance.

.

About Brian Suhr

Brian is a VCDX5-DCV and a Solutions Architect for a VMware partner and owner of this website. He is active in the VMware community and helps lead the Chicago VMUG group. Specializing in VDI and Cloud project designs. Awarded VMware vExpert status for 2014, 2013, 2012 & 2011. VCP3, VCP5, VCP5-Iaas, VCP-Cloud, VCAP-DTD, VCAP5-DCD, VCAP5-DCA, VCA-DT, VCP5-DT, Cisco UCS Design

22 Comments

  1. Hi Brian

    Is your build compatible with version 5.1?

    Thanks

    TD

    • Yes, I am running ESXi 5.1 on them now.

      • Awesome. Thanks for the build.
        TD

    • Brian –
      Thanks for the info and post info in this vetted build. I’m in need of a system and have been looking around the net for some more info – post builds to know how the systems actually ended up – and yours sounds like something I’d like to ‘build off of’. I purchased a synology 1812+ a couple of months ago and am planning on using it for my storage needs. What I’m hoping to do is create, on a smaller scale, some of my production enviroment so that I can test a ‘DR’ solution and I think these componets might work. With work possibly helping out in the costs, I feel fortunate. I don’t have alot of room for 2 ‘normal’ towers and wonder if you think that these parts would fit into a mini-atx or small form factor case like a liun or something equavelant? If not, do you have any suggestions for a similiar built type of equipment for a smaller form enviroment? And, thanks again…

      mark

      • Mark,
        I built the same setup as Brian but used the Pro-4M version which is a micro-atx form factor of the same board. Works great!

        • Hi.
          I’ve been following this thread and like all the parts – but was wishing for a smaller form factor. Did you change any other parts, from Brian’s other than the board to fit into a micro-atx case – and can you suggest the case that you bought?
          thanks,
          mark

      • As long as the case fits an ATX motherboard you will be good. If you want something even smaller you might be able to find a mini or micro version of this board with the same chipset.

  2. Brian Thanks for the tips on this… Its always helpful when someone had vetted a build. This one turned out great. It was cheap yet powerful enough to be usable and best of all it’s smaller and quieter than my previous AMD builds. Hats off to you!

    • Glad the post helped in your build. The boxes are working out well for me, I would like to built a couple more in the future to keep up with the expanding VMware product list.

  3. Great article Brian! I am going to try to build my own mini lab based on this. Quick question. I don’t have a basement to store my servers in. How quiet are your servers? And how much heat do they exert? I live in a small apartment so don’t have much space plus have to keep an eye on cooling. I was thinking of just getting a Mac mini or 2 and using them for my lab, but they are very costly. Thoughts?

    • They are pretty quiet and don’t put out a ton of heat. I don’t use any spinning disks or optical drives. So that keeps the noise down also. It’s really just the noise of the power supply which is low.

      You wont get much lower noise unless you spend 2x or 3x the money on the high end low sound gear.

  4. What did you do about nics? Just the one on the board?

    • Currently I am using the on board nic and I had some gigabit cards that I removed from some proliant servers that I had around for a 2nd port.

      • Intel Ether Express Gigabit card. $31 (cdn) EXPI9301CTBLK

  5. Hi Brian,
    Will Intel e3-1230v2 processor supports vmware advance features like Directed I/O (VT-d), vmotion, DRS, HA & FT?

    Thanks for ur reply in advance.

    Bhagat

    • Hello, The only thing that needs a specific Motherboard/CPU combination to work is the Direct I/O feature. This is also the one that is rarely used, would not worry about it for lab use. But to verify what your configuration will support you can look up the parts on the VMware HCL.

  6. Hi, is your setup still running stable?

    I am looking to purchase the same CPU/Mobo/Memory combo that you have but want to know if you have come accorss any issues with it.

    Thanks

    Mario

    • Hello,

      Yes both of the white boxes that I built are still going strong. Only trouble I had is the Mobo in one of the boxes died as a result of a storm that caused power issues for me. Lesson learned that I need to invest in a UPS for the lab. It was replaced under warranty and been going strong since.

  7. Hi, great post!
    I am looking to replace my Mobo, cpu and memory in my whitebox.
    Just curious, why did you choose relative slow memory (1333)? and not a
    Kind regards,
    Martijn

    • I was not terribly worried about memory speed when purchasing. Figured anything was going to be a vast improvement over the 5+ year old gear I was using. This type of memory was also very cost effective at the time when I built these boxes.

  8. THx for the specs. Do you use your ssd drive only for caching or also for your datastore. I have also à iomega x2 but i want also à “fast” local storage when I test some things. I consider à ssd for local storage or the sata Western Digital WD5000HHTZ 500 GB Harde Schijf
    (SATA 600, VelociRaptor . Richard

    • I currently use the SSD as local datastores. In the near future will likely use for vFlash and VSAN. Will also be working towards building a 3rd whitebox for this new setup.

Trackbacks/Pingbacks

  1. Welcome to vSphere-land! » Home Lab Links - [...] (Virtualize Planet) Running nested hypervisors on the Ultimate Portable Lab (Virtualize Planet) Building new whitebox servers for VMware home …
  2. Lab Day! (aka ESX 5.1 Whitebox server build) | Infrastructure, technology and virtualization. - [...] dual sockets but the price point was closer to $2K which is clearly way beyond budget. Then I found …
  3. Baby Dragon Triplets – VSAN Home Lab | vcdx133.com - […] Dragon home labs: Erik Bussink, Eric Shanks, Chris Wahl, Brian Suhr, Phillip Jaenke, William Lam, Derek […]

Leave a Reply

%d bloggers like this: