When I was a teenager in the 90’s my homelab looked a lot like the one you see here. I wish I had a photo of it still, linux penguins and al...

Dad's homelab, getting everything you want in a small package

When I was a teenager in the 90’s my homelab looked a lot like the one you see here. I wish I had a photo of it still, linux penguins and all with 5-10 old computers under the desk running everything from Windows NT, Redhat, and BSD UNIX.  This was when virtualization was really only in mainframes so everything I had was connected to analog KVM switches which I only recently threw out the cables to. They had been in that box of cables we all have, that you know you’ll need at some point but probably never do.

Photo credit: https://www.reddit.com/r/homelab/comments/3u5w1l/old_school_cool_my_homelab_15_years_ago/ 

Through the 2000’s my homelab grew to include enterprise servers from ebay or ones that my job threw out, Cisco switches and routers for CCNA/CCNP labs, a firewall/DSL router that ran on a 386 desktop with no hard drive, booted Coyote Linux from a floppy disk, and ran everything in its few megabytes of memory. In the lab was everything from web servers, pc game servers, Active Directory domains, and at one point a Beowulf cluster because why not?

All of this was LOUD, took up a lot of space, and drew a lot of power. I’m a dad now and this setup can get off my lawn! Not to mention what may have been a fire hazard of surge protectors and power cables. Around this time some virtualization technologies from VMware and Citrix were starting to become usable as processing power was continuing to grow exponentially at a rapid pace.  In the early 2010’s VMware released ESXi 5 which was to date, the most stable, easy to use, and configure virtualization server.  This is also when I became a dad.

I wanted a nice homelab but I didn’t want it to take up a lot of space, I didn't want it to heat up house, I didn’t want it to be loud, and I didn’t want it to cost a lot in power.  Did I mention I also only want to have a single physical computer to run virtualized servers as well as my gaming PC?  Yes, a nice small footprint, all this would make me a happy dad.

While what is displayed in this photo is a very nice setup, it's not what I wanted. It sounds like a jumbo jet, will heat the whole house, and cost a fortune in power.

In 2014 I set out on this mission and purchased a humble i5 Quad-Core 3.3 GHz processor with motherboard that supported virtualization, a newer AMD video card for gaming (nothing bleeding edge for dad), a few hard drives and 32GB of RAM. Stuffed it all into a discrete plain black tower case and installed VMware 5.5 which would boot and run from a usb thumb drive. Next I created virtual machines for every service and operating system I wanted to run and play with. Plex for organizing and playing music and movies, OpenVPN to access the home network remotely, and of course a VM for what would be my main desktop/gaming computer. Through a great feature in ESXi called GPU passthrough, the video card I installed can be assigned directly to my Windows gaming VM. This also works for usb ports and allows the assigned VM to have direct access to this hardware.  This means when my Windows VM turns on it outputs its display to the assigned video card. Mouse, keyboard, and other usb devices work the same way through the assigned usb ports.

This configuration worked perfectly and allowed me to do everything I intended for my home lab, surprisingly without compromise.  For a while I was also running Splunk on top of this and 5 to 6 other VMs, collecting threat intel data from a bunch of honeypots I had setup in the cloud.  

You may be wondering how I turned my primary workstation on if ESXi is the primary operating system.  This can be done a few ways. VMware has a couple of mobile apps which work well and I also use a lightweight Android app called Raspberry SSH. 

This app is meant for use with Raspberry Pi devices and lets you assign commands to buttons, when pressed, in logs in to the system via SSH and executes the configured command.  The command shown outputs all VMs on the ESXi server. So just create buttons to power on and shutdown some of the VMs and we're all set!

This setup remained unchanged for over 5 years with various VMs until December of 2019 when my 7 year old daughter started to become interested in computer games, primarily Minecraft. Dad says, perfect time to upgrade!  Not only that but we would add an additional video card (GPU) and run my primary desktop as well as hers from virtual machines with GPU passthrough in a single desktop computer along with all the virtual machine I could possible need.

I would buy a new processor, heatsink, motherboard, 64GB of RAM, power supply, a couple SSDs for our workstation VMs, a couple usb3 hubs, and a new case. I would also get a new GPU and hand-me-down the previous GPU to my daughter, which would work perfectly for any games she wanted to play. By a stroke of luck I found a deal on a 24 core AMD Threadripper 1920X for less than what I paid for my old i5 quad core, that set the rest of the hardware in place. This was a great experience for my daughter as well because she got to help assemble the computer and learn what all the components are and what they do.

All components installed including a 4 port network card to physically separate groups of VMs depending on their purpose.

After some testing, ESXi 6.5 seemed to work best with my hardware and using GPU passthrough. I migrated my VMs from the old server, added the new GPU to my Windows 10 gaming VM and spun it up with everything working seamless. Next I setup a new Windows 10 VM for my daughter's gaming workstation, configured GPU passthough on the second video card as well as passthrough for a usb hub (mouse, keyboard, headset), spun it back up and everything works great. To turn on her computer, I have an old phone setup on the desk with the Raspberry SSH app and a button to turn on her computer.

ESXi 6.5 up and running

GPU and USB passthrough config - GPU 1

GPU and USB passthrough config - GPU 2

VM configuration with GPU and USB added as PCI devices - GPU 1

VM configuration with GPU and USB added as PCI devices - GPU 2

I found that passthrough of the audio on the GPUs cause issues sometimes so I purchased a couple small usb audio adapters with mic and speaker ports. Other than that the two gaming workstations run flawlessly at high loads and with close to 10 other VMs running on the same system at the same time including, Plex, OpenVPN, Pi-hole, our family Minecraft server, Nextcloud, SELKS & Splunk (IDS and threat hunting), and multiple other workstations.  This dad is happy, just about an entire homelab in one box.  The rest of the homelab includes a small Netgate SG-1100 pfSense firewall, access point, small 2 bay NAS, and a Netgear GSS108E 8 port switch.  I use this switch specifically because it has the ability to mirror ports and is low cost.  The firewall connects to this switch and I mirror that port so all traffic is duplicated to another port which my SELKS IDS VM is connected to. I plan to detail that design and configuration in a future post.

Port mirroring configuration

The rest of Dad's homelab, Linux penguin and all.  The Raspberry Pi Zero on top of the cable modem is connected to a UPS by usb and used to monitor for power outages, send alerts, and gracefully shut everything down.