UnRAID and Virtualize All The Things

I have been out of the consumer hardware market for a long time. Despite building a ‘gaming’ PC (that doesn’t see much gaming) back in 2018, I’ve been out of the game for the past decade or so. In the early-to-mid-2000s my bedroom was a sea of tech; at one point I was running my own mail server (over a tiny ADSL connection), had two case-modded machines and a wall of monitors (CRT, natch) that would have made The Architect jealous.

Since then, I’ve been “on the road”. When I first moved into a shared house, there wasn’t room for a desk, so I had to shrink my tech down to something mobile – a Dell XPS 13. They continue to be an excellent laptop line all these years later, though sadly mine destroyed itself by boiling its integrated graphics card.

It got replaced with a MacBook Pro, thus beginning a 11+ year love of all things Apple. Windows 8 was released a few years later, cementing my stay in Apple-land. When I did get a desk back, it was an iMac I bought, not a Windows tower. That, once more, had to make way for mobile computing when I found myself on the road again, moving between flats and houses and jobs.

Another big part of the Wintel apathy was down to my job(s). Back in the day when I had an array of machines and the digital world at my fingertips, IT as a job was something I aspired to. Once I got into it, I found that spending 8 hours a day working on computers is a really great way to sap your enthusiasm for doing more of it once you get home.

When I finally ‘settled’ in New York (for two years) I finally got around to getting a Windows machine again, ostensibly for gaming. My girlfriend is an avid gamer, and I wanted us to share that hobby a little. Well, she probably did more than me – like I said, after 8 hours at a desk fixing problems, the last thing I want to do is come home and sit at a desk for another 4.
I quickly realized that I was well out of touch with modern components, what manufacturers were best, and so on. I think in the end I just found a few system guides that roughly correlated to my budget, and whipped the thing together.

All in all, it’s been a very solid, reliable machine. An Intel Core i7-8900K with 32 (now 64) GB of RAM, an NVMe boot drive (that was a nice change from my last Windows machine!) and some storage, and things were great.

I was also running with a Synology NAS for my Plex media library. It was a little 2-disk 4TB mirrored job, which worked ‘just fine’, but I found that I had to offload the actual media transcoding (for remote streams) to my PC. Once that filled up, rather than getting a new NAS I decided just to move all the disks and data to my main PC and have that do everything.
A new case was sourced (a Lian-Li PC-A75), the hardware transplanted, and I also took the opportunity to Frankenstein some other hardware together in my old case to replace my girlfriend’s PC which was on its last legs. The only thing that came over from the old machine was the graphics card, which is now slowly dying – more of that anon.

This has all been working fine, but it was a pretty quick & dirty switch over that I hadn’t spent much time thinking about or planning.

Fast forward a few months, and I hired a young guy at work who was also a Plex aficionado and was in the middle of rebuilding his own – rack mounted – Plex server, along with a host of other additions. My interest was piqued, and he told me more about his system. He had rebuilt it using Docker containers on Linux, and had a fully interactive web frontend to access media, as well as requesting it, auto-downloading and categorizing it, and more besides.

This was something I hadn’t even considered – again, my day job kept me plenty busy and I wasn’t exactly looking for more things to keep me in front of a screen. However, this information had awoken something in me – an urge to tinker with personal tech that had been lurking at the back of my mind, long dormant.

Around this time, the graphics card on my girlfriend’s PC started doing more weird stuff. Photoshop would randomly complain that it couldn’t initialize the graphics driver and deny access to a bunch of features. One screen would stop drawing certain items in Chrome. It had been on its way out for a while, but I decided it was time to do something about it, so started looking for a good replacement.

Somehow, Googling for a graphics card replacement turned into a whole system replacement. That then became researching individual components, and suddenly I was figuring out how to split my systems in two.

With no end to the Covid pandemic in sight, the winter drawing closer, along with its cousins Seasonal Depression and It’s Too F-ing Cold To Go Anywhere, I decided now was a good time to rebuild my tech and get stuck into a project or three.

Plan A was similar to my 2018 build – sticking with my conventional, out-of-the-game-for-years knowledge. I would build a small form factor machine into a small case that could fit a bunch of disks. Something small but good looking, maybe that I could put a novel cooling solution inside, to house my media, and what would become my projects for the season – docker containers for stuff like Radarr, Sonarr, Emby, and so on.

The other machine would be a rehoming of my existing machine, again with some interesting new cooling – perhaps a trip back to Water-cooling Town, a place I had visited first at University many moons ago. It was a mostly painful journey that ended up with a destroyed graphics card, but things had moved on in the last ~16 years.

That was when I stumbled upon UnRAID, VFIO, IOMMU groups, and other wondrous things.
I discovered that now, people were running some form of Linux on bare metal, then virtualizing a bunch of machines inside of it, and directly passing through pieces of the hardware to those operating systems.
People are now literally building multiple gaming PCs inside of one physical box.

The concept of virtualizing my primary OS on a device had never even occurred to me. I’m familiar with virtualization in the work place – most of my core infrastructure runs on Hyper-V, and without VMWare vSphere we could never do the development work we do without incurring crippling hardware costs.
Heck, I have a virtual copy of macOS running behind this browser window right now.

But virtualizing the primary OS? With graphics and input/output that operate like a bare metal install on consumer hardware? It’s something I’d never considered, but now seems obvious. Hardware has come on such a long way since I last built PCs with any regularity, and for the last 10 years my “IT brain” has been firmly locked into the corporate world.

In the case of UnRAID, the software is actually running from a USB device (in my case it will be a 16GB SanDisk Ultra Fit), with the application loaded into memory. This leaves all of the bare metal hardware free to be assigned at will (well, mostly …) to whatever you want to run on it.
And with UnRAID, you can run virtual machines and docker containers within it, whilst UnRAID itself manages your storage pools for you.

The whole thing sounds like a really fun project. As I mentioned, I’ve ‘done virtualizing’ for years now, but in the corporate world. Slapping Hyper-V on a rack mounted server and installing a ton of single-role server operating systems isn’t fun, it’s procedure.

So what’s intended for this Project Box (which I’m casually calling the God Box), to while away the winter hours? There are a few things I needed to make sure I could stick to:

  • A ‘daily driver’ machine for standard web browsing, life admin, occasional photo editing, that sort of thing.
  • A gaming machine, for the odd occasion I feel like doing that. I’ve just been gifted Red Dead Redemption, so that’s something to get my teeth into, and I love playing Civ as well.
  • A Plex server for media.

So, we’ll be running UnRAID as the ‘core OS’ of the device, from USB.
The machine will have 32GB of RAM (to start with), stolen from my existing machine. There will be a 1TB NVMe SSD to host ‘key’ virtual machines – in this case Windows 10 and Ubuntu Linux.
A second 500GB NVMe SSD will act as a cache drive for the system’s main storage.

Two 1TB Western Digital Blue SSDs will provide a striped array which will form the data partition for the Windows 10 machine. That’s based on current usage – basically it’s a partition for my documents and a local copy of my iTunes library (the idea is if the internet ever goes out for a long time, at least I can still play music).

The Ubuntu Linux installation will run Plex. Whilst I could run Plex in its own native container, I believe the performance is better this way. I can pass the Intel Integrated GPU through to the Linux VM, and perform hardware transcoding on it for pretty good performance. Originally I was intending to purchase a second graphics card for this, but I don’t think it’s necessary anymore.

For storage of the media, I’m adding an 8TB disk for parity. Data will be stored on the existing 4TB disks I have in my existing machine, but the parity drive will maintain data integrity, as well as allowing future expansion with larger disks
The parity drive must be equal to or larger than the largest disk in your array, and anything larger than 8TB gets expensive; as media grows I could slowly replace the 4TB disks with 8s and still have a ton of storage before needing to consider anything else.

I’ve already begun prepping migration, removing one 4TB disk from the Windows Storage Spaces array on my machine. That will form the beginning of the array on the God Box, with the parity drive building its index as data is migrated to it, which is more efficient (and safer) than migrating all the data first and then building parity. I have about 6TB of media, so I can migrate most to the new machine, remove another from the array and put it in the new machine before finishing the transfer, then moving the remaining disks.

All of this will be housed in a new enclosure – a Dark Base 700 from be quiet!, using fans and cooler from the same company. Other than Noctua these are some of the quietest fans on the market, and they look pretty cool too.

Having shamelessly stolen the idea from my colleague, I’m going with a white-on-black theme, with white components inside the case (where I can get them) and white cabling. be quiet! supply white versions of their fans, and a white cooling block for the CPU which should finish the look off quite nicely.

Lastly, horsepower. I was very tempted to go with an AMD Ryzen build, as they seem to provide more bang-for-buck. However I am wedded to Plex as a media player, and AMD support is poor with that product, so I’m sticking to Intel. A Core i9-10850K gives me 10 cores to play with, and an ASRock Z490 Taichi motherboard seemed to be a good pick to tie it all together.

be quiet! offer a ‘build your character’ page for their cases, so this is roughly how it should come out looking (albeit with more white highlights).

The components start arriving this weekend! The initial build will take some time (getting the cable management right is going to be a big consideration for this build), and I think fettling UnRAID to get the basics working is going to take some time.

However, once the base machine is up and running I look forward to migrating the data, then getting started on expanding the setup! There will be a lot to come, and I’m looking forward to sharing this project as it progresses.

2 Replies to “UnRAID and Virtualize All The Things”

  1. Hey Ed, glad you’re coming over to the Unraid family! I run a fairly large Unraid box (13TB currently over 6 disks, plus parity and cache) and it’s been great. I do run Plex as a docker and for my usage (1 stream, 1080p) it’s been absolutely solid. There is a project to get nVidia GPUs directly passed into the docker but I have an older nVidia card in there (for troubleshooting) which doesn’t support the API so I haven’t bothered. Moving from a dedicated install of Plex to the docker was a big hassle so I’m glad I, hopefully, never have to do that again and can just move the docker around to wherever I want it.

    I’m very happy with it – I run a Windows VM for the bits of software that need to be on a full OS (damned iCloud for example) and for when I want to jump into desktop Chrome from my phone and it’s been mostly fine. Stability in Win10 drops when you squeeze the RAM too much so I’ve had to give it a big chunk unfortunately. Fortunately RAM is cheap right now, although when I eventually need to upgrade the 4x8GBs it’s going to hurt.

    I have watched many Linus Tech Tips videos on running a gaming PC via Unraid but I haven’t gone down that route yet – I want my server far away from my office and don’t fancy running cables down half the house. Maybe I’ll run a second Unraid box one day but I don’t need it just yet.

    re disks, the cheapest way most people in the Unraid community get them is by “shucking” from external drives – especially Western Digital ones. In UK prices the 14TB units can be half or less than the OEM drives during sales – and 2/3rds even at normal pricing. With Black Friday coming up you may be able to get a 14TB for something like ~$190. Best Buy has one coming soon: https://www.reddit.com/r/DataHoarder/comments/jn73un/189_14tb_will_be_back_at_best_buy_starting_1122/ Just be aware sometimes you have a make a very minor mod power to the power connector to get them to spin up outside the enclosure (the mod literally uses electrical tape, nothing fancy). Lots of detail on that sub-reddit.

    I’m already on my second server for Unraid, I did a trial on a cheap Optiplex workstation I got on eBay before building a proper box, and can’t see myself going back. Good luck with your adventures!

    Liked by 1 person

    1. Thanks Peter! I meant to reply to this sooner but the last week has been chaotic for a number of reasons, as well as getting UnRaid setup in the first place! 🙂
      I’ve read a bit about shucking and do have my eye on that 14TB deal, though now I’m out of the Windows Storage Spaces situation I may not need it, as I think I’ll have about 16TB or so free once all the disks are moved which should do me well for a while yet (famous last words).

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s