I do a little research at work using Deep Learning and also work with Virtual/Augmented/Mixed reality. Typically I'm using Quadro's for the DL work and 1070s or 1080s for VR/AR, but Quadro GP/GV100s are cost prohibitive for personal use so 1080Ti it is.

There's not much bling in this build, but that's the look I wanted, unassuming yet able to stomp. The Lian Li PC-9NB ATX Mid Tower is huge inside and offers eight expansion slots out the rear (for x4 GPUs). It is about as low profile as you can get with eight expansion slots at just over 18 inches tall. It also allowed me to route the Corsair H60 radiator to the case exterior which helps reduce the thermal load inside the case. And... it doesn't look like a spaceship.

The GA-X99P-SLI motherboard is a generation behind socket-wise (Broadwell vs Skylake) but supports four double wide PCIe cards which means I can stack up to x4 GPUs. To support that I went with the EVGA 1600 Gold PSU as I have seen the 1080Ti spike at 350W, well above the advertised 250W power draw. Imagine four of them maxing out simultaneously for ~1400W! I did not have to do a BIOS upgrade to get Broadwell support on the GA-X99P-SLI which was nice.

My NVIDIA bridge for SLI shipped just the other evening which I'll use when gaming. I originally bought a 2 slot bridge, but per GIgabyte's recommendation to place GPUs in PCIe slots 1 and 3 for a 2 GPU configuration, the bridge needs to be of the 4 slot variety. Should be here this week.

For the longest time the NVIDIA website was showing out of stock on 1080Ti but recently they've made some available. These consumer-grade cards are inexpensive compared to a commercial-grade GP100 and perform similarly from a images/sec perspective. However they aren't qualified to handle the lengthy, sustained workloads that DL creates, so they may not last as long when used for that purpose. Neither 1080Ti nor GP100 get near the images/sec performance of a GV100 with it's tensor cores, but who can afford those for personal use?

32GB of RAM should suffice. DL doesn't require much system RAM but typically you want x2 GPU RAM, so 22GB + 10GB overhead will do. I'll need to add 2 more 16GB sticks for x4 1080Ti.

The 12 HW threads of the 6 core 6850K will be enough to keep the GPUs fed. Not a 22 core Xeon Broadwell, but good enough. Minimally you want 2x cores to keep the GPUs going. Tensorflow is the only DL framework I have seen that will utilize as many cores as you have in a system. Caffe, MXNet, etc. just use a couple of cores to feed images and sync weights.

I've added a 512GB Samsung 960 EVO NVMe drive for boot and fast working space and a 4TB HGST NAS drive for slower, longer term storage, e.g. images/videos. Want to pick a mechanical drive that lasts? Have a look at the Backblaze data set which is often used by machine learning enthusiasts and researchers to predict hard drive failures. They provide brand information so you'll know which drives the large data centers use and how those drives perform.

Liquid cooling of the CPU because I can, and may look to liquid cool the GTXs in the future depending on whether or not I OC ... and yes, it will see its fair share of games this holiday season.

Log in to rate comments or to post a comment.


  • 26 months ago
  • 1 point