Description

It's a 3990x with a 1080ti SLI. It does it all! Overall the build went smoothly other than taking a couple of attempts at memory initialization for the first POST.

Some problems with the build. -The S340 Elite, while being a fantastic case, is just too small for this amount of heat. a 280mm cooler is technically enough for it, but not with anything else requiring that air in the case. -EATX. It REQUIRES You to remove the cable management shroud to fit the motherboard in. Even with as much room as I can get out of the case, it's still a very tight fit. -Water cooling the CPU. Other than an Enermax option, nothing really fits in the CLC options. I don't want Enermax. Go check out Gamersnexus' review (the second one) if you need any more information. It's gross. -Cable management-While this case does have some space for cable management, a 24 pin, 2 8 pin CPU, 1 6 pin PCIe for the motherboard, 4 8 pin PCIe for the GPUs, front panel, fans, SATA, etc. takes up a lot of space and it's easy to run out of room to shove stuff.

Parts on the way right now-EK 360 custom soft tubing loop for the CPU, 2x 2080 Ti, Lian Li 011 XL case. Plans for the future-2 Corsair PCIe 4 SSDs, Custom loop for graphics cards, Selling my second kidney.

Thanks for checking it out!

Log in to rate comments or to post a comment.

Comments

  • 1 month ago
  • 6 points

Is this insured?

  • 1 month ago
  • 5 points

OMG. The first 3990X build on the site. Congratulations and thanks for posting!

64 cores, Stellar MB, 64GB ram quad-channel, 2 GTX 1080 Ti GPUs, Platinum PS. What more could you ask for?

PCI4 storage prices should drop thru 2020 so you might want to add one of those if you need more/faster storage.

Please describe what you are using this king of the castle beast for. Thumbs up from me, wishing I had 64. Enjoy!

  • 1 month ago
  • 2 points

Corsair PCI-e 4.0 SSDs are in the next round for this (maybe a RAID-0 just 'cause). I have kind of an oddball use case, but I basically wanted one computer that could handle 4 or 5 full-featured remote workstation/gaming computers for my GF and her coworkers, storage server for their apple stuff, while running alongside a home server with some fairly basic thin client (for windows 10 VMs through the house, AND be able to handle VR gaming done directly from this machine in its hole in the basement. At the same time. Resource management with the workstations was becoming a problem with my older 1950x (never thought I'd say that) so this has allowed me to go completely off the deep end and have no real-world way to max out CPU resources. Case and graphics are in the works since the s340 is severely limiting my cooling at this point. It's no joke. I can't keep this CPU working in ANY of my real world workloads, but I'm having a hard time cooling it so it works out. It's just insane. EDIT: Also Plex and its encoding. It really does everything I can think of throwing at it.

  • 21 days ago
  • 1 point

speaking of plex, how does it scale on the latest gen threadrippers? I have the first gen 1950x. Due to the design flaw, plex could never use more than 50% of the cpu.

  • 1 month ago
  • 4 points

What is "all" in your context?

ie what do you use it for?

  • 1 month ago
  • 2 points

It's mostly for home server, Rendering VMs, Gaming VMs (depending on what the rest of the house wants to do), VR, etc. It does do a lot of different things at the same time. The limits have really shown up with the graphics and memory capacity if the resources are spread out a lot, but nothing that really pushes this CPU. The case is definitely becoming a problem though, since I'm out of cooling options for Threadripper with the lower airflow. A case and custom CPU loop are more than likely coming next.

  • 1 month ago
  • 1 point

I would recommend Corsair 1000D case going with with EK for cooling options. Plenty of room to work with and for water cooling.

  • 1 month ago
  • 1 point

I went with EK for my loops (I think I got the last 2 new reference PCB nickel/plex 2080 ti blocks), but I planned around a Lian Li O11 XL. The build is in process and should be up by tomorrow if everything goes as planned. Actually it's basically going to be a white 2 loop version of what Linus built at the Denver Microcenter.

  • 1 month ago
  • 3 points

9 picture of that threadripper phhst it should be at least 18

  • 1 month ago
  • 1 point

I appreciate you taking a look (or 9) at it! It's been fantastic so far. Definitely fits my oddball use case. Next will be 15 pictures of the IO panel on a bigger case. 🤣

  • 1 month ago
  • 1 point

Are you gonna keep the Corsair cooler or are you going to change it because it looks like it’s going to be a oven and case. It looks very good by the way.

  • 1 month ago
  • 1 point

I've got an EK custom loop and a Lian Li 011 Dynamic XL coming for it. I'll update when its rebuilt. It is a bit of an oven now.

  • 1 month ago
  • 2 points

When you can, try to find a HB SLI bridge as you're actually bottlenecked in SLI performance by using "non-qualified" regular SLI bridges.

  • 1 month ago
  • 1 point

I've seen that in benchmarks, but haven't noticed anything with actual games. Plus, most of the time it's also running VMs that get graphics cards split up within them. Graphics are actually the next to change with this one, but It's a 3990x in an S340 elite. I've discovered I'm out of cooling so the case needs to change first.

  • 1 month ago
  • 2 points

I'd recommend the O11 Dynamic XL. Ample room for lots of fans.

  • 1 month ago
  • 1 point

I've got one coming! Thanks for checking it out!

  • 1 month ago
  • 2 points

BUT CAN IT RUN CRYSIS?

  • 1 month ago
  • 3 points

It runs in compatibility mode, 😒 but yes!

  • 1 month ago
  • 1 point

yeah sure... if you use the threadripper with no graphics card, yes, it runs in compatibility mode.

  • 1 month ago
  • 2 points

I haven't gotten around to it yet, but I do want to try Crysis on CPU rendering only. It should be interesting. Probably not great, but interesting.

  • 1 month ago
  • 2 points

Only a 280mm raditor for a 3990x? You should at least get a 360mm raditor.

  • 1 month ago
  • 2 points

I should have this all under a custom loop with a thick 360 in the next couple weeks (hopefully). EK has a sweet looking block for it!

  • 1 month ago
  • 2 points

Congrats on being the first with Threadripper 3990X!

About the planned custom loop - you should do a dual loop, separate the CPU and GPU cooling, or at least have a radiator between the two flows, for best results. Use a clear liquid to avoid clogging in the tubes and blocks.

Cheers!

  • 1 month ago
  • 2 points

Thanks for checking it out! It will eventually end up with a dual-loop after the next graphics cards show up. I've planned it to be able to be fairly easy to work on without taking the loops apart in the case. That CPU justifies its own loop. It's thirsty!

  • 1 month ago
  • 2 points

i am really curious what are the temps for this 64 core beast?

do you think a huge air cooler like the Cooler Master Wraith Ripper be able to keep this beast cool?

Or i really have to go AIO?

I have bad experience with AIO that fried my PC coz the pump failed.

  • 1 month ago
  • 1 point

The CPU gets to be around 70C under full load, but not much gets it there. The main problem with an AIO is that it doesn't cover the whole heat spreader. I've been having more heat issues when it boosts one or two CCX's to 4.2 GHz and they aren't completely covered by the AIO. I've seen jumps to 95C for lightly threaded workloads. It gets pretty loud running 8 cores at 4.2 GHz. Combined with the 1080 Tis, it can't handle itself unless everything is going full blast, and it sounds like a server. For air coolers, the Wraith Ripper and the Noctua U14S TR4-SP3 (or the U12S TR4 if the 140mm doesn't fit) do look really good! Due to the complete IHS coverage, they've been performing as good or better than the AIO. Check out comparisons on Youtube that fit your use case. I don't blame you for not wanting to go AIO. One nearly killed my last Threadripper build because Enermax couldn't keep it from growing stuff on the inside.

  • 1 month ago
  • 1 point

thanks for the info

will you be going air cooler as well?

there is one possible cooler which i might consider that is the IceGiant ProSiphon Elite

it a huge tower cooler that is current TR4 socket use only but can cool up to 400W TDP

7 years warranty is def hard to beat

nvm you are going custom water loop.

  • 1 month ago
  • 1 point

here is the product link of the Thermalsiphon

https://www.icegiantcooling.com/prosiphonelite

  • 1 month ago
  • 1 point

I saw a video about that last month and it looks like a good option if there is room for it, and you aren't restricted to mounting direction or location. But since it requires being mounted vertically and it does take up a lot of room, I decided to go with 2 custom loops for CPU and GPU. In order to make the thermosiphon the most efficient, I'd have to have the case lying on its side. Plus with a thermosiphon, there is a minimum temperature it can possibly run at. It requires the liquid to boil to cool effectively, which means it cannot cool below the boiling temperature of the liquid they are running. If I've planned my loops right, I should be able to keep this much quieter than it is right now.

  • 1 month ago
  • 2 points

The fact that you used two 1080tis instead of that trash 2080ti earns you all of the street cred!

  • 1 month ago
  • 1 point

10 grand for a pc? How are you this rich?

  • 1 month ago
  • 1 point

The 1080 Ti SLI might be a bottleneck for the 3990X.

  • 1 month ago
  • 1 point

Great build, so I'm actually wanting to plan to do this exact same thing for my house. Me, my girlfriend and my roommate all have computers and I think it would be better, electricity wise, to just have a server and consolidate all of the computers to virtual machines. So I have a few questions about this build:

1- Why use Windows 10 Pro and not Windows Enterprise? I've been reading about the new processor and Windows 10 Pro actually can't make use out of all of the threads that it has. Windows Enterprise is able to actually handle 128 threads, so I'm curious to see what you did to circumvent this problem or if it is even a problem.

2- How is gaming performance through the virtual machines? I know alot of people that I've asked about this have said that there would be a ton of lag and it wouldn't be nearly as good as gaming on a dedicated, physical device.

3- How did you split the GPUs up to the virtual machines you're using? Is that a specific setting in Hyper V or are you using a different virtualization platform?

Thank you very much for doing something like this and posting the completed build.

  • 1 month ago
  • 2 points

I've been getting around the thread limit right now by disabling SMT. It's not ideal but until I get enterprise or finally assimilate and learn Linux, and stop windows grouping 64 threads together in groups.

Gaming through the VMs isn't terrible as long as it remains on the local network. Honestly steam in-home streaming works just as well, without running through a VM. It's just easier to get emulators running on a VM to play some older console games on the TV without having a dedicated HTPC. Just a NUC with decent networking stuck to the back of the TV is fine.

I haven't played around with splitting the GPUs too much yet since it's changing to 2080 TIs and I'm not sure how much that will break everything. As of right now, any VM has full access to both GPUs and I've been letting scheduling figure it out. It's not ideal but it's good enough for 3 people if they aren't all hammering it simultaneously. The workloads for the GPUs don't usually happen more than 1 job at a time. I'm very new to getting this much stuff split up and still usable so I've been trying to figure out how to do it efficiently. I am curious about Nvidia's vGPU and carving them up into more than 2 units but I haven't tried it out.

I'm going to be putting up a new build with the new hardware, case, and cooling once it's leak tested and finished, and I hope to have time to play with the virtualization then!

  • 1 month ago
  • 1 point

Thanks for replying! I figured it would be something around disabling SMT in Windows.

I've also read about Nvidia's vGPU but apparently it's only supported for server graphics cards like Tesla, but you could probably do some jank configuring to get it to work. I'm no expert on the Nvidia things but it seems really cool and it would really help out with a project like this.

I've been looking into thin clients that could be useful and NUC's and Wyse terminals are alot more expensive than what I was giving them credit for. I'm probably going to end up using RaspberryPi's with some sort of Terminal OS on them like WTWare for the clients to connect to the server. On the server I'm also probably going to use some sort of Linux server distro and run KVM for the virtual machines.

That way I can trick the VM's into thinking they're running on bare metal and not running inside a HyperVisor. I'm kind of just floating all of the ideas around and haven't started purchasing any hardware yet. It's really cool to see someone else have my idea and execute on it though! I'm also really excited to see the new build you're doing.

  • 1 month ago
  • 2 points

That sounds like exactly where I'm at. Even 3 years ago, I hadn't even imagined having 64 cores and 128 threads to play around with on the HEDT platform. It's insane. Coming from a 1950x (that really seems poorly optimized now with Ryzen 2), I can crank away on the 2080tis since that kind of workload is very mature, but the CPU is tricky to push within a Windows environment. I expected to have to expand my knowledge of virtualization and load-balancing within this environment, but the scaling, coming from a 16 core 32 thread CPU, is interesting with this one.

  • 1 month ago
  • 1 point

Holy...

  • 1 month ago
  • 1 point

amazing build. Any reason you didn't go for a titan rtx card instead of 2 1080ti's or 2080ti's?

  • 1 month ago
  • 1 point

I found 2 2080 Ti's for $1200 and I couldn't turn that down. The 1080 Ti's in this build were something similar for both of them. The Titans are double that for one card.

  • 1 month ago
  • 1 point

$1200 each or $1200 for both of them?

  • 1 month ago
  • 1 point

1200 for both. It was less than I payed for the 1080 Tis.

  • 1 month ago
  • 1 point

Oh wow that makes a lot more sense. Thanks!

  • 8 days ago
  • 1 point

What is this for?

  • 1 month ago
  • 0 points

I actually have a 3900x as well its rocking a 6 monitor setup LOL and I find it amazing.