AMD's consumer CPUs support ECC memory, whereas Intel's don't, which you might want for scientific computing, especially if you're doing something mission-critical. Apart from that, AMD and Intel consumer CPUs support the same RAM (DDR4 unbuffered), so it should be equally easy to find RAM for them.
"Scientific computing" is a vague term but most likely it means you're doing a lot of parallel processing. Currently AMD still has better general multithreaded performance at a given price; for example, the 2700 is cheaper than the 9600k but has 8 cores/16 threads vs. 6 cores/threads (Intel removed hyperthreading from the 9-series). It does depend on what exactly you're running, though; a software package that's been specifically optimized for Intel-specific instructions might still perform better on the Intel.
2990wx and 2950x are both NUMA. The difference in per-thread speed is to a significant degree due to the clock speed difference (for thermal reasons, presumably); the funky memory topology of the 2990WX is only part of it.
Depending on your workload you might consider Epyc, which is more expensive than Threadripper but has larger cache and fully-enabled dies. That might be good if your performance is constrained by memory connectivity.
Why is a MicroSD plugged into your graphics card though?
If you aren't overclocking, it sounds like you have a bad CPU. It's hard to find information about this exact message, but it sounds like one of the CPU caches' ECC detected and corrected an error.
More info on the fields: https://msdn.microsoft.com/en-us/office/ff559554(v=vs.90)
If you get the machine check exception error code, you can try decoding it to see exactly what component is giving it: http://www.mcelog.org/
There have been Thunderbolt monitors on the market for years. https://www.amazon.com/slp/thunderbolt-monitors/m3bmocaa4gt54jk
More expandability. 64 PCIe lanes, 4-channel RAM and greater RAM support (most boards have 8 slots).
Random chance? 25% of copies will be at the 25th percentile or worse, by definition.
Then I got bored and started memorizing something else.
Some of these are kind of useful, especially if you're a programmer.
Not true anymore. The Intel Optane 900P/905P is significantly faster for everything except sequential writes (where it roughly matches the best Samsungs). Of course, it's much more expensive, and is generally overkill, so it makes sense to get only if you specifically need its performance or have cash to burn.
AMD has information on BIOS updates for previous-generation chipsets: https://support.amd.com/en-us/kb-articles/Pages/2Gen-Ryzen-AM4-System-Bootup.aspx
That sounds reasonable. A related option to consider is ZFS with parity, which provides several other benefits like end-to-end checksums, ZFS send for backups or migrations. Debian doesn't have ZFS by default but it's probably not hard to set up. It's supposed to have a reasonably user-friendly command-line UI, and if you use some of the additional features (e.g. snapshots) regularly you probably won't forget how to use it.
Double parity (RAID-6 or RAID-Z2) is recommended nowadays due to the higher chance of a second drive failure while rebuilding the array when you have large drives.
I have this board. The X399 Taichi has two USB 3.0 headers. See page 5 in the user manual.
They used to be the future of storage. They offered excellent long-term data retention for backups and archival storage. The next-generation holographic discs under development promised a then-enormous capacity of 1 TB.
Then everyone seemingly stopped developing new optical disc technology. The largest-capacity commercial optical technology (Blu-Ray) is now too small for backups now that hard drives and even SSDs come in multi-terabyte capacities. They're still used for read-only distribution of mastered media, but are otherwise pretty much obsolete.
Also, tape drives were once a common backup medium, but for consumer use they are dead. They are still used in large enterprise installations, but otherwise the fixed cost of a tape drive is prohibitive. (Tapes are only somewhat cheaper than hard drives per byte, and drives cost thousands of USD, so hard disks are much cheaper unless you're storing an enormous amount of data.)
You can actually do this.
x86 processor that goes into a PCIe slot.
At first I thought this would be a question about SATA cable routing.
Vanguard Total World Stock Market Index.
One other option if you happen to own an enterprise server: these may (usually?) include this feature as part of their remote management system. I know Dell supports this as part of their iDRAC.
The best >=1TB SSD on the market?
Intel Optane DC P4800X 1.5 TB
The drive is available in 2 form factors, as either a PCIe card or a 2.5" U.2 drive. Presumably you would want the PCIe card if you have a free slot (most motherboards don't have a U.2 mini-SAS connector).
These drives have vastly better read latency and write endurance than any flash SSD, and roughly match the throughput of the best flash SSDs. They are rated for an uncorrectable error rate two orders of magnitude lower than a typical consumer drive, and include some enterprise reliability features like power-loss protection.
On one site offering it for sale, it costs $6,848.99.
If you didn't actually want the best SSD on the market, you could take a look at the consumer Optane drives (900P or 905P), which sell for less-absurd but still very high prices. Even those drives are probably overkill for almost all users. Evaluate how much performance you really need.
Maybe not, but boot times hardly matter. The benefit of faster storage is system responsiveness once the system is booted, especially if you have little RAM (for read cache) or a write-heavy workload.
A small case doesn't necessarily preclude it. There should be small cases with a single 5.25" bay. If you buy thin 2.5" drives, you could get a 6x2.5" or 8x2.5" adapter in the bay.
Windows XP can't even use more than 4 GB of RAM. Unless you're talking about the "x64 Edition", which is a rebranded Server 2003 rather than real XP.
Also consider 2x4GB RAM to get dual-channel.
Interestingly, Intel does the same thing with including "workable, if mediocre," integrated graphics in most (all?) of their consumer CPUs. AMD instead includes better (decently good) stock coolers and integrated graphics with some of their CPUs.
I don't get it. While your computer is shut down for hardware-cloning a disk, its performance goes to zero. Why is the tiny amount of CPU usage from/while cloning a disk a concern?
You're probably right; my concern was more that it's not the typical use case for enterprise hot-swap (replacing failed drives) so I wasn't sure whether the drive-side connector is robust enough.
Online software cloning is obviously going to be slightly slower, but the performance doesn't matter as much since you can keep working while it happens.
I'm not sure if it actually gains you any real-world speed, unless you have a high-end drive with a lot of DRAM and capacitors, because the OS will cache in main memory. Specifically, assuming you have more free main memory than the drive's cache:
Basically I suspect the cache is mainly to make the drive look better in benchmarks.
This may not be true if you're running out of main memory. But in that case, it would be more effective to upgrade the memory rather than the drive.
That might not be a good idea. Internal connectors are not designed for frequent insertion and removal and may wear out. From what I've read, SATA connectors are only required to be rated for 50 mating cycles. The connector in a good-quality hot-swap bay should be more robust, but you may still wear out the connector on the drive.
I'd recommend looking into backing up using online snapshots. It's much more convenient; you won't even have to reboot. On Linux you can do this using LVM, Btrfs, or ZFS; I imagine recent Windows versions something similar.
With the computer opened up, we get to see the plumbing. If it crashes, make sure to flush the caches and take a core dump.
Since all other components depend on it, a power supply is the number two most important part to get for your computer. That power supply looks kind of crappy. Since it is located above the cooler, when it craps out, the **** will really hit the fan.
(to be clear, I don't actually see anything wrong with the PSU. Last paragraph is just for pun.)
I have the same issue with the ridiculously bright case LEDs. I was thinking of wiring a resistor into the circuit to reduce the brightness.
Are you using Numpy on Windows? Apparently Windows's scheduler sucks for NUMA systems like Threadripper. Since Numpy is memory-bandwidth-intensive, it would perform poorly on Windows, but on Linux it should be fine.
Here's mine (semi-WIP): https://pcpartpicker.com/b/LW6scf
Currently it only has a wimpy GPU I got for free, but I'm planning on upgrading it later. The ASRock Taichi can support 4 graphics cards in x16/x8/x16/x8, or in principle even more with adapter cables since it supports PCIe bifurcation.
Did all the cable ties and stuff come with the case?
I can confirm. Adding unicorn barf makes fans spin faster as they try to flee, screaming in terror.
To elaborate, Ryzen supports ECC RAM and has more cores.
If you get a case with 5.25" external bays, you can buy hot-swap adapters for multiple 3.5" or 2.5" drives. They'll be accessible from the front, without having to open the chassis. I see you have one inside the case somehow; how did you mount it?
That said, what are you using hot-swap for if you have only 1 drive?
Almost certainly not. Unless you're already maxed out, or have very specific computational requirements, spending the money on more RAM capacity will improve performance more.
True, the average gamer won't get tripped up. In my case, I was looking for ECC RAM for a Ryzen workstation build, and it was the opposite.
ECC UDIMMs are somewhat hard to find. Presumably due to the limited selection at retail, RDIMMs are cheaper than UDIMMs and are available in higher speeds (I see 2933 RDIMMs available, whereas nobody sells UDIMMs faster than 2666).
I would have been tripped up myself if I hadn't read about the differences.
OP plans to get 2080 Ti in a month or two.
Threadripper is quad-channel; you should get 4 RAM sticks (unless planning on upgrading later).
I get your point, and you're right about this kit, but DDR4 is not all the same. Stick a DDR4 RDIMM or LRDIMM on a Z370 motherboard and it won't work.
Unfortunately there's no standard interface to communicate this information to the computer. Some of Corsair's higher-end units include a proprietary "Corsair Link" interface.
To get this information otherwise (or if you don't trust the Corsair PSU's self-reported values), you need expensive equipment. The PSU review site JonnyGuru has details: http://www.jonnyguru.com/modules.php?name=Testing_Methodology
Specific example products to illustrate what I mean:
I plugged it in and it worked?
More info: https://pcpartpicker.com/forums/topic/294234-optane-acceleration#cx2981462
If you want to set up caching, you have to use your own software. Options include lvmcache and bcache (for Linux), and AMD's StoreMI (for Windows).
FYI, Optane drives are standard SSDs and they work fine with AMD processors. I'm using two of them on my Threadripper build. The compatibility limitations are restrictions of Intel's caching software, and not a hardware incompatibility.
PCIe passthrough reportedly had problems on Threadripper in the recent past. I'm not sure whether they're fully resolved; you may have to upgrade to a newer kernel.
Just a measly $20,000?
Even if you use less than 8 GB, the remaining RAM will speed up your computer by being used as a read cache for your disks.
Some good suggestions. I'd add more RAM though. 3D animation should be able to gobble up RAM, so you could go whole hog and get 128 GB.
Why 2 SSDs? At this price point, just get the Optane 905p.
I'd also consider adding storage redundancy (e.g. 6x HDDs in a double-parity array). Or go all-SSD if you want to blow through the budget.
Be aware that unless your workload has specific requirements and can make use of large amounts of expensive hardware, you're hitting diminishing returns with a $9000 computer. For example, machine learning or crypto mining can use 10x high-end GPUs, but if you're only gaming the extra 9 GPUs will do little more than increase your computer's effectiveness as a space heater. I guess you can spend practically unlimited amounts of money on switching all the storage to Optanes.