add arrow-down arrow-left arrow-right arrow-up authorcheckmark clipboard combo comment delete discord dots drag-handle dropdown-arrow errorfacebook history inbox instagram issuelink lock markup-bbcode markup-html markup-pcpp markup-cyclingbuilder markup-plain-text markup-reddit menu pin radio-button save search settings share star-empty star-full star-half switch successtag twitch twitter user warningwattage weight youtube

CPU Advice Needed

Carrier_Taihou
  • 3 months ago

Pretty simple question this time around.

Boxing day saw the Ryzen 7 3800X at $20 more than the 3700X. Due to an unforseen bill however, I wasn't able to purchase the CPU, and it has now gone back up in price to well over $500 dollars. With the number of retailers still listing it at $460 CAD or so, I should be able to get my local shop to price match.

On the offhand that they refuse to price match, is it worthwhile to just spend the same money on a 3700X, or hold off till the 3800X or better hits a sale?

Comments

  • 3 months ago
  • 2 points

3800X isn't worth the extra even if you use one of the early BIOS versions from before AMD increased voltage to max safe across the board there was very little difference.

After that voltage change they are the same within margin of error.

  • 3 months ago
  • 1 point

Pretty safe to default to the 3700X then. Alternatively, would it be worth the 120 dollar price difference to jump to a 3900X which is on sale now?

  • 3 months ago
  • 2 points

3900X might be worth it depending on workloads, you double the number of preferred cores so Windows uses it like a 4c/24t (Each CCX gets a preferred core pairing with Windows) instead of the 2c/16t of the 3700X/3800X.

If you uses don't really scale past the two preferred core scenario (Gaming primarily) then the uplift is marginal.

  • 3 months ago
  • 1 point

The upgrade itself was prompted by my 6700K really struggling during streaming, especially with AAA titles. If I'd see a marked improvement by switching to the 3900X over the 3700X I could swing it, but if it's just going to be bigger numbers for the sake of bigger numbers I'll probably go with the cheaper option.

  • 3 months ago
  • 2 points

If your primarily doing the upgrade for streaming the 3700X/3800X are not an upgrade and often a downgrade from a 2700X because of core priority.

The older model does have to deal with cross-CCX communication latency but can actively use all of its cores instead of having workloads pushed onto selected cores.

3900X becomes a little more debatable.

https://ca.pcpartpicker.com/products/compare/bddxFT,qryV3C,QKJtt6/

Right there you can save $150 to put into better RAM, Graphics Card, or more SSD storage.

  • 3 months ago
  • 1 point

Is the 2700X capable of running AAA titles at 1080p 144hz?

The biggest reason I didn't pick one up through a friend in the US was that most of the reading I've done points to them not being able to sustain higher FPS in demanding titles.

Thankfully there's not really much else I need to upgrade. The rest of the system has 32GB of 3200mhz RAM, 2080 Super, and 3.5TB of SSD storage.

  • 3 months ago
  • 1 point

Not worth any extra money really.

https://cpu.userbenchmark.com/Compare/AMD-Ryzen-7-3800X-vs-AMD-Ryzen-7-3700X/4047vs4043

Overclock the 3700x to the 3800 and their the same performance.

  • 3 months ago
  • 1 point

I agree with Gilroar + Zerk... The 3800X is pointless (not sure why AMD added this one to an already spot-on 3rd GEN Ryzen stack). I guess for $10-$20 on top a small increase in performance would be nice but anything above this sort of premium is more leaning towards enthusiasm opposed to spend-worthy lucrative performance returns.

Potential board-maker and AMD driver or firmware I/O instruction updates did initially open up some optimism to get more out of the higher binned 3800X but 5/6 months-on there's nothing worthy of note to report, other then a select number of buyers reporting some favour with a 2-3% performance margin (with expensive coolers/boards or a silicon lotto win). Keep in mind the 3800X's default TDP is rated at 105w, whereas the more efficient 65w 3700X delivers pretty much on-par performance - "efficiency" alone is an excellent selling point for the 3700X opposed to a higher binned power munching and poorly scoring 3800X.

  • 2 months ago
  • 2 points

Jesus christ Idk why I just spent like 30 minutes reading all of this. Longest debate on pcpp? +1

  • 3 months ago
  • 0 points

Keep in mind the 3800X's default TDP is rated at 105w, whereas the more efficient 65w 3700X delivers pretty much on-par performance - "efficiency" alone is an excellent selling point for the 3700X opposed to a higher binned power munching and poorly scoring 3800X.

TDP isn't power efficiency or usage its Thermal Design Power or thermal output at AMD specifications for that model.

And that number isn't fixed any longer and varies depending on model, platform, and what AMD chooses for it to reference, so can be anything from flat stock clock thermal output, to full boost clock thermal output, or be completely unrelated to the CPU and only a reference to the coolers thermal capacity.

Let alone that figure was set before AMD revamped the voltage profiles for the entire line to better fit the listed boost and stock clocks, now that they are all running 1.3v-1.4v at stock clocks the thermal and power draw across the line has increased.

  • 2 months ago
  • 2 points

TDP isn't power efficiency

No one is suggesting TDP is a measure of "power efficiency"

TDP isn't ...... usage

No one is suggesting TDP is a measure of power "usage"

And that number isn't fixed any longer and varies depending on.......

The "rated" TDP was never fixed in the first place nor intended to fit a predetermined constant. It just doesn't work that way. Thermal throttled limitations with a number of variable factors can achieve the result closer to home but that's not the intended goal with a dense and split DIE double threaded CPU. What we do know is the 3700X comfortably hits it's rated base-clock frequency with a 65w TDP cooler.

Let alone that figure was set before AMD revamped the voltage profiles for the entire line to better fit the listed boost and stock clocks, now that they are all running 1.3v-1.4v at stock clocks the thermal and power draw across the line has increased.

The 3700X was already boosting up with a 1.3v-1.4v power drive. The vehemently "opportunistic" voltage profile was always apparent from the get-go. Later optimisations at the code level (firmware/BIOS) maintains pretty much the same opportunistic approach with a fine-tuned 100~200Mhz improvement. As we know very well the "AUTO" CPU_CORE voltage preset has always been set high through-out the 1st/2nd and now 3rd GEN platforms - something we can easily off-set/fine-tune to reduce the thermal output. This can be achieved more effectively with the 3700X whilst maintaining stability/stock performance under load.

Bottom line: At the sweeping consumer-level, we don't need to introduce complex design frameworks of TDP and what it defines to.... where it can help for the ordinary consumer is forming "some idea" as to which CPU runs hotter.... or consumes more power to achieve it's rated stock/boost clocks... which is where TDP can be used as an "indicator" and not a definitive source. When comparing the 3700X and the higher binned 3800X, initial power consumption with the stock cooler saw the 3800X demanding 12% more power which is where "efficiency" comes into the equation. 3800X demands a higher TDP rated cooler to comfortably manage a 3.9Ghz base-clock and where achievable on the bundled stock cooler, the temps will be least favourable.

  • 2 months ago
  • 0 points

No one is suggesting TDP is a measure of "power efficiency"

You did above as referenced.

What we do know is the 3700X comfortably hits it's rated base-clock frequency with a 65w TDP cooler.

3700X comes with the same 135w Wraith Prism as the 3800X.

The 3700X was already boosting up with a 1.3v-1.4v power drive.

No that was already proven to be observer effect from software only being able to poll the CPU every 15-100ms when voltage is adjusted every 1-5ms and software readings will never be correct on Ryzen for that reason.

The original voltage profile ran with a 1.1v average and has since been increased to a 1.3-1.35v average depending on models according to AMD to solve models not hitting boost and stock clock speeds even with extreme cooling.

Now your still using "TDP" as a measurable selling point when you have even pointed out that it doesn't apply the way you are stating it does.

  • 2 months ago
  • 1 point

sounds like we're hitting a tit-4-tat curve ball with some elements of selective reading.

The efficiency reference is purely from an indicative representation and not by definition - oddly enough, explained previously but..... oh well!

The 65W reference is purely based on base-clock thermal "ability" and not limiting or sacrificing Wraith Prism's potential for a higher scaled thermal output. Not sure why the 135w reference was made unless you're speculating within selective means.

The original voltage profile ran with a 1.1v average and has since been increased to a 1.3-1.35v average depending on models according to AMD to solve models not hitting boost and stock clock speeds even with extreme cooling.

Can you provide a legitimate source which establishes the 1.1v skirmish across the board prior to the later AGESA/BIOS patch/s? As far as I'm concerned the earlier adoption had me tinkering into voltage off-sets to slash that volatile 1.4v+ brawl. I hope the reference provided is "real-world" materialised outcomes opposed to firmware/software poorly addressed 'would have beens'.

  • 2 months ago
  • 1 point

As far as I'm concerned the earlier adoption had me tinkering into voltage off-sets to slash that volatile 1.4v+ brawl. I hope the reference provided is "real-world" materialised outcomes opposed to firmware/software poorly addressed 'would have beens'.

Observer effect causing the wrong results on Ryzen and why they aren't valid in a nutshell with how boost clocks now work.

We have determined that many popular monitoring tools are quite aggressive in how they monitor the behavior of a core. Some of them wake every core in the system for 20ms, and do this as often as every 200ms. From the perspective of the processor firmware, this is interpreted as a workload that's asking for sustained performance from the core(s). The firmware is designed to respond to such a pattern by boosting: higher clocks, higher voltages.

https://www.reddit.com/r/Amd/comments/cbls9g/the_final_word_on_idle_voltages_for_3rd_gen_ryzen/

When the software polls for current voltage it causes all cores to ramp up causing the high voltages it is reading.

Current desktop load without running anything major should run in the 1.2v range with normal background tasking after the change.

Example test cases might include: video playback, game launchers, monitoring utilities, and peripheral utilities. These cases tend to make regular requests for a higher boost state, but their intermittent nature would fall below the threshold of the activity filter.

Net-net, we expect you’ll see lower desktop voltages, around 1.2V, for the core(s) actively handling such tasks. We believe this solution will be even more effective than the July changes for an even wider range of applications.

https://community.amd.com/community/gaming/blog/2019/09/10/ryzen-community-update-bios-updates-for-boost-and-idle-plus-a-new-sdk

These are all changes made after the release benchmarks rendering the older power/thermal numbers void.

The same effect and issue shows up on any Intel CPU that are using dynamic boost clocks as well so it isn't an AMD only issue.

Sort

add arrow-down arrow-left arrow-right arrow-up authorcheckmark clipboard combo comment delete discord dots drag-handle dropdown-arrow errorfacebook history inbox instagram issuelink lock markup-bbcode markup-html markup-pcpp markup-cyclingbuilder markup-plain-text markup-reddit menu pin radio-button save search settings share star-empty star-full star-half switch successtag twitch twitter user warningwattage weight youtube