add arrow-down arrow-left arrow-right arrow-up authorcheckmark clipboard combo comment delete discord dots drag-handle dropdown-arrow errorfacebook history inbox instagram issuelink lock markup-bbcode markup-html markup-pcpp markup-cyclingbuilder markup-plain-text markup-reddit menu pin radio-button save search settings share star-empty star-full star-half switch successtag twitch twitter user warningwattage weight youtube

NVIDIA Ampere Speculations

XXX_BillTube_XXX
  • 1 month ago

From what is known, NVIDIA Ampere will have six GPUs: GA100, GA102, GA103, GA104, GA106, and GA107. Obviously, the GA100, equipped with HBM2 memory, will be the high-end Ampere GPU, probably used for a Titan and a high-end Quadro. The GA102 to GA107 GPUs, equipped with GDDR6 memory, will be the mainstream Ampere GPUs, probably used for the GeForce family.

NVIDIA Ampere will likely be a beast at compute, for its GPUs have many CUDA cores. The high-end GA100 GPU is expected to feature 8192 CUDA cores, 2048 Tensor cores, 128 RT cores, and a 6144-bit memory bus. However, the mainstream chips are, also, impressive, for the mid-range GPU, the GA103, contains 3840 CUDA cores, the same amount of CUDA cores in a Quadro P6000.

GA100: 8192 CUDA cores, 2048 Tensor cores, 128 RT cores, and a 6144-bit memory bus

GA102: 5376 CUDA cores, 1344 Tensor cores, 84 RT cores, and a 384-bit memory bus

GA102 (cut down): 4928 CUDA cores, 1232 Tensor cores, 77 RT cores, and a 352-bit memory bus

GA103: 3840 CUDA cores, 960 Tensor cores, 60 RT cores, and a 320-bit memory bus

GA104: 3072 CUDA cores, 768 Tensor cores, 48 RT cores, and a 256-bit memory bus

GA104 (cut down): 2304 CUDA cores, 576 Tensor cores, 36 RT cores, and a 256-bit memory bus

GA106: 1920 CUDA cores, 480 Tensor cores, 30 RT cores, and a 192-bit memory bus

GA107: 1280 CUDA cores, 320 Tensor cores, 20 RT cores, and a 128-bit memory bus

With these specifications and past NVIDIA tendencies, much information regarding these cards can be extracted. It is known that the GA103 and GA104 are home to the GeForce GTX 3080 and the GeForce GTX 3070 respectively, but from past NVIDIA trends, it can be known that the GeForce GTX 3060 will, also, have the cut-down GA104 GPU. Possible VRAM amounts can be extracted, too.

Quadro GA100: GA100 w/ 96 GB of HBM2 VRAM

Titan A: GA100 w/ 48 GB of HBM2 VRAM

Titan XA: GA102 w/ 24 GB of GDDR6 VRAM

GeForce GTX 3080 Ti: GA102 (cut down) w/ 11 GB of GDDR6 VRAM

GeForce GTX 3080: GA103 w/ 10 GB of GDDR6 VRAM

GeForce GTX 3070: GA104 w/ 8 GB of GDDR6 VRAM

GeForce GTX 3060: GA104 (cut down) w/ 8 GB of GDDR6 VRAM

GeForce GTX 3050: GA106 w/ 6 GB of GDDR6 VRAM

GeForce GT 3030: GA107 w/ 4 GB of GDDR6 VRAM

Comments

  • 1 month ago
  • 2 points

Those are very unlikely figures.

NVidia is more likely to start the 3070 on a full GA106 core/cut back GA104, and the 3060 on a cut back GA106, until AMD manages a comeback then moving both up to the GA104 cut down/full'ish GA106 like they did with Turing.

Saves money for them and they have no competition now that AMD will no longer have the process advantage and requires more cores to get equivalent performance.

Without either AMD or Intel bringing competition above the mainstream market you'll never see massive improvements like that in the upper models.

  • 1 month ago
  • 2 points

I was going off of past trends.

  • 1 month ago
  • 1 point

Since there hasn't been a xx100 or xx103 core in generations it isn't based on a trend.

What is a trend is NVidia moving models down the line using more failed cores so they can rerelease updated ti/Super models if AMD manages a competition.

  • 1 month ago
  • 1 point

Based on the rumors I have heard, the GeForce GTX 3080 will use a full GA103 GPU. Because the GeForce GTX 3070 would probably be based on a cut-down GA104 GPU, according to you, would a GeForce GTX 3070 Ti take the place of the full GA104 GPU?

  • 1 month ago
  • 1 point

Why would NVidia launch a 3070ti?

They only launched a 1070ti because AMD managed a competition at that level.

Same with the Super series it only launched when AMD was prepping Navi.

AMD has only managed to trade blows with the fully enabled TU106 using more cores and a process advantage to do so.

A refresh of the same architecture isn't going to magically manage a complete reversal of what Navi is and isn't.

When they do launch it will likely be the same all over again cut down models across the board so if they do need more competitive products they can just relaunch the same old cores without as much locked down.

  • 1 month ago
  • 1 point

I am just wondering why NVIDIA would not do anything with the full GA104 GPU. Would it be a Quadro, then?

  • 1 month ago
  • 1 point

the "3080Ti" is also supposed to have 12 GB of VRAM

  • 1 month ago
  • 1 point

How do you know? Do you have proof?

  • 1 month ago
  • 1 point

none of us have proof obviously, but gamers nexus and gamer meld both seem to agree

  • 1 month ago
  • 1 point

How can that be done if it has a 352-bit memory bus?

  • 1 month ago
  • 1 point

do you have proof?!?

  • 1 month ago
  • 1 point

I have past trends on my side. Both the GeForce GTX 1080 Ti and the GeForce GTX 2080 Ti used a 352-bit memory bus.

  • 1 month ago
  • 1 point

this is a completely new thing though, and your the only person ive seen suggesting 11gigs

  • 1 month ago
  • 1 point

Are you saying that the GeForce GTX 3080 Ti will have a 384-bit memory bus?

Sort

add arrow-down arrow-left arrow-right arrow-up authorcheckmark clipboard combo comment delete discord dots drag-handle dropdown-arrow errorfacebook history inbox instagram issuelink lock markup-bbcode markup-html markup-pcpp markup-cyclingbuilder markup-plain-text markup-reddit menu pin radio-button save search settings share star-empty star-full star-half switch successtag twitch twitter user warningwattage weight youtube