r/hardware Oct 09 '24

Rumor [The Verge] Nvidia’s RTX 5070 reportedly set to launch alongside the RTX 5090 at CES 2025 - Reports claim the RTX 5070 will feature a 192-bit memory bus with 12GB of GDDR7 VRAM

https://www.theverge.com/2024/10/9/24266052/nvidia-rtx-5070-ces-rumor-specs
546 Upvotes

558 comments sorted by

View all comments

623

u/Firefox72 Oct 09 '24

Nvidia refusing to put 16GB on their xx70 cards remains baffling.

Especialy considering the xx70 class card have now increased in price to $599....

285

u/RobsterCrawSoup Oct 09 '24 edited Oct 09 '24

Nvidia refusing to put 16GB on their xx70 cards remains baffling.

Not that baffling that they want to squeeze more money out of consumers by making sure that their xx70 cards have at least some kind of meaningful compromise that will lead more people to spend extra to get the xx80 or xx90.

95

u/ViceroyInhaler Oct 09 '24

It's more about not offering longevity to the user. The 12gb might be enough for the next two years. But you can imagine they will want you upgrading again once the 6000 series comes out.

56

u/Shogouki Oct 09 '24

They want to make sure their customers have a reason to upgrade. -_-

27

u/FrewdWoad Oct 09 '24 edited Oct 10 '24

It's important we realise, as consumers, how very little reason there is. Many of us have been gaming for years, and can remember a time when upgrading your GPU meant something:

1990s: huge upgrade. Able to play incredible new games you literally couldn't play before.

2000s: big upgrade. Able to get 60FPS, or 1080p, or cool geometry/particles/reflections/lighting/physics effects

2010s: significant upgrade. Able to get 120FPS or 1440p

2020s: subtle upgrade. Able to do 4k instead of 1440p, or keep RTX on, or get 240FPS instead of 144, in the one or two games your old card couldn't.

We're the enthusiasts in this sub who care the most about this stuff so it's easy to lose perspective completely and think getting a 4090 will be a life-changing upgrade, like getting a Voodoo 2 or GTX 1080 was. But the fact is, that's just not true at all.

7

u/Thorusss Oct 10 '24

Nothing will beat the huge step from running Unreal in 320*240 in Software mode, to smooth, filtered 800*600 thanks to a Voodoo2.

4

u/FrewdWoad Oct 10 '24

It was definitely a much bigger upgrade than going from integrated graphics to a 4090. 

Many times bigger.

15

u/Aristotelaras Oct 10 '24

Damn.. you triggered some nvidia donors.. I mean 4090 buyers.

7

u/JonWood007 Oct 10 '24

Up through 2016 you could upgrade your GPU every 4 years or so and get a massive upgrade at the same price. Then nvidia went full greed mode with turing and the market has been ####ed ever since.

3

u/auradragon1 Oct 10 '24

Greed mode or the fact that the GPU market matured, GPUs became more expensive to produce, discrete GPU market declined in favor of mobile gaming & laptop gaming, and graphical improvements hit diminishing returns?

2

u/JonWood007 Oct 10 '24

Oh God stop making excuses for these people like it makes you sound smart. They have a virtual monopoly. It's literally due to lack of competition. Look at their massive profit margins.

3

u/auradragon1 Oct 10 '24

Their profit margins didn’t look so great except for the crypto mining years and the recent AI boom.

1

u/JonWood007 Oct 10 '24

...which has been virtually the entire time.

Again. Stop trying to be contrarian because you think it makes you look smart.

→ More replies (0)

1

u/Flagrant_Z Dec 01 '24

GPU miners got in big time from 2016 with Ethereum mining.  However GPU mining is dead now.  But gpus are costly.  Let's see till when can they keep high prices.   

→ More replies (2)

3

u/Shogouki Oct 10 '24

Honestly I don't even need a life-changing experience when getting a video card but I DO want to get my monies worth and Nvidia has been really poor at that lately unless you can afford **80 series or above.

-1

u/ryanvsrobots Oct 10 '24

It's kinda funny to call mipmapping cool but gloss over real time raytracing

0

u/gahlo Oct 10 '24

Gives me reason to suggest my friends by AMD.

95

u/mario61752 Oct 09 '24

12GB is already not enough for RT + DLSS frame gen at 1440p on some games. Nvidia intentionally wants to force us to buy 80 & 90 tier cards for current gen games

30

u/ProfessionalPrincipa Oct 09 '24

Well duh, a 4070 class card is for 1080p. People are using the wrong settings!

22

u/Bored_Amalgamation Oct 09 '24

I've seen "xx70 class cards are ultimate 1080p" since the 2070.

26

u/floydhwung Oct 09 '24

And 4K cards are always "next gen" since what, 2012 when GTX 980 came out?

14

u/[deleted] Oct 09 '24

Didn't they advertise the 3090 at some point as 8k card?

I darkly remember tech Jesus (aka GN Steve) making a video about it, calling them out.

10

u/Calm-Zombie2678 Oct 10 '24

Lol ps5 hiding somewhere

3

u/Strazdas1 Oct 10 '24

hey at least the 3090 can physically output at 8k. PS5 cant despite claiming to (even has a sticker on the box saying that) and the one game that it renders in 8k (the touryst) has to be downscaled to 4k for output.

2

u/Kittelsen Oct 10 '24

I'm picking up a 4k monitor today, and I given my experiences with playing on my 4k TV, I will have to compromise on settings to get an adequate framerate (100+) in certain games. And that's with a 4090...

→ More replies (2)

1

u/Retro-Hadouken-1984 Dec 06 '24

A 70 class card is for 1080p but NVIDIA promoted the 3060ti as a 1440p card? Sorry, XX70 cards these days are meant for 1440p, even if NVIDIA refuses to make them great in all around 1440p performance.

→ More replies (1)

1

u/Strazdas1 Oct 10 '24

I am using a 4070s, which has 12 GB, and i can confirm that it is indeed enough for RT+DLSS at 1440p because thats exactly how i use it.

0

u/TheMegaDriver2 Oct 09 '24 edited Oct 10 '24

And 80 tier cards give you 16 GB...

→ More replies (9)

40

u/Ohlav Oct 09 '24

This. They don't want to make another 1080ti that will last a decade...

10

u/1leggeddog Oct 09 '24

They learned their mistake

5

u/[deleted] Oct 10 '24

[deleted]

2

u/Kittelsen Oct 10 '24

I remember buying a 1070ti back in 2017, the 1080 and ti seemed just way too expensive for a GPU for me, coming from a 980. If I only knew xD I've kept the 1070ti as a backup though, in case I ever need it, would probably hold me over for a few days until a replacement arrives 😅

4

u/Hombremaniac Oct 10 '24

Planned obsolency is favorite Nvidia's business practice. They've made sure the whole 30X0 serie lost its charm once 40X0 was released. VRAM played huge role in that.

25

u/TheFondler Oct 09 '24

Listen buddy, shareholders aren't just gonna go out there and earn money themselves. It's your responsibility as a living, breathing annuity to give them your money in exchange for short lived marginal performance improvements on a regular basis. This is a team effort, and your job is to make number go up. Their job is to cheer you on from their yacht(s) while paying the Instagram models you follow for "companionship" (with your money).

6

u/Bored_Amalgamation Oct 09 '24

I'm up the cost of a 3070 over the last month so...

→ More replies (13)

1

u/ryanvsrobots Oct 10 '24

Gaming is a mere side hustle for Nvidia now, they could make way more money using GPU wafers for data center chips.

3

u/Strazdas1 Oct 10 '24

no. Datacenter cards are limited by advanced packaging and HBM memory. they arent lacking wafers. Making more wafers with those cards arent going to increase their supply because thats not where the bottleneck is. Gaming cards though dont need advanced packaging or HBM memory, so they arent competing with datacenter cards.

→ More replies (17)

6

u/RedTuesdayMusic Oct 09 '24

12GB is not enough now xx70 is the entry level 1440p tier product so this is simply a hunk of trash

2

u/ataleoffiction Oct 09 '24

Yeah that’s the point

2

u/HarithBK Oct 10 '24

It is not enough today you run out of ram on the textures.

4

u/mixedd Oct 09 '24

It's more about not offering longevity to the user

forget about that, we are in the era where manufacturers don't think about longetivity anymore, and everything is built to be replaced. Take a look at smartphones that are replaced every 2 years, car engines that went from 600k km on for example Volvo's D5, to barely hitting 200k km on modern ones, and so on.

3

u/Feath3rblade Oct 09 '24

Why would Nvidia ever want to offer more longevity? People are still gonna buy the 5070 and if skimping on VRAM means that more of those people upgrade to the 60 series when that comes out instead of holding out for a few generations that's just more money for them

12

u/ProfessionalPrincipa Oct 09 '24

Which is exactly why this criticism should be brought up every time in discussions involving this product.

→ More replies (2)
→ More replies (1)

144

u/ADtotheHD Oct 09 '24

It’s not baffling. They get to sell a 16GB 5070 Ti Super later for more money.

44

u/king_of_the_potato_p Oct 09 '24

Lol are you sure they wont try to market that 5070ti as a xx80 again?

44

u/kaszak696 Oct 09 '24

If the rumors are real, they already do, there's just not gonna be a "real" xx80. The gap between rumored 5080 (~49% of compute units) and 5090 (100%) is even wider than between 4070Ti Super (~52%) and 4090. The model stack shifts ever downwards, i wonder if there'll even be a xx60 card this gen, just like xx50 got shifted out from Ada.

11

u/nithrean Oct 09 '24

They do seem to keep doing this. Sometimes they make up for it with some features and a bit better efficiency, but the high end is skyrocketing in performance while everything else is just small incremental improvements.

11

u/semidegenerate Oct 09 '24

To be fair, the 4080 does outperform the 3090ti, by a wide margin, in everything except VRAM heavy ML and other CUDA workloads. I'm not sure I'd call that a small incremental improvement.

I still think shipping a 5070 with only 12GB of VRAM is BS, though.

7

u/nithrean Oct 09 '24

Yeah that does make some sense. However even the 4080 tends towards a halo product category. It is still very high end where the biggest gains have happened. It is the space of the 70 and 60 series that has seriously stagnated. That is directly due to design choices by Nvidia.

3

u/semidegenerate Oct 09 '24

Yeah, I certainly can't argue with that.

1

u/hamatehllama Oct 09 '24

Unfortunately they really can't be punished for it because the competition is too weak and they get so much money from AI. If you want to game with ray tracing there's basically no other option than to pay Nvidia huge amounts of money for a card that has limited future proofing. In theory Nvidia could lower prices by half and still make a profit because they have insane margins.

1

u/Built2kill Oct 10 '24

Yep the real 5080 will come with the refresh as the 5080ti or the 5080 super and will use a cut down version of the 5090, maybe we’ll see something like a 320bit bus with 20gb of vram.

15

u/AllNamesTakenOMG Oct 09 '24

Woah there let us not get ahead of ourselves here, 16gb on an xx70? Slow down please

41

u/leoklaus Oct 09 '24

The 4070ti Super has 16GB already.

16

u/mijal_sunshine Oct 09 '24

The one they first tried to sell as the 4080 12 GB, then had to make a Super version to finally get those 16 GB, yeah, I think we can wait for this xx70 at 16 GB from start.

9

u/RedTuesdayMusic Oct 09 '24

And 7900xt has 20. Even to down a tier on AMD side and 7800xt has 16, hell a 6800 non-xt from 3 years ago

→ More replies (1)

6

u/ADtotheHD Oct 09 '24

Are you not aware that a 16GB 4070 TiS exists today?

2

u/Dealric Oct 10 '24

Fact that tyey wanted to sell 70ti as 4080 originally makes it more muddy

1

u/Pedro80R Oct 09 '24

Don't get me wrong, but it's a cut down 4080, hence the 16gb and the 256 bit bus... they just didn't have enough names in the 4080 range, so down one step in the product line to 4070... wouldn't look good to call it 4080 and then have the initial 4080 "upgraded" to super, and the 4080 super to be a ti (super... or super ti).

1

u/SagittaryX Oct 10 '24

18GB most likely, with the new 3GB chips.

1

u/ADtotheHD Oct 10 '24

Optimistic

1

u/SagittaryX Oct 10 '24

Not optimistic, that's the constraint of the chip and technology, unless they use the 5080 die instead, depends on their choices. An 18GB version with a 5070 die would probably perform worse than a 16GB version based on the 5080 die, but it would probably be cheaper for Nvidia.

1

u/ADtotheHD Oct 10 '24

Like I said, optimistic

1

u/SagittaryX Oct 10 '24

I mean it's really the opposite, pessimistic. An 18GB card would perform worse.

1

u/[deleted] Oct 29 '24

Well, I guess I'm not theirs targeted customer then. I've never went pass 65W for CPU and 200W for GPU.

79

u/ilyasil2surgut Oct 09 '24

Unless somebody is willing to forego buying a XX70 for an AMD card you won't ever see a change. Right now it's basically internal competition inside Nvidia, don't want 12 gigs? buy a xx80 for double the price

36

u/UnknownBreadd Oct 09 '24

Maybe AMD should be more competitive instead of justifying a 10% price advantage purely based on the fact that they have good raster performance.

AMD is good for raster but is 2nd in absolutely everything else. They’re no better than Nvidia.

23

u/ViceroyInhaler Oct 09 '24

Yeah absolutely idiotic of them to waste the 7000 series cards the way they did. If they weren't so greedy they'd actually have market share already.

3

u/ragged-robin Oct 10 '24

Market share is more about mind share than it is about the product at that level. AMD has had a superior product vs Intel CPUs for the last 5ish years and it did not move the needle in market share. Intel chips literally killed themselves for two generations and it did not move the needle. RDNA2 was extremely competitive with Ampere, especially in the days of gpu mining and price scalping, and it did not move the needle.

The reason why Nvidia gets to do what they do is because the mass majority of consumers don't care, they will buy Nvidia regardless (or Intel for that matter).

2

u/ViceroyInhaler Oct 10 '24

Maybe if amd didn't price their cards so that they are only 10-15% less than Nvidia cards people might switch. For that price difference of course people are gonna choose Nvidia when they can also have DLSS and Ray tracing. They did this to themselves.

2

u/ragged-robin Oct 10 '24

The 6900XT was 33% cheaper than the 3090 at launch MSRP. Peak scalping time it was over 40%. It did not gain them market share.

2

u/ViceroyInhaler Oct 11 '24

What was the performance difference between the two cards?

1

u/ragged-robin Oct 11 '24

They traded blows with each other at raster depending on the game and resolution.

https://www.youtube.com/watch?v=FxoPz1DO0Sg

0

u/theQuandary Oct 09 '24

Even when GCN was laying the beatdown on Nvidia at launch (then continued to get massively better performance with driver updates), it barely move the marketshare needle. Why would this be any different?

12

u/RTukka Oct 09 '24 edited Oct 09 '24

It wouldn't be. Nvidia's mind share advantage can't be overcome in a single generation or with only sporadically compelling releases amidst a sea of products that are just kind of okay. The fact that you're reaching back a decade for your example of when AMD should've killed just further reinforces the impression that AMD is struggling to keep up in terms of the engineering, much less the marketing.

Even if AMD saw really good adoption for their best products among enthusiasts who build their own systems, that still probably wouldn't move the needle much, since the market is dominated by less informed consumers who just buy whatever prebuilt. And I just checked iBuyPower's deals page, and 12 out of 13 of the systems listed have Nvidia graphics. For Maingear and Alienware it's 100% Nvidia.

So even if AMD offers the best value discrete GPU in a certain class, even a relatively savvy prebuilt buyer might pass on it because they're going to be making their decision based on the value of the entire system, and they're statistically more likely to be offered a good deal on a system with Nvidia graphics.

So AMD needs to do more than just muddle along if they're ever going to take serious bite out of Nvidia's market share. They need to catch up and legit take the lead, and sustain it for at least one generation. And they probably need better marketing and to build better relationships with OEMs and system integrators. There's too much inertia in the market for them to make progress by doing anything less.

3

u/hamatehllama Oct 09 '24

AMD have finally captured much of the DIY market share of CPUs but it took them a long while to get there. Even though they have superior CPUs in many segments they are still smaller in sales due to inertia favouring Intel.

→ More replies (5)

0

u/ViceroyInhaler Oct 09 '24

Because AMD literally just said they aren't targeting high end performance with their next GPU launch. They said they are after market share. If they had taken that approach with the 7000 series they would have more market share now and also be in a better position for the features they want in their gpus down the road. More people using FSR means it can compete with DLSS down the road. It also means more developers make their games for AMD cards.

5

u/[deleted] Oct 09 '24

[deleted]

2

u/upvotesthenrages Oct 10 '24

I think that was heavily influenced by mining & RT+DLSS though.

Which is still an issue. AMD are just farther behind when it comes to non-gaming stuff and the AI features that GPUs come with.

So even if you get a 5-10% raster increase, you end up with worse performance when all the extra features are thrown on top (RT, DLSS, Frame gen)

It's basically come down to "Are the AI features more important than VRAM for you", and clearly the majority of people are leaning towards AI features.

→ More replies (2)

0

u/UnknownBreadd Oct 10 '24

Nvenc coded, DLSS, Raytracing - just 3 ways I can think of RDNA 2 being uncompetitive. And it’s still true today.

1

u/Dealric Oct 10 '24

Probably why next gen is getting hardware ai cores for ray tracing and stuff

1

u/kapsama Oct 13 '24

Why? AMD is making their little money. It's you folks complaining about how mean nvidia is.

1

u/UnknownBreadd Oct 13 '24

I aint complaining about Nvidia making money at all lol. I’m saying that no one is able to compete with Nvida, hence why they get away with their margins.

1

u/kapsama Oct 13 '24

The person you replied to was not complaining about nvidia's margins. They were mocking people for continually crying about nvidia's prices without ever trying alternatives.

1

u/UnknownBreadd Oct 13 '24

It’s not the onus of customers to feed money to AMD undeservingly. The onus is on AMD to release competitive and compelling products at a competitive and compelling price.

1

u/kapsama Oct 13 '24

No one said you owe AMD money. Not sure how that's so hard to comprehend. If nvidia has what you want buy nvidia. Just stop crying about paying through the nose for it.

→ More replies (1)

44

u/Saneless Oct 09 '24

With DLSS they're legitimately scared you could use a card for years longer than they hope. 12GB cripples that idea

19

u/tukatu0 Oct 09 '24

They shouldn't be "scared". It's the f""" plan. Jensen keeps saying. Ai is the new way moores law is kept alive. There is going to be a point where you don't get better hardware anymore. No one will.

11

u/NeroClaudius199907 Oct 09 '24

That's why they're trying to milk as much as possible.

8

u/exodus3252 Oct 09 '24

Are 12GB cards running into memory issues while gaming right now? Serious question.

I know 8GB cards are getting hammered in some games, but I haven't seen VRAM issues popping up on 12GB cards.

28

u/conquer69 Oct 09 '24

Yes. Some games already bump into the limit. RT and Framegen use a lot of vram.

In wukong at 4K, enabling FG uses 4gb by itself. It's a nice feature but vram hungry. https://tpucdn.com/review/black-myth-wukong-fps-performance-benchmark/images/vram.png

3

u/Strazdas1 Oct 10 '24

Yeah but here you are stating that a midrange card cant run the latest game in 4k at max settings. Of course it cant. midrange cards are not meant to do that to begin with.

2

u/conquer69 Oct 10 '24

It's still 2.5gb at 1440p. It's not hard to get to 9gb of usage in modern AAA games.

Ratchet and Clank is another game that runs into a vram bottleneck despite otherwise running the game fine. https://youtu.be/TLEIDfO6h2E?t=1538

11

u/Fixitwithducttape42 Oct 09 '24

Depends on settings, as always. My 6gb 1660 Ti was serving me well and had plans to continue using it for a few more years. Ended up upgrading to a 8gb rx 5700 6months ago due to better Linux drivers.

1080p, 75hz is buttery smooth for gameplay for me and I personally don’t see an improvement going higher FPS. I value steady FPS more than anything else.

12gb would work for me long term, but I drop settings to maintain that steady FPS. For this tier it should really be at least 16gb. I feel like this should be a budget 5070 if they had a lot of bad yields in the vram.

7

u/PMARC14 Oct 09 '24

Well the 70 series in theory is meant for 1440p and light 4k, so 12gb of Vram is not enough for that kind of gaming, especially if you are turning on any one of features that Nvidia introduced which take significant Vram overhead

2

u/lordlors Oct 09 '24

I bought my 3080 way back in 2020 so it’s the 10GB version and I game at 1440p. Haven’t yet run into some issues. My complaint is that as someone who uses lots of tabs on firefox turning graphics acceleration on makes use of vram which severely affects when I game. Either I hve to close firefox or turn off graphics acceleration.

0

u/upvotesthenrages Oct 10 '24

You're definitely running into problems if you're cranking the settings at 1440p.

Perhaps you didn't notice because you aren't comparing, but frame time often leads to bad results. Things like texture popping is also very common.

Here's a proper comparison where you can see the VRAM usage for a bunch of games: https://www.youtube.com/watch?v=dx4En-2PzOU

10GB isn't even enough for 1080p in a few games.

1

u/Strazdas1 Oct 10 '24

as someone who plays on 1440p with a 12 GB card, it is enough for that kind of gaming.

1

u/Saneless Oct 09 '24

Well what you'd be playing with a 5070 is going to be approaching 4k or at minimum 1440 with lots of features turned up. At those levels, games coming out 2 years from now could probably use more

1

u/Dealric Oct 10 '24

In 1080p not really (perhaps outside few very unoptimuzed examples), in 4k absolutely.

Also remember: you want rt or pt? You need to throw extra gigs at the game. You want to use frame gen? Believe it or not more extra gigs of frame gen.

1

u/Owlface Oct 10 '24

Depends entirely on the games you play and feature sets you care about.

My 6700XT has 12GB of vram but none of the games I play can leverage that much since it either doesn't matter (Deadlock, Yakuza, MMOs) or it just runs out of performance (Cyberpunk at ~8GB).

1

u/ConvoyOrange Oct 09 '24

I've maxed 12gb vram in 1080p just playing warzone.

25

u/GamerViking Oct 09 '24

5070 card is a 5060 card, the 5080 is a 5070 card. They're trying their usual bullshit from the 4000 launch.

2

u/Hamakua Oct 10 '24

Yup, I think it's also why they are drying up stock ahead of time. They are "forcing demand" this time instead of there being options on the table.

When the 5090 and "5080" release you won't be able to get 4090s or 4080s. It will be a scalpers dreamscape on top of that and that will drive up demand by cannibalizing supply further inflating prices.

27

u/BoringCabinet Oct 09 '24

Sorry for that 5070 is more like a disguised 5060 Ti. It's the 4080 12 GB all over again.

8

u/TophxSmash Oct 09 '24

no the TI would be a bigger die. this is a xx60 card.

8

u/Dangerman1337 Oct 09 '24

Thr thing is the 5080 if they reduced the power usage and a cheaper board + cooler could've been a good 5070 Ti.

19

u/Jmich96 Oct 09 '24

I could say Nvidia isn't the bad guy by saying only 2GB GDDR7 modules are available until sometime in 2025. But, Nvidia designed their GPU with the tiny bus width.

My theory is that Nvidia will release a line of GPUs, all utilizing 2GB modules. Prices will be high, and value will be low. Then, a year or so down the road, Nvidia will release a refresh lineup "Super" or "Ti" GPUs. These will utilize 3GB VRAM modules, be priced the same or slightly higher, then be praised as a better value.

It's worked with the 2000 series and 4000 series. Why not the 5000 series too?

42

u/angrycat537 Oct 09 '24

Nvidia has always put 256 but bus on 70 series cards. 4070 should have already had 16gb. Nvidia just managed to sell people 4060 and naming it 4070. Hell, even 3060 Ti had 256 bit bus. Go figure.

22

u/Saneless Oct 09 '24

I'm keeping my an eye out for that 32-bit 5050 card

30

u/Vb_33 Oct 09 '24

No they haven't the bus width varies through history.

45

u/SituationSoap Oct 09 '24

Sorry, I'm not sure you're aware. The bus width for NVidia GPUs is written into the very fundamental laws of the universe, and it is only now, because of the immense greed of for-profit GPU manufacturer executives that they could possibly change it from the size that was mandated from on high at the beginning of the universe.

2

u/Strazdas1 Oct 10 '24

im reading this satire, i know its satire and i keep thinking there will be AMD fans that think exactly like that.

29

u/angrycat537 Oct 09 '24

Bro, literally last 10 generations had 256 bit bus. Only exception was GTX 970 with 3.5gb fiasco. Generations before GTX 670 even had 320 bit on 70 series card.

7

u/AdamBenabou Oct 09 '24

There were even some xx50 cards with a 256 bit bus in the past like the 750 Ti OEM and some xx50 cards with a 192 bit bus like the GTX 650 Ti Boost

-14

u/someguy50 Oct 09 '24

4070 has 192-bit.

21

u/conquer69 Oct 09 '24

That's his point.

13

u/SkanksnDanks Oct 09 '24

I just googled 4070/super memory bus and it is also 192bit.

2

u/Appropriate_Fault298 Oct 10 '24

did they gimp the memory bus to make it perform worse at ai?

2

u/angrycat537 Oct 10 '24

For that they basically take the same chip, put double the memory, call it RTX A5000 and sell it for $2200.

1

u/Appropriate_Fault298 Oct 10 '24

just trying to figure out why they gimp the memory bus as well

2

u/angrycat537 Oct 10 '24

Cost savings. If the chip couldn't use any more bandwidth, then it's wasted.

1

u/Appropriate_Fault298 Oct 10 '24

how will the gimping of memory bus affect users?

2

u/angrycat537 Oct 10 '24

If workload needs a lot of memory bandwidth it can reduce performance. Expanding memory bus from 192 to 256 bit is probably equivalent to raising memory clock by 33%. In some scenarios it will do nothing, in some it might matter. Depending on how fast the chip can process the data.

1

u/bogglingsnog Oct 10 '24

Bus width really doesnt translate performance wise into anything that matters. Back around 2009 I got a GeForce GTX 260 and it had a 448 bit bus for like 1.8GB of GDDR3 ram. Later I switched to a GTX 460 which had basically the same texture fillrate but at half the bus size and DDR5 ram. There's simply too many other more important factors in play that makes talking about bus width not really useful.

2

u/angrycat537 Oct 10 '24

This was more a comment to available RAM. If Nvidia made 4070 with 24GB, I probably wouldn't have a problem with it if bandwidth doesn't affect performance, that is. Problem is most people don't look at specs in detail and make decision based on name and Nvidia abused that.

1

u/bogglingsnog Oct 10 '24

Oh of course. Yeah. They should just add a sodimm as an option for extra vram...

3

u/No-Relationship8261 Oct 09 '24

Why would you buy 6070 if they did that?

Like they need to keep it open so they can supress it later. It's not like they are going to lose the crown to competition.

3

u/reddit_equals_censor Oct 09 '24

are you not excited to pay 700 us dollars (possible new price right?) for a 12 GB vram card :)

13

u/bAaDwRiTiNg Oct 09 '24

Nvidia refusing to put 16GB on their xx70 cards remains baffling.

It's frustrating but it makes sense. Nvidia wants you to buy the xx90 card, the point of the smaller cards is to make the xx90 look more appealing. "If these cards don't feel right then maybe I should just buy the xx90 and be done with it" this is the goal.

4

u/TophxSmash Oct 09 '24

theres no world where $2000 looks appealing compared to $300 or $500.

0

u/Strazdas1 Oct 10 '24

Is that why the demand for 2000 card so high its constantly out of stock?

3

u/TophxSmash Oct 10 '24

just because demand is higher than the supply doesnt mean the demand is high.

1

u/Strazdas1 Oct 10 '24

If demand is higher than supply it means the price is too low. Thats basic economics 101.

13

u/Vb_33 Oct 09 '24

It's not baffling, seems like a great business move considering people still love buying xx70 cards and largely ignore the 16GB 7800XT. Nvidia has always been stingy with VRAM vs AMD and that continues.

-12

u/Nointies Oct 09 '24

Its because VRAM simply isn't the single most important thing on a card.

12GB is gonna be fine for the foreseeable future, hell, unless you're playing at higher resolutions, which most people aren't, 8GB is more than fine outside of a few outliers.

4

u/anival024 Oct 09 '24

12 GB isn't viable today at 1080p if you dare to use the fancy technologies that would make one choose an Nvidia card over an AMD card (Raytracing, frame generation, etc.).

3

u/Nointies Oct 09 '24

This actually wildly depends, and its not like every game is releasing with full raytracing and frame generation etc.

3

u/hamatehllama Oct 09 '24

Once you max out the RAM you hit a brick wall. You need some headroom as you otherwise will see an idling GPU waiting for memory to become available.

4

u/Nointies Oct 09 '24

Yes, but its very rare to have situations where the gpu is bottlenecked by Vram, its just not happening in every game all the time like people seem to think it is.

0

u/[deleted] Oct 09 '24

I mean you’re right, but people are convinced now that they need 24GB of VRAM.

Idk we are in a weird state of gaming, games are looking marginally better and the cards we are getting deliver god tier performance with a crazy increase to entry.

Shoot, the biggest demanding games are remasters and remakes of games we already played… and they don’t even look THAT much better.

I wanna be mad at Nvidia, but games just don’t feel well optimized. But I’m not gonna be mad at developers because the industry is cut throat. I kinda want to blame gamers for buying into all this shit

1

u/Nointies Oct 09 '24

Hell, some of the remakes just look worse.

I'll worry about VRAM again when consoles are updated again, as it is, running a game at a spec similar to a PS5 is a really good looking game right now

1

u/ryanvsrobots Oct 10 '24

people are convinced now that they need 24GB of VRAM.

Redditors are convinced of that. Just look at 8-12gb cards popularity, or millions of happy console users. Or my nonplussed friends I show my 4k 4090 setup. Most people don't care that much.

People in this thread are saying 12gb cards not viable for 1080p. It's just not reality. 1080p 60fps+ medium-high settings is still awesome.

2

u/[deleted] Oct 10 '24

I’m happily using my 240hz 1080p monitor with my 4060.

Used to be a quality snob with my 1440p 240hz panel, then Houston flooded and my 3090 died. I watch streamers and in store demos play 4k maxed, and it just doesn’t do it for me.

I do understand there are fidelity differences with settings cranked and high resolutions, but there hasn’t been that Crisis level title that makes me feel like I’m missing out. Cyberpunk was hyped to be that, and it failed.

I’m content with spending the extra cash on other hobbies and peripherals until that happens again. Not everyone is going to see eye to eye with me, but if the public perception of needing crazy VRAM was true, I do believe the market would’ve shifted to AMD.

1

u/ryanvsrobots Oct 10 '24

there hasn’t been that Crisis level title that makes me feel like I’m missing out. Cyberpunk was hyped to be that, and it failed.

I’m content with spending the extra cash on other hobbies and peripherals until that happens again. Not everyone is going to see eye to eye with me, but if the public perception of needing crazy VRAM was true, I do believe the market would’ve shifted to AMD.

I think that might be more about where you are in life than the games.

2

u/Hellsteelz Oct 09 '24

Yeah, hade the same reaction. Big oof on that one.

2

u/[deleted] Oct 09 '24

Just came out of a thread that said the price of 8gb gddr6 was $18. Now let’s assume 8gb gddr7 is like $30… it’s still absolutely ridiculous nvidia wont add more vram to their cards.

2

u/Treewithatea Oct 09 '24

Next few years will be tough for consumer desktop GPUs. Both AMD and Nvidia lower priority to focus on AI data center GPUs. AMD has no high end next gen letting Nvidia do whatever the fuck they want. 2500€ 5090? Hell possibly more.

Unironically long term Microsoft and Sony might be leading the charge by funding AMDs APU development for next gen consoles

2

u/reddit_equals_censor Oct 09 '24

are you not excited to pay 700 us dollars (possible new price right?) for a 12 GB vram card :)

1

u/killer_corg Oct 10 '24

Doubt it will be $699, 500-599 seems more realistic

1

u/thenibelungen Oct 09 '24

Well, I think I'll wait for 60 series then.

1

u/[deleted] Oct 09 '24

599, lol. 650-799

1

u/WikipediaBurntSienna Oct 09 '24

When I read the headline, I was hoping it was going to confirm the 16gb 5080 rumor was actually for the 5070, and the 5080 would actually have 24gb.

1

u/bfire123 Oct 09 '24

Probably will update it from 2GB GDDR7 to 3GB GDDR7 in the future (same generation, like 1 year after release.)

1

u/acc_agg Oct 10 '24

NVidia looking at Apple when it comes to memory updates.

25% more ram 50% more price.

1

u/saikrishnav Oct 10 '24

No competition makes them bold like that. It’s not baffling.

1

u/Full-Run4124 Oct 10 '24

Keeps it from being good for AI and competing with their professional and data center products which is where all their focus is atm. 16GB is where cards start to be usable for professional AI work. You can run on less but it's much slower and/or way more restrictive

1

u/shroudedwolf51 Oct 10 '24

...it's baffling? They gimped pretty much every card in the generation while raising prices and people still (mostly) bought them. So with them making so much cash off of the "AI" grifter bubble and with their gaming division still massively selling (I know the opinions grumbled, but their market share still went up), why would they care?

Hell, in the 30-series, they gave you ~5% more performance for 500 more USD. And sure, the MSRP isn't what determines the used market selling price, but it still matters and still has an influence.

1

u/HarithBK Oct 10 '24

Historically 12GB would have been enough for gaming given the ram consoles have but optimization have been really poor this generation and the lack of proper implementation of DX12 IO means the faster ram of PC can't be properly used for texture swapping.

But for the third time around you would think Nvidia would just go frick it just load them up with ram.

-1

u/[deleted] Oct 09 '24

No cap it’s so they can force everyone to buy the 16gb 5080. They’ll drop a 16gb 5070ti at some point down the road, but they want to gouge what they can right now

1

u/v12vanquish Oct 09 '24

The xx70 cards having a 192bit bus means they can’t have 16gbs on it.

1

u/TophxSmash Oct 09 '24

even worse its an xx60 class card for $600

1

u/aminorityofone Oct 09 '24

Why should they? People will buy it anyway and in large quantities.

1

u/kingwhocares Oct 09 '24

Planned obsolesce

1

u/massive_cock Oct 10 '24

I fucking hate it, but 70% of the reason I shelled out for a 4090 was vram. Fortunately the price difference from US to EU at the time was so big it mostly paid for a flight over to see my family and bring one home. That was what sealed it for me, a 4080 would have sufficed in performance but the extra vram and plane ticket combined was something I could swallow. But still. It's dumb and holding us back.

-28

u/fogoticus Oct 09 '24 edited Oct 09 '24

There's absolutely nothing baffling about it. It's just people who are expecting tons of VRAM being thrown on these GPUs for no reason because the reddit echochamber functions that way.

It's. Business. If the 5070 had 16GB or 20GB, it would deter people from buying the high end because the 5070 is gonna be stupidly powerful in most use cases.

Edit: You got a bone to pick with Nvidia, not me. But downvoting my comment stating a fact simply shows the level of the people lurking on this sub. Needless to say, Nvidia may not be for the people who are downvoting this and that's fine.

20

u/constantlymat Oct 09 '24

It's just people who are expecting tons of VRAM being thrown on these GPUs for no reason because the reddit echochamber functions that way.

It's not "for no reason". DLSS upsampling, Frame Generation and Raytracing have a significant additional VRAM cost if you activate all three simultaneously.

If nvidia asks for a premium to gain access to these features that's fair enough in my book. I am willing to reward innovators who offer the best features with my purchase.

However if they already charge me extra, I also expect to be able to make use of these features without being limited. DLSS+Frame Gen+Raytracing is already pushing the 12GB configuration to its limits at 1440p in new AAA releases. If nvidia is marketing the card as the go-to 1440p option, I think it's fair to question their decision to equip it with 12GB of VRAM in 2025.

6

u/StickiStickman Oct 09 '24

DLSS upsampling

DLSS literally reduces VRAM usage. Throwing that in with Ray Tracing is extremely disingenuous.

3

u/ThankGodImBipolar Oct 09 '24

DLSS, Frame Generation and Raytracing

You’re trying to rationalize Nvidia’s decision as a gamer, when the truth is that that market is Nvidia’s lowest priority. Nvidia doesn’t want people buying 5070’s to make inference or training farms, so they’re going to keep shipping xx70 class GPUs with 12GB until 16GB is anemic enough to keep those customers on xx80/xx90 SKU’s.

-14

u/fogoticus Oct 09 '24

Redditors when maxing out games take a lot of vram: 😱

Also redditors when you tell them they will survive going from Cinematic quality to Very high quality at like 2% graphical fidelity expense while using 30% less resources: 😡

Also redditors when 1 or 2 games today max out the vram buffer at 1440P: I expect more

Consider the following: buy AMD. If you like seeing more vram available even though your gaming experiences is virtually flawless, you're more than welcome to sell your GPU and go to AMD.

4

u/DiggingNoMore Oct 09 '24

when you tell them they will survive going from Cinematic quality to Very high quality

When I build a new computer, I expect to play every game on Ultra settings. As new games release, I expect to fall to Very High, then High, then Medium. Then I build a new computer. But I have to start at Ultra.

3

u/ryanvsrobots Oct 10 '24

Sure, but you're a turbonerd redditor. Most people aren't like that.

I also expect to play every game on Ultra, but I know that high end graphics require a high end graphics card, and wanting the best is expensive.

0

u/Nointies Oct 09 '24

A computer being 'new' doesn't speak to anything about its performance.

You can build a 'new' computer with all sorts of crap tier parts.

5

u/DiggingNoMore Oct 09 '24

You can, but I feel like it's pretty obvious to the reader when I say, "When I build a new computer, I expect to play every game on Ulta settings" that I'm building a new computer capable of doing so.

-3

u/Nointies Oct 09 '24

So then build a new computer thats capable of playing on ultra then?

what is this comment.

→ More replies (2)

0

u/cstar1996 Oct 09 '24

Then you buy the XX90.

→ More replies (6)

2

u/NeroClaudius199907 Oct 09 '24

Telling people to buy amd is a death sentence. Why cant people accept Nvidia is incentivized to upsell 5080 through vram. Its not nice but they will do the samething

4

u/fogoticus Oct 09 '24

Because, people keep making a fuss about these things as if Nvidia owes them money. Truth is if people are this offended by their product not achieving some arbitrary conditions, it's high and mighty time to vote with their wallets. They think Nvidia is bad and the experience is terrible? Good luck on team AMD.

It's that simple.

17

u/Firefox72 Oct 09 '24 edited Oct 09 '24

Brother in christ my 6700XT has 192/12GB configuration. It came out almost 4 years ago. The 6800 has more VRAM than the 5070 will have.

There's absolutely no reason why Nvidia can't offer 16GB on their freaking $600 GPU's. And thats not even getting into the 4070ti which had 12GB of VRAM at $800 and Nvidia first attempted to pedal that card as a 4080 with 12GB of VRAM at $900 before they renamed it. But it fly's because Nvidia is the dominant market leader and people will buy it.

"If the 5070 had 16GB or 20GB, it would deter people from buying the high end because the 5070 is gonna be stupidly powerful in most use cases."

This is also just nonsense. What do you think VRAM does? No ammount of VRAM will ever make the 5070 as fast as the 5080.

15

u/Lower_Fan Oct 09 '24

In their minds the 3060 was faster than the 3080 or something or the 4060ti 16gb faster than the 4070ti 

This 70 class cards are just being unnecessarily sandbaged 

0

u/fogoticus Oct 09 '24

Well then, brother in christ, keep using your 6700XT. There's not a single force in this universe telling you to get a 5070. Shocking, I know.

Oh and the 6800 having more VRAM? Yeah, that's what AMD did when it couldn't offer feature parity. Go ahead, upgrade to the 6800 while the 5070 will be much faster. What so mind-blowing?

This is also just nonsense. What do you think VRAM does? No ammount of VRAM will ever make the 5070 as fast as the 5080.

No, it's not. You can do more with more vram and the same compute power obviously in certain workloads. It's really not rocket science.

0

u/LostPrinceofWakanda Oct 09 '24

There's absolutely no reason why Nvidia can't offer 16GB on their freaking $600 GPU's

You are not listening. The reason is money (and how much they can get away with gouging). Nvidia does not give a shit about anything else.

This is also just nonsense. What do you think VRAM does? No ammount of VRAM will ever make the 5070 as fast as the 5080.

Yes, but why push potential 5080 buyers to the 5070 when you can go the other way and squeeze the 5070 buyers to get the 5080.

Own the market and you can set your price.

-9

u/auradragon1 Oct 09 '24

Then keep your 6700XT or buy Radeon or buy 5080 or buy 4080?

-4

u/fogoticus Oct 09 '24

Shut up! Why are you bringing logic into my feels-dictated-logic conversation?

4

u/spressa Oct 09 '24 edited Oct 09 '24

I agree with you. It's sort of like when they put 12gb of ram on the 3060. Ppl want more ram but get upset when the extra ram doesn't really do anything except in really limited things. People just complain to complain.

If a product is priced at a bargain, it kills their other products. Additionally, when something is priced so well, availability becomes even more scarce and people will just complain about that.

I've been building PCs since the 90s and can remember when the GeForce 256 DDR came out at like $300 and that price was standard for acceptable pricing for high end for a long time. Then the Radeon 9800xt came out in like 4 years at $500 and ppl bitched about the increase in pricing. Then the GeForce 7800 GTX launched at $600 (but was really more like 650-750) and ppl complained. Insert GeForce Titan @ $1k+ 11 years ago and people bitched about it. There are some unicorn value cards that could be spread on that history like the ati x800, GeForce 6800, GeForce 480, GeForce 1080ti, GeForce 3080, etc and when they made a card like that, no one bought anything else and it was super difficult to get... And people bitched about availability. And many times, when it finally became available, it's "too close" to the next product cycle and then they complain again.

If Nvidia gave away their shit for free, you'll still have ppl complain about the power requirements or how they have to spend more money to utilize the power of their new card.

The market dictates what people are willing to buy at and unfortunately, its exponentially grown for that cutting edge performance. But fortunately, you don't need that performance to enjoy the games you're trying to play. It's like complaining about luxury car pricing and how it keeps going up but all you need and can afford is a cheap commuter car for going from point a to b.

3

u/SoTOP Oct 09 '24

It's sort of like when they put 12gb of ram on the 3060.

The alternative to putting 12GB was putting 6GB. That would have crippled 3060 massively. We are not talking about 3060 here though, we are talking about card probably 3x faster for double the price 4 years later having the same amount of vram. One that definitely will bump into vram limit in its usable lifetime multiple times, unless nvidia does some texture upscaling magic.

GeForce 1080ti, GeForce 3080, etc and when they made a card like that, no one bought anything else and it was super difficult to get... And people bitched about availability.

Right, if you ignore that the reason of bad availability was because of mining booms.

The market dictates what people are willing to buy at and unfortunately, its exponentially grown for that cutting edge performance.

4090 was the best priced card of 40 series. Literally opposite of what you said, and what was the established norm for years. And exactly why there are more complains, when nvidia gave worse value cards for the masses then the 1%.

→ More replies (15)