The tweet was specifically talking about their $299 card that also has a 16Gb version. OP is shitstirring for clicks.
PC Gaming
For PC gaming news and discussion. PCGamingWiki
Rules:
- Be Respectful.
- No Spam or Porn.
- No Advertising.
- No Memes.
- No Tech Support.
- No questions about buying/building computers.
- No game suggestions, friend requests, surveys, or begging.
- No Let's Plays, streams, highlight reels/montages, random videos or shorts.
- No off-topic posts/comments, within reason.
- Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)
The tweet was specifically talking about their $299 card that also has a 16Gb version. OP is shitstirring for clicks.
This is one of the most uninformed comments I've read so far.
I shared this vid to try and spread awareness that Frank Azor, AMD's Chief Architect of Gaming Solutions and Gaming Marketing, of whom made that needless badfaith comment as it holds back the advancement of gaming.
AMD's 9060XT 16GB ($350) released recently but we've yet to see if they're able to provide them to consumers at AMD's own stated MSRP price of $350 USD; something that was unmet in their previous launch.
For PC building, you walk around this show Computex and talk to case manufacturers and cooler manufacturers, and they'll sync up their launches to Nvidia GPU launches cause they don't sell things in between at the same velocity. And so if Nvidia launches a GPU and the interest falls off a cliff because people just feel like they either can't get a card or they get screwed if they get a card, it I think actively damages the hobby.
I remember even when the RTX 3070 came out and I gave that a positive review I said it was a great value product because by all metrics/measurements that we had at the time it was a good product. We had very few examples that we could point to where 8 gigabytes wasn't enough. Of course the competing card the upcoming competing card we knew had a 16 GB vram buffer so; it/that doesn't necessarily make that a valid thing. Like it can be like if you had a 32GB buffer now on that product, you'd be like "Well it's got enough vram". It's probably nothing.
...
But because we did see, and even when you were looking at like dedicated used vram, a lot of games like 7, 7 and a ½GB [usage of vram increasing]. So you could see it creeping up over the years from like 4, 5, 6, 7; you could see where it was trending right? Which is why I always find it funny when people [say] "Why are we using more than 8 now? Like 8 should be enough".
- First quote paragraph from Steve Burke of Gamers Nexus, second and third quote paragraph from Steve Walton of Hardware Unboxed
- Is Nvidia Damaging PC Gaming? feat. Gamers Nexus
- if you replace Nvidia's name in the provided quotes with AMD it still holds the same force in that AMD would ruin the gaming landscape for the benefit of only themselves at the cost of literally everyone else.
Clicks literally has no value to me as what I care about the most is trying to inform gamers so that people aren't exploited by badfaith actors, especially hardware manufacturers as they dictate the limitations that game developers must work within. I'm not paid or affiliated with any of the hardware manufacturers.
shitstirring for clicks
"Shitstirring for clicks" literally does nothing for me. It would actually be detrimental for my reputation if I were to do so.
No one should get a free pass if behaving badly. If Nvidia acts poorly they should be called out. Same for AMD, same for Intel. 0 exceptions.
Sorry but I don't understand why this is a controversy. They have a 16gb model, if you need more than 8gb then go and buy that? They aren't forcing your hand or limiting your options.
The problem is, that games, especially the new big titles, are quite badly optimised quickly burning through your VRAM. So even if you maybe could play a game with 8gb you can't, because the game is unoptimised.
Perhaps AMD could convince game developers/publishers to spend some time learning optimisation. Many of the popular games I've seen in the past 5 years have been embarrassingly bad at working with RAM and storage specs that were considered massive until relatively recently. The techniques for doing this well have existed for generations.
Was about to comment about this. The problem isn't a lack of beefier components but a lack of interest in efficient use of available resources.
Could you please help me understand the reason behind this?
Here's a comment I made in this same post here that provides some details what's happening:
https://lemmy.ca/comment/16934365
AMD doesn't know about Unreal slop
He said most people are playing at 1080p, and last month's Steam survey had 55% of users with that as their primary display resolution, so he's right about that. Ignore what's needed for the 4K monitor only 4.5% of users have as their primary display; is 8GB VRAM really a problem at 1080p?
Absolutely. Why pay more if less is good enough?
They are open about it, and give the option to get more RAM if you want it. Fine by me.
No one with a 4k monitor will by them anyway.
Absolutely. Why pay more if less is good enough?
Different problem, IMO.
8GB VRAM is definitely a problem even at 1080p. There are already benchmarks showing this from various outlets. Not in every game of course, and it definitely hurts AAA mre than others. but it will get worse with time, and people buy GPUs to last several years, so it shouldn't have a major issue on the day you buy it!
Good thing there are only trash AAA games and the gems are indie games which would run on GladOS.
Not all indie games are low graphics requirements.
What's a trend without outliers?
- Why would they be buying a new card to play how they're already playing?
- What does the long term trend line look like?
You can confidently say that this is fine for most consumers today. There really isn't a great argument that this will serve most consumers well for the next 3 to 5 years.
It's ok if well informed consumers are fine with a compromise for their use case.
Misrepresenting the product category, and misleading less informed consumers to believe that it's not a second rate product in the current generation is deeply anti-consumer.
You can confidently say that this is fine for most consumers today. There really isn't a great argument that this will serve most consumers well for the next 3 to 5 years.
People have been saying that for years, my 8gb card is chugging along just fine. The race to vram that people were expecting just hasn't happened. There's little reason to move on from 1080p and the 75+ million ps5s aren't going anywhere anytime soon.
There's no NEED to move on from 1080p, but there are a lot of reasons.
I wouldn't even object to his position on 1080p if he said "the RX 5700 is fine for most users, don't waste your money and reduce e-waste".
He's telling consumers that they should expect to pay a premium price to be playing a slightly better than 2019 experience until 2028 or 2030. There are gullible people who won't understand that he's selling snake oil.
He's telling consumers that they should expect to pay a premium price
What exactly do you expect him to say? Get the same GPU, but with more vram, at no extra cost?
If you want more performance, you pay more, that's the way it's always worked, and he's correct in that 8gb is good enough for most people.
No, I expect him to gaslight naive consumers. Which is what he did. I just don't get why others are defending it.
In this case, at 1080p it's barely more performance for a lot more money. And if we're falling back on "good enough for most people" then an RX 5700 or 6600 is also "good enough for most people".
It's a free market, he can sell a low value product to suckers. That's his right. You're free to defend him and think it's not scummy. But it's scummy, and hopefully most people who know better are going to call it out.
This really feels like AMD's "don't you guys have phones" moment.
Did you read the tweet? It said most people don't need more, but that they offer a 16gb version if you do. That's completely reasonable.
At least we know nVidia aren't the only ones being shitty, they just lead the market in it.
I’ve been playing Jedi Survivor off Game Pass, and the 12GB I have is getting maxed out. Wishing I woulda had the extra money to get one of the 16GB gpus.
I'll just keep using my 7900xtx for 240hz 4k then.
Used 7800 XT for 500€ for >60Hz 4k then
(>60Hz as in theoretically more than 60Hz, but my monitor only supporting 60 lol)
I can agree that the tweet was completely unnecessary, and the naming is extremely unfair given both variants have the exact same brand name. Even their direct predecessor does not do this.
The statement that AMD could easily sell the 16 GiB variant for 50 dollars less and that $300 gives "plenty of room" is wildly misleading, and from that I can tell they've not factored in BOM at all.
They blanketly state that GDDR6 is cheap and I'm not sure how they figure.
They've had same sku different vram for a while. 480 4gb and 8gb.
As long as the core counts and such are the same it's fine.
If vram isn't a bottleneck performance should be equal
As some commentators have mentioned, that was mostly fine at the time of Ellesmere (2016ish?) where games wouldn't so frequently shoot past that limit. In today's environment, we find that a much higher proportion of games will want more than 8 GiB of VRAM, even at lower resolutions.
Notably, the most recent predecessor in this sort of segment (RX 7600 series) used the XT suffix to denote a different SKU to customers, though it's worth mentioning that the XT was introduced quite a bit later in the RDNA3 product cycle.
I agree if that would mean that those options are cheaper. You can be a very active gamer and not need more, why pay more.