this post was submitted on 08 Sep 2023
312 points (100.0% liked)

Technology

37747 readers
255 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Sorry if I'm not the first to bring this up. It seems like a simple enough solution.

top 50 comments
sorted by: hot top controversial new old
[–] beefcat@beehaw.org 115 points 1 year ago

People did stop buying them. Their consumer GPU shipments are the lowest they've been in over a decade.

But consumer habits aren't the reason for the high prices. It's the exploding AI market. Nvidia makes even higher margins on chips they allocate to parts for machine learning in data centers. As it is, they can't make enough chips to fill the demand for AI.

[–] mojo@lemm.ee 82 points 1 year ago (8 children)

Just like Chrome will stop being anti-consumer when people stop using it. Or Blizzard will stop being terrible if people stop buying their games. People are not very good at this whole "voting with your wallet" thing.

[–] Zellith@kbin.social 39 points 1 year ago (3 children)

It's almost as if people are fucking idiots.

[–] AmosBurton_ThatGuy@lemmy.ca 11 points 1 year ago (1 children)

The unfortunate truth ain't it

[–] Bizarroland@kbin.social 15 points 1 year ago* (last edited 1 year ago)

What's funny is that I vote with my wallet, and I tell my friends about it and they think I'm the weird one for not having a Facebook account, not having insta or Twitter, or shopping at Amazon or Walmart or Chick-fil-A.

Then I explain it and they say, "that makes sense" and not 30 minutes later are telling me about how I should look up somebody on tiktok, which I don't have, or asking about windows 11, which I don't use, or telling me I should buy a Tesla, which I don't want, and its for all the same reasons I keep explaining to them.

You vote with your wallet. My vote goes for people over countries and corporations.

As a side effect, countries and corporations have ensured that anyone who doesn't comply gets ostracized.

[–] TwilightVulpine@kbin.social 7 points 1 year ago

Almost like people with more money than sense can outvote everyone else.

How do you even count "people who didn't buy product X"? There could be millions more, either out of revolt or sheer disinterest, but that just doesn't matter for the companies selling a product. The only votes that end up counting are the ones from people buying.

People really need to drop that saying, because the market was never a democracy and it will never be. Hell, companies can even ignore the paying customers to do something else entirely because the ones who have the most money are the investors.

load more comments (1 replies)
[–] metaStatic@kbin.social 23 points 1 year ago

The good thing about voting with your wallet is that people with more money get more votes, the way god intended.

[–] agent_flounder@lemmy.one 10 points 1 year ago (1 children)

Well... I bought an AMD card, I have been using Firefox for a few years now, and I'm not buying anything from Blizzard. There are literally dozens like me... Unfortunately, only a small number of people know these things and have these views and care enough to boycott. These companies will continue to do what they do until there is sufficient pushback (if ever) to make it less profitable than alternatives.

[–] windlas@lemmy.ca 4 points 1 year ago

Dozens + 1, friend!

[–] greenskye@beehaw.org 10 points 1 year ago (2 children)

Almost like voting with your wallet doesn't actually work. Or only works in same way 'communism' and 'well regulated free market capitalism' concepts work... in theory only.

[–] agent_flounder@lemmy.one 8 points 1 year ago (1 children)

It works a lot better when there are many choices, fair competition in the market, and the traits being voted on are painfully obvious.

[–] ThePenitentOne@discuss.online 4 points 1 year ago

Because the free market is bullshit. It always results in a few major companies hand-shaking and fucking over consumers. Smaller businesses almost never have a chance and are just as easily bought out. To win in this capitalist iteration of society, you have to be the worst and greediest you can be. Add in the fact most people prefer to remain ignorant or are just generally apathetic from years of conditioning, and 'voting with your wallet' rarely really works. You should still do it though of course.

[–] averyminya@beehaw.org 6 points 1 year ago* (last edited 1 year ago)

It's a struggle. Boycotts are historically very hard to be effective and I feel that the Internet has made it even more difficult. Protests need their own marketing and companies at an international scale feel almost immune from any public movements.

That said, voting with your wallet, like boycotts, do work. They just need people to be consistent and informed. But it does work.

Look at Star Wars Battlefront 2 (the 2nd). Prerelease it got over -600k downvotes and substantially hurt the game to the point that they reworked the entire system. If gamers had just bought the game and played anyway, EA wouldn't have needed to actually rework it. But they were so worried about the performance of the game that they actually made a change.

Same for Sonic the Hedgehog. He looked so, so terrible that the fear of losing money made him get fixed.

Granted, these two are examples of something becoming changed before full release, but in spirit the effect is the same. Corporation scared to lose money so changes are made to help make money. Voting with your wallet does work. It just needs to be marketed right. Edit: and I completely forgot the context here, which is that for something like tech, while consumers can have a choice, corporations do too. That's where the struggle comes in

[–] dumdum666@kbin.social 9 points 1 year ago (1 children)

They are also not very good in voting for politicians that actually act in their interest. It baffles me every day… what do you guys think is the reason for this?

[–] theangriestbird@beehaw.org 17 points 1 year ago (1 children)

undereducation. The missing skill here is critical thinking, and critical thinking is something that you don't usually get a lot of practice with until college. The conservative strategy of raising the price of college, refusing to spend money on student aid, and demonizing college professors as liberal brainwashers has been quite effective in keeping their constituents away from higher education.

[–] lemann@lemmy.one 7 points 1 year ago (1 children)

I think quality of education is a big one too, but as long as teachers are underpaid, schools underfunded, understaffed and stretched as far as they can go, things can't improve ☹️

[–] theangriestbird@beehaw.org 4 points 1 year ago

I agree completely! Underfunding of public schools is all part of the plan. Congressional Republicans get to send their kids to private school while their impoverished constituents are forced to send their kids to public schools that are literally falling apart. Most of those kids learn to hate school, so they don't go to college. The cycle repeats.

[–] ripcord@kbin.social 7 points 1 year ago

The browser one is especially bad since there are plenty of good options and they all cost nothing except the most minal amount of time to switch to

[–] Turun@feddit.de 6 points 1 year ago (2 children)

No, actually I don't need to buy the worse product. Privacy considerations are part of the package, just like price and performance are.

I use firefox, because in the performance - privacy - price consideration it beats chrome.

I have a Nvidia graphics card, because being able to run CUDA applications at home beats AMD.

load more comments (2 replies)
[–] OrangeJoe@lemm.ee 4 points 1 year ago* (last edited 1 year ago)

Or maybe they are and you just don't like how they are voting.

I'm not saying that's actually the case, but that point of view always seems to be absent from these types of discussions.

[–] Comment105@lemm.ee 37 points 1 year ago (6 children)

All of Lemmy and all of Reddit could comply with this without it making a difference.

And the last card I bought was a 1060, a lot of us are already basically doing this.

You have not successfully unionized gaming hardware customers with this post.

[–] BlinkerFluid@lemmy.one 7 points 1 year ago* (last edited 1 year ago)

One less sale is victory enough. It's one more than before the post.

load more comments (5 replies)
[–] JackGreenEarth@lemm.ee 27 points 1 year ago* (last edited 1 year ago) (5 children)

What other company besides AMD makes GPUs, and what other company makes GPUs that are supported by machine learning programs?

[–] meteokr@community.adiquaints.moe 24 points 1 year ago (1 children)

Exactly, Nvidia doesn't have real competition. In gaming sure, but no one is actually competiting with CUDA.

[–] jon@lemmy.tf 9 points 1 year ago (2 children)

AMD has ROCm which tries to get close. I've been able to get some CUDA applications running on a 6700xt, although they are noticeably slower than running on a comparable NVidia card. Maybe we'll see more projects adding native ROCm support now that AMD is trying to cater to the enterprise market.

[–] Turun@feddit.de 5 points 1 year ago

They kinda have that, yes. But it was not supported on windows until this year and is in general not officially supported on consumer graphics cards.

Still hoping it will improve, because AMD ships with more VRAM at the same price point, but ROCm feels kinda half assed when looking at the official support investment by AMD.

load more comments (1 replies)
[–] dudewitbow@lemmy.ml 10 points 1 year ago

AMD supports ML, its just a lot of smaller projects are made with CUDA backends, and dont have developers there to switch from CUDA to OpenCL or similar.

Some of the major ML libraries that used to built around CUDA like Tensorflow has already made non CUDA branches, but thats only because tensorflow is open source, ubiquitous in the scene and litterally has google behind it.

ML for more niche uses basically is in the chicken and egg situation. People wont use other gpus for ML because theres no dev working on non CUDA backends. No ones working on non CUDA backends because the devs end up buying Nvidia, which is basically what Nvidia wants.

There are a bunch of followers but a lack in of leaders to move the direction in a more open compute environment.

[–] coffeetest@kbin.social 10 points 1 year ago

My Intel Arc 750 works quite well at 1080 and is perfectly sufficient for me. If people need hyper refresh rates and resolution and all all the bells well then have fun paying for it. But if you need functional, competent gaming, at US$200 Arc is nice.

load more comments (2 replies)
[–] MiddledAgedGuy@beehaw.org 26 points 1 year ago (8 children)

Funnily enough I just, like an hour before reading this post bought an AMD card. And I've been using NVIDIA since the early 00's.

For me it's good linux support. Tired of dealing with their drivers.

Will losing me as a customer make a difference to NVIDIA? Nope. Do I feel good about ditching a company that doesn't treat me well as a consumer? Absolutely!

[–] BlinkerFluid@lemmy.one 6 points 1 year ago* (last edited 1 year ago) (5 children)

Suddenly your video card is as mundane and trivial a solved problem as your keyboard or mouse.

It just works and you never have to even think about it.

To even consider that a reality as someone who's used Linux since Ubuntu 8.10.... I feel spoiled.

load more comments (5 replies)
[–] sandriver@beehaw.org 5 points 1 year ago (1 children)

Absolutely indeed! I'll never buy an Nvidia card because of how anti-customer they are. It started with them locking out PCI passthrough when I was building a gaming Linux machine like ten years ago.

I wonder if moving people towards the idea of just following the companies that don't treat them with contempt is an angle that will work. I know Steph Sterling's video on "consumer" vs "customer" helped crystallize that attitude in me.

load more comments (1 replies)
load more comments (6 replies)
[–] 520@kbin.social 26 points 1 year ago (1 children)

The problem is, the ML people will continue to buy Nvidia. They don't have a choice.

[–] StereoTypo@beehaw.org 18 points 1 year ago

Even a lot of CG professional users are locked into NVIDIA due to software using CUDA. It sucks.

[–] geosoco@kbin.social 19 points 1 year ago (2 children)

In this particular case, it's a bit more complicated.

I suspect the majority of 30x0 & 40x0 card sales continue to be for non-gaming or hybrid uses. I suspect that if pure gamers stopped buying them today for months, it wouldn't make much of a difference to their bottom line.

Until there's reasonable competition for training AI models at reasonable prices, people are going to continue buying their cards because it's the most cost-effective thing -- even at the outrageous prices.

load more comments (2 replies)
[–] JokeDeity@lemm.ee 14 points 1 year ago (4 children)

I mean, you could also say they'll stop price gouging when competitors can meet their quality and support level. What's the alternative?

[–] potustheplant@feddit.nl 6 points 1 year ago (1 children)

The reason nvidia has the r&d budget it has is because you buy their cards. AMD is just now matching them on that but they used to have about half the resources.

[–] JokeDeity@lemm.ee 7 points 1 year ago (1 children)

I mean, I buy my cards second hand because I'm dirt poor, but I still want the best option for my money. It's a hard sell to convince people to buy inferior products to expand the market.

[–] potustheplant@feddit.nl 9 points 1 year ago (3 children)

Except that most of nvidia's features are borderline gimmicks or will not have a long lifespan. DLSS will eventually die and we'll all use FSR just like it happened with gsync vs freesync. Raytracing is very taxing still for all cards in all brands to be worth it. RTX voice (just like AMD noise supression) is not something that useful (I don't even want it). What else do you have that could make an nvidia card "superior"? The only thing I can think of is that the most recent nvidia cards are way more energy efficient than amd's.

load more comments (3 replies)
load more comments (3 replies)
[–] BobKerman3999@feddit.it 11 points 1 year ago (4 children)

They have enough datacenters buying their products, they don't need consumers so they can wait more than the average user...

I personally am done with gaming because if I must spend a thousand euros, I'll spend it in bicicle parts and accessories and not on one single piece of hardware for a decent PC.

[–] bleistift2@feddit.de 10 points 1 year ago (2 children)

There are also games that don’t render a square mile of a city in photorealistic quality.

[–] Valdair@kbin.social 5 points 1 year ago (2 children)

Graphical fidelity has not materially improved since the days of Crysis 1, 16 years ago. The only two meaningful changes for how difficult games should be to run in that time are that 1440p & 2160p have become more common, and raytracing. But consoles being content to run at dynamic resolutions and 30fps combined with tools developed to make raytracting palatable (DLSS) have made developers complacent to have their games run like absolute garbage even on mid spec hardware that should have no trouble running 1080p/60fps.

Destiny 2 was famously well optimized at launch. I was running an easy 1440p/120fps in pretty much all scenarios maxed out on a 1080 Ti. The more new zones come out, the worse performance seems to be in each, even though I now have a 3090.

I am loving BG3 but the entire city in act 3 can barely run 40fps on a 3090, and it is not an especially gorgeous looking game. The only thing I can really imagine is that maxed out the character models and armor models do look quite nice. But a lot of environment art is extremely low-poly. I should not have to turn on DLSS to get playable framerates in a game like this with a Titan class card.

Nvidia and AMD just keep cranking the power on the cards, they're now 3+ slot behemoths to deal with all the heat, which also means cranking the price. They also seem to think 30fps is acceptable, which it just... is not. Especially not in first person games.

load more comments (2 replies)
load more comments (1 replies)
load more comments (3 replies)
[–] peter@feddit.uk 9 points 1 year ago

You know that, I know that. Most people browsing here know that. Anyone you speak to who buys GPUs would probably also agree. Still not gonna change.

[–] gnuplusmatt@startrek.website 9 points 1 year ago

Similarly if nvidia wanted me to buy their cards they'd get their drivers sorted out. While plugging in an amd card just works with literally no setup from me, nvidia will get no money from me

[–] S13Ni@lemmy.studio 9 points 1 year ago

I went with best AMD I could get at the time, 7900xtx sapphire nitro. For gaming, it's already really good, I can use raytracing, although not on best settings on some games,but in most cases I can just max out the settings.

Main complaint atm is that self hosting AI is much more annoying than it would be on nvidia. I usually get everything to work eventually, but it always needs that extra level of technical fuckery.

[–] mateomaui@reddthat.com 7 points 1 year ago (6 children)

well that won’t happen because they are still the best option for compatibility unless you’re using linux

load more comments (6 replies)
[–] Psythik@lemm.ee 6 points 1 year ago* (last edited 1 year ago)

Well when AMD finally catches up with nVidia and offers a high-end GPU with FG and decent Ray Tracing, I'll gladly switch. I'd love nothing more than to have an all-AMD PC.

[–] excel@lemmy.megumin.org 4 points 1 year ago

And as soon as they have any competitors we might consider it

load more comments
view more: next ›