this post was submitted on 27 Jan 2025
883 points (98.1% liked)
Technology
61206 readers
4373 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It requires only 5% of the same hardware that OpenAI needs to do the same thing. So that can mean less quantity of top end cards and it can also run on less powerful cards (not top of the line).
Should their models become standard or used more commonly, then nvidis sales will drop.
Doesn’t this just mean that now we can make models 20x more complex using the same hardware? There’s many more problems that advanced Deep Learning models could potentially solve that are far more interesting and useful than a chat bot.
I don’t see how this ends up bad for Nvidia in the long run.
Honestly none of this means anything at the moment. This might be some sort of calculated trickery from China to give Nvidia the finger, or Biden the finger, or a finger to Trump's AI infrastructure announcement a few days ago, or some other motive.
Maybe this "selloff" is masterminded by the big wall street players (who work hand-in-hand with investor friendly media) to panic retail investors so they can snatch up shares at a discount.
What I do know is that "AI" is a very fast moving tech and shit that was true a few months ago might not be true tomorrow - no one has a crystal ball so we all just gotta wait and see.
There could be some trickery on the training side, i.e. maybe they spent way more than $6M to train it.
But it is clear that they did it without access to the infra that big tech has.
And on the run side, we can all verify how well it runs and people are also running it locally without internet access. There is no trickery there.
They are 20x cheaper than OpenAI if you run it on their servers and if you run it yourself, you only need a small investment in relatively affordable servers.
Give that statement to maybe not super techy investors, and that could spook them into the sell-off.