this post was submitted on 13 Nov 2024
669 points (94.9% liked)

Technology

59972 readers
2560 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Ultraviolet@lemmy.world 3 points 1 month ago (6 children)

Because novelty is all it has. As soon as it stops improving in a way that makes people say "oh that's neat", it has to stand on the practical merits of its capabilities, which is, well, not much.

[–] theherk@lemmy.world 7 points 1 month ago (5 children)

I’m so baffled by this take. “Create a terraform module that implements two S3 buckets with cross-region bidirectional replication. Include standard module files like linting rules and enable precommit.” Could I write that? Yes. But does this provide an outstanding stub to start from? Also yes.

And beyond programming, it is otherwise having positive impact on science and medicine too. I mean, anybody who doesn’t see any merit has their head in the sand. That of course must be balanced with not falling for the hype, but the merits are very real.

[–] Eccitaze@yiffit.net 5 points 1 month ago (2 children)

There's a pretty big difference between chatGPT and the science/medicine AIs.

And keep in mind that for LLMs and other chatbots, it's not that they aren't useful at all but that they aren't useful enough to justify their costs. Microsoft is struggling to get significant uptake for Copilot addons in Microsoft 365, and this is when AI companies are still in their "sell below cost and light VC money on fire to survive long enough to gain market share" phase. What happens when the VC money dries up and AI companies have to double their prices (or more) in order to make enough revenue to cover their costs?

[–] obbeel 1 points 1 month ago

I understand that it makes less sense to spend in model size if it isn't giving back performance, but why would so much money be spent on larger LLMs then?

load more comments (1 replies)
load more comments (3 replies)
load more comments (3 replies)