this post was submitted on 21 Oct 2024
932 points (97.3% liked)

Technology

59577 readers
4760 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] GeneralInterest@lemmy.world 45 points 1 month ago (2 children)

Maybe it's like the dotcom bubble: there is genuinely useful tech that has recently emerged, but too many companies are trying to jump on the bandwagon.

LLMs do seem genuinely useful to me, but of course they have limitations.

[–] linearchaos@lemmy.world 13 points 1 month ago (2 children)

We need to stop viewing it as artificial intelligence. The parts that are worth money are just more advanced versions of machine learning.

Being able to assimilate a few dozen textbooks and pass a bar exam is a neat parlor trick, but it is still just a parlor trick.

Unfortunately probably the biggest thing to come out of it will be the marketing aspect. If they spend enough money to train small models on our wants and likes it will give them tremendous amounts of return.

The key to using it in a financially successful manner is finding problems that fit the bill. Training costs are fairly high, quality content generation is also rather expensive. There are sticky problems around training it from non-free data. Whatever you're going to use it for either needs to have a significant enough advantage to make the cost of training /data worth it.

I still think we're eventually going to see education rise. The existing tools for small content generation adobe's use of it to fill in small areas is leaps and bounds better than the old content aware patches. We've been using it for ages for speech recognition and speech generation. From there it's relatively good at helper roles. Minor application development, copy editing, maybe some VFX generation eventually. Things where you still need a talented individual to oversee it but it can help lessen the workload.

There are lots of places where it's being used where I think it's a particularly poor fit. AI help desk chatbots, IVR scenarios, It says brain dead as the original phone trees and flow charts that we've been following for decades.

[–] SparrowRanjitScaur@lemmy.world 4 points 1 month ago

Machine learning is AI. I think the term you're looking for is general artificial intelligence, and no one is claiming LLMs fall under that label.

[–] datelmd5sum@lemmy.world 10 points 1 month ago (3 children)

We're hitting logarithmic scaling with the model trainings. GPT-5 is going to cost 10x more than GPT-4 to train, but are people going to pay $200 / month for the gpt-5 subscription?

[–] madis@lemm.ee 4 points 1 month ago (1 children)

But it would use less energy afterwards? At least that was claimed with the 4o model for example.

[–] fuck_u_spez_in_particular@lemmy.world 3 points 1 month ago (1 children)

4o is also not really much better than 4, they likely just optimized it among others by reducing the model size. IME the "intelligence" has somewhat degraded over time. Also bigger Model (which in tha past was the deciding factor for better intelligence) needs more energy, and GPT5 will likely be much bigger than 4 unless they somehow make a breakthrough with the training/optimization of the model...

[–] hglman@lemmy.ml 2 points 1 month ago

4o is optimization of the model evaluation phase. The loss of intelligence is due to the addition of more and more safeguards and constraints by the use of adjunct models doing fine turning, or just rules that limit whole classes of responses.

[–] Skates@feddit.nl 4 points 1 month ago

Is it necessary to pay more, or is it enough to just pay for more time? If the product is good, it will be used.

[–] GeneralInterest@lemmy.world -2 points 1 month ago

Businesses might pay big money for LLMs to do specific tasks. And if chip makers invest more in NPUs then maybe LLMs will become cheaper to train. But I am just speculating because I don't have any special knowledge of this area whatsoever.