this post was submitted on 21 May 2024
120 points (86.6% liked)
Technology
59429 readers
2671 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Short term yes; long term probably not. All the dipshit c-suites pushing the “AI” worker replacement initiatives are going to destroy their workforces and then realize that LLMs can’t actually reliably replace any of the workers they fired. And I love that for management.
It can potentially allow 1 worker to do the job of 10. For 9 of those workers, they have been replaced. I don't think they will care that much for the nuance that they technically weren't replaced by AI, but by 1 co-worker who is using AI to be more efficient.
That doesn't necessarily mean that we won't have enough jobs any more, because when in human history have we ever become more efficient and said "ok, good enough, let's just coast now"? We will just increase the ambition and scope of what we will build, which will require more workers working more efficiently.
But that still really sucks because it's not going to be the same exact jobs and it will require re-training. These disruptions are becoming more frequent in human history and it is exhausting.
We still need to spread these gains so we can all do less and also help those whose lives have been disrupted. Unfortunately that doesn't come for free. When workers got the 40 hour work week it was taken by force.
My colleagues are starting to use AI, it just makes their code worse and harder to review. I honestly can’t imagine that changing, AI doesn’t actually understand anything.
This comment has similar vibes to a boomer in the 80s saying that the Internet is useless and full of nothing but nerds arguing on forums, and he doesn't see that changing.
Probably. I'm just not seeing it actually doing any logic or problem solving. It’s a pattern matching machine today. A new technology could certainly happen.
Do you know what pattern matching is great for? Finding commonly cited patterns in long debug log messages. LLMs are great for brainstorming problem solving. They're basically word granularity search engines, so they're great for looking things up are more niche knowledge that document search engines fail on. If the thing you're trying to look up doesn't exist, it will make shit up so you need to cross reference everything, but it's still incredibly helpful. Pattern matching is also great for boilerplate. I use the codium extension and it comes up with auto complete suggestions that don't have much logic, but save a good amount of key strokes.
I didn't think the foundational tech of LLMs are going to get substantially better, but we will develop programming patterns that make them more robust and reliable.