this post was submitted on 07 Apr 2024
338 points (93.1% liked)

Technology

59223 readers
3500 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] misspacific@lemmy.blahaj.zone 17 points 7 months ago (26 children)

exactly, this will eliminate some jobs, but anyone who's asked an LLM to fix code longer than 400 lines knows it often hurts more than it helps.

which is why it is best used as a tool to debug code, or write boilerplate functions.

[–] hansl@lemmy.world -5 points 7 months ago* (last edited 7 months ago) (11 children)

You’ll get blindsided real quick. AIs are just getting better. OpenAI are already saying they moved past GPT for their next models. It’s not 5 years before it can fix code longer than 400 lines, and not 20 before it can digest a specification and spout a working software. Said software might not be optimized or pretty, but those are things people can work separately. Where you needed 20 software engineers, you’ll need 10, then 5, then 1-2.

You have more in common with the guy getting replaced today than you care to admit in your comment.

Edit: not sure why I’m getting downvoted instead of having a discussion, but good luck to you all in your careers.

[–] misspacific@lemmy.blahaj.zone 7 points 7 months ago (5 children)

i didn't downvote you, regardless internet points don't matter.

you're not wrong, and i largely agree with what you've said, because i didn't actually say a lot of the things your comment assumes.

the most efficient way i can describe what i mean is this:

LLMs (this is NOT AI) can, and will, replace more and more of us. however, there will never, ever be a time where there will be no human overseeing it because we design software for humans (generally), not for machines. this requires integral human knowledge, assumptions, intuition, etc.

[–] hansl@lemmy.world 1 points 7 months ago* (last edited 7 months ago) (1 children)

LLMs (this is NOT AI)

I disagree. When I was studying AI at college 20+ years ago we were also talking about expert systems which are glorified if/else chains. Most experts in the field agree that those systems can also be considered AI (not ML though).

You may be thinking of GAI or Universal AI which is different. I am a believer in the singularity (that a machine will be as creative and conscious as a human), but that’s a matter of opinion.

I didn’t downvote you

I was using “you” more towards the people downvoting me, not you directly. You can see the accounts who downvoted/upvoted, btw.

Edit: and I assumed the implication of your comment was that “people who code are safe”, which is a stretch I was answering to. Your comment was ambiguous either way.

[–] misspacific@lemmy.blahaj.zone 3 points 7 months ago (2 children)

jesus christ you should be shoved into a locker

[–] hansl@lemmy.world 0 points 7 months ago (1 children)

Wow. Thanks for the advice. I guess that’s just Lemmy showing me the door. Good luck with your community here.

[–] Welt@lazysoci.al 3 points 7 months ago

Try not to let the bot hurt your feelings, it was trained on cunts 'n' assholes

load more comments (3 replies)
load more comments (8 replies)
load more comments (22 replies)