this post was submitted on 15 Sep 2024
892 points (98.1% liked)
Technology
59989 readers
2162 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How come the hallucinating ghost in the machine is generating code so bad the production servers hallucinate even harder and crash?
I’m not sure how AI supposed to understand code. Most of the code out there is garbage. Even most of the working code out there in the world today is garbage.
Heck, I sometimes can’t understand my own code. And this AI thing tries to tell me I should move this code over there and do this and that and then poof it doesn’t compile anymore. The thing is even more clueless than me.
Randomly rearranging non working code one doesn’t understand… sometimes gets working code, sometimes doesn’t fix the bug, sometimes it won’t even compile anymore? Has no clue what the problem is and only solves it randomly by accident?
Sounds like the LLM is as capable as me /s
Sometimes you even get newer and more interesting bugs!
As a senior dev, this sounds like job security. :)
You know you’re Sr. when it doesn’t even bother you anymore. It amuses you.
My boss comes to me saying we must finish feature X by date Y or else.
Me:
We're literally in this mess right now. Basically, product team set out some goals for the year, and we pointed out early on that feature X is going to have a ton of issues. Halfway through the year, my boss (the director) tells the product team we need to start feature X immediately or it's going to have risk of missing the EOY goals. Product team gets all the pre-reqs finished about 2 months before EOY (our "year" ends this month), and surprise surprise, there are tons of issues and we're likely to miss the deadline. Product team is freaking out about their bonuses, whereas I'm chuckling in the corner pointing to the multiple times we told them it's going to have issues.
There's a reason you hire senior engineers, and it's not to wave a magic wand and fix all the issues at the last minute, it's to tell you your expectations are unreasonable. The process should be:
If you skip some of those steps, you're going to have a bad time.
In my experience, the job of a sr. revolves around expectations. Expectations of yourself, of the customer, of your bosses, of your juniors and individual contributors working with you or that you're tasking. Managing the expectations and understanding how these things go to protect your guys and gals and trying to save management from poking out their own eyes.
And you may actually have time to do some programming.
Yup. I actually only take a 50% workload because half of my time is spent in random meetings telling people no, or giving obscenely high estimates that essentially amount to "no." The other half of my time is fixing problems from when they didn't listen when I said "no."
Such is life I guess. But occasionally, I get to work on something new. And honestly, that's fine, I've long since stopped caring about my name showing up on things.
Not all heroes wear capes. You're saving their butts, and they don't know it.
Can confirm. At our company, we have a tech debt budget, which is really awesome since we can fix the worst of the problems. However, we generate tech debt faster than we can fix it. Adding AI to the mix would just make tech debt even faster, because instead of senior devs reviewing junior dev code, we'd have junior devs reviewing AI code...
LLMs are not supposed to understand, they are supposed to pretend to understand.
You have to be hallucinating to understand.
I’ve licked the frog twice! How many does it take?
I take it that frog hadn't been de-boned.