this post was submitted on 17 Mar 2025
1348 points (99.7% liked)

Programmer Humor

34449 readers
725 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] thickertoofan@lemm.ee 8 points 1 day ago

taste of his own medicine

[–] can@sh.itjust.works 58 points 2 days ago* (last edited 1 day ago)
[–] mindbleach@sh.itjust.works 45 points 2 days ago (9 children)

An otherwise meh article concluded with "It is in everyone’s interest to gradually adjust to the notion that technology can now perform tasks once thought to require years of specialized education and experience."

Much as we want to point and laugh - this is not some loon's fantasy. This is happening. Some dingus told spicy autocomplete 'make me a database!' and it did. It's surely as exploit-hardened as a wet paper towel, but it functions. Largely as a demonstration of Kernighan's law.

This tech is borderline miraculous, even if it's primarily celebrated by the dumbest motherfuckers alive. The generation and the debugging will inevitably improve to where the machine is only as bad at this as we are. We will be left with the hard problem of deciding what the software is supposed to do.

[–] HiddenLayer555@lemmy.ml 6 points 1 day ago* (last edited 1 day ago) (1 children)

It is in everyone’s interest to gradually adjust to the notion that technology can now perform tasks once thought to require years of specialized education and experience.

The years of specialized education and experience is not for writing code in and of itself. Anyone with an internet connection can learn to do that in not that long. What takes years to perfect is writing reliable, optimized, secure code, communicating and working efficiently with others, writing code that can be maintained by others long after you leave, knowing the theories behind why code written in a certain way works better than code written in some other way, and knowing the qualitative and quantitative measures to even be able to assess whether one piece of code is "better" than the other. Source: Self-learned programming, started building stuff on my own, and then went through an actual computer science program. You miss so much nuance and underlying theory when you self-learn, which directly translates bad code that's a nightmare to maintain.

Finally, the most important thing you can do with the person that has years of specialized education and experience is you can actually have a conversation with them about their code, ask them to explain in detail how it works and the process they used to write it. Then you can ask them followup questions and request further clarification. Trying to get AI to explain itself is a complete shitshow, and while humans do have a propensity to make shit up to cover their own/their coworkers' asses, AI does that even when it make no sense not to tell the truth because it doesn't really know what "the truth" is and why other people would want it.

Will AI eventually catch up? Almost certainly, but we're nowhere close to that right now. Currently it's less like an actual professional developer and more like someone who knows just enough to copy paste snippets from Stack Overflow and hack them together into a program that manages to compile.

I think the biggest takeaway with AI programming is not that it can suddenly do just as well as someone with years of specialized education and experience, but that we're going to get a lot more shitty software that look professional on the surface, but is a dumpster fire inside.

load more comments (1 replies)
load more comments (8 replies)
[–] Takumidesh@lemmy.world 33 points 1 day ago (18 children)

This is satire / trolling for sure.

LLMs aren't really at the point where they can spit out an entire program, including handling deployment, environments, etc. without human intervention.

If this person is 'not technical' they wouldn't have been able to successfully deploy and interconnect all of the pieces needed.

The AI may have been able to spit out snippets, and those snippets may be very useful, but where it stands, it's just not going to be able to, with no human supervision/overrides, write the software, stand up the DB, and deploy all of the services needed. With human guidance sure, but with out someone holding the AIs hand it just won't happen (remember this person is 'not technical')

[–] allo@sh.itjust.works 28 points 1 day ago (1 children)

idk ive seen some crazy complicated stuff woven together by people who cant code. I've got a friend who has no job and is trying to make a living off coding while, for 15+ years being totally unable to learn coding. Some of the things they make are surprisingly complex. Tho also, and the person mentioned here may do similarly, they don't ONLY use ai. They use Github alot too. They make nearly nothing themself, but go thru github and basically combine large chunks of code others have made with ai generated code. Somehow they do it well enough to have done things with servers, cryptocurrency, etc... all the while not knowing any coding language.

[–] Maggoty@lemmy.world 25 points 1 day ago

That reminds me of this comic strip....

Claude code can make something that works, but it's kinda over engineered and really struggles to make an elegant solution that maximises code reuse - it's the opposite of DRY.

I'm doing a personal project at the moment and used it for a few days, made good progress but it got to the point where it was just a spaghetti mess of jumbled code, and I deleted it and went back to implementing each component one at a time and then wiring them together manually.

My current workflow is basically never let them work on more than one file at a time, and build the app one component at a time, starting at the ground level and then working in, so for example:

Create base classes that things will extend, Then create an example data model class, iterate on that architecture A LOT until it's really elegant.

Then Ive been getting it to write me a generator - not the actual code for models,

Then (level 3) we start with be UI.layer, so now we make a UI kit the app will use and reuse for different components

Then we make a UI component that will be used in a screen. I'm using flutter as an example so It would be a stateless component

We now write tests for the component

Now we do a screen, and I import each of the components.

It's still very manual, but it's getting better. You are still going to need a human cider, I think forever, but there are two big problems that aren't being addressed because people are just putting their head in the sand and saying nah can't do it, or the clown op in the post who thinks they can do it.

  1. Because dogs be clownin, the public perception of programming as a career will be devalued "I'll just make it myself!" Or like my rich engineer uncle said to me when I was doing websites professionally - a 13 year old can just make a website, why would I pay you so much to do it. THAT FUCKING SUCKS. But a similar attitude has existed from people "I'll just hire Indians". This is bullshit, but perception is important and it's going to require you to justify yourself for a lot more work.

  2. And this is the flip side good news. These skills you have developed - it's is going to be SO MUCH FUCKING HARDER TO LEARN THEM. When you can just say "hey generate me an app that manages customers and follow ups" and something gets spat out, you aren't going to investigate the grind required to work out basic shit. People will simply not get to the same level they are now.

That logic about how to scaffold and architect an app in a sensible way - USING AI TOOLS - is actually the new skillset. You need to know how to build the app, and then how to efficiently and effectively use the new tools to actually construct it. Then you need to be able to do code review for each change.

load more comments (16 replies)
[–] electric@lemmy.world 59 points 2 days ago (13 children)

Is the implication that he made a super insecure program and left the token for his AI thing in the code as well? Or is he actually being hacked because others are coping?

[–] jewbacca117@lemmy.world 27 points 2 days ago (1 children)

AI writes shitty code that's full of security holes, and Leo here has probably taken zero steps to further secure his code. He broadcasts his AI written software and its open season for hackers.

[–] T156@lemmy.world 11 points 1 day ago

Not just, but he literally advertised himself as not being technical. That seems to be just asking for an open season.

load more comments (12 replies)
[–] formulaBonk@lemm.ee 46 points 2 days ago (14 children)

Reminds me of the days before ai assistants where people copy pasted code from forums and then you’d get quesitions like “I found this code and I know what every line does except this ‘for( int i = 0; i < 10; i ++)’ part. Is this someone using an unsupported expression?”

load more comments (14 replies)
[–] hperrin@lemmy.ca 24 points 2 days ago

“Come try my software! I’m an idiot, so I didn’t write it and have no idea how it works, but you can pay for it.”

to

“🎵How could this happen to meeeeee🎵”

[–] ikidd@lemmy.world 21 points 2 days ago* (last edited 2 days ago)

ITT: "Haha, yah AI makes shitty insecure code!"

[–] bratorange@feddit.org 14 points 2 days ago* (last edited 1 day ago) (1 children)

Im gone print this and hang it into office

load more comments (1 replies)
[–] HStone32@lemmy.world 15 points 2 days ago

Managers hoping genAI will cause the skill requirements (and paycheck demand) of developers to plummet:

Also managers when their workforce are filled with buffoons:

load more comments
view more: ‹ prev next ›