this post was submitted on 08 Nov 2023
129 points (95.7% liked)

Technology

34862 readers
45 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
129
... (www.phind.com)
submitted 1 year ago* (last edited 8 months ago) by CoderSupreme@programming.dev to c/technology@lemmy.ml
top 17 comments
sorted by: hot top controversial new old
[–] theluddite@lemmy.ml 53 points 1 year ago (6 children)

I do software consulting for a living. A lot of my practice is small organizations hiring me because their entire tech stack is a bunch of shortcuts taped together into one giant teetering monument to moving as fast as possible, and they managed to do all of that while still having to write every line of code.

In 3-4 years, I'm going to be hearing from clients about how they hired an undergrad who was really into AI to do the core of their codebase and everyone is afraid to even log into the server because the slightest breeze might collapse the entire thing.

LLM coding is going to be like every other industrial automation process in our society. We can now make a shittier thing way faster, without thinking of the consequences.

[–] nix@merv.news 7 points 1 year ago

This is definitely an issue with AI and is why it shouldn’t be used to replace people.

The real value of AI in my opinion is for a non coder like myself to be able to use it to quickly create a blender addon in python to help me do small things like delete keyframes on every other frame or add a button to the ui that automatically adds a shrinkwrap modifier to the object I have selected. Small things that are really convenient to have but not worth the effort of learning python and the blender codebase to do.

[–] sheogorath@lemmy.world 6 points 1 year ago (1 children)

The thing with LLMs is that they learn from existing tech and use cases. My work is niche enough that LLMs are unable to make meaningful contributions on the code side. However your assessment is true where an overzealous junior dev can make a lot of shit stuff faster because they don't really understand the code the LLM spits out.

As for myself, I mostly use it to help breakdown requirements so I can better work thru it, help me think up edge cases for tests, and be an interactive rubber duck when debugging an issue.

[–] theluddite@lemmy.ml 3 points 1 year ago

Yeah, I totally see that. I want to clarify: It's not that I don't think it's useful at all. It's that our industry has fully internalized venture capital's value system and they're going to use this new tool to slam on the gas as hard as they can, because that's all we ever do. Every single software ecosystem is built around as fast as possible, everything else be damned.

[–] Lemongrab@lemmy.one 5 points 1 year ago (2 children)

I have seen how useful it can be to people who dont know how to code. I think it would help more if you already know how to. Maybe for generating scafolding for functionality to be built on.

[–] theluddite@lemmy.ml 4 points 1 year ago* (last edited 1 year ago) (1 children)

Yeah, I think helping people who don't know how to code and letting them dabble is a great use case. I fully encourage that.

I don't think it's actually good for generating scaffolding in terms of helping people write quality software, but I do agree with you that that's how people are going to use it, and then the expectation is going to become that you have to do things that fast. It's kind of mindboggling to me that anyone would look at the software industry and decide that our problem is that we don't move fast enough. Moving too fast for speed's own sake is already the cause of so many of our problems.

[–] Lemongrab@lemmy.one 2 points 1 year ago

True. I havent yet used any of these services, but from how i see things LLMs should be used to help with research and as a primary source about a topic.

[–] nathris@lemmy.ca 2 points 1 year ago

I pay $10/month for copilot because it saves me a lot more than $10 in time not spent typing out boilerplate or searching through garbage documentation.

It frees up my mind to focus on the actual software architecture instead of the quirks of the language.

[–] llothar@lemmy.ml 4 points 1 year ago

I think the biggest benefit is for people that cannot code or are just learning. Before a python script to do X or Y was a real problem. Now it is easy.

Plus it may help with Linux adoption - LLM can describe few commands in terminal plus some text config easily, but will struggle with Windows-like graphical configuration.

[–] gazter@aussie.zone 2 points 1 year ago (1 children)

I think about this in my workplace. I'm not on the IT side of things, but I do have more of an interest than most. And wow, it seems a mess.

I think the problem lies with all these nifty solutions being implemented, and then suddenly it's someone's job to tie them all together, which they get halfway through doing before they are called off to do some other task... There doesn't seem to be an overall architecture, or a coherent model of how information should flow around the business. I'm guessing you come across this a lot? How does that get solved?

[–] theluddite@lemmy.ml 1 points 1 year ago

I agree. I've actually written about this.

It gets solved by planning. Actual long term planning that includes the relevant stakeholders. Currently everything is run by and for VCs who only plan in terms of funding rounds and exits.

[–] artaxadepressedhorse@lemmyngs.social 2 points 1 year ago (1 children)

How do you keep your sanity? Do you just have to compartmentalize everything as extremely abstract and comical when your task is detangling someone's 25,000 line spaghetti God classes and obscure async timing bugs?

[–] theluddite@lemmy.ml 8 points 1 year ago

Honestly I almost never have to deal with any of those things, because there's always a more fundamental problem. Engineering as a discipline exists to solve problems, but most of these companies have no mechanism to sit down and articulated what problems they are trying to solve at a very fundamental level, and then really break them down and talk about them. The vast majority of architecture decisions in software get made by someone thinking something like "I want to use this new ops tool" or "well everyone uses react so that's what I'll use."

My running joke is that every client has figured out a new, computationally expensive way to generate a series of forms. Most of my job is just stripping everything out. I've replaced so many extremely complex, multi-service deploy pipelines with 18 lines of bash, or reduced AWS budgets by one sometimes two orders of magnitude. I've had clients go from spending 1500/month on AWS with serverless and lambda and whatever other alphabet soup of bullshit services that make no sense to 20 fucking dollars.

It's just mind-blowing how stupid our industry is. Everyone always thinks I'm sort of genius performance engineer for knowing bash and replacing their entire front-end react framework repo that builds to several GB with server side templating from 2011 that loads a 45kb page. Suddenly people on mobile can actually use the site! Incredible! Turns out your series of forms doesn't need several million lines of javascript.

I don't do this kind of work as much anymore, but up until about a year ago, it was my bread and butter..

[–] GlitzyArmrest@lemmy.world 4 points 1 year ago (1 children)

Just used it to generate some HA automations that GPT-4 struggled with, and it was definitely faster.

[–] fred@lemmy.ml 2 points 1 year ago (2 children)
[–] GlitzyArmrest@lemmy.world 1 points 1 year ago

Home Assistant, in this case; YAML.

[–] Tum@lemmy.world -1 points 1 year ago

High Availability.