this post was submitted on 04 Jan 2024
24 points (100.0% liked)

Programming

13376 readers
1 users here now

All things programming and coding related. Subcommunity of Technology.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 1 year ago
MODERATORS
 

cross-posted from: https://programming.dev/post/8121843

~n (@nblr@chaos.social) writes:

This is fine...

"We observed that participants who had access to the AI assistant were more likely to introduce security vulnerabilities for the majority of programming tasks, yet were also more likely to rate their insecure answers as secure compared to those in our control group."

[Do Users Write More Insecure Code with AI Assistants?](https://arxiv.org/abs/2211.03622?

you are viewing a single comment's thread
view the rest of the comments
[โ€“] TheFriendlyArtificer@beehaw.org 14 points 10 months ago (1 children)

My argument is thus:

LLMs are decent at boilerplate. They're good at rephrasing things so that they're easier to understand. I had a student who struggled for months to wrap her head around how pointers work, two hours with GPT and the ability to ask clarifying questions and now she's rockin'.

I like being able to plop in a chunk of Python and say, "type annotate this for me and none of your sarcasm this time!"

But if you're using an LLM as a problem solver and not as an accelerator, you're going to lack some of the deep understanding of what happens when your code runs.

[โ€“] jherazob@beehaw.org 7 points 10 months ago

The thing is that this is NOT what the marketers are selling, they're not selling this as "Buy access to our service so that your products will be higher quality", they're selling this as "this will replace many of your employees". Which it can't, it's very clear by now that it just can't.