this post was submitted on 04 Mar 2025
11 points (82.4% liked)

Comradeship // Freechat

2342 readers
140 users here now

Talk about whatever, respecting the rules established by Lemmygrad. Failing to comply with the rules will grant you a few warnings, insisting on breaking them will grant you a beautiful shiny banwall.

A community for comrades to chat and talk about whatever doesn't fit other communities

founded 3 years ago
MODERATORS
 

I've tried using deepseek (first time I've ever used an LLM, so maybe I'm being dumb) to help me a little with designing some circuit because my reference book was leaving out a LOT of crucial information.

The results have been ... subpar. The model seems to be making quite elementary mistakes, like leaving floating components with missing connections.

I'm honestly kinda disappointed. Maybe this is a weak area for it. I've probably had to tell deepseek more about designing the circuit in question than it has told me.

Edit: I realised I was just being dumb, since LLMs aren't designed for this task.

you are viewing a single comment's thread
view the rest of the comments
[–] davel@lemmygrad.ml 12 points 1 month ago (1 children)

I wouldn’t have thought an LLM to be of use for circuit design in the first place, so I wouldn’t have been disappointed.

[–] Sodium_nitride@lemmygrad.ml 1 points 1 month ago (2 children)

I mean, people use it for making code.

[–] lorty@lemmygrad.ml 8 points 1 month ago

Maybe I'm wrong but the amount of code available online you can use to feed a model is probably a few orders of magnitude larger than circuit designs.

[–] KrasnaiaZvezda@lemmygrad.ml 7 points 1 month ago

LLMs are often trained on up to some 80% code, depending on use although usually it's probably lower, as that has been shown to improve their logical/thinking skills.

Basically, if the task can be done with only words and there is a lot of data of it present day LLMs can probably get really good at it if properly trained for it, but for things like circuits, where a lot of the data is likelly to be graphical or there might just not be much of it, LLMs aren't yet as good at.