705
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 23 May 2024
705 points (99.0% liked)
Technology
59094 readers
3109 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
there are language models that are quite feasible to run locally for easier tasks like this. “local” rules out both ChatGPT and Co-pilot since those models are enormous. AI generally means machine learned neural networks these days, even if a pile of if-else used to pass in the past.
not sure how they’re going to handle low-resource machines, but as far as AI integrations go this one is rather tame
Right, but a neural network traditionally rules out using a single local machine. Hell, we have entire chip architecture that revolves around neural net optimization. I can't imagine needing that kind of configuration for my internet browser.
One of the perks of Firefox is its relative thinness. Chrome was a shameless resource hog even in its best days, and IE wasn't any better. Do I really want Firefox chewing hundreds of MB of memory so it can... what? Simulate a 600 processor cluster doing weird finger art?
i mean, i’ve worked in neural networks for embedded systems, and it’s definitely possible. i share you skepticism about overhead, but i’ll eat my shoes if it isn’t opt in
I don't doubt it's possible. I'm just not sure how it would be useful.
I use my local machine for neutral networks just fine