this post was submitted on 27 Nov 2024
160 points (93.5% liked)

Firefox

17952 readers
394 users here now

A place to discuss the news and latest developments on the open-source browser Firefox

founded 4 years ago
MODERATORS
 

They support Claude, ChatGPT, Gemini, HuggingChat, and Mistral.

top 50 comments
sorted by: hot top controversial new old
[–] JokeDeity@lemm.ee 4 points 55 minutes ago (1 children)

Unpopular opinion, I think they're doing it right as well as it can be at least. It's completely optional and doesn't seem to be intrusive.

[–] potentiallynotfelix@lemmy.fish 2 points 50 minutes ago (1 children)

yeah its not google chrome level which i'm thankful about.

[–] JokeDeity@lemm.ee 3 points 48 minutes ago (1 children)

I'm way more pissed about restarting my PC after an update and having Copilot installed without my permission.

[–] Scrollone@feddit.it 4 points 53 minutes ago

Wow, great job Firefox. Thanks.

If I wanted unreliable bullshit like AI, I'd use Chrome.

[–] fibojoly@sh.itjust.works 15 points 7 hours ago

Didn't want it in Opera, don't want it in Firefox. I mean they can keep trying and I'll just keep on ignoring this shit :/

[–] ilinamorato@lemmy.world 4 points 7 hours ago (2 children)

This happened ages ago, didn't it? Am I missing something new?

I only saw it now, maybe it happened before on a different version.

[–] 2kool4idkwhat@lemdro.id 3 points 1 hour ago (1 children)

Yeah, it did. That feature has been there at least since when Mozilla enabled "Firefox labs" section in settings by default a few months ago, and maybe even earlier than that

[–] victorz@lemmy.world 3 points 1 hour ago (1 children)
[–] ilinamorato@lemmy.world 2 points 1 hour ago

Well, this month in particular....

[–] celeste@lemmy.blahaj.zone 3 points 7 hours ago (3 children)

If they do it in a privacy-preseeving way, this could help them get back market share which will generally benefit an open internet.

[–] victorz@lemmy.world 2 points 1 hour ago

I really wish there was another way.

But it's gonna be very difficult when you've got Google and OpenAI up there.

[–] Scrollone@feddit.it 0 points 58 minutes ago

Why would anybody want to have AI in their browser? It's a fucking browser.

[–] Melatonin@lemmy.dbzer0.com 4 points 8 hours ago

oh good. hurray.

[–] nu11@sh.itjust.works 19 points 13 hours ago (1 children)

I don't understand the hate. It's just a sidebar for the supported LLMs. Maybe I'm misunderstanding?

Yes, I would prefer Mozilla focus on the browser, but to me, this seems like it was done in an afternoon.

[–] Scrollone@feddit.it 2 points 57 minutes ago

I want my browser to be a browser. I don't want Pocket, I don't want AI, I don't want bullshit. There are plugins for that.

[–] fruitycoder@sh.itjust.works 3 points 7 hours ago* (last edited 3 hours ago)

I will say, the Le Chat provider is pretty decent. You really can use it more natural language. "Rewrite it with a better rhyme scheme" "remove the last line" and it just got it.

Why no local option though? Why no anonmysing option?

Edit: There is a right click option which does make this officially actually useful for me now (summarize this!).

Other models do have RAG options and Mist real supports making agents with specified documentation too to at least fine tune too (not as good as full grounding though IMHO)

[–] fmstrat@lemmy.nowsci.com 9 points 12 hours ago (2 children)

I mean, if you're going to do it, where's the Ollama love?

[–] fruitycoder@sh.itjust.works 3 points 7 hours ago

I was disappointed there was no local option...

[–] tetris11@lemmy.ml 1 points 12 hours ago (2 children)

I don't get it, ollama is a provider no?

[–] fmstrat@lemmy.nowsci.com 1 points 13 minutes ago

A provider that can be run locally.

[–] pineapplelover@lemm.ee 3 points 11 hours ago (1 children)

I think the point is it's open source

[–] tetris11@lemmy.ml 4 points 11 hours ago

and so is firefox, so why use another model provider

[–] Zementid@feddit.nl 8 points 13 hours ago

Now add support for GPT4All and everyone is happy again.

[–] eleitl@lemm.ee 16 points 16 hours ago (1 children)

Thanks for nothing, Mozilla.

[–] Rozauhtuno@lemmy.blahaj.zone 20 points 15 hours ago (1 children)

They should raise the ceo's pay some more to celebrate.

[–] piracysails@lemm.ee 7 points 15 hours ago* (last edited 15 hours ago)

And fire a few employees just cause.

[–] ocassionallyaduck@lemmy.world 28 points 18 hours ago (9 children)

Thing is, for your average user with no GPU and whp never thinks about RAM, running a local LLM is intimidating. But it shouldn't be. Any system with an integrated GPU, and the more RAM the better, can run simple models locally.

The not so dirty secret is that ChatGPT 3 vs 4 isn't that big a difference, and neither are leaps and bounds ahead of the publically available models for about 99% of tasks. For that 1% people will ooh and aah over it, but 99% of use cases are only seeing marginal gains on 4o.

And the simplified models that run "only" 95% as well? They can use 90% fewer resources give pretty much identical answers outside of hyperspecific use cases.

Running a a "smol" model as some are called, gets you all the bang for none of the buck, and your data stays on your system and never leaves.

I've been yelling from the rooftops to some stupid corporate types that once the model is trained, it's trained. Unless you are training models yourself, there is no need for the massive AI clusters, just for the model. Run it local on your hardware at a fraction of the cost.

[–] LWD@lemm.ee 22 points 16 hours ago (1 children)

There's the tragedy with this new feature: they fast-tracked this past more popular requests, sticking it into Release Firefox.

But they only rushed the part that connects to third parties. There was also a "localhost" option which was originally alongside the Big Five corporate offerings, but Mozilla ultimately decided to bury that one inside of the about:config settings.

[–] MrOtherGuy@lemmy.world 10 points 15 hours ago

I'm guessing that the reason (and a good one at that) is that simply having an option to connect to a local chatbot leads to just confused users because they also need the actual chatbot running on their system. If you can set up that, then you can certainly toggle a simple switch in about:config to show the option.

load more comments (8 replies)
[–] davel@lemmy.ml 92 points 21 hours ago
[–] onlooker@lemmy.ml 8 points 16 hours ago

For a second I thought it said "experimental failure". Would be more accurate, I think.

[–] marcie@lemmy.ml 11 points 18 hours ago* (last edited 17 hours ago) (1 children)

why a fucking chatbot? translate a page better for me you fucking losers, all the translation options suck for privacy outside of specifically trained local AIs. this is the BEST use case for a small local LLM yet mozilla with all its brains and resources couldnt rub two neurons together for this.

or they could do character prediction on your typing to make typing faster. just some legit examples, why waste resources to build a chat ai into my browser when i can just open a website???

[–] Midnitte@beehaw.org 4 points 16 hours ago

Perhaps Mozilla’s biggest "failure" is just communication...

Firefox actually has this now.

[–] Eiri@lemmy.ca 19 points 21 hours ago (5 children)

I wish I had telemetry on such features.

I really doubt a significant number of people use AI chatbots often enough that having it in a dedicated sidebar is worth it.

[–] Scrollone@feddit.it 1 points 56 minutes ago

I think nobody uses AI Chatbots, unless you're forced to do it. They're utter shit.

[–] treadful@lemmy.zip 1 points 8 hours ago

I've never had the urge to use a chat bot personally, but I'm pretty sure I'm in the minority. Lots of people use these things all the time for so much stuff we probably wouldn't even consider.

I've worked with a few people that all but rely on these things to produce any creative work they have to do.

Maybe we run in different circles but I think a lot of people don't even talk about how they're using it.

load more comments (3 replies)
[–] that_leaflet@lemmy.world 48 points 1 day ago

That was there before 133, don’t remember the exact release that added it.

load more comments
view more: next ›