this post was submitted on 15 Apr 2025
410 points (97.7% liked)

Privacy

37077 readers
312 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

much thanks to @gary_host_laptop for the logo design :)

founded 5 years ago
MODERATORS
 

A chart titled "What Kind of Data Do AI Chatbots Collect?" lists and compares seven AI chatbots—Gemini, Claude, CoPilot, Deepseek, ChatGPT, Perplexity, and Grok—based on the types and number of data points they collect as of February 2025. The categories of data include: Contact Info, Location, Contacts, User Content, History, Identifiers, Diagnostics, Usage Data, Purchases, Other Data.

  • Gemini: Collects all 10 data types; highest total at 22 data points
  • Claude: Collects 7 types; 13 data points
  • CoPilot: Collects 7 types; 12 data points
  • Deepseek: Collects 6 types; 11 data points
  • ChatGPT: Collects 6 types; 10 data points
  • Perplexity: Collects 6 types; 10 data points
  • Grok: Collects 4 types; 7 data points
top 50 comments
sorted by: hot top controversial new old
[–] lemmycloud666@lemm.ee 2 points 4 days ago
[–] Empricorn@feddit.nl 10 points 6 days ago* (last edited 6 days ago)

Gemini: "Other Data"

Like, what's fucking left!?

[–] krnl386@lemmy.ca 7 points 6 days ago (1 children)

Wow, it’s a whole new level of f*cked up when Zuck collects more data than the Winnie the Pooh (DeepSeek). 😳

[–] Octagon9561@lemmy.ml 11 points 6 days ago (1 children)

The idea that US apps are somehow better than Chinese apps when it comes to collecting and selling user data is complete utter propaganda.

[–] Duamerthrax@lemmy.world 0 points 6 days ago* (last edited 6 days ago)

Don't use either. Until Trump, I still considered CCP spyware more dangerous because they would be collecting info that could be used to blackmail US politicians and businesses. Now, it's a coin flip. In either case, use EU or FOSS apps whenever possible.

[–] arakhis_@feddit.org 5 points 6 days ago (1 children)

anyone whos competent in the matter: what about the french competition chat.mistral.ai

[–] TangledHyphae@lemmy.world 5 points 6 days ago

+1 for Mistral, they were the first (or one of the first) Apache open source licensed models. I run Mistral-7B and variant fine tunes locally, and they've always been really high quality overall. Mistral-Medium packed a punch (mid-size obviously) but it definitely competes with the big ones at least.

[–] pennomi@lemmy.world 139 points 1 week ago (2 children)
[–] exothermic@lemmy.world 17 points 1 week ago (8 children)

Are there tutorials on how to do this? Should it be set up on a server on my local network??? How hard is it to set up? I have so many questions.

[–] TangledHyphae@lemmy.world 5 points 6 days ago

https://ollama.ai/, this is what I've been using for over a year now, new models come out regularly and you just "ollama pull " and then it's available to run locally. Then you can use docker to run https://www.openwebui.com/ locally, giving it a ChatGPT-style interface (but even better and more configurable and you can run prompts against any number of models you select at once.)

All free and available to everyone.

[–] Kiuyn@lemmy.ml 23 points 1 week ago* (last edited 1 week ago) (1 children)

I recommend GPT4all if you want run locally on your PC. It is super easy.

If you want to run in a separate server. Ollama + some kind of web UI is the best.

Ollama can also be run locally but IMO it take more learning than GUI app like GPT4all.

[–] codexarcanum@lemmy.dbzer0.com 11 points 1 week ago (2 children)

If by more learning you mean learning

ollama run deepseek-r1:7b

Then yeah, it's a pretty steep curve!

If you're a developer then you can also search "$MyFavDevEnv use local ai ollama" to find guides on setting up. I'm using Continue extension for VS Codium (or Code) but there's easy to use modules for Vim and Emacs and probably everything else as well.

The main problem is leveling your expectations. The full Deepseek is a 671b (that's billions of parameters) and the model weights (the thing you download when you pull an AI) are 404GB in size. You need so much RAM available to run one of those.

They make distilled models though, which are much smaller but still useful. The 14b is 9GB and runs fine with only 16GB of ram. They obviously aren't as impressive as the cloud hosted big versions though.

[–] Kiuyn@lemmy.ml 7 points 1 week ago* (last edited 1 week ago) (2 children)

My assumption is always the person I am talking to is a normal window user who don't know what a terminal is. Most of them even freak out when they see "the black box with text on it". I guess on Lemmy the situation is better. It is just my bad habit.

load more comments (2 replies)
load more comments (1 replies)
[–] pennomi@lemmy.world 11 points 1 week ago

Check out Ollama, it’s probably the easiest way to get started these days. It provides tooling and an api that different chat frontends can connect to.

load more comments (5 replies)
[–] TuxEnthusiast@sopuli.xyz 13 points 1 week ago* (last edited 1 week ago) (2 children)

If only my hardware could support it..

[–] skarn@discuss.tchncs.de 7 points 1 week ago

I can actually use locally some smaller models on my 2017 laptop (though I have increased the RAM to 16 GB).

You'd be surprised how mich can be done with how little.

load more comments (1 replies)
[–] Cris16228@lemmy.today 107 points 1 week ago (1 children)

Me when Gemini (aka google) collects more data than anyone else:

Not really shocked, we all know that google sucks

[–] Telorand@reddthat.com 33 points 1 week ago (2 children)

I would hazard a guess that the only reason those others aren't as high is because they don't have the same access to data. It's not that they don't want to, they simply can't (yet).

load more comments (2 replies)
[–] WreckingBANG@lemmy.ml 58 points 1 week ago (1 children)

Who would have guessed that the advertising company collects a lot of data

[–] will_a113@lemmy.ml 29 points 1 week ago (2 children)

And I can't possibly imagine that Grok actually collects less than ChatGPT.

[–] HiddenLayer555@lemmy.ml 2 points 6 days ago* (last edited 6 days ago)

Skill issue probably. They want to collect more but Musk's shitty hires can't figure it out. /s

[–] Chronographs@lemmy.zip 30 points 1 week ago (1 children)

Data from surfshark aka nordvpn lol. Take it with a few chunks of salt

[–] Ziglin@lemmy.world 1 points 6 days ago

Yeah I feel like there's a supposedly missing somewhere. We don't know their servers so at the very least 'user content' is based on trust.

[–] HiddenLayer555@lemmy.ml 49 points 1 week ago (6 children)
load more comments (6 replies)
[–] zr0@lemmy.dbzer0.com 28 points 1 week ago (1 children)

And what about goddamn Mistral?

[–] Sunny@slrpnk.net 7 points 1 week ago (7 children)

Its French as far as I know so at least it abides to gdpr by default.

load more comments (7 replies)
[–] kirk781@discuss.tchncs.de 28 points 1 week ago* (last edited 1 week ago)

Back in the day, malware makers could only dream of collecting as much data as Gemini does.

[–] abdominable@lemm.ee 25 points 1 week ago (2 children)

I have a bridge to sell you if you think grok is collecting the least amount of info.

load more comments (2 replies)
[–] demunted@lemmy.ml 23 points 1 week ago (2 children)
[–] standarduser@lemm.ee 1 points 6 days ago (1 children)

Most of my workforce strangely enough. They claim it’s the best for them in terms of mathematics, but I can’t find that to be a good reason.

[–] pineapplelover@lemm.ee 4 points 6 days ago (2 children)

Isn't deepseek better for that?

[–] TangledHyphae@lemmy.world 3 points 6 days ago

In my experience it depends on the math. Every model seems to have different strengths based on a wide berth of prompts and information.

[–] standarduser@lemm.ee 1 points 6 days ago

That’s what I’m saying!

[–] MoonRaven@feddit.nl 11 points 1 week ago

Fascists. Why?

[–] knighthawk0811@lemmy.ml 22 points 1 week ago

I'm interested in seeing how this changes when using duck duck go front end at duck.ai

there's no login and history is stored locally (probably remotely too)

[–] morrowind@lemmy.ml 19 points 1 week ago (2 children)

Note this is if you use their apps. Not the api. Not through another app.

load more comments (2 replies)
[–] sonalder@lemmy.ml 17 points 1 week ago* (last edited 1 week ago)

Anyone has these data from Mistral, HuggingChat and MetaAI ? Would be nice to add them too

Edit : Leo from brave would be great to compare too

[–] OprahsedCreature@lemmy.ml 9 points 1 week ago

Or you could use Deepseek's workaround and run it locally. You know, open source and all.

[–] jagged_circle@feddit.nl 9 points 1 week ago (1 children)

Almost none of this data is possible to collect when using Tor Browser

[–] serenissi@lemmy.world 6 points 6 days ago (1 children)

Nope, these services almost always require user login, eventually tied to cell number (ie non disposable) and associate user content and other data points with account. Nonetheless user prompts are always collected. How they're used is a good question.

[–] jagged_circle@feddit.nl 4 points 6 days ago (1 children)

Use a third party API. Pay with monero.

[–] serenissi@lemmy.world 1 points 5 days ago (1 children)

Yes it is possible to create disposable-isque api keys for different uses. The monetary cost is the cost of privacy and of not having hardware to run things locally.

If you have reliable privacy friendly api vendor suggestions then do share. While I do not need such services now, it can a good future reference.

[–] jagged_circle@feddit.nl 1 points 5 days ago

I think I only used chatgpt once to play around, and it was one of those. I dont remember the name, sorry

load more comments
view more: next ›