nikqwxq550

joined 22 hours ago
[–] nikqwxq550@futurology.today 1 points 4 hours ago* (last edited 4 hours ago)

Ah my mistake, yes a social media post or blog post from them would have been nice

[–] nikqwxq550@futurology.today 2 points 4 hours ago (1 children)

Sorry I just read the GrapheneOS thread on the F-Droid signature pinning issue (the same issue I linked in the last paragraph of my first comment in fact), and I just wanted to add some comments. While I agree with most of the discussion there, the problem is that the alternatives are worse. Obtainium just pulls binaries directly from Github, where developer accounts have been compromised before. The Play Store has tons of malicious apps.

One of the main benefits of F-Droid is that they have standards. If you get an app from the default F-Droid repo, you can be reasonably certain that it is open-source and private. There are many apps like Bitwarden that couldn't get included, and when you read the F-Droid Gitlab discussions on why, there are always good reasons. F-Droid will also warn you about telemetry and tracking, even if the app makes it into the default repo. These are things that Obtainium or the Play Store simply don't provide.

The official GrapheneOS account wrote:

F-Droid automatically downloads and builds code, so despite their false marketing it does not protect users from the developers in any real way

Yes this does protect users. As I've mentioned before, it's all too common for developers to sell their project to malicious third parties (often happens for browser extensions), or for developer accounts to be compromised (often happens for software packages, like NPM or PyPI). In these cases, the attackers will almost always change the pre-compiled binaries without updating the published source code. The only way to defend against this is via reproducible builds. F-Droid has been pushing for this, and the number of apps supporting reproducible builds has been growing year by year. Still, even without reproducible builds, I would rather trust F-Droid to protect their signing keys and accounts, rather than trust every app developer to do the same. After all, it only takes one compromised developer to compromise your phone.

Lastly, in the same comment by the GrapheneOS account, they said

If you continue to misrepresent, downplay and deny the many real issues about F-Droid, you'll no longer be participating in our community.

This is very worrying to me, and makes me wary to participate in their community in the first place. As I just explained above I don't agree with their logic, and now I see that this person is flaunting the fact that they can ban people for whatever they consider "misrepresentation"? I hope that the GrapheneOS community will recognize the dangers of centralizing all moderation power to somebody who seems so self-righteous.

Anyways I just wanted to share my thoughts on the thread, but thanks for the discussion as well, I bookmarked a lot of the links you shared and will be sharing them in the future!

[–] nikqwxq550@futurology.today 3 points 5 hours ago (3 children)

secureblue includes modified images of CoreOS called securecore. While this doesn’t fix the issue you described, it is worth mentioning as a (technically) more secure option than both Debian and CoreOS.

Honestly I would not recommend securecore or secureblue for security. Small team, no track record, very little funding. I doubt their patches are audited by third-parties, and their userbase is probably so small that bugs are not found quickly. I'm sure you've already seen this PrivacyGuides thread on secureblue but the project is still very unstable. Their ideas may sound nice in theory, but patches can end up introduces more vulnerabilities than they fix. There are going to be breakages, changes in recommendations, bugs, regressions, and all of these impact security. I would not recommend it until their userbase is larger. You might ask how their userbase could ever get larger by my logic, which is why I'll say that I'll only recommend it for users who care about contributing and supporting the project, and improving the security of the future, even if it means sacrificing a bit of their own security at the present.

From my experience, having a large userbase and strong track record are the most reliable indicators for good security. You can always find articles criticizing old projects for security issues, but that's simply because new projects aren't under the same scrutiny (GrapheneOS is a rare exception). This is why I recommend Fedora Workstation/Silverblue over secureblue, Debian over CoreOS and securecore, and F-Droid over Accrescent. Though if you want to fight for a better future and test drive the hot new stuff, all the power to you.

[–] nikqwxq550@futurology.today 2 points 5 hours ago

I get what you're saying, but at the same time if every developer released software as pre-compiled binaries on their website, installing stuff on Linux would become such a PITA. (This is different from how Windows works because apps for Windows are distributed using installers like xxx.msi, and Linux does not have a unified installation system across distros)

[–] nikqwxq550@futurology.today 2 points 5 hours ago (1 children)

I've had the opposite experience, and started using Flatpaks after running into dependency conflicts once or twice when updating my system. Though I admit I've run into bugs with Flatpaks as well, just nothing as painful as a dependency conflict.

[–] nikqwxq550@futurology.today 2 points 6 hours ago

It's impossible to know for sure whether you are tracked or not, but even the most basic fingerprinting mechanisms check browser version, and Reddit has advanced fingerprinting mechanisms to detect ban evasion. Couple that with the fact that 90% of my searches led me to Reddit, and it's easy to conclude that Reddit correlated all my visits using my fingerprint, and thus has a history of all the things I have searched and been interested in for the past year, and sold that to Google. And Google has enough data on me from back when I used to use Google services, that they were probably able to link that activity to my real identity.

[–] nikqwxq550@futurology.today 1 points 6 hours ago

It sounds as though you were aware of this bug already. How did you find out? Did you notice it yourself or was there a notification somewhere?

[–] nikqwxq550@futurology.today 1 points 6 hours ago

You are right I should have linked directly to the workaround, sorry. Glad you got it sorted out though.

[–] nikqwxq550@futurology.today 2 points 6 hours ago

the point of tor is not to avoid fingerprinting, it’s to blend in

Fingerprinting and blending in are the same thing. You can't blend in if you have a unique fingerprint. The Tor Project goes to great lengths to mitigate fingerprinting using their custom browser, it's one of their main goals. It's pointless to use Tor with a regular browser that doesn't have those protections, because websites can just identify you by your fingerprint even when you are obfuscating your IP using Tor.

You are no more tracked by Reddit than you would be with up to date tor

Browser version is a major part of your fingerprint. It's in your user agent, but that can be faked so there are additional mechanisms that check what javascript features your browser supports to get a more reliable read of your browser version. Use https://coveryourtracks.eff.org/ to learn more.

And fingerprinting is not a hack or exploit. It's something that websites use for tracking, just like cookies. And I'm almost certain that Reddit fingerprints users to detect ban evasions.

[–] nikqwxq550@futurology.today 1 points 6 hours ago

Technically rollbacks are possible using regular packages, but in practice multiple packages will share dependencies and prevent you from downgrading just one of them. This is why it's important that Flatpaks isolate dependencies between apps.

[–] nikqwxq550@futurology.today 2 points 6 hours ago (2 children)

Are you saying that this bug would have been reported there? I don't think I ever saw it, and I honestly doubt it was ever posted there. Unless you're talking about the browser update announcements, but I would still need to check the Help > About page of my browser to notice that it didn't match the latest version. As mentioned in my post, the Flatpak was updating like usual, the updates just weren't affecting the browser.

Really, the main reason I made the post was to see if anybody else was affected, and see how other people avoided the bug. And aside from one other user, it really seems like nobody else was affected, which is surprising to me. The only reasons I can come up with are:

  1. nobody installs Tor Browser using the Flatpak
  2. everybody manually checks their browser versions
  3. everybody installed or re-installed Tor Browser within the last year

Based on the comments I suspect #1 is the main cause. Which makes me lose trust in Flatpaks quite a bit. After all, if nobody is using them, then maintainers have less incentive to maintain them, and the worse they get.

[–] nikqwxq550@futurology.today 3 points 6 hours ago (1 children)

Wow nice. Still not really friendly to beginners, since this is something they would have to dig into documentation to find, but it's good to know

 

cross-posted from: https://futurology.today/post/4000823

And by burned, I mean "realize they have been burning for over a year". I'm referring to a bug in the Tor Browser flatpak that prevented the launcher from updating the actual browser, despite the launcher itself updating every week or so. The fix requires manual intervention, and this was never communicated to users. The browser itself also doesn't alert the user that it is outdated. The only reason I found out today was because the NoScript extension broke due to the browser being so old.

To make matters worse, the outdated version of the browser that I had, differs from the outdated version reported in the Github thread. In other words, if you were hoping that at least everybody affected by the bug would be stuck at the same version (and thus have the same fingerprint), that doesn't seem to be the case.

This is an extreme fingerprinting vulnerability. In fact I checked my fingerprint on multiple websites, and I had a unique fingerprint even with javascript disabled. So in other words, despite following the best privacy and security advice of:

  1. using Tor Browser
  2. disabling javascript
  3. keeping software updated

My online habits have been tracked for over a year. Even if Duckduckgo or Startpage doesn't fingerprint users, Reddit sure does (to detect ban evasions, etc), and we all know 90% of searches lead to Reddit, and that Reddit sells data to Google. So I have been browsing the web for over a year with a false sense of security, all the while most of my browsing was linked to a single identity, and that much data is more than enough to link it to my real identity.

How was I supposed to catch this? Manually check the About page of my browser to make sure the number keeps incrementing? Browse the Github issue tracker before bed? Is all this privacy and security advice actually good, or does it just give people a false sense of security, when in reality the software isn't maintained enough for those recommendations to make a difference? Sorry for the rant, it's just all so tiring.

Edit: I want to clarify that this is not an attack on the lone dev maintaining the Tor Browser flatpak. They mention in the issue that they were fairly busy last year. I just wanted to know how other people handled this issue.

 

cross-posted from: https://futurology.today/post/4000823

And by burned, I mean "realize they have been burning for over a year". I'm referring to a bug in the Tor Browser flatpak that prevented the launcher from updating the actual browser, despite the launcher itself updating every week or so. The fix requires manual intervention, and this was never communicated to users. The browser itself also doesn't alert the user that it is outdated. The only reason I found out today was because the NoScript extension broke due to the browser being so old.

To make matters worse, the outdated version of the browser that I had, differs from the outdated version reported in the Github thread. In other words, if you were hoping that at least everybody affected by the bug would be stuck at the same version (and thus have the same fingerprint), that doesn't seem to be the case.

This is an extreme fingerprinting vulnerability. In fact I checked my fingerprint on multiple websites, and I had a unique fingerprint even with javascript disabled. So in other words, despite following the best privacy and security advice of:

  1. using Tor Browser
  2. disabling javascript
  3. keeping software updated

My online habits have been tracked for over a year. Even if Duckduckgo or Startpage doesn't fingerprint users, Reddit sure does (to detect ban evasions, etc), and we all know 90% of searches lead to Reddit, and that Reddit sells data to Google. So I have been browsing the web for over a year with a false sense of security, all the while most of my browsing was linked to a single identity, and that much data is more than enough to link it to my real identity.

How was I supposed to catch this? Manually check the About page of my browser to make sure the number keeps incrementing? Browse the Github issue tracker before bed? Is all this privacy and security advice actually good, or does it just give people a false sense of security, when in reality the software isn't maintained enough for those recommendations to make a difference? Sorry for the rant, it's just all so tiring.

Edit: I want to clarify that this is not an attack on the lone dev maintaining the Tor Browser flatpak. They mention in the issue that they were fairly busy last year. I just wanted to know how other people handled this issue.

 

And by burned, I mean "realize they have been burning for over a year". I'm referring to a bug in the Tor Browser flatpak that prevented the launcher from updating the actual browser, despite the launcher itself updating every week or so. The fix requires manual intervention, and this was never communicated to users. The browser itself also doesn't alert the user that it is outdated. The only reason I found out today was because the NoScript extension broke due to the browser being so old.

To make matters worse, the outdated version of the browser that I had, differs from the outdated version reported in the Github thread. In other words, if you were hoping that at least everybody affected by the bug would be stuck at the same version (and thus have the same fingerprint), that doesn't seem to be the case.

This is an extreme fingerprinting vulnerability. In fact I checked my fingerprint on multiple websites, and I had a unique fingerprint even with javascript disabled. So in other words, despite following the best privacy and security advice of:

  1. using Tor Browser
  2. disabling javascript
  3. keeping software updated

My online habits have been tracked for over a year. Even if Duckduckgo or Startpage doesn't fingerprint users, Reddit sure does (to detect ban evasions, etc), and we all know 90% of searches lead to Reddit, and that Reddit sells data to Google. So I have been browsing the web for over a year with a false sense of security, all the while most of my browsing was linked to a single identity, and that much data is more than enough to link it to my real identity.

How was I supposed to catch this? Manually check the About page of my browser to make sure the number keeps incrementing? Browse the Github issue tracker before bed? Is all this privacy and security advice actually good, or does it just give people a false sense of security, when in reality the software isn't maintained enough for those recommendations to make a difference? Sorry for the rant, it's just all so tiring.

Edit: I want to clarify that this is not an attack on the lone dev maintaining the Tor Browser flatpak. They mention in the issue that they were fairly busy last year. I just wanted to know how other people handled this issue.

view more: next ›