this post was submitted on 29 Nov 2023
124 points (97.7% liked)

Technology

59223 readers
3341 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Another article, much better and presents in more detail that Olvid was audited on an older version and chosen because it was French and they applied for it (French) https://www.numerama.com/tech/1575168-pourquoi-les-ministres-vont-devoir-renoncer-a-whatsapp-signal-et-telegram.html

Google translate link original post : https://www-lepoint-fr.translate.goog/high-tech-internet/les-ministres-francais-invites-a-desinstaller-whatsapp-signal-et-telegram-29-11-2023-2545099_47.php?_x_tr_sl=fr&_x_tr_tl=en&_x_tr_hl=fr&_x_tr_pto=wapp

The translation has some mistakes but good enough to understand the context.

Here is a short summary :

Olvid passed a 35d intrusion test by Anssi (French cybersecurity state organisation) experts or designated experts, with code examination without finding any security breach. Which is not the case of all other 3 messaging apps (either because they didn't do any test, or because they didn't pass).

This makes WhatsApp, signal and telegram unreliable for state security.

And so government members and ministerial offices will have to use Olvid or Tchap (French state in house messaging app).

More detail in the article.

top 38 comments
sorted by: hot top controversial new old
[–] spiderkle@lemmy.ca 50 points 11 months ago* (last edited 11 months ago) (2 children)

Well that was the dumbest explanation ever, that's basically just political pretext to give the government contract to some french company. Potentially there has been some lobbying going on.

Signal doesn't store it's encryption/decryption keys in the cloud, so you would need the devices and then you would still have to decrypt content if the user doesn't give you access manually.

To crack a 128-bit AES key, it would take 1 billion billion years with a current supercomputer. To crack a 256-bit AES key, it would take 2^255 / 2,117.8 trillion years on average.

So until some amazing quantum computer comes along, this is pretty safe. Fuck Olvid.

[–] dotMonkey@lemmy.world 20 points 11 months ago (1 children)

I'm sure there are more attack vectors than that though

[–] themusicman@lemmy.world 16 points 11 months ago

Exactly. "Security assuming nobody fucked up" isn't enough

[–] jet@hackertalks.com 14 points 11 months ago* (last edited 11 months ago) (1 children)

Signal does store the decryption keys in the cloud. Using their SGX enclaves. Which have their own issues. Signal SVR I believe they call it.

You can turn off signal pins, which still stores the decryption keys in the cloud, but then they're signed with a very long pin which is good enough.

From a government perspective, signals a no-go, the SGX enclaves are completely exploitable at the state actor level. You just have to look at all of the security vulnerabilities to date for SGX enclaves.

[–] stimut@aussie.zone 2 points 11 months ago (1 children)

Do you have a reference for Signal using SGX for keys?

Everything I could find was about metadata and private data, e.g. contact lists (which is what the SVR thing that you mention is), but nothing about keys.

[–] jet@hackertalks.com 7 points 11 months ago* (last edited 11 months ago) (1 children)

https://signal.miraheze.org/wiki/Secure_Value_Recovery

https://github.com/signalapp/SecureValueRecovery2

If you want to do an empirical test, create a signal account set a pin. Send a message to someone. Then delete signal. Recreate the account using the same phone number, recover using the pin and send a message. The receiver of that message will not get a warning that the signing key has changed.

The only way that's possible is if the key, or a derived key, is recoverable from the network. That is de facto proof that the keys or a key generation mechanism is in the cloud. Which is probably fine for personal communication.

But if I'm a nation state, this represents a significant security risk, especially when you're worried about another nation-state peaking at your communication. I.e France is buddy buddy with the US, but they probably don't want the US to read all of their internal communication.

SGX https://en.m.wikipedia.org/wiki/Software_Guard_Extensions

https://sslab-gatech.github.io/sgx101/pages/attestation.html

SGX is a inside chip secure enclave created by Intel, a company headquartered in the United States, that uses key management, and signing keys from Intel. Intel could be compelled by domestic intelligence to provide their keys, or to sign certain enclave software. I'm not saying it's happened, but I am saying this is part of the risk assessment a nation state would use to evaluate a messaging platform

So a nation state attack might look something like this: Intel is compelled to release their signing keys, the signal user enclave is copied, using the combination of both of these a special SGX environment will be set up to brute Force the passwords, with no limit. The limit will be removed, and it will operate in the SGX environment, and brute forcing a six-digit pin is trivial if you're not rate limited. This is just one possibility, SGX has at least nine known side channel attacks at the moment, I'm sure there's more that haven't been published.

[–] stimut@aussie.zone 2 points 11 months ago (1 children)

Interesting, thanks for that.

The first link you posted states that the master key is stored. It also states that the information on the page doesn't match the official blog from Signal, but that they've gathered their information from the source code, so I assume it's correct. It does make me wonder why Signal doesn't say that they store the master key though 🤔

[–] jet@hackertalks.com 1 points 11 months ago

You don't have to trust blogs, do the experiment yourself, make a new signal account, send a message, set a pin, delete the app, reinstall, recover from pin, and send a message again.. the signing key doesn't change. That is proof the key is in the cloud.

Signal DOES say its in the cloud, but they use the Corporate partial truth..... SVR is for "personal data" ... which the key is. They don't emphasis it, because its such a bad idea, when they implemented this there was a big security online outrage... which seems to have died down.

Signal is a good enough protocol for daily use, but not good enough for nation states, or the truly security conscious. Signal is a step in the path to federated democratic private communication but not the destination.

[–] luthis@lemmy.nz 36 points 11 months ago (1 children)

I don't know much but what I do know is when a government endorses a secure messaging service, it's definitely not secure.

[–] bamboo@lemm.ee 4 points 11 months ago

They’re using it themselves, not forcing citizens to use it. It’s when they force citizens to use an app they claim is secure that I am distrustful. I would assume their intentions are more pure when it’s their own state security rather than their citizens’ privacy.

[–] sudoshakes@reddthat.com 22 points 11 months ago (1 children)

Not being able to inspect their code vs no passing are different things.

[–] lemmyvore@feddit.nl 8 points 11 months ago* (last edited 11 months ago) (1 children)

Are they? If you want to know if something is secure enough to use then not being able to examine the code should obviously disqualify it.

[–] sudoshakes@reddthat.com -1 points 11 months ago (2 children)

Sure it does, but that doesn’t make it bad.

Open source code is not the only solution to secure communication.

You can be extremely secure on closed source tools as well.

If they found specific issues with Signal aside from not being allowed to freely inspect their code base, I suspect we would be hearing about it. Instead I don’t see specific security failings just hat it didn’t make the measure for their security software audit.

As an example of something that is closed source and trusted:

The software used to load data and debug the F-35 fighter jet.

Pretty big problem for 16 countries if that isn’t secure… closed source. So much s you can’t even run tests against the device for loading data to the jet live. It’s a problem to sort out, but it’s an example of where highly important communication protocols are not open source and trusted by the governments of many countries.

If their particular standard here was open source, ok, but they didn’t do anything to assure the version they inspected would be the only version used. In fact every release from that basement pair of programmers could inadvertently have a flaw in it, which this committee would not be reviewing in the code base for its members of parliament.

[–] lemmyvore@feddit.nl 9 points 11 months ago

Lol at military stuff being secure. Most often it's not, it's just hidden. There was an Ars Technica article about the "secure" devices used at military bases being full of holes for example: https://arstechnica.com/security/2023/08/next-gen-osdp-was-supposed-to-make-it-harder-to-break-in-to-secure-facilities-it-failed/

When code is hidden all you know for sure is that you don't know anything about it. You certainly can't say it's secure.

If a piece of code or a system is really secure then it does not care if the code is open because the security lays in the quality of its algorithms and the strength of the keys.

[–] Tibert@jlai.lu 2 points 11 months ago* (last edited 11 months ago) (1 children)

Well let's give some counter examples in the softwares I mentioned :

  • WhatsApp closed : Owned by Facebook. Well Facebook had multiple data leaks, privacy violations and nothing substantial was done about it. Definitely not trustable (also zero days are getting sold on the black market for WhatsApp (https://techcrunch.com/2023/10/05/zero-days-for-hacking-whatsapp-are-now-worth-millions-of-dollars/ ).

  • Telegram closed : not end to end encrypted. Russian app. Not trustable.

  • Signal open : well this one is e to e encrypted. Open source, maybe could be trusted. Seems to have passed some security audits (https://community.signalusers.org/t/overview-of-third-party-security-audits/13243), tho it's based in the US and uses servers, maybe the US may have super computers capable of decrypting such communications. However is signal has switched their encryption to quantum computer resistance it may be too hard even for a state actor. However they also "debunked"/ignored zero-day reports which were not reported through their own tool, and by asking the US for confirmation. I am not sure if the US can be trusted to give confirmation about the existance or not of vulnerabilities when they are very likely to use them (https://thehackernews.com/2023/10/signal-debunks-zero-day-vulnerability.html?m=1).

  • Olvid open (servers closed) : is French, e to e, and backed up by an encryption PhD. And why not use a local messaging app witch also is very secure and open source.

Notice how closed source is untrusted here. The economic activity of the tool changes how trustable it is. Military équipement has a huge and strict budget, it has to be secure.

Communication apps are user first. So they do what they can get away with, and that is very true for Facebook.

[–] jet@hackertalks.com 3 points 11 months ago* (last edited 11 months ago)

I had no idea o l v i d was open source, since you mentioned it I googled and I found their repo, it's not mentioned on the English web page

https://github.com/olvid-io/olvid-android

AGPLv3 .. nice

Client only source, just like telegram, no server side source.

[–] kalistia@sh.itjust.works 9 points 11 months ago* (last edited 11 months ago)

They had Tchap that may not be perfect but is open source (based on matrix/element), hosted in France and already used by 400 000 ppl from the public services... Why pay for a new app? Don't get it...

[–] Tibert@jlai.lu 8 points 11 months ago (1 children)

Another article, much better and presents in more detail that Olvid was audited on an older version and chosen because it was French and they applied for it (French) https://www.numerama.com/tech/1575168-pourquoi-les-ministres-vont-devoir-renoncer-a-whatsapp-signal-et-telegram.html

[–] jet@hackertalks.com 11 points 11 months ago

Honestly at the security level, critical infrastructure, which messaging is, is something every country should have independently. So it makes complete sense for the French government to set up their critical messaging infrastructure inside of France with a French company who cannot be compelled by external intelligence agencies.

[–] merde@sh.itjust.works 8 points 11 months ago (1 children)

on one hand they are trying to illegalize encrypted messaging on the other they're saying that it's not secure? 😅

[–] merde@sh.itjust.works 6 points 11 months ago

it's not surprising to see this reaction after learning that their allies were spying on them through nsa

[–] LWD@lemm.ee 6 points 11 months ago* (last edited 10 months ago) (1 children)
[–] topinambour_rex@lemmy.world 2 points 11 months ago

Tchap is the messaging app made for the French gov.

[–] suckmyspez@lemmy.world 5 points 11 months ago (2 children)

It’s not open source…big no no for me 🤷‍♂️

[–] synapse1278@lemmy.world 6 points 11 months ago (1 children)
[–] suckmyspez@lemmy.world 4 points 11 months ago (2 children)

Yup I did see they were on GitHub but when I looked the iOS repository is months (and several releases) out of date.

I’d expect an open source project to be working in public…not in private and updating their public repositories later down the line

[–] hedgehog@ttrpg.network 2 points 11 months ago (1 children)

Signal isn’t much better in this regard. They certainly don’t work directly in the public repos - they have internal repos that they work from and they push updates from them to the public repos after the fact.

I’m not sure about the current state but when I looked into it a couple years ago, their client side repos were around a year behind. I recall reading some issues stating that the client was so far behind that the server was refusing to communicate with builds of it.

[–] bamboo@lemm.ee 1 points 11 months ago (1 children)

Signal’s official policy is that third party clients aren’t permitted, and lacks reproducible builds for their android client. Even if the open source code was up to date, using it without patching it to use a custom server would be a TOS violation.

[–] hedgehog@ttrpg.network 2 points 11 months ago

One of the ways Signal doesn’t really feel FOSS that I read about was related to third party clients and the official server. Projects wanted to use forks of their client with the official servers. In one case this was just so they could remove nonfree software. In another they were adding minor features (that Signal would have been free to take back into the main build, since they were under the same license). But Moxie said they couldn’t use their servers, period.

[–] Rokk@feddit.uk 2 points 11 months ago

I feel like for internal government communications you might not want it to be open source.

Doesnt mean everyone else should want to use it.

[–] Kusimulkku@lemm.ee 5 points 11 months ago

Olvid 26, rue Vignon 75009 Paris France

Lmao big surprise.

[–] merde@sh.itjust.works 2 points 11 months ago

looks like you have to buy the paid version to make calls 🤷