this post was submitted on 21 Jul 2023
916 points (100.0% liked)

Technology

37805 readers
161 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

The much maligned "Trusted Computing" idea requires that the party you are supposed to trust deserves to be trusted, and Google is DEFINITELY NOT worthy of being trusted, this is a naked power grab to destroy the open web for Google's ad profits no matter the consequences, this would put heavy surveillance in Google's hands, this would eliminate ad-blocking, this would break any and all accessibility features, this would obliterate any competing platform, this is very much opposed to what the web is.

you are viewing a single comment's thread
view the rest of the comments
[–] Whirlybird@aussie.zone 5 points 1 year ago* (last edited 1 year ago) (3 children)

Why do people have a problem with this? It explicitly says browser extensions, like ad blockers, will still work. It says cross site tracking won’t be allowed. It all sounds pretty good.

It sounds like most are not liking it because of some potential future abuses rather than what it actually is?

[–] jarfil@beehaw.org 23 points 1 year ago* (last edited 1 year ago) (2 children)

This is part of a broader plan:

  1. Get hardware attestation, aka secure boot (DONE)
  2. Get software attestation, via app stores (DONE)
  3. Get web app attestation (this proposal)
  4. Compile all web apps to webassembly (upcoming)
  5. Create a provider-controlled environment on user-supplied devices (partially there)

Only basic extensions and ad blockers will work with compiled apps (Manifest V3 is part of that plan). Accessibility features will be as good as those of Flash.

What most are not liking, is the change in power dynamic on the WWW:

  • Before: "you give me some data and I'll decide what to do with it"

  • Upcoming: "we'll give you some data and you will do exactly as we tell you with it"

The time might be coming to create a "libre WWW", parallel to the "corporate WWW".

[–] Zeth0s@reddthat.com 9 points 1 year ago

Even more "we'll decide if you are worthy to get my data"

[–] ilmagico@beehaw.org 3 points 1 year ago (1 children)

I'm mostly in agreement but ... what's wrong with webassembly? that's just another way to compile webapps into, or parts of webapps, other than javascript. What am I missing?

[–] jarfil@beehaw.org 7 points 1 year ago (1 children)

"Compiled" is the key: a non-reversible operation that implies loss of syntactical and grammatical content. Meaning, it's harder to analyze, reason about, or modify. As the "assembly" part indicates, it's intended to be as hard to analyze, reason about, or modify, as possible.

First there was Java, then there was Flash, now there is Webassembly... all compiled to bytecode, all running in their VM, all intent on converting all apps everywhere, and to lock "proprietary" elements away from the prying eyes and hands of content blockers, analyzers, or even worse: control by end users.

Webassembly and attestation just go hand in hand to create a remote-controlled enclave on a user-owned device that will make it as hard as possible for the user to control.

Some may see it as an inherent exploitation of the user's resources (already used for cryptominer exploits), others as an attack vector that will be difficult to mitigate by design, others as an unnecessary duplication of the JVM.

[–] ilmagico@beehaw.org 4 points 1 year ago (1 children)

Look, I hate this proposal from Google as much as anyone else here, but let's stick to the facts.

As the "assembly" part indicates, it's intended to be as hard to analyze

The "assembly" is just a reference to machine instructions, a.k.a "assembly language".

Minified javascript, on the other hand, is made with the express purpose of obfuscation and as well, minimize load times, but mainly obfuscation in practice.

That's to say, you don't need webassembly to make it hard to reverse engineer. At least webassembly is a standard.

First there was Java, then there was Flash, now there is Webassembly

First, there were machine instruction, then people invented handy mnemonics for those and called "assembly language". Then there was C, then C++ (let's skip the basic, pascal, etc) and those weren't meant to be hard to analyze, they were and still are meant to be close to the machine, to be fast. Webassembly has similar goals. They can be relatively easily decompiled, just as much as webassembly I'm sure, unless they are purposefully obfuscated.

Just like native machine code and javascript, it can be decompiled/reverse engineered, and also obfuscated, but that's not its goal, not as stated nor in practice.

[–] jarfil@beehaw.org 6 points 1 year ago* (last edited 1 year ago)

You went a bit too far back, I was talking about compiled languages intended for the web.

There is nothing easy sbout decompiling native code, even before we start with jumps into the middle of instructions and polymorphic code. Reverse engineering obfuscated JavaScript is orders of magnitude easier than that, and most minified JavaScript isn't even obfuscated.

The only saving grace for Webassembly is that it requires keeping stuff in blocks, with if, then, else, etc. reasonably delimited, and (I think) it doesn't allow too many shenanigans with indirect calls. But stuff like br_table, doesn't make me particularly eager to tackle decoding what someone meant.

[–] ilmagico@beehaw.org 14 points 1 year ago* (last edited 1 year ago) (1 children)

It sounds like most are not liking it because of some potential future abuses rather than what it actually is?

If I, potentially, wanted to abuse a system, I'd probably come up with a way to modify that system such that I can abuse it, but with a plausible explanation as to why I'm not actually going to do that, so that others will agree to it.

But let's assume, for the sake of the argument, that Google and/or the people who wrote this are actually acting in good faith. That still won't stop other large companies like Microsoft, Apple, etc. or even future Google employees from abusing the system later on.

Yes, the potential for abuse is the big deal here. And you know humans, if it can be abused, someone will try.

[–] Whirlybird@aussie.zone 1 points 1 year ago (1 children)

Sure, but this is also a solution for the existing abuse that runs rampant. Which abuse is better?

I’m sure these same arguments against this were made for anti-virus software back in the beginning. “They’re only doing this so in the future they can flag all their competitors programs as viruses” and “they’re only doing this so they can choose who can use what”. The parallels are strong.

[–] ilmagico@beehaw.org 6 points 1 year ago

Is there a way to stop the existing abuse without introducing a different kind of abuse? Ideally, that's what we should aim for, if possible at all.

If that's not possible, restricting people's freedoms in the digital world (or the real world, for that) to prevent some from abusing such freedoms doesn't sound such a great proposition. As for "which abuse is better", I'd argue that if I have to be abused one way or another, I'd prefer to be free and in control so I have a chance to stop it myself ;)

(what freedoms, you might say? freedoms to run my own choice of operating system, my choice of browser, etc. on a computer that I own, maybe even built myself, and not be prevented from accessing the internet at large)

I’m sure these same arguments against this were made for anti-virus software back in the beginning

And I'm sure some of those companies, or some of those companies' employees, wrote some viruses themselves ;) But really, we can only speculate. Most are definitely legit and helpful.

The key here is, who is in control: the user of the software, or the company that made it? I'd say even for antiviruses, the user is in control, can choose a different antivirus or no antivirus at all (like me). In this Google proposal, it seems Google and other big corporations will be in control and not the user. That's the reason why it's bad. If I have to be abused, at least I like being in control so I can (try to) prevent it.

[–] LiveLM@lemmy.zip 7 points 1 year ago

Ah yes, Google pinky promises it won't use this to screw us over, we're good to go!