this post was submitted on 14 May 2024
114 points (100.0% liked)

chapotraphouse

13519 readers
967 users here now

Banned? DM Wmill to appeal.

No anti-nautilism posts. See: Eco-fascism Primer

Gossip posts go in c/gossip. Don't post low-hanging fruit here after it gets removed from c/gossip

founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Lerios@hexbear.net 29 points 5 months ago (3 children)

why? genuinely who does this help and how does it make google money? it seems like they're paying for the energy for ai content in exchange for absolutely nothing

[–] Awoo@hexbear.net 30 points 5 months ago (1 children)

The people internally at Google are techbro true believers. If it's new technology it is inherently good and an improvement.

[–] TheDoctor@hexbear.net 20 points 5 months ago (2 children)

God, it’s sad but you’re probably right. We had to implement something AI-related at work because the board all had massive hard ons for the buzzwords. They literally could not have given less of a shit what we used it for. We had full autonomy as long as ChatGPT ended up in our dependency tree somewhere.

[–] homhom9000@hexbear.net 10 points 5 months ago

Same here. Every all hands at work emphasizes the need to use AI. Except they have no clue what to do with it yet beyond chatbots but we need to use it right now or else.

[–] Lerios@hexbear.net 4 points 5 months ago (1 children)

yeah same. we have an AI assistent now and every meeting has a 'gentle reminder' that the sales people and devs and tech support etc etc should be using it. they're never specific about what we should be using it for and the one time i touched it it didn't seem like it even had access to our documentation.

is it really that simple? this is a massive capitalist company, surely they have to understand that they should be acting to improve their material conditions? random libs not understanding shit is fine, but i thought the actual capitalists themselves understood capitalism. exchanging material wealth for like cyberpunk vibes or whatever is genuinely insane.

[–] TheDoctor@hexbear.net 4 points 5 months ago

surely they have to understand that they should be acting to improve their material conditions?

In my experience with execs, they find a guiding principle from a book or a conference speaker and treat it like a personal religion. Everything outside of that is very much vibes based. There are a lot of conference talks that try to summarize new tech stuff for execs but it’s very much a short overview followed by practical applications. They don’t understand the stuff experientially unless they happen to do a deep dive on their own.

[–] alexandra_kollontai@hexbear.net 3 points 5 months ago* (last edited 5 months ago)

The theory is that people don't want to click through blue links trying to find a source (or sources) they can trust, they rather want an instant summarised answer to any question. Google already does instant summarised answers for things like "when is the next public holiday" - generative AI content would expand these instant answers to any question, at the cost of accuracy. Google thinks ChatGPT is taking their market share (which it kinda is, and kinda was a year ago when they started developing this). The big idea of this new feature from Google is to retain market share, which is a prerequisite to making money.

[–] blobjim@hexbear.net 2 points 5 months ago

I thought Google was already incorporating some machine learning stuff into the core search algorithm anyways, which would be a much better use than directly making up sentences.