this post was submitted on 23 May 2024
154 points (78.9% liked)

Technology

59190 readers
2588 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] EatATaco@lemm.ee 35 points 5 months ago* (last edited 5 months ago) (11 children)

But if OpenAI didn’t do anything wrong, why would it take down the voice?

This almost made me stop reading. What a garbage point, if someone is offended by something I did, even if I did nothing wrong, I don't do it to them again because I'm not an asshole. He's clearly an asshole who tried to use her voice anyway, but this line of questioning is garbage...decent people apologize all the time when they've done nothing wrong, and then not do the offending thing again, without admitting guilt.

But the design choice is worrying on an ethical level. Researchers say it reinforces sexist stereotypes of women as servile beings who exist only to do someone else’s bidding — to help them, comfort them, and plump up their ego.

And this is where I stopped. If they had used a male voice, they could have argued that they were excluding women. But they did a study and picked the voice people would respond to the best. And objective choice. The author set out to find sexism, and by golly they did it. Amazing.

[–] Drewelite@lemmynsfw.com 4 points 5 months ago

Yeah this is a real: "Let me find a problem and not let them apologize." You don't like Open AI. Ok. I'm sure you have a good reason. So focus on that and stop contriving controversy. You're not changing any minds like that. That only gets kudos from people that already agree with you.

[–] A7thStone@lemmy.world 4 points 5 months ago (1 children)

What a garbage point, if someone is offended by something I did, even if I did nothing wrong, I don't do it to them again because I'm not an asshole.

Putting aside the jury still being out in the last part of that statement, Sam Altman has showed himself to not only be an asshole, but an asshole who will do anything he thinks he can get away with. So the statement you took issue with "but if OpenAI didn't do anything wrong, why would it take down the voice" is accurate. Considering the pattern of behavior from Altman and OpenAI that action is a rather implicit admission of guilt.

[–] EatATaco@lemm.ee 4 points 5 months ago

They did something wrong. We both agree.

But the suggestion that doing something to correct the offense is considered an admission of guilt is garbage logic. This is why people are so hesitant to apologize or move to correct perceived wrongs, because people treat doing so as an admission that you did something wrong.

[–] CopernicusQwark@lemmy.world 1 points 5 months ago (3 children)

Why not just have both a male and female voice, and ideally one that's as neutral as possible?

[–] EatATaco@lemm.ee 7 points 5 months ago

They do have 2 male voices. The article is complaining about the choice of sky for the demo.

[–] Jarix@lemmy.world 5 points 5 months ago

Neutral is boring? Flaws add as much character to a thing as beauty

[–] toofpic@lemmy.world 2 points 5 months ago

There is a choice of different voices, it wasn't and isn't a problem. But Sky was the best in my opinion, so even if I support the right of ms Johansson to not hear her voice out of every device, it's personally kinda sad that they're removing it

load more comments (8 replies)
[–] Buffalox@lemmy.world 23 points 5 months ago* (last edited 5 months ago) (2 children)

IMO it sounded fake, not fake like artificial or not being real, but more like not being honest or genuine. Like a bit too much or over-attached girlfriend.
Don't get me wrong, it was very impressive, but IMO they should tone down the fake enthusiasm.

[–] somethingp@lemmy.world 15 points 5 months ago (2 children)

GPT is just trying to get a good tip.

[–] assassin_aragorn@lemmy.world 8 points 5 months ago

Don't stick your dick in that

[–] j4yt33@feddit.de 3 points 5 months ago
[–] helenslunch@feddit.nl 2 points 4 months ago (4 children)

Some people sound fake also. They're very popular.

load more comments (4 replies)
[–] Buddahriffic@lemmy.world 16 points 5 months ago* (last edited 5 months ago) (2 children)

Even if they hired an actress with a similar voice to train the AI to sound similar to Johansonn, celebrity impersonators have been doing that for (I'd guess) longer than recorded voice media has even existed. I'm having a hard time seeing why one is fine but the other isn't.

Edit: corrected bad spelling of her name.

[–] GamingChairModel@lemmy.world 21 points 5 months ago (1 children)

I'm having a hard time seeing why one is fine but the other isn't.

I think the law says that neither is fine, in the context here. The law allows celebrity impersonators to engage in parody and commentary, but not to actually use their impersonation skills to endorse products, engage in fraud, and pretend to be that person being impersonated.

[–] Buddahriffic@lemmy.world 7 points 5 months ago (4 children)

But this is just using a voice. It might even be their natural voice. I don't think there's fraud because it wasn't presented as Scarlett's voice. If it wasn't presented as not her voice, then maybe those other two would apply, though is allowing a service to use your voice the same as endorsement? Is it enough to sound like someone to be considered impersonating them?

This situation lands in a grey area where I can't endorse or condemn it. I mean, it would have been smarter to just use a different voice. Find a celebrity that would sign on or just use an unrecognisable voice. Ethical or not, and legal or not, it was stupid.

[–] zik@lemmy.world 9 points 5 months ago

It was explicitly represented as her voice when he tweeted "Her" in relation to the product, referencing a movie which she voiced. It's not a legal grey area in the US. He sank his own ship here.

[–] GamingChairModel@lemmy.world 9 points 5 months ago

I'm mostly going off of this article and a few others I've read. This article notes:

Celebrities have previously won cases over similar-sounding voices in commercials. In 1988, Bette Midler sued Ford for hiring one of her backup singers for an ad and instructing the singer to “sound as much as possible like the Bette Midler record.” Midler had refused to be in the commercial. That same year, Tom Waits sued Frito-Lay for voice misappropriation after the company’s ad agency got someone to imitate Waits for a parody of his song in a Doritos commercial. Both cases, filed in California courts, were decided in the celebrities’ favor. The wins by Midler and Waits “have clear implications for AI voice clones,” says Christian Mammen, a partner at Womble Bond Dickinson who specializes in intellectual property law.

There's some more in there:

To win in these cases, celebrities generally have to prove that their voice or other identifying features are unregistered trademarks and that, by imitating them, consumers could connect them to the product being sold, even if they’re not involved. That means identifying what is “distinctive” about her voice — something that may be easier for a celebrity who played an AI assistant in an Oscar-winning movie.

I think taken with the fact that the CEO made a direct reference to the movie she voiced an AI assistant when announcing the product, that's enough that a normal person would "connect them to the product being sold."

[–] Paragone@lemmy.world 5 points 5 months ago (1 children)

I read that Scarlett's family & friends couldn't tell it apart from her actual voice.

I'd say that "Open AI" or whatever they're called, trained it specifically on only her voice.

The seems-narcissistic-machiavellian-sociopath-CEO whats-his-face tried to get her to agree to this,

she wouldn't agree,

he tweeted "her" when releasing the update ( after Scarlett's movie )

she lawyered up,

he backed down..


I'd say it's a clear case of identity-theft-for-profit of a celebrity, by a consistently narcissistic-machiavellian-sociopath who's kinda leaving lots of corpses of "integrity" all over the place.

There's some law which protects celebrities from use of their likeness, and rightly:

it's their "coin" that their career is made-of, right?

_ /\ _

load more comments (1 replies)
[–] mojofrododojo@lemmy.world 2 points 5 months ago

if they didn't need to license it, why did they repeatedly try?

[–] stellargmite@lemmy.world 8 points 5 months ago (1 children)

Legally maybe its fine, I’m not sure. But because they tried to license or get permission and involvement officially from her, but she declined, then they asked again , she declined again and two days later they released it with (possibly) her voice anyway. At best it displays them to be bad faith plundering abusers including of individuals’ likenesses. We in this type of forum are not surprised of course - its par for the course with these tech bros who’ve made a business out of other peoples content largely without consent. Respect to Johansonn for making this known publicly though. But even weirder that they then took it down when they saw the reaction. Highlighting themselves as Sociopaths. Plenty of those around, but with this much power and access to data? Creepy.

[–] Buddahriffic@lemmy.world 3 points 5 months ago (1 children)

Yeah, it is kinda sketchy, though they might have backed down because they realized there was no winning this in the court of public opinion, regardless of whether they were trying to act in good faith prior to the controversy coming out.

IMO Johansonn making it public was an obvious strategic move because it gave her a strong position because of how unpopular AI is these days. She might have otherwise just paid some lawyers a lot of money to accomplish nothing if it was legally fine and she was adamant about them not using a voice that sounded like hers (guessing the best she would have gotten without going public is them paying her some money to continue using that similar voice or maybe a bit more money to use her actual voice, either way they would have gotten what they wanted).

[–] stellargmite@lemmy.world 3 points 5 months ago

Yeh she effectively chose an ethical position with no downside I can think of. Unless they made her sign an NDA / MOU which they clearly didn’t. Their sketchiness is enhanced if anything. Makes me wonder if they made some low level threat at that last minute approach. e.g we are using your voice anyway, now’s your chance to get onboard the gravy train or look bad. Just speculation of course. She wasnt aware apparently. Also the fact they want to mimic the “her” ai is just weird. They are worse than the cautionary fiction.

[–] JackbyDev@programming.dev 8 points 5 months ago (1 children)

Isn't calling the voice "flirty" sexist? It didn't seem flirty in any of the clips I heard of it.

[–] toofpic@lemmy.world 2 points 5 months ago

This was my voice of choice because it sounded like a "person on the other side" is engaged. Like when you talk to a friend or teacher who's interested in the topic.
If somebody thinks it's somehow related to sexuality, gender questions, etc, they have to check themselves.

[–] SeattleRain@lemmy.world 8 points 5 months ago* (last edited 5 months ago) (1 children)

I hate to defend Sam but Scarlet does not have a patent on a bubbly mid western accent.

[–] zik@lemmy.world 12 points 5 months ago* (last edited 5 months ago) (5 children)

He tweeted "Her", which explicitly tells us it's a deliberate imitation of Scarlett's voice in that movie. And he tried to negotiate licencing her famous voice, which she rejected.

So it's more than just a coincidence, it's deliberate bad faith behaviour. Legally you can't misrepresent a product as being from a famous person when it wasn't, and he very much did that. I guess he was hoping she'd give in and accept the licensing agreement post-facto. But instead it looks he's in legal deep water now.

load more comments (5 replies)

What, does it talk like that grey bluetooth speaker often featured on Dankpods?

load more comments
view more: next ›