OpenAI reveals details about ChatGPT voices along with special reference to Sky

In their blog OpenAI announced that they are going to pause the use of Sky on ChatGPT.

Here’s how OpenAI chose the voices for ChatGPT:

https://openai.com/index/how-the-voices-for-chatgpt-were-chosen/

6 Likes

i wonder if agents of Scarlett J sent them notice or this is a preemptive move by openai to quell any potential problem because of the similarity.

1 Like

I didn’t dig too deep into it, but I saw this…

3 Likes

I honestly didn’t think Sky sounded anything like Scarlett Johansson or “Samantha.”

3 Likes

Hi @jesterk and welcome to the community.

I am with you. I never drew any correlation before I was told to and honestly couldn’t pick her voice out of a lineup. So dumb.

3 Likes

Well well well, look what I just found:

I believe this came from her Agent. NBC is already reporting on it.

7 Likes

I mean, there’s a lot of people who sound the same out of billions of humans.

Maybe Scarlett is trying to get money from the soon-to-be biggest company in the world? I dunno…

6 Likes

It’s interesting, but…

Unless it can be proved OpenAI deliberately set out to deceive people into believing the voice was that of Scarlett Johansson, it’s a non-issue.

It’s been fairly common for companies to use sound-alikes in all manner of media since basically forever.

I don’t remember any discussion anywhere where anyone made any claims that they really believed Sky was Scarlett Johansson.

On this forum there was one mention of Scarlett Johansson related to Sky.

There were a number of people who made the comparison on Reddit, but if you objectively listen to samples of each, it’s clear they’re not the same voice.

Sky isn’t as raspy and doesn’t crack the way Scarlett Johansson’s voice does.

9 Likes

Oh, I completely agree!

The reason I shared this info, and why I find all of this fascinating is more or less to this point here:

This is not quite true, or rather, I don’t think that’s the point. The reason why is because this is actually a copyright / IP law issue.

Scarlet Johansson can allege that OpenAI deliberately stole her likeness, and bring OpenAI to court to prove one way or the other that they actually did so. Whether or not they did would then be up to the court of law. Anyone with a lawyer willing to do so and the money to pay that lawyer can do so.

I do not actually believe they did this. I think they did exactly what you said, which is hire a voice double and tell them “Okay, now sound like Scarlet Johansson” and call it a day.

So, if the above is true, why would OpenAI take down Sky anyway?

Well, I think the biggest thing here is that if OpenAI were dragged into court for this, they would be obligated to demonstrate proof beyond a reasonable doubt that they did not do this, which would require them to divulge their process in great detail. They obviously do not want to explain on public record something that is likely a trade secret and under tight NDAs. So, to avoid this lawsuit, they removed it.

Now, I’m not saying that actors/actresses should not be protected from folks using their voices for things without their consent. Nobody likes people putting words in their mouths.

That being said, let’s not forget who these individuals are:

One on hand I get it, on the other it’s very hard for me to find sympathy for some of these individuals and treat them like victims when they have $165 million in the bank. That is $165,000,000. So, even if OAI did pull a fast one, I do not see that affecting her literal dragon-hoarded mountain of cash. It’s the non-wealthy VAs who can’t incorporate themselves I have much more sympathy for, but the reality is those VAs are the ones actually providing their voices for AI generated content because that gives them more money and opportunity than traditional voice acting gigs. The ones who are making the most noise against this are literally the wealthiest individuals on the planet. It’s really bizarre to me.

Notably too, Scarlet Johansson is infamously litigious. She’s been in the courtroom a few times before, and that has contributed to her net worth. This is not new. This is a very tactical move on her part.

5 Likes

You don’t need proof beyond a reasonable doubt in a civil case, that’s for criminal cases.

You need Preponderance of the evidence in civil cases.

2 Likes

Ah, thanks for that catch. Clearly, I am not a Lawyer lol.

2 Likes

What I have to say is that at the moment ChatGPT does not have a voice for the Spanish language.

What comes out as Spanish is not, it is Hispanic-American, which is quite different.

I must confess, I don’t quite understand the concern. It is clearly stated that “Sky” is not voiced by Johansson, but by another voice actress. Surely, Johansson does not own the rights to all human voices that might sound similar to hers?

Personally, I like the voice of “Sky” a lot and have used it since its introduction, so I have grown accustomed to it and would not like to see it go away.
Edit: Sorry about the double posting, I replied to the wrong person in the first one.

6 Likes

Of all of the AI technologies, I think voice cloning is probably the most dangerous.

That’s not what happened here, obviously, but I think it’s an example of how much impact it could potentially have.

Even a voice which just sounds sorta like someone famous is attracting national attention. Like deepfakes, we are definitely moving into dangerous ground.

When you take all the AI tech together, it’s going to start generating quite a bit of confusion as to what is real. And it won’t be the fake stuff that will confuse us, but the doubt we will begin to have for the real stuff:

This seems to me to be one of the bigger and more immediate AI risks. Curiously, it’s not really listed here - AI Risks that Could Lead to Catastrophe | CAIS

1 Like

I think the issue was sam said “her” on twitter, alluding to the movie.

Probably not a great idea.

3 Likes

“Her” could mean a lot of things.

Maybe even that users are going to find this app very similar to that in “her”.

I guess we’ll see how it goes.

In my testing, Sky on ChatGPT and, Nova and Onyx on the API are relatively more capable out of all of them.

3 Likes

I think it sets a dangerous precedent that goes beyond a simple reference if generated content that slightly resembles a famous person can be grounds for legal action.

It’s unreasonable to think that a voice actress should be penalized simply because her natural voice resembles that of a famous actress. This could extend into other areas where people are restricted or penalized for attributes simply because they resemble a more famous individual.

4 Likes

Yep, though I think the legal action is just a symptom of the larger issue: the confusion that AI is creating between what’s human generated and what’s not.

1 Like

Similar with Swedish. Personally I don’t find it as such a big deal when Juniper speaks Swedish, which is the only voice I use.

To me it varies between sounding like some non-specific Swedish dialect, and sometimes it sounds like an American who is extremely fluent in Swedish with perfect pronunciation with just a slight indication to not being Swedish due to the sometimes American sounding r’s. :grin:

Maybe it’s worse in Spanish?

1 Like

I don’t think it’s primarily about money.

From what I read into the Variety article, Scarlett is freaked out because she suspects OpenAI might have used her voice or directed someone to sound like her with intention. Regardless if that is legal or not, I get why she is reacting. What’s really unsettling for her is the context, considering Altman’s involvement and how it all feels very “Her,” the movie she starred in where her voice was used as an AI assistant. The whole situation is hitting too close to home for her, making it personal and makes it feel invasive.

From Variety: …setting out what they had done and asking them to detail the exact process by which they created the ‘Sky’ voice.

She wants OpenAI to prove how they did it because she has suspicions about the methods. If they used her voice without consent, that’s a huge invasion of privacy. Her trust level for the company is low, so she won’t settle for just comments. So it’s not just about money but about the right to control how her unique voice is used, especially given how distinctive and recognizable it is.

Sound-alikes are used all the time, sure, but it often comes to context.

I would guess that one of the reasons she bowed out was because she understands how creepy a section of her fans are, considering past events. She has already been hit by that previously. That makes her extra sensitive to all of this, which I think is understandable.

But I am sure this will all settle soon.