Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

They made the choice to use that voice after she decided otherwise they would never have approached Scarlet. When the recording session occurred is irrelevant. They wanted her voice, couldn’t get it, and then chose to use something similar.


If they really recorded the Sky actor well in advance then there's nothing wrong with them using it. They aren't required to suddenly delete the other voice actor's work because Scarlett Johansen refused a later proposal to use her voice instead. She owns the rights to her voice, not any vaguely similar sounding white female voice put together before they even talked to her.


OpenAI didn’t need to talk to her to create an imitation there’s many recordings freely available. Recording beforehand therefore has zero impact on their ability or intent to imitate such a public figure.

The core question eventually put before a jury would be if it was unintentionally similar or an imitation and attempting to pay her for use of her voice is clear evidence they wanted it because of it or they perceived the voices where overly similar. Either of those are possible, but to then use the voice anyway became problematic.


It doesn't matter if they "could" have made a copy of her voice if they didn't actually do it, and the voice actress who recorded Sky's voice is saying just that.


Read up on Glover vs Universal, the likeness doesn’t have to be that close to run into issues and that’s for the studio who owned the character.

I’m not saying Sky did anything wrong. But the final result is close enough many people assumed it was Scarlet without promoting, that’s problematic when they are making references to her role as an AI voice and obviously wanted to make a deal.


Glover v. Universal was settled out of court at the behest of Universal's insurer. We don't know who would have prevailed if it had continued.

And of course, some specific aspects of the case are unique. Universal used a face mold they had made of Glover's face, for instance.


The uncertainty is itself information. I would have assumed Universal would have been fine, but presumably expert legal advice thought the risk was fairly high.

I’ll give you the mold makes the intent more clear, but I doubt the jury is going to be debating intent. Pulling the voice suggests OpenAI thinks this is either a real risk or bad publicity.


"Similar" in that it's a friendly flirty female voice? Is that entire category supposed to be off the table after SJ declined?


There’s literally thousands of flirty female voices which would have sounded like a different actress, singer, etc who they didn’t approach and therefore would have been less risky.

Really using a voice on a digital assistant that sounds like the actress playing a role of a digital assistant is just dumb. Especially if you then tweet about the film days before releasing the voice.


This changes absolutely nothing about the legality or ethics involved.


Using something similar is a possible legal problem. Creating evidence likely to sway a jury that it was intentionally similar even if it wasn’t intended in that fashion is a problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: