Too Creepy for Words? The Uncanny Valley Claims Gabby Petito

Netflix takes a high-profile chance on voice cloning…and totally creeps out its audience

 

I’ve spent what feels like far too much time in the past couple of years thinking and writing about AI-generated synthetic voice overs and their impact on my profession. (Just look around my blogs here, and you’ll see.) Pretty much everyone in the professional voice acting community remains preoccupied with this topic too.

I promised myself that I would give it a break. After all, there are other things to talk about.

And besides, I had personally arrived at a fragile peace with this AI controversy. While AI is here to stay, I’m secure enough in my own skills and marketing ability that I don’t need to succumb to the white-knuckled panic that seems to have paralyzed some of my peers.

But just when I thought the whole AI voice over issue had been beaten into the ground and there was nothing new to say…

…along comes Netflix with its recent true crime mini-series American Murder: Gabby Petito.

This has managed to introduce an entirely new twist to the AI debate, not to mention opening up a fresh moral/ethical can of worms. And this time it’s not just voice actors — it’s got everyone talking.

The details of this headline-grabbing story were disturbing enough. But for many viewers, Netflix crossed the line when it applied voice cloning technology to audio samples of the young murder victim so it could “bring her to life” and let her digital voice replica recite entries from her own diary.

The eerie result is what sounds like Gabby herself speaking directly to you, revealing in her own words what she’s feeling in the weeks and days before her life is cruelly snuffed out by her abusive, possessive nut-job of a boyfriend. It feels oddly like she’s right there in your living room, narrating the events we already know will lead to her own demise.

And that just isn’t sitting well with audiences.

Netflix is careful to explain that it got the permission of Gabby Petito’s family to clone her voice, which supposedly should put viewers’ minds at ease.

Except it hasn’t.

The backlash in the weeks since its release has been widespread, with heated discussions found all over TikTok, X, Reddit, YouTube, Facebook, and countless online articles.

Although Netflix producers have defended their actions, saying they did it in service of telling Gabby’s story, this has touched a nerve. Even Gabby’s family admits that despite giving their blessing to this project, hearing her voice like this still upsets them.

Online reaction generally falls into one of the following buckets:

 

  1. This was uncomfortable, exploitative and completely unnecessary. Don’t ever do this again.
  2. Chill out. Her family gave permission, so it’s fine.
  3. Replicating her voice to share private, deeply personal content is unethical and sick.
  4. It didn’t bother me a bit. Got used to it after a while.
  5. No one has the right to grant “permission” to use a person’s voice after they have died. It’s disrespectful of the dead, especially the victim of a violent murder.
  6. Why didn’t they just hire a voice actor instead?

 

This brings to mind the term “uncanny valley,” the concept in AI and robotics which describes that freaky feeling of discomfort or unease you get when something artificial comes unsettlingly close to resembling an actual human being, but still is not quite right. It’s what most pisses off voice talents like myself: we hate that some people marvel at how “lifelike” it all sounds, while at the same time it’s unmistakably, hopelessly phony.  It’s devoid of emotion and nuance, and stripped of humanity.

It looks like Netflix has accomplished putting that uncanny valley notion on steroids. Some viewers noticed that even when expressing her deepest fears and anxieties, the “Gabby” AI voice feels emotionless and flat. One viewer even went as far as to say it sounded like the producers gave her a “happy voice” which he found totally inappropriate.

 

Here are a few stars from the past whose voices you could hear again someday.
But does that make this okay?

 

Before his death, James Earl Jones executed a legal agreement to allow AI to clone his voice specifically as his iconic character Darth Vader, for unlimited use in any potential future projects connected to the Star Wars franchise. Similarly, the estates of Burt Reynolds, James Dean, Judy Garland, and Sir Laurence Olivier cut deals that would permit the cloning of their voices.

Is this slippery slope a place we should be going? Is it okay if it’s a celebrity or public figure, but not a murder victim? Is it okay for anyone?  Should families or estates have the right to even grant this kind of ghoulish “permission?” Do the rights to a person’s unique voice print survive after that person is long gone?

Should we quite literally and figuratively be putting words in the mouths of the dead?

While initially it sounds kind of cool to imagine fresh new Darth Vader content or to think about all the creative ways a celebrity voice from the distant past might pop up in some future show or movie, in the wake of this Netflix thing I really don’t like some of the very dark places this could take us.

I think my feelings on the matter could be summed up in the old saying: just because you can, doesn’t mean you should.

Maybe let’s just heed another old saying, and let the dead rest in peace.

Share: