More deepfakes are entering popular culture. A few weeks ago, two BBC 1 radio hosts used voice swap technology from Respeecher to appear on air with reversed voices. Voicebot.ai reported this week:
“From the very first briefing for The Capture Series 2 we knew we had to get some of the hypothetical tech used in the series and show it was possible today in a LIVE broadcast,” senior creative at BBC Creative Michael Tsim wrote in a LinkedIn post. “We didn’t know how, or whether it was even possible. But after literally months of negotiations, contracts and conversations with some incredibly clever guys in Ukraine (Respeecher!) , we actually managed to pull it off. A live, completely real deepfake stunt with the brilliant Greg James and Matt Edmonson on Radio 1.”
Making Deepfake Mainstream
It used to be that most of the news coverage of deepfake technology was negative. Aside from a couple of stories about use in the Star Wars franchise or the novelty of Deep Tom Cruise on TikTok, it was mostly warnings about how the technology will be used for fraud. Even media such as The Capture furthers that narrative.
Then, the appearance of Metaphysic, and later Respeecher, on America’s Got Talent helped show deepfakes in a positive light and offer a firmer foothold in popular culture. The appearance on BBC 1 is another step in the same positive direction. Of course, not everyone is on board, as you will frequently see on Twitter.
The Lighter Side of Tech
So far, the incidence of voice clone initiated fraud hasn’t materialized as the prognosticators had predicted, as I covered in a recent Synthedia post. And, these comments will be made regardless of what happens.
I recall similar sentiment around voice assistants on smart speakers. There was a lot of demand from pundits for negative stories, but very few materialized. It was the integration of smart speakers and voice assistants into popular culture that helped expose the fear sentiment as a minority opinion. Recall Alexa’s appearances in Super Bowl ads and on Saturday Night Live cast voice assistants in a positive, even if sometimes humorous, light.
So, something bad can still happen if deepfake technology is used for fraud. However, it can also be used for entertainment and some fun. Let me know what you think about the video above.