The first part of this article can be found here: Good AI is Ethical AI: The Media and Entertainment Industry Must Rethink Itself
Here, we continue the conversation focused on AI ethics and regulation, as SMPTE Chair Renard Jenkins; Director of the ETC AI and Neuroscience in the Media project Yves Bergquist And Frederic WallsAMD Fellow, explores the importance of trial and error in AI.
Watch the full video below or read on for highlights.
The importance of experimentation
Bergquist hopes that the idea that organizations and individuals have to be perfect, that they can’t make mistakes and all that, has to go away. We’re all imperfect.
Instead, he wants to move toward a culture of honesty and experimentation. “What I love about this technology is that it’s a work in progress. It’s an ongoing conversation. And we have to use it, we have to figure out what’s good for us, what’s not, when does it work, when doesn’t it work? When is it ethically right?”
Bergquist argued: “It’s actually very important for all of us as a community to use half-baked technology and participate in its development.
“From a creative point of view, from a technical point of view, from an ethical point of view, it is normal to make mistakes. It is normal to deploy a model that has biases, provided that you have a very honest and transparent dialogue with users and consumers.”
Jenkins added: “That’s the key, to make that kind of play when you’re in a safe space. But having a diverse group of people that are part of your UAT team, having a very diverse group of experience that’s part of the model itself. All of those things have to come back.”
Additionally, organizations like SMPTE “need to be there to teach people what AI is and what it is not, giving them the tools to evaluate all the hype that is coming out of Silicon Valley,” Bergquist said.
“So that’s the first thing. The second thing is we have to be honest about what the tools are, right? We have to be honest with the fact that, look, these are very imperfect tools. We’re just at the very beginning of this journey toward artificial intelligence.”
Diving into deep fakes
Speaking of honesty, Jenkins admits, “I have some fear about how the tools are going to be used.”
One potentially problematic use case is fake technology, an ethical and regulatory issue that looms large in the public imagination.
Bergquist noted that there are different levels of sophistication when it comes to deep fakes. There are hoaxes carried out by high school students and then there are “state actors who create really elaborate deep fakes that are much harder to detect.”
Walls said: “The idea that a single citizen on the Internet can create a very compelling medium that could replicate someone else’s… opens up all sorts of possibilities for harm and problems and damage to society.”
“I am quite pessimistic about our ability to technically and reliably detect deep fakes made by sophisticated actors,” Bergquist admitted.
Still, “people are thinking about these problems and trying to engineer systems to try to address these concerns. There are also, of course, human-oriented ways to detect deep fakes,” Walls said. At least for now, “look for the guy with the wiggly chin, or six fingers, or whatever.”
He adds: “There’s certainly research going on to… try to distinguish between real media and fake media. So I think these tools are also going to be very useful in helping people know if this is really something I can trust, right?”
Fortunately, Bergquist said, “there is a tremendous amount of energy in the media community to address this problem.”
Ultimately, “technology is an expression of who we are,” Bergquist said. “I don’t like the fact that deep fakes are now becoming a weapon in the hands of bad actors.” However, “it requires us to come together as a community and design community-wide solutions.”
He also pointed out that not all deep fakes are malicious. “There are also interesting use cases for deep fakes. … There is a documentary called “Welcome to Chechnya“Yes, the documentary maker basically used deep fakes, instead of blurring the images or putting the people who were testifying anonymously… they used deep fakes to put real faces that were not their faces. And that makes the documentary much more alive.”
Jenkins added: “People are so afraid. They forget that it’s never the tool that matters. It’s how the tool is used. A hammer is not going to do anything bad on its own.”
EXTRACTS
Why subscribe to The angle?
Exclusive information: Get editorial summaries of the cutting-edge content that matters most.
Backstage access: Take a peek behind the curtain with in-depth Q&As with industry experts and thought leaders.
Unparalleled access: NAB Amplify is your digital hub for technology, trends and insights not available anywhere else.
Join a community of professionals as passionate as you are about the future of film, television and digital storytelling. Subscribe to The Angle today!