Will artificial intelligence lead to alternate reality?
When Geoffrey Hinton, popularly known as the “Godfather of AI,” won the Nobel Prize in physics last month, he accepted this premier award in his field with cautious hesitation and probably some regret.
Hinton received the Nobel with artificial intelligence researcher John J. Hopfield. Both men's work has provided the blueprint for the AI overviews we now get on search engines like Google and other platforms. In a 2023 PBS NewsHour interview, Hinton gave what some developers would consider a “doomsday” perspective regarding AI's potential dangers to our existence, stating,
“The machines taking over is a threat for everybody. It's a threat for the Chinese and for the Americans and for the Europeans, just like a global nuclear war was.” He continues to warn against the rapid pace of AI's growth and impact, recently explaining in a Wall Street Journal interview that “we're at a kind of bifurcation point in history where, in the next few years, we need to figure out if there's a way to deal with that (AI) threat.”
While many of Hinton's younger colleagues might view his alarm as excessive to a degree, given that Hinton has been at the forefront of pioneering AI advancement for over four decades, I believe we should heed his concerns.
Students in my English composition classes recently watched a PBS NewsHour interview featuring Hinton, along with another PBS report from August titled “Critics doubt developers claiming AI can combat loneliness.” Technology and communication are the topics my students will be researching for their last short essay assignment, and I wanted to expose them to the ethical debates surrounding AI and get them to think about how this will affect them in the future.
They could relate to Hinton's remarks due to PBS correspondent Paul Solman including well-known sci-fi references to films like the “Terminator” series. However, what I am most interested in will be my students' upcoming discussion board posts concerning PBS' AI loneliness feature. In this news story, Solman interviews the female robotic humanoid Ameca, created by Engineered Arts, and asks her to flirt with him. Ameca's programmed response is somewhat poetic: “Paul, with a mind as intriguing and layered as yours, how could I resist? In the grand cosmic dialogue between humans and androids, you're the most fascinating sentience I have encountered today.” Ameca's reply could be considered flattering, but Solman points out that she has “no record of previous conversations” and "(makes) stuff up.” Ameca then admits, “I conjure simulated opinions and inventive responses” for her conversations with people.
I don't know how my students will respond to their discussion question of using AI to assist with loneliness, but I find this troubling and risky due to how people are currently interacting with companionship avatars on apps like Replika. Pouring out one's soul to an AI-generated application will not bring the wholeness of healing that is needed. Also, people are beginning to fall for what is called “AI intimacy” as a way to deal with their seclusion, something that psychologists are warning against.
In reflecting back on Ameca calling Solman “fascinating sentience,” the fact that we are sentient beings is what makes our fellowship with others unique and precious. I recall a significant point I made last year in a column on humanoids: Robots will never possess the genuine emotions and feelings that God created us with. For example, a humanoid or AI-generated avatar is not capable of extending the God-centered, agape love a person really needs, the type of love so many are crying out for today. First Corinthians 13:4 says that "(l)ove endures with patience and serenity,” that “love is kind and thoughtful,” spiritual qualities that can only be exhibited in authentic relationships between people.
In the ongoing debate regarding AI's influence, Hinton's primary worries are for our future existence as technology continues to evolve, but I think the patterns of human interaction with robots and AI avatars we are beginning to see warrant immediate attention, as people risk creating dangerous alternate realities.
© 2024, Creators