Can You Feel Me?
The awkward mating of man and machine
The car was hot and stuffy with boredom. We were on our way to the beach and it seemed to be taking forever. Pop music rose up over the front seat with the dull metronomic quality of crickets – it seemed like every song had the same nauseating hook. “Dad”, I asked, frustrated, “Why does every song have to be about love?” “Because,” said my dad, “It’s the greatest thing in the world.”
As with most things, my dad was probably right about this. Certainly the world agrees. Love makes up the center of our music, our movies, our conversations, and right now, even our debate stage. One of the most anticipated love-lush launches this Fall is the new streaming TV series produced by Amazon, Modern Love. Based on the long running New York Times column documenting the messy and often heart-breaking real-life stories of love, romance and relationships; the TV series boasts stars like Anne Hathaway, Tina Fey, and Andy Garcia. And it makes sense. In a love-starved world, The NY Times column has spent 15 years operating as a sort of time-lapse capsule of the ever-multiplying variations of human connection. Since its inception in 2004, this human behavior lens has spawned 80,000 submissions, 670 published columns, one book and over 100 podcasts. While heartbreak and obsession still dominate, regrettable phone calls and emails have mostly given way to regrettable texts, Instagram posts, and Tinder swipes. Love is love; whatever vehicle carries it our way. And we love it.
How do we reconcile our insatiable hunger for the messy and volatile world of the humanness with the rapidly scaling world of robots and AI? How will we merge our belief in the kismet-propelled pursuits of human existence with our desire for a personalized path to all those things we want and need? We say we want to detox from our devices, but we insist that they connect us with what we value the most. We want the companies and brands we interact with to know us intimately – but we don’t really trust their intentions. We fiercely defend our right to privacy, but demand genome functioning to serve us the music and books we will delight in. (Hmmm.. talk about the humanity of it all!)
This tension is highlighted in two new reports: the most recent Euromonitor Report as well as a white paper entitled “The Personalization Paradox” published by the NRF (release date: August 7th) . The latter shares these stats:
- 63% of millennials like when a website keeps track of their activities and makes recommendations to them
- 58% of all consumers are increasingly skeptical about trusting technology
It seems we want to have our cake and to eat it too, (but it all might depend on when and where we have that meal.) According to a recent global survey conducted by MIT Technology Review Insights, 90% of companies are deploying AI across some aspect of their customer journey. But not all has been beautiful along the way. In fact, some brands that leaped into the world of chatbots and AI processing in its earliest iterations have seemed to scale back. Sephora now hosts a community-based recommendation platform on its ecommerce site, allowing beauty passionistas and stylists to weigh in with advice and ideas, in addition to algorithmic support. Amazon’s fashion stylist Echo Look also uses a recommendation process based on the blending of data and human experts. We don’t seem to be quite ready to let go of the side of the pool.
In May 2018, Google released a demo of Duplex, an artificial intelligence personal assistant that can make phone calls for you, stepping in when you might need to book a haircut or reserve a table at a restaurant. During these calls, Duplex will insert “umms” and “ahhs” into the conversation, imitating the syncopation of natural dialogue. Its multiple voices have unique personalities and use linguistic quirks like “I gotcha,” not to mention millennial uptalk. The response was chilly. Although we all can think of a thousand uses for this type of technology, there is a marked sense of discomfort in releasing a ‘fake human’ to play the role of us. Is it simply a form of cat fishing? Or is it just trying to humanize a task we often use the least human of apps for? (Interestingly enough, it turns out Google, like the other brands mentioned, is still relying on humans to determine where a real person is the better option for the task – up to 25% of calls made by Duplex are placed by humans. More time seems to be needed to build the reservoir of human behavior to inform the digital assistance program.) As we march inexorably to a place where our more pedestrian tasks can be offloaded to a swarm of VAs, we are questioning the cost. The fear, I believe, is not the science fiction story of some future where we are all enslaved by our synthetic apprentices. I think we worry about somehow giving our emotions away to something that isn’t real.
Silicon Valley is certainly racing ahead searching for solutions to our troubled minds. From Google’s famed ‘Empathy Lab’ to empathy consulting groups scattered across Northern California, to companies intrinsically understand that scaling empathy is the ultimate Holy Grail. To quote Garage Magazine: “Empathy is what motivates artificial intelligence research and what makes the Turing test so alluring: Emulating human emotion is considered the height of technological sophistication, the effectual equivalent of the nuclear bomb.” We desire a life framed up from a technology driven by something, if not someone, who intrinsically knows us; possibly better than we know ourselves. And we know the catch: if AI is really going to provide us with all of the elements we seem to desire – deeply personal ways and ideas to make our life more of what we imagine it to be, we need to be – eerily – more and more vulnerable to our devices. Just like any good relationship, it seems the more we share, the better our partners can try to make us happy. And so, we all seem to be negotiating this awkward dance into the future. It will come, no doubt. The trick seems to be who leads and who follows.
Will stories created by machines replace our yearning for tangled tales of the heart? I will give Danielle Krettek, the Founder and Principle of Google’s Empathy Lab, the last word:
I think that when it comes to the magic and mystery of emotion, I think you can look at the idiosyncrasies of the dance of emotion in a person and think that there’s no pattern in that. But in truth we all do have our patterns—like literally there are emotional rhythms and emotional tendencies. So I think if we allow machines to observe us long enough, they’ll probably be able to mimic us very convincingly. But my personal opinion is that the real emotional connection—that real empathic connection, and the idea of being self-aware—I think is a uniquely human thing.
We may indeed be creatures of emotional habit. But it is our predictable unpredictability that provides the delight to this ride we are all on. I’ll be watching Modern Love come this Fall, applauding the whole glorious mess.