1001 10ve 0001

Spike Jonze’s new movie, “Her,” presents a novel iteration of the Western fear of artificial intelligence: the fear that we might fall in love with it. Everyday people express less extreme fears about the role of technology. The fear that texting, Facebook chatting and chatbots might replace human interaction. The fear that a Candy Crush addiction has gone too far. The fear that the ease and aptitude of technology will render the skillsets of some individuals obsolete. Perhaps fear is a strong word, but as society furthers its reliance on technology the discomfort that stems from dependency is far outshone by the underpinning of that dependency: our infatuation with new technology. Although this seems like a stretch, the reality of the matter is that human relationships with technology aren’t too far from the reciprocal romantic relationship portrayed in “Her.”

There’s no doubt that people have grown attached to technology. Aesthetically powerful games mesmerize users and organizational apps orchestrate real life events from scheduling to reminding. Dating sites, whether Meet an Inmate, Farmers Only or the more traditional Match.com go so far as to initiate and prompt reality. But what makes a movie like “Her” seem far-fetched to us isn’t the reliance or dependence, it’s the sense of reciprocity. It’s ludicrous to think of a computer or any brand of artificial intelligence producing humanity.

Yet that seems to be a next step. Technology has already been incredibly good at responding to human prompting. Typing a line of code results in an output. Swiping advances to the next picture. Some phones vibrate slightly with every touch, a raw physical response. The question is: Can our gadgets reciprocate emotion? Human emotions on a grand scale aren’t impossible to predict. You can probably guess most stupid puns before your friends make them, and an understanding of hormones and the endocrine system provides further insight into the way people respond to different situations, behaviorally and emotionally. Throw in a dash of weighted randomness and you could produce a machine that exhibits sometimes irrational, but largely rational, emotional responses to external stimuli. A lab in Singapore directed by Dr. Hooman Samani is working to develop a robot with artificial intelligence that is “an active participant in the communicative process, [adjusting] its affective state depending on its interactions with humans.” Namely, it becomes content (makes happy beeps a la R2D2 and displays purple flashing LEDs) when touched by a human companion, or appears jealous (makes grunting noises and displays green LEDs) when it isn’t receiving attention.

This isn’t to say that producing a response that simulates emotion is the same as producing an actual emotion, but it does beg the question: What does it take for an individual to develop feelings for something or someone? Do they have to be confident that the same emotional and chemical processes that cause themselves to behave in certain ways also motivate the actions of others?

For instance, as robots become more and more lifelike, there is something that causes humans to hold a distinct distrust of them. This is often called the “uncanny valley.” The basic idea is that the neural pathway that allows individuals to identify motion and that which allows individuals to identify facial features cross at the parietal cortex. If a robot has a lifelike face but retains a limited and mechanized range of motion, there is a mismatch of neural pathways. One piece of information calls the robot human, one calls it mechanical. The incongruity prompts distrust.

I hypothesize that if Dr. Samani produced a robot that was capable of realistic emotional responses to human companions, a similar distrust would develop. Viewing a projection of realistic human emotion combined with the recognition that the technology is distinctly non-human (either from visual cues of mechanized motion or from just the understanding that behavior is defined by algorithms) would cause distinct discomfort.

Still, despite these imbalances in reciprocation, I think that people have fallen in love. Perhaps not in the manner of “Her,” but people have become obsessed with, dependent on and validated by their technology. All it takes is a click on one of your seven remotes, and your plasma/HD/flat-screen/able-to-predict-what-you-want-to-record television will show you hundreds of inter-human relationships that label themselves as love without half of the relationship qualifications that a person might have with their iPhone. I fail to see any more depth in “The Bachelor” than these weird human-technology relationships.

When it comes down to it, I think the true appeal of technology is that it makes you a better person. This is said over and over again in wedding vows, but possessing a 4G device really does make you better, superhuman even. It makes you a person who doesn’t need to stop for directions and can always come up with the actor in that one movie. Technology can be an engaging and empowering companion, requiring minimal investment or emotional risk. Still, the human responsibility exists that we must actively seek to define our relationships with technology. Without active definitions, this age of high-volume tech usage could easily be one that sees an isolation of people and the mindless loss of hundreds of dollars on Candy Crush upgrades. Loving our gadgets is more multi-faceted than it seems.

Lydia Thurman is a Trinity junior. Her biweekly column will run every other Tuesday. Send Lydia a message on Twitter @ThurmanLydia.

Discussion

Share and discuss “1001 10ve 0001” on social media.