AR, AI, and Emotional Labor

Cortney Harding
3 min readJul 22, 2018

I recently spent a fascinating day at an event put on by Samsung Next and the NYC Media Lab, featuring wide ranging discussions on artificial intelligence (AI) and characters — specifically, how we will interact with AI characters as technology progresses. In the afternoon, I led a breakout session on AI and emotional labor, including a discussion of how our relationships with AI characters and robots will impact our relationships with other humans.

Already, we are starting to see a change in how we interact with service workers, driven by Alexa and Siri. Young children are adept at using digital assistants to get information and fulfill requests — however, they have a hard time distinguishing between the devices that provide them with things and the humans who do the same work. More than one parent I know has told me their kids yell at Alexa and then turn around and yell at service workers, because they see the two as being equal (for the record, all those parents were horrified and immediately rebuked the behavior). But the trend towards these type of interactions will only continue, and become more human. One startup at the event has a product that allows customers to use AR to ask a virtual human questions in a store. On one level, this is useful and great, a boon for customers and a way for service workers to focus on other tasks. But on the other hand, if a user starts to expect this behavior and is then forced to deal with a human who doesn’t have immediate answers or provide flawless service, the annoyance will be much greater.

If virtual people start replacing service providers, those left in service positions will be held to impossibly high and even abusive standards. And as always, it’s worth considering how these virtual people will be gendered and presented — if current digital assistants are any indication, many of them will be female and extraordinarily compliant and helpful. Women in the service industry are already expected to do lots of emotional labor on top of their regular labor, and it’s not uncommon for some people to mischaracterize friendliness and get angry when they don’t reciprocate. If AI women will flirt, cajole, and flatter, along with providing data, it’s a slippery slope towards terrible expectations.

All of this said, the name of my company is “Friends With Holograms,” so I’m generally bullish on the idea of interacting with virtual people. Another point that came up was the need to be respectful or at least non-judgmental of attachments certain people will form with an AI — as the technology progresses, it’s not hard to imagine that we’ll have virtual friends or even virtual romantic partners. For many people who struggle with human interaction, this will provide a valuable outlet and help them feel better. We just have to make sure that as all of this moves forward, we keep an eye on how it is being presented, and who it could harm.

--

--

Cortney Harding

Founder and CEO at Friends With Holograms. Adjunct at NYU. Bylines Billboard, Ad Week. Speaker. Ultrarunner in my spare time.