9 C
New York
Thursday, March 28, 2024

Moxie Is the Robot Pal You Dreamed of as a Kid

It’s hard to imagine anything less personable than a vacuum cleaner—until you give it a mind of its own. Almost as soon as iRobot released the Roomba into the world, a community of autonomous vacuum enthusiasts started giving their Roombas names, backstories, and custom wardrobes. One of the company’s early TV ads acknowledged this unlikely bond, featuring people talking about their Roomba like it was a person. It’s a big emotional investment in a tool whose sole purpose is to suck up filth, but Paolo Pirjanian, former CTO of iRobot, totally gets it.

Image may contain: Construction Crane

The WIRED Guide to Robots

Everything you wanted to know about soft, hard, and nonmurderous automatons.

By Matt Simon

“There’s something innate in our mind that triggers when we see something move on its own,” says Pirjanian. “Our experience tells us that it’s a living thing with a life and a consciousness all its own.” It’s the same reason why we mourn the "death" of a Mars rover or laugh when Atlas doesn’t quite land a backflip. We can’t help seeing agency, even when we know full well these machines are just following coded instructions. Our attachment to these automatons is all the more remarkable because they weren’t designed to forge human connections; they were built to do a job. But what if we could tap into our natural empathy for the unnatural and build robots whose job is connecting with humans?

In 2016, Pirjanian cofounded Embodied with the roboticist Maja Matarić to build a better social robot. (Matarić left Embodied in 2018 to focus on her research at the University of Southern California.) This week the company started accepting preorders for Moxie, its first automaton, which will ship this autumn. Whereas other companion robots like the household assistant Jibo or Paro the robotic seal are designed for adults or the elderly, Moxie is built to foster social, cognitive, and emotional development in children. These are skills that are typically imparted to kids by their parents, teachers, and other adults, but Pirjanian noticed that many families want some extra help.

Studies have shown that the current generation of children are falling behind on their social, emotional, and communication skills, relative to previous generations,” he says. “It’s partially attributed to a lot of screen time and social media, but also pressures at school that add to anxiety, depression, and so on. Every child can benefit from advancing their social and emotional skills.”

Moxie, whose teardrop-shaped head is perched upon a cylindrical, baby blue body, is a cross between a videogame, a pet, and a teacher. It’s main purpose is to help children improve basic social skills (like making eye contact) and cognitive skills (like reading comprehension) as they complete tasks supplied by a gamified narrative. Moxie’s backstory is that it has been dispatched from a secret laboratory on a mission to learn how to be a better friend. The child becomes Moxie’s mentor, and Pirjanian’s idea is that they will also improve their own cognitive, emotional, and social skills by teaching the robot.

Robots are well suited for the kinds of repetitive skill-building activities that would quickly wear down a human teacher. They can’t totally replace human interaction (yet), but they may be able to augment it. “There’s evidence to support the idea that social robots can help with skill development in children,” says Kate Darling, a research specialist at MIT Media Lab and an expert in human-robot interaction. “I would call it preliminary evidence, but very promising.”

A growing body of research suggests that companion robots are especially effective for children with neurological disorders like autism. For example, children with autism often struggle with eye contact and reading facial expressions, so it helps to practice with a robot’s exaggerated emotions. Pirjanian says Moxie was originally developed for kids on the spectrum, but during testing, “parents who also had a neurotypical child were like, ‘Why can’t we use this for them as well?’ Overall it seems like there is a great need for helping children advance their social and emotional skills.”

But for all their promise, designing and building effective companion robots is a major challenge. The reason for this, says Erik Stolterman Bergqvist, a professor of human-computer interaction at the University of Indiana Bloomington, is because “social robots don’t have an obvious function.” They’re designed to be your friend, but companionship is a metric that defies easy quantification. This makes Moxie very different from robots that have a clear job. If you want to know if a Roomba worked, just look for the dirt.

“What a lot of designers are struggling with is that as soon as you leave the design of things that have an obvious purpose, everything becomes more complicated,” says Stolterman Bergqvist. “You’re asking: ‘How do people relate to people?’ But they relate to each other in complex and diverse ways.”

To meet these challenges, Pirjanian and his colleagues relied on a heavy dose of artificial intelligence. Moxie’s head is packed with microphones and cameras that feed data to machine-learning algorithms so that the robot can carry on a natural conversation, recognize users, and look them in the eye. With the exception of Google’s automated speech-recognition software, all the data is crunched by Moxie’s onboard processor. The more a child interacts with Moxie, the more sophisticated those interactions become, as the robot learns to recognize the child’s face and his or her speech patterns and developmental needs.

Each week, Moxie is updated with new content based on a certain theme like “being kind” or “making mistakes.” It then sends the child on thematic missions and asks them to report back about their experiences. For example, it might have a child write a nice note for their parents or make a new friend. Pirjanian says he considers Moxie a “springboard” to improve social interactions in day-to-day life. “We don’t want [children] to just binge on this, because five hours of games each day doesn’t help,” he says. “The robot encourages children to go out and practice things in the real world and report back, because that’s where we want them to succeed.”

Pirjanian says that Moxie’s rampant appetite for data is key to the robot’s effectiveness. Not only does the data allow the robot to tailor its interaction to individual kids, but it is also critical for providing feedback to parents. While the robot “sleeps,” it crunches the data from the day’s interaction, measuring things like the child’s reading comprehension and language use, and the amount of time they spent on various tasks. It sends that data to an app that parents can use to monitor their child’s progress on those tasks and overall social, cognitive, and emotional development as determined by Moxie’s algorithms. Over time, the robot also provides recommendations. For example, if Moxie notices a recurring verbal tic, it might suggest that the parents take their child to a speech pathologist.

Parents might be queasy about letting an internet-connected robot collect data on their kid. Although there are lots of laws on the books governing how companies can collect and use data from children, some researchers are concerned that they’re not equipped to handle the deluge of intimate personal data—including photos and conversations—that will be generated as social robots become more common. “Children are a particularly vulnerable population in terms of not fully appreciating the risks of their data being collected,” says Jason Borenstein, associate director of the Center for Ethics and Technology at Georgia Tech. “There certainly needs to be more discussion at various levels about what kinds of data can and should be collected from children when they’re interacting with robots.”

Pirjanian says Embodied has emphasized privacy and data security in Moxie from the beginning. Parents must consent to their child using the robot, and most of the data collected by Moxie is processed locally on a computer inside the robot. “There was no way in hell we were going to let any images leave the robot,” says Pirjanian. He says only audio data is sent over the internet, so that it can be transcribed using a speech-to-text algorithm. When Moxie “sleeps,” it analyzes these transcriptions and other data from the day, encrypts it, and sends it to a parent’s app. Pirjanian says this means that not even Embodied has access to an individual child’s data; the company only sees aggregated anonymized data from all its robots.

But addressing technological problems was only half of the challenge of creating Moxie. The other half was overcoming the psychological barriers involved with human-robot interactions, which can be even trickier than teaching a robot how to talk. Although people readily ascribe agency to autonomous machines, there’s a limit to how human we like our robots to be. If a robot acts and looks too much like us, it will evoke the revulsion that characterizes the uncanny valley. But if it’s not like us at all, users might not form a connection with the robot in the first place.

There’s an ongoing debate among roboticists about how humanlike to make companion robots. So far, most have erred on the side of caution and limited the use of human features. Robots like Jibo and ElliQ have more abstract shapes and are about as faithful to the human form as a Picasso portrait. To the extent that a robot is endowed with eyes or a mouth, they are typically static or animated on a flat screen, which detracts from their humanness.

With Moxie, Pirjanian and his colleagues bucked many of these trends. Moxie’s teardrop head is fronted with a rounded screen that always displays two cartoonishly large eyes and a mouth. By using machine vision, Moxie can make direct eye contact with its user. “When you put eyes on a robot, you have a responsibility to use those eyes in a way that’s not creepy,” Pirjanian says. “Eye contact is a big part of that.”

Moxie can’t move around on its own, but it can tilt its head and bow at its middle. Unlike most companion bots, Moxie also comes with two flipper-like arms that it uses to accentuate its speech. Each of these design traits was carefully chosen to foster a connection between the robot and its user based on research from fields as diverse as animation and developmental psychology.

Unlike the Roomba, everything about Moxie, from the color of its body to the algorithms in its head, is designed to foster connections with its users. And if it succeeds, it might just foster better connections between users too.

Update 5-1-2020, 10:30 am EDT: Jason Borenstein is the associate director of the Center for Ethics and Technology at Georgia Tech, not the director as previously stated.

Related Articles

Latest Articles