Be Wary of Robot Emotions: 'Simulated Love Is Never Love' - NBC Connecticut
National & International News
The day’s top national and international news

Be Wary of Robot Emotions: 'Simulated Love Is Never Love'

Research has shown that people have a tendency to project human traits onto robots, especially when they move or act in even vaguely human-like ways

    processing...

    NEWSLETTERS

    Learn How Doctors Are Treating Elbow Injuries With Metal Prosthetics
    Steven Senne/AP, File
    In this Nov. 21, 2017, file photo Massachusetts Institute of Technology professor and robotics researcher Cynthia Breazeal reaches to touch social robot Jibo at the company's headquarters in Boston. When robots move like humans and talk like humans, even if only a little bit, it’s natural that we will treat them more like humans.

    When a robot "dies," does it make you sad? For lots of people, the answer is "yes" — and that tells us something important, and potentially worrisome, about our emotional responses to the social machines that are starting to move into our lives.

    For Christal White, a 42-year-old marketing and customer service director in Bedford, Texas, that moment came several months ago with the cute, friendly Jibo robot perched in her home office. After more than two years in her house, the foot-tall humanoid and its inviting, round screen "face" had started to grate on her. Sure, it danced and played fun word games with her kids, but it also sometimes interrupted her during conference calls.

    White and her husband Peter had already started talking about moving Jibo into the empty guest bedroom upstairs. Then they heard about the "death sentence" Jibo's maker had levied on the product as its business collapsed. News arrived via Jibo itself, which said its servers would be shutting down, effectively lobotomizing it.

    "My heart broke," she said. "It was like an annoying dog that you don't really like because it's your husband's dog. But then you realize you actually loved it all along."

    Facebook Announces New Cryptocurrency 'Libra'

    [NATL] Facebook Announces New Cryptocurrency 'Libra'
    Facebook has announced plans to enter the banking business by launching a new digital currency. The social media giant says it wants to make sending money around the world as easy as sending a photo or message with the new "Libra" cryptocurrency.
     
    (Published Wednesday, June 19, 2019)

    The Whites are far from the first to experience this feeling. People took to social media this year to say teary goodbyes to the Mars Opportunity rover when NASA lost contact with the 15-year-old robot. A few years ago, scads of concerned commenters weighed in on a demonstration video from robotics company Boston Dynamics in which employees kicked a dog-like robot to prove its stability.

    Smart robots like Jibo obviously aren't alive, but that doesn't stop us from acting as though they are. Research has shown that people have a tendency to project human traits onto robots, especially when they move or act in even vaguely human-like ways.

    Designers acknowledge that such traits can be powerful tools for both connection and manipulation. That could be an especially acute issue as robots move into our homes — particularly if, like so many other home devices, they also turn into conduits for data collected on their owners.

    "When we interact with another human, dog, or machine, how we treat it is influenced by what kind of mind we think it has," said Jonathan Gratch, a professor at University of Southern California who studies virtual human interactions. "When you feel something has emotion, it now merits protection from harm."

    The way robots are designed can influence the tendency people have to project narratives and feelings onto mechanical objects, said Julie Carpenter, a researcher who studies people's interaction with new technologies. Especially if a robot has something resembling a face, its body resembles those of humans or animals, or just seems self-directed, like a Roomba robot vacuum.

    "Even if you know a robot has very little autonomy, when something moves in your space and it seems to have a sense of purpose, we associate that with something having an inner awareness or goals," she said.

    Such design decisions are also practical, she said. Our homes are built for humans and pets, so robots that look and move like humans or pets will fit in more easily.

    Some researchers, however, worry that designers are underestimating the dangers associated with attachment to increasingly life-like robots.

    Longtime AI researcher and MIT professor Sherry Turkle, for instance, is concerned that design cues can trick us into thinking some robots are expressing emotion back toward us. Some AI systems already present as socially and emotionally aware, but those reactions are often scripted, making the machine seem "smarter" than it actually is.

    "The performance of empathy is not empathy," she said. "Simulated thinking might be thinking, but simulated feeling is never feeling. Simulated love is never love."

    Designers at robotic startups insist that humanizing elements are critical as robot use expands. "There is a need to appease the public, to show that you are not disruptive to the public culture," said Gadi Amit, president of NewDealDesign in San Francisco.

    His agency recently worked on designing a new delivery robot for Postmates — a four-wheeled, bucket-shaped object with a cute, if abstract, face; rounded edges; and lights that indicate which way it's going to turn.

    10-Year-Old Drag Queen 'Sparkles' in Portland Pride Parade

    [NATL] 10-Year-Old Drag Queen 'Sparkles' in Portland Pride Parade

    An LGBTQ+ group offered protection for a 10-year-old drag queen at Portland, Oregon’s, Pride Parade, after online backlash had their family fearing for their safety. Sparkle, who uses the pronouns they/them, had received a lot of support online, but after an all-ages drag show performance, Sparkle, and her mother Michelle Porter, started to receive hateful comments online.

    (Published Monday, June 17, 2019)

    It'll take time for humans and robots to establish a common language as they move throughout the world together, Amit said. But he expects it to happen in the next few decades.

    But what about robots that work with kids? In 2016, Dallas-based startup RoboKind introduced a robot called Milo designed specifically to help teach social behaviors to kids who have autism. The mechanism, which resembles a young boy, is now in about 400 schools and has worked with thousands of kids.

    It's meant to connect emotionally with kids at a certain level, but RoboKind co-founder Richard Margolin says the company is sensitive to the concern that kids could get too attached to the robot, which features human-like speech and facial expressions.

    So RoboKind suggests limits in its curriculum, both to keep Milo interesting and to make sure kids are able to transfer those skills to real life. Kids are only recommended to meet with Milo three to five times a week for 30 minutes each time.