As we become able to build more complex robots, it also becomes clear that developing more capable robots to serve us is about a lot more than technology: It’s about designing intelligent machines that we can actually relate to.
Surely, it’s inspiring to see Boston Dynamics’ Atlas in action; and truly impressive advances are being made in robotics these years, from improved mechanics to new levels of artificial intelligence handling still more sensory input. But physical capabilities will only be part of the equation as robots become a more explicit part of our daily life.
In fact, several companies are already making human experience and interaction central in their efforts to develop commercial robots that we’ll actually want to make part of our lives. Ranging from simple to advanced, here are three examples illustrating how and why – including one we may not intuitively consider a robot.
A delivery robot with personality
A rather simple example of making a robot that humans can relate to is the Savioke Relay. Engineered to perform hotel delivery services, the robot is basically a cylindrical container on wheels with a touch screen for simple interactions.
Expecting that being serviced by a robot might seem impersonal to hotel guests, the developers decided to address the user experience by adding some personality to the Relay. Using light, sound and movement, along with simple interactions, they managed to add various ‘happy gestures’ and even make the robot seem charming.
This helped create a more personal experience for the hotel guests, making them more comfortable at being serviced by a robot. Read more about how the team achieved it at Fast Company.
Pepper is designed to make you feel good
One of the most sophisticated humanoid robots currently on the market is ‘Pepper’. Already in use several places in the world, Pepper is used to greet and guide visitors at stores, give directions to travelers at train stations – or even act as a social robot in private homes.
At 120 cm with a face designed to resemble the traits of an older child, Pepper is deliberately designed to look friendly and innocent. The robot is designed for interaction with humans, either by its chest mounted touch screen or voice – or even by exchanging hand gestures such as ‘fist bumps’.
Despite its relatively simple appearance, Pepper is based on rather advanced technology, including sensors, AI, and a pair of fairly complex arms that, make the robot able to recognize and respond to emotions.
Adjusting for human imperfection
Though we may not think of them that way, self-driving cars are some of the most advanced robots that are about to become part of our everyday life. From a technology perspective, one of the primary challenges with autonomous cars is to make them better able to detect and react to human road users.
Human road users share an understanding, intuition and a range of small behavioral cues. Self-driving cars don’t share these, nor are they able to replicate those behaviors and cues. And if we experience that self-driving act against our expectations, we’re not likely to trust them – whether as passengers or as users of the same roads.
From a design perspective, the solution may be to make autonomous cars act less like robots and more like human drivers. That could include positioning on the road, driving style, or even driving faster than speed-limits to make human road users better able to relate. That requires still better technology and equipment along the roads – but also a lot of data to teach driving-systems how to drive like humans.
Designing robots – and for robots
The three examples illustrate how we need to think beyond technology to develop robots that function better with and alongside humans. In short, robots need to be designed so that we can relate to them, not just engineered.
But should we also start considering if we need to design for robots? With robots soon becoming an entirely new group of users, maybe we shouldn’t just develop products, buildings, and cities around humans in the future.
Post by Attention Group/4 May 2016