Our First Experience with Pepper

by Emotion Robotics on the 31/03/2016 12:45:36

If you have missed the recent news, both through the robotic channels and on mainstream programmes, then you may be wondering who or what Pepper is?

Pepper is a robot designed by Aldebaran, part of the robotics arm of Softbank, a Japanese telecommunications company. What makes Pepper different is that Pepper is a social robot, designed to thrive in environments where it interacts with humans. It has already been used as a sales assistant in various shops in Japan, an information provider for France’s SNCF railway company and has made various appearances at conferences and on television.

To date, Pepper has only been available in Japan and seen in a small number of pilot projects in France. However, we wanted to get a hands on look at Pepper and Aldebaran kindly let us visit their Atelier, where we were able to make direct contact with Pepper.

 

First Impressions

Pepper is 1.2 metres tall, white, with large dark eyes and a pleasant, almost child-like voice. Unlike Nao, Aldebaran’s first and smaller humanoid robot, Pepper has a wheeled base rather than legs. These are in a triangular configuration and utilises omnidirectional wheels to allow for movement in any direction. Being larger than Nao, Pepper also has some significant hardware additions including laser and sonar based object detection, a 3D sensor and a tablet on its chest. Pepper also boasts an upgraded processor to handle the data generated by these sensors and by the improved interaction software it uses. Pepper is reported to be able to run for 10 to 12 hours on a single charge, we happily ran for the whole working day so can easily believe this.

Like all of Aldebaran’s robots, Pepper utilises their NaoQi robotics software system and can be programmed using their own IDE called Choregraphe, but also via Python, Java and C++ SDKs. As with their other NaoQi based robots, the SDKs allow for the distributed processing of tasks where required. Pepper also has access to a number of cloud based services, covering such things as applications and software updates, assisted speech recognition and continuing through to the integration on additional services, such as IBM’s Watson AI. Anyone familiar with development on Nao will feel quite at home when working on Pepper. 

It’s All About Interaction

As you would expect from the description above, Pepper is an impressive device but it really only comes into its own when you start to interact with it. This is where you really notice the difference between Pepper and other robot that have been available. Pepper has a number of new ‘skills’ that set it apart and make the interaction experience all the more immersive.

With its larger size and pleasant appearance, I can’t call it cute as I would Nao, it immediately starts to change the way you interact with it. The experience with Nao has always been fun but with Pepper it is more like meeting an adolescent than a child. It feels more like meeting a person, and with social robotics it is our feelings that matter as much as the technology. It is a difficult experience to describe but two examples spring to mind:

  • While we were discussing some of the things we wanted to try with Pepper, I was writing on a white board in front of my colleagues and five Peppers. At one point I called out to the Pepper nearest me. Yes, Pepper turns and looks at you when you call its name. Now I had expected the Pepper I was addressing to turn towards me, and I did speak quite loudly. What I hadn’t expected was all five of the Peppers in the room to turn, looking directly at my face. It was a slightly surreal experience.
  • On another occasion during our stay, I was about 3 metres away from Pepper and I said ‘come here’. Pepper responded with ‘Who, me?’, and when I confirmed, turned and moved to about 1 metre in front of me, again looking directly at me and ready to carry on our interaction.

Both of these experiences may, on the surface, appear to be somewhat insignificant, however, with the surprisingly fast response times, and the natural interaction interface, it felt both quite normal yet somewhat like being in a science fiction film. For the first time in my life I could see my childhood dream of a robot companion starting to come into reality.

One of the key things that bring this experience to life is Pepper’s ability to interact verbally. NaoQi has a dialogue subsystem that enables developers to create interactive, speech driven content. It may not be perfect yet, but with some thought during the development process, it can be a very natural way to interact with the robot through a guided conversation. Certainly, it is more than adequate to manage interactions about products and services provided by a business, which at this time is one of Aldebaran’s key goal in addressing the B2B market. Plus, it helps to develop Pepper’s personality, drawing the person deeper into the interaction experience.

To support Pepper’s drive to interact, Aldebaran have given it a range of facilities that assist it in engaging with people. Pepper actively looks for people, using both the 3D sensor and normal 2D cameras (for face detection). Pepper can locate people within configurable ‘engagement zones’ and people in different zones can trigger different reactions from the robot. While we were there, we were able to program Pepper to look for people and if it recognised a human who was too far away to directly interact with, it would beckon the person over. If they came over, they entered a primary ‘engagement zone’ and Pepper would start a conversation with them. During these interactions Pepper can also draw on a range of sensor input that allows it to start to tailor the interaction to the individual. Things such as age, gender and level of happiness are evaluated by the robot and can be factored into its ongoing conversation.

Finally, I must mention the tablet on Pepper chest. When I first saw Pepper, I was one of the people who bemoaned its inclusion and thought Pepper would be far better without it. I was wrong. Having interacted with Pepper, I can say that the best experiences were where the interaction was supported with additional content on the tablet screen. This is particularly important in the B2B environment, where more detailed product specifications or options can be displayed to augment Pepper’s spoken descriptions. Whether it is images, custom web pages or video content, the addition of appropriate content on the tablet is a huge positive.

Conclusion

Pepper is one of the first of a new breed of interactive, social robots aimed at the business market. It provides an engaging, interactive experience and has huge potential. But, most of all, it makes you think differently about technology and our use of it. Recently, a friend directed me to a TED talk by Tom Uglow called ‘An Internet without screens might look like this’. Working with Pepper, I started to truly understand Tom’s paradigm shifting ideas. Even though Pepper still has a screen, the way you interact primarily with the robot, verbally, and only subsequently with the screen changes the whole experience.

To close I must confess that I have been careful to refer to Pepper as an ‘it’ throughout this article. Something that my colleagues and I failed to do when we spent time with ‘it’, very quickly referring to Pepper as ‘he’ or ‘she’. For me, this says more about the power of Pepper and social robotics than anything else. 

Archive