Pepper
Humanoid robot for customer engagement and interaction
Contact for pricing
SoftBank Pepper is a semi-humanoid social robot originally developed by Aldebaran Robotics and manufactured by SoftBank Robotics, first unveiled in 2014. Standing 120 cm and weighing 28 kg, Pepper was designed for human interaction — featuring 20 degrees of freedom, a 10.1-inch chest-mounted touchscreen, four directional microphones, and emotion recognition via cameras and audio. Over 27,000 units were produced before manufacturing paused in 2021. In February 2025, Aldebaran filed for bankruptcy; its assets were acquired by Maxvision Technology Corp. in July 2025.
Released: 2014
Overview
Pepper is SoftBank Robotics' flagship humanoid robot designed to bring social interaction and customer engagement to commercial environments. Launched in 2014 in Japan and expanding globally in subsequent years, Pepper was conceived as the world's first social robot capable of recognizing human emotions and adapting its behavior accordingly. With its friendly appearance, expressive gestures, and interactive touchscreen display, Pepper has become an iconic presence in retail stores, banks, airports, and healthcare facilities worldwide.
The robot stands 1.21 meters tall and moves on an omnidirectional wheeled base, allowing smooth navigation through crowded spaces. Pepper's design prioritizes approachability and engagement, featuring a white humanoid upper body with articulated arms and hands, a tablet interface positioned on its chest for interactive content, and an expressive head capable of various movements. This combination of physical presence and digital interface enables Pepper to serve as both a greeter and information kiosk.
Pepper represents a significant milestone in human-robot interaction, focusing on emotional intelligence rather than task-specific automation. Its ability to detect and respond to human emotions through facial recognition and voice analysis creates more natural and engaging interactions, making it particularly effective in customer-facing roles where human connection matters.
Key Features
- Emotion Recognition: Advanced AI analyzes facial expressions, voice tone, and body language to detect and respond to human emotions in real-time
- Natural Interaction: 20 degrees of freedom enable natural gestures and movements, with articulated arms and hands for expressive communication
- Multi-Sensory Perception: Equipped with 3D camera, dual HD cameras, four directional microphones, touch sensors, sonar, and laser sensors for comprehensive environmental awareness
- Interactive Touchscreen: 10.1-inch tablet interface on chest provides visual content, forms, and interactive applications to supplement verbal communication
- Multilingual Capabilities: Supports multiple languages and can be programmed for various conversational scenarios and cultural contexts
- Cloud Connectivity: WiFi and Ethernet connectivity enable cloud-based AI services, remote management, and continuous software updates
- Long Battery Life: Up to 12 hours of autonomous operation on a single charge, suitable for full-day commercial deployment
- Open SDK Platform: NAOqi framework allows developers to create custom applications, behaviors, and integrations for specific business needs
Applications
Pepper has been deployed across diverse industries globally, with particularly strong adoption in retail, hospitality, healthcare, and education sectors. In retail environments, Pepper serves as a brand ambassador, product advisor, and queue management assistant, engaging customers with personalized recommendations and entertainment while collecting valuable interaction data. Banks and financial institutions use Pepper to greet customers, provide basic account information, and guide visitors to appropriate services. In healthcare settings, the robot assists with patient check-in, wayfinding, and provides companionship in elderly care facilities.
The education sector has embraced Pepper as a teaching assistant and learning companion, particularly for STEM education and programming instruction. Hotels and airports deploy Pepper for guest services, multilingual information delivery, and self-service check-in processes. Conference centers and exhibitions use the robot to attract visitors, deliver presentations, and create memorable interactive experiences. The robot's versatility and programmability have led to creative applications ranging from autism therapy support to corporate reception assistance, demonstrating its adaptability across various customer engagement scenarios.
Technical Highlights
Pepper's emotion recognition system represents a significant achievement in affective computing, combining computer vision, voice analysis, and machine learning algorithms to assess human emotional states. The robot's perception system integrates multiple sensor modalities including a 3D depth camera for spatial awareness, dual HD cameras for face and gesture recognition, four directional microphones for sound localization and speech recognition, and various proximity sensors for safe navigation. This sensor fusion enables Pepper to maintain situational awareness while simultaneously tracking and engaging with multiple individuals.
The robot's omnidirectional wheeled base provides smooth, stable movement at speeds up to 3 km/h, with advanced obstacle avoidance and autonomous navigation capabilities. Pepper's 20 degrees of freedom are strategically distributed across its body, including mobile head joints, articulated shoulders, elbows, wrists, and hands, plus hip joints that enable expressive upper-body movements. The NAOqi operating system and development framework provide extensive APIs and tools for customization, supporting Python, C++, Java, and JavaScript, with integration capabilities for popular cloud AI services and business systems. This technical foundation has established Pepper as a versatile platform for social robotics research and commercial applications.
The World's Most Recognized Social Robot
Pepper was conceived as a companion and assistant for public-facing environments. Its rounded white plastic body, large LED eyes, and child-like 120 cm stature were deliberately engineered to reduce human discomfort in shared spaces. Pepper launched commercially in Japan in June 2015 at ¥198,000 — the initial 1,000 units sold out in one minute. At peak deployment, Pepper operated in HSBC branches, Carrefour supermarkets, Renault dealerships, Costa cruise ships, French railway stations, hospitals, and hundreds of universities. In February 2025, Aldebaran filed for bankruptcy; Maxvision Technology Corp. acquired its assets in July 2025 with stated plans to continue development for eldercare, education, and security.
Key Technical Features
- 10.1-inch chest-mounted touchscreen tablet
- Emotion recognition via cameras and microphones — detects joy, sadness, anger, surprise
- Facial recognition — can identify and remember returning visitors
- Natural language processing in 15+ languages
- Three omnidirectional wheels for smooth multi-directional movement
- 4 directional microphones for reliable speech recognition
- Depth sensor and 3 cameras (2 HD + 1 3D) for environment awareness
- 6 laser sensors + 2 ultrasonic + 3 bumper sensors for collision avoidance
- Choregraphe visual programming environment accessible to non-programmers
- Python, C++, and Android SDK support
- 12-hour standby / 8-10 hour active use battery life
NAOqi OS and Programming Platform
Pepper runs on NAOqi OS, a Linux-based operating system. The platform supports natural language processing across 15+ languages, basic object recognition, autonomous navigation with laser/sonar obstacle avoidance, and programmable behaviors via Choregraphe (visual drag-and-drop) or code (Python, C++, Android). Thousands of academic publications have used Pepper as a study platform for human-robot interaction. However, Pepper's NLP was built in the pre-LLM era and feels significantly outdated in 2026; some developers have integrated external LLM APIs to improve conversational quality.
Real-World Deployments and Best Use Cases
Pepper excels in greeting and wayfinding in retail, hotel lobbies, and events where its physical novelty commands attention; STEM education where Choregraphe is accessible from middle school through university; and academic research where thousands of publications have used Pepper to study HRI, social cognition, and assistive robotics. Documented deployments include HSBC branches, Carrefour supermarkets, Renault car dealerships, Costa cruise ships, French railway stations, hospitals, and universities across Europe, Japan, and the US.
Videos
Deployment Evidence
Pepper Production Status — Internal Assessment
SoftBank Robotics paused Pepper production in 2020/21; Aldebaran entered receivership in June 2025 and was acquired by Maxvision Technology. No new production of Pepper announced as of 2026.
Deployment Readiness
Integration & SDK
- ROS Support
- Supported
- SDK Available
- Supported
- API Docs
- View Docs
- Supported Platforms
- Linux, ROS 1, Python, C++, Java, JavaScript
Support & Service
Regions
—Support details not available
Specifications
Connectivity
| Communication | WiFi, Ethernet |
Mechanics
| Degrees of Freedom | 20 |
| Locomotion | Wheeled |
Performance
| Max Speed | 0.83 m/s |
Physical
| Weight | 28 kg |
| Height | 1210 mm |
Power
| Battery Life | 12 hours |
Sensing
| Sensors | 2 RGB cameras, 3D camera, 4 microphones, touch sensors, sonar, laser, gyro, bumpers |
Software
| SDK Support | Yes |
Similar Robots
Ready to deploy Pepper?
Tell us about your use case and we'll help you evaluate this solution.

