
Robot SAM
This project focused on the way that people experience and communicate with robots.
The Process
We began by defining the robot's goal. This was limited to delivering a drink to a customer in order to keep the scope clear and achievable.
This task was chosen for its potential to generate interesting interactions. Our design process included research, a co-creation session, and scenario enactment to understand potential interactions. During the co-creation session, participants drew their ideal robot waiter, leading to key insights. They were comfortable taking a drink from a tray and consistently depicted the robot with a separate head, body, and wheels. From these findings, we identified three key requirements: (1) The robot should maintain eye contact while serving, (2) It should be kind, happy, and professional, (3) users expect different interactions from a robot, so it doesn't need to mimic human behavior closely. We then brainstormed interaction flows for SAM, incorporating verbal communication, emotional expression, gaze, and movement into the design, which guided the development of the prototype
My Role
This project included the design, development, and evaluation of an interaction strategy for social robotics. It helped to shape a workflow for solving these kinds of interaction problems. This included the gathering of user input about appropriate interactions within a certain context, ideation about how we can shape these interactions, and implementing that practically.
For this project the speech of participants needed to be captured, processed, and thereafter an appropriate response needed to be sent by the robot. Google Dialog Manager was used to detect intents within the speech of participants and allowed for the creation of appropriate textual responses for the robot. Google's Text-to-Speech API was used to generate an vocal response. This was all implemented in a Python-based dialog manager script.
Hardware Design
In this project, a hardware prototype was developed that included a bunch of different modalities that all needed to be connected to a central microcontroller. These included different LED modules for the eyes and mouth, a sensor for detecting the placement of items on the delivery plate of SAM, and two continuous servo motors that the robot used for its movement. This further improved my skills in Arduino development and hardware design.


