Carnegie Mellon University Personal Robotics Lab

Information for Undergraduates

The Personal Robotics Lab is always looking for capable, hardworking undergraduates to join the lab. During the academic school year, almost all of our undergraduates earn research course credit that can be used as an elective credit. During the summer, our CMU undergraduates are typically funded through CMU's SURG (Small Undergraduate Research Grants). We have had students earn long-time funding through CMU's SRC-URO (SRC Undergradate Research Opportunities). In the summer we also have undergraduate interns from other schools through the Robotics Summer Scholars Program.

Undergraduate Projects

Below are a few prospective projects for incoming undergraduates, although we always welcome new ideas!

For more information please contact Professor Srinivasa (siddh -- cs.cmu.edu).

YCB Tasks

The lab has been collaborating to develop the YCB (Yale-CMU-Berkley) Object and Model set to aid with benchmarking robotic tasks. More information can be found in the relevant paper. Tasks include packing groceries, sorting marbles into boxes, and folding clothes. We'd like HERB and ADA to be able to do these tasks! This job would mostly involve programming the robot in our Python framework.

Natural Language Processing Pipeline

Our assistive arm ADA is currently controllered through a joystick interface. However, many users of the arm have extremely limited mobility and cannot operate a joystick. The lab is interested in exploring a control method using natural language processing. Imagine a world where you could tell your robotic arm what you wanted to do!

Feeding Interface

We have previously worked on automated dining with ADA, our Kinova arm. However, the user currently has no control over what food they eat next! We imagine having an easy to use GUI that would allow the user to select which food morsel they want to eat. Yum...

Enable Move-Until-Touch Actions for HERB and ADA

To interact with the world, the robots need to be able to touch objects. An important component of this is getting the robot to move until it feels a force, and then stop. We need a simple API that works across our robots for doing this. For HERB, the force can be measured with the force/torque sensor. For ADA, it will have to be measured using the joint torques.

Utilize Human Skeleton Tracking to Enable HRI

HERB needs to be able to interact with humans. We have a skeleton tracker that is able to track humans as they move around. How should HERB use this information? We should put together a demonstration of HERB interacting with a human (through gesture, handoff, or some other interaction) that makes use of the skeleton tracker.