I am a research specialist at the Interactive Robotics Group, MIT, supervised by Prof. Julie Shah. My research focuses on developing planning and learning algorithms to enable robots to safely and efficiently assist and collaborate with humans on physical tasks.
Previously, I obtained my M.S. degree in Robotics at the Robotics Institute, CMU. I was a research assistant at the Personal Robotics Lab (now moved to UW), co-advised by Prof. Siddhartha Srinivasa and Prof. Stephanie Rosenthal. I also worked with Prof. Katia Sycara in summer 2017. I obtained my B.S. degrees in Computer Science and Psychology from Penn State.
* Both authors contributed equally
From the perspective of robots, I develop planning and learning algorithms to enable a robot to physically interact with humans. From the perspective of humans, I draw inspirations from psychology and investigate the understandability, predictability, and trustworthiness of robot behaviors.
We are investigating planning under the uncertainty about the latent states in human models. We constructed a POMDP model and trained a human model on the motion level instead of task level to synchronize robot and human motion speeds. We augmented motion primitives as macro actions with extended durations to improve motion smoothness under kinodynamic constraints and gripper actions as macro actions to precisely model their durations. We resolved the computational overhead caused by the fine-grained motions via adaptive resolutions for state and action spaces.
We developed a motion planner to enable a robot arm to avoid fast-moving obstacles by searching for a safe trajectory with the minimal execution time in real-time, given the time parameterized motion predictions of the obstacles. Our algorithm enormously reduces the search space to reason about novel obstacle avoidance strategies within robot configuration space and the time domain.(Submitted to RA-L/ICRA 2019)
We developed a human motion predictor in human-robot collaborative navigation. The system recognizes the current goal that the human is heading to and predicts the time-parameterized human motion in the near future. The motion planner then uses the predicted human trajectory to plan an optimal robot trajectory in space and time to avoid the human and reach the goal.
We developed Fast Online Segmentation of Activities from Partial Trajectories (FOSAPT) to recognize and segment human activities in real time. We integrated navigation, manipulation, and grasping planners, along with FOSAPT to enable a robot to fetch and deliver correct parts to humans at appropriate times in factory settings. We have successfully demonstrated the system at a Honda manufacturing plant in Marysville, OH.(Submitted to RA-L/ICRA 2019)    Video YouTube Link
In a GridWorld navigation scenario with roads, grass, rocks, trees, and humans, we found that critical points in demonstrations of robot paths could shape humans' understanding of robot cost functions and prediction of future robot paths.RO-MAN'17
"The robot is avoiding rocks."
In a tabletop manipulation scenario, we investigated enabling a robot to convey its intention to pick up a block from a table to human collaborators using referring expressions. Our goal was to improve the understandability of robot behaviors. We crowdsourced a corpus of referring expressions from a user study where participants were asked to instruct their partner sitting across the table to pick up a block from the table.IJRR'18 RO-MAN'16 R:SS:WS'16
We extracted the visual and spatial attributes with their visual and linguistic saliences from the corpus and applied them in a referring expression generating algorithm. We further expedited this algorithm via pruning and heuristics.M.S. Thesis'17
We have demonstrated our system on "Ada" (Kinova Mico) and enabled it to specify the target blocks using natural language before picking them up.Video YouTube Link
"Pick up the yellow block."
"Pick up the leftmost orange block from your perspective."
We built "Red Dragon" for Trinity College Fire Fighting Home Robot Contest at Penn State!
I am working on "Robbie & Yuri" at MIT!
I worked on "Herb" (Home Exploring Robotic Butler) at CMU!
I am working on "Abbie" at MIT!