Summary of Team MIT's participation in Amazon Picking Challenge

June 2, 2015



Amazon held the first ever Amazon Picking Challenge at ICRA conference 2015. As the largest on-line retailer in the world, Amazon has numerous warehouses stocked with millions of items that must be ready to be packed and shipped at a moment's notice. With ever-increasing demand for online shopping, automating the object handling system in warehouses will be a necessity. Robots that can autonomously pick up the desired objects from shelves may be able to assist human workforce at warehouse and to increase the efficiency of it. Amazon Picking Challenge was a good platform for research teams all around the world to propose their solution to this interesting problem.

Over 30 Companies and research organizations from three continents participated in the Amazon Picking Challenge preliminary stages. The most outstanding teams earned the right to compete at the finals at ICRA 2015. The finalists included renowned robotics groups like Mitsubishi Motors, UC Berkeley, Georgia Tech among others.

Team MIT is striving to solve the general problem of autonomously retrieving objects from warehouse-type shelves. With eyes set firmly on that goal, we developed an entry for the Amazon Picking Challenge that has great potential for future real-life application. Team MIT proved the viability of its solution by dexterously and succesfully picking the majority of the requested items during the competition, eventually earning second place. For the Amazon Picking Challenge, Team MIT used an industrial ABB 1600ID robot arm. This robot arm can move not only fast (about 1meter/sec in our competition settings) but also with sub-millimeter precision. The robot has purpose-built canals that allow routing all cables and airlines internally, this enables the robot to perform manoeuvres in tight spaces without risking pulling on a connector. The ABB Company is seeking to push the boundaries of robotics in warehouse scenarios and lent us the robot arm and advice for competition.

Team MIT designed custom robot end-effector "fingers" for the competition. They are made from aviation-grade aluminum, which gives us the right compliance and endurance while also being extremely light weight. At the outer-most end of bottom finger tip is a spatula-like finger nail. With that, the robot can scoop objects from underneath, or grasp objects that are flush against a shelf wall. On the top finger, there is a suction system for sucking up items that are hard to grasp. To utilize this multi-functional fingers, we've defined 7 motion primitives: grasping, suction down, scooping, toppling, push-rotate, etc. The robot autonomously decides which to execute based on target object characteristics. The robot motions are planned with the Drake package developed by the Locomotion Group at MIT.



For perception, we statically mounted two Microsoft Kinect2 cameras to the left and right of the robot, and one Intel Realsense camera on the robot arm, close to our gripper. The mobile Intel camera allows us to acquire finer depth images for small and/or reflective objects. To classify and find the pose of objects we utilize a software package by a startup company - Capsen Robotics. Capsens Robotics' software receives pre-processed data from our cameras and instructions on what objects to look for, it returns the position and orientation of the target objects.



To orchestrate all the primitives and perception components, we have a heuristic engine that parses received product orders, sorts through them, and determines which action to execute. The theory behind it is to maximize reliability. Team MIT is thrilled to have completed the first version of the system and proved many of our proposed concepts. Our team members are from the MCube lab at MIT. We are looking forward to competing in the Amazon Picking Challenge next year and, more importantly, developing the technologies necessary for a fully automated large-scale warehouses in the near future.

We hope to see you there!


Our shelf configuration:

Watch our competition video:


Link to our videos

Link to our pictures

Link to our arXiv paper