Duckietown

Duckietown, AKA Probabilistic Mobile Robotics (2.166) at MIT, is a graduate project-based course on Autonomous Vehicles. Duckietown represents a simplified version of the real world (a small version is shown above) where duckiebots are to commute autonomously using a front-facing camera as the only sensor. The duckiebots run on Raspberry Pi’s. Check out the course website for full details.

Here’s the backstory from the course syllabus:

Duckietown is a pleasant little town in the sovereign state of Duckieland. You moved to Duckietown last summer, after graduating from MIT. You were following the love of your life. You were working remotely for your friend’s social networking startup. Life was good, for a while. But things didn’t quite work out the way they were supposed to—the start-up went south and so did your love life. As the winter begins, you are now single and jobless in Duckietown. In a fortuitous late-night encounter at a Karaoke bar, you meet a funny old man and you become best friends over saké. You learn that he is a high-ranking official in the Duckie Party.  A couple of weeks later, the Duckieland Ministry of Transportation gives you the task of designing a mobility-on-demand system based on robo-taxis for the entire country of Duckieland. You have to build this system from scratch.”


I worked on visual odometry for system identification and on an AprilTags-based localization system.

The task of system identification is performed in order to obtain a kinematic model for each robot. An accurate kinematic model is necessary so that the robot can successfully and smoothly navigate areas with little or no information from the camera, its only sensor. For example, since the lane-finding algorithm uses the painted lines on the road, when the robot navigates an intersection it relies completely on its kinematics model to dead-rekcon and arrive at the correct lane. The only data available for system identification, however, is the camera image and the voltage input to each wheel’s motor. Two challenges we encountered where: 1, that the motors themselves, being very inexpensive ones, had a significantly time-varying behavior, likely related to the motors heating up; and 2, the very limited computing power of our Raspberry Pi computers together with the homogeneity of the “road” forced us to detect only lane markings from the compressed camera image for odometry measurements, resulting in very noisy odometry data.

The following is the ROS node graph for system identification:

The localization system was needed in the same way a GPS signal is used for driving in a city. We used unique fiducial markers (AprilTags) placed at each intersection and recorded their location with respect to a predefined frame (offline). When each robot sees one of the AprilTags, it measures its pose with respect to the tag to compute its global pose. My contribution was to define a consistent yet concise representation of the city map and street signs and to automate the generation of a dictionary of transforms of all the tags in a city. This automation made it possible to quickly create digital representations of any Duckietown format with arbitrarily placed street signs (AprilTags).


Duckiebot assembly:

image image

image image

image image

image