Visual Servoing

Ariel Anders and Sertac Karaman
Summary In this assignment students use Image Based Visual Servoing to program a mobile robot to park in front of solid-color objects, like orange cones. The control of the robot is completely based on information extracted from a standard monocular camera. The assignment is structured in three parts: getting realtime images from the robot, blob detection, and closed loop porportional control. Students leverage open source software tools like OpenCV and ROS to implement this reactive planner.
Topics robotics, linear feedback control systems, and computer vision
Audience This assignment has been tailored from advanced highschool students to junior and senior level undergraduate students in Engineering and Computer Science.
Difficulty The difficulty of the assignment is highly scalable based on the environment the robot encounters. For example, the complexity of the blob detection for orange cones in plain environments is much easier to implement than harder detections like a picture of a cat in low lighting environments. Additionally, the porportional feedback controller can be upgraded to include integral and derivative terms.
Strengths
  • Everything we use in the assignment is open source.
  • The difficulty of the assignment is easy to scale for different audiences.
  • The introduction to feedback control is intuitive and easy to visualize.
Weaknesses
  • The open source tools are only reliable with the latest Ubuntu LTS operating system.
  • Requires access to a mobile robot with a camera running the Robot Operating System (ROS).
Dependencies
  • Familiarity with Python programming or C++
  • Have ROS installed
  • Have enough backround with ROS to send wheel commands to their particular robot and subscribe to a rostopic.
  • No prior experience with computer vision or control.
Variants As mentioned in the difficulty section, this assignment can scale for different challenge levels based on the robot's environment. The task can also be modified for line following or added as a low level primitive in a higher level planning task.

Materials

We taught our collegiate and high school course with the RACECAR, an open source robotic platform. The vehicle includes a Stereolabs ZED stereo camera; however, in these assignments we used the rectified image. The same code has been tested on other robotic platforms with monocular cameras, such as the duckiebot and Willow Garage PR2.

Materials- Collegiate curriculum

Blob Detection and Visual Servoing The collegiate course had one lab for blob detection and visual servoing. The instructions are at a much higher level for the students.
OpenCV Cone Detecting Tutorial This is a website tutorial we provided on detecting cones. After our students completed the visual servoing lab they had a more challenging problem of swerving through cones. We provided this tutorial to make sure all students could implement visual servoing and other OpenCV algorithms in python and C++.
Code for the tutorial

Materials- Highschool curriculum

Image passthrough Assignment In this first assignment student create a python program called echo.py . This simple program receives an image from the robot's camera and publishes it to a rostopic. Students learn how to use OpenCV bridge to transition into the next assignment.
Blob Detector Assignment The second assignment goes over a simple blob detection algorithm that segments an image based on a desired color, then finds the largest segmented part of the image to return as the blob. Students are encouraged to review other object detection algorithms from OpenCV.
Visual Servo Assignment In this last assignment students create a proportional controller to try to center the blob in the middle of the image detected by the robot.
Instructor Solutions We provided these solution and bag files after students had time to work on the assignment:
echo.py
blob_detector.py
moving_blob_test.bag

Previous Courses

The visual servoing lab assignment has been offered in 2 courses, with another expected this Spring semester. We intend to update this site with any changes made to the curriculum. The following are courses we taught this lab exercise in:

MIT Course 6.141J/16.405J: Robotics: Science and Systems

The Robotics: Science and Systems course is a technical elective that teaches robotics to undergraduate students throughout MIT. The course features laboratory exercises for mechanical design, control systems implementation, as well as software development for planning and perception.

We taught this course with the visual servoing lab assignment in semester:

  • Spring 2016
  • Spring 2017 (expected)

MIT Beaver Works Summer Institute 2016

The RACECAR class was offered for high school students. In the summer of 2016, the course was offered to 46 high-school students coming from across the United States. In a 4-week residential program, the students learned the foundations of robotics in theory lectures, practiced their skills in hands-on laboratory exercises. The class also included lectures on teamwork and collaboration, as well as seminars from established researchers in the field and experienced entrepreneurs. Students demonstrated their learning in a final course challenge. They designed and implemented software for fully autonomous mini race cars.

Contact

We are excited to offer the materials for our course open source at Github