Lab: The Pilot and Localization

From Tekkotsu Wiki

Jump to: navigation, search

Learning Goal: This lab will teach you how to get the Pilot to use visual landmarks to localize the robot on its world map.

Contents

Materials

To perform this lab using a real robot, you will need some panels with landmarks (AprilTags) that the robot can recognize. See the article on the VeeTags Maze for instructions on how to construct the required navigation environment.

You can also run this lab using the Mirage simulator. A simulated VeeTags world is available in /usr/local/Tekkotsu/tools/mirage/worlds/VeeTags.mirage.

Camera Alignment

You can skip this section if you will be using Mirage, or you're using a robot with a pan/tilt mount, such as Calliope. But if you're running on a real (not simulated) Create/ASUS robot, read on.

To estimate the distance to an AprilTag used as a navigation landmark, the Pilot assumes that the center of the tag is exactly 7 inches above the floor. It then uses the height of the tag in the camera image, plus a bit of trigonometry, to calculate the distance. But in order for this scheme to work, the Pilot must know the camera angle. On the Create/ASUS robot, the webcam is built in to the netbook and can be tilted at any angle. Therefore, the Pilot assumes a fixed camera angle and it's your job to tilt the netbook to achieve that angle. We'll call this process "camera alignment".

  1. Mount an AprilTag on a vertical panel, or tape it to a wall, so that the center of the tag is exactly 7 inches above the floor.
  2. Park the robot directly in front of the tag, positioned so that the camera (not the edge of the body) is 1 meter from the tag. Make sure the camera is properly seated on the base plate.
  3. Run the AprilTest demo in Root Control > Framework Demos > Vision Demos > AprilTest
  4. Compare the reported distance to the AprilTag against the actual distance.
  5. Adjust the tilt of the netbook screen and run the demo again, until the reported distance is reasonably close to the actual distance. It doesn't have to be exact.

Localizing With The VeeTags Demo

File:VeeTags.png

The VeeTags demo is built on the PilotDemo class, so it has all the PilotDemo commands. In this section of the lab you will drive the robot around, making it uncertain of its position, and then direct the Pilot to localize using the landmarks it can see plus its built-in map. After localizing, the robot will be more certain of its position.

Begin by following the camera alignment procedure described above. Then set up your robot at a random position and orientation with respect to the VeeTags maze. but with the camera facing some of the markers. If you will be using Mirage instead of a physical robot, start Mirage with the commands below:

      Mirage VeeTags.mirage           # if using Mirage instead of a real robot
  1. Start Tekkotsu, and in the ControllerGUI, go to Root Control > Framework Demos > Navigation Demos > VeeTags
  2. Type "msg rand" to the Tekkotsu command line to initialize the particle filter; randomizing the particles because the robot has no idea where it's starting out.
  3. Click on the "W" button in the ControllerGUI to display the world map. Notice that the robot starts out at (0,0) with a heading of 0. But since it does not know its actual location, the particles are distributed randomly.
  4. Type "msg loc" to the Tekkotsu command line to tell the Pilot to localize the robot.
  5. Click the Refresh button for the world map. The robot's position in the world map should now match its position in the physical world (or the Mirage world if you're using Mirage), and the particles should be clustered around that position.
  6. Use the PilotDemo commands to drive the robot around. You can also use the Walk Controller if you wish, but the PilotDemo is designed to use speeds that work best with odometry, so you'll accumulate less error that way.
  7. Refresh the world map and you'll see the particles begin to diverge. The more you drive the robot around, the greater the cumulative error.
  8. Repeat the "msg loc" command and confirm that the robot has again localized itself correctly on the map.

Building a World Map with the MapBuilder

The VeeTags demo showed you one way to build a world map: by manually constructing the necessary shapes in the buildMap() method. It's also possible to construct a map using the MapBuilder to look for landmarks. This is known as the SLAM (Simultaneous Localization and Mapping) problem. You can demonstrate a simple version of SLAM by following the steps below.

  1. Make a new robot environment by attaching AprilTags to some objects, such as a trash basket or some cardboard boxes. Make sure that the centers of the AprilTags are exactly 7 inches above the floor. Arrange the tags however you like.
  2. Turn the robot so some of the landmarks are in view.
  3. Start the PilotDemo behavior by going to Root Control > Framework Demos > Navigation Demos > PilotDemo.
  4. Type "msg build" on the Tekkotsu console. This will invoke the MapBuilder to look for landmarks and deposit them in the world map.
  5. Click on the "W" button in the ControllerGUI to bring up the world map and verify that the landamrks the robot was looking at are present.
  6. If necessary, turn the robot (using the PilotDemo turn commands) to bring other landmarks into view, then type "msg build" again to incorporate these into the world map as well.
  7. Once the map is complete, drive the robot around for a bit, then use the map to localize.

Including Localization Steps In Your Code

When writing your own state machines, you can ask the Pilot to localize using whatever world landmarks you designate. Sample code will be introduced here.

At the start of a behavior, Tekkotsu assumes the robot is at (0,0) and facing north. If the robot is actially starting out at an unknown location and needs to localize itself, you will need to randomly distribute the particles so that the particle filter can consider all possible hypotheses. You can do this with the following statement:

particleFilter->resetFilter();