Lab: The Pilot and Odometry

From Tekkotsu Wiki

Jump to: navigation, search

This lab will introduce you to the Pilot, a member of the Tekkotsu "Crew" responsible for moving the robot's body and keeping track of the robot's location.

Contents

Using the Pilot to Move

The simplest way to move the robot is to give it a distance to travel:

  • forward distance dx in millimeters
  • sideways distance dy in millimeters (positive is leftward; only possible for robots with legs or omnidirectional wheels; not Create or Calliope)
  • angular distance da in radians (positive is clockwise)

To invoke the Pilot with a distance to travel, use a PilotNode. Try the example code below:

#include "Behaviors/StateMachine.h"

$nodeclass PilotLab1 : VisualRoutinesStateNode {

  $nodeclass Forward500 : PilotNode(PilotTypes::walk) : doStart {
    pilotreq.dx = 500;     // forward 500 millimeters
  }

  $nodeclass Left90 : PilotNode(PilotTypes::walk) : doStart {
    pilotreq.da = M_PI/2;  // left 90 degrees
  }

  $setupmachine {
    Forward500 =C=> Left90 =C=> SpeechNode("done")
  }

}

REGISTER_BEHAVIOR(PilotLab1);

The PilotNode contains a member named pilotreq that is an instance of the PilotRequest class. It tells the Pilot what you want it to do. There are many options. A walk request: moving a fixed distance determined by dx, dy, and da, is one of the simplest Pilot operations.

Exercises: These exercises can be done either on a real robot or using the Mirage simulator. Note: the simulated robot's motion might not exactly match the real robot's performance due to issues with the simulated physics.

  1. Measuring accuracy: Place a strip of masking tape or duct tape 600 millimeters long on the floor. (If running in Mirage, you can use the fact that the large tiles making up the floor are each 1 meter by 1 meter.) Start your robot at one end of the tape, facing the other end. Run the PilotLab1 behavior on your robot. Using a measuring tape and a protractor, estimate the robot's actual motion. Did it travel exactly 500 millimeters forward and turn exactly 90 degrees?
  2. Measuring repeatability: Does the robot do the same thing every time? Collect data from four more runs of PilotLab1 and calculate the mean and standard deviation for the actual dx and da from your measurements. If using Mirage, type "r" in the Mirage window to reload the world and put the robot back at the origin.
  3. If you know how to make Mirage worlds, create a world with measuring lines on the floor emanating from the origin, to make it easy to visually determine the robot's actual motion.

Odometry

"Odometry" means measurement of distance traveled. Some robots have built in odometry; one way to do that for wheeled robots is to put shaft encoders on the axles. Other robots do odometry by ded reckoning, which means integrating the requested velocity over time. Odometry is vulnerable to several sources of inaccuracy. For example, the robot's actual velocity may not always match its requested velocity because it can take some time to accelerate and decelerate. Encoders can help correct for that. But encoders cannot detect wheel or foot slip, and that can be a major source of error when a robot is moving on a slippery surface. Nonetheless, odometry is valuable for robot navigation, especially when combined with other sources of information. The Pilot automatically performs odometry for you.

Exercises:

  1. Observing odometry: Launch the world space SketchGUI by clicking on the "W" button in the "sketch" row of the ControllerGUI. Start the PilotLab1 behavior. Then click the Refresh button in the worldGUI window. The Agent shape, shown by a blue triangle in the worldView window, indicates the robot. Continue clicking Reset and watch the robot's position and heading update on the map.
  2. Accuracy of odometry: When the robot has stopped moving, read its final estimate of its position and heading as shown by the Agent shape's coordinates. How close does this match your measurements of the robot's actual motion?
  3. Hardware odometry on the Create: If you're using a Create or Calliope robot, you can monitor its odometry sensors using the Sensor Observer. In the ControllerGUI window, go to Root Control > Status Reports > Sensor Observer > Sensors, and check the boxes for Distance and Angle. Then click the Back button, and check the box for Console Output. You will see a stream of sensor values displayed on the Tekkotsu console. Run the PilotLab1 behavior and watch the values change. Distance is in millimeters; Angle in in degrees rather than radians.

The TagCourse Demo

The TagCourse demo uses the Pilot to run the robot through a twisty course called the TagCourse Maze. The course is defined by pairs of AprilTags. You can build your own TagCourse Maze using five cardboard boxes, or use the simulated maze supplied with Mirage.

  • Run the TagCourse demo by going to Root Control > Framework Demos > Navigation Demos > TagCourse. If you don't have a physical tag course, use Mirage with the built-in TagCourse world.
  • Notice that the robot can usually run the course successfully, but position and heading error gradually accumulate, so it generally doesn't end up perfectly positioned over the red destination circle. If the error is worse than usual on a run, it may even hit one of the boxes, or fail to find the next set of tags.
  • While the robot runs the course, monitor its position on the world map. By the end of the run, you will see that the particle cloud is quite a bit more diffuse than when it started.
  • Look at the source file for the TagCourse demo to see how the Pilot is used. You can find the source in /usr/local/Tekkotsu/Behaviors/Demos/Navigation/TagCourse.cc.fsm

Modeling Error Accumulation

The farther the robot travels, the less accurate its position and heading estimate, due to accumulated error. The Pilot's particle filter, which is used for localization, models this error by adding small random perturbations to its particles as the robot moves. You can watch the particles diverge as the robot executes a lengthy trajectory. Let's add a loop to our previous behavior and have the robot travel 1 meter at a time. We'll call the new version PilotLab2.

#include "Behaviors/StateMachine.h"

$nodeclass PilotLab2 : VisualRoutinesStateNode {

  $nodeclass Forward1000 : PilotNode(PilotTypes::walk) : doStart {
    pilotreq.dx = 1000;
  }

  $nodeclass Left90 : PilotNode(PilotTypes::walk) : doStart {
    pilotreq.da = M_PI/2;
  }

  $setupmachine {
    loop: 
      Forward1000 =C=> Left90 =C=>
      Forward1000 =C=> Left90 =C=>
      Forward1000 =C=> Left90 =C=>
      Forward1000 =C=> Left90 =C=>
      SpeechNode("Type 'msg' to continue") =TM=> loop
  }

}

REGISTER_BEHAVIOR(PilotLab2);

The =TM=> above is a text message transition: the robot will wait until you type a message on the console before returning to the node we've labeled loop. In this case we don't care what the message is, so there are no parameters to the TM transition.

Exercises:

  1. Divergence of particles: Line your robot up on the tape mark as before. Run the PilotLab2 behavior. Click the refresh button in the worldGUI window every few seconds to watch the particles move. After four forward/turn sequences, the robot should be back where it started, if it moved without error. How well did it do? How are the particles distributed now? Type 'msg' to the Tekkotsu command prompt to tell the behavior to repeat the loop.
  2. Estimating the divergence: All the particles initially start out at (0,0). As the robot moves, they form an expanding cloud approximately centered on the robot's estimated location. Find the coordinates of the particles farthest from the robot and use this to estimate the diameter of the cloud. (Putting the mouse over a particle will reveal its coordinates in the bottom left corner of the worldView window.)
  3. Growth of divergence: Make a table showing the diameter of the particle cloud after each of 8 loops of PilotLab2. How would you describe the rate of divergence?
  4. Changing the number of particles displayed; The particle filter starts out with 2000 particles, but by default it only displays 50 at a time. Change this to 100 by setting the displayParticles field of pilotreq and run the PilotTest2 behavior again. Do the results look any different? Was 50 too many? What is the minimum number of particles you would recommend to still get reasonable results?

At this point, all the particles have a weight of 0.0. (To see the weights, change displayParticles to displayIndividualParticles in your pilot request.) The weight indicates the particle filter's confidence in that particle's estimate of the robot's position. In the absence of sensory information, such as that provided by visual landmarks, all the estimates are equally plausible. In the next lab you will learn how the Pilot can use landmarks to help the particle filter evaluate its particles.

Reading the Robot's Position

The robot's position and orientation on the world map are maintained as properties of the Agent shape, accessible via the global variable theAgent. You can query the robot's position using the getCentroid() method, which returns a Point. The getOrientation() method returns an angle between zero and two pi.

The example code below instructs the robot to move forward by 500 millimeters and then checks the robot's position to determine the distance traveled. Note that this may not be the actual distance traveled; it is the distance estimated by whatever odometry mechanism the robot is using.

#include "Behaviors/StateMachine.h"

$nodeclass MeasureDistance : VisualRoutinesStateNode {

  $provide Point initialPosition;
  $provide AngTwoPi initialOrientation;

  $nodeclass Move : PilotNode(PilotTypes::walk) : doStart {
    $reference MeasureDistance::initialPosition, MeasureDistance::initialOrientation;
    initialPosition = theAgent->getCentroid();
    initialOrientation = theAgent->getOrientation();
    cout << "Robot starts out at " << initialPosition << endl;
    pilotreq.dx = 500;
  }

  $nodeclass Report : VisualRoutinesStateNode : doStart {
    $reference MeasureDistance::initialPosition, MeasureDistance::initialOrientation;
    Point finalPosition = theAgent->getCentroid();
    AngTwoPi finalOrientation = theAgent->getOrientation();
    cout << "Robot stops at " << finalPosition << endl;
    cout << "Estimated distance traveled = " 
         << (finalPosition-initialPosition).xyNorm() << " mm" << endl;
    cout << "Orientation drift = "
         << float(finalOrientation-initialOrientation)*180/M_PI << " degrees" << endl;
  }

  $setupmachine {
    Move =C=> Report
  }

}

REGISTER_BEHAVIOR(MeasureDistance);

Collision Detection

The Pilot will do collision detection using whatever resources it has available. By default, it will terminate the walk request and report the collision via a PilotEvent; you can request different behavior by changing the value of the collisionAction element of the PilotRequest.

Your state machine can check for collisions by using a =PILOT(collisionDetected)=> transition as shown in PilotLab3 below. Note: you cannot use a =C=> transition with a PilotNode if you are using any =PILOT=> transitions. A =PILOT=> transition with no argument acts as a "default" case that will fire only if no other transition fires; thus the ordering of the two =PILOT=> transitions in the example below is unimportant.

#include "Behaviors/StateMachine.h"

$nodeclass PilotLab3 : VisualRoutinesStateNode {

  $nodeclass Forward500 : PilotNode(PilotTypes::walk) : doStart {
    pilotreq.dx = 500;
    pilotreq.forwardSpeed = 100;    // speed 100 millimeters/second
  }

  $nodeclass Backup : PilotNode(PilotTypes::walk) : doStart {
    pilotreq.dx = -100;             // negative displacement means back up
    pilotreq.forwardSpeed = 30;     // speeds are always non-negative
  }

  $setupmachine {
    forward: Forward500

    forward =PILOT=> SpeechNode("done")

    forward =PILOT(collisionDetected)=>
      SpeechNode("Ouch! I hit something.") =C=> Backup
  }

}

REGISTER_BEHAVIOR(PilotLab3);

If a collision occurs during the square loop behavior of PilotLab2, that PilotNode will stop and report the collision, but the transition to the next PilotNode will still fire, so the robot will immediately start moving again. If you only want the transition to fire if no error occurs, you can specify noError as the argument to the transition:

     Forward1000 =PILOT(noError)=> Left90 =Pilot(noError)=> ...

Exercises:

  1. Try running PilotLab3 with no obstacles in front of the robot. Then run it again with the robot facing a wall.
  2. The Create and Calliope robots use the Create's left and right bump sensors for collision detection. In the ControllerGUI, go to Root Control > Status Reports > Sensor Observer > Buttons and check the LBump and RBump boxes. Then click the Back button and check the Console Output box. Notice that when the robot isn't touching anything, both bump sensors read zero. Use PilotLab3 to run the robot into a wall and see how the Sensor Observer output changes.
  3. The Create base has no bump switches in back, so the Pilot uses the wheel "over current" sensors to detect that the robot is pushing against something when backing up. Demonstrate this using the Event Logger.
  4. Suppose you want the robot to push a box by running into it. In this case a collision means it contacted the box and should keep going. Read the documentation for PilotRequest::collisionAction to learn how to tell the Pilot not to stop when it detects a collision.
  5. Write a behavior that walks straight ahead until it makes contact with a box, and then pushes the box forward by 200 millimeters. If it fails to make contact with a box after traveling for a meter, it should give up and complain.

Advanced Topics

What can the Pilot do besides walk? Read the documentation for the PiotRequestType_t enumerated type to find out.

The particles diverge during locomotion because the accumulated noise means they are undergoing a random walk. Consider an insect that walks at a rate of 1 millimeter per step. If the insect changes direction randomly with each step, how far will it end up from the origin, on average, after N steps? Do a web search on "random walks" to research this.

The particle filter uses a motion model to update its particles as the robot moves. The motion model tries to model the inaccuracy in the robot's motion (e.g., due to wheel slippage, or acceleration/deceleration effects) by adding noise to the odometry measurements. For the Create platform, the motion model can be found in Tekkotsu/Localization/CreateMotionModel.h. What are the noise parameters used in this motion model?

Background Reading

  • The Robotics Primer by Maja Mataric. Section 19.1 talks about Odometry; section 8.3.7 explaims how shaft encoders work.
  • See the article on the Create Odometry Bug for information about a quirk of the Create's odometry mechanism.
  • Probabilistic Robtotics by Sebastian Thrun, Wolfram Burgard, and Dieter Fox. Chapter 5 explains motion models.
  • Sebastian Thrun has an article on Particle Filters in Robotics.