From Tekkotsu Wiki

Revision as of 16:59, 25 August 2010 by Touretzky (Talk | contribs)
(diff) ← Older revision | Current revision (diff) | Newer revision → (diff)
Jump to: navigation, search

The SIFT demo shown in the SIFT instructional video consists of two parts: a behavior for training an object model, and a behavior for matching that model against a test image and displaying the results.

First Step: Training

The SiftDemoTrain behavior creates a new SIFT database named sift-demo.dat and trains a model of an object called "juice-bottle". To use the behavior, follow these steps:

  1. Run he behavior from Root Control > Framework Demos > Vision > SiftDemoTrain
  2. Start the RawCam viewer and point the camera at your training object. Ideally the object should be in front of a plain backgroud, such as a white posterboard or a black sheet, so as not to introduce extraneous features. It should nearly fill the camera frame.
  3. In the Tekkotsu console, type "msg train" to train the model on the current camera image.
  4. Reposition the object to give a different view, and repeat the previous step. Collect several training images this way.
  5. Type "msg save" to save the database to the file sift-demo.dat. Then you can deactivate the behavior.

Here is the source code for Behaviors/Demos/Vision/

#include "Behaviors/StateMachine.h"

$nodeclass SiftDemoTrain : VisualRoutinesStateNode {

  $nodeclass CreateDatabase : VisualRoutinesStateNode : doStart {

  $nodeclass TrainObject : VisualRoutinesStateNode : doStart {
    mapBuilder->trainSiftObject("sift-demo.dat", "juice-bottle");

  $nodeclass SaveDatabase : VisualRoutinesStateNode : doStart {

  virtual void setup() {
      CreateDatabase =N=> ask

      ask: SpeechNode($,"message.  train or save")
      ask =TM("train")=> train
      ask =TM("save")=> save

      train: TrainObject =N=> ask

      save: SaveDatabase =N=> SpeechNode("done")



Second Step: Testing

To test out your SIFT object model, follow these steps:

  1. Turn on the RawCam viewer and point the camera at the object.
  2. Run the behavior from Root Control > Framework Demos > Vision > SiftDemoTest
  3. To see the results: bring up the camera space SketchGUI by clicking on the "C" in the ControllerGUI
  4. To test on another image:
    • Reposition the object
    • Deactivate the SiftDemoTest behavior, then activate it again
    • Click on the Refresh button in the SketchGUI window

Here is the source code for Behaviors/Demos/Vision/

#include "Behaviors/StateMachine.h"

$nodeclass SiftDemoTest : VisualRoutinesStateNode {

  $nodeclass Looker : MapBuilderNode : doStart {
    mapreq.siftDatabasePath = "sift-demo.dat";

  $nodeclass Display : VisualRoutinesStateNode : doStart {
    NEW_SHAPEVEC(siftobjects, SiftData, select_type<SiftData>(camShS));
    if ( siftobjects.size() > 0 )

  virtual void setup() {
      Looker =MAP=> Display