1. Summary
  2. Files
  3. Support
  4. Report Spam
  5. Create account
  6. Log in

Queued Navigation Architecture

From lejos

Revision as of 22:01, 31 March 2010 by Lgriffiths (Talk | contribs)
Jump to: navigation, search


Proposal for Queued Navigation Architecture

This architecture attempts to combine all mthe ideas we have recently been discussing.

Here is a version of the diagram that includes all the current and proposed interfaces:


The application constructs a pilot and an obstacle detector appropriate to the robot and calls the path planner, specifying the destination.

The path planner runs in its own thread. It consults the map, plans a path as a series of waypoints and feeds them into the waypoint queue.

The navigator runs in its own thread. It reads waypoints from the waypoint queue, determines what moves the robot can make from the pilot (e.g. arc or rotate), and feeds moves into the move queue.

The pilot and obstacle avoider run in parallel in their own threads or possibly in the same thread. The pilot knows about the robot design. It implements a PID or similar algorithm to generate Motor commands. It reads moves from the move queue, and informs the obstacle detector when each move stops and starts and what type it is. It also gives the same information to the pose provider and the pilot state estimator. The state estimator is given the current settings of velocity and angular velocity as well.

When the obstacle detector detects an obstacle, it informs the navigator and path planner. These flush their queues. The path planner then constructs a new path, avoiding the obstacle and feeds it to the waypoint queue.

The pose provider receives moves from the pilot and any amount of sensor data from sensor providers. It will typically be a localization Bayes filter, such as an MCL particle filter.

The pilot state estimator hold the current state of a move (position, velocity, angular velocity, acceleration etc.) in terms that are independent of the design of the robot. It is told about moves and velocity from the pilot and reads sensor data from any number of sensor providers. It will typically be a Bayes filter, such as a Kalman filter. The pilot consults it to get the current state of the move.

Proposed Interfaces

public interface PathPlanner {
  public void goTo(Waypoint destination);

public interface WaypointController {
  public void goTo(Waypoint destination);
  public void flush();
  public void interrupt();

public interface MoveProvider {
  public Move getMovement();
  public void addMoveListener(MoveListener listener);

public interface MoveListener {
  public void moveStarted(Move event, MoveProvider mp);
  public void moveStopped(Move event, MoveProvider mp);

public interface PoseProvider {
  public Pose getPose();	

public interface MoveController extends MoveProvider {
  public void flush();
  public void interrupt();
  public void forward();
  public void backward();
  public boolean stop();
  public boolean isMoving();
  public boolean travel(float distance);
  public boolean travel(float distance, boolean immediateReturn);
  public float getMovementIncrement();
  public void setMoveSpeed(float speed);
  public float getMoveSpeed();
  public float getMoveMaxSpeed();

public interface ArcMoveController extends MoveController {
  public float getMinRadius();
  public void setMinRadius(float radius);
  public boolean arcForward(float radius);
  public boolean arcBackward(float radius);
  public boolean arc(float radius, float angle);
  public boolean arc(float radius, float angle, boolean immediateReturn);
  public boolean travelArc(float radius, float distance);
  public boolean travelArc(float radius, float distance, boolean immediateReturn);}

public interface RotateMoveController extends MoveController {
  public boolean rotate(float angle);
  public boolean rotate(float angle, boolean immediateReturn);
  public float getAngleIncrement();
  public void setTurnSpeed(float speed);
  public float getTurnSpeed();
  public float getTurnMaxSpeed();

public interface ArcRotateMoveController extends ArcMoveController, RotateMoveController {

public interface PilotStateEstimation {
  public float getAngleTurned();
  public float getDistanceMoved();
  public float getForwardVelocity();
  public float getAngularVelocity();
  public float getForwardAcceleration();
  public float getAngularAcceleratiion();

public void ObstacleDetector {
  public void obstacleDectected(Obstacle obstacle) {

public interface SensorConsumer() {
  addSensorProvider(Measurement quantity, SensorProvider provider);

public interface SensorProvider {

public interface RangeMap {
  public float range(Pose pose);
  public boolean inside(Point p);
  public Rectangle getBoundingRect();

public interface TachoMotorListener {
  public void rotationStarted(TachoMotor motor,int tachoCount, boolean stalled, long timeStamp);
  public void rotationStopped(TachoMotor motor,int tachoCount, boolean stalled,long timeStamp);

public interface TachoMotor extends Tachometer, DCMotor {
 // As now

Outline of calling application

// Construct a map of the environment
map = new LineMap(...);

// Construct an ObstacleDetector that uses an Ultrasonic sensor
obstacleDetector = new RangeObstacleDetector(sonic, THRESHOLD);

// Construct a state estimator for the pilot, using a Kalman filter with feedback from a Gyro Sensor
stateEstimator = new KalmanPilotStateEstimator();
stateEstimator.addSensorProvider(Measurement.ANGULAR_VELOCITY, gyro);

// Construct an MCL PoseProvider
poseProvider = new  MCLPoseProvider(map);
poseProvider.addSensorProvider(Measurement.RANGE_READINGS, rangeScanner);

// Construct the pilot
pilot = new SteeringPilot(Motor.A, Motor.C, obstacleDetector, poseProvider, stateEstimator, ...);

// Construct the navigator
navigator - new Navigator(pilot, poseProvider);

// Construct a path planner that uses the map
planner = new MapPathPlanner(map, navigator);

// Go to the destination
Personal tools