Assignment 4, Part 2

[This assignment was adapted from Assignment 2 in the Autonomous Systems course offered within the Swarmlab of Maastricht University.  This material is being used with kind permission from Dr. Karl Tuyls.]

Due Date: Thursday, March 20 at 5:00 p.m.  Submit for your group using D2L


In this assignment a robot drives through a corridor environment and has to localize itself. To achieve this, your task is to implement the discrete Bayes filter.

The robot is controlled by an interface, called “gui” which has several buttons to perform required actions.

Screenshot from 2013-03-14 12:20:05

The robot world is discretized in grid cells of 1m x 1m.  The robot moves forward by 1m by pressing the Move button, and turns 180 degrees by pressing the Turn button. This means the robot can move in both directions along the corridor and every grid cell is represented by two belief-states (one facing left, one facing right).  State 0 represents the right-most position with the robot facing left and state 9 represents the left-most position facing left.  State 10 is the left-most position facing right and state 19 is the right-most position facing right.   Once again, states 0-9 represent the beliefs for the robot facing left, and 10-19 represent the beliefs for the robot facing to the right. The measurements of the robots are also discretized to detect walls at a maximum distance of 1m (done by the laser_to_wall node). This means that at every position (state) the robot gets three possible measurements: wall_left, wall_right, and wall_front.  However, for the localization process we only use wall_left and wall_right.

We use RViz to visualize the robot’s position and its belief representation.  You will see the robot and an overlay of the states projected onto the map.  States 0-9 are show on the top row and states 10-19 below, but this vertical separation is for visualization purposes only.  Next to the state number the belief probability is printed and the higher the probability, the greater the intensity of red is displayed.

Localization without uncertainty wouldn’t be much fun.  You can enable/disable both movement and measurement noise with the gui.  The following are the motion and measurement models to use (the same models are used to incorporate noise):

Motion model for forwards movement:
P(Xt = i | Xt = i ) = 0.1
P(Xt = i+1 | Xt = i ) = 0.8
P(Xt = i+2 | Xt = i ) = 0.1

Motion model for turns:
P(Xt = 19 – i | Xt = i ) = 0.9
P(Xt = i | Xt = i ) = 0.1

Measurement model:
P(Zt = sense_wall  | Xt = is_wall  ) = 0.8
P(Zt = sense_door | Xt = is_wall  ) = 0.2
P(Zt = sense_wall  | Xt = is_door ) = 0.3
P(Zt = sense_door | Xt = is_door ) = 0.7

This means that for forwards movement the robot drives 1m 80% of the time, 2m 10% of the time, and fails to actually move 10% of the time. For a commanded turn there is a 90% chance of actually turning and a 10% chance of not turning.  In terms of sensing, when a wall is present the sensor returns the correct value 80% of the time.  When a door is present (i.e. there is no wall) the sensor returns the correct value 70% of the time.


Download the following file and unzip it into your ~/catkin_ws/src directory.

Your job is to fill in the missing parts of which is located within the scripts directory.  The other python scripts within this directory are all necessary for the system to work.  Make sure that these scripts are executable (if not, execute chmod +x *.py in the nodes directory).  The comp4766_a4_p2 package defines a message (look in the msg directory).  So catkin_make will need to be called.

In the launch directory is the launch file bayes_world.launch.  Executing roslaunch comp4766_a4_p2 bayes_world.launch will do the following:

  1. it starts Stage, with the required robot model and map
  2. it starts fake_localization, needed for simulation purposes.
  3. it starts map_server, that publishes the map so it can be used in RViz.
  4. it starts laser_to_wall, which is a custom node that translates from laser data to wall_front, wall_left, and wall_right detection, within a maximum distance of 1m.
  5. it starts the GUI which provides control buttons for moving the robot around, turning, applying measurement update and enabling/disabling noise.
  6. it starts RViz, in which we visualize the robot position and  the belief states of the robot.

Note that the GUI buttons will be unresponsive unless you also start the localizer node (take a look at to see what is going on).  The localizer node is not started by the launch file because it connects to your code (which you may be frequently editing, then re-running).  The localizer node instantiates a BayesFilter object which you are required to flesh out in

To test your code execute rosrun comp4766_a4_p2 localizer.  You can exit this process and edit if you need to make changes (that is, you can leave everything from the launch file up and running).  Move the robot by using the buttons on the GUI.  Note that after every movement, the prediction function will be called implementing the prediction step of Bayes filter.  You have to call meas_update explicitly by pressing the button after each movement.  This is done so that you can see the separate effect of each part of Bayes filter.  You can enable or disable movement and measurement noise, but your filter code should always assume that noise may be present (i.e. use the movement and measurement models supplied above).  Note that it may take a number of movements and measurement update cycles before the belief converges.  But it should eventually converge to a single cell representing the robot’s true pose.


Submit  You should not modify the launch file provided.

Terminator Points

Terminator points will be awarded for any creative or interesting extensions of this assignment.  Moving to a more realistic situation and implementing full 3D grid localization would be interesting but is probably too much to tackle.  One interesting possibility would be to modify the world and the measurement model to incorporate walls at different distances.