Assignment 4

Submit on D2L prior to 9:00 a.m. on Wednesday, July 4.  Demonstrate to Rory within your assigned time slot (see D2L for schedule).

Setup

Please download the following zip file and unzip it into your catkin_ws/src directory:

Part 1: Wall Following

The package ~/catkin_ws/src/a4_extract_lines contains a modified version of the laser line extraction algorithm from assignment 3.  Your job is to complete the code in the following script to implement line-based wall-following:

The comments at the beginning of the file describe the desired algorithm.  The vectors referenced within the code are pictured below (click on the image to zoom):

wall_follow_vectors

Execute the simulation as follows:

Submit your version of wall_follow.py by the deadline. Remember to join a group (even of 1) prior to submission. If in a group of 2, indicate your partner’s name in the source code.

Part 2: Discrete Bayes Filter

Your job is to fill in the missing body of the compute function in the following script which implements a discrete Bayes filter:

Once the launch file from above is running, you can execute this script directly.  You will see a depiction of the 4×4 belief grid overlaid on the simulation.

As discussed in class, the motion model used here is specifically tailored for the wall-following behaviour from part 1.  Since this behaviour leads to counter-clockwise rotation around the arena, the following motion model is proposed:

IMG_20180627_214034

The cells on the periphery have two 50% events (a null movement and one other).  The inner cells have a 20% probability for all events.  In the code this motion model is incorporated as follows:

This is just one of many ways of encoding this information.  If you find it confusing, you can devise your own scheme.

For the measurement update, the only sensory observation used is a floor sensor which indicates whether the robot is on a black square or not.  This sensor is assumed to be correct 90% of the time which means the following:

  • p(z_t = True | x_t is on a black square) = 0.9
  • p(z_t = False | x_t is NOT on a black square) = 0.9
  • p(z_t = True | x_t is NOT on a black square) = 0.1
  • p(z_t = False | x_t is on a black square) = 0.1

The measurement map requires a map of where the black squares lie, which is implemented in the BayesFilter constructor:

Note that the belief, motion model, and map as given in the code are in row-major order.  Therefore, to access column i, row j of the belief matrix we reference:

This is important to note, because you might mistakenly refer to self.belief[i][j].  This applies to self.motion_model and self.map as well.

Your implementation of the discrete Bayes filter using these motion and measurement models should allow the belief to eventually coalesce at 1-2 neighbouring grid cells which overlap the robot’s true position.  However, a couple of laps may be required before the probability has converged.

Submit your version of bayes_filter.py by the deadline.  Remember to join a group (even of 1) prior to submission.  If in a group of 2, indicate your partner’s name in the source code.