**Submit on D2L prior to 9:00 a.m. on Wednesday, July 4. Demonstrate to Rory within your assigned time slot (see D2L for schedule).**

### Setup

Please download the following zip file and unzip it into your **catkin_ws/src** directory:

### Part 1: Wall Following

The package ~/catkin_ws/src/a4_extract_lines contains a modified version of the laser line extraction algorithm from assignment 3. Your job is to complete the code in the following script to implement line-based wall-following:

1 |
~/catkin_ws/src/a4_wall_follow/scripts/wall_follow.py |

The comments at the beginning of the file describe the desired algorithm. The vectors referenced within the code are pictured below (click on the image to zoom):

Execute the simulation as follows:

1 |
roslaunch a4_world main.launch |

**Submit your version of wall_follow.py by the deadline. Remember to join a group (even of 1) prior to submission. If in a group of 2, indicate your partner’s name in the source code.**

### Part 2: Discrete Bayes Filter

Your job is to fill in the missing body of the **compute** function in the following script which implements a discrete Bayes filter:

1 |
~/catkin_ws/src/a4_bayes_filter/scripts/bayes_filter.py |

Once the launch file from above is running, you can execute this script directly. You will see a depiction of the 4×4 belief grid overlaid on the simulation.

As discussed in class, the motion model used here is specifically tailored for the wall-following behaviour from part 1. Since this behaviour leads to counter-clockwise rotation around the arena, the following motion model is proposed:

The cells on the periphery have two 50% events (a null movement and one other). The inner cells have a 20% probability for all events. In the code this motion model is incorporated as follows:

1 2 3 4 5 6 7 8 |
# Encodes the probabilities of all movements for each position. # Each tuple encodes the following probabilities: # (right_prob, up_prob, left_prob, down_prob, stay_prob). self.motion_model = [ [(.0,.0,.0,.5,.5),(.0,.0,.5,.0,.5),(.0,.0,.5,.0,.5),(.0,.0,.5,.0,.5)], [(.0,.0,.0,.5,.5),(.2,.2,.2,.2,.2),(.2,.2,.2,.2,.2),(.0,.5,.0,.0,.5)], [(.0,.0,.0,.5,.5),(.2,.2,.2,.2,.2),(.2,.2,.2,.2,.2),(.0,.5,.0,.0,.5)], [(.5,.0,.0,.0,.5),(.5,.0,.0,.0,.5),(.5,.0,.0,.0,.5),(.0,.5,.0,.0,.5)]] |

This is just one of many ways of encoding this information. If you find it confusing, you can devise your own scheme.

For the measurement update, the only sensory observation used is a floor sensor which indicates whether the robot is on a black square or not. This sensor is assumed to be correct 90% of the time which means the following:

- p(z_t = True | x_t is on a black square) = 0.9
- p(z_t = False | x_t is NOT on a black square) = 0.9
- p(z_t = True | x_t is NOT on a black square) = 0.1
- p(z_t = False | x_t is on a black square) = 0.1

The measurement map requires a map of where the black squares lie, which is implemented in the BayesFilter constructor:

1 2 3 4 5 |
# Our map is just a representation of where the black squares are. self.map = [[False, False, True, False], [False, False, False, False], [False, False, False, False], [True, False, True, False]] |

Note that the belief, motion model, and map as given in the code are in row-major order. Therefore, to access column i, row j of the belief matrix we reference:

1 |
self.belief[j][i] |

This is important to note, because you might mistakenly refer to self.belief[i][j]. This applies to self.motion_model and self.map as well.

Your implementation of the discrete Bayes filter using these motion and measurement models should allow the belief to eventually coalesce at 1-2 neighbouring grid cells which overlap the robot’s true position. However, a couple of laps may be required before the probability has converged.

**Submit your version of bayes_filter.py by the deadline. Remember to join a group (even of 1) prior to submission. If in a group of 2, indicate your partner’s name in the source code.**