Part 2: Control
If you are not familiar with the creation and use of classes in Python, you should read chapter 9 of the official Python tutorial.
Ensure that the following packages are installed on your system:
Note that a new virtual machine (dated Jan. 27) has been uploaded which has these two packages pre-installed. In addition, the comp4766_a2_world package referred to below is already in the virtual machine’s /home/pal/catkin_ws/src directory.
The following zip file contains a package called comp4766_a2_world. Unzip this file into your catkin_ws/src directory:
Execute the following command:
roslaunch comp4766_a2_world world.launch
You will see a little robot and a red target square. A laser range finder is simulated, although we will not be using this sensor until the next assignment (although you should feel free to experiment). To teleoperate the robot, open another terminal and execute the following:
roslaunch comp4766_a2_world teleop.launch
Note that will need to have the terminal selected for your keypresses to be able to control the robot. You can combine the launch of the world with teleoperation by launching “all.launch”. Take a few minutes to drive around, check out the topics published by Stage, and mouse around with the Simulator. Once you are ready to work, create a package called comp4766_a2 and proceed with the instructions below:
catkin_create_pkg comp4766_a2 rospy
Note: In the tasks that follow we are assuming that the robot has perfect access to its own pose. This is an unrealistic assumption that we will tackle later in the course.
Create a scripts directory in comp4755_a2 and place go_to_goal.py in that directory.
Task 1: Smooth Controller 1
Implement “smooth controller 1″ discussed in the third set of notes on kinematics. Your task is to create a class called SmoothController1 that fulfills the requirements in go_to_goal.py. This class should go in a file called smooth1.py. Note that the parameter settings for “smooth controller 1″ are not critical (both can be set to 1 or you can experiment with different values). You can use this template for smooth1.py (make the appropriate changes for tasks 2 and 3).
Test your code by launching the Stage simulation in one terminal (roslaunch comp4766_a2_world world.launch) and executing the following in another terminal:
rosrun comp4766_a2 go_to_goal.py
The controller should drive the robot to the red square (which the robot can pass through). Note that there is no goal angle specified with smooth controller 1. Important: you should not use the tf library here. The arguments passed into the constructor and the get_twist method provide everything that you need. The same goes for task 2.
Task 2: Smooth Controller 2
As above, only implement smooth controller 2. Your class should go in smooth2.py. Note that this controller should guide the robot to the red square such that it ends up pointing down the negative x-axis. You should experiment with setting different goal angles to ensure that it works reliably. Note that smooth controller 2 will work well only with certain parameter settings. I have found good results with the following:
- K_RHO = 0.6
- K_ALPHA = 1.6
- K_THETA = 0.3
Task 3: Smooth Controller 1 using TF
Complete the first set of tutorials on the tf library under “Learning tf”. You should follow the Python stream of tutorials.
Re-implement smooth controller 1 to utilize tf. This should allow you to eliminate almost all of the math from the get_twist method (which now takes no arguments, since we are using tf internally). The class should be named SmoothController1TF and should be placed in smooth1_tf.py.
Note that all of the required functions from the tf library are demonstrated through the tf tutorials. For this part you should pay special attention to the “Adding a frame” tutorial. However, you should be able to avoid creating an additional “broadcaster” node. Also, if you are confused about the relationships between frames, it is very helpful to visualize them using rviz.
Submit smooth1.py, smooth2.py, and smooth1_tf.py to online.mun.ca by the deadline given above. Please take note of these constraints on your submission.
Some interesting things to try:
- Utilize rviz to interactively specify the position of the goal in the global reference frame (/odom).
- Modify the stage world to incorporate a moveable block that the robot can sense using the laser (you could use a different world or remove the walls so that you can easily focus on the block).
- Execute an extended trajectory by chaining together a list of waypoints and using one of the smooth controllers to move between distant rooms.