Assignment 2


Autonomous Robotic Systems / Assignment 2

In this assignment the robot drives through a corridor with several doors in it and has to localize itself. To achieve this, you are going to implement a Bayes filter.

The robot can be controlled by an interface, called “controller” which has several buttons to perform some required actions.

The robot world is discretized in grid cells of 1m x 1m. There are 10 of these grid cells in this world. The robot moves forward by 1m by pressing the “Move” button, and turns 180 degrees by pressing the “Turn” button. This means the robot can move in both directions through the corridor and every grid-cell is represented by two belief-states (one facing left, one facing right). State 0 is at the upper right corner, then increasing to state 9 at the upper left, state 10 is at the lower left and increasing to state 19 in the lower right. So states 0..9 represent the beliefs for the robot facing left, and 10..19 represent the beliefs for the robot facing to the right. The measurements of the robots are also discretized to detect walls at a maximum distance of 1m (done by the laser_to_wall node). This means that at every position (state) the robot get’s three possible measurements (e.g. wall_right, wall_left, wall_front).

To visualize the robot position and the belief of the robot for every possible state we visualize everything in RViz. You see the robot and see an overlay of the states onto the map. Next to the state number the belief probability is printed and the higher the probability, the less opaque the red marker gets.

Localization without uncertainty wouldn’t be much fun, so you can enable/disable both movement and measurement noise with the controller buttons, by default noise is disabled. (When you pressed the button, you can check the noise state in the terminal.)  The following measurement and movement models are used:

[latex]

Movement model:

\bigskip

$P(x_{i} | x_{i}) = 0.1$

$P(x_{i+1}| x_{i}) = 0.8$

$P(x_{i+2} | x_{i}) = 0.1$

\bigskip

Turning model ($N$ – is a number of belief states):

\bigskip

$P(N – x_{i} | x_{i} ) = 0.9$

$P(x_{i} | x_{i} ) = 0.1$

\bigskip

Measurement model:

\bigskip

$P(z_{t} = \mathrm{sense \; wall} | x_{t} = \mathrm{is \; wall} ) = 0.8$

$P(z_{t} = \mathrm{sense \; door} | x_{t} = \mathrm{is \; wall} ) = 0.2$

$P(z_{t} = \mathrm{sense \; wall}  | x_{t} = \mathrm{is \; door} ) = 0.3$

$P(z_{t} = \mathrm{sense \; door} | x_{t} = \mathrm{is \; door} ) = 0.7$

\bigskip

This means that for movement with noise the robot  has a chance of $80\%$ to drive $1$m, $10\%$ to drive $2$m, and $10\%$ chance to not move. For a turn $90\%$ chance of turning and $10\%$ of not turning.  For sensing this means that measuring a wall when there is a wall is correct for $80\%$ of the measurements, detecting a door is a bit more hard and is correct for $70\%$ of the measurements.

[/latex]

In the example code the general framework for running this simulation is provided, you have to fill in the gaps!

  • Implement the measurement model in updateSensing()
  • Implement the motion model in updateMove() and updateTurn()

Hints:

  • Create an internal representation of the world to compare your measurements against.
  • The three different sensor measurements can be treated with separate probabilities.
  • View the tutorials on Bayes filters.

Practical tip:

– For those of you who would like to solve move/turn overshoot problem (particularly relevant if you run ROS on a VM), you might have to modify “Move” and “Turn” actions of the simulated robot so that it takes into account its own odometry. You might have to look into quaternion coordinate representation and see how you can grab current robot position/orientation at the time of movement.

Installation:

Download and unzip these packages into your ~/catkin/src workspace directory.
Go inside a terminal to the /nodes directory of the controller and laser_to_wall packages and make the python files in these packages executable, otherwise you can’t run them.

chmod +x filename.py

Create a new package called “bayes_fiter” with dependency’s of “roscpp”, “laser_to_wall”, “controller”, and “geometry_msgs”. (see Assignment 1, how to create a package)

Copy the example-code file (bayes_filter.cpp) into the /src directory of this package.

Add this source-file to the CMakeLists.txt (see Assignment 1, how to do this). Build your catkin workspace.

 

Inside the “bayes_world” directory we have created a launch file “bayes_world.launch”, it starts the environment you need for this (see Assignment 1 how to run the launch file). It starts:

  1. joint_state_publisher, needed for visualization purposes. It publishes messages about the robot’s joint states.
  2. robot_state_publisher, needed for visualization purposes. It publishes messages about the robot’s body state.
  3. Stage, with the required robot model and map.
  4. fake_localization, needed for simulation purposes. Note: this package might have to be installed through the package manager.
  5. map_server, that publishes the map so it can be used in RViz. Note: this package might have to be installed through the package manager.
  6. laser_to_wall, which is a custom node that translates from laser data to wall_front, wall_left, wall_right detection, with a maximum distance of 1 meter.
  7. controller, this custom nodes provides control buttons for moving the robot around, turning, measurements and enabling/disabling noise. Note: this package might require installation of the wxPython through the package manager.
  8. RViz, in which we visualize the robot position and more important, the belief states of the robot, the view, sensors and map settings you see in is loaded from the view.vgz configuration file.

Run your bayes_filter node (see Assignment 1), and then you can move your robot around with the controller.