Post

Solution for Montecarlo Visual Loc and new exercise

Overview

These three weeks I have been developing a solution for the exercise Montecarlo Visual Loc based on the one I did for Montecarlo Laser Loc. I also started developing a new exercise about mapping for Robotics Academy.

Solution for Montecarlo Visual Loc

To do this one, I used the solution I did for Montecarlo Laser Loc and replace the sensor reading part, since in this one I’m using a camera pointing at the ceiling instead of a group of laser rays, in the part where I calculate the weights, I take the pixel in the center of the image and calculate the euclidean distance (taking the three color spaces of the pixel as coordinates) between the pixel of the camera and the one the camera should see if the robot is in the position of the particle, this gives me the error that I can use to calculate the weight of the particle. To get the pixel for the particles I created a color map, which has the ceiling textures placed on the same position they would be in the scenario.

After many tests and tweaking the values for assigining weights, I decided that the textures I gave the ceiling back in weeks 6 and 7 weren’t good enough and the robot tended to get lost, so I changed them. This time I made sure each room had a different color assigned to it, and then apply some noise and gradients to better help the robot to pinpoint its exact location inside that room. Here is the end result, this image is also used as the “color map” for the solution:

New textures for ceiling

With this, the solution is more accurate, but the texture still needs some changes, certain areas are still not good enough like the red room since the different red sections are still too big.

New Exercise: Laser Mapping

Another thing I’ve been working on is a new exercise for Robotics Academy called Laser Mapping. This exercise consists of building a map of the environment using the laser sensor readings. This exercise will also be developed in Gazebo Harmonic since the Gazebo 11 + Gazebo Harmonic docker is finally implemented.

For now, I’ve only done the basic things like implementing the new exercise in the database of Robotics Academy and the GUI and HAL functions, using the templates and world of the first gazebo harmonic exercise implemented in Robotics Academy, Marker Visual Loc. The HAL, and GUI functions are almost the same, without a getImage function. I still need to implement a function for the user so they can show the map they are building in the frontend.

This post is licensed under CC BY 4.0 by the author.

Trending Tags