libgazebo_ros_moveit_planning_scene

While connecting with Mobile Manipulation exercise, we were facing two errors and one of them was Failed to load plugin libgazebo_ros_moveit_planning_scene.so: libgazebo_ros_moveit_planning_scene.so: cannot open shared object file: No such file or directory. As mentioned in the previous blogs, this is a model plugin which broadcasts Moveit PlanningScene messages so that the planning scene stays up-to-date with the world simulation. This is useful if you want to “fake” perfect perception of the environment. Yijia Wu found this plugin ros_simulation repository along with the pull request in which this plugin was integrated in ros_simulation in kinetic-devel

In the meet, we tried to resolve this error by cloning the repo and building the workspace. Fortunetly, that error was resolved costed us some more errors related to wrold and frames. Yijia asked me to integrate just the required plugins without rest of the packages in the repo considering that those redundant packages might be creating these erorrs, but it seems that it wasn’t the case. I removed all the redundant plugins and tried to connect with the exercise but those errors were stlll there. All of the errors are depicted in the image below:

Machine Vision exercise without Rviz

As we decided to create a separate pull request integrating Rviz in Machine Vision exercise, I started working on it this week. After exposing VNC display, creating a launch file and some APIs to spawn the objects, Rviz was ready and working fluwntly. The integrated rviz window can be seen in the image below:

Testing Exercises

Yijia suggested me to test the exercises ignoring the cob and pcl errors to ensure that everything else is working along except these errors.

Machine Vision

In order to test machine vision exercise, I first wrote the code for simulation of the exercise taking reference from previous year’s code. After resolving some syntactical errors and adding some missing HAL APIs, the simulation code is ready.

from GUI import GUI
from HAL import HAL
# Enter sequential code!

while True:
    # Enter iterative code!
    HAL.back_to_home()
    HAL.send_message("Building map")
    color = "green"
    shape = "cylinder"
    HAL.start_color_filter(color, 50, 0, 255, 100, 50, 0)
    HAL.start_shape_filter(color, shape, 0.05)
    object_name = "green_cylinder"
    position = HAL.get_object_position(object_name)
    HAL.pickup(object_name, position)

On testing the exercise with the code above, I found that the arm went to the home position but didn’t moved from there. On checking the console, it showed Attribute error: x is of None type. This is caused because pcl node isn’t working and due to which the octomap wasn’t created and the oject wasn’t detected.

Mobile Manipulation Exercise

Below is the code that I wrote in order to test Mobile Manipulation exercise:

from GUI import GUI
from HAL import HAL
# Enter sequential code!

while True:
    # Enter iterative code!
    HAL.back_to_home()
    target_name = "conveyor1"
    pose = HAL.get_target_pose(target_name)
    HAL.send_goal_to_client(pose)
    while HAL.get_result_from_client() != True:
        pass
    print(HAL.get_robot_pose())
    HAL.move_to_pick_place_home()
    HAL.spawn_obstacle_rviz("conveyor1")
    HAL.spawn_all_objects()
    object_name = "green_cylinder"
    pose = HAL.get_object_pose(object_name)
    pose.position.z -= 0.01
    HAL.pickup(object_name, pose.position, 0.48)
    HAL.back_to_home(False)
    target_name = "conveyor2"
    pose = HAL.get_target_pose(target_name)
    HAL.send_goal_to_client(pose)
    while HAL.get_result_from_client() != True:
        pass
    print(HAL.get_robot_pose())
    HAL.move_to_pick_place_home(False)
    HAL.spawn_obstacle_rviz(target_name)
    pose = HAL.get_target_position(target_name)
    HAL.place(object_name, pose)

The biggest problem in testing the code in Mobile Manipulation exercise was that the exercise wasn’t connecting everytime I tried to connect the exercise.