Week 11

GSOC Coding Week 11 Progress Report

Week 11: Exploring the New HAL API and First Pick-and-Place Workflow

This week’s focus was on testing and adapting the new HAL API that was originally developed for the Pick and Place exercise. The good news is that it can be almost directly applied to the Machine Vision exercise, with only a few minor adjustments required.

I also added back useful functions from the legacy implementation, such as the buildmap() function from pick_and_place.py, now integrated into the new HAL.py.

What I Did This Week

  1. Explored and tested the new HAL API for pick-and-place actions.
    • Verified core functions such as MoveAbsJ, MoveJoint, MoveLinear, and link attach/detach operations.
    • Integrated buildmap() into the HAL for workspace initialization.
  2. Implemented a hardcoded pick-and-place sequence using object and target positions.
    • The workflow covers moving to home, picking the red cylinder, and placing it in the target position.
    • Verified that the sequence works smoothly when using link attach/detach instead of a physical gripper.

Problems I Encountered

  • Planning failures during placing: while the robot could reliably move to the object and pick it up, placing was problematic. No matter how I adjusted the target position, planning kept failing.

  • Gripper planning issue: the gripper motion is not working at the moment.

Solutions Implemented

  • After investigation, the issue turned out to be related to the camera placement.
    • The old setup had the camera in a position where it occasionally collided with the robot during motion planning.
    • By adjusting the camera position and angle, the planner succeeded, and the placing sequence worked correctly.

Example Algorithm (Hardcoded Positions)

```python def main(): object_pos = [0.65, 0.09, 1.01] # Red cylinder position target_pos = [-0.44, -0.06, 1.0] # Target position down_orientation = [0, 90, 0] # Gripper pointing down

set_home_position([0.0, -90.0, 0.0, 0.0, -90.0, 0.0])
MoveAbsJ([0.0, -90.0, 0.0, 0.0, -90.0, 0.0], 0.5, 2.0)

buildmap()

# Pick sequence
above_object = [object_pos[0], object_pos[1], object_pos[2] + 0.15]
MoveJoint(above_object, down_orientation, 0.3, 2.0)
approach_object = [object_pos[0], object_pos[1], object_pos[2] + 0.05]
MoveLinear(approach_object, down_orientation, 0.1, 1.5)
attach('red_cylinder')

# Place sequence
above_target = [target_pos[0], target_pos[1], target_pos[2] + 0.15]
MoveJoint(above_target, down_orientation, 0.5, 2.5)
approach_target = [target_pos[0], target_pos[1], target_pos[2]]
MoveLinear(approach_target, down_orientation, 0.1, 1.5)
detach()

```

Conclusion

Pick-and-Place with HAL API Demo

After integrating the HAL API, the pick-and-place workflow with hardcoded positions is now working in ROS2.

  • The API enables smooth motion planning, attach/detach operations, and workspace initialization.
  • Camera repositioning resolved planning failures, allowing the sequence to complete reliably.

With manipulation now functional, the next phase will be to connect perception outputs to the HAL workflow so that objects can be picked and placed dynamically.

For Week 12, I plan to:

  • Update the gripper driver
  • Integrate perception outputs with manipulation to replace hardcoded positions
  • Start connecting the exercise into the JdeRobot Robotics Academy infrastructure

📍 Posted from Barcelona, Spain
🧠 Project: Migration and Enhancement of Machine Vision Exercise to ROS2 + MoveIt2