Overview
Collaborated with the University of Almería Robotics Lab (Spain) to develop perception systems for a collaborative robot (cobot) operating in greenhouse environments alongside human workers. The project built the real-time sensing foundation that enables a robot arm to safely operate near humans while performing agricultural tasks like fruit harvesting.
The work addressed a core challenge in agricultural automation: how do you deploy a capable robotic arm in an environment shared with human workers, without requiring costly safety cages or extensive workspace separation? The answer is perception — giving the robot the ability to see and respond to human presence in real time.
The Challenge
Agricultural robots operating near humans require perception systems that are simultaneously accurate, real-time, and reliable — a single failure in human proximity detection could cause injury. The challenge is compounded by the greenhouse environment itself: varying natural and artificial lighting throughout the day, partial occlusions from plant foliage, dust accumulation on lenses, and the visual complexity of the scene.
The system also needed to run efficiently on the embedded NVIDIA Jetson TX2 hardware mounted on the robot — a constrained platform compared to a desktop GPU. TensorRT optimization was essential to meet latency requirements within the available compute budget.
Technical Approach
- Human pose detection ROS2 node implemented in C++ for low-latency human proximity awareness — provides the robot arm control system with real-time human body position data
- Fruit detection ROS2 node in C++ for automated harvesting target identification — enables the robot to locate and approach fruit without manual guidance
- TensorRT model optimization for deployment on NVIDIA Jetson TX2 — models converted and optimized for maximum throughput on constrained embedded hardware
- Real-time inference achieving safe human detection latency on the Jetson TX2, meeting the timing requirements for safe collaborative operation
- ROS2 node architecture enabling modular integration with the existing robot arm control system — perception nodes publish to standard ROS2 topics consumed by motion planning
- Safe proximity zones defined based on detected human pose positions — robot arm automatically pauses or modifies trajectory when a worker enters a defined safety radius
- Perception-driven automation: fruit detection drives harvesting targets while human detection continuously monitors for safety override conditions
Key Outcomes
This project was conducted in collaboration with the same team at the University of Almería whose MVSim simulation software was used in Teladoc's autonomous navigation development — a productive ongoing research relationship.