Depth error in a harvesting robot: a case of 3D expert analysis

Published on May 11, 2026 | Translated from Spanish

A robotic arm designed for automated fruit harvesting began systematically damaging tree trunks in a smart farm. The robot's stereo vision system, responsible for calculating the exact distance to branches, failed critically. During the technical investigation, it was identified that the disparity map generated by the cameras contained optical artifacts. The goal of the forensic analysis was to determine whether the sun's reflection on the leaves had deceived the depth detection algorithm.

Agricultural robotic arm damaging trunks due to error in disparity map with solar reflections on leaves

Forensic analysis of the disparity map with MATLAB 🛠️

The first step of the investigation was to extract the raw frames from the robot's stereo cameras. Using MATLAB and its Computer Vision Toolbox, the scene's disparity map was reconstructed. The results showed regions with anomalous disparity values, especially in areas where the leaves had intense brightness. The solar reflection generated saturated pixels that the stereo correspondence algorithm interpreted as objects at a much shallower depth than reality. This caused the robotic arm, when calculating the gripping trajectory, to program a gripper movement too close to the trunk, impacting the bark. Using SolidWorks, the arm's kinematics were modeled, and the geometry was exported to MeshLab to clean the mesh at the impact points, confirming the systematic collision at the erroneous coordinates.

Practical solutions and simulation in Unity 🎯

To prevent future incidents, two lines of correction are proposed. The first is the installation of polarizing filters on the stereo camera lenses to reduce specular reflections from the sun. The second, more advanced, involves implementing a monocular depth neural network as a backup, trained to ignore brightness artifacts. To validate these solutions, the entire scenario was recreated in Unity, simulating dynamic sunlight and the robotic arm's behavior. The simulation showed that, with the virtual polarizing filter activated, the disparity map remained stable, and the robot harvested the fruit without damaging the trunk. This case study reinforces the need to integrate robust vision systems against adverse environmental conditions in agricultural robotics.

How can a 3D investigation identify whether the depth error in a harvesting robot is due to a defective LiDAR sensor calibration or a misinterpretation of the trunk geometry in the computer vision model

(PS: Simulating robots is fun, until they decide not to follow your orders.)