RIEM News LogoRIEM News

New system helps robotic arm navigate using sound instead of vision

New system helps robotic arm navigate using sound instead of vision
Source: interestingengineering
Author: @IntEngineering
Published: 7/4/2025

To read the full content, please visit the original article.

Read original article
Researchers at Carnegie Mellon University have developed SonicBoom, a novel sensing system that enables robotic arms to navigate and localize objects using sound rather than relying on visual sensors. Traditional robotic arms depend heavily on cameras for tactile sensing, which can be obstructed or damaged in cluttered environments like agricultural fields. SonicBoom addresses these challenges by embedding contact microphones along the robot’s arm that detect sound waves generated when the arm touches objects, such as branches. By analyzing subtle variations in these sound waves with AI, the system can accurately determine the exact point of contact, achieving localization errors as low as 0.43 centimeters for trained objects and maintaining strong accuracy (2.22 cm error) even with unfamiliar materials. This acoustic-based approach offers several advantages: the microphones are well-protected from harsh contact, the system is more affordable and practical than camera-based tactile sensors, and it can function effectively in visually occluded environments. The researchers demonstrated SonicBoom’s utility by mapping occluded branch-like structures in a mock canopy

Tags

roboticsrobotic-armsound-sensingAItactile-sensorsagricultural-robotsobstacle-navigation