In this work, we propose a novel visual navigation method to estimate the state of mobile and fixed cold-spray material deposition systems using a stereo-camera sensor installed in the workspace. Unlike other visual localization algorithms that exploit costly onboard sensors such as LiDARs or fully rely on distinct visual cues on the robot and grid markers in the environment, our method significantly reduces the cost and complexity of the sensory setup by utilizing a cost-effective remote stereo vision system. This allows for the localization of the target system regardless of its appearance or the environment, and enables scalability for tracking and operation of multiple mobile material deposition systems at the same time. To achieve this aim, deep neural networks, kinematic constraints, and learning-aided state observers are employed to detect and estimate the location and orientation of the deposition system. A physical model of the system with bounded uncertainty and fusion with a remote visual sensing module is proposed. This accounts for frames in which depth estimation accuracy is reduced due to perceptually degraded conditions in the cold spraying context. The algorithm is evaluated on a fixed and mobile setup that demonstrate the accuracy and reliability of the proposed method.

This content is only available as a PDF.
You do not currently have access to this content.