
Innovative Robot Navigation System Developed by Scientists at South Ural State University
On March 2, 2026, researchers at South Ural State University unveiled a unique robot navigation system capable of accurately locating targets even under backlighting and interference conditions.
The team has developed an innovative method for object localization that utilizes only a panoramic camera and a laser line. This technique allows robots to precisely identify targets amidst complex optical disturbances. The research, titled “Structure Light Strip Extraction and Robot Target Localization Methods Under Complex Interference Based on Monocular Omnidirectional Vision,” was published in the Ain Shams Engineering Journal, a prestigious Q1 journal recognized in the top 5% of Scopus rankings and the top 1% of Web of Science.
Imagine a robot tasked with “tracking” a target marked by a laser among numerous red objects, where strong light creates glare on surfaces. This was the challenge that South Ural State University scientists successfully addressed by developing a method based on monocular omnidirectional vision and structured light strips.
Traditionally, robots required two cameras (stereo vision) or expensive laser rangefinders (LiDAR) to estimate distances. This new technology introduces a fundamentally different approach: it only needs a panoramic camera with a 180° field of view, a 650-nanometer linear laser, and a smart algorithm that can identify the laser line on objects in real-time and convert its coordinates into three-dimensional target positions. “The system has become more compact and reliable,” explained Ivan Kholodilin, Associate Professor in the Department of Electrical Drives, Mechatronics, and Mechanical Engineering. “The reduction in the number of sensors means fewer potential issues with calibration and synchronization. Additionally, a single panoramic camera allows the robot to view almost all areas around it, including corners.”
The major achievement of this technology lies in its algorithm’s ability to operate reliably in situations where traditional methods fail. These scenarios include dark, smooth surfaces, red objects (where the laser line may “bleed” into the background), mirror reflections, and even partial overlaps of laser lines.
The algorithm can “recover” the laser trajectory by employing special multi-threshold processing in color space to filter out noise, morphological operations to “repair” broken lines, and clustering and minimum spanning tree construction to restore the continuity of the stripes, even in cases of partial overlap.
The effectiveness of this method has been experimentally validated on a real robotic setup equipped with a SCARA manipulator. The results are impressive: in noisy environments, the average coordinate measurement error is only 5.57 millimeters (compared to a benchmark method with an error of 18.08 millimeters). The depth reconstruction error decreased by 69%. In actual object capture experiments, the average error was no more than 6.435 millimeters. Furthermore, the operational time for the laser localization module is approximately 0.5 seconds, which is acceptable for industrial applications.
This technology proposed by scientists at South Ural State University is particularly suited for:
- Robot sorting and handling of parts;
- Control of object positions on conveyor belts;
- Working in operational units with harsh optical environments (glare, reflections).
The new technology also features import substitution characteristics, as its cost is significantly lower than that of industrial LiDAR systems. The total cost of the panoramic camera and linear laser is around 10,000 rubles, while a high-quality foreign LiDAR can cost several hundred thousand rubles.
Currently, the main point of contention regarding this technology lies in the object recognition by neural networks, which takes approximately 2.93 seconds (accounting for 86.2% of the total time). Plans are in place to optimize this stage using more powerful graphical accelerators and compact neural network architectures to bring the system closer to real-time operation.
This achievement, led by Dr. Maxim Grigoriev and Associate Professor Ivan Kholodilin from the Department of Electrical Drives, Mechatronics, and Mechanical Engineering, aligns perfectly with the global trend of moving away from expensive sensors towards the adoption of “smart vision” and active illumination technologies, paving the way for more economical and compact robotic systems in the industrial sector.
Original article by NenPower, If reposted, please credit the source: https://nenpower.com/blog/innovative-robot-navigation-system-developed-by-south-ural-state-university-scientists-for-accurate-target-positioning-in-challenging-light-conditions/
