A high-precision visual tracing method of variable boresight for robot guidance

Significance 

Robots are programable machines which can autonomously be engaged in a complex series of actions and yield the desired results. Essentially, robots can be guided by an external control device or the control may be embedded within. Consequently, the development of robot guidance technology has been the forefront of many researchers. So far, various robot guidance technologies have been successfully employed in industrial setups, military expeditions, scientific exploration and even in transportation. Using advanced autonomous navigation technology, industrial robots can adapt to dynamic, unknown, and complex external environment and carry out high-repeatability and high-risk tasks with high precision. Robot guidance methods commonly used at present include electromagnetic induction, magnetic induction, optical guidance, inertial guidance, visual guidance, GPS – among others. At present, robot guidance technology based on computer vision represents a popular research topic nowadays and has already penetrated into various application fields. However, in this vision-based robot guidance technologies, the difficulties lie in how to perceive external environments or detect positions of unknown targets. Therefore, it is important that images with high resolution be obtained under a large enough camera’s field of view.

In response, many scholars have focused on developing new methods and tools on the basic principle. For instance, a hybrid generation imaging device based on Risley prisms was recently developed to achieve field of view expansion with a high resolution. Nonetheless, in most of the current imaging techniques based on Risley prisms, research aspirations focus on expanding the imaging field of view through boresight adjusting, and then perform the image stitching with high resolution. To this end, researchers from the School of Mechanical Engineering at Tongji University in Shanghai: Qiao Li, Zhaojun Deng and Yang Zhang led by Professor Anhu Li developed a new visual tracing technology based on Risley prisms to guide robots. Their work is currently published in the research journal, Journal of the Optical Society of America A.

Placing Risley prisms in front of the camera, the field of view of the camera can be dynamically adjusted so that the imageable area of this camera is expanded greatly. Bearing this in mind, the researchers proposed two real-time visual tracing strategies for dynamic targets which could effectively avoid the problems of target loss and tracking instability. The deviations between the reference trajectory generated by the manipulator and the actual trajectory detected by the proposed visual system were measured.

Results based of the experiments and simulations undertaken showed that the visual tracing system could detect trajectories dynamically in a large range. Specifically, the results showed that the deviations were less than 1.5% in the 250 mm motion space of the manipulator. Further, results from their case study were in good agreement with existing literature.

In summary, by taking full use of the characteristic of Risley prisms, a visual tracing system model that could dynamically detect the target’s trajectory in a wide range was established. The approach employed used a calibration method to obtain the manipulator’s position through the binocular cameras, which helped realize the guidance function of the visual tracing device to the manipulator. In a statement to Advances in Engineering, the proposed method is highlighted, which can be used to guide robots with high precision, thereby providing an improved potential method for robot navigation.

A high-precision visual tracing method of variable boresight for robot guidance - Advances in Engineering
Figure 1 Experiment platform of Risley-prism-based visual tracing for robot guidance

About the author

Anhu Li received Ph.D. in Optical Engineering from Shanghai Institute of Optics and Fine Mechanics, Chinese Academy of Sciences. He is currently a full-time Professor with Tongji University, China. He has led over ten important scientific research projects in the laser tracking fields and also developed many precision optical tracking and test devices. He has published a scientific monograph by Springer Nature and more than 80 articles in international journals including Optics Letters, Optics Express, Journal of Lightwave Technology, Applied Optics, Journal of the Optical Society of American A, Optics and Laser Technology, Optical Engineering, Optics Communications and so on. He is an active reviewer for over twenty international journals. His research interests include laser tracking, vision measurement, optical instrument, and robot measurement and control.

About the author

Qiao Li received the B.S. degree from the Qingdao University of Science and Technology, Qingdao, China, in 2017. He is currently working toward the M.S. degree with Tongji University, Shanghai, China. His current research interests include optical scanning device and computer vision.

About the author

Zhaojun Deng received the B.S. and M.S. degrees from Harbin University of Commerce and Northeast Forestry University, China in 2015 and 2018, respectively. He is currentlyworking toward the Ph.D. degree with Tongji University, Shanghai, China. His current research interests include optical scanning device and computer vision.

About the author

Yang Zhang received the B.S. and M.S. degrees from Tongji University, Shanghai, China, in 2016 and 2019, respectively. His current research interests include optical scanning device and computer vision.

Reference

Anhu Li, Qiao Li, Zhaojun Deng, Yang Zhang. Risley-prism-based visual tracing method for robot guidance. Volume 37, No. 4; Journal of the Optical Society of America A.

Go To Journal of the Optical Society of America A

Check Also

Advancing Fusion Energy: High-Field REBCO Superconducting Magnets in the SPARC TFMC Program - Advances in Engineering

Advancing Fusion Energy: High-Field REBCO Superconducting Magnets in the SPARC TFMC Program