Significance
Robots are programable machines which can autonomously be engaged in a complex series of actions and yield the desired results. Essentially, robots can be guided by an external control device or the control may be embedded within. Consequently, the development of robot guidance technology has been the forefront of many researchers. So far, various robot guidance technologies have been successfully employed in industrial setups, military expeditions, scientific exploration and even in transportation. Using advanced autonomous navigation technology, industrial robots can adapt to dynamic, unknown, and complex external environment and carry out high-repeatability and high-risk tasks with high precision. Robot guidance methods commonly used at present include electromagnetic induction, magnetic induction, optical guidance, inertial guidance, visual guidance, GPS – among others. At present, robot guidance technology based on computer vision represents a popular research topic nowadays and has already penetrated into various application fields. However, in this vision-based robot guidance technologies, the difficulties lie in how to perceive external environments or detect positions of unknown targets. Therefore, it is important that images with high resolution be obtained under a large enough camera’s field of view.
In response, many scholars have focused on developing new methods and tools on the basic principle. For instance, a hybrid generation imaging device based on Risley prisms was recently developed to achieve field of view expansion with a high resolution. Nonetheless, in most of the current imaging techniques based on Risley prisms, research aspirations focus on expanding the imaging field of view through boresight adjusting, and then perform the image stitching with high resolution. To this end, researchers from the School of Mechanical Engineering at Tongji University in Shanghai: Qiao Li, Zhaojun Deng and Yang Zhang led by Professor Anhu Li developed a new visual tracing technology based on Risley prisms to guide robots. Their work is currently published in the research journal, Journal of the Optical Society of America A.
Placing Risley prisms in front of the camera, the field of view of the camera can be dynamically adjusted so that the imageable area of this camera is expanded greatly. Bearing this in mind, the researchers proposed two real-time visual tracing strategies for dynamic targets which could effectively avoid the problems of target loss and tracking instability. The deviations between the reference trajectory generated by the manipulator and the actual trajectory detected by the proposed visual system were measured.
Results based of the experiments and simulations undertaken showed that the visual tracing system could detect trajectories dynamically in a large range. Specifically, the results showed that the deviations were less than 1.5% in the 250 mm motion space of the manipulator. Further, results from their case study were in good agreement with existing literature.
In summary, by taking full use of the characteristic of Risley prisms, a visual tracing system model that could dynamically detect the target’s trajectory in a wide range was established. The approach employed used a calibration method to obtain the manipulator’s position through the binocular cameras, which helped realize the guidance function of the visual tracing device to the manipulator. In a statement to Advances in Engineering, the proposed method is highlighted, which can be used to guide robots with high precision, thereby providing an improved potential method for robot navigation.
Reference
Anhu Li, Qiao Li, Zhaojun Deng, Yang Zhang. Risley-prism-based visual tracing method for robot guidance. Volume 37, No. 4; Journal of the Optical Society of America A.