With the ever rising global population, modernization of agricultural practices is the key to ensuring sufficient food production. In line with this, application of robotics in systems used in planting, weeding and even harvesting has greatly improved modern agricultural practices. In large farms, manual weeding is impractical due to prohibitive labor costs and regulations. Automated weed control systems have thereby been seen to offer great potential in removing both the economic and environmental costs.
Presently, four categories of weed removal mechanisms and three approaches to guide and control the weeding tools are available. So far, one approach stands out: crop-row following based on machine vision and real-time kinematic GPS for weeding tool guidance. Unfortunately, it has a limited capability in terms of sorting out crop plants hence cannot be applied in intra-row weed control. Stereo vision has been proposed in a bid to resolve this shortcoming, however, the correspondence searching problems caused by the lack of leaf texture, the complexity of the canopy structure, occlusion, and variation in sunlight conditions have offered great resistance to application and must therefore be resolved.
Recently Iowa State University scientists, Dr. Ji Li and Professor Lie Tang from the Agricultural and Bio-system Engineering developed a novel system that would overcome the canopy occlusion and illumination variation problems encountered in modernized weed control systems. Specifically, they hoped to apply a 3D imaging sensor and develop corresponding machine vision algorithms to discriminate crop plants from weeds under challenging field conditions where weed infestation was severer than normal. Their work is currently published in the research journal, Journal of Field Robotics.
In brief, the research method employed commenced with the selection and fixing of a 3D time-of-flight (ToF) camera as the sensor. Next, they built the data collection system by mounting the 3D sensor and a computer at a specific height. Broccoli and green bean plants in a field having complex filed conditions were used as the study objects. They then used the developed system to collect images of the plants in the field. Lastly, they developed separate segmentation algorithms for each of the broccoli and green bean plant in accordance with their 3D geometry and 2D amplitude characteristics.
The authors established the detection rate of their system to be 88.3% and 91.2% for broccoli and green bean under weedy conditions, respectively. This helped them arrive at a conclusion that utilization of low spatial resolution ToF camera was a limitation to achieving higher crop plant detection rate and segmentation accuracy. They also observed that the 2D and 3D machine vision algorithms they had developed were highly optimized, and the image processing speed of their system was over 30 frames/second for both types of crop plants.
In summary, Ji Li-Lie Tang study presented a new development of a green bean and broccoli plant detection system based on the use of a 3D ToF camera for automated weeding application. In general, their system produced excellent results but a design review with regard to sunlight blocking capabilities would yield better results. Altogether, the 3D imaging-based crop plant recognition exhibited here presents promising potentials for automated robotic weeding application.
Ji Li, Lie Tang. Crop recognition under weedy conditions based on 3D imaging for robotic weed control. Journal of Field Robotics. 2018; volume 35: page 596-611.Go To Journal of Field Robotics