Review of Current Robotic Approaches for Precision Weed Management

The goal of this review is to provide an overview of current robotic approaches to precision weed management. This includes an investigation into applications within this field during the past 5 years, identifying which major technical areas currently preclude more widespread use, and which key topics will drive future development and utilisation.

Recent Findings

Studies combining computer vision with traditional machine learning and deep learning are driving progress in weed detection and robotic approaches to mechanical weeding. Integrating key technologies for perception, decision-making, and control, autonomous weeding robots are emerging quickly. These effectively save effort while reducing environmental pollution caused by pesticide use.

Summary

This review assesses different weed detection methods and weeder robots used in precision weed management and summarises the trends in this area in recent years. The limitations of current systems are discussed, and ideas for future research directions are proposed.

Similar content being viewed by others

DeepWeeds: A Multiclass Weed Species Image Dataset for Deep Learning

Article Open access 14 February 2019

Precision Weed Management

Chapter © 2021

Design of an Autonomous Agriculture Robot for Real-Time Weed Detection Using CNN

Chapter © 2023

Explore related subjects

Avoid common mistakes on your manuscript.

Introduction

According to the United Nations (UN), all countries must consider and address a common problem; global population growth, which is expected to reach nearly 10 billion by 2050 [1]. Population growth requires farmers to adapt in the control, management, and monitoring of their farms to meet the growing demand for food [2]. According to [3••], food yields need to increase by 70%. In addition, billions of people worldwide are at risk from unsafe food, with millions becoming sick and hundreds of thousands dying annually [4]. This illustrates the need for higher requirements in agricultural production which is the source of the food industry. However, many problems need resolving, such as the reduction of cultivated land and loss of available labour. Other issues, including climate change [5], water pollution [6], and weeds, also affect agricultural productivity.

Weed Detection

Weeds are unwanted plants that grow on farmland and compete with crops for nutrients, space, and sunlight. If not removed, they obstruct crop growth, causing a reduction in crop yield and consequently, a reduction in profit for farmers [7]. Therefore, weed control is an important means to improve crop productivity. Currently, large-scale spraying of pesticides is the most widely used weed control method, but this wastes resources and causes environmental pollution [8]. Therefore, the design of a weeding system that reduces pesticide use is urgently needed. Currently, weed robots are designed based on real-time image detection as the early identification and control of weeds is paramount. The experiment in [9] investigated the significance of late intervention treatment time commencing at week 6, which resulted in a weed survival probability of 0.54±0.08 versus 0.24±0.18 for earlier intervention at week 4.

Typical site-specific weed management (SSWM) includes four processes:

  1. 1. Data collection: Use different equipment for data collection.
  2. 2. Detection: Detect weeds through proper sensors to provide real-time data such as location, area and type of weeds, or generate weed map.
  3. 3. Weeding: Choose suitable methods and pesticides for weeding according to the appeal information.
  4. 4. Evaluation: Evaluate the weeding effect for subsequent improvement.

For data collection, optical sensors are the most widely used technology in weed recognition [10]. Various types of sensors, including machine vision, visible and near infrared (Vis–NIR) spectroscopy [11, 12], multi-/hyper-spectral images [13,14,15,16], and distance sensing techniques [17], have been tested. These sensors can be grouped into two categories: airborne remote sensing and ground-based techniques. The former commonly uses sensors mounted on balloons, airplanes, unmanned aerial vehicles (UAV) [18,19,20,21], and satellites for data acquisition. The obtained images are then analysed off-line to generate weed maps for subsequent SSWM operations. Remote sensing techniques are helpful in map-based SSWM as they are well-suited to larger areas, but are not a real-time process and have a lower spatial resolution compared to ground-based techniques. Ground-based techniques collect and promptly process weed information enabling real-time SSWM operation. Compound signalling is another method [22, 23] which can be applied to crops and weed detection.

Weed detection plays a key role in SSWM, since it provides information necessary for successive procedures and it determines the upper weeding effect limit. Some early studies have focused on the efficacy and reliability of using different light spectra and simple image processing techniques [24]. Thanks to the advancement of sensors, computational power, and algorithms, several breakthroughs have been made in weed detection within the past few years. Detection is based on using multiple labelled plant images to teach a model to distinguish the desirable crops from weeds, recognise patterns in weed distribution, and identify weed edges/boundaries.

Weeding Robots

Agriculture still relies heavily on a human workforce, which can be affected by health problems such as the worldwide public health crisis generated by the coronavirus pandemic (COVID-19). In addition to causing many deaths around the world, the pandemic has imposed several forms of restriction on agricultural activities including weed management. Thus, an effective weeding method is urgently needed. In [25], Lati et al. compared Robovator weed control with hand weeding, and results showed that robotic cultivators can reduce the dependency on hand weeding. Technological advancements and price reductions in these types of machines will improve their weed removal efficacy. In [26], Kirtan et al. studied different specialist and wireless systems employed in agricultural sectors including weed management. Results showed that machine learning is necessary for sustainable development in the farming sector. In [27], Zha conducted a review of AI use in soil and weed management with IoT technologies, and demonstrated that computer vision algorithms, such as deep belief networks (DBN) and convolution neural networks (CNN), show promise in fruit classification and weed detection in complex environments. More specifically, environments with varying ambient lighting, background complexities, capture angle, and fruit/weed shapes, and colours were explored.

Challenges

Due to the very complex agricultural environment, including but not limited to illumination, occlusion, and different growth stages under field conditions, the effect of weed identification is not satisfactory. To solve the problems that arise, transfer learning, model reuse, self-supervised, and even unsupervised methods are employed. Most methods employ supervised learning and require large amounts of annotated data for optimal performance. However, due to the complex agricultural environment and the time-consuming image annotation, there are very few public image datasets.

Paper Organisation

This review is organised as follows: a brief overview of weed detection based on machine and deep learning is presented in “Weed Detection Methods.” “Currently Emerging Weeding Robots” describes varied types of weeding robots. An overall discussion and conclusions are presented in “Discussion,” which also considers the remaining challenges, limitations, and recommendations. Finally, “Conclusion” concludes this review.

Weed Detection Methods

As the cost of labour has increased, and people have become more concerned about health and environmental issues, site-special weed management (SSWM) has become attractive. To develop SSWM, the essential first step is how to detect and recognise. Weed detection methods can be divided into two parts: machine learning (ML) and deep learning (DL). Figure 1 shows the difference between ML and DL. In this section, methods based on ML and DL are introduced.

figure 1

Machine Learning Methods

In the early stages, many scholars used traditional ML algorithms to classify weeds and crops. A typical ML-based weed detection technique involves five steps [24, 28, 29]: data acquisition, pre-processing, feature extraction, and classification. [30] reviewed recent plant image segmentation-based methods (from 2008 to 2015) and highlighted the advantages and disadvantages to colour index-based methods and threshold-based approaches.

Since means-based machine learning has low computing power and data quantity requirements, its results and visual features are easily understood, and consequently, it is the most commonly used weed detection method. The variety of visual characteristics, which differentiate between crops and weeds, can be divided into four categories: texture [31,32,33,34], colour [33, 35], shape [36], and spectrum feature [37,38,39,40,41,42].

Some unsupervised learning methods are also used in weed detection. A clustering approach without prior knowledge was proposed in [43] which eliminates the need for the system to be retrained for fields with different weed species. The similarity between weeds and crops, particularly during early growth, makes identification using a single feature impossible. Therefore, researchers use multi-feature fusion to identify weeds with great success. Consequently, the following subsection introduces multi-features used during the last 5 years.

Multi-feature Fusion

Gai et al. [35] fused colour and deep images to detect and localise crops in their early growth stages. Ahmad et al. [31] combined edge orientation and shape matrix histograms using Sobel filters, and feature coverage for each cell to detect weeds during their early growth stages and achieved a classification accuracy of 98.40%.

Lin et al. [44] combined 11 features, including four spectral features, three space features, and three texture features, to identify eight plant species. Sabzi et al. [45] extracted eight texture features based on the gray level co-occurrence matrix (GLCM), two spectral texture descriptors, thirteen different colour features, five moment-invariant features, and eight shape features. Zou et al. [46••] extracted six features, including HOG, rotation-invariant local binary pattern (LBP), HU invariant moment, gray-level co-occurrence matrix (GLCM), and gray-level gradient go-occurrence matrix (GGCM). A classifier was then employed to identify crop seedlings and weeds.

Classifier

As mentioned earlier, random forest (RF) [47], Bayesian decision [48], K-means [49], SVM [50, 51] and k-nearest neighbour (KNN) [52] have been widely used for weed and crop classification [53]. Other algorithms including naive Bayes [54], artificial neural networks(ANNs), and AdaBoost [31] have been used in weed detection. [55] compared the performances of ConvNets, SVM, RF, and AdaBoost. Results showed methods based on ConvNets achieved excellent results, with a higher than 98% accuracy in the classification of all classes. [56] proposed a system that performs vegetation detection, feature extraction, random forest classification, and smoothing, via a Markov random field to obtain accurate crop and weeds estimates. A weed detection system based on linear and quadratic classifiers was developed to target goldenrod weed in [57]. Experimental result showed that the proposed system has the potential as a target application to control goldenrod.

Deep Learning Methods

Designing features manually requires prior knowledge and expertise, which limits machine learning accuracy improvement. Furthermore, with the substantial increase in computing power and the availability of large amounts of training data, the neural network can independently learn features and automatically optimise the weights of each layer, significantly improving DL performance. Following recent DL achievements, it is logical to utilise it in weed detection using machine vision. Many classical neural network architectures, such as ResNet [58], DenseNet, and GoogleNet, have achieved state-of-the-art performance. According to whether the input data is in Euclidean space, deep learning is divided into two categories: convolutional neural networks(CNN) [59, 60•, 61] and graph convolutional networks (GNN) [62].

According to whether the data is labelled, deep learning can be divided into supervised learning, semi-supervised learning, and unsupervised learning. Several supervised and semi-supervised methods have recently emerged. Therefore, these two methods are introduced in the two following subsections.

Supervised Learning

Supervised learning is the current mainstream deep learning method used to detect weeds and crops. You et al. [63] proposed an improved semantic segmentation network integrated with a hybrid dilated convolutional layer and DropBlock to enlarge the receptive field and learn the robustness features. Lottes et al. [64] exploited a fully convolutional network integrating sequential information to encode the spatial arrangement of plants in a row using 3D convolutions over an image sequence. Hu et al. [65•] designed a graph-based deep learning architecture, namely Graph Weeds Net(GWN), which involves multi-scale graph representations which precisely characterise weed patterns. Zou et al. [46••] designed a simplified U-Net, which was trained on a synthesised training set. The F1-score achieved from the test set was 93.59%. Lottes et al. [66] presented a novel system using an end-to-end trainable fully convolutional network to estimate the stem location of weeds, which enables robots to perform precise mechanical treatment. In [67], three deep conventional neural architectures including DetectNet, GoogleNet, and VGGNet were evaluated for their weed detection capability using bermudagrass, and found VGGNet was most effective at detecting multiple broadleaf weed species at different growth stages.

Semi-supervised Learning

Supervised deep learning methods require large training datasets with ground truth annotations, which require time and effort to build, consequently becoming the main obstacle to supervised learning. Unsupervised learning is an approach which does not need annotated images. The semi-supervised method, which needs a small amount of labelled data, is somewhere between supervised and unsupervised methods.

Pseudo-label is a process which uses a model trained on labelled data, to make predictions on unlabelled data, and filter samples based on predicted results, before re-entering them into the model for training. Zou et al. [46••] constructed a library of real in-situ plant images, which were then cut and randomly pasted onto a prepared soil image, with rotations, to build a synthetic image dataset which can be used to train neural architecture.

Lottes et al. [68] considered that as most crops are planted in rows, a small gap between the rows significantly reduces the ability to adapt a vision-based classifier to a new field. Based on the above, Bah et al. [18] and Louargant et al. [69] analysed crop rows to identify inter-row weed and crops, constructed a training dataset which does not require manual annotation, and then performed CNNs on it to build a model which is able to detect crop and weeds.

Shahbaz et al. [70] proposed a semi-supervised generative adversarial network (SGAN) for crops and weeds classification in early stages of growth, and achieved an average accuracy of 90% when 80% of the training data was unlabelled. Jiang et al. [71] proposed a CNN feature-based graph conventional network and combined the features of unlabelled vertices with nearby labelled ones; thus, the problem of weed and crop recognition was transferred to semi-supervised learning on a graph to reduce manual effort.

Currently Emerging Weeding Robots

The agriculture robot or agribot is a robot used in agriculture. It is commonly believed that progress in robotics science and engineering may soon change the face of farming. Global spending and research into the subject are experiencing near exponential growth [72]. In this section, robots listed in Table 1 used commercially and in research within the last 5 years are summarised.

figure 2

The Laser Weeder is designed for row crops in 200 to tens of thousands of acres. A single robot will weed 15–20 acres per day and replace several hand-weeding crews. The robots have undergone beta testing on specialty crop farms, and multiple crop fields, including broccoli and onions.

EcoRobotix [74] shown in Fig. 2b is a prototype designed in Switzerland for spot thinning and weeding. It is equipped with a computer system that identifies weeds. It sprays the detected weeds with a small dose of herbicide powered by a solar-powered battery mounted on the upper plane. According to the manufacturers of EcoRobotix, their prototype can reduce the volume of required herbicides by 20 times compared to conventional spray systems. A developed robot named AVO shown in Fig. 2c was designed to perform autonomous weeding operations in plane fields and row crops. Using machine learning, the robot detects and selectively sprays the weeds with a micro-dose of herbicide. The centimetre-precise detection and spraying reduces the herbicide volume by more than 95%, while ensuring crops are not sprayed, therefore preserving yield.

Franklin Robotics [73] designed a new gardening robot named Tertill. The sensors it is equipped with allow it to recognise weeds and cut them using small scissors. It uses solar energy as its power source and is waterproof, making it suitable for both gardens and farmland.

TerraSentia [78] developed a small agricultural robot, shown in Fig. 2d, for autonomous weed detection. Bonirob [81], developed by Bosch Deepfield Robotics, is credited with eliminating some of the most tedious tasks in modern farming, planting, and weeding. The autonomous robot is built to function as a mobile plant lab, able to decide which strains of plant are most able to survive insects and viruses, and how much fertiliser they need, before smashing any weeds with a ramming rod.

State-of-the-Art in Research

In the past 5 years, several papers on weeding robots have been published. According to our assessment, these robots are still a long way from being able to be applied practically.

Xiong et al. [82] developed a prototype robot equipped with a dual-gimbal. The robot was able to detect weeds in indoor environments, carry lasers with which to target weeds, and control the platform in real-time, realising continuous weeding. Tests indicated that with a laser traversal speed of 30 mm/s and a dwell time of 0.64 s per weed, the robot displayed a high hit rate of 97%.

Sujaritha et al. [83] designed a weed detecting robotic prototype using a Raspberry Pi micro-controller and suitable input/output subsystems such as cameras, small light sources, and powered motors. The prototype correctly identified sugarcane crop among nine different weed species based on rotation invariant and scale invariant texture analysis methods with a fuzzy real-time classifier. The system detects weeds with 92.9% accuracy, with a processing time of 0.02 s.

Utstumo et al. [84] demonstrated an autonomous robot platform Adigo, shown in Fig. 2e. The proposed robot was designed with the specific task of Drop on Demand herbicide application. The robot effectively controlled all weeds in the field trial with a ten-fold reduction in herbicide use.

The Ladybird was designed by the University of Sydney. As indicated, it looks like a ladybird, shading the vision system on the underside of the robot, and preventing it from being affected by light conditions. The wings can be raised or lowered to accommodate varying crop height. To control weeds in the field, the Ladybird robot was fitted with a spraying end actuator attached to a 6-axis robotic arm. When the machine learning algorithm identifies a weed, the co-ordinates of the weed are transferred to the intelligent robotic system, which then positions itself directly over the weed. Once in position, a small and controllable volume of herbicide spray is fired at the weed exactly where it is required. Sydney University also proposed a smaller robot named RIPPA which is the prototype for the commercial version.

Bawden et al. [77] developed a modular weeding robot called AgBotII. It identifies crops and weeds using LBP and covariance features in its image processing stage and removes weeds using three different tools: an arrow-shaped hoe, a toothed tool, and a cutting tool. This platform can already be used in commercial applications. In order to comprehensively utilise the advantages of multiple robots, Pretto et al. [85] provided an adaptive solution by combining the aerial survey capabilities of a small, autonomous UAV with a multi-purpose agricultural unmanned ground vehicle.

Hitoshi Sori et al. [79] purposes a weeding robot shown in Fig. 2f that can navigate autonomously while weeding robot a paddy field. The weeding robot removes the weeds by churning up the soil and inhibits the growth of the weeds by blocking-off sunlight. Thorvald, shown in Fig. 2h is a number of different robots rolled into one, all built using the same basic modules, and rebuilt using only basic hand tools. A protype robot named Sinobot was designed for weeding. As shown in Fig. 2g, the prototype, equipped with four independently steered wheels, can automatically plan routes and support the remote control.

Discussion

After discussing various existing weeder robots and weed detection methods, data were collected for analysis. Therefore, this section discusses the research trends, common difficulties, and finally, the desirable prerequisites for mechanical weeder robots.

The Challenges of Weed Detection

Changing Weather and Light Conditions

In actual production environments, to take full advantage of the technologies, robots that can run 24/7 are required, so weed recognition methods need to adapt to a variety of working environments, particularly different lighting and weather conditions. This necessitates higher requirements on data pre-processing methods and sensors.

Severe Occlusion

Plants grow randomly, which presents several issues, including occlusion. During actual weeding operation, in order to achieve better results, we need to avoid crops that shade other plants and only operate on weeds that are occluded. Therefore, based on weed identification, we need to distinguish the occlusion relationship between plants. Dyrmann et al. [89] used a fully conventional neural network to detect single weeds in cereal fields despite heavy leaf occlusion.

Large-Scale Dataset

Large-scale datasets are essential for developing high performance and robust deep learning models. Table 2 lists several common datasets compiled within the last 5 years which are related to the field of weed detection and identification. A lack of large datasets has limited the development of advanced methods applicable in a large variety of fields and prevented the transformation into commercial viability. Therefore, constructing large-scale datasets with diverse and complex conditions to facilitate practical deployment are in high demand. Xie et al. [90•] and Gao et al. [91] proposed algorithms to generate high-fidelity synthetic data, and combined synthetic data with raw data to train the network. The results showed that the proposed method can effectively overcome the impact of imprecise and insufficient training samples [92]. presented an approach which uses real-world textures to create an explicit model of the target environment and generate a large variety of annotated data. This proposed approach removes the need for human intervention in the labelling phase, and reduces the time and effort needed to train a visual inference model.

Table 2 The list of publicly available datasets published within 5 years

Reducing Annotation Effort

Since building large datasets is extremely costly, we urgently need a method which reduces the cost of annotation. Many state-of-the-art methods and algorithms have been proposed to solve different tasks, and some can be used to automatically label, and manually collate the labelled data. A self-supervised method was proposed in [100] to automatically generate training data using a row detection and extraction method. Shorewala et al. [101] classified pixels on unlabelled images that are similar to each other, according to a response map, into two clusters (vegetation and background) based on an unsupervised deep learning-based segmentation network. R. Sheikh et al. [102•] used k-means to determine 20 cluster categories from 10 randomly selected images and then manually determined which category represents vegetation to obtain the pseudo ground truth. Furthermore, Sheikh compared three sample selection methods based on loss, the L2 norm of gradients, and gradient projection, with random samples, as well as entropy, and found the suggested methods had a higher semantic segmentation accuracy with a few training samples.

Transfer Learning

Transfer learning methods aim to apply knowledge and skills learned in previous domains/tasks to novel domains/tasks. According to the target task, the examples in the source domain that are useful to the target domain are given new weights to ensure the improved source domain is close to the distribution of the target domain, and a reliable algorithm is obtained from the improved domain. In deep learning, the typical transfer learning method is to fine-tune models trained on other similar tasks to achieve enhanced results, and transfer learning has demonstrated a very promising classification performance under varying ambient light conditions [103]. Bosilj et al. [104•] explored the role of knowledge transfer between deep-learning-based classifiers with different crop types, with the goal of reducing retraining time and labelling effort, required for a new crop. Results show that even when the data used for retraining is imperfectly annotated, the classification performance is within 2% of that of networks trained with laboriously annotated pixel-precision data.

Weakly Supervised and Unsupervised Learning

As annotating manually can be both costly and inaccurate, an algorithm that identifies, while requiring only a small number of, or even no, annotations, is needed. In [105], Zhan et al. utilised a self-supervised method to learn occlusion order and solve the inconsistency of invisible parts using multiple annotators.

Method Fusion

Traditional and deep learning methods should be combined to improve the level of weed detection. Traditional methods use a small volume of labelled data to locate a usable classification criterion. However, traditional methods necessitate hand-crafted features that come from prior knowledge, which limits their performance. Asad et al. [106] used maximum likelihood classification to segment the background and foreground, and a semantic segmentation model to detect weeds.

The Challenges of Weeder Robots

Multi-robot System

Multi-robot systems (MRS) are a group of robots that are designed to perform some collective behaviours, making some goals that are impossible for a single robot to achieve become feasible and attainable [107]. Compared with a single robot, swarm robots can improve weeding efficiency. In addition, MRS have some other advantages. Firstly, they have a high fault tolerance rate, where one robot error does not cause the entire system to crash. Secondly, the swarm robot has a stronger adaptability to the environment, and the swarm algorithm works according to the optimal plan. For multi-robot systems to become practical, we need coordination algorithms that can scale up to large teams of robots dealing with dynamically changing, failure-prone, contested, and uncertain environments [108, 109]. Bechar et al. [110] presented three strategies for coordinated multi-agent weeding under the conditions of partial environmental information.

Modularisation

Robots or intelligent automation systems are generally highly complex since they consist of several different sub-systems which must be integrated and correctly synchronised to perfectly perform tasks as a whole, and to successfully transfer the required information [110]. Modularisation refers to the decomposition of robots into mutually independent parts or breaking cluster systems into different robots. The goal is to reduce build time and cost to a minimum, as this will enable low cost swarms of high-quality robots [111].

Actuator Design

Evidence of negative environmental impacts from herbicides is growing, and herbicide resistance is increasingly prevalent [112]. Furthermore, with few new herbicides pending release, no new mechanism of action in 30 years, and an increasing number of herbicide-resistant weeds, the need for new weed control tools is overwhelming [113]. Thus, ways to minimise crop damage and pesticide dosage when removing weeds, and improving weeding efficiency through actuator design, will become the focus of future development. McCool et al. [9] compared the effect of various mechanical tools including arrow hoe, tine, and whipper-snipper (W/S). Results showed W/S had better cottonweed weeding efficacy, tine performed better with Feathertop Rhodes, and arrow hoe worked better with wild oats. Furthermore, other methods such as infrared, laser [75], and microwave weeding need development.

Intelligent System

The system must be developed to overcome difficult problems such as continuously changing conditions, variability of the produce and environment, and hostile environmental conditions such as vibration, dust, extreme temperature, and humidity. Even though much effort has been put into developing obstacle detection and avoidance algorithms and systems, this is still at the research stage [110]. Multi-sensor fusion has become a popular technology to improve recognition accuracy. According to [114], vision systems need not be a bottleneck in the detection of items with high-end GPUs.

Conclusion

This review summarises the current status of robotic approaches to mechanical weeding. Accurate weed detection is a prerequisite in weed management. Two weed detection technology categories are discussed in “Weed Detection Methods.” Consequently, deep neural network architectures deliver better performance and enable quicker application development. Weed recognition still has several issues, including light changes, irregular growth, severe occlusion, and difficulty in early recognition. Therefore, a dataset containing the above situations is required. More hybrid models using deep learning and traditional image processing are expected to be developed in the future. Another trend is to reduce the annotation effect using self-supervised methods.

Commercial weeding machineries are emerging onto the market, but most of them only target a few specific crops and weeds. Designing a robot, which can cope with a variety of scenarios and can be adjusted to quickly adapt to new environments, is a priority. Currently, the high cost of these robotic weeding systems hinders further commercialisation. How to reduce the cost and improve the efficiency of weeding has become the focus of current research.

References

Papers of particular interest, published recently, have been highlighted as: • Of importance •• Of major importance

  1. World population projections. https://www.worldometers.info/world-population/world-population-projections. Accessed 12 Dec 2021.
  2. Lottes P, Khanna R, Pfeifer J, Siegwart R, Stachniss C. UAV-based crop and weed classification for smart farming. IEEE International Conference on Robotics and Automation (ICRA). 2017. p. 3024–31. https://doi.org/10.1109/ICRA.2017.7989347.
  3. •• Hasan ASMM, Sohel F, Diepeveen D, Laga H, Jones MGK. A survey of deep learning techniques for weed detection from images. Comput Electron Agric. 2021;184:106067. https://doi.org/10.1016/j.compag.2021.106067. An excellent overview of deep learning methods using in weed detection.ArticleGoogle Scholar
  4. Fung F, Wang H-S, Menon S. Food safety in the 21st century. Biomedical Journal. 2018;41(2):88–95. https://doi.org/10.1016/j.bj.2018.03.003. ArticleGoogle Scholar
  5. K BK, K L, K R. Climate change and variability in Kenya: a review of impacts on agriculture and food security. Environ Dev Sustain. 2021;23:23–43. https://doi.org/10.1007/s10668-020-00589-1. ArticleGoogle Scholar
  6. Berthet A, Vincent A, Fleury P. Water quality issues and agriculture: an international review of innovative policy schemes. Land Use Policy. 2021;109:105654. https://doi.org/10.1016/j.landusepol.2021.105654. ArticleGoogle Scholar
  7. Marco E, Mariano C, Valerio C, Fabrizio S, Albino M. Drone and sensor technology for sustainable weed management: a review. Chem Biol Technol in Agricult. 2021;8:18. https://doi.org/10.1186/s40538-021-00217-8. ArticleGoogle Scholar
  8. Rani L, Thapa K, Kanojia N, Sharma N, Singh S, Grewal AS, et al. An extensive review on the consequences of chemical pesticides on human health and environment. J Clean Prod. 2021;283:124657. https://doi.org/10.1016/j.jclepro.2020.124657. ArticleGoogle Scholar
  9. McCool C, Beattie J, Firn J, Lehnert C, Kulk J, Bawden O, et al. Efficacy of mechanical weeding tools: a study into alternative weed management strategies enabled by robotics. IEEE Robot Auto Lett. 2018;3(2):1184–90. https://doi.org/10.1109/lra.2018.2794619. ArticleGoogle Scholar
  10. Christensen S, Dyrmann M, Laursen MS, Jørgensen RN, Rasmussen J. Sensing for weed detection. In: Sensing approaches for precision agriculture. Cham: Springer International Publishing; 2021. p. 275–300. ChapterGoogle Scholar
  11. Shirzadifar A, Bajwa S, Mireei SA, Howatt K, Nowatzki J. Weed species discrimination based on SIMCA analysis of plant canopy spectral data. Biosys Eng. 2018;171:143–54. https://doi.org/10.1016/j.biosystemseng.2018.04.019. ArticleGoogle Scholar
  12. Gao J, Nuyttens D, Lootens P, He Y, Pieters JG. Recognising weeds in a maize crop using a random forest machine-learning algorithm and near-infrared snapshot mosaic hyperspectral imagery. Biosys Eng. 2018;170:39–50. https://doi.org/10.1016/j.biosystemseng.2018.03.006. ArticleGoogle Scholar
  13. Potena C, Nardi D, Pretto A. Fast and accurate crop and weed identification with summarized train sets for precision agriculture. Cham: Springer International Publishing; 2017. p. 105–21. Google Scholar
  14. Tao T, Wu S, Li L, Li J, Bao S, Wei X. Design and experiments of weeding teleoperated robot spectral sensor for winter rape and weed identification. Adv Mech Eng. 2018;10(5):1687814018776741. https://doi.org/10.1177/1687814018776741. ArticleGoogle Scholar
  15. Pantazi X-E, Moshou D, Bravo C. Active learning system for weed species recognition based on hyperspectral sensing. Biosys Eng. 2016;146:193–202. https://doi.org/10.1016/j.biosystemseng.2016.01.014. ArticleGoogle Scholar
  16. Zhang Y, Gao J, Cen H, Lu Y, Yu X, He Y, et al. Automated spectral feature extraction from hyperspectral images to differentiate weedy rice and barnyard grass from a rice crop. Comput Electron Agric. 2019;159:42–9. https://doi.org/10.1016/j.compag.2019.02.018. ArticleGoogle Scholar
  17. Andújar D, Dorado J, Fernández-Quintanilla C, Ribeiro A. An approach to the use of depth cameras for weed volume estimation. Sensors. 2016. https://doi.org/10.3390/s16070972. ArticleGoogle Scholar
  18. Bah MD, Hafiane A, Canals R. Deep learning with unsupervised data labeling for weed detection in line crops in UAV images. Remote Sens. 2018. https://doi.org/10.3390/rs10111690. ArticleGoogle Scholar
  19. Huang H, Deng J, Lan Y, Yang A, Deng X, Wen S, et al. Accurate weed mapping and prescription map generation based on fully convolutional networks using UAV imagery. Sensors. 2018. https://doi.org/10.3390/s18103299. ArticleGoogle Scholar
  20. Huang H, Lan Y, Deng J, Yang A, Deng X, Zhang L, et al. A semantic labeling approach for accurate weed mapping of high resolution UAV imagery. Sensors. 2018. https://doi.org/10.3390/s18072113. ArticleGoogle Scholar
  21. Gao J, Liao W, Nuyttens D, Lootens P, Vangeyte J, Pižurica A, et al. Fusion of pixel and object-based features for weed mapping using unmanned aerial vehicle imagery. Int J Appl Earth Obs Geoinf. 2018;67:43–53. https://doi.org/10.1016/j.jag.2017.12.012. ArticleGoogle Scholar
  22. Raja R, Nguyen TT, Slaughter DC, Fennimore SA. Real-time weed-crop classification and localisation technique for robotic weed control in lettuce. Biosys Eng. 2020;192:257–74. https://doi.org/10.1016/j.biosystemseng.2020.02.002. ArticleGoogle Scholar
  23. Raja R, Nguyen TT, Vuong VL, Slaughter DC, Fennimore SA. RTD-SEPs: real-time detection of stem emerging points and classification of crop-weed for robotic weed control in producing tomato. Biosys Eng. 2020;195:152–71. https://doi.org/10.1016/j.biosystemseng.2020.05.004. ArticleGoogle Scholar
  24. Liu B, Bruch R. Weed detection for selective spraying: a review. Curr Robot Rep. 2020. https://doi.org/10.1007/s43154-020-00001-w. ArticleGoogle Scholar
  25. Lati RN, Siemens MC, Rachuy JS, Fennimore SA. Intrarow weed removal in broccoli and transplanted lettuce with an intelligent cultivator. Weed Technol. 2016;30(3):655–63. https://doi.org/10.1614/wt-d-15-00179.1. ArticleGoogle Scholar
  26. Jha K, Doshi A, Patel P, Shah M. A comprehensive review on automation in agriculture using artificial intelligence. Artif Intell Agricult. 2019;2:1–12. https://doi.org/10.1016/j.aiia.2019.05.004. ArticleGoogle Scholar
  27. Zha J. Artificial intelligence in agriculture. J Phys: Conf Ser. 2020;1693(1):012058. https://doi.org/10.1088/1742-6596/1693/1/012058. ArticleGoogle Scholar
  28. Liakos KG, Busato P, Moshou D, Pearson S, Bochtis D. Machine learning in agriculture: a review. Sensors. 2018. https://doi.org/10.3390/s18082674. ArticleGoogle Scholar
  29. Júnior PCP, Monteiro A, Ribeiro RDL, Sobieranski AC, Wangenheim AV. Comparison of supervised classifiers and image features for crop rows segmentation on aerial images. Appl Artif Intell. 2020;34(4):271–91. https://doi.org/10.1080/08839514.2020.1720131. ArticleGoogle Scholar
  30. Hamuda E, Glavin M, Jones E. A survey of image processing techniques for plant extraction and segmentation in the field. Comput Electron Agric. 2016;125:184–99. https://doi.org/10.1016/j.compag.2016.04.024. ArticleGoogle Scholar
  31. Ahmad J, Muhammad K, Ahmad I, Ahmad W, Smith ML, Smith LN, et al. Visual features based boosted classification of weeds for real-time selective herbicide sprayer systems. Comput Ind. 2018;98:23–33. https://doi.org/10.1016/j.compind.2018.02.005. ArticleGoogle Scholar
  32. Bakhshipour A, Jafari A, Nassiri SM, Zare D. Weed segmentation using texture features extracted from wavelet sub-images. Biosys Eng. 2017;157:1–12. https://doi.org/10.1016/j.biosystemseng.2017.02.002. ArticleGoogle Scholar
  33. Mekhalfa F, Yacef F. Supervised learning for crop/weed classification based on color and texture features. 2021. ArXiv preprint: abs/2106.10581.
  34. Hamuda E, Mc Ginley B, Glavin M, Jones E. Automatic crop detection under field conditions using the HSV colour space and morphological operations. Comput Electron Agric. 2017;133:97–107. https://doi.org/10.1016/j.compag.2016.11.021. ArticleGoogle Scholar
  35. Gai J, Tang L, Steward BL. Automated crop plant detection based on the fusion of color and depth images for robotic weed control. J Field Robot. 2020;37(1):35–52. https://doi.org/10.1002/rob.21897. ArticleGoogle Scholar
  36. Bosilj P, Duckett T, Cielniak G. Connected attribute morphology for unified vegetation segmentation and classification in precision agriculture. Comput Ind. 2018;98:226–40. https://doi.org/10.1016/j.compind.2018.02.003. ArticleGoogle Scholar
  37. Tang J, Miao R, Zhang Z, Xin J, Wang D. Distance-based separability criterion of ROI in classification of farmland hyper-spectral images. Int J Agricult Biol Eng. 2017;10(5):177–85. https://doi.org/10.25165/j.ijabe.20171005.2264. ArticleGoogle Scholar
  38. Yanbo H, A LM, J TS, N RK. Ground-based hyperspectral remote sensing for weed management in crop production. Int J Agricult Biol Eng. 2016;9(2):98–109. https://doi.org/10.3965/j.ijabe.20160902.2137. ArticleGoogle Scholar
  39. Pignatti S, Casa R, Harfouche A, Huang W, Palombo A, Pascucci S. Maize crop and weeds species detection by using Uav Vnir hyperpectral data. 2019. p. 7235–8. https://doi.org/10.1109/IGARSS.2019.8900241.
  40. Barrero O, Perdomo SA. RGB and multispectral UAV image fusion for Gramineae weed detection in rice fields. Precision Agric. 2018;19(5):809–22. https://doi.org/10.1007/s11119-017-9558-x. ArticleGoogle Scholar
  41. Zisi T, Alexandridis TK, Kaplanis S, Navrozidis I, Tamouridou A-A, Lagopodi A, et al. Incorporating surface elevation information in uav multispectral images for mapping weed patches. J Imaging. 2018. https://doi.org/10.3390/jimaging4110132. ArticleGoogle Scholar
  42. Sa I, Chen Z, Popović M, Khanna R, Liebisch F, Nieto J, et al. weedNet: dense semantic weed classification using multispectral images and MAV for smart farming. IEEE Robot Auto Lett. 2018;3(1):588–95. https://doi.org/10.1109/lra.2017.2774979. ArticleGoogle Scholar
  43. Hall D, Dayoub F, Kulk J, McCool C. Towards unsupervised weed scouting for agricultural robotics. 2017. p. 5223–30. https://doi.org/10.1109/ICRA.2017.7989612.
  44. Lin F, Zhang D, Huang Y, Wang X, Chen X. Detection of corn and weed species by the combination of spectral, shape and textural features. Sustainability. 2017. https://doi.org/10.3390/su9081335. ArticleGoogle Scholar
  45. Sabzi S, Abbaspour-Gilandeh Y, Arribas JI. An automatic visible-range video weed detection, segmentation and classification prototype in potato field. Heliyon. 2020;6(5):e03685. https://doi.org/10.1016/j.heliyon.2020.e03685. ArticleGoogle Scholar
  46. •• Zou K, Chen X, Wang Y, Zhang C, Zhang F. A modified U-Net with a specific data argumentation method for semantic segmentation of weed images in the field. Comput Electron Agric. 2021;187:106242. https://doi.org/10.1016/j.compag.2021.106242. A simple and fast neural network architecture with high accuracy of weed segmentation.ArticleGoogle Scholar
  47. De Castro AI, Torres-Sánchez J, Peña JM, Jiménez-Brenes FM, Csillik O, López-Granados F. An automatic random forest-OBIA algorithm for early weed mapping between and within crop rows using UAV imagery. Remote Sens. 2018. https://doi.org/10.3390/rs10020285. ArticleGoogle Scholar
  48. Tang J-L, Chen X-Q, Miao R-H, Wang D. Weed detection using image processing under different illumination for site-specific areas spraying. Comput Electron Agric. 2016;122:103–11. https://doi.org/10.1016/j.compag.2015.12.016. ArticleGoogle Scholar
  49. Tang J, Wang D, Zhang Z, He L, Xin J, Xu Y. Weed identification based on K-means feature learning combined with convolutional neural network. Comput Electron Agric. 2017;135:63–70. https://doi.org/10.1016/j.compag.2017.01.001. ArticleGoogle Scholar
  50. Rojas CP, Guzmán LS, Toledo NV. Weed recognition by SVM texture feature classification in outdoor vegetable crops images. Ing Invest. 2017;37(1):68–74. https://doi.org/10.15446/ing.investig.v37n1.54703. ArticleGoogle Scholar
  51. Chen Y, Wu Z, Zhao B, Fan C, Shi S. Weed and corn seedling detection in field based on multi feature fusion and support vector machine. Sensors. 2021. https://doi.org/10.3390/s21010212. ArticleGoogle Scholar
  52. Khurana G, Bawa NK. Weed detection approach using feature extraction and KNN classification. Singapore: Springer Singapore; 2021. p. 671–9. Google Scholar
  53. Islam N, Rashid MM, Wibowo S, Xu C-Y, Morshed A, Wasimi SA, et al. Early weed detection using image processing and machine learning techniques in an Australian chilli farm. Agriculture. 2021. https://doi.org/10.3390/agriculture11050387. ArticleGoogle Scholar
  54. Wu X, Aravecchia S, Lottes P, Stachniss C, Pradalier C. Robotic weed control using automated weed and crop classification. J Field Robot. 2020;37(2):322–40. https://doi.org/10.1002/rob.21938. ArticleGoogle Scholar
  55. dos Santos Ferreira A, Matte Freitas D, Gonçalves da Silva G, Pistori H, Theophilo Folhes M. Weed detection in soybean crops using ConvNets. Comput Electron Agric. 2017;43:314–24. https://doi.org/10.1016/j.compag.2017.10.027. ArticleGoogle Scholar
  56. Lottes P, Hoeferlin M, Sander S, Müter M, Schulze P, Stachniss LC. An effective classification system for separating sugar beets and weeds for precision farming applications. In: 2016 IEEE International Conference on Robotics and Automation (ICRA), 2016, p. 5157–63. https://doi.org/10.1109/ICRA.2016.7487720.
  57. Rehman TU, Zaman QU, Chang YK, Schumann AW, Corscadden KW. Development and field evaluation of a machine vision based in-season weed detection system for wild blueberry. Comput Electron Agric. 2019;162:1–13. https://doi.org/10.1016/j.compag.2019.03.023. ArticleGoogle Scholar
  58. Sarker MI, Kim H. Farm land weed detection with region-based deep convolutional neural networks. 2019. ArXiv preprint: abs/1906.01885.
  59. Milioto A, Lottes P, Stachniss C. Real-time semantic segmentation of crop and weed for precision agriculture robots leveraging background knowledge in CNNs. In: 2018 IEEE International Conference on Robotics and Automation (ICRA), 2018, p. 2229–35. https://doi.org/10.1109/ICRA.2018.8460962.
  60. • Chavan TR, Nandedkar AV. AgroAVNET for crops and weeds classification: a step forward in automatic farming. Comput Electron Agric. 2018;154:361–72. https://doi.org/10.1016/j.compag.2018.09.021. This article used incremental learning to learn new categories of weeds and crops more effectively.ArticleGoogle Scholar
  61. Ma X, Deng X, Qi L, Jiang Y, Li H, Wang Y, et al. Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields. PLoS ONE. 2019;14(4):1–13. https://doi.org/10.1371/journal.pone.0215676. ArticleGoogle Scholar
  62. Hu K, Wang Z, Coleman G, Bender A, Yao T, Zeng S, et al. Deep learning techniques for in-crop weed identification: a review. 2021. arXiv preprint: abs/2103.14872.
  63. You J, Liu W, Lee J. A DNN-based semantic segmentation for detecting weed and crop. Comput Electron Agric. 2020;178:105750. https://doi.org/10.1016/j.compag.2020.105750. ArticleGoogle Scholar
  64. Lottes P, Behley J, Milioto A, Stachniss C. Fully convolutional networks with sequential information for robust crop and weed detection in precision farming. IEEE Robot Auto Lett. 2018;3(4):2870–7. https://doi.org/10.1109/lra.2018.2846289. ArticleGoogle Scholar
  65. • Hu K, Coleman G, Zeng S, Wang Z, Walsh M. Graph weeds net: A graph-based deep learning method for weed recognition. Comput Electron Agric. 2020;174:105520. https://doi.org/10.1016/j.compag.2020.105520. Graph weeds net is proposed to address the inputs containing multiscale graph structures in this article.ArticleGoogle Scholar
  66. Philipp L, Jens B, Nived C, Andres M, Cyrill S. Robust joint stem detection and crop-weed classification using image sequences for plant-specific treatment in precision farming. J Field Robot. 2019;37:20–34. https://doi.org/10.1002/rob.21901. ArticleGoogle Scholar
  67. Yu J, Sharpe SM, Schumann AW, Boyd NS. Deep learning for image-based weed detection in turfgrass. Eur J Agron. 2019;104:78–84. https://doi.org/10.1016/j.eja.2019.01.004. ArticleGoogle Scholar
  68. Lottes P, Stachniss C. Semi-supervised online visual crop and weed classification in precision farming exploiting plant arrangement. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2017, p. 5155–61. https://doi.org/10.1109/IROS.2017.8206403.
  69. Louargant M, Jones G, Faroux R, Paoli J-N, Maillot T, Gée C, et al. Unsupervised classification algorithm for early weed detection in row-crops by combining spatial and spectral information. Remote Sens. 2018. https://doi.org/10.3390/rs10050761. ArticleGoogle Scholar
  70. Khan S, Tufail M, Khan MT, Khan ZA, Iqbal J, Alam M. A novel semi-supervised framework for UAV based crop/weed classification. Plos one. 2021;16(5):e0251008. https://doi.org/10.1371/journal.pone.0251008. ArticleGoogle Scholar
  71. Jiang H, Zhang C, Qiao Y, Zhang Z, Zhang W, Song C. CNN feature based graph convolutional network for weed and crop recognition in smart farming. Comput Electron Agric. 2020;174:105450. https://doi.org/10.1016/j.compag.2020.105450. ArticleGoogle Scholar
  72. Hajjaj SSH, Sahari KSM. Review of agriculture robotics: practicality and feasibility. 2016 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS). 2016. p. 194–8. https://doi.org/10.1109/IRIS.2016.8066090.
  73. Sanchez J, Gallandt ER. Functionality and efficacy of Franklin Robotics’ Tertil robotic weeder. Weed Technol. 2021;35(1):166–70. https://doi.org/10.1017/wet.2020.94. ArticleGoogle Scholar
  74. Passez au désherbage intelligent avec Ecorobotix. https://www.ecorobotix.com/en/. Accessed 12 Dec 2021.
  75. Autonomous LaserWeeder-Carbon Robotics. https://carbonrobotics.com/autonomous-weeder. Accessed 12 Dec 2021.
  76. Passez au désherbage intelligent avec Ecorobotix. https://www.ecorobotix.com/en/avo/. Accessed 27 Dec 2021.
  77. Bawden O, Kulk J, Russell R, McCool C, English A, Dayoub F, et al. Robot for weed species plant-specific management. J Field Robot. 2017;34(6):1179–99. https://doi.org/10.1002/rob.21727. ArticleGoogle Scholar
  78. Earthsense. https://www.earthsense.co/. Accessed 17 Dec 2021.
  79. Sori H, Inoue H, Hatta H, Ando Y. Effect for a paddy weeding robot in wet rice culture. J Robot Mechatron. 2018;30(2):198–205. https://doi.org/10.20965/jrm.2018.p0198. ArticleGoogle Scholar
  80. Uchida HIROAKI, Hunaki T. Development of a remoto control type weeding machine with stirring chains for a paddy field. In: Proceedings of the 22nd International Conference on Climbing and Walking Robots and Support Technologies for Mobile Machines(CLAWAR). Kuala Lumpur: CLAWAR Association Ltd; 2019. p. 61–8. Google Scholar
  81. Ruckelshausen A, Biber P, Dorna M, Gremmes H, Klose R, Linz A, et al. BoniRob—an autonomous field robot platform for individual plant phenotyping. Precision Agric. 2009;9(841):1. Google Scholar
  82. Xiong Y, Ge Y, Liang Y, Blackmore S. Development of a prototype robot and fast path-planning algorithm for static laser weeding. Comput Electron Agric. 2017;142:494–503. https://doi.org/10.1016/j.compag.2017.11.023. ArticleGoogle Scholar
  83. Sujaritha M, Annadurai S, Satheeshkumar J, Kowshik Sharan S, Mahesh L. Weed detecting robot in sugarcane fields using fuzzy real time classifier. Comput Electron Agric. 2017;134:160–71. https://doi.org/10.1016/j.compag.2017.01.008. ArticleGoogle Scholar
  84. Utstumo T, Urdal F, Brevik A, Dørum J, Netland J, Overskeid Ø, et al. Robotic in-row weed control in vegetables. Comput Electron Agric. 2018;154:36–45. https://doi.org/10.1016/j.compag.2018.08.043. ArticleGoogle Scholar
  85. Pretto A, Aravecchia S, Burgard W, Chebrolu N, Dornhege C, Falck T, et al. Building an aerial–ground robotics system for precision farming: an adaptable solution. IEEE Robot Autom Mag. 2021;28(3):29–49. https://doi.org/10.1109/mra.2020.3012492. ArticleGoogle Scholar
  86. Small Robot Company. https://www.smallrobotcompany.com/press-releases/end-to-end-milestone. Accessed March 1 2022.
  87. Ben-Ari M, Mondada F. Elements of robotics. Elem Robot. 2018. https://doi.org/10.1007/978-3-319-62533-1. ArticleMATHGoogle Scholar
  88. Grimstad L, From PJ. The Thorvald II Agricultural Robotic System. Robotics. 2017;6(4):24. ArticleGoogle Scholar
  89. Dyrmann M, Jørgensen RN, Midtiby HS. RoboWeedSupport — detection of weed locations in leaf occluded cereal crops using a fully convolutional neural network. Adv Anim Biosci. 2017;8(2):842–7. https://doi.org/10.1017/S2040470017000206. ArticleGoogle Scholar
  90. • Xie S, Hu C, Bagavathiannan MV, Song D. Toward robotic weed control: detection of nutsedge weed in bermudagrass turf using inaccurate and insufficient training data. IEEE Robot Autom Lett. 2021;6:7365–72. This article proposed an algorithm to generate high fidelity synthetic data and overcome the impact of imprecise and insufficient training sample issues.ArticleGoogle Scholar
  91. Gao J, French AP, Pound MP, He Y, Pridmore TP, Pieters JG. Deep convolutional neural networks for image-based Convolvulus sepium detection in sugar beet fields. Plant Methods. 2020;16(1):29. https://doi.org/10.1186/s13007-020-00570-z. ArticleGoogle Scholar
  92. Di Cicco M, Potena C, Grisetti G, Pretto A. Automatic model based dataset generation for fast and accurate crop and weeds detection. In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2017, p. 5188–95. https://doi.org/10.1109/IROS.2017.8206408.
  93. Chebrolu N, Lottes P, Schaefer A, Winterhalter W, Burgard W, Stachniss C. Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields. Int J Robot Res. 2017;36:1045–52. https://doi.org/10.1177/0278364917720510. ArticleGoogle Scholar
  94. Giselsson TM, Jørgensen RN, Jensen PK, Dyrmann M, Midtiby HS. A public image database for benchmark of plant seedling classification algorithms. 2017. ArXiv preprint: abs/1711.05458.
  95. Lameski P, Zdravevski E, Trajkovik V, Kulakov A. Weed Detection dataset with RGB images taken under variable light conditions. In: Trajanov, D., Bakeva, V. (eds) ICT Innovations 2017. ICT Innovations 2017. Communications in Computer and Information Science, vol 778. Springer, Cham; 2017. https://doi.org/10.1007/978-3-319-67597-8_11.
  96. Olsen A, Konovalov DA, Philippa B, Ridd P, Wood JC, Johns J, et al. DeepWeeds: a multiclass weed species image dataset for deep learning. Sci Rep. 2019. https://doi.org/10.1038/s41598-018-38343-3. ArticleGoogle Scholar
  97. Skovsen S, Mortensen AK, Laursen MS, Gislum R, Eriksen J, Farkhani S, Karstoft H, Jorgensen RN. The grass clover image dataset for semantic and hierarchical species understanding in agriculture. In: 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), 2019, p. 2676–84. https://doi.org/10.1109/CVPRW.2019.00325.
  98. Sudars K, Jasko J, Namatevs I, Ozola L, Badaukis N. Dataset of annotated food crops and weed images for robotic computer vision control. Data Brief. 2020;31:105833. https://doi.org/10.1016/j.dib.2020.105833. ArticleGoogle Scholar
  99. lincolnbeet dataset. https://github.com/LAR/lincolnbeet_dataset. Accessed June 28 2022.
  100. Wendel A, Underwood J. Self-supervised weed detection in vegetable crops using ground based hyperspectral imaging. In: 2016 IEEE International Conference on Robotics and Automation (ICRA), 2016, p. 5128–35. https://doi.org/10.1109/ICRA.2016.7487717.
  101. Shorewala S, Ashfaque A, Sidharth R, Verma U. Weed density and distribution estimation for precision agriculture using semi-supervised learning. IEEE Access. 2021;9:27971–86. https://doi.org/10.1109/access.2021.3057912. ArticleGoogle Scholar
  102. • Sheikh R, Milioto A, Lottes P, Stachniss C, Bennewitz M, Schultz T. Gradient and log-based active learning for semantic segmentation of crop and weed for agricultural robots. In: 2020 IEEE International Conference on Robotics and Automation (ICRA), 2020, p. 1350–56. https://doi.org/10.1109/ICRA40945.2020.9196722. This article proposed a deep learning-based semi-supervised method to robustly estimate weed density and across farmland using only limited color images.
  103. Suh HK, Ijsselmuiden J, Hofstee JW, van Henten EJ. Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosyst Eng. 2018;174:50–65. https://doi.org/10.1016/j.biosystemseng.2018.06.017. ArticleGoogle Scholar
  104. • Bosilj P, Aptoula E, Duckett T, Cielniak G. Transfer learning between crop types for semantic segmentation of crops versus weeds in precision agriculture. J Field Robot. 2020;37(1):7–19. https://doi.org/10.1002/rob.21869. This article showed that transfer learning between different crop types is possible and there is no noticeable impact on classification performance by imperfectly annotated retaining data.ArticleGoogle Scholar
  105. Zhan X, Pan X, Dai B, Liu Z, Lin D, Loy CC. Self-supervised scene de-occlusion. In: 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2020, p. 3783–91. https://doi.org/10.1109/CVPR42600.2020.00384.
  106. Asad MH, Bais A. Weed detection in canola fields using maximum likelihood classification and deep convolutional neural network. Inf Proc Agricult. 2020;7(4):535–45. https://doi.org/10.1016/j.inpa.2019.12.002. ArticleGoogle Scholar
  107. Khamis A, Hussein A, Elmogy A. Multi-robot task allocation: a review of the state-of-the-art. In: Koubâa A, Martínez-de Dios JR, editors. Cooperative Robots and Sensor Networks 2015. Cham: Springer International Publishing; 2015. p. 31–51. ChapterGoogle Scholar
  108. Zhou L, Tokekar P. Multi-robot Coordination and Planning in Uncertain and Adversarial Environments. Curr Robot Rep. 2021;2(2):147–57. https://doi.org/10.1007/s43154-021-00046-5. ArticleGoogle Scholar
  109. McAllister W, Osipychev D, Chowdhary G, Davis A. Multi-agent planning for coordinated robotic weed killing. In: 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2018, p. 7955–60. https://doi.org/10.1109/IROS.2018.8593429.
  110. Bechar A, Vigneault C. Agricultural robots for field operations: concepts and components. Biosys Eng. 2016;149:94–111. https://doi.org/10.1016/j.biosystemseng.2016.06.014. ArticleGoogle Scholar
  111. Grimstad L, From PJ. Thorvald II — a modular and re-configurable agricultural robot. 20th IFAC World Congr. 2017;50(1):4588–93. https://doi.org/10.1016/j.ifacol.2017.08.1005. ArticleGoogle Scholar
  112. Chloe M, Jonathan S, Alexander M, Helen M, Katharina D-S. An ecological future for weed science to sustain crop production and the environment. A review. Agron Sustain Dev. 2020;40:24. https://doi.org/10.1007/s13593-020-00631-6. ArticleGoogle Scholar
  113. Davis AS, Frisvold GB. Are herbicides a once in a century method of weed control? Pest Manag Sci. 2017;73(11):2209–20. https://doi.org/10.1002/ps.4643. ArticleGoogle Scholar
  114. Zhivkov T, Gomez A, Gao J, Sklar E, Parsons S. The need for speed: how 5G communication can support AI in the field. 2021. https://doi.org/10.31256/On8Hj9U.

Funding

This work is funded by Shanghai Agricultural and Rural Committee, and the project number is 202002080009F01466.