In Precision Agriculture real-time action and decisions based on accurate information of time and location a pest occurs is essential. Using computer vision and machine learning technologies Huajian Liu, Research fellow (Plant phenotyping) at the APPF’s The Plant Accelerator in Adelaide has been able to accurately detect invertebrate pests on crops in real-time. He was also able to evaluate the performances of state-of-art convolutional neural networks (CNNs) and has proposed a standard training pipeline.

“Facing the challenge of rapidly developing comprehensive training data, we used a novel method to generate a virtual database which was then successfully used to train a deep residual CNN with an accuracy of 97.8% in detecting four species of pests in farming environments,” Dr Lui said.

Deep CNNs naturally integrate low-, middle- and high-level features and classifiers in an end-to-end multilayer fashion, and the levels of features can be enriched by the number of stacked layers, driving image classification technologies forward, adding deep residual learning increases accuracy further.

Traditionally, sample pests have been trapped, counted and classified by humans to estimate the levels of infestation. It’s a subjective, error-prone and labour intensive process. Although semi-automatic pest detection is possible using computer vision technologies to classify and count pest samples in laboratories or insect traps, the decision made by the laboratory-based or trap-based approaches is still too late for making optimal pest management decisions. The proposed method can be applied to a robotic system for proximal detection of invertebrate pests on crops in real-time.

Dr Lui also found that the wide-angle cameras of smartphones working at 20 cm to 50 cm can capture high-quality images for pest detection. This imaging approach can be automated by using a ground-based platform with a robotic arm equipped with a low-cost camera. The processing speed was tested on a computer with a 4.2G Hz CPU and the average time for processing a window was 0.21 seconds. By using parallel processing, this processing rate can support real-time pest detection in the field.

Read the paper here: H.Liu and J.S.Chahl, Proximal detecting invertebrate pests on crops using a deep residual convolutional neural network trained by virtual images Artificial Intelligence in Agriculture (2021).