Open access peer-reviewed chapter

Agricultural Mobile Robots for Plant Health Assessment and Drought Stress Detection

Written By

Maryam Behjati, Redmond R. Shamshiri and Ibrahim A. Hameed

Submitted: 21 March 2024 Reviewed: 19 June 2024 Published: 04 September 2024

DOI: 10.5772/intechopen.115219

From the Edited Volume

Precision Agriculture - Emerging Technologies

Edited by Redmond R. Shamshiri, Sanaz Shafian and Ibrahim A. Hameed

Chapter metrics overview

16 Chapter Downloads

View Full Metrics

Abstract

The vulnerability of plants to various threats, such as insects, pathogens, and weeds, poses a significant risk to food security, particularly before harvest. Mobile robots are used in digital agriculture as a breakthrough approach to address challenges in crop production, such as plant health assessment and drought stress detection. This chapter aims to explore the application of agricultural mobile robots equipped with advanced sensing technologies and computer vision algorithms, along with their key features, to enhance crop management practices. An overview of some the platforms with different steering mechanisms, sensors, interfaces, communication, and machine learning has been provided along with case studies on the use of robots for collecting data on plant health indicators such as physiological parameters, leaf coloration, and soil moisture levels. Recent trends in this area show that by utilizing machine learning techniques such as convolutional neural networks (CNNs) and support vector machines (SVMs), the collected data are analyzed to identify symptoms of plant diseases, nutrient deficiencies, and drought stress, facilitating timely interventions to mitigate crop losses. The integration of Internet of robotic things into existing practices are also discussed with respect to cost-effectiveness, scalability, and user acceptance.

Keywords

  • agricultural robotics
  • modern precision agriculture
  • plant health
  • digital farming
  • mobile robots

1. Introduction

Plant health is a critical aspect of agriculture-related to achieving farming goals, and overcoming challenges is plant health. The complex interactions between the plant hosts, various plant pathogens, and different environmental factors influence plant health [1]. In addition, different products face different types of biotic and abiotic stresses. Abiotic stresses including drought, temperature fluctuations, high soil salinity, and oxidative stress. These stresses can cause permanent damage to the plants, such as stopping growth, disrupting metabolism, reducing performance, and changing genetic behavior [2]. Drought occurs over a long period due to the insufficient availability of fresh water, but with the increase in global temperature and climate change, drought has intensified. Drought, in addition to affecting food production, reduces the level of underground water, increases soil erosion, and eventually can lead to other disasters such as fire, flood, and the spread of diseases [3]. Drought is classified into four separate categories: 1. Meteorological drought that occurs in areas with dry climate. 2. Hydrological drought that occurs in water shortage conditions, especially in surface and underground water levels, and is encountered after a few months of meteorological drought. 3. Agricultural drought is often associated with a decrease in the water level in the soil and, as a result, crop failure, which poses a threat to food supply around the world. 4. Socio-economic drought which is related to the failure of supply and demand of various goods due to drought [2]. If the drought in the plant can be detected in the early stages, these short-term responses will not cause much damage to the plants. However, if the plant is under stress for a long period, more serious and permanent damage is caused to the plant [4]. Traditional methods for plant health assessment typically involve manual and destructive techniques. These methods are usually laborious, time-consuming, and expensive. Moreover, the manual sampling and analysis procedures typically entail numerous steps that rely on human intervention and low precision [5].

To overcome these challenges, the adoption of new agricultural technologies and techniques is necessary. Precision agriculture emerges as a strategic approach that leverages electronic information and various technologies for data collection, processing, and analysis [6]. Modern agriculture has various important aspects, such as Unmanned Ground Vehicles (UGV) and Unmanned Aerial Vehicles (UAV). While drones possess the capability to measure large areas in a short period, they have drawbacks such as depending on weather conditions, sensor limitations, and data quality [7]. To overcome these limitations, UGV proves to be suitable. The development of UGV is divided into two primary approaches, including autonomous conventional vehicles and independent mobile robots. Autonomous conventional vehicles can modify and deploy existing farm machinery for automated agricultural tasks, including planting, spraying, fertilizing, and harvesting. Although these vehicles are highly reliable and perform well on uneven terrains, they may lack some adaptability features required for specific tasks, such as integrating sensors and working with varying row spacing during experiments. Since farm vehicles are designed for transporting people and towing heavy equipment, they are typically heavier and larger, which can lead to soil compaction and the need for robust steering mechanisms [8]. In contrast to the UAV approach, independently mobile robots are specifically designed to be lightweight and adaptable to different wheel spacings and sensor configurations. This enables them to be used for a wide range of experiments and terrains while minimizing soil compaction [9]. The integration of robots represents a revolutionary development in precision agriculture in recent years. The increasing prevalence of robots in agriculture is evident, with a diverse array of robotic solutions currently available in the market. The widespread use of agricultural robots has led to a reduction in the need for labor and an increase in the production of agricultural products [10]. The development of robotics integrated with remote sensing technology offers an opportunity to assess complex traits more effectively.

The objective of this chapter is to provide an overview of robotics in plant health assessment and its features, including steering, sensing, communication, and the application of machine vision and deep learning in robots.

Advertisement

2. Mobile robot platforms

Unmanned ground vehicles, particularly robots, are widely used in precision agriculture due to numerous advantages, including reduced reliance on human resources and higher accuracy. The mobile platform is an essential component for the performance of any agricultural mobile robot. It plays a significant role in determining the overall system’s effectiveness. One of the most important locomotion systems in robots is wheels and their arrangement together. The wheel system will play an important role in transportation, logistics, and distribution, and selecting the appropriate wheel type is crucial for kinematics and dynamics modeling. It is important to know that no universally optimal drive system can optimize maneuverability, controllability, and stability at the same time, however, the best type should be considered in the design of the robot according to the environmental conditions and the specific robot application.

2.1 Differential steering

The most prevalent arrangement of wheels in agrobots is differential steering. In a differential-drive mobile robot, the actuators connected to the wheels must be driven along the same velocity profile [11]. Some of the differential steering robots are shown in Figure 1, including (a) Xf-Rovim [13], and (b) RobHortic [12]. The Xf-Rovim, designed for early detection of Xylella fastidiosa (X. fastidiosa) in olive orchards, employs two DC motor control drivers, specifically H-Bridge modules. These modules enable the operator to have precise control over the speed, torque, and direction of rotation for each wheel. The utilization of a dedicated driver for each wheel enhances maneuverability. This feature is particularly useful for reducing the rotation angle and achieving greater precision in the robot’s movements. RobHortic is a field robot designed to detect the presence of pests and diseases in horticultural crops. The robot contained a frame with four fat-bike wheels designed to navigate through uneven terrains seamlessly. While the front wheels remain fixed, the two rear wheels, comparatively smaller, can rotate freely, enhancing the robot’s adaptability to diverse field conditions.

Figure 1.

Examples of differential steering robots, (a) Xf-Rovim, (b) RobHortic [12, 13].

2.2 Skid steering

The Skid Steering mechanism operates like a differential wheel drive by controlling the relative speed of left and right drives. Its main advantages are mechanical simplicity, strength, and better maneuverability in straight movement. However, drawbacks include increased wheel wear due to skidding, reduced lifespan, and challenges in maintaining a straight path [14]. Figure 2 shows examples of skid steering robotic platforms including (a) Robotianist [16], (b) Vinobot [17], and, (c) MARIA [15]. The Robotianist is an autonomous ground platform equipped with skid steering, designed for movement within the undergrowth of row crops like sorghum or corn. The power for each of the four pneumatic tires is supplied by a 200-watt brushless DC (BLDC) motor. This robotic system features a SCARA-style arm configuration, specifically tailored for measuring plant stem strength and collecting phenotypic data. Vinobot is a ground-based robotic platform designed for the assessment of plant phenotype, plant height, and the estimation of leaf area index. The robot is equipped with a linear slide guiding a JACO2 robotic arm from Kinova, enhancing lateral reach and enabling the handling of multiple sensors. MARIA is an indoor and outdoor robot developed using the Robot Operating System (ROS). MARIA’s skid steering system, composed of off-the-shelf components, allows it to navigate through crop rows or greenhouses while conducting phenotyping and soil measurements.

Figure 2.

Examples of skid steering robots, (a) Robotianist, (b) Vinobot, and (c) MARIA [15, 16, 17].

2.3 Ackerman steering

Ackerman robots integrate a cutting-edge rotating mechanism to prevent sliding on curved paths. This involves aligning the rotation centers of all wheels to a singular point. As the rear wheels remain stationary, the pivotal point is strategically located along a line extending from the rear axle. Additionally, the left front wheel is programmed to turn at a greater angle than the outer front wheel, ensuring compliance with this fundamental principle [18]. The advantages of this design are increased control, better stability, and improved maneuverability on roads. Figure 3 shows two different Ackerman robot platforms. AgriRover-01 is a mobile agricultural robot for collecting plant height information and estimating row spacing in corn plants. This robot is equipped with six motors: four in-wheel motors for driving and two motors for steering. The four driving wheels are divided into two pairs, front, and rear, with each pair employing a gear mechanism to enable simultaneous rotation. This configuration is implemented to execute Ackerman steering for effective and coordinated navigation in the agricultural environment [20]. Another example of Ackerman steering is DairyBioBot which is a fully automated phenotyping platform specifically used for Ryegrass biomass estimation. It has a lightweight four-wheel drive system that employs front wheel drive to ensure efficient mobility and precise data collection in field operations [19].

Figure 3.

Examples of Ackerman steering robots, (a) AgriRover-01, (b) DairyBioBot [19, 20].

2.4 Articulated steering

Articulated steering is like Ackerman steering in which the robot is propelled by deforming the entire chassis to achieve rotation. However, the mechanical complexity of articulated steering robots is higher compared to Ackerman robots. The wheel of an articulated steering is divided into two halves, front and rear, which are connected by a vertical hinge. To change the angle of the front part of the chassis and induce the desired rotation, a motor is employed [21]. Some of the robots that have articulated steering include PhenoBot 3.0 and Agri.Q are shown in Figure 4. PhenoBot 3.0 is a mobile robot designed for data acquisition in corn and sorghum plants within under-canopy navigation. This robot consists of a rover body in two parts, front and back, which are connected by an articulated steering joint. It is equipped with a central articulated steering mechanism and a differential gear-based power transmission that enables the robot to move quickly between narrow product rows with high efficiency [23]. Agri.Q is an eight-wheel articulated robot designed for precision agriculture applications, particularly crop sampling and monitoring in unstructured agricultural environments. The robot features a unique combination of a novel articulated mobile base and a commercial collaborative manipulator with seven degrees of freedom. This design allows for the assurance of wheel contact with the ground, even on highly uneven terrain. The capability to distribute the robot’s weight evenly among multiple contact points helps prevent excessive soil compaction during its operations [22].

Figure 4.

Examples of articulated steering robots, (a) PhenoBot 3.0, (b) Agri.Q [22, 23].

2.5 Other steering mechanisms

Due to the diverse types of agricultural tasks and the complex conditions of the field, agricultural robots have various steering to accomplish their operational tasks. A weeding robot has been developed to address the challenges of using mobile robots to remove weeds in paddy fields. The robot churned up the soil to remove weeds and prevented further weed growth by blocking sunlight. The robot features two wheels, and its rotational speed is regulated by pulse width modulation (PWM) signals [24]. BoniRob is an agricultural mobile robot featuring Omnidirectional steering that incorporates diverse hardware modules according to research applications. These modules are strategically integrated to facilitate plant phenotyping, soil monitoring, weed control, and web-based communication. With its four driving and auto-steering wheels, BoniRob boasts a high level of maneuverability [25]. TERRA-MEPP is a tracked robot specifically designed for sorghum phenotype analysis that enables adaptable sensor positioning to address varying canopy heights during the growth cycle [26]. Figure 5 displays some of the robots that use a different type of steering mechanism.

Figure 5.

Examples of articulated steering robots are, (a) weeding robots, (b) BoniRob, and (c) TERRA-MEPP [24, 25, 26].

Advertisement

3. Mobile robot sensing system

In mobile robots, sensors are used for various purposes, such as phenotypic sensors for measuring phenotypic traits and perception sensors for navigation. Perception sensors play a crucial role in determining location and planning routes [14]. The phenotyping sensors include vision sensors and non-vision sensors. Optical sensors detect light in the 400–700 nm wavelength spectrum and provide red, green, and blue (RGB) reflectance values. In fact, sensors with different spectral bands enjoy different advantages. The two current technologies for vision-based sensors are CCD and CMOS [7]. The most widely used non-vision sensors are the hyperspectral camera, multispectral camera, thermal camera, and LiDAR sensor. LiDAR stands out as a highly employed sensor in robotics due to its reduced sensitivity to ambient light and its capacity to measure distances accurately without direct contact. Hyperspectral imaging is a combination of imaging and spectroscopy that produces multidimensional data. These sensors are capable of imaging with high resolution and narrow spatial range, so they can distinguish plant responses to stress. Unlike hyperspectral imaging, which uses hundreds of individual wavelengths or narrow wave bands to generate data, multispectral techniques use data from a range of wavelengths. In addition, multispectral imaging, compared with the hyperspectral method, is more affordable and more flexible. Thermal cameras detect radiation in the infrared wavelength range, with the resulting measurements being displayed where the pixels contain the temperature values [27].

3.1 Plant health assessment sensors

LiDAR sensors are increasingly applied in assessing crop health as they are able to measure 3D point clouds of crops [7]. While these sensors offer additional information such as depth and temperature, they have a higher financial cost, hindering their practical implementation [28]. To select the correct sensor for the agriculture robot, emphasizing efficiency rather than cost considerations. The agricultural environment introduces variables like temperature, humidity, and dust, which directly impact the performance of sensors. Therefore, it becomes crucial to design sensors with high Ingress Protection (IP) ratings like IP65, IP66, or IP67. These sensors need to operate effectively across varying temperature and humidity conditions while maintaining affordability [28]. An overview of some agricultural mobiles with sensors is shown in Table 1.

Robot nameSummary of applicationSensors
Ladybird [29]Broccoli monitoringHyperspectral camera (Resonon Pika XC2), Thermal camera (Xenics Gobi-640-GigE), Stereo camera (GS3-U3–120S6C-C)
AgriRover-0 [20]Maize phenotypingLiDAR (Velodyne HDL64E-S3)
Scout 2.0 [30]Grapevine phenotypingLiDAR (RealSense camera VLP16)
Robotanist [31]Sorghum and corn phenotypingcustom stereo camera, point Gray Flea3 cameras
RobHortic [12]Pests and diseases detection in carrotMultispectral camera (CMS-V), DSLR cameras (EOS 600D), NIR camera (400–1000 nm), Hyperspectral camera (InSpectralVNIR), Thermal camera(A320)
XF-ROVIM [13]Xylella fastidiosa detection in olive grovesDSLR cameras (EOS 600D Canon Inc), Multispectral camera (CMS-V Silios), Hyperspectral camera, NIR (spectrograph Inspector V10), Thermal camera (A320), LiDAR (LMS111)

Table 1.

A summary of agricultural mobile robots and sensors.

3.2 Drought stress detection sensors

RGB sensors can be used to detect plant response to water stress, based on visible parameters [27]. However, many factors can affect RGB imaging, such as lighting, environmental conditions, time of day, and spectral resolution. To overcome these challenges, image processing, and machine learning techniques can be used. Hyperspectral imaging has the potential ability to obtain physiological and phytochemical parameters, which is why it is used to detect drought stress in crops [32]. When these sensors are used to detect stress, instead of the spectral information of stressed and non-stressed areas in a complete image of a plant leaf, it separates the affected area and identifies specific imaging patterns and characteristics [33]. Thermal images are the most useful imaging for water stress indicators and drought detection. Still, environmental factors such as solar radiation, air temperature, wind speed, and background temperature can easily affect field measurements [27]. LiDAR acquires various parameters of canopy and leaves, such as vegetation cover, height, canopy structure, leaf area index, and nitrogen status [27]. For using LiDAR sensors in ground-based platforms, integration with GPS and wheel encoder position systems is required for georeferencing. To characterize drought stress in corn, a terrestrial laser scanner has been used to collect LiDAR data, which has the ability to detect three phenotypes related to drought such as plant height, plant area index, and predicted leaf area [34].

Advertisement

4. Internet of Robotic Things (IoRT)

IoRT combines the capabilities of robotic devices (such as sensors and actuators) with the Internet of Things (IoT) infrastructure, enabling interconnected and intelligent robotic systems to collect, process, and exchange data autonomously. This convergence allows agricultural robots to operate in dynamic environments, interact with each other and with other IoT-enabled devices, and make data-driven decisions to perform a wide range of tasks across various farming domains. The way robots communicate is influenced by various factors such as the robot, application, and environment which affect the establishment of a communication protocol and the overall performance of the robots. Wireless communication protocols like WiFi [35], ZigBee [36], and Bluetooth [37] are frequently used in digital agriculture. ZigBee is used in agricultural robots that are used in plant health assessment due to hardware availability, low power consumption, and flexible communications models that permit both IP-based and more simplified serial-like messaging. However, they also face drawbacks, including relatively low range and data rates. To address this issue, LoRaWAN emerges as a viable alternative, maintaining the positive attributes of ZigBee for robotic applications while offering extended transmission capabilities with ranges of up to 16 km. Cellular networks like 4G overcome challenges faced by local area network standards by employing a nearly ubiquitous infrastructure-based communication model and benefiting from economies of scale [38]. Now, 4G is available in 85% of global locations [39] and access latency on the order of 50 ms [40]. The 5G network provides a secure communication infrastructure with low latency capabilities. Additionally, with the increasing adoption and success of 3G and 4G, the number of devices supporting 5G is expected to expand as technology advances. This growth is anticipated to lead to the availability of more affordable and energy-efficient devices in the market.

The multi-robot system, a recent technological development, involves the collaboration of mobile robots working together in the agricultural environment. Its purpose is to overcome the limitations associated with tasks that a single mobile robot may be unable to accomplish alone [31]. Multi-robot systems have certain benefits compared to individual mobile robots in precision agriculture applications, including increased effectiveness, efficiency, flexibility, and fault tolerance. Moreover, for efficient operation control, all robots require a reliable and high-throughput network connection with low latency [39]. An example includes the work of [41], in which the development of an autonomous agricultural robot prototype (AgriBot) is intended for various agricultural activities such as seeding, weeding, and spraying fertilizers and insecticides. The robot is equipped with WIFI communication, enabling it to communicate with the operator and other robots. Pretto et al. [42] employed the WiFi protocol for communication between unmanned aerial vehicles and ground vehicles, enabling the monitoring of crop density, weed pressure, and nitrogen (N) nutrition status. The robots were adept at accurately classifying and locating weeds during the spraying process. Additionally, WiFi communication facilitated coordinated missions among the robots, enhancing their collaborative capabilities.

Advertisement

5. Machine vision algorithms

In agricultural mobile robots used for plant health assessment, computer vision algorithms, particularly deep learning, play a key role in extracting features from images obtained by sensors. Deep learning algorithms draw inspiration from the functioning of the human brain. Deep learning is a type of neural network trained as a multilayer perceptron (MLP) to learn datasets without manual feature extraction. To generate a deep learning architecture and model, functions are repeated in several layers [43]. These algorithms enable disease detection, weed detection, and yield estimation. An example of such a solution can be found in the work of Blue River Technology. Blue River Technology developed a robotic system called “See & Spray” that uses computer vision and machine learning for plant health assessment. The system employs two sets of camera points, and deep learning algorithms analyze the images to differentiate between crops and weeds. This enables targeted herbicide application, reducing the overall use of herbicides [44]. Ladybird is another example of a robot that uses deep learning for crop detection. The use of Faster R-CNN allows Ladybird to detect and analyze crops efficiently, providing valuable information for monitoring plant health [29]. Weed scouting is an important part of modern integrated weed management. With the advancement of technologies such as machine learning and robots, both time and money can be managed. Hall et al. [45] utilized DCNN data collection algorithms in the AgBotII robot to develop a weed detection model specifically for cotton plants. Similarly, McCool et al. [46] used a lightweight DCNN algorithm to identify weeds. Florance Mary and Yogaraman [47] have applied CNN for detecting weeds between crops and drilling them using a blade attached at the end of the robotic.

Artificial intelligence algorithms play a crucial role in automatic navigation for agricultural robots. A typical agricultural environment has many challenges such as terrain roughness, soil conditions, obstacles, and multiple rows of crops for navigation. This challenge is effectively tackled by training navigation data using machine vision algorithms. In comparison to costly navigation hardware like Real-Time Kinematic Global Positioning System (RTK-GPS), the utilization of machine vision sensors and algorithms offers a cost-effective alternative. This approach enables robots to identify optimal paths and navigate efficiently through agricultural fields [48]. U-Net has been used to process images captured by the Husky robot to predict crop rows and follow the central crop row in a sugar beet field [49]. De Silva et al. [49] presented an agricultural robot based on ANN to make precise path planning. Oliveira et al. [28] have presented their work on the navigation of an agriculture robot in greenhouses. To achieve this, APF, Hector SLAM, and LiDAR are implemented on a robot. AI techniques are capable of minimizing human errors and significantly reducing time consumption by providing data frequently in real-time. In addition, combining various AI techniques can enhance effectiveness in solving agricultural problems.

Advertisement

6. Challenges

Robots demonstrate remarkable potential to replace traditional methods for various agricultural tasks and plant care. Despite the advantages of utilizing robots for plant health diagnosis, several challenges remain. For example, identifying pests and weeds necessitates comprehensive data acquisition for each plant, which requires extensive databases to store detailed information. Early detection of plant health issues can avert substantial economic losses in the agricultural sector [50]. Furthermore, vision-based robot and vehicle navigation offer promising prospects for plant health assessment. However, images obtained from agricultural environments such as orchards or farmlands often contain high levels of background noise [51]. Implementing vision navigation systems is challenging due to varying lighting conditions that affect image quality and navigation baseline extraction. Wind can cause plant movement, resulting in blurred images and inaccurate detection [52]. Additionally, terrain slope, soil type, weeds, and dust further complicate vision systems. In addition, most agricultural robots employ 4WD systems, which have drawbacks such as soil compaction and increased dependence on environmental conditions [28]. Additionally, the complex and restrictive regulatory landscape poses obstacles for farmers integrating these technologies. Consequently, further studies are needed to validate the effectiveness of robotics in agriculture. Future research should focus on improving motion systems, computer vision algorithms, and communication technologies for agricultural robots. One recent development for the agricultural environment is the use of legged robots, which can adapt to varying ground slopes and maintain continuous soil contact, thus resisting soil erosion [53]. Additionally, advanced neural networks including convolutional neural networks (CNN) [54] and deep neural networks (DNN) [55] facilitate the identification of diseases and weeds. Deep learning techniques enhance visual information processing, such as image segmentation, feature detection, and target recognition, by reducing environmental noise and vegetation overlap. The integration of IoT technologies with robotic systems can advance smart agriculture by enhancing process control, monitoring, and standardization through artificial intelligence and machine-to-machine communications [56]. This integration holds promise for addressing immediate and long-term agricultural challenges and facilitating the development of precision multi-purpose robotic systems. Furthermore, multi-robot collaboration, involving both ground and aerial vehicles, can enhance vision-based navigation through real-time data exchange and a more comprehensive understanding of the environment.

Advertisement

7. Conclusion

This chapter provided insights into enhancing crop management practices through the exploration of agricultural mobile robots equipped with advanced sensing technologies and computer vision algorithms. These robots offer significant advantages over traditional manual methods, including increased efficiency, accuracy, and timeliness in data collection and analysis. However, the agricultural environment’s complexity and dynamism present challenges for robots to move beyond small-scale applications or the prototype phase. Research trends indicate that leveraging machine learning techniques enables these robots to effectively analyze data collected on plant health indicators, facilitating early detection and mitigation of plant diseases, nutrient deficiencies, and drought stress. This presents a promising solution for addressing critical challenges in crop production, particularly in plant health assessment and drought stress detection. The adoption of appropriate steering mechanisms tailored to specific agricultural tasks and environmental conditions is crucial for optimizing robot performance and maneuverability. The discussion on mobile robot sensing systems underscores the critical role of sensors in comprehensive data collection and analysis. While technologies such as LiDAR, multispectral imaging, and thermal imaging offer promising capabilities, challenges such as sensor reliability and cost-effectiveness remain to be addressed to ensure the robustness and scalability of sensing systems in agricultural robotics. Additionally, the potential of interconnected robotic systems, facilitated by the Internet of Robotic Things (IoRT), to revolutionize precision agriculture through enhanced communication and collaboration capabilities was highlighted. Despite the transformative potential of agricultural robotics, several challenges must be addressed, including the establishment of comprehensive databases for plant health diagnosis, the development of weather-independent robots, the mitigation of soil compaction, and optimization of AI algorithms for real-time decision-making.

Advertisement

Acknowledgments

We gratefully acknowledge the funding provided by Adaptive AgroTech Consultancy for this study.

References

  1. 1. Lowenberg-DeBoer J, Erickson B. Setting the record straight on precision agriculture adoption. Agronomy Journal. 2019;111(4):1552-1569
  2. 2. Ahluwalia O, Singh PC, Bhatia R. A review on drought stress in plants: Implications, mitigation and the role of plant growth promoting rhizobacteria. Resources, Environment and Sustainability. 2021;5:100032
  3. 3. Zaidi NW, Dar MH, Singh S, Singh US. Chapter 38—Trichoderma species as abiotic stress relievers in plants. In: Gupta VK, Schmoll M, Herrera-Estrella A, Upadhyay RS, Druzhinina I, Tuohy MG, editors. Biotechnology and Biology of Trichoderma. Elsevier; 2014. pp. 515-525. DOI: 10.1016/B978-0-444-59576-8.00038-2. ISBN 9780444595768
  4. 4. Kaur G, Asthir B. Molecular responses to drought stress in plants. Biologia Plantarum. 2017;61(2):201-209. DOI: 10.1007/s10535-016-0700-9
  5. 5. Pautasso M, Dehnen-Schmutz K, Holdenrieder O, Pietravalle S, Salama N, Jeger MJ, et al. Plant health and global change–some implications for landscape management. Biological Reviews. 2010;85(4):729-755
  6. 6. Shafi U, Mumtaz R, García-Nieto J, Hassan SA, Zaidi SAR, Iqbal N. Precision agriculture techniques and practices: From considerations to applications. Sensors. 2019;19(17):3796
  7. 7. Wang H, Lin Y, Wang Z, Yao Y, Zhang Y, Wu L. Validation of a low-cost 2D laser scanner in development of a more-affordable mobile terrestrial proximal sensing system for 3D plant structure phenotyping in indoor environment. Computers and Electronics in Agriculture. 2017;140:180-189
  8. 8. Yang G, Liu J, Zhao C, Li Z, Huang Y, Yu H, et al. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: Current status and perspectives. Frontiers in Plant Science. 2017;8:1111
  9. 9. Edan Y, Adamides G, Oberti R. Agriculture Automation. In: Nof SY, editors. Springer Handbook of Automation. Springer Handbooks. Cham: Springer; 2023. DOI: 10.1007/978-3-030-96729-1_49
  10. 10. Atefi A, Ge Y, Pitla S, Schnable J. Robotic technologies for high-throughput plant phenotyping: Contemporary reviews and future perspectives. Frontiers in Plant Science. 2021;12:611940
  11. 11. Rubio F, Valero F, Llopis-Albert C. A review of mobile robots: Concepts, methods, theoretical framework, and applications. International Journal of Advanced Robotic Systems. 2019;16(2):1729881419839596
  12. 12. Cubero S, Marco-Noales E, Aleixos N, Barbé S, Blasco J. Robhortic: A field robot to detect pests and diseases in horticultural crops by proximal sensing. Agriculture. 2020;10(7):276
  13. 13. Rey B, Aleixos N, Cubero S, Blasco J. Xf-Rovim. A field robot to detect olive trees infected by Xylella fastidiosa using proximal sensing. Remote Sensing. 2019;11(3):221
  14. 14. Xu R, Li C. A review of high-throughput field phenotyping systems: Focusing on ground robots. Plant Phenomics. 2022;2022. DOI:10.34133/2022/9760269
  15. 15. Iqbal J, Xu R, Halloran H, Li C. Development of a multi-purpose autonomous differential drive mobile robot for plant phenotyping and soil sensing. Electronics. 2020;9(9):1550
  16. 16. Mueller-Sim T, Jenkins M, Abel J, Kantor G. The Robotanist: A ground-based agricultural robot for high-throughput crop phenotyping. In: 2017 IEEE International Conference on Robotics and Automation (ICRA). Singapore; 2017. pp. 3634-3639. DOI: 10.1109/ICRA.2017.7989418
  17. 17. Shafiekhani A, Kadam S, Fritschi FB, DeSouza GN. Vinobot and vinoculer: Two robotic platforms for high-throughput field phenotyping. Sensors. 2017;17(1):214
  18. 18. Zhang H, Zhang Y, Yang T. A survey of energy-efficient motion planning for wheeled mobile robots. Industrial Robot: The International Journal of Robotics Research and Application. 2020;47(4):607-621
  19. 19. Nguyen P, Badenhorst PE, Shi F, Spangenberg GC, Smith KF, Daetwyler HD. Design of an unmanned ground vehicle and lidar pipeline for the high-throughput phenotyping of biomass in perennial ryegrass. Remote Sensing. 2020;13(1):20
  20. 20. Qiu Q , Sun N, Bai H, Wang N, Fan Z, Wang Y, et al. Field-based high-throughput phenotyping for maize plant using 3D LiDAR point cloud generated with a “phenomobile”. Frontiers in Plant Science. 2019a;10:554. DOI: 10.3389/fpls.2019.00554
  21. 21. Botta A, Cavallone P, Baglieri L, Colucci G, Tagliavini L, Quaglia G. In depth analysis of power balance, handling, and the traction subsystem of an articulated skid-steering robot for sustainable agricultural monitoring. SN Applied Sciences. 2023;5(4):103
  22. 22. Colucci G, Botta A, Tagliavini L, Cavallone P, Baglieri L, Quaglia G. Kinematic modeling and motion planning of the mobile manipulator Agri.Q for precision agriculture. Machines. 2022;10(5):321
  23. 23. Gai J, Xiang L, Tang L. Using a depth camera for crop row detection and mapping for under-canopy navigation of agricultural robotic vehicle. Computers and Electronics in Agriculture. 2021;188:106301
  24. 24. Sori H, Inoue H, Hatta H, Ando Y. Effect for a paddy weeding robot in wet rice culture. Journal of Robotics and Mechatronics. 2018;30(2):198-205
  25. 25. Biber P, et al. Navigation System of the Autonomous Agricultural Robot Bonirob. Workshop on Agricultural Robotics: Enabling Safe, Efficient, and Affordable Robots for Food Production (Collocated with IROS 2012), Vilamoura, Portugal. 2012
  26. 26. Young SN, Kayacan E, Peschel JM. Design and field evaluation of a ground robot for high-throughput phenotyping of energy sorghum. Precision Agriculture. 2019;20(4):697-722
  27. 27. Kim J, Kim K, Kim Y-H, Chung YS. A short review: Comparisons of high-throughput phenotyping methods for detecting drought tolerance. Scientia Agricola. 2020;78(4)1-8. DOI: 10.1590/1678-992x-2019-0300
  28. 28. Oliveira LF, Moreira AP, Silva MF. Advances in agriculture robotics: A state-of-the-art review and challenges ahead. Robotics. 2021;10(2):52. DOI: 10.3390/robotics10020052
  29. 29. Bender A, Whelan B, Sukkarieh S. A high-resolution, multimodal data set for agricultural robotics: A Ladybird's-eye view of Brassica. Journal of Field Robotics. 2020;37(1):73-96. DOI: 10.1002/rob.21877
  30. 30. Tiozzo Fasiolo D, Pichierri A, Sivilotti P, Scalera L. An analysis of the effects of water regime on grapevine canopy status using a UAV and a mobile robot. Smart Agricultural Technology. 2023;6:100344. DOI: 10.1016/j.atech.2023.100344
  31. 31. Zhang C, Noguchi N, Yang L. Leader–follower system using two robot tractors to improve work efficiency. Computers and Electronics in Agriculture. 2016;121:269-281. DOI: 10.1016/j.compag.2015.12.015
  32. 32. Behmann J, Steinrücken J, Plümer L. Detection of early plant stress responses in hyperspectral images. ISPRS Journal of Photogrammetry and Remote Sensing. 2014;93:98-111. DOI: 10.1016/j.isprsjprs.2014.03.016
  33. 33. Zubler AV, Yoon J-Y. Proximal methods for plant stress detection using optical sensors and machine learning. Biosensors. 2020;10(12):193
  34. 34. Su Y, Wu F, Ao Z, Jin S, Qin F, Liu B, et al. Evaluating maize phenotype dynamics under drought stress using terrestrial lidar. Plant Methods. 2019;15:11. DOI: 10.1186/s13007-019-0396-x
  35. 35. Hajjaj SSH, Sahari KSM. Review of agriculture robotics: Practicality and feasibility. In: 2016 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS). Tokyo, Japan; 2016. pp. 194-198. DOI: 10.1109/IRIS.2016.8066090
  36. 36. Bodunde O, Adie U, Ikumapayi O, Akinyoola J, Aderoba A. Architectural design and performance evaluation of a ZigBee technology based adaptive sprinkler irrigation robot. Computers and Electronics in Agriculture. 2019;160:168-178
  37. 37. Sowjanya KD, Sindhu R, Parijatham M, Srikanth K, Bhargav P. Multipurpose autonomous agricultural robot. In: 2017 International Conference of Electronics, Communication and Aerospace Technology (ICECA). Coimbatore, India; 2017. pp. 696-699. DOI: 10.1109/ICECA.2017.8212756
  38. 38. Gielis J, Shankar A, Prorok A. A critical review of communications in multi-robot systems. Current Robotics Reports. 2022;3(4):213-225. DOI: 10.1007/s43154-022-00090-9
  39. 39. Zhivkov T, Sklar EI, Botting D, Pearson S. 5G on the farm: Evaluating wireless network capabilities and needs for agricultural robotics. Machines. 2023;11(12):1064. Available from: https://www.mdpi.com/2075-1702/11/12/1064
  40. 40. Tahir MN, Katz M. Performance evaluation of IEEE 802.11p, LTE and 5G in connected vehicles for cooperative awareness. Engineering Reports. 2022;4(4):e12467. DOI: 10.1002/eng2.12467
  41. 41. Amer G, Mudassir SMM, Malik MA. Design and operation of Wi-Fi agribot integrated system. In: 2015 International Conference on Industrial Instrumentation and Control (ICIC); Pune, India; 28-30 May 2015. 2015. pp. 207-212. DOI: 10.1109/IIC.2015.7150739
  42. 42. Pretto A, Aravecchia S, Burgard W, Chebrolu N, Dornhege C, Falck T, et al. Building an aerial-ground robotics system for precision farming: An adaptable solution. arXiv e-prints, arXiv:1911.03098. 2019. DOI: 10.48550/arXiv.1911.03098
  43. 43. Alzubaidi L, Zhang J, Humaidi AJ, Al-Dujaili A, Duan Y, Al-Shamma O, et al. Review of deep learning: Concepts, CNN architectures, challenges, applications, future directions. Journal of Big Data. 2021;8(1):53. DOI: 10.1186/s40537-021-00444-8
  44. 44. Yeshe A, Gourkhede P, Vaidya PH. Blue River Technology: Futuristic Approach of Precision Farming. Punjab, India: Just Agriculture; 2022
  45. 45. Hall D, Dayoub F, Kulk J, McCool C. Towards unsupervised weed scouting for agricultural robotics. In: Chen IM, Nakamura Y, editors. Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA). Singapore; 2017. pp. 5223-5230. DOI: 10.1109/icra.2017.7989612
  46. 46. McCool C, Perez T, Upcroft B. Mixtures of lightweight deep convolutional neural networks: Applied to agricultural robotics. IEEE Robotics and Automation Letters. 2017;2(3):1344-1351. DOI: 10.1109/LRA.2017.2667039
  47. 47. Florance Mary M, Yogaraman D. Neural network based weeding robot for crop and weed discrimination. Journal of Physics: Conference Series. 2021;1979(1):012027. DOI: 10.1088/1742-6596/1979/1/012027
  48. 48. Shamshiri R, Hameed I, Pitonakova L, Weltzien C, Balasundram S, Yule I, et al. Simulation software and virtual environments for acceleration of agricultural robotics: Features highlights and performance comparison. International Journal of Agricultural and Biological Engineering. 2018;11:15-31. DOI: 10.25165/ijabe.v11i4.4032
  49. 49. De Silva R, Cielniak G, Wang G, Gao J. Deep learning-based crop row detection for infield navigation of agri-robots. Journal of Field Robotics. 2023;32:162-175
  50. 50. Chapman SC, Merz T, Chan A, Jackway P, Hrabar S, Dreccer MF, et al. Pheno-copter: A low-altitude, autonomous remote-sensing robotic helicopter for high-throughput field-based phenotyping. Agronomy. 2014;4(2):279-301. Available from: https://www.mdpi.com/2073-4395/4/2/279
  51. 51. Chen M, Tang Y, Zou X, Huang Z, Zhou H, Chen S. 3D global mapping of large-scale unstructured orchard integrating eye-in-hand stereo vision and SLAM. Computers and Electronics in Agriculture. 2021;187:106237
  52. 52. Mavridou E, Vrochidou E, Papakostas GA, Pachidis T, Kaburlasos VG. Machine vision systems in precision agriculture for crop farming. Journal of Imaging. 2019;5(12):89
  53. 53. Oliveira LFP, Rossini FL. Modeling, simulation and analysis of locomotion patterns for hexapod robots. IEEE Latin America Transactions. 2018;16:375-383
  54. 54. Emmi L, Le Fl’echer E, Cadenat V, Devy M. A hybrid representation of the environment to improve autonomous navigation of mobile robots in agriculture. Precision Agriculture. 2021;22(2):524-549
  55. 55. Adhikari SP, Kim G, Kim H. Deep neural network-based system for autonomous navigation in paddy field. IEEE Access. 2020;8:71272-71278
  56. 56. Cui F. Deployment and integration of smart sensors with IoT devices detecting fire disasters in huge forest environment. Computer Communications. 2020;150:818-827

Written By

Maryam Behjati, Redmond R. Shamshiri and Ibrahim A. Hameed

Submitted: 21 March 2024 Reviewed: 19 June 2024 Published: 04 September 2024