Open access peer-reviewed chapter

Use of RGB Images in Field Conditions to Evaluate the Quality of Pastures in Farms in Antioquia: A Methodology

Written By

Valentina Parilli-Ocampo, Manuela Ortega Monsalve, Mario Cerón-Muñoz, Luis Galeano-Vasco and Marisol Medina-Sierra

Submitted: 25 September 2023 Reviewed: 15 January 2024 Published: 26 February 2024

DOI: 10.5772/intechopen.114198

From the Edited Volume

Precision Agriculture - Emerging Technologies

Edited by Redmond R. Shamshiri, Sanaz Shafian and Ibrahim A. Hameed

Chapter metrics overview

68 Chapter Downloads

View Full Metrics

Abstract

The use of RGB (Red, Green, and Blue) images is a useful technique considered in the prediction of diseases, moisture content, height, and nutritional composition of different crops of productive interest. It is important to adopt a methodology in the field that allows the acquisition of images without losing the quality of the information in the RGB bands since the prediction and adjustment of the grass quality parameters depend on it. Currently, there are few studies and methodologies that support the validity of the use of RGB images in the field, since there are many environmental factors that can distort the information collected. For this study, a field methodology was established where RGB images were captured using the unmanned aerial vehicle drone, DJI Phantom 4 Pro. A total of 270 images of grass crops for animal feed were taken on 15 farms in Antioquia. The images were pre-processed using the programming language Python, where a region of interest for each image was chosen and the average RGB values were extracted. Different indices were created with the RGB bands and based on them; several models were used for the nutritional variables of the pasture, managing to find suitable equations for acid detergent fiber, crude protein, and moisture.

Keywords

  • drone
  • region of interest
  • spectral bands
  • statistical models
  • vegetal nutrition
  • weather

1. Introduction

Light is characterized as visually assessed radiant energy exhibiting diverse wavelengths, perceived as different colors, and emitting electromagnetic radiation that is relatively balanced in all the wavelengths contained in the visible region [1].

The color of an object is determined by the selective absorption of different amounts of wavelengths within the visible spectrum. As indicated in Figure 1, the blue, green, and red colors of an object are quantified and depicted through spectrophotometric curves. These curves serve as graphical representations of the fractions of incident light, whether reflected or transmitted, as a function of wavelength across the visible spectrum [1, 2].

Figure 1.

Representation of spectrophotometric curves for red, green, and blue wavelengths (adapted from Eissa et al. [1] and Pharr et al. [2]).

Sensors using RGB cameras have evolved into widely used tools in the fields of artificial vision and agriculture. These work by means of passive cameras that are transported by an unmanned aerial vehicle (UAV) and where natural light is used to take the images [3]. Advances in digital agriculture have made it possible to extract information about the “vegetation indexes” (VI) of the crops, which are mathematical operations based on spectral data of the radiation emitted by the vegetation at various wavelengths [4]. These operations provide an understanding of characteristics such as the height of grasses by simply flying over the terrain and obtaining a photograph of the site.

Experts in the agricultural industry have used unmanned aerial vehicles (UAVs), such as drones, to analyze pixels and RGB bands to identify nutritional factors in crops. These types of techniques bring ease of use, tracking areas that are difficult for humans to access, and monitoring different crop yields [5]. Studies have shown that UAVs can be used for monitoring cultivation, with the advantage that the light source used is natural [6]. The use of UAVs has reached high levels of reliability, mapping areas of different sizes, becoming a solution and alternative at the farm level [7]. However, not enough research studies have been carried out showing a clear and applicable methodology in the field along with adequate processing to verify the accuracy of the data collected in the field. The use of RGB images in the field is intended to replace the bromatological analysis of the plants, which play a very important role in determining the nutritional composition of the plants before animals feed from them; moreover, there is a relationship between the elemental content of the plant and its nutritional status [8], knowing and quantifying the nutrient content of pastures is fundamental to decision making and establishing adequate crop management in agricultural productions [9]. Bromatological analysis performed on pastures is commonly used to determine the content of protein, fiber, sugars, and lipids, among others. However, the RGB imaging approach additionally analyses different characteristics through the image bands, allowing us to know, for example, the nutritional status of a plant [10].

The high spatial resolution and low cost of RGB cameras give them a great advantage. In general, RGB cameras have been used extensively in pasture studies with UAVs [11]. When taking images in the field, it’s important to consider several factors such as the weather, light intensity, light angle, and altitude of the UAVs. According to authors like [12], these factors may have negative effects, making it challenging to conduct flights in unpredictable field conditions such as lighting, shooting angle, and altitude. The flight methodology proposed by Tagle Casapía et al. [13] suggests that the aircraft should be unmanned, and the pilot should be aware of the aircraft’s altitude, as well as any previously marked locations in the field. Additionally, a calibration reflectance panel was used before and after the flight, as this data is essential to the construction of the model [14]. Overall, it is important to regulate certain field variables to obtain high-quality images and data, while maintaining a functional methodology that can be used by other researchers.

In addition to capturing images in the field, artificial intelligence (AI), intelligent automated systems (IAS) and machine learning models are necessary tools that help extract the information captured. Through the analysis of parameters, these tools also have the potential to help improve crops. They can streamline operations, observe environmental conditions, and predict the quantity and quality of yields [15, 16]. This chapter’s goal was to provide a system for using RGB images of the field to assess the quality of pastures on farms in Antioquia.

Advertisement

2. Methodology for the acquisition of RGB images in the field

Below is a practical example of applying the methodology for the collection of field images using unmanned aerial vehicles which can be replicated. A series of steps and suggestions are provided that are considered necessary to successfully complete the process. This is followed by a practical example of statistical processing of RGB data to fit several models.

2.1 Study area

The flights and bromatological analyses of the grass were carried out in different municipalities of the Department of Antioquia. The farms selected for this study, the number of bromatological analyses carried out, and the number of photographs taken with the drone in each of them are shown in Table 1. The only criteria considered in choosing the farms were their accessibility and the presence of pastures for dairy and beef animals.

SubregionMunicipalityFarmsBromatological analysis
WestSopetrán22
Magdalena MedioMaceo12
EastAbejorral24
NorthYarumal24
SouthwestAmagá24
NorthSanta Rosa de Osos24
UrabáChigorodó24
Bajo CaucaCaucasia24
SouthwestTámesis12

Table 1.

Region and municipality, sample size, and number of images for the properties chosen for the current study.

2.2 Flight planning

Planning a drone flight involves several steps that are critical to its success (Figure 2). These include the selection of the responsible crew, the appropriate choice of aircraft and flight programs, and the precise definition of the area of interest being covered. Each of these steps plays a crucial role in ensuring the efficient and effective execution of the flight to achieve optimal results.

Figure 2.

Steps for UAV flight planning.

2.2.1 Crew selection

In the use of a UAV, key roles are defined that form essential functions to ensure a safe and efficient operation. The pilot supervises operations and plans flight routes, while the visual observer plays a critical role in identifying potential airspace risks, maintaining detailed flight records, and assisting the pilot with equipment handling and calibration. The data manager, on the other hand, is responsible for managing the storage and transfer of information between flights, as well as assessing the quality of the images captured. For successful and safe flights, it is crucial to have a cooperative team that determines the objectives of the operation in advance.

2.2.2 Aircraft and software selection

Aircraft and flight planning software are critical to any aerial project. To achieve stable and effective flights, elements such as battery life and the availability of a stabilizer must be carefully considered. In addition, software must be selected that ensures the recording of areas and flight parameters, allowing autopilot flights under human supervision. This overall strategy ensures the efficiency of data collection and the successful completion of the previously drawn plan. Among the most common software programs for UAV control are DroneDeploy©, DJI Terra©, Litchi©, AirMap©, and Pix4Dcapture© (as part of the Pix4D suite). These programs vary in approach and features, offering options suited to different needs and levels of experience.

Some of these programs are simple and do not require advanced knowledge. They offer the ability to plan automated flights and generate basic 3D maps and models. Others are specific to working with DJI drones, ensuring seamless integration with DJI equipment. Some programs offer more manual control over flights, allowing more detailed customization of the drone’s routes and actions. To maintain the safety and legality of flights, the planning and controlling of them need to factor in airspace regulations and constraints.

Programming the flying software requires a suitable flight plan. The AUV must fly directly over the region of interest, thus setting the flying perimeter. Additionally, monitoring flight statistics estimated by the flight software is required. An average flight generates 150–200 photos. The drift angle can be modified to reduce flight time and adjust the flight paths start/end point. It is recommended to start the flight at a greater distance and end the flight closer to the starting point. Since image capture may result in some loss of resolution, it is advisable to maintain a target resolution. In this sense, a resolution of 2–3 cm is recommended for sufficient quality of the processed images, thus working with the machine learning algorithms.

In this case, PIX4Dcapture software (©Pix4D SA-2023) [17] was used for flight preparation. PIX4Dcapture is a tool dedicated to both flight mission organization and photogrammetry. It allows for the meticulous planning of flight paths, allowing the capture of images from different angles. The software additionally offers real-time tracking, automatic pre-launch inspections, and accurate image capturing (Figure 3).

Figure 3.

Flight programming in PIX4Dcapture software.

Once the pre-flight procedures are completed, it is necessary to check the antenna alignment before starting the propeller motors. Take-off should be executed under manual control to prevent the collision of the drone during flight. The pilot and the visual observer must keep regular visual contact. When landing, if an automatic landing is decided, it is essential to monitor the behavior of the AUV during descent. Promptly rectify any aberrant behavior such as the drone’s misguided landing attempts. If on the other hand, a manual landing is required, the drone should be carefully and slowly guided to the desired location. Once on the ground, it is important to download the data collected during flight and shut down the drone. Integration with Pix4D provides a complete workflow from planning to the advanced processing and analysis of captured data, thus, making it easy to achieve accurate and detailed results.

2.2.3 Define the area of interest

The process of selecting the area of interest (AOI) is critically important due to the objectives of the assessment, with a focus on ensuring accessibility to the area of the study. An effective strategy for delineating this terrain is to create a polygon that outlines the boundaries of the AOI using a GPS system. Once established, the polygon can be uploaded to PIX4Dcapture software in KMZ format. This allows flights to be planned within the specific area of interest, resulting in a more accurate and efficient use of flight resources (Figure 4).

Figure 4.

Delimitation of the area of interest in PIX4Dcapture.

In this study, areas of interest were selected where the drone images would be taken. As shown in Figure 5, a 56.5 cm diameter circular frame with a prominent color was used to distinguish the area of study in the drone imagery. A white and yellow 130 × 130 cm square frame was also used. Alternatively, high-contrast colored spray markings or weather-resistant rubber and vinyl markers can be used ©DroneDeploy-2017 [18]. Another option is to use a lawn frame, which can be clearly seen from flight altitude.

Figure 5.

Circular and square frames to differentiate the flight and interest area.

Prior to operating the AUV, the several recommendations considered included weather conditions, reflective panel, and AUV elements:

2.2.4 Weather conditions

Ensuring the secure functionality of drones necessitates the presence of consistent and stable weather conditions. Elevated precipitation levels, strong winds, and low visibility could compromise the safety of drone operations. Moreover, dense cloud cover can also block sunlight, resulting in a loss of image quality. For optimum resolution, it is recommended to plan drone operations between 11:00 am and 1:00 pm, when the sun reaches its zenith, ensuring favorable lighting conditions.

2.2.5 Reflective panel

The use of a reflectance panel at the beginning and end of each flight is required to ensure that remote sensing applications are calibrated and ensure accurate alignment with the camera’s response to light.

2.2.6 AUV elements

It is essential to check the AUV battery charge prior to flight. It should be noted that the charging process can take up to one hour. For the Phantom 4 Pro UAV, flight time should not exceed 25 min, due to its battery capacity and design limitations. Having two high-capacity SD cards compatible with the UAV will facilitate the organization of the images and ensure that the expected quantity and quality of the captured images are obtained. In terms of flight altitude, it is suggested to keep the AUV at an altitude of approximately 120 m and temperature conditions ranging from 0 to 40°C. This altitude is one of the standard conditions for most flights and provides an adequate balance between flight time and resolution.

2.3 Using image processing to create orthomosaics

Photogrammetry uses sophisticated methods for creating and capturing accurate three-dimensional data from images. Once the files have been downloaded from the UAV, the steps that follow in Agisoft PhotoScan Software, are described in Figure 6.

Figure 6.

Steps for image processing in Agisoft PhotoScan software©.

The images captured by the drone are imported into the processing software. Once imported, the images undergo a standard processing procedure to eliminate distortions and rectify variations in lighting. At this point, the geographical coordinates of each image are also established to ensure precise spatial positioning. The next step is to orientate the images, aligning them in a coherent and precise manner to achieve proper alignment between them all. Once this phase is completed, the generation of a three-dimensional point cloud from the processed images is initiated. This point cloud contains detailed information about the terrain’s topography and the structure of the objects in the scanned area, which is critical for obtaining accurate and complete representation of the environment.

The information in the point cloud is used to create the orthomosaic. This composite image combines all the individual images into an accurate panoramic view of the area, representing the terrain and objects in detail and with accuracy. The resulting orthomosaic offers a complete and precise geospatially visual depiction of the area of interest and can be utilized in a range of applications, including mapping, urban planning, and environmental monitoring. Together, these steps form a comprehensive process that transforms the original imagery into a visually coherent and geospatially accurate representation of the terrain and its surroundings, using PhotoScan software as a key tool throughout the process. For this study, 18 images were taken at each of the selected properties to obtain different RGB values and allow for correlation between them and the bromatological results of the pastures. The grass species found in these properties were mainly varieties of Urochloas, native grass, kikuyu grass and star grass. Image processing and sampling point identification is shown in Figure 7.

Figure 7.

Drone footage capturing pastures.

2.4 Collection and pre-processing of RGB images

The RGB images generated in the flights of the rural properties correspond to the total area of the property (if it is small). The images were taken at a height of 60 m with an overlap of 90% and stored in JPG format. The RGB band values from the images were extracted using Google Colab via the Python programming language.

The cv2, numpy, matplotlib, and pandas libraries were used to import the images, select the region of interest, and extract the RGB values. The region of interest (ROI) was selected by defining coordinates in the center of each image, considering it to be a homogeneous area; from this region of interest, the average values of the RGB bands were extracted. The steps to perform the pre-processing of the images in Google Colab are shown in Figure 8. The purpose of pre-processing the images is to obtain three numerical values of the RGB bands that will be applied to the statistical models.

Figure 8.

A flowchart indicating the Programming of RGB image pre-processing done in Google Colab.

The averages of each band for each of the attributes were used to obtain the RGB values. The values of the RGB bands can be in the range or scale from 0 to 255. The results are presented in Table 2.

MunicipalityR bandG bandB band
Sopetrán130.28156.3396.97
Sopetrán149.28164.61118.22
Maceo132.16153.4780.79
Abejorral119.66145.3179.32
Abejorral79.98140.9965.13
Yarumal109.42155.0676.74
Yarumal74.10121.6951.32
Amagá129.99150.6089.56
Amagá115.16154.2695.95
Santa Rosa de Osos99.77151.7455.10
Santa Rosa de Osos101.94141.6273.83
Chigorodó132.07149.8356.84
Chigorodó147.55174.3180.46
Caucasia160.00198.43118.79
Caucasia131.39169.25100.15
Támesis145.30180.3382.93

Table 2.

Average RGB bands extracted from the images of each farm in the study.

2.5 Bromatological analysis of grass

The pastures were sampled in the field following the methodology guidelines for bromatology sampling established by the Department of Research Laboratories and Services of Agrosavia. The crude protein (PC), ethereal extract (EE), total moisture (H), neutral detergent fiber (NDF), and acid detergent fiber (ADF) contents were determined by wet chemical analysis. The values reported by the bromatological analysis of each farm were included in the database with the RGB values of each image. This database was used in the R-Project statistical software to apply statistical regression models and predict the nutritional characteristics of the pasture using the RGB bands.

2.6 Statistical processing

Statistical processing of the data was performed in the software R-Project [4.2.2]. The “gramEvol” library [19] was used to generate different indices with the RGB bands using grammatical context-free expressions with minimum cost functions. The logarithm, exponent, square root functions, and the “+”, “-”, “*”, “/” operations were included; in addition, the logarithmic transformation was applied to the pasture nutrient variables. However, the models showed a better fit when the transformation was not applied to the data.

In addition to the indices obtained with gramEvol, another 50 indices and their combinations were developed to find the best equations. The equations had to meet the criteria of highest R2 and best Bayesian Information Criterion (BIC) for the nutritional variables. Of all the nutritional variables analyzed, good indices were obtained only for ADF, CP, and MOISTURE. The results are shown in Table 3.

VariableBest modelR2*
Acid detergent fiber (ADF)y=14.79eRG0.99
Moisture (MOISTURE)y=9.15RG1.94eRB0.97
Crude protein (CP)y=63.36eRG+173.88RG0.95

Table 3.

Equations and fit obtained by means of linear models for the variables ADF, CP, and MOISTURE.

R2 = coefficient of determination.


Authors such as [20] employed various indices proposed by other researchers in their studies, enhancing the precision of results in images captured with RGB cameras. Furthermore, authors such as [10] used indices based on NDVI values, creating different combinations between this index and the RGB bands; with which they were able to obtain adjustments of 0.71 in maize crops.

Advertisement

3. Conclusions

The application of a well-studied field methodology for RGB image acquisition in pastures can mitigate errors before, during and after drone flights, providing greater accuracy and efficiency throughout the process. In addition, the equations constructed using the RGB band indices were presented an excellent fit and can be used to predict pasture nutritional variables in future pasture research. The example presented in this chapter will facilitate image acquisition and information processing, as it can be extrapolated to similar studies. However, a more substantial quantity of RGB data is recommended for data processing and statistical modeling and the insufficient sampling size stands out as a noticeable constraint. Increasing the sample size in data collection enhances the validity of research findings.

From a positive perspective, the utilization of RGB images proved beneficial by enhancing decision-making efficiency over extensive crop areas. In addition, the methodology, curated and carefully selected from previous studies, allows for easy replication of the technique in various locations. While these techniques demonstrate effectiveness in analyzing pasture crops, their applicability extends to diverse crops such as corn, rice, and avocado, among others. This tool can be used by government entities providing technical assistance to producers for general livestock feeding plans. Additionally, in certain instances, it can function as an adjunctive support to pasture fertilization plans.

In addition to RGB images taken by UAV, there are other more complex techniques such as multispectral images (MSI) and hyperspectral images (HSI), which can provide more precise information about the nutritional conditions of crops due to the greater number of bands they encompass compared to RGB. However, it is imperative to note that these methods necessitate further research and field testing for comprehensive validation.

Advertisement

Acknowledgments

This work received financial support for the fieldwork from the project “Development and Establishment of the Centre for Agro-biotechnological Development of Innovation and Territorial Integration, El Carmen de Viboral, Antioquia, Occident (CEDAIT)”, Expert System Component, with resources from the General Royalty System and the Government of Antioquia. Financial support was also received for the publication of this chapter from the project “Design and validation of predictive models to determine Cation Exchange Capacity (CEC), Organic Matter (OM) and Nitrogen (N) in soils from hyperspectral images” through the agreement 2022-7204, financed by the University of Antioquia Foundation. The authors express their gratitude to Tatiana Rodríguez Monroy, a master’s student in Animal Sciences, for her contributions.

Advertisement

Conflict of interest

The authors declare no conflict of interest.

References

  1. 1. Eissa AHA, Khalik AA, Abdel AA. Understanding color image processing by machine vision for biological materials. In: Eissa AHA, editor. Structure and Function of Food Engineering. Chapter 10. Intechopen; 2012. pp. 227-274, 416 p. DOI: 10.5772/50796
  2. 2. Pharr M, Wenzel J, Humphreys G. Physically Based Rendering: From Theory to Implementation. 4th ed. MIT Press; 2023. 1312 p. Available from: https://pbr-book.org/4ed/Radiometry,_Spectra,_and_Color/Color
  3. 3. Fu L, Gao F, Wu J, Li R, Karkee M, Zhang Q. Application of consumer RGB-D cameras for fruit detection and localization in field: A critical review. Computers and Electronics in Agriculture. 2020;177:105687. Available from: https://www.dronedeploy.com/blog/what-are-ground-control-points-gcps/
  4. 4. Torres-Sánchez JM, Peña AI, de Castro A, López-Granados F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Computers and Electronics in Agriculture. 2014;103:104-113. DOI: 10.1016/j.compag.2014.02.009
  5. 5. Puri V, Nayyar A, Raja L. Agriculture drones: A modern breakthrough in precision agriculture. Journal of Statistics and Management Systems. 2017;20(4):507-518. DOI: 10.1080/09720510.2017.1395171
  6. 6. Li W, Niu Z, Wang C, Huang W, Chen H, Gao S, et al. Combined use of airborne lidar and satellite GF-1 data to estimate leaf area index, height, and aboveground biomass of maize during peak growing season. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing. 2015;8(9):4489-4501. DOI: 10.1109/JSTARS.460944310.1109/JSTARS.2015.2496358
  7. 7. Luna I, Lobo A. Mapping crop planting quality in sugarcane from UAV imagery: A pilot study in Nicaragua. Remote Sensing. 2016;8(6):1-18. DOI: 10.3390/rs8060500
  8. 8. Deshmukh S, Jadhav V. Bromatological and mineral assessment of Clitoria ternatea linn. Energy (KJ). 2014;459(9):489-424
  9. 9. Manzani L, Aparecido L, Ventura G, Ferreira L, Monteiro de Figueiredo PA. Bromatological and morphological characteristics of forage plants. Investigación Agraria. 2021;23(1):22-27. DOI: 10.18004/investig.agrar.2021.junio.2301602
  10. 10. Vergara-Díaz O, Zaman-Allah MA, Masuka B, Hornero A, Zarco-Tejada P, Prasanna BM, et al. A novel remote sensing approach for prediction of maize yield under different conditions of nitrogen fertilization. Frontiers in Plant Science. 2016;7(666):1-13. DOI: 10.3389/fpls.2016.00666
  11. 11. Karila K, Alves R, Ek J, Kaivosoja J, Koivumäki N, Korhonen P, et al. Estimating grass sward quality and quantity parameters using drone remote sensing with deep neural networks. Remote Sensing. 2022;14(11):2692. DOI: 10.3390/rs14112692
  12. 12. Barbedo JG. Digital image processing techniques for detecting, quantifying, and classifying plant diseases. SpringerPlus. 2013;2(1):1-12
  13. 13. Tagle Casapía X, Di Liberto S, Falen L, Flores G, Dávila A, Mendoza C, et al. Protocolo para sobrevuelos con RPAs PHANTOM 4 PRO y PHANTOM 4 RTK. 1st ed. Perú; 2021. Available from: https://repositorio.iiap.gob.pe/bitstream/20.500.12921/610/5/tagle_protocolo_2021.pdf
  14. 14. Posada-Asprilla W, Cerón-Muñoz MF. Influencia del ángulo de iluminación solar y la altura de la toma de la imagen multiespectral sobre la estimación de biomasa de pasto kikuyo. Revista UDCA Actualidad & Divulgación Científica. 2019;22(2):1-6. DOI: 10.31910/rudca.v22.n2.2019.1338
  15. 15. Toskov B, Toskova A, Stoyanov S, Doychev E. Architecture of intelligent guard system in the virtual physical space. In: Sgurev V, Jotsov V, Kruse R, Hadjiski M, editors. IEEE 10th International Conference on Intelligent Systems (IS); August 28-30; Sofia, Bulgaria. 2020. pp. 265-269, 629 p. DOI: 10.1109/IS48319.2020.9200177
  16. 16. Pajares G. Overview and current status of remote sensing applications based on unmanned aerial vehicles (UAVs). Photogrammetric Engineering & Remote Sensing. 2015;81(4):281-330
  17. 17. Pix4Dcapute. Un paquete de software para mapeo. 2023. Available from: pix4d.com/es/
  18. 18. DroneDeploy. Online: How Do I Use Ground Control Points? 2017. Consulted on August 15, 2023. Available from: https://www.dronedeploy.com/blog/what-are-ground-control-points-gcps
  19. 19. Noorian F, de Silva AM, Leong PHW. gramEvol: Grammatical evolution in R. Journal of Statistical Software. 2016;71(1):1-26. DOI: 10.18637/jss.v071.i01
  20. 20. Barbosa BDS, Ferraz GAS, Gonçalves LM, Marin DB, Maciel DT, Ferraz PFP, et al. RGB vegetation indices applied to grass monitoring: A qualitative analysis. Agronomy Research. 2019;17(2):349-357. DOI: 10.15159/ar.19.119

Written By

Valentina Parilli-Ocampo, Manuela Ortega Monsalve, Mario Cerón-Muñoz, Luis Galeano-Vasco and Marisol Medina-Sierra

Submitted: 25 September 2023 Reviewed: 15 January 2024 Published: 26 February 2024