robotics

Article Smart Agricultural Machine with a Computer Vision-Based Weeding and Variable-Rate Scheme

Chung-Liang Chang * ID and Kuan-Ming Lin

Department of Biomechatronics Engineering, National Pingtung University of Science and Technology, Pingtung 91201, Taiwan; [email protected] * Correspondence: [email protected]

 Received: 4 June 2018; Accepted: 17 July 2018; Published: 19 July 2018 

Abstract: This paper proposes a scheme that combines computer vision and multi-tasking processes to develop a small-scale smart agricultural machine that can automatically and perform variable rate irrigation within a cultivated field. Image processing methods such as HSV (hue (H), saturation (S), value (V)) color conversion, estimation of thresholds during the image binary segmentation process, and morphology operator procedures are used to confirm the position of the and , and those results are used to perform weeding and watering operations. Furthermore, the data on the wet distribution area of surface (WDAS) and the moisture content of the deep soil is provided to a fuzzy logic controller, which drives pumps to perform variable rate irrigation and to achieve water savings. The proposed system has been implemented in small machines and the experimental results show that the system can classify plant and weeds in real time with an average classification rate of 90% or higher. This allows the machine to do weeding and watering while maintaining the moisture content of the deep soil at 80 ± 10% and an average weeding rate of 90%.

Keywords: fuzzy logic; machine vision; field robotics;

1. Introduction Traditional agricultural models utilize inefficient and labor intensive human labor. When automation and mechanization technology matured, tractors replaced moving farming machines in place of . This was a solution to solve the issues of insufficient labor and the aging farming population [1]. Today’s automated tilling machines are mainly categorized as planting and irrigation equipment; however, plant nursery operations often include weeding, the spreading of , and watering as they are critical cultivation tasks. To complete weed control tasks, machine devices such as weed mowers or spray machines can be used and operated via manual control or tractors [2]. However, these methods require indiscriminate spraying, watering, or weed removal operations. When utilized in inter-row or intra-row weed removal, human error can often cause damage to and excessive watering can lead to the rotting of roots or leaves. In the early stages, the use of does not suppress weed growth effectively, due to the diversity of weed varieties and the issue of drug resistance. Next, inadequate dosage control can adversely affect the soil and growth. This is also true for watering and fertilizing. Lastly, these machines are powered by petrol engines that can cause air pollution and harm our health. Therefore, the variable rate spraying with herbicides and the development of smart weeding machines are the most common practices in recent years. At present, many studies focus on the design and experiment of the smart weeding machine, including the use of digital cameras to capture large-scale field images and the adoption of machine

Robotics 2018, 7, 38; doi:10.3390/robotics7030038 www.mdpi.com/journal/robotics Robotics 2018, 7, 38 2 of 17 vision technique to find rows of crops and to perform weeding operations [3–6]. The shape descriptors in combination with the fuzzy decision-making method are also employed to classify weed species [7]. Besides, several researches also concentrate on the implementation and testing of automatic weeding machines including the use of proximity sensors to identify crop positions [8–10] or the use of robots with visual imaging capabilities to locate the position of weeds and employ a special steerage [11,12], tube stamp [13], or laser device [14,15] to remove the weeds. In addition, a combination of chemical and mechanical methods are also utilized to destroy the weeds [16]. Some modular weeding mechanisms are also implemented [17]. These smart weeding machines rely on the performance of the machine vision system to conduct weeding and visual guidance. However, the uncertainty factors, including illumination and different colors of leaf or soil, affect the performance of the machine vision system [18]. The results of Hamuda et al. (2013) and Yang et al. (2015) indicated that HSV-based image processing methods improve the robustness of the machine’s vision [19,20], and these methods can reduce the effect of natural illumination during image processing. Some studies also propose good classification methods that depict the difference between shades of green from varying brightness or contrast for green crops [21,22]. In 2018, Shu et al. proposed a method that was based on color space conversion and multi-level threshold [23]. It can be used to segment the vegetation under shadowy conditions and has been proven to be feasible in real-world applications. Some start-up companies in the market, such as ecoRobotix and Blue River (which was acquired by John Deer in 2017), have the ability to develop smart machines which selectively use herbicides to destroy weeds [24,25]. The use of this method can reduce the amount of and decrease the damage to the soil. Some other applications using machine vision technology, such as fruit sorting or harvesting, have also been presented in recent years [26,27]. Today, the machine vision technology has been sensibly applied to the field to separate crops from weeds, and can be used as a useful tool for precision agriculture in the future. Nevertheless, such technology still needs to be effectively integrated into the machine. Inappropriate integration practices can result in the unstable and inaccurate performance of the robot. Besides, as the machines that were mentioned above only possessed weeding functions, and other tasks required different machinery, they thus lack flexibility. To address this issue, this paper proposes a multi-functional smart machine (Supplementary Materials) that can automatically remove weeds while also allowing for variable rate irrigation. The digital camera device of this machine can capture images of growth areas under the machine in real time and use HSV and adaptive threshold methods to distinguish crops from weed areas and estimate the wet distribution area of the surface soil (WDAS), such that the machinery can automatically respond to specific areas with weeding or watering. This scheme allows for the removal of weeds while leaving the crop unharmed. In addition, the modularized mechanism can be used to provide different functions, such as turning soil or sowing. Furthermore, the use of the fuzzy logic control method determines the amount of water that is given to crops depending on the soil moisture content of the root depth and the wet distribution area of the surface soil to ensure that the soil maintains the appropriate moisture rate. This paper is organized as follows: Section2 details the procedures and the steps in designing the method, including image processing, variable rate irrigation, and multi-tasking operations; Section3 describes the design method of the operating mechanism and the computer vision system; Section4 contains the experimental results and the discussion, while the final chapter contains the conclusion.

2. Materials and Methods The scheme that is proposed in this paper includes the use of the image processing technique to classify crops and weeds and also to estimate the WDAS, the fuzzy logic control method to obtain the pulse width modulation (PWM) level to drive the water pump, and multi-tasking processes to execute weeding and variable rate irrigation. Robotics 2018, 7, 38 3 of 17

2.1. Image Processing Technique Since brightness and color contrast differ in each image, the adaptive threshold method uses brightness and color values to automatically adjust the threshold level that is required during the segmentation process, such that the feature can be successfully extracted and classified from the image.

2.1.1. Weed/Plant Classification The flowchart of image processing with the adaptive threshold method is depicted in Figure1. The colors of an image are often described using red (R), green (G), and blue (B) as the three primary colors which are combined to form the RGB color space. This expression method can be used to convey the color of every single pixel in the image in relation to the red, green, and blue channels; however, it is not easy to obtain pixel luminance, saturation, and hue with this method, and thus, a hue (H), saturation (S), and value (V) (known as HSV) color model can better emulate the human perception of color than an RGB model. This is why the 8-bit digital image that is captured by the camera is converted from RGB to HSV during image pre-processing. It is assumed that the image has been shrunk by 25% (see Figure2a) and that the new image’s dimensions are length ( I) × width (J) or a binary matrix and that every pixel in the matrix Prgb(i, j), where i = 1, 2, 3 ... , I and j = 1, 2, ... , J, shows red, green, and blue color values that are represented as Pr(i, j), Pg(i, j), and Pb(i, j), and that these values are between 0 and 225. Equation (1) represents the HSV color space and the pixels hue value Ph(i, j), saturation value Ps(i, j), and value Pv(i, j) in the color space and their conversion relationship with the RGB color model. ( ) 0.5[(Pr(i,j)−Pg(i,j))+(Pr(i,j)−P (i,j))] ( ) = −1 b Ph i, j cos q 2 (Pr(i,j)−Pg(i,j)) −(Pr(i,j)−Pb(i,j))(Pg(i,j)−Pb(i,j))

max(Pr(i,j),Pg(i,j),Pb(i,j))−min(Pr(i,j),Pg(i,j),Pb(i,j)) (1) Ps(i, j) = max(Pr(i,j),Pg(i,j),Pb(i,j))

max(Pr(i,j),Pg(i,j),Pb(i,j)) Pv(i, j) = 255 where Ph ∈ [0, 360), Ps(i, j), Pv(i, j) ∈ [0, 1]. Next, the adaptive thresholding method is employed to create the binary image from the H-channel or S-channel of the HSV image. First, the and weeds are defined as target objects with a similar color and it is assumed that the obtained information only contains green plants and J I the color of soil (background), then the average hue value Ph, AVG = (1/(I × J)) ∑ ∑ Ph(i, j) and the j=1 i=1 J I average saturation value are calculated Ps, AVG = (1/(I × J)) ∑ ∑ Ps(i, j) from the H-channel and j=1 i=1 S-channel, respectively. Second, the less of the two average values above is selected to become the desired output channel (see Figure2b). The selection of the lesser average value can prevent too much soil in the image from being mistaken as a green object. On the other hand, it is easy to reach the segmentation process by taking the lesser average value than by taking a larger value. Third, the initial threshold is set as Tn = 0 (n = 0) and the binary segmentation process for the channel then begins. Assume that the threshold is defined as

n Tn+1 = Lh, AVG Pv, AVG (2)

n where Lh, AVG represents the average hue value in the target area within the image that is filled with plants and weeds, this value is between 0 and 1; the average value of brightness on the V-channel is represented as Pv, AVG ∈ (0, 255). The final threshold will be obtained once the termination condition −4 |Tn+1 − Tn| ≤ 10 is achieved during the binary segmentation process. This optimization method can effectively separate the image background and objects with an iteration process. After that, the morphology method is used on the binary image to emphasize the target features of the image Robotics 2018, 7, x FOR PEER REVIEW 4 of 17

represented as Pv, AVG ∈(,0 255 ). The final threshold will be obtained once the termination condition T -T ≤ 10−4 is achieved during the binary segmentation process. This optimization method can Roboticsn+1 n2018, 7, 38 4 of 17 effectively separate the image background and objects with an iteration process. After that, the morphology method is used on the binary image to emphasize the target features of the image with withprocessing processing steps steps that thatinclude include masking, masking, erosion, erosion, dilation, dilation, connected connected-component-component labeling, labeling, or or local descriptors (see FigureFigure2 2cc).). Finally, Finally, connected connected markings markings are are then then utilized utilized to to mark mark the the position position ofof thethe plantplant andand weedsweeds and and to to count count the the number number of of weeds weeds (see (see the the Figure Figure2d). 2 Thed). The red boxred signifiesbox signifies plant plant and weed,and weed whereas, whereas blue signifiesblue signifies the serial the numberserial number of green of objects. green Theobjects plants. The are plants acquired are byacquired using theby maximumusing the maximum area method area from method the binary from the image binary and image the remaining and the remaining are weeds. are weeds.

Flowchart for the image processing with the adaptive threshold method. Figure 1.1. Flowchart for the image processing with the adaptive threshold method.

Robotics 2018, 7, 38 5 of 17 Robotics 2018, 7, x FOR PEER REVIEW 5 of 17

① ②

(a) (b) ③ ④

(c) (d)

Figure 2. Flowchart of image processing. (a) Original image; (b) output channel; (c) feature Figure 2. Flowchart of image processing. (a) Original image; (b) output channel; (c) feature extraction; extraction; (d) mark the position of the plant and weeds. (d) mark the position of the plant and weeds.

2.1.2. Soil Surfaces Moisture Content Estimation 2.1.2. Soil Surfaces Moisture Content Estimation Excessive or insufficient soil moisture content at the roots of the plants can affect plant growth. Excessive or insufficient soil moisture content at the roots of the plants can affect plant growth. In addition, the appropriate control of moisture content of soil surfaces can delay the growth rate of In addition, the appropriate control of moisture content of soil surfaces can delay the growth rate of weeds. The image processing method is also utilized to estimate the WDAS by first using the area weeds. The image processing method is also utilized to estimate the WDAS by first using the area description sub-operation method to convert the green objects in ROI of the image into black objects. description sub-operation method to convert the green objects in ROI of the image into black objects. Next, with only the soil images (small quantities of weeds are not considered), the average R, G, and Next, with only the soil images (small quantities of weeds are not considered), the average R, G, and B B channels of the soil image are calculated, and the three values are then averaged and defined as channels of the soil image are calculated, and the three values are then averaged and defined as WDAS. WDAS. A larger value indicates that the color of the soil tends to be dark brown. It means that the A larger value indicates that the color of the soil tends to be dark brown. It means that the larger this larger this WDAS is, the bigger the surface value that is occupied by wet soil. Conversely, the WDAS is, the bigger the surface value that is occupied by wet soil. Conversely, the smaller the value, smaller the value, the greater the area that is occupied by dry soil. This method provides similar the greater the area that is occupied by dry soil. This method provides similar performance compared performance compared to that of the previous proposed method in [28]. to that of the previous proposed method in [28].

22.2..2. Variable R Rateate I Irrigationrrigation M Methodethod Fuzzy logiclogic controllercontroller(FLC) (FLC) utilize utilize linguistic linguistic rules rules that that are are based based on professionalon professional experience experience and and utilize inference methods to make fuzzy comprehensive decisions [29,30]. In this paper, WDAS utilize inference methods to make fuzzy comprehensive decisions [29,30]. In this paper, WDAS (Sp) ( S ) and Soil Moisture Content (SMC; S ) are input variables of the controller, while the output andp Soil Moisture Content (SMC; Sq) are inputq variables of the controller, while the output variable is variablethe PWM is drivethe PWM voltage drive (W ovoltage) that controls (Wo ) that the controls speed of the the speed pump. of Figure the pump.3 demonstrates Figure 3 demonstrate the variables therate v irrigationariable rate method irrigation using method FLC. using FLC.

Robotics 2018, 7, 38 6 of 17 Robotics 2018, 7, x FOR PEER REVIEW 6 of 17

FigureFigure 3.3. The diagram of the variablevariable raterate irrigationirrigation system.system.

The Sq is acquired by the moisture sensor which is inserted in the soil under the roots of the The Sq is acquired by the moisture sensor which is inserted in the soil under the roots of the plant. Theplant fuzzy. The setfuzzy utilizes set theutilizes types the of types triangle of andtriangle trapezoid and trapezoid membership membership functions functions that are served that are as fuzzyserved set as and fuzzy each set fuzzy and each set has fuzzy a linguistic set has labela linguistic that is usedlabel forthat definitions. is used for The definitions labels contain:. The labels wet (W),contain: normal wet (N),(W), dry normal (D), high(N), (H),dry (D), mid high (M), and(H), lowmid (L).(M), Variable and low values (L). Variable address values the corresponding address the degreecorresponding values in degree each fuzzy values set. in Table each1 detailsfuzzy set. the input/outputTable 1 details variables the input/output of the fuzzy variables sets and of their the membershipfuzzy sets and function their membership distribution function area. distribution area.

TableTable 1. 1.InputInput and and output output valuesvalues and and linguistic linguistic labels. labels.

Sp Sq Wo Sp Sq Wo [[a,,b ,bc,, d]c , d ] LinguisticLinguistic Labels Labels [ [aa , bb, c,] c ] LinguisticLinguistic Labels [ a [, a ,bb, cc] ] LinguisticLinguistic Labels Labels [0, 0, 91, 109] W [0, 0, 40] H [0, 0, 145] L [0, 0, 91, 109] W [0, 0, 40] H [0, 0, 145] L [97,[97, 105, 105, 113, 113, 0] 0] NN [30,[30, 50, 50, 70] 70] M [135, [135, 170, 170, 200] 200] M M [100,[100, 118, 118, 255, 255, 255] 255] DD [60,[60, 100, 100, 100] 100] L [185, [185, 255, 255, 255] 255] HH

Table 1, a , b , and c depict the physical values corresponding to the three vertexes of the Table1, a, b, and c depict the physical values corresponding to the three vertexes of the triangular triangular membership function. The symbol d is extra vertexes for trapezoidal membership membership function. The symbol d is extra vertexes for trapezoidal membership function. Sp, Sq, and function. Sp , S , and W depict the WDAS, SMC, and PWM output variable. Then, the analysis of Wo depict the WDAS,q SMC,o and PWM output variable. Then, the analysis of the fuzzy inference is executed,the fuzzy whichinference is the is executed core of the, which fuzzy is logic the core control. of the The fuzzy aim islogic to matchcontrol. the The membership aim is to match function the degreemembership to the inputfunction variable degree and to performthe input the variable intersection and perform and union the operation intersection that and is established union operation by the simulatedthat is established inference by rule the for simulated decision-making. inference The rule paper for employs decision the-making. [maximum-minimum The paper employs method] the of[maximum the Mamdani-minimum fuzzy method] model forof the inference. Mamdani The fuzzy fuzzy model rule for syntax inference is “[If]. T .he . . fuzzy and . .rule . [then] syntax . . . is ”. The“[If]…and expert… experience[then]…”. The are expert used toexperience establish are a used rule databaseto establish as a shownrule database in Table as2 shown. For example,in Table 2. theFor depictionexample, ofthe the depiction fuzzy rule of the in Tablefuzzy2 ruleis as in follows. Table 2 is as follows.

RuleRule 1:1: If {(SSp isis W) W) and and ( S( Sisq L)}is L)} then then {(W {(Wis H)}is H)} p q o o Rule 2: If {(Sp is W) and (Sq is M)} then {(Wo is L)} Rule 2: If {( Sp is W) and ( Sq is M)} then {(Wo is L)} Rule 3: If {(S is W) and (S is H)} then {(W is L)} Rule 3: If {( Sp is W) and (qS is H)} then {(oW is L)} ... p q o … Rule 9: If {(Sp is D) and (Sq is H)} then {(Wo is L)} Rule 9: If {( S is D) and ( S is H)} then {(W is L)} Figure4 showsp the rules surfaceq of FLC. o Figure 4 shows the rules surface of FLC. Table 2. Fuzzy logic inference table.

Sp(WDAS) Wo WND LHHH Sq (SMC) MLMH HLLL

Robotics 2018, 7, x FOR PEER REVIEW 7 of 17

Table 2. Fuzzy logic inference table.

Sp (WDAS) Wo W N D L H H H

Sq (SMC) M L M H Robotics 2018, 7, 38 7 of 17 H L L L

220

200

180

160 PWM 140

120

100 80 250 60 200 40 150 100 20 50 SMC 0 0 WDAS

FigureFigure 4. 4Rule. Rule surface surface forfor FLC.FLC.

Finally, a weighted average formula is used to infer the completed fuzzy output for the Finally, a weighted average formula is used to infer the completed fuzzy output for the conversion conversion to a crisp output Wo , otherwise known as defuzzification. The fuzzy inference to a crisp output Wo, otherwise known as defuzzification. The fuzzy inference algorithms are algorithms are programmed in the Matlab simulink toolbox. programmed in the Matlab simulink toolbox. 2.3. Multi-Tasking Process 2.3. Multi-Tasking Process First, the region of interest (ROI) within the image is divided into the left, middle, and right First,segments the, regionas shown of interestby the red (ROI) frames within of Figure the image5. The middle is divided segment into is the for left, plant middle, identification and right, segments,while as the shown left and by ride the segments red frames are of for Figure weed5 identification.. The middle The segment image is processing for plant identification,method is used while to the leftobtain and ridethe ident segmentsification are results for weed of each identification. segment, including The image the wet processing distribution method area isof usedthe surface to obtain the identificationsoil in the ROI. results Next, of the each identification segment, results including in these the wetthree distribution areas are binary area coded of the. surfaceWhen the soil code in the ROI. Next,value thefor a identification segment is “1”, results it means in thesethat there three is areas a weed are or binary plant in coded. the segment, Whenthe while code “0” value means for a segmentthat is there “1”, itare means no plants that thereor weeds is a weedin the orsegment. plant in As the the segment, middle whilesegment “0” is means the plant that area, there it are is no plantsidentified or weeds as in the the segment segment. that As contains the middle the plant segment and will is thenot plantbe weeded area,. itThe is identifiedleft and right as segments the segment that containswill be weeded the plant when and the will output not bevalue weeded. is “1”. The The operation left and rightprocess segments first removes will bethe weeded weed and when then the disperses water through the pump with values between 0 and 255 as defined by the FLC, so as to output value is “1”. The operation process first removes the weed and then disperses water through regulate the input voltage of the pump to achieve the variable rate irrigation. the pump with values between 0 and 255 as defined by the FLC, so as to regulate the input voltage of the pump to achieve the variable rate irrigation. Robotics 2018, 7, x FOR PEER REVIEW 8 of 17

Figure 5. Illustration of the range of region of interest in the digital image. Figure 5. Illustration of the range of region of interest in the digital image. 3. Description of the Mechatronics System The proposed mechatronics system includes the frames of the machine, the weeding and watering mechanism, the image and soil moisture sensor, the actuator, and the graphical user interface (GUI).

3.1. Mechanism Design The drawing of the prototype design of the machine is as shown in Figure 6. The body of the machine is assembled from multiple black iron support frames combined with herringbone patterned wheels with a diameter of 300 mm. The dimensions of the body are 988 mm (L) × 806 mm (W) × 449 mm (H). The weeding mechanism is constructed from a linkage device, chain, gear, and a depth-adjustable mechanism. The motor drives the gear and the chain to pull the three-pronged weeding device in an arc to flick the soil backwards (see Figure 7a) and simultaneously remove the weeds. The mechanism is height adjustable and convenient for tilling soil. Three weeding mechanisms are replicated to facilitate the comprehensive coverage of the cultivation areas. The metal tubes on both sides of the weed remover are equipped with spray nozzles that are equipped with two sets of water pumps (see Figure 7b) that draw out water from the water tank in the back of the machine to irrigate the soil on both sides of the plant.

Figure 6. Prototype design of the proposed multi-functional farming robot.

Robotics 2018, 7, x FOR PEER REVIEW 8 of 17

Robotics 2018, 7, 38 8 of 17

Figure 5. Illustration of the range of region of interest in the digital image. 3. Description of the Mechatronics System 3. Description of the Mechatronics System The proposed mechatronics system includes the frames of the machine, the weeding and watering The proposed mechatronics system includes the frames of the machine, the weeding and mechanism, the image and soil moisture sensor, the actuator, and the graphical user interface (GUI). watering mechanism, the image and soil moisture sensor, the actuator, and the graphical user interface (GUI). 3.1. Mechanism Design 3.1.The Mechanism drawing D ofesign the prototype design of the machine is as shown in Figure6. The body of the machineThe draw is assembleding of the prototype from multiple design blackof the machine iron support is as shown frames in combinedFigure 6. The with body herringbone of the patternedmachine wheels is assembled with a from diameter multiple of 300black mm. iron The support dimensions frames combined of the body with areherringbone 988 mm (L)patterned× 806 mm (W) wheels× 449 mmwith (H).a diameter The weeding of 300 mm. mechanism The dimensions is constructed of the body from are a linkage988 mm device,(L) × 806 chain, mm (W) gear, × and a depth-adjustable449 mm (H). The mechanism. weeding mechanism The motor is drivesconstructed the gearfrom anda linkage the chain device, to pullchain, the gear, three-pronged and a weedingdepth device-adjustable in an mechanism. arc to flick The the motor soil backwards drives the (seegear Figureand the7 a)chain and to simultaneously pull the three-pronged remove the weeds.weeding The mechanism device in an isarc height to flick adjustable the soil backwards and convenient (see Figure for tilling 7a) and soil. simultaneously Three weeding remove mechanisms the weeds. The mechanism is height adjustable and convenient for tilling soil. Three weeding are replicated to facilitate the comprehensive coverage of the cultivation areas. The metal tubes on mechanisms are replicated to facilitate the comprehensive coverage of the cultivation areas. The both sides of the weed remover are equipped with spray nozzles that are equipped with two sets of metal tubes on both sides of the weed remover are equipped with spray nozzles that are equipped waterwith pumps two sets (see of Figurewater pumps7b) that (see draw Figure out 7 waterb) that fromdraw theout wat waterer from tank the in water the back tank of in the the machineback of to irrigatethe machine the soil onto irrigate both sides the soil of theon both plant. sides of the plant.

FigureFigure 6. 6Prototype. Prototype design design ofof the proposed multi multi-functional-functional farming farming robot robot.. Robotics 2018, 7, x FOR PEER REVIEW 9 of 17

(a) (b)

Figure 7. Weeding mechanism and sprayer. (a) Linkage device; (b) weed mechanisms and sprayer. Figure 7. Weeding mechanism and sprayer. (a) Linkage device; (b) weed mechanisms and sprayer. 3.2. Hardware/Software System Implementation A laptop computer serves as the main processing unit that is used for executing the image classification and fuzzy logic inference. The data for the soil moisture content that is received from the soil moisture sensor can be wirelessly transmitted to the sub-microcontroller, and the data is transferred via a universal serial bus (USB) interface to the main processing unit for the fuzzy logic inference operation. The results are transmitted to the main microcontroller, which then transmits the PWM signal to drive the pump and performs the variable rate irrigation. There is an additional graphical user interface (GUI) that allows for the real time observation of the ROI area to display the crop and weed dispersion, the completed rows, and the soil moisture rate of this operation. It allows the user to start and stop the system execution. The main controller program of the GUI will begin the detection and the classification of the target within the different segments of ROI, along with the estimation of WDAS (see Figure 8). The multi-tasking process (as shown in Section 2.3) will be executed once the detection and classification process is finished. In addition, the growth response map of the plant shows that the image of each plant will be recorded in the database during the experiment in the whole field.

Figure 8. The layout of GUI for the smart machine.

Robotics 2018, 7, x FOR PEER REVIEW 9 of 17

(a) (b)

Figure 7. Weeding mechanism and sprayer. (a) Linkage device; (b) weed mechanisms and sprayer. Robotics 2018, 7, 38 9 of 17

3.2. Hardware/Software System Implementation 3.2. Hardware/Software System Implementation A laptop computer serves as the main processing unit that is used for executing the image classificationA laptop and computer fuzzy logic serves inference. as the mainThe data processing for the unitsoil moisture that is used content for executing that is received the image from theclassification soil moisture and sensor fuzzy logiccan be inference. wirelessly The datatransmitted for the soil to moisturethe sub- contentmicrocontroller that is received, and fromthe data the is transferredsoil moisture via sensora universal can be serial wirelessly bus transmitted(USB) interface to the to sub-microcontroller, the main processing and unit the data for isthe transferred fuzzy logic inferencevia a universal operation. serial The bus results (USB) are interface transmitted to the mainto the processing main microcontroller, unit for the fuzzy which logic then inference transmits theoperation. PWM signal The to results drive are the transmitted pump and toperform the mains the microcontroller, variable rate whichirrigation. then There transmits is an the additional PWM signal to drive the pump and performs the variable rate irrigation. There is an additional graphical graphical user interface (GUI) that allows for the real time observation of the ROI area to display the user interface (GUI) that allows for the real time observation of the ROI area to display the crop and crop and weed dispersion, the completed rows, and the soil moisture rate of this operation. It allows weed dispersion, the completed rows, and the soil moisture rate of this operation. It allows the user to the user to start and stop the system execution. The main controller program of the GUI will begin start and stop the system execution. The main controller program of the GUI will begin the detection theand detection the classification and the classification of the target of within the target the different within the segments different of ROI,segments along of with ROI the, along estimation with the estimationof WDAS of (see WD FigureAS (see8). TheFigure multi-tasking 8). The multi process-tasking (as shown process in Section (as shown 2.3) will in beSection executed 2.3) oncewill be executedthe detection once the and detection classification and process classification is finished. process In addition, is finished the. growth In addition response, the map growth of the response plant mapshows of the that plant the imageshows of that each the plant image will of be each recorded plant in will the be database record duringed in the the database experiment during in the the experimentwhole field. in the whole field.

FigureFigure 8 8.. TheThe layout layout of GUI forfor thethesmart smart machine. machine.

4. Results and Discussion The site of the experiment is at an empty field (situated in front of a building within the school of bio-electromechanical engineering) (longitude: 120.6059◦, latitude: 22.6467◦). The square field is 4 m in length and width and contains six crop rows that each have 10 butter lettuces. The lettuces are spaced 40 cm apart (as shown in Figure9) from each other and three experiments were conducted in the area to verify the performance of the system: Robotics 2018, 7, x FOR PEER REVIEW 10 of 17

4. Results and Discussion The site of the experiment is at an empty field (situated in front of a building within the school of bio-electromechanical engineering) (longitude: 120.6059°, latitude: 22.6467°). The square field is 4 m in length and width and contains six crop rows that each have 10 butter lettuces. The lettuces are

Roboticsspaced2018 40 ,cm7, 38 apart (as shown in Figure 9) from each other and three experiments were conducted10 of 17in the area to verify the performance of the system:

(a)

(b)

Figure 9. (a) The layout of experiment site and the appearance of the machine; (b) Smart Figure 9. (a) The layout of experiment site and the appearance of the farm machine; (b) Smart machine machine to be used for cultivating work in the field. to be used for cultivating work in the field.

4.1. Scenario 1 First, the crop and weed categorization was manually performed on the same row of plants at one-hour intervals between 10 A.M. and 4 P.M. Equation (2) was used to find the appropriate image processing computation threshold value so that the method could conduct categorization under different degrees of environmental lighting. The number of weeds that were grown in the area were then calculated as a means to verify the identification capabilities of the algorithm. Figure 10 displays Robotics 2018, 7, x FOR PEER REVIEW 11 of 17

4.1. Scenario 1 First, the crop and weed categorization was manually performed on the same row of plants at one-hour intervals between 10 A.M. and 4 P.M. Equation (2) was used to find the appropriate image Roboticsprocessing2018, 7computation, 38 threshold value so that the method could conduct categorization 11under of 17 different degrees of environmental lighting. The number of weeds that were grown in the area were then calculated as a means to verify the identification capabilities of the algorithm. Figure 10 thedisplays classification the classification results at 3 results P.M. The at results3 P.M. show The thatresults the weedshow classificationthat the weed rate clas is betweensification 90.0 rate and is 95.2%,between while 90.0 theand crop 95.2% classification, while the crop rate isclassification 100%. rate is 100%.

(a) (b)

(c) (d)

(e) (f)

Figure 10. Classification results of the image using the proposed image processing method at 3 p.m. Figure 10. Classification results of the image using the proposed image processing method at 3 p.m. (a) original image (10 a.m.); (b) initial threshold T = 0 ; (c) threshold update (second iterations); (a) original image (10 a.m.); (b) initial threshold Tnn = 0;(c) threshold update (second iterations); (d)) finalfinal iteration;iteration; (e(e)) pixel pixel that that belongs belongs to to the the plant plant is is highlighted highlighted in white;in white; (f) ( pixelf) pixel that that belongs belongs to the to weedsthe weeds is highlighted is highlighted in white. in white.

Next, the camera on the machine randomly captured 10 images every hour in the growth area Next, the camera on the machine randomly captured 10 images every hour in the growth area between 10:00 in the morning and 16:00 in the afternoon. The image data was collected over a period between 10:00 in the morning and 16:00 in the afternoon. The image data was collected over a of seven days, with the weather each day being sunny with clouds. Figure 8 shows the classification period of seven days, with the weather each day being sunny with clouds. Figure8 shows the results of the plants and weeds at each time point. From the figure, we can see that the classification classification results of the plants and weeds at each time point. From the figure, we can see that results of the plants (Figure 11a) and weeds (Figure 11b) are better in the afternoon, achieving a the classification results of the plants (Figure 11a) and weeds (Figure 11b) are better in the afternoon, classification rate of up to 80% or higher. However, more variation is shown in Figure 8 in the achieving a classification rate of up to 80% or higher. However, more variation is shown in Figure8 in the classification results in the afternoon, which takes place due to the low irradiation angle of the sunshine and seasonal short daylight. Robotics 2018, 7, x FOR PEER REVIEW 12 of 17 classification results in the afternoon, which takes place due to the low irradiation angle of the Robotics 2018, 7, 38 12 of 17 sunshine and seasonal short daylight.

(a)

(b)

FigureFigure 11. Classification 11. Classification results results for for the the image image processing technique technique with with adaptive adaptive threshold threshold (a) plant; (a) plant; (b) weed.(b) weed.

4.2. Scenario4.2. Scenario 2 2 ThisThis experiment experiment is is carried carried outout to to evaluate evaluate the performancethe performance of the variable of the rate variable irrigation rate method irrigation methodthat that is proposed is proposed in this paper.in this Soilpaper. moisture Soil sensorsmoisture were sensors installed were around installed the roots around of an the entire roots row of an entireof row plants of (5plants cm deep (5 cm below deep the below soil level) the withsoil wirelesslevel) with transmission wireless devicestransmission to transmit devices the sensorto transmit values to the computer, so as to obtain the PWM signal for controlling the watering pump. Ten plants the sensor values to the computer, so as to obtain the PWM signal for controlling the watering pump. were used for testing. Figure 12 shows the soil moisture control results from the watering of two plants Ten plants were used for testing. Figure 12 shows the soil moisture control results from the watering which were chosen randomly. The initial soil moisture content levels of these two plants were 4% and of two20%. plants For the which first plant, were FLC chosen activated randomly the pump. The eight initial times forsoil a totalmoisture duration content of 25.5 levels s and achievedof these two plantsa soilwere moisture 4% and level 20% of. F 80%.or the For first the plant, second FLC plant, activated FLC activated the pump the pump eight five times times for for a total a total duration of of 25.519.5 s s.and Based achieved on the a calculations, soil moisture the variablelevel of rate80%. irrigation For the devicesecond can plant, achieve FLC a soilactivated moisture the control pump five timesrange for a of total 80 ± of10%. 19.5 s. Based on the calculations, the variable rate irrigation device can achieve a soil moisture control range of 80 ± 10%.

Robotics 2018, 7, 38 13 of 17 Robotics 2018, 7, x FOR PEER REVIEW 13 of 17

Figure 12 12.. NumberNumber of waterings waterings in relation to the number of variable rate and and soil soil moisture. moisture.

4.3.4.3. Scenario 3 In order to test the correction rate of thethe actions that were conducted through the multi-taskingmulti-tasking measurement,measurement, eacheachplant plant in in the the area area was was observed observed for thefor actionsthe actions that requirethat require completion, completion, after which after anwhich assessment an assessment was carried was outcarried to determine out to determine if the machine if the actuallymachine completed actually completed these tasks these within tasks the designatedwithin the designated area in order area to in assess order the to correctness assess the correctness of the machine of the operations. machine operations. First, the machinemachine was transported to the test area (field)(field) and was placed at the starting position with its Logitech digitaldigital webcamwebcam focusedfocused onon thethe firstfirst plant.plant. The plant was placed in the center of the ROI inin thethe graphicalgraphical user user interface interface (GUI). (GUI) Then,. Then the, the GUI GUI activation activati buttonon button was pressed,was pressed, at which at which point thepoint indicator the indicator icon on icon the GUIon the became GUI red.beca Whenme red. the When classification the classification procedure procedure was completed was forcompleted the first plant,for the the first indicator plant, the icon indicator turnedgreen. icon turn Next,ed thegreen machine. Next, was the manuallymachine was pushed manually to the nextpushed plant to andthe thenext process plant and above the was process repeated. above When was repeated. changing When rows, thechanging stop button rows, was the pressedstop button and was the machinepressed wasand re-activatedthe machine after was being re-activated moved toafter the nextbeing row. moved Figure to 13 thea represents next row. the Figure operation 13a represents success rate the of theoperation left, center, success and rate right of segmentsthe left, center, in the ROIand whenright segments it was operating in the ROI in the when field. it Thewas figure operating shows in thatthe thefield. mission The figure success shows rate that reaches the mission a maximum success of 95% rate andreaches a minimum a maximum of 75%. of 95% and a minimum of 75%. Figure 13b represents the weeding rate for each row of plants and shows that the average rate was aboutFigure 90%. 13b represents The total number the weeding of plants rate for for measurement each row of plants was 60. and The shows weeding thatrate the isaverage defined rate as was about 90%. The total number of plants for measurement was 60. The weeding rate is defined as WD(%) = ((DN − DO)/DN) × 100, with DN and DO representing the number of weeds before and afterWD(%) machine =(( DNO operation. - D) /D N) It× 100 should, with be notedDN and that theDO numberrepresenting of weeds the innumber each row of ofweeds the field before was firstand countedafter machine manually. operation. It should be noted that the number of weeds in each row of the field was first countedNext, the manually. measurement time of each plant for cultivation task within rows of plants is evaluated. The tasksNext, includedthe measurement the processing time ofof 10each plants plant for for each cultivation row and atask total within of six rowsrows ofof plant plants were is made.evaluated The. The operation tasks include time ford the each processing plant during of 10 theplants six for rows each was row averaged and a total (included of six rows the variableof plant ratewere irrigation made. The and operation no watering) time tofor obtain each theplant results during as shownthe six inrows Figure was 14 averaged. The figure (include showsd thatthe thevariable operation rate irrigation time for the and first no and wa secondtering) plantsto obtain in eachthe results row is longeras shown than in that Figure for the 14 other. The plants.figure Itshows is inferred that the that operation this is caused time for by the the first more and iteration second timesplants for in searchingeach row is for longer the optimal than that threshold for the duringother plants the image. It is inferred identification that this process. is caused The figureby the showsmore iteration that it takes times 15~26 for searching s to water for each the plant optimal and thethreshold entire operationduring the time image averages identification at 3.5 min. process. However, The figure if the shows machine that does it takes not conduct 15~26 s wateringto water operations,each plant and the timethe entire for a single operation row averagestime averages at approximately at 3.5 min. 40 However, s. if the machine does not conduct watering operations, the time for a single row averages at approximately 40 s.

Robotics 2018, 7, 38 14 of 17 Robotics 2018, 7, x FOR PEER REVIEW 14 of 17

(a)

(b)

Figure 13. (a) Measurement results with the proposed system for each of the ten plants; (b) weeding Figure 13. (a) Measurement results with the proposed system for each of the ten plants; (b) weeding rate in different vegetablevegetable farmland.farmland.

Robotics 2018, 7, 38 15 of 17 Robotics 2018, 7, x FOR PEER REVIEW 15 of 17

Figure 1 14.4. The average measurement of time for each of the ten plants in one row.

5.5. ConclusionsConclusions The multi-taskingmulti-tasking farm farm machine machine that that was was proposed proposed in this in paper this canpaper beattached can be toattached an automated to an guidedautomated vehicle guided to implement vehicle to consistent implement field consistent operations infield a smalloperations field. Thein experimentala small field. results The showexperimental that the results average show weeding that ratethe isaverage 90%, and weed thating the rate average is 90%, wet distributionand that the area average of surface wet soildistribution can be maintained area of surface at 75%. soil can Although be maintained variable at rate 75% irrigation. Although requires variable a rate longer irrigation operation requires time ata longer this stage, operation this method time at could this savestage, water this andmethod it is believedcould save that water once hardwareand it is believed processing that speeds once increasehardware in p therocessing future, thisspeeds problem increase can bein solved.the future, The this multi-tasking problem can strategy be solved. that was The proposed multi-tasking in this paperstrategy can that also was be proposed expanded in to this different paper cultivationcan also be tasksexpanded and usedto different in conjunction cultivation with tasks the and relevant used farmingin conjunction equipment with (suchthe relevant as farming planting, equipment , and (such chemical as seed sprays) planting, to achieve fertilizer, the and objectives chemical of maintainedsprays) to achieve standardized the objectives field operations of maintained and the standardized conservation field of manpower.operations and the conservation of manpower. Supplementary Materials: The following are available online at http://www.mdpi.com/2218-6581/7/3/38/s1, VideoSupplementary S1: Smart AgMaterials: Machine. The following are available online at www.mdpi.com/xxx/s1, Video S1: Smart Ag AuthorMachine. Contributions: The original concept and method of the research were provided by C.-L.C.; he also wrote the paper, supervised the work, and designed the experiments at all of the stages. The model simulation, software programAuthor Contributions: development, control The original system concept establishment, and method field experiment of the research conduction, were andprovided image by data C.- analysisL.C.; hewere also evaluatedwrote the paper, by K.-M.L. supervised Both authors the work have, and read designed and approved the experiments the final paper. at all of the stages. The model simulation, software program development, control system establishment, field experiment conduction, and image data Funding: This work has been supported by an earmarked fund for the Ministry of Science and Technology (MOST 106-2221-E-020-011).analysis were evaluated by K.-M.L. Both authors have read and approved the final paper. Acknowledgments:Funding: This work Thehas authorsbeen supported would like by an to thankearmarked Wei-Chen fund Chenfor the for Ministry their contribution of Science and during Techn theology field cultivation(MOST 106- experiment2221-E-020- shown011). in this research. Besides, many thanks are due to anonymous reviewers for their valuable comments to refine this paper. Acknowledgments: The authors would like to thank Wei-Chen Chen for their contribution during the field Conflicts of Interest: The authors declare no conflict of interest. cultivation experiment shown in this research. Besides, many thanks are due to anonymous reviewers for their valuable comments to refine this paper. References Conflicts of Interest: The authors declare no conflict of interest. 1. Sims, B.; Heney, J. Promoting smallholder adoption of conservation agriculture through mechanization Referencesservices. Agriculture 2017, 7, 64. [CrossRef] 2. Pullen, D.; Cowell, P. An evaluation of the performance of mechanical weeding mechanisms for use in high 1. speedSims, B.; inter-row Heney, weeding J. Promoting of arable smallholder crops. J. Agric.adoption Eng. of Res. conservation1997, 67, 27–34. agriculture [CrossRef through] mechanization 3. Hemming,services. Agriculture J.; Nieuwenhuizen, 2017, 7, 64. A.T.; Struik, L.E. Image analysis system to determine crop row and plant 2. positionsPullen, D.; for Cowell, an intra-row P. An evaluation weeding machine. of the performance In Proceedings of mechanical of the CIGR weeding International mechanisms Symposium for use onin Sustainablehigh speed inter Bioproduction,-row weeding Tokyo, of arable Japan, crops. 19–23 J. September Agric. Eng. 2011; Res. 1997 pp. 1–7., 67, 27–34. 3. Hemming, J.; Nieuwenhuizen, A.T.; Struik, L.E. Image analysis system to determine crop row and plant positions for an intra-row weeding machine. In Proceedings of the CIGR International Symposium on Sustainable Bioproduction, Tokyo, Japan, 19–23 September 2011; pp. 1–7.

Robotics 2018, 7, 38 16 of 17

4. Burgos-Artizzu, X.; Ribeiro, A.; Guijarro, M.; Pajares, G. Real-time image processing for crop/weed discrimination in maize fields. Comput. Electron. Agric. 2011, 75, 337–346. [CrossRef] 5. Amir, H.; Kargar, B. Automatic weed detection system and smart herbicide sprayer robot for com fields. Int. J. Res. Comput. Commun. Technol. 2013, 2, 55–58. 6. Choi, K.H.; Han, S.K.; Han, S.H.; Park, K.H.; Kim, K.S.; Kim, S. Morphology-based guidance line extraction for an autonomous weeding robot in pddy fields. Comput. Electron. Agric. 2015, 113, 266–274. [CrossRef] 7. Herrera, P.J.; Dorado, J.; Ribeiro, A. A Novel Approach for Weed Type Classification Based on Shape Descriptors and a Fuzzy Decision-Making Method. Sensors 2014, 14, 15304–15324. [CrossRef][PubMed] 8. Cordill, C.; Grift, T. Design and testing of an intra-row mechanical weeding machine for corn. Biosyst. Eng. 2011, 110, 247–252. [CrossRef] 9. Andujar, D.; Weis, M.; Gerhards, R. An ultrasonic system for weed detection in cereal crops. Sensors 2012, 12, 17343–17357. [CrossRef][PubMed] 10. Perez-Ruiz, M.; Slaughter, D.; Fathallah, F.; Gliever, C.; Miller, B. Co-robotic intra-row weed control system. Biosyst. Eng. 2014, 126, 45–55. [CrossRef] 11. Tillett, N.; Hague, T.; Grundy, A.; Dedousis, A. Mechanical within row weed control for transplanted crops using computer vision. Biosyst. Eng. 2008, 99, 171–178. [CrossRef] 12. Gobor, Z. Mechatronic system for of the intra-row area in row crops. KI-Künstliche Intell. 2013, 27, 379–383. [CrossRef] 13. Langsenkamp, F.; Sellmann, F.; Kohlbrecher, M.; Trautz, D. Tube Stamp for mechanical intra-row individual Plant Weed Control. In Proceedings of the 18th World Congress of CIGR, Beijing, China, 16–19 September 2014. 14. Blasco, J.; Aleixos, N.; Roger, J.M.; Rabatel, G.; Molto, E. Robotics weed control using machine vision. Biosyst. Eng. 2002, 83, 149–157. [CrossRef] 15. Xiong, Y.; Ge, Y.; Liang, Y.; Blackmore, S. Development of a prototype robot and fast path-planning algorithm for static laser weeding. Comput. Electron. Agric. 2017, 142, 494–503. [CrossRef] 16. Bawden, O.; Kulk, J.; Russell, R.; McCool, C.; English, A.; Dayoub, F.; Lehnert, C.; Perez, T. Robot for weed species plant-specific management. J. Field Robot. 2017, 34, 1179–1199. [CrossRef] 17. McCool, C.; Beattie, J.; Firn, J.; Lehnert, C.; Kulk, J.; Bawden, O.; Russell, R.; Perez, T. Efficacy of mechanical weeding tools: A study into alternative weed management strategies enabled by robotics. IEEE Robot. Autom. Lett. 2018, 3, 1184–1190. [CrossRef] 18. Teixido, M.; Font, D.; Palleja, T.; Tresanchez, M.; Nogues, M.; Palacin, J. Definition of linear color models in the RGB vector color space to detect red peaches in images taken under natural illumination. Sensors 2012, 12, 7701–7718. [CrossRef][PubMed] 19. Hamuda, E.; Ginley, B.; Glavin, M.; Jones, E. Automatic crop detection under field conditions using the HSV colour space and morphological operations. Comput. Electron. Agric. 2017, 133, 97–107. [CrossRef] 20. Yang, W.; Wang, S.; Zhao, X.; Zhang, J.; Feng, J. Greenness identification based on HSV decision tree. Inf. Proc. Agric. 2015, 2, 149–160. [CrossRef] 21. Romeo, J.; Pajares, J.; Montalvo, M.; Guerrero, J.; Guijarro, M. A new expert system for greenness identification in agricultural images. Expert Syst. Appl. 2013, 40, 2275–2286. [CrossRef] 22. Arroyo, J.; Guijarro, M.; Pajares, G. An instance-based learning approach for thresholding in crop images under different outdoor conditions. Comput. Electron. Agric. 2016, 127, 669–679. [CrossRef] 23. Suh, H.K.; Hofstee, J.W.; Henten, E.J. Improved vegetation segmentation with ground shadow removal using an HDR camera. Precis. Agric. 2018, 19, 218–237. [CrossRef] 24. Keely, M.; Ehn, E.; Patzoldt, W. Smart Machines for Weed Control & Beyond. In Proceedings of the 65th West Texas Agricultural Chemicals Institute Conference, Lubbock, TX, USA, 13 September 2017. Available online: http://www.plantmanagementnetwork.org/edcenter/seminars/2017AgChemicalsConference/ SmartMachinesWeedControl/SmartMachinesWeedControl.pdf (accessed on 28 June 2018). 25. Tanner, S. Low—Herbicide Robotic Weeding. World Agri—Tech Innovation Summit London, 17–18 October 2017. Available online: http://worldagritechinnovation.com/wp-content/uploads/2017/10/EcoRobotix. pdf (accessed on 25 June 2018). 26. Amatya, S.; Karkee, M.; Zhang, Q.; Whiting, M.D. Automated detection of branch shaking locations for robotic cherry harvesting using machine vision. Robotics 2017, 6, 31. [CrossRef] Robotics 2018, 7, 38 17 of 17

27. Cubero, S.; Lee, W.S.; Aleixos, N.; Albert, F.; Blasco, J. Automated systems based on machine vision for inspecting citrus fruits from the field to —A review. Food Bioprocess Technol. 2016, 9, 1623–1639. [CrossRef] 28. Chang, C.; Lin, K.; Fu, W. An intelligent crop cultivation system based on computer vision with a multiplex switch approach. In Proceedings of the ASABE Annual International Meeting, Spokane, WA, USA, 16–19 July 2017. 29. Zadeh, L.A. A rationale for fuzzy control. J. Dyn. Syst. Meas. Control 1972, 94, 3–4. [CrossRef] 30. Lee, C.C. Fuzzy logic in control system: Fuzzy logic controller. I. IEEE Trans. Syst. Man Cybern. 1990, 20, 404–418. [CrossRef]

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).