<<

2017 International Conference on Mechanical and Mechatronics Engineering (ICMME 2017) ISBN: 978-1-60595-440-0

An Agricultural Robot for Multipurpose Operations in a Greenhouse

* -lin , - FAN, -xin and Yong FENG College of Engineering, Nanjing Agricultural University, Nanjing, China, 210031 *Corresponding author

Keywords: Agricultural robot, Multipurpose, Modular design, Automation.

Abstract. A multipurpose robot was designed to perform several tasks such as spraying and weeding in a greenhouse. Several operations were conducted by adding or removing sensing component(s), replacing actuator(s) and switching control software, with little or no change of the platform. A sliding mode control was applied to control motion of the robot in light of its kinematic model. Vision based algorithms were developed for navigation and operations of row planting crops, taking operations of spraying water and weeding by machine as examples. A vertical projection method was applied to calculate a guidance line for navigation. Preliminary tests were conducted to assess both guidance and operations of spraying and weeding in a greenhouse with green vegetable, respectively. Results showed that the robot traveled with a maximum lateral error of 47 mm, and sprayed with a productivity of 16-20 plants/min. Also, weeding action was taken in time. The designed robot has adaptability and versatility because the different operations can be achieved on the same platform.

Introduction Agricultural robots have been regarded as a solution to reduce labor intensity, improve operation efficiency and operation safety [1,2]. Some mobile robots were developed and tested in a greenhouse environment in previous works. González et al. [3] conducted the autonomous navigation in narrow corridors in greenhouses using deliberative techniques based on map algorithm and pseudo-reactive technique with a sensor feedback algorithm. Mehta et al. [4] presented a daisy chaining strategy to find the local coordinates of the above-the-aisle targets to localize a wheeled mobile robot by using machine vision. Janos and Matijevics [5] introduced a potential field method for robot navigation in a greenhouse with WSN support. Dario et al. [6] developed AGOBOT platform with stereo vision and a manipulator arm with a gripper and six degrees freedom for greenhouse cultivation of tomatoes, whose vision system controlled the moving direction and kept the vehicle at the center of the free path. Mandow et al. [7] described an AURORA platform for greenhouse spraying, and the navigation of this robot was based on a set of basic behaviors established by an operator before. Thus it does not apply navigation techniques strictly, and it lacks detailed navigation result. Singh et al. [8] developed a robotic vehicle with six-wheel differential steering for greenhouse spraying. It was tested on sand and concrete surfaces through simulated greenhouse aisles using ultrasonic sensors. A robot was developed by Henten et al. [9] for harvesting cucumbers in greenhouses, and heating pipes on the ground in the aisles were used for guidance and support of this robot during crop maintenance and harvest. Similarly, a rail type traveling robot was described by Rajendra et al. [10] for strawberry harvesting with vision algorithm in a tabletop culture greenhouse. Kitamura and Oka [11] designed a recognition and cutting system of sweet peppers for picking robots in greenhouse horticulture. The picking operation was conducted in two simple situations with and without leaves in the laboratory when the prototype robot did not move. Ko et al. [12] presented a system-of-systems approach to develop a mobile robotic platform for pesticide spraying application. Sulakhe and Karanjkar [13] developed and tested a prototype of pesticide spraying robot specially used in a greenhouse, and the navigation experiment was conducted by tracking a signal wire on the ground. Most of the past researches focused on navigation control and a specific task to be performed by robots in greenhouses. However, it is difficult to popularize these single-purpose robots due to low efficiency and high cost. On the other hand, multipurpose robot is more versatile since they can be used to carry out a variety of agricultural tasks. Monta et al. [14] studied a multipurpose robot in a vineyard, which performed 122

tasks such as harvesting, thinning, spraying and bagging with appropriate end-effectors. Belforte et al., [15] developed a low-cost robot prototype, which can perform precision spraying and precision fertilization in a greenhouse. Development of multipurpose robots represents a valid solution to problems about greenhouse operations such as low efficiency, high cost, and shortfall of labor. The objective of this work is to develop a multipurpose robot platform for several tasks, such as spraying (water, liquid fertilizer or pesticide) and weeding, which can be implemented by adding or removing the sensing component(s), replacing actuator(s) and switching control software. In this work, taking two tasks of spraying water and weeding by machine as examples, vision-based guidance and operations were conducted to assess the performance of two tasks in a green vegetable greenhouse, and to validate if the different operations can be achieved on the same mobile platform.

Structure Design The designed multipurpose robot meets the following requirements: ① being able to travel along a guidance line autonomously with steering control on a flat field; ② having open control system and platform for easy extension and reconstruction, which means that several farming operations such as spraying, weeding, cutting and so on, could be performed on the same platform by adding or removing sensor(s), replacing actuator(s), and switching control software. Therefore, the platform was designed according to an accepted concept of modular design [2]. It was divided into four parts: a sensing unit, a control unit, an actuating unit, and a mobile platform. The sensing unit mainly consisted of some sensors for target recognition of navigation and operations. Sensors such as vision, encoder, gyro, laser radar, or their combination, etc., can be used to acquire necessary information for navigation and operation. The control unit processed the sensing information and output commands to the mobile platform for robot motion and the actuating unit for field operations. The relevant actuating unit was chosen and replaced according to field operation task. The mobile platform carried the other units of the robot and implemented the robot’s motion. The wheel-type setup was chosen, since it was used by most agricultural machines on flat land. Here, the mobile platform included four identical wheel modules and a frame. Each wheel module, composed of a wheel, a wheel chain and a DC motor with a reduction gear, had a simple mechanical interface that allowed it to be mounted on the robot frame of 1.2 m by 0.6 m. The robot was driven with differential steering and the two DC motors on the same side sharing one motor driver. The actuating unit, installed in the middle of the frame, included an elevating and translating mechanism in Figure 1 and an actuating tool. The former consisted of a horizontal linear rail and a vertical linear rail with a stepping motor respectively, and it controlled the up-down and left-right motion of the actuating tool. The appropriate actuating tool was chosen and installed on the elevating and translating mechanism for different agricultural operations. Here we chose operations such as spraying (water, liquid fertilizer or pesticide) and weeding by machine. For spraying water or pesticide, the actuating tool was made up of a solenoid valve 8 and a nozzle 9, as shown in Figure 1(a). For weeding by machine, the actuating tool was an electric hoe 11, as shown in Figure 1(b).

(a) spraying operation (b) weeding by machine Figure 1. Robots models. 1-Wheel; 2-Chain wheel mechanism; 3-Frame; 4-Machine vision sensor for guidance; 5-Front box; 6-Elevating and translating mechanism; 7-Rear box; 8-Solenoid valve; 9-Nozzle; 10-Machine vision sensor for operation; 11-Electric hoe.

123 The robot was designed with an overall length of 1200 mm, an overall width of 780mm, an overall height of 970 mm, a frame height of 554 mm, a tread of 540 mm and a wheelbase of 740mm, and it was powered by 24 V batteries. The batteries, motor drivers, and controllers were placed in the front box 5, but a liquid tank was placed in the rear box 7 if spraying operation was needed. A machine vision sensor 4 was used for guidance in field tests. Of course, gyro, sonar, laser radar or their combination can be installed if needed. Even GPS can be used when the robot travels in the open air. The vision sensor 10 was used for detecting operation targets.

Control Systems

Basic Control Structure The control system of the robot included all sensors, controllers with a host computer and a slave computer, all drivers for DC motors, stepping motors and actuating tools (in Figure 2). A hierarchical structure was applied to the control system, where the host computer, usually a laptop computer with high speed and large memory capacity, acted as a decision-making unit and control core of the robot, performing acquisition and processing from sensor(s) and path planning for navigation and operations according to sensing information. Also, the host computer coordinated commands and output them to the slave computer. The slave computer, usually a microcomputer, only drove DC motors, stepping motors and actuating tool by the relevant drivers to complete the assigned tasks from the host computer. An RS232 connection was chosen for communication between the host computer and the slave computer. The whole system had characteristics of a modular structure, that is, the appropriate operation was carried out as as the relevant software and hardware were changed.

Figure 2. Control system of the agricultural robot.

SMC Based Motion Control

Motion Control Mode. The SMC (sliding mode control) method was applied to control the motion of the designed robot with differential-speed driving, for this type of robots is a typical nonholonomic system with multi-variables, nonlinear and strong coupling characteristics [16]. Because the path following control is a special case of trajectory tracking control, the trajectory tracking control is conducted to control motion of the robot here. The kinematic models of the designed robot and a referred robot with differential-speed control in Eq. 1 and Eq. 2 respectively.

cosθ 0  v   =  θ  = , (1) P sin 0   Jq ω 0 1 

θ cos r 0   v  P = sinθ 0 r = Jq , (2) r  r ω   r  0 1  T T T T where the poses P=[x, y, θ] and Pr=[xr, yr, θr] , the control inputs q=[v, ω] and qr=[vr, ωr] , the velocities v and vr, the angular speeds ω and ωr for the controlled robot and the referred robot respectively, and J is the Jacobian matrix. And v and ω meet the following constraint conditions:

124

≤≤ ωωω ≥∀≤≤ max vtvv max ,)(- max t)(- max t 0 , (3) where vmax and ωmax are positive, and the negative sign indicates the backing motion of the robot. The values of v and ω are calculated by two encoders on the output shafts of the left and right motors. The goal of the trajectory tracking control is to design a suitable controller v and ω satisfying Eq. 4, which makes the tracking error to zero:

− + − + θ −θ = r r yyxx r 0|]||||[|lim . (4) t ∞→ Define the following transformation:

θ θ − − xe  cos sin 0 xr x   x r x  =  =  θ θ  −  =  −  , (5) Pey e   -sin cos 0 yr y  T  y r y  θ    θ− θ   θ − θ  e  0 0 1  r  r 

Here (xe, , θe) is defined as the tracking error, and when (xe, ye, θe) = (0,0,0) means xr-x=0, yr-y=0 and θr-θ=0. So the control goal (Eq. 4) is equivalent to:

+ + θ = e e e ttytx 0|])(||)(||)([|lim . (6) t ∞→ Differentiate Eq. 5, and use Eq. 1 and Eq. 2. Hence, θ θ θ −+−−−+−=  ωθθ +−= θ  re  r  r r yyxxyyxxx cos)(sin)(sin)(cos)( e vvy cos er θ θ −−−+−−=  θθ  ωθθ +−=−− θ e r   r  r xxyyxxy r yy sin)(cos)(cos)(sin)( vx sin ere θ= θ  − θ  = ω − ω e r r . Then we get

x =ω y − v + v cosθ  e e r e = −ω + θ . (7) ye x e v rsin e θ = ω − ω  e r Define a new set of control inputs as

u  ω  0 = r , (8)   θ −  u1  vrcos e v  Then, the Eq. 7 is rewritten as

x =()ω − u y + u  e r0 e 1 = −ω − + θ . (9) ye( r u0 ) x e v r sin e θ =  e u0

So, the control goal is to design v and ω satisfying the Eq. 3 to make the tracking error (xe, ye, θe) zero. = θ = − − δ δ SMC Design. Set switching function s e , and have s k1 sgn s k2 s , where sgn s= s/(|s|+ ), is small positive. Design control law as

v   cosθω +++ 2 xykxkvy  q =   =  ere 3 e 3 ee  . (10) ω  ω ++     r 1 sgn 2 sksk  Stability Analysis of the Control System. Consider the candidate Lyapunov function V as 1 1 V= x2 + θ 2 . 2e 2 e  += θθ  The derivation of V becomes xxV  eeee . Then have

125  =ω − + θ + V xe( y e v vr cos e ) ss =ω − ω + θ + +2 + θ − − 2 xye [ e ( y e v r cos e kxkyxv3 e 3 e e) r cos e ] ksks1 2 = −2 − 2 2 − −2 ≤ ⋅ k3 xe k 3 y e x e k1| s | k 2 s 0 When k1, k2 and k3 are all positive constants, the control system is global asymptotic stability under the control law of Eq. 10.

Algorithms for Guidance and Operations Here, taking two operations of spraying water and weeding by machine for row-planting crops as an example, the vision-based algorithms were presented for guidance and operations. Algorithm for Guidance A vertical projection method was used to calculate the guidance line (Figure 3). First, a binarized image from an original crop image was divided into some equidistant image strips. Then, the vertical projection for each strip was done to find out the position of peak pixel point. Figure 4 shows the process of this method. A local window image with 640×240 pixels was obtained from a crop image (Figure 4(a)) to speed up image processing, and it was binarized (Figure 4(b)). The following discriminants presented in the earlier work [17], were introduced to segment according to prior knowledge:

G(,)(,) i j− G i − p j > ε (13)

and G(,)(,) i j− G i + p j > ε (14)

1 j+1 i+1 and Gij(,)(,)= ∑∑ Gmn> Thresh (15) 9 j−1 i − 1 where G(i,j) is the gray value of the pixel point (i, j), p the plant pixels, ε a small positive integer,G(,) i j the average value of the 3×3 area around the pixel point (i, j), and Thresh the gray scale threshold. If the detected pixels do not satisfy the discriminants, the pixel’s gray values are set at zero.

Figure 3. Algorithm of calculation for crop row lines. The binary image was divided into some equidistant image strips in Figure 4(c), and then the vertical projection was performed to find out the peak pixel point for each image strip (Figure 4(d)). A linear fitting method was used to calculate the crop lines according to the position coordinate of peak pixel point in every target zone of an image strip. If the number of peak pixel points were up to 2 or more in every target zone, the position coordinate was averaged in the light of the position coordinates of the first peak pixel point and the last peak pixel point. Lastly, the crop rows lines were calculated, as shown in Figure 4(e). Here, the robot traveled along the middle line which was a desired path (guidance line) to follow. It should be noted that, the effect of a single crop loss can be ignored for every crop rows during image processing, but the middle line will be calculated by finding the central line of the left and right crop rows when there is a loss of middle crop row with some distance.

126

(a) crop image (b) binary image (c) image strip division

(d) peak pixel calculation (e) crop row lines Figure 4. Process of obtaining crop rows lines with a vertical projection method. The desired path was regarded as the track trajectory for the robot, and the trajectory track control was implemented in light of the SMC method mentioned above. Figure 5 shows the principle of the straight line trajectory track control of the robot. The desired path R1R2 represented the straight line trajectory for tracking, and the line Q1Q2 indicated the robot’s centerline on the ground, which referred to the centerline of the image. The point G was the intersection point between the center line of the camera lens and the line Q1Q2. Here, the point G was regarded as the starting point of the robot, while the referred robot, located at the intersection point between the line R1R2 and its vertical line through the point G, traveled along the line R1R2 with a constant velocity and an angle speed of zero. Therefore, the control velocity and angle speed of the robot were determined according to the control law of Eq. 10 and the constraint conditions of Eq. 3. Further, the host computer calculated the values of PWM for the left and right motors in light of the control velocity and angle speed of the robot, and then it output these values into the slave computer to control the wheels speed through the RS232 connection. Algorithm for Operations The robot recognized the operation target in light of the information from the machine vision sensor installed on the actuating unit in this work, and it determined the actual position of operation objects such as the crops or weeds according to the corresponding relationship for the actual target position and its pixel position in a image. The control system output the commands to the elevating and translating mechanism and to the actuating tool in order to execute the desired operation. The control algorithm of spraying water was the same to that of other spraying operations (Figure 6).

Figure 5. Principle of the straight line trajectory track. Figure 6. Control algorithm of spraying operation.

127 For weeding by machine, the difference of algorithm was mainly reflected in recognition of operation objects. Firstly, the soil and plants were separated by image binaryzation of raw images. Then, the binaryzed images were processed by morphological Open-Close operation. Further, weeds and green vegetables were distinguished in light of shape feature (e.g. area or elongation). Subsequently, the centroid of weeds and crops were determined by using a centroid coordinate formula. Lastly, the real coordinates of weeds and crops were determined according to the coordinate relationship between pixels in images and the corresponding points in the ground plane. Figure 7 shows the image process of weed and crop in light of area feature. It should be noted that the preliminary tests were done to verify the performances of guidance, operations and operation switching, and the algorithm for the target recognition of the operating objects was simple.

Experiments and Results

Experimental Design Two tasks of spraying water and weeding by machine were chosen to test the multipurpose function of the designed robot in a plastic greenhouse with green vegetables at Nanjing Agricultural University. The size of the greenhouse is 38 m×12 m, the green vegetables are transplanted with a length of 35 m, an average intra-row distance of 120 mm and an average inter-row distance of 400 mm. A laptop TL-N12 and a BasicATOM Pro 40m microcontroller were chosen as the host computer and the slave computer respectively, while an RS232 connection for communication between two computers.

(a) raw image (b) processed binary image (c) separated crop image

(d) edge extraction of crop (e) separated weed image (f) edge extraction of weed Figure 7. Image processing of weed and crop. Two web cameras with the maximum resolution of 640×480 pixels and the focus of 4.3 mm and the field view of 31º, were installed with mounting angles of 30º and 0º as vision sensors for navigation and operations respectively. An image was acquired every 0.5 s. Before these tests, crop images were acquired by two cameras at different time from 9:00 a.m. to 4:00 p.m. in sunny days or cloudy days respectively, and then they were analyzed to find out the threshold values. It should be noted that, two tasks of spraying and weeding are mainly implemented under earlier stages after transplanting of green vegetables. Additionally, because sunlight is shielded partially by plastics, there is less effect on image acquisition and processing on sunny days, and it is easy for image segmentation on cloudy days due to clear chromatism between green vegetables and soil.

128

The robot traveled at an initial speed of 0.2 m/s measured by the encoders. For the experiment of spraying water, the robot traveled above one green vegetable row while it directly sprayed water toward to every green vegetable, as shown in Figure 8(a). The spraying experiment was repeated three times and each time the robot began at the same starting point and traveled the same distance of a row of 30 m. After each test, the lateral error values were measured every 150 mm from the starting point according to one-side wheel track. The lateral error value was calculated according to the difference between the distance from the center of one wheel track to crop row and the distance from the wheel center to the robot centerline. The spraying performance was evaluated by human observation. Meanwhile, its spraying productivity of the robot was estimated by calculating the total time it spent in spraying 100 vegetables every test, with a spraying time of 2 s for every vegetable.

(a) spraying water (b) weeding Figure 8. Spraying and weeding operations. For the experiment of weeding by machine, the robot also traveled above one green vegetable while it weeded (Figure 8(b)). It should be noted that, the weeds with different shapes from the vegetables were transplanted around the green vegetables. The experimental design for weeding was the same as that of the spraying water above. The weeding performance was evaluated by human observation. Experiments and Results Table 1 shows the statistical analysis of the lateral error data, and Figure 9 illustrates the single trajectory taken by the robot each time, where the negative sign indicates that the error is biased to the left. Table 2 shows the statistical analysis of the lateral error data, and Figure 10 illustrates the single trajectory taken by the robot each time, where the negative sign indicates that the error is biased to the left. And the timeliness of weeding was evaluated by human observation. Table 1. Analysis of navigation test data for spraying. Table 2. Analysis of navigation test data for weeding.

Figure 9. One trajectory of the robot in spraying tests. Figure 10. One trajectory of the robot in weeding tests. From Table 1 and Table 2, the maximum values of the lateral errors for every test were between 44-47 mm. For the spraying tests, the average lateral error shows smaller variation (see Table 1), and the maximum average error of 7.67 mm appeared in the first test and the minimum average error of 5.32 mm in the third experiment indicates that the guidance error distributed evenly on both sides in the three replications. For the weeding tests, it also shows even distribution according to data of the

129 average error in Table 2. According to data of standard deviation of the lateral error in Table 1 and Table 2, it shows the similar discrete level in each test. According to data in Table 1 and the trajectory in Figure 9, the control algorithm was able to reduce the lateral error and drive the robot along the vegetable row. Meanwhile, the actions of the elevating and translating mechanism and the solenoid valve were time efficient to execute the spraying operation by human observation. The spraying productivity was about 16-20 plants/min. This seems a little low when compared with spraying by human hand, but the robot can operate continuously for longer hours, while it is unaffordable for human operators. According to data in Table 2 and the trajectory in Figure 10, it shows the similar guidance performance to that of the spraying operation due to no change of the control algorithm and the test conditions. Additionally, the actuating unit including the electric hoe takes action in time for weeding by human observation, but it needs to evaluate the weeding productivity in the further work.

Conclusions A multipurpose robot was designed for several field operations in this work. Once a new task is needed, the corresponding operation can be realized by adding or removing sensing components, replacing the actuating unit and switching the control software, with little or no change of the mobile platform. The SMC method was applied to control motion of the designed robot with differential speed driving. Taking two operations of spraying water and weeding by machine as examples, vision-based guidance and operations were conducted to assess the performance of two tasks in a vegetable greenhouse. The preliminary tests showed that the robot traveled well with a maximum lateral error of 47 mm for two tasks, it sprayed with productivity of 16-20 plants/min, and the weeding action was taken in time. The designed robot was found to have adaptability and versatility because the different operations can be achieved on the same mobile platform. The test environment and the operation algorithms employed were all simple in the preliminary tests. Therefore, future research will focus on the control algorithm with autonomous turning and target recognition algorithm to improve the guidance accuracy and operating performance in a more complex environment.

Acknowledgements The study was supported by “Jiangsu Provincial Natural Science Foundation of China (Grant No. BK20151436)”.

References [1] F. Dong, O. Petzold, W. Heinemann, et al. Time-optimal guidance control for an agricultural robot with orientation constraints. Comput. Electron. Ag. 99(2013), 124-131. [2] T. Bakker, K. A. Van, J. Bontsema, et al. Systematic design of an autonomous platform for robotic weeding. J. Terramechanics, 47(2010), 63-73. [3] R. González, F. Rodríguez, J. Sánchezhermosilla, et al. Navigation techniques for mobile robots in greenhouses. Appl. Eng. Agric. 25(2009), 153-165. [4] S. S. Mehta, T. F. Burks, W. E. Dixon, Vision-based localization of a wheeled mobile robot for greenhouse applications: a daisy-chaining approach. Comput. Electron. Ag. 63(2008), 28-37. [5] S. Janos, I. Matijevics, Implementation of potential field method for mobile robot navigation in greenhouse environment with WSN support. International Symposium on Intelligent Systems and Informatics, New York, 2010, pp.319-323. [6] P. Dario, G. Sandini, B. Allotta, et al. The AGROBOT projects for greenhouse automation. Acta Horticulturae, 361(1994), 85-92. 130

[7] A. Mandow, J. M. Gómez-De-Gabriel, J. L. Martínez, et al. The autonomous mobile robot Aurora for greenhouse operation. IEEE Robot. Autom. Mag., 3(1996), 18-28. [8] S. Singh, W. S. Lee, T. F. Burks, Autonomous robotic vehicle development for greenhouse spraying. Transactions of the ASAE, 48(2005), 2355-2361. [9] E. J. V. Henten, J. Hemming, B. A. J. V. Tuijl, et al. An autonomous robot for harvesting cucumbers in greenhouses. Auton. Robot. 13(2002), 241-258. [10] P. Rajendra, N. Kondo, K. Ninomiya, et al. Machine vision algorithm for robots to harvest strawberries in tabletop culture greenhouses. Engineering in Agriculture, Environment and Food, 2(2009), 24-30. [11] S. Kitamura, K. Oka, Recognition and cutting system of sweet pepper for picking robot in greenhouse horticulture. IEEE International Conference on Mechatronics and Automation. New York, 2005, pp. 1807-1812. [12] M. H. Ko, B. Ryuh, K. C. Kim, et al. Autonomous greenhouse mobile robot driving strategies from system integration perspective: review and application. IEEE-ASME T. Mech. 20(2014), 1-12. [13] A. Sulakhe, M. N. Karanjkar, Design and operation of agricultural based pesticide spraying robot. International Journal of Science and Research, 4(2013), 1286-1289. [14] M. Monta, N. Kondo, Y. Shibano, Agricultural robot in grape production system. IEEE International Conference on Robotics and Automation. New York, 1995, pp. 2504-2509. [15] G. Belforte, R. Deboli, P. Gay, et al. Robot design and testing for greenhouse applications. Biosyst. Eng. 95(2006), 309-321. [16] V. Sankaranarayanan, A. D. Mahindrakar, Switched control of a nonholonomic mobile robot. Commun. Nonlinear Sci. 14(2009), 2319-2327. [17] J. Xue, W. , Vision-based guidance line detection in row crop fields. 2010 International Conference on Intelligent Computation Technology and Automation. New York, 2010, pp. 1140-1143.

131