Machine Vision and Soil Trace-based Guidance-Assistance System for Farm Tractors in Soil Preparation Operations

The main objective of this study was to develop a guidance-assistance system for agricultural vehicles in land preparation and tilling operations. The proposed system is a potential replacement for markers on the implement toolbar. The new automatic steering system controls the vehicle path based on the trace of the previous tilling operation on the field. Hough transform method was adapted to process forward view images of the tractor captured by a digital camera. For traces coinciding with center line of the images no correction is necessary, otherwise the relative position of the trace with respect to image center line would be an indication of the need for steering wheel adjustment. The proposed system was evaluated at various conditions including three illumination levels, previous plant residue levels (9, 24 and 50%) and soil surfaces tilled by one of the three implements (plow, subsoiler and row-crop plant furrow opener). For each test accuracy of algorithm was determined and Line Prediction Error (LPE) was calculated by the algorithm. Tests were conducted in field at three forward speeds; 4, 7.5 and 11 km h and three steering angles; 10, 16 and 20.5. Distance error (DE) and response time (RT) for each forward speed and steering angle was measured and compared. Results indicated that various illumination and residue levels have not distorted the output of the system proposed (LPE was limited to ±1.72 cm). Response time was decreased for increasing forward speed and steering angle. Effect of forward speed on DE was significant whereas effect of steering angle on DE was not significant (0.05p).


Introduction
Automated guidance can greatly reduce drivers fatigue and as a result increases both productivity and safety of farm operations.Different approaches have been studied for realizing automatic guidance for agricultural machines, using a combination of existing solutions including global positioning and computer vision system (Jahns, 2000;Pilarski et al., 2002).One of the most common types of navigation sensors is the global positioning system (GPS).A four-antenna real-time kinematic-GPS (RTK-GPS, a fiber optic gyro and an inclination sensor to demonstrate completely autonomous operation)-based navigation system was developed that could guide an agricultural tractor following desired paths with a tracking accuracy of 0.04-0.06m at normal field operating speeds (Bell, 2000).To compensate for GPS positioning error associated with machinery attitude, Kise et al., 2001Kise et al., , 2002 integrated an inertial measurement unit (IMU) with an RTK-GPS to provide more accurate navigation information.This integrated navigation system could guide agricultural machinery performing all field operations, including planting, cultivating, and spraying at a travel speed of up to 10.8 km h -1 , with a tracking error of less than 0.05m on both straight and curved paths.The GPS positioning system is measuring position of one of the tractor's GPS antennas relative to a fixed reference station antenna.The absolute position of the tractor could be measured only if the absolute position of the reference station antenna was known.The GPS based guidance systems require not only satellite antenna but they also require complicated peripheral instruments and therefore are not applicable to all situations.On the other hand, the machine vision based guidance systems are not complicated and only use the picture frames captured by a camera fixed to the tractor.This system is associated with lower error and its output does not necessarily depend on weather conditions as in satellite based systems.
No previous study was found as a guidance system based on the detection of furrow or trace of tillage machines.In a vision-based vehicle guidance system, finding guidance information from trace (such as soil cut edge, plant row or furrow) is the key step in achieving accurate control of the vehicle.A number of image processing used to automatically guide a vehicle when crop row structure is distinguishable in a field.Typical applications include guiding a tractor for row-crop cultivation or guiding a combine for harvesting operation using a guidance sensor, i.e., the camera.Machine vision guidance has the advantage of using local features to fine-tune the vehicle navigation course.It has the technological characteristics closely resembling those possessed by a human operator, and thus has great potential for implementation of a vehicle guidance system (Wilson, 2000).Tillett et al. (2002) described a vision system for guiding hoes between rows of sugar beet.Image acquisition was done using a camera with near infrared filter.A band pass filter was used to extract the lateral crop row location at eight scan bands in the image.The position and orientation of the hoe with respect to the rows was tracked using an extended Kalman filter.Søgaard and Olsen (2003) mounted a camera on a hand-operated vehicle and later on a weeder to evaluate the precision of an algorithm based on image analysis.The camera height was 1.15m and the inclination of the optical axis on the vertical was 56 • .The images were divided into band strips, which were mathematically 'enrolled'.The centre of gravity gave the position and an estimation of the relative accuracy.A weighted linear regression gave the positions of the rows.The mean position returned by their algorithm (trueness) was centered with the reference trace, with no statistical differences.The standard deviation (precision) was below 5mm in the centre of the image and about three times higher under the camera.The working speed was 1.44km h -1 .
A control mechanism aiming to position seed drills relative to the previous lines was devised by Leemans and Destain (2007).The position was measured by a machine vision system and used in a feedback control loop.An articulated mechanism was used to ensure the lateral displacement of the drill relative to the tractor.The standard deviation of the error, measured as the difference between the observed inter-row distance and its set value, was 23mm and its range was less than 100mm, which was sufficient to fulfill the requirements of the application.
The main objective of the present study was to establish a technique to detect path of newly tilled soils.The other objective was to develop and to mount necessary hardware on tractor to maintain the desired inter row spacing via automated steering wheel with no need for installing control hardware on implement.

System Components
The automated guidance system developed consisted of two parts: the visual navigation-sensing unit and the steering-control unit.An inexpensive universal serial bus (USB) camera Logitech Quick Cam was used.It was a color mono-charge-coupled device (CCD) camera with a fixed aperture.The camera was fixed at the end of an arm mounted on the tractor chassis normal to the line of travel with optical axis having an angle of 35 o with the horizontal at 1-1.1 m height in the forward direction.The camera was plugged in the USB port of the laptop computer mounted in the tractor cabin.This camera enables numerical images to be produced with a resolution of 1600×1200 to 320×240 pixels, at a rate of 30 images per second, in various lighting conditions.Two electric hydraulic valves and an electric spool valve were accommodated in the existing open center hydraulic system to attain automatic steering of the tractor wheels (MF-399).An electric circuit was assembled and accommodated next to tractor seat.As the first row in the field was tilled, the operating switch can be turn on to initiate the vision-based automatic steering of the tractor.

Development of the Recognition Algorithm
The experiment was conducted on a field with previous year corn crop residue.The experimental site was Fine mixed mesic calcixerollic xerochrepts.An algorithm was developed to recognize trace of previous furrow formed by tillage operation.The algorithm was expected to look for the edge of the tilled soil, measure its coordinates and finally compare to center line of the captured image.Any possible deviation could be quantified in pixels.Then equivalent distance could be calculated and finally the electronic circuit invokes the hardware to respond.The comprehensive algorithm developed was prepared in two soft wares; Matlab (tester program) and C++ (Visual Studio, real time program), (Figure 2).The images were acquired with a size of 640*480 pixels.As a smaller image contains sufficient information and less computational load, the size was reduced to 60*50 pixels, the scaling coefficient being adjusted.The three dimensional images taken by the monochrome CCD camera was then converted into a two-dimensional array of pixels, with each pixel represented by a Gray Level (GL) between 0 and 255.In a typical image scene (Figure 3, Row 1), the brighter pixels with higher gray level (GL) represent undisturbed soil and plant residue, and the darker pixels with lower GL represent disturbed soil.Variations in the soil relief induce differences in the irradiance and consequently clods or tillage furrows add high uncertainty.Aliasing effect was avoided by applying a Gaussian convolution filter.Gaussian filter is a linear filter usually used to smooth fine edges within each frame (Gonzales & Woods 1992).The output of the Gaussian filter at the moment is the weighted mean of the input values, and the weights are defined by equation: , Where x is the distance from the origin in the horizontal axis, y the distance from the origin in the vertical axis, and σ is the standard deviation of the Gaussian distribution (Figure 3, Row 2).
The number of pixels used to identify a trace is often too large for efficient application of the Hough transform.The objective of edge detection was to replace the GL image by the binarized edge map of the image.The edge detector was used to reduce size of data, to facilitate further processing.The sharp edge was placed at the vicinity of the edge of the furrow, because it had a high frequency variation, and other edges were boundary of low frequency variation (Figure 3, Row 3).The Gaussian filter was caused to remove all tiny edges in edge map of images helpful to reduce Hough transforms processing time.By using appropriate threshold in edge detector algorithm, maximum sensitivity for finding nuance between tilled and untilled was selected.So when soil is changing to another soil texture and moisture, these weren't effective on LPE.
Figure 3. Subsequent manipulations on picture frames on (a) plowed, (b) subsoiled and (c) furrowed soil (operation include gray level, smoothed and edge map images)

Application of the Hough Transform
The Hough transform is widely used for concluding a straight line from pixel points having potential of forming a line in each image (Gonzales & Woods 1992).The Hough transform is a non-linear transformation in which, for every pixel (X, Y) of an image, the parameters are computed, according to the equation (2).

cos (2)
Where X and Y are the position, given in Cartesian coordinates, of the transformed point (white pixel) in the image space, and are the parameters represented in the Hough space.The representation of a line by the Hough transform is a mathematical expression.Where the parameters do not represent physical objects or distances, unlike X and Y which carry direct location information within the image, do not provide any feature that can be identified in the real scene.The conversion to the eases the data processing.
The transformation was conducted by applying equation ( 2) in the range from 90° 90 with 2° increments for .Plotting the different values of a sinusoid was obtained for very pixel transformed.This transformation method was developed on the basis of the principal that, if points are aligned, all the sinusoids will cross in one single point of the parameter space.The crossing point, denoted by the parameter pair , in equation ( 4) and solving for Y and is (3) In theory, all the points in a row will be collinear, and therefore their sinusoids in the Hough space will meet at one crossing point, whose coordinate in the Hough space will lead to the searched line.The outcome from the point encoding operation is a series of points that are somewhat collinear but not aligned in a perfectly straight line, as in the ideal case shown in Figure (3).As a result, that points yielded by the Hough transform have a cloud-shaped distribution in the parameter space.A high probability that points are aligned is indicated by multiple sinusoids crossing at a specific point and creating certain heavier areas in the parameter space; these areas contain the potential lines.A vote accumulator approach, in which one vote is added every time that a sinusoid crosses a certain location, is used to determine the best estimated cross point of a trajectory in the Hough space.High peaks in the Hough space will represent straight lines in the image space.Point with the low number of votes represent noise and must be discarded.
Real field images inevitably contain noise, and such noises will cause some points to be located at a distance from the center of the imaged trace.Typically these points, known as outliers, are a constant source of error when a regression analysis is applied.The Hough transform is quite robust against 'noise' and missing parts, no matter how far it is from the expected position and does not affect the equation of the line.As a result, the points yielded by the Hough transform have a cloud-shaped distribution in the parameter space.After undergoing a number of subsequent operations, the Hough Transform presents a straight line out of each image (Fig. 5).

Experiments
The trace detection method developed and described was used to detect the relative position of the tractor with respect to the furrow formed by tillage tools.To validate the real-time system, both primary evaluation and field validation were performed.Several hundred images of the traces were acquired in the field by a camera fixed to the tractor, representing many kilometers of travel.Images were captured for three tillage operations by either four bottom semi-mounted moldboard plow, or a three shank mounted subsoiler, or a five-row crop planter, from March 2009 to December 2009 in day light conditions and on fields with drying soil and at various plant residue levels.For primary evaluation, all the images were analyzed, for each image one line was found and the deviation of this line with respect to the image center line (coinciding with furrow edge) was computed by algorithm, this deviation was named as Line Predicting Error (LPE).
Field evaluation was judged on two indices; Response Time (RT) and Distance Error (DE).The former is the time elapsed from the moment when the system is invoked (as the driver switches the automatic steering system) till the time the tractor front wheels establish the desired spacing to the target line.This time was measured by dividing trace of front wheel on constant tractor forwards speed.DE is the positive or negative deviation of the tractor front wheels from the target line and was measured by tape meter when tractor was operating on field (Figure 6).Distance error is zero when tractor front wheel is moving on the target line.

Figure 6. Experimental results of Response time and distance error
In order to damp unnecessary output responses, it was decided to add certain threshold in the controller algorithm.The threshold interval was set as ±1cm from the desired path.Therefore only deviations exceeding this interval could result in system output causing steering wheel adjustment (Figure 6).

Primary Evaluation of the Algorithm
The effects of lighting conditions and various plant residues on LPE were investigated in a factorial experiment arranged in RCBD in three replications.To validate the trace detection algorithm and search for optimal setting of system parameters, laboratory evaluation was carried out.For image collection the camera was fixed above the furrow (focused on center of the image scene).The camera was fixed at the end of an arm mounted on the right hand side of the tractor front axle.It was faced forward at an angle of 30 , this arrangement resulted in successful capture of images of previous tilled soil located in the right-hand side of the tractor.Image capturing was conducted at morning, noon and sunset in early summer 2009 in agricultural experiment station, Shiraz University.Three farms were considered for the experiments (laboratory and field).Field residues were estimated using camera images.Analysis of the captured images indicated that 9, 24 and 50% of the surfaces of the first, second and the third fields were covered with residue, respectively.Care was taken to consider both green and dried residues on the field surface, to this effect the appropriate algorithm were developed and used to analyze the captured images.Figure 7 shows the raw and processed field images for the three fields under study.
As mentioned earlier the soil was tilled either by moldboard plow, subsoiler or row crop planter.Therefore a total of 81 plots were considered.For each plot a total of 30 images were captured.LPE for each plot was estimated.
Figure 7. Raw and processed images of fields with surface coverage of 9, 24 and 50%

Field Evaluation of the Developed System
Throughout the field evaluation data were collected at three different steering angles (10, 16 and 20.5 o ).Therefore it was necessary to find the duration required excitation time for the electric spool valve to attain these steering angles.Figure 8 shows the relationship between time of opening of spool valve and the steering angle at both idle and rated engine RPMs.Three opening times; 100, 150, and 200 ms proved to be able to maintain the steering angle of 10, 16 and 20.5 o respectively.

Rated engine speed
Field tests were conducted to evaluate performance of the comprehensive system incorporating soft and hard wares.Tests were conducted in the field with no driver in tractor cabin.The validation criterion for field tests was that the vision-sensing unit should be capable of guiding the tractor automatically in the field at normal forward speed (up to 11km h -1 ).The system was validated in real-time for the three tillage tools introduced earlier.The trace of tillage tool as detected by machine vision system could provide navigation signals which could be used to steer the tractor.The field evaluation was conducted at three forward speeds (4.7.5 and 11 km h -1 ) and three steering angles (10 o , 16 o and 20.5 o ).The collected data on RT and DE were analyzed in a factorial experiment arranged in RCBD in three replications.

Effects of Illuminations, Plant Residue and Tools on LPE at Preliminary Evaluation
As mentioned in the previous section, data on LPE were extracted from field images captured in the preliminary study (Table 2 and 3).The analyses of data on LPEs indicated that no significant difference exists among data for these treatments or combination of them.This table shows that the range of deviation was limited to ±1.72 cm.This error was result of unsharp, landfall and smoother furrow.Therefore it can be concluded that the proposed algorithm can detect the border of the tilled/untilled soil at a considerably narrow error.Therefore it can be concluded that the proposed developed algorithm can be used in the automatic steering system without disturbing effects of illumination level, residue level and type of tools.Field tests were also conducted to monitor the possible combined effects of the algorithm and the hardware of the automatic steering wheel control system developed in the study.

Response Time and Tractor Heading Deviation
Considering the previous results indicating insignificant effects of light conditions and residue levels, RT was measured in field at three forward speeds and three steering angles in five replications (Figure 9).It should be noted that at the start of each test, the tractor was placed at a lateral distance of 0.6 m measured from the tractor right hand wheel to the border line of tilled/untilled soil.It is evident that the minimum standard deviation of 0.016 (corresponds to forward speed of 4 km h-1 and steering angle of 10 o is mainly due to the momentum of the tractor in the field.The performance was better at 4 km h -1 than at the two other speed levels.The standard deviation is 0.0378 cm at the fastest speed and it apparently demonstrates a good control performance similar to that of human driving.However the performance was better at lower speed levels.shows that as the tractor forward velocity increases the capability of automatic steering reduces, in other words the maximum deviation from the desired path for high speed operation increases.DE values would ensure the compatibility of soil operation equipped with guidance assistance with a non-matching width harvesting and reforming plant protection machine using machine vision system.In other hand comparing with the GPS guidance system, machine vision systems are largely cheaper and in some condition more accurate.

Conclusions
The soft and hardwares developed could satisfactorily track the transition between tilled and untilled soil.The algorithm utilized the image output from a camera mounted directly above the furrow edge.The system was evaluated both in laboratory and field.The system was capable of accurately locating the tilled and untilled edge using the real time images.The new controller system not only reduces drivers' fatigue but also increases accuracy of the tillage operations.Furthermore the system is versatile and there is no need for hardware or software adjustment when tilling operation changes.It was found that the trueness of the system was influenced by the forward tractor speed.The precision was 30mm and the range was 80mm.
Other results are: 1.The proposed algorithm can be used in the automatic steering system without disturbing effects of illumination and previous residue levels.
2. The developed system can be used for the three operations of Plowing, subsoiling and furrowing (when planting with row crop planter).
3. The response time decreases as the forward speed and steering angle increases.
4. The experimental was conducted with forward speed of 4, 7.5 and 11 km h -1 since in these speeds ranges agricultural vehicles work.This system didn't test in higher than these speeds and it is clear that if forward speed increases DE increases too.
5. To attain lower DE, it is advisable to move the tractor at lower speed levels, therefore forward speeds of 4 or 5km h -1 is recommended.Adoption of each one of steering angles makes no difference in the system output.

Figure 1 .
Figure 1.Schematic diagram a) components of hydraulic steering system, of a typical MF-399 tractor and b) components of hydraulic steering system of the tractor equipped with extra controlling elements (MSV: open center mechanical spool valve, ESV: open center electrical spool valve, EHV 1 : normal open electrical hydraulic valve and EHV 2 : normal close electrical hydraulic valve)

Figure 2 .
Figure 2. Flowchart of the comprehensive algorithm developed in this study 2.2.1 Acquiring Image and Pre-processing Show the edge detector algorithm used in the studyFor all rows and columns in image For I from 1 to (row-length) If gray value ( pixel i )-gray value(pixel i+1 ) Figure (4) graphically illustrates this concept.

Figure 4 .
Figure 4. Hough transform for line detection

Figure 5 .
Figure 5. Results of application of Hough transform and output line for a) plowed , b) subsoiled and c) furrowed soil

Figure 8 .
Figure 8. Spool valve opening time to attain various tractor steering angles

Figure 10a .
Figure 10a.Characteristics performance of developed guidance system heading for target path at three steering angles and forward speed of 4 km h -1 on subsoiled field

Table 2 .
Results of LPE on three soil cover residues and three illumination levels for images of plowing and subsoiling operations

Table 3 .
Data on LPE at three illumination levels for images of row crop planter operation