CN113432597B - Composite visual navigation system applied to inter-row management of complex outdoor environment - Google Patents

Composite visual navigation system applied to inter-row management of complex outdoor environment Download PDF

Info

Publication number
CN113432597B
CN113432597B CN202110878640.2A CN202110878640A CN113432597B CN 113432597 B CN113432597 B CN 113432597B CN 202110878640 A CN202110878640 A CN 202110878640A CN 113432597 B CN113432597 B CN 113432597B
Authority
CN
China
Prior art keywords
module
point
navigation system
openmv
management
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110878640.2A
Other languages
Chinese (zh)
Other versions
CN113432597A (en
Inventor
夏俊芳
周明宽
张帅
衡一帆
国立伟
江锍
胡梦杰
刘政源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong Agricultural University
Original Assignee
Huazhong Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong Agricultural University filed Critical Huazhong Agricultural University
Priority to CN202110878640.2A priority Critical patent/CN113432597B/en
Publication of CN113432597A publication Critical patent/CN113432597A/en
Application granted granted Critical
Publication of CN113432597B publication Critical patent/CN113432597B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning

Abstract

The invention discloses a composite visual navigation system applied to the interline management of a complex outdoor environment, which comprises a detection unit, a single chip microcomputer module and a control unit, wherein the detection unit is electrically connected with the single chip microcomputer module; aiming at the influence of illumination change on outdoor agricultural visual navigation, an HSV color space is provided, the hue is separated from the illumination intensity, tests show that accurate identification of a path under the illumination change in the whole day can be realized after a threshold value is determined, and aiming at the characteristics of dense pavement weeds and complex and changeable environment of a citrus orchard, a natural path formed by identifying a crown and the sky is provided as the basis for path identification.

Description

Composite visual navigation system applied to inter-row management of complex outdoor environment
Technical Field
The invention belongs to the technical field of agricultural management, and particularly relates to a composite visual navigation system applied to interline management of a complex outdoor environment.
Background
The autonomous navigation operation of the agricultural robot can not only liberate laborers from repeated and monotonous labor, but also improve the accuracy and the working efficiency of agricultural operation. At present, research on agricultural autonomous navigation operation mainly comprises three modes, namely laser positioning navigation, machine vision navigation and satellite positioning navigation (GPS). The satellite positioning navigation (GPS) can obtain the absolute position of navigation, but has poor anti-interference performance, is easy to lose signals when being shielded, has single acquired information and is expensive; the laser positioning navigation has high manufacturing cost and relatively high requirements on external light, ground, visibility and other environments; the machine vision navigation technology has the advantages of wide signal detection range, rich and complete information, non-contact measurement, high cost performance and the like, and is widely applied to the fields of unmanned aerial vehicles, surgical instruments, transportation, agricultural production and the like.
However, for the complicated outdoor environment such as uneven ground, short trees, thick weeds and the like, when crops are managed, the problems of difficult manual operation and high cost are caused.
Therefore, a composite visual navigation system applied to the inter-row management of the complex outdoor environment is needed to provide a power walking basic platform for the inter-row management.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provide a composite visual navigation system applied to the interline management of a complex outdoor environment so as to solve the problems of difficult manual operation and high cost in the background technology.
In order to achieve the purpose, the invention provides the following technical scheme: the utility model provides a be applied to compound vision navigation of complicated outdoor environment interline management, includes the detecting element, the detecting element includes speed sensor, OPENMV module, MATLAB module and camera, speed sensor OPENMV module MATLAB module with the camera all with single chip module electric connection, single chip module and control unit electric connection, control unit and processing unit electric connection, processing unit and execution unit electric connection.
Preferably, the camera is located the top of switch board, just the upward viewing angle of camera is 135 °, switch board one side is provided with the OPENMV module, OPENMV module opposite side is provided with the baffle, the model of switch board is USB2.0HD UVC WebCam.
Preferably, the execution unit comprises a left driving wheel, a right driving wheel, a crawler, a direct current motor and a crawler chassis, the execution unit is located below the baffle, the left driving wheel, the right driving wheel and the direct current motor are arranged on the crawler chassis, the crawler is arranged on the outer side of the left driving wheel and the outer side of the right driving wheel, the left driving wheel and the right driving wheel are in transmission connection with the output end of the direct current motor, and speed sensors are arranged at shaft ends of the left driving wheel and the right driving wheel.
Preferably, the MATLAB module obtains the image with the mjpeg format and the effective pixels of 320 pixels × 240 pixels by calling the statement videoignput, and the image is an RGB color picture.
Preferably, an image preprocessing module, a morphology processing module and a route reference line determining module are arranged inside the control cabinet, the image preprocessing module divides an image into three mutually non-interfering channels, namely an H channel (hue of color), an S channel (saturation of color) and a V channel (brightness of color), based on an HSV color space model, and the hue value H is within a range of 0.18-0.413.
Preferably, the course datum line determining module performs secondary screening through feature points on two sides of the intersection of the horizontal line and the edge, and the feature points are respectively defined as left-side feature points left _ point x And the right side feature point right _ point x (n, m) is the coordinate of the feature point, (n, m) is an element (i, j), and
left:
Figure BDA0003191063940000021
and (3) right:
Figure BDA0003191063940000022
in the formula, p (n, m) is the pixel value of the edge, the characteristic point of the final path is the midpoint between the starting point and the ending point, and the calculation formula of the characteristic point is as follows:
Figure BDA0003191063940000023
preferably, the MATLAB module includes a Hough function, a Houghpeaks function, and a Houghlines function.
Preferably, the OPENMV module determines the threshold value through the IDE tool based on the OPENMV visual processing chip.
Preferably, the processing unit adopts STM32 hierarchical control.
Preferably, the angle threshold value of the straight line driving of the crawler chassis is +5 degrees to-3 degrees.
Compared with the prior art, the invention provides a composite visual navigation system applied to the interline management of a complex outdoor environment, which has the following beneficial effects:
1. aiming at the fact that outdoor agricultural visual navigation is susceptible to illumination change, the method provides an HSV color space, the hue is separated from the illumination intensity, and tests show that accurate identification of paths under the illumination change all day after a threshold value is determined can be achieved;
2. aiming at the characteristics of dense weeds on the road surface of a citrus orchard and complex and variable environment, the invention provides a natural path formed by identifying a crown and the sky as a basis for path identification, and tests show that the path identification algorithm is faster and more accurate and meets the actual autonomous navigation operation requirement;
3. aiming at path identification errors caused by uneven tree crowns, the method provides an improved Hough linear fitting algorithm, sets a threshold value to determine a farthest point of an effective path, takes the farthest point as a target point and is connected with a reference point of a navigation platform to form a navigation path, and tests show that the navigation path is more rapid and reliable to identify;
4. the navigation platform is enabled to autonomously steer and operate by arranging the auxiliary OPENMV module, the integrity and the creativity of the navigation platform are improved, and the navigation platform can realize inter-row autonomous navigation during field operation.
The device has the advantages that the structure is scientific and reasonable, the use is safe and convenient, and great help is provided for people.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention without limiting the invention in which:
FIG. 1 is a block diagram of a composite visual navigation system for interrow management in a complex outdoor environment according to the present invention;
FIG. 2 is a block diagram of a test platform system of a composite visual navigation system applied to interrow management in a complex outdoor environment according to the present invention;
FIG. 3 is a system logic diagram of a composite visual navigation system for interrow management in a complex outdoor environment according to the present invention;
FIG. 4 is a schematic diagram of the relationship between the angle and time obtained by the test of the composite visual navigation system applied to the row management of the complex outdoor environment according to the present invention;
FIG. 5 is a schematic diagram of a binary image obtained by a morphological processing module in a composite visual navigation system applied to interline management in a complex outdoor environment according to the present invention;
FIG. 6 is a schematic diagram of an HSV-based image segmentation result in a composite visual navigation system for interrow management in a complex outdoor environment according to the present invention;
fig. 7 is a schematic view of a main view structure of a composite visual navigation system applied to inter-row management in a complex outdoor environment according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
Referring to fig. 1-7, the present invention provides a technical solution: the utility model provides a be applied to compound vision navigation of complicated outdoor environment interline management, includes the detecting element, the detecting element includes speedtransmitter, OPENMV module 3, MATLAB module and camera 4, speedtransmitter, OPENMV module 3, MATLAB module and camera 4 all with single chip module electric connection, single chip module and the control unit electric connection, control unit and processing unit electric connection, processing unit and execution unit 1 electric connection.
Preferably, the camera 4 is located at the top end of the control cabinet 2, the upward viewing angle of the camera 4 is 135 degrees, an OPENMV module 3 is arranged at one side of the control cabinet 2, a baffle 5 is arranged at the other side of the OPENMV module 3, and the control cabinet 2 is USB2.0HD UVC WebCam.
In the invention, preferably, the execution unit 1 comprises a left driving wheel 11, a right driving wheel 11, a crawler 12, a direct current motor and a crawler chassis, the execution unit 1 is positioned below the baffle 5, the left driving wheel 11, the right driving wheel 11 and the direct current motor are arranged on the crawler chassis, the crawler 12 is arranged outside the left driving wheel 11, the left driving wheel 11 and the right driving wheel 11 are in transmission connection with the output end of the direct current motor, and the shaft ends of the left driving wheel 11 and the right driving wheel 11 are provided with speed sensors.
In the present invention, preferably, the MATLAB module obtains the image in the mjpeg format and with effective pixels of 320 pixels × 240 pixels by calling the statement videotape, and the image is an RGB color picture.
In the present invention, preferably, an image preprocessing module, a morphology processing module, and a route reference line determining module are disposed inside the control cabinet 2, the image preprocessing module divides an image into three non-interfering three channels, namely an H channel (hue of color), an S channel (saturation of color), and a V channel (brightness of color), based on an HSV color space model, and the hue value Hmin =0.18.
In the invention, preferably, the route reference line determining module performs secondary screening through the feature points on two sides of the intersection of the horizontal line and the edge, and the feature points are respectively defined as left-side feature points left _ point x And the right side feature point right _ point x (n, m) is the coordinate of the feature point, (n, m) is an element (i, j), and
left:
Figure BDA0003191063940000051
and (3) right:
Figure BDA0003191063940000052
wherein p (n, m) is the pixel value of the edge, the characteristic point of the final path is the middle point of the starting point and the ending point, and the characteristic pointThe calculation formula is as follows:
Figure BDA0003191063940000053
in the present invention, preferably, the MATLAB module includes a Hough function, a Houghpeaks function, and a Houghlines function.
In the present invention, preferably, the OPENMV module 3 determines the threshold value through the IDE tool based on the OPENMV visual processing chip.
In the present invention, preferably, the processing unit is hierarchically controlled by using STM 32.
In the invention, the angle threshold value of the straight line running of the crawler chassis is preferably between +5 degrees and-3 degrees.
Example 2
Referring to fig. 1-7, the present invention provides a technical solution: the utility model provides a be applied to compound vision navigation of complicated outdoor environment interline management, includes the detecting element, the detecting element includes speedtransmitter, OPENMV module 3, MATLAB module and camera 4, speedtransmitter, OPENMV module 3, MATLAB module and camera 4 all with single chip module electric connection, single chip module and the control unit electric connection, control unit and processing unit electric connection, processing unit and execution unit 1 electric connection.
In the invention, preferably, the camera 4 is positioned at the top end of the control cabinet 2, the upward viewing angle of the camera 4 is 135 degrees, an OPENMV module 3 is arranged at one side of the control cabinet 2, a baffle 5 is arranged at the other side of the OPENMV module 3, and the model of the control cabinet 2 is USB2.0HD UVC WebCam.
In the invention, preferably, the execution unit 1 comprises a left driving wheel 11, a right driving wheel 11, a crawler 12, a direct current motor and a crawler chassis, the execution unit 1 is positioned below the baffle 5, the left driving wheel 11, the right driving wheel 11 and the direct current motor are arranged on the crawler chassis, the crawler 12 is arranged outside the left driving wheel 11, the left driving wheel 11 and the right driving wheel 11 are in transmission connection with the output end of the direct current motor, and the shaft ends of the left driving wheel 11 and the right driving wheel 11 are provided with speed sensors.
In the present invention, preferably, the MATLAB module obtains the image in the mjpeg format and with effective pixels of 320 pixels × 240 pixels by calling the statement videotape, and the image is an RGB color picture.
In the present invention, preferably, an image preprocessing module, a morphology processing module, and a route reference line determining module are disposed inside the control cabinet 2, the image preprocessing module divides the image into three non-interfering three channels, namely an H channel (hue of color), an S channel (saturation of color), and a V channel (brightness of color), based on an HSV color space model, and the hue value Hmax =0.413.
In the invention, preferably, the route reference line determining module performs secondary screening through the feature points on two sides of the intersection of the horizontal line and the edge, and the feature points are respectively defined as left-side feature points left _ point x And the right side feature point right _ point x (n, m) is the coordinate of the feature point, (n, m) is an element (i, j), and
left:
Figure BDA0003191063940000071
and (3) right:
Figure BDA0003191063940000072
in the formula, p (n, m) is the pixel value of the edge, the characteristic point of the final path is the midpoint between the starting point and the ending point, and the calculation formula of the characteristic point is as follows:
Figure BDA0003191063940000073
in the present invention, preferably, the MATLAB module includes a Hough function, a Houghpeaks function, and a Houghlines function.
In the present invention, preferably, the OPENMV module 3 determines the threshold value through the IDE tool based on the OPENMV visual processing chip.
In the present invention, preferably, the processing unit is hierarchically controlled by using STM 32.
In the invention, the angle threshold value of the straight running of the crawler chassis is preferably +5 degrees to-3 degrees.
Example 3
Referring to fig. 1-7, the present invention provides a technical solution: the utility model provides a be applied to compound vision navigation of complicated outdoor environment interline management, includes the detecting element, the detecting element includes speedtransmitter, OPENMV module 3, MATLAB module and camera 4, speedtransmitter, OPENMV module 3, MATLAB module and camera 4 all with single chip module electric connection, single chip module and the control unit electric connection, control unit and processing unit electric connection, processing unit and execution unit 1 electric connection.
In the invention, preferably, the camera 4 is positioned at the top end of the control cabinet 2, the upward viewing angle of the camera 4 is 135 degrees, an OPENMV module 3 is arranged at one side of the control cabinet 2, a baffle 5 is arranged at the other side of the OPENMV module 3, and the model of the control cabinet 2 is USB2.0HD UVC WebCam.
In the invention, preferably, the execution unit 1 comprises a left driving wheel 11, a right driving wheel 11, a crawler 12, a direct current motor and a crawler chassis, the execution unit 1 is positioned below the baffle 5, the left driving wheel 11, the right driving wheel 11 and the direct current motor are arranged on the crawler chassis, the crawler 12 is arranged outside the left driving wheel 11, the left driving wheel 11 and the right driving wheel 11 are in transmission connection with the output end of the direct current motor, and the shaft ends of the left driving wheel 11 and the right driving wheel 11 are provided with speed sensors.
In the present invention, preferably, the MATLAB module obtains the image with mjpeg format and effective pixels of 320 × 240 pixels by invoking the statement videoineput, and the image is an RGB color picture.
In the invention, preferably, an image preprocessing module, a morphology processing module and a route reference line determining module are arranged in the control cabinet 2, the image preprocessing module divides an image into three non-interfering three channels, namely an H channel (hue of color), an S channel (saturation of color) and a V channel (brightness of color), based on an HSV color space model, and the hue value H is within a range of 0.18-0.413.
In the invention, preferably, the route reference line determining module performs secondary screening through the feature points on two sides of the intersection of the horizontal line and the edge, and the feature points are respectively defined as left-side feature points left _ point x And the right side feature point right _ point x (n, m) is the coordinate of the feature point, (n, m) is an element (i, j), and
left:
Figure BDA0003191063940000081
and (3) right:
Figure BDA0003191063940000082
in the formula, p (n, m) is the pixel value of the edge, the characteristic point of the final path is the midpoint between the starting point and the ending point, and the calculation formula of the characteristic point is as follows:
Figure BDA0003191063940000083
in the present invention, preferably, the MATLAB module includes a Hough function, a Houghpeaks function, and a Houghlines function.
In the present invention, preferably, the OPENMV module 3 determines the threshold value through the IDE tool based on the OPENMV visual processing chip.
In the present invention, preferably, the processing unit is hierarchically controlled by using STM 32.
In the invention, the angle threshold value of the straight line running of the crawler chassis is preferably between +5 degrees and-3 degrees.
The working principle and the using process of the invention are as follows: when the navigation system is used, the camera 4 with the model of USB2.0HD UVC WebCam is used for acquiring an image, the elevation angle of the image is 135 degrees, the OPENMV module 3 is used as a secondary sensor to capture steering information of the navigation system in the process of traveling, whether the steering information is acquired or not is judged from the image acquired by the OPENMV module 3, the execution judgment is left/right turning is acquired, finally, a left/right turning program is operated to drive bottom layer steering, when the steering information is not acquired, a MATLAB module is used for acquiring path information, and the navigation system performs color space extraction and segmentation, morphological image processing and navigation path fitting, the angle of the fitted navigation straight line is converted into a PWM signal after being processed by the processing unit to drive the execution unit 1 to carry out path tracking driving, after the camera 4 finishes image acquisition, a MATLAB module is used for calling a camera sentence video ('winvideo', 1, 'MJPEG _320x 240') to acquire an image with an MJPEG format and an effective pixel of 320 pixels x240 pixels, the form and the size of the tree are uneven due to outdoor seasonal weeds and irregular pits, a gap between a crown and a sky is used as a basis for road datum line extraction at one side, the image acquisition triggering mode is triggered by the MATLAB module, and the next image acquisition is triggered immediately after the navigation datum line extraction is finished each time of the acquired imageThe color image directly obtained is an RGB color image, green tree and vegetation information is accurately extracted, an HSV-based segmentation mode is provided by combining the independence of three variables of HSV in a color space, an HSV color space model divides the image into three non-interfering three channels, namely an H channel (hue of color), an S channel (saturation of color), a V channel (brightness of color), and the like, and after the RGB image is converted into the HSV image, the threshold value of H shown by green is determined according to an HSV color ring distribution table, the green is mainly concentrated between 0.201 and 0.498 of the value of the H hue, the upper limit Hmax =0.413 of a test statistical result and the lower limit Hmin =0.18 are accurate in extraction and high in applicability of the green information of the citrus garden, the segmentation result is shown in figure 6, the result shows that the segmentation algorithm is accurate and meets the actual requirements, and the influence of light of an operation environment is greatly reduced because the hue and the brightness are separated by the HSV color space; because a plurality of black noise points and white holes exist in a binary image after image segmentation and the irregular shape of the edge of a tree crown is met, in order to meet the requirements of effectiveness and real-time performance, a morphology processing module performs one-time open operation and closed operation to remove noise points and smooth edges by constructing an octagon and taking a structure with the distance from the original point of the element to the edge of the octagon as a structural element, and finally obtains a smooth binary image, as shown in fig. 5, because of the irregularity of outdoor fruit tree growth, the central point extraction of fruit trees and sky edges is performed on the vertical coordinate of the image, the central point extraction is used as the reference point of a navigation path, and assuming that the size of the original image is u × v, v and u respectively represent row pixels and column pixels of the image, the number of horizontal strips is v, and the pixel coordinate point p (i, j) of the characteristic edge obtained on the ith row is the pixel value at the (i, j) on the horizontal strip, so that the coordinate value of the edge information of the tree edge can be extracted, but because of the irregular shape of the outdoor tree growth causes that the extracted characteristic edge only has edge information on one side; in addition, the extracted information of the edge of the tree on one side has wrong repeated information, and then a sharp point Q (l, k) formed by the edge is used as a reference point of the center of the path, and feature points on two sides which are intersected with the edge through a horizontal line are screened again and respectively defined as left feature points left _ point x And the right side feature point right _ point x (n, m) is the coordinate of the feature point, and (n, m) is an element(i, j), and
left:
Figure BDA0003191063940000101
and (3) right:
Figure BDA0003191063940000102
in the formula, p (n, m) is the pixel value of the edge, the characteristic point of the final path is the midpoint between the starting point and the ending point, and the calculation formula of the characteristic point is as follows:
Figure BDA0003191063940000103
in order to solve the different influences of the tree growth conditions, a random sampling consistency algorithm is adopted, firstly, two points are randomly selected, a model equation y = ax + b represented by the two points can be calculated through the two points, all data points are sleeved in the model to calculate errors, all points meeting an error threshold value are found, then the process is repeated until a certain iteration number is reached, the most supported model is selected as a solution of the problem, and if the probability of the inner points in the whole data set is t, the t is t
Figure BDA0003191063940000104
Determining the model of the problem requires n points, which are defined according to the problem, for example, n is 2 when fitting a straight line, n =3 when fitting a plane, n =3 when solving a rigid transformation matrix between point clouds, n =4 when solving a projective transformation matrix between images, etc., k represents the number of iterations, i.e., the number of times of randomly selecting n points to calculate the model, P is the probability of obtaining a correct solution under these parameters, for convenient representation, the probability that n points are all interior points is tntn, and the probability that at least one of n points is an exterior point is 1-t n (1-tn) k (1-tn) k represents the case that all inliers are not found once in any of the k random samples, and the wrong solution is obtained when P is P =1- (1-t) n ) k The interior point probability t is a priori value, which can give some robust values, and it can be seen that even though t is too optimistic, the probability t can be determined by increasing the iteration number k,to ensure the probability P of correct solution, we can also calculate the iteration number k by the above equation, i.e. we assume that the correct probability is P (e.g. we need 99% probability to get the correct solution), and k is
Figure BDA0003191063940000111
Then, a Hough function, a Houghpeaks function and a Houghlines function in the MATLAB module are used for conversion, the Houghpeaks function is firstly used for scanning the binary image in a row to obtain the central points of all rows of tree crown sky gaps in a parameter space for accumulated voting, when the maximum value is larger than a threshold value, the tree crown sky gaps are judged to be straight lines, the points on the straight lines are removed from the image and are executed in a circulating mode until the maximum value of the obtained central feature points in the parameter space is smaller than the threshold value, the peak function exits, then the Houghlines function is used for determining the longest straight line, the farthest point of a path formed by the tree crown and the sky is extracted from the longest straight line to serve as a target point of a citrus garden path, the bottom central point of the image serves as a reference point of a navigation system, and a reference point and the obtained target point are connected to form a navigation datum line of the citrus garden visual navigation; because the navigation datum line is extracted based on the sky background, the end turning information of the road cannot be accurately determined, the auxiliary sensing device can be added for obtaining, the OPENMV visual processing chip is a programmable camera, the color block tracking processing is efficiently carried out through Python language, the steering information can be accurately obtained in real time as an auxiliary sensor, the threshold value is determined by means of an IDE tool of the OPENMV based on the OPENMV visual processing chip, the turning mark is extracted, the blue frame is firstly used for determining the maximum communication area, when the area of the communication area is larger than 50% of a sampling picture, the system judges the turning point reaching the navigation system, then four equally spaced green frames are used for determining the central positions of different areas in a segmented mode, whether the inclination angles of the farthest points are larger than 10 degrees is calculated by the central points of the four extracted green frames, the final turning information is determined by the positive and negative of the farthest points, the program processing time is 0.465s, the identification is accurate and fast, the requirement of identifying the turning mark is met, according to field experiments, the initial state is that the starting point angle is 78 xft due to the farthest point of the tree crown, the initial point is 8978 degrees, and the threshold value is set to be in the normal crawler chassis, so that the normal crawler belt is in the range of + 895 degrees-3 degreesWhen the running speed exceeds a straight running threshold value, the crawler chassis is driven to be adjusted to about 0 degree in real time through a closed-loop differential speed, the test proves that the error of a starting point is 2.862 degrees, the RMS error of an angle is 2.145 degrees, as can be seen from the graph 4, when the angle deviation exceeds the threshold value, the crawler is automatically corrected, the visual navigation feedback system is shown to be effective, about 12 minutes, a serial port obtains an OPENMV transmission steering signal, the crawler chassis is driven to run a steering program, the steering action is finished about 19 minutes, the visual navigation feedback program is continuously run, and the test shows that the visual navigation system runs normally, so that the autonomous navigation operation requirement of inter-row management can be met.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that various changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (6)

1. A composite visual navigation system applied to the row management of a complex outdoor environment comprises a detection unit and is characterized in that: the detection unit comprises a speed sensor, an OPENMV module (3), an MATLAB module and a camera, wherein the speed sensor, the OPENMV module (3), the MATLAB module and the camera are all electrically connected with a single chip microcomputer module, the single chip microcomputer module is electrically connected with a control unit, the control unit is electrically connected with a processing unit, and the processing unit is electrically connected with an execution unit (1);
the camera is arranged at the top end of the control cabinet (2), the upward viewing angle of the camera is 135 degrees, the OPENMV module is arranged on one side of the control cabinet (2), the baffle (5) is arranged on the other side of the OPENMV module, and the control cabinet (2) is USB2.0HDUVCWebCam;
an image preprocessing module, a morphology processing module and a route datum line determining module are arranged in the control cabinet (2), the image preprocessing module divides an image into three mutually non-interfering channels, namely an H channel, an S channel and a V channel based on an HSV color space model, and the hue value H is within the range of 0.18-0.413; the H channel is the hue of the color, the S channel is the saturation of the color, and the V channel is the brightness of the color;
the OPENMV module (3) determines a threshold value through an IDE tool based on an OPENMV visual processing chip;
and the route datum line determining module performs secondary screening through the feature points on the two sides of the intersection of the horizontal line and the edge, and the feature points are respectively defined as left-side feature points left _ point x And the right side feature point right _ point x (n, m) is the coordinate of the feature point, (n, m) is an element (i, j), and
left:
Figure FDA0003885652080000011
and (3) right:
Figure FDA0003885652080000012
in the formula, p (n, m) is the pixel value of the edge, the characteristic point of the final path is the midpoint between the starting point and the ending point, and the calculation formula of the characteristic point is as follows:
Figure FDA0003885652080000013
2. a composite visual navigation system for intercourse management in complex outdoor environments according to claim 1, wherein: execution unit (1) is including controlling drive wheel (11), track (12), direct current motor and track chassis, execution unit (1) is located the below of baffle (5), be provided with on the track chassis control drive wheel (11) with direct current motor, it is provided with to control drive wheel (11) outside track (12), control drive wheel (11) with direct current motor's output transmission is connected, control drive wheel (11) axle head and be provided with speed sensor.
3. A composite visual navigation system for intercourse management in complex outdoor environments according to claim 1, wherein: the MATLAB module acquires an image in an MJPEG format and with effective pixels of 320 pixels multiplied by 240 pixels through calling a sentence videoinut, and the image is an RGB color picture.
4. A composite visual navigation system for intercourse management in complex outdoor environments according to claim 1, wherein: the MATLAB module comprises a Hough function, a Houghpeaks function and a Houghlines function.
5. The composite visual navigation system for intercolumnar management of complex outdoor environments of claim 1, wherein: the processing unit adopts STM32 hierarchical control.
6. A composite visual navigation system for intercourse management in complex outdoor environments according to claim 2, wherein: the angle threshold value of the straight line running of the crawler chassis is between +5 degrees and-3 degrees.
CN202110878640.2A 2021-08-02 2021-08-02 Composite visual navigation system applied to inter-row management of complex outdoor environment Active CN113432597B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110878640.2A CN113432597B (en) 2021-08-02 2021-08-02 Composite visual navigation system applied to inter-row management of complex outdoor environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110878640.2A CN113432597B (en) 2021-08-02 2021-08-02 Composite visual navigation system applied to inter-row management of complex outdoor environment

Publications (2)

Publication Number Publication Date
CN113432597A CN113432597A (en) 2021-09-24
CN113432597B true CN113432597B (en) 2022-12-13

Family

ID=77762469

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110878640.2A Active CN113432597B (en) 2021-08-02 2021-08-02 Composite visual navigation system applied to inter-row management of complex outdoor environment

Country Status (1)

Country Link
CN (1) CN113432597B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN211552867U (en) * 2020-02-29 2020-09-22 华南理工大学 Visual navigation system for assisting unmanned trolley
CN112506208A (en) * 2020-12-22 2021-03-16 江苏理工学院 Transport robot system and control method
CN112806342A (en) * 2021-03-15 2021-05-18 江西农业大学 Automatic targeted pesticide application control system based on openMV

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7570783B2 (en) * 2005-07-01 2009-08-04 Deere & Company Method and system for vehicular guidance using a crop image
CN105706637A (en) * 2016-03-10 2016-06-29 西北农林科技大学 Autonomous-navigation crawler-type multi-mechanical-arm apple picking robot
CN108509928A (en) * 2018-04-09 2018-09-07 石河子大学 For Cold region apple jujube garden field pipe operation vision guided navigation path extraction method
CN109085845B (en) * 2018-07-31 2020-08-11 北京航空航天大学 Autonomous air refueling and docking bionic visual navigation control system and method
DE102019122755A1 (en) * 2019-08-23 2021-02-25 Horsch Leeb Application Systems Gmbh Safety system and method for operating a mobile agricultural work machine
CN110554706A (en) * 2019-09-25 2019-12-10 江苏理工学院 visual navigation self-balancing vehicle and balancing method
CN112544204A (en) * 2020-12-03 2021-03-26 武汉理工大学 Weeding robot based on RISC-V embedded processor and control method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN211552867U (en) * 2020-02-29 2020-09-22 华南理工大学 Visual navigation system for assisting unmanned trolley
CN112506208A (en) * 2020-12-22 2021-03-16 江苏理工学院 Transport robot system and control method
CN112806342A (en) * 2021-03-15 2021-05-18 江西农业大学 Automatic targeted pesticide application control system based on openMV

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于OpenMV图像识别智能小车的设计与制作;王泽川等;《电子世界》;20191230(第24期);全文 *

Also Published As

Publication number Publication date
CN113432597A (en) 2021-09-24

Similar Documents

Publication Publication Date Title
Bai et al. Vision-based navigation and guidance for agricultural autonomous vehicles and robots: A review
CN110243372B (en) Intelligent agricultural machinery navigation system and method based on machine vision
Meng et al. Development of agricultural implement system based on machine vision and fuzzy control
AU2020103026A4 (en) A Single Tree Crown Segmentation Algorithm Based on Super-pixels and Topological Features in Aerial Images
US20060095207A1 (en) Obstacle detection using stereo vision
CN110765916B (en) Farmland seedling ridge identification method and system based on semantics and example segmentation
Juman et al. A novel tree trunk detection method for oil-palm plantation navigation
CN102914967A (en) Autonomous navigation and man-machine coordination picking operating system of picking robot
CN103891697A (en) Drug spraying robot capable of moving indoors autonomously and variable drug spraying method thereof
CN106969730A (en) A kind of top fruit sprayer volume measuring method based on unmanned plane Detection Techniques
Smitt et al. Pathobot: A robot for glasshouse crop phenotyping and intervention
Singh et al. Comprehensive automation for specialty crops: Year 1 results and lessons learned
CN111727457B (en) Cotton crop row detection method and device based on computer vision and storage medium
Lin et al. Automatic detection of plant rows for a transplanter in paddy field using faster r-cnn
CN109753075B (en) Agriculture and forestry park robot navigation method based on vision
CN108509928A (en) For Cold region apple jujube garden field pipe operation vision guided navigation path extraction method
CN113432597B (en) Composite visual navigation system applied to inter-row management of complex outdoor environment
CN114485667A (en) Light and intelligent orchard ground navigation method
Peng et al. A combined visual navigation method for greenhouse spray robot
Benet et al. Multi-sensor fusion method for crop row tracking and traversability operations
CN115560754A (en) Visual navigation method based on weed removal
Chen et al. Measurement of the distance from grain divider to harvesting boundary based on dynamic regions of interest.
CN114625114A (en) Ground spraying system traveling path planning method based on machine vision
CN107862248A (en) A kind of vehicle and the recognition methods of pedestrian
Bakken et al. Robot-supervised learning of crop row segmentation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant