CN115683102A - Unmanned agricultural machinery navigation method, equipment, device and storage medium - Google Patents
Unmanned agricultural machinery navigation method, equipment, device and storage medium Download PDFInfo
- Publication number
- CN115683102A CN115683102A CN202210635783.5A CN202210635783A CN115683102A CN 115683102 A CN115683102 A CN 115683102A CN 202210635783 A CN202210635783 A CN 202210635783A CN 115683102 A CN115683102 A CN 115683102A
- Authority
- CN
- China
- Prior art keywords
- unmanned agricultural
- agricultural machine
- image
- coordinates
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Image Analysis (AREA)
Abstract
The invention discloses a navigation method, equipment, a device and a storage medium for an unmanned agricultural machine, wherein the method comprises the following steps: determining the geodetic coordinates of characteristic corner points of the graphic signs arranged on the driving path of the unmanned agricultural machine or on two sides of the driving path; positioning to obtain geodetic coordinates of the unmanned agricultural machine in the driving process of the unmanned agricultural machine, collecting an image of a surrounding environment, extracting an edge contour of a pattern in the image of the surrounding environment by using an image edge detection technology, thereby identifying a vertex of the pattern in the image of the surrounding environment, taking the vertex as a characteristic corner point, and calculating to obtain pixel coordinates of the characteristic corner point; and correcting and calibrating the geodetic coordinates of the unmanned agricultural machine by using the geodetic coordinates and the pixel coordinates of the characteristic angular points. The method can correct the position coordinates of the unmanned agricultural machine by using a computer vision technology without depending on vehicle-mounted RTK equipment and technology, realizes centimeter-level unmanned agricultural machine high-precision positioning, reduces equipment cost, and improves positioning and navigation precision.
Description
Technical Field
The invention relates to the field of computer vision and assisted automatic driving. And more particularly, to a method, apparatus, device and storage medium for unmanned farm machine based navigation.
Background
China is a big agricultural land, and the cultivated land area is large and occupies 7 percent of the cultivated land area of the world. However, compared with developed countries in europe and the united states, the degree of agricultural mechanization is still relatively low, and the industrial development degree needs to be improved. Meanwhile, with the depth of the urbanization process, more and more rural young labors leave the rural soil and enter the city service workers, so that the rural young labors are continuously reduced, and great impact is brought to agricultural operation and rural economy. Therefore, unmanned agricultural machinery has become one of the important landing scenes of the automatic driving (unmanned) technology in recent years. Under the support of the automatic driving technology, the agricultural machine can realize diversified operations such as automatic land turning, leveling, sowing, harvesting and the like, greatly makes up the defects of shortage of young labor force and professional driving manipulators, and has important significance for improving the efficiency and the automation level of farming operation.
Due to the requirement of accurate operation, the unmanned agricultural machine has high precision requirement on navigation positioning, and generally needs to obtain and maintain centimeter-level high-precision positioning capability by using Real-Time Kinematic (RTK) carrier phase differential measurement equipment loaded on the unmanned agricultural machine, so that a foundation is provided for the operation of driving, controlling, path planning and the like of the agricultural machine. However, there are still limitations to the wide application of RTK devices in farming today. First, the use cost of RTKs remains high. When the third-party commercial service is used, the user needs to pay a very large equipment fee and a very large service fee for the third-party commercial service, and when the number of agricultural equipment is large, the cost is more obvious. Second, in many remote areas, RTK service coverage is not currently available. At this time, if the user sets up the RTK reference station and the communication facility by himself/herself, the user needs to build and maintain himself/herself, which increases labor and equipment costs. Therefore, how to obtain a precise, reliable and cost-controllable high-precision position reference becomes one of the prominent pain points for the scale application of the unmanned agricultural machinery at present.
Disclosure of Invention
In order to solve the above technical problems or at least partially solve the above technical problems, the present disclosure provides an unmanned agricultural machine navigation method, apparatus, and storage medium for achieving accurate and reliable navigation positioning of an unmanned agricultural machine without depending on a vehicle-mounted RTK apparatus and technology.
The embodiment of the invention provides an unmanned agricultural machinery navigation method, which comprises the following steps: determining the geodetic coordinates of characteristic corner points of the graphic signs arranged on the driving path of the unmanned agricultural machine or on two sides of the driving path; positioning to obtain a geodetic coordinate of the unmanned agricultural machine in the driving process of the unmanned agricultural machine, collecting a peripheral environment image, extracting an edge contour of a pattern in the peripheral environment image by using an image edge detection technology, thereby identifying a vertex of the pattern in the peripheral environment image, taking the vertex as a characteristic angular point, and calculating to obtain a pixel coordinate of the characteristic angular point; and correcting and calibrating the geodetic coordinates of the unmanned agricultural machine by using the geodetic coordinates and the pixel coordinates of the characteristic angular points.
Preferably, determining characteristic angles of graphic signs disposed on or on both sides of the path of travel of the unmanned agricultural vehicle according to the method of claim 1, the geodetic coordinates of the points comprising: and statically measuring and collecting the coordinate position of the graphic indication board in a period of time by using an RTK high-precision receiver, averaging the coordinate position of the graphic indication board collected in the period of time, and calculating to obtain the geodetic coordinates of the characteristic angular point of the graphic indication board.
Preferably, using an image edge detection technique, extracting an edge contour of the pattern in the surrounding environment image comprises calculating a gradient and an angle of the pattern in the surrounding environment image; performing non-maximum suppression on the gradient; the edges of the pattern are connected using a dual threshold until a complete edge profile of the pattern is extracted.
Preferably, the calculating the gradient and the angle of the pattern in the surrounding environment image comprises: the gradient and angle of the pattern in the ambient image are calculated according to the following formulas,wherein x is an abscissa of a pixel point of the pattern in the image of the surrounding environment, y is an ordinate of a pixel point of the pattern in the image of the surrounding environment, f is a gray value of the pattern in the image of the surrounding environment,and calculating an angle matrix formed by all pixel points of the pattern in the surrounding environment image.
Preferably, the non-maxima suppression of the gradient comprises: judging whether the gray value of the currently detected point C in the gradient is the maximum in an 8-connected neighborhood, if so, continuously checking whether the gray values of a first intersection point dTMp1 and a second corner point dTMp2 in the gradient direction in the gradient are larger than C, if C is larger than the gray values of the first intersection point dTMp1 and the second corner point dTMp2, determining that C is the maximum value and setting the value of C to be 1, otherwise, determining that C is the non-maximum value and setting the value of C to be 0, traversing all points C in the gradient, searching the local maximum value of a pixel point in the gradient, and finishing the non-maximum value suppression of the gradient.
Preferably, the edges of the pattern are connected by using double thresholds until the complete edge profile of the pattern is extracted, wherein the two thresholds comprise a low threshold and a high threshold, a point smaller than the low threshold is regarded as a false edge and is set as 0, and a point larger than the high threshold is regarded as a strong edge and is set as 1; according to the high threshold points in the image, firstly connecting the high threshold points into the outline, when the breakpoint of the outline is reached, searching a point meeting a low threshold value in the 8 fields of the breakpoint by an algorithm, and collecting a new edge according to the point until the whole image is closed.
Preferably, the geodetic coordinates of the unmanned agricultural machinery are processed by using the geodetic coordinates and the pixel coordinates of the characteristic corner pointsThe correction and calibration comprises the following steps: calculating the coordinates of the corrected unmanned agricultural machine according to the following formula,wherein, p' j As the geodetic coordinates of said characteristic corner points, p j Corrected coordinates, z ', of the unmanned agricultural machine to be optimized' ij For unmanned agricultural machinery in position and posture T i Is observed from the characteristic angular point p' j Generated pixel measurement data, e ij As an error function, h (T) i Pj) is the projection function of world coordinates to pixel coordinates.
On the other hand, the embodiment of the invention provides unmanned agricultural machinery navigation equipment, wherein the device comprises: the characteristic angular point geodetic coordinate determination device is used for determining geodetic coordinates of characteristic angular points of the graphic signs arranged on the driving path of the unmanned agricultural machine or on two sides of the driving path; the characteristic angular point pixel coordinate calculation device is used for positioning in the running process of the unmanned agricultural machine to obtain the geodetic coordinates of the unmanned agricultural machine, acquiring an image of the surrounding environment, extracting the edge contour of a pattern in the image of the surrounding environment by using an image edge detection technology, identifying the vertex of the pattern in the image of the surrounding environment, taking the vertex as a characteristic angular point, and calculating to obtain the pixel coordinates of the characteristic angular point; and the coordinate calibration device is used for correcting and calibrating the geodetic coordinates of the unmanned agricultural machine by using the geodetic coordinates and the pixel coordinates of the characteristic angular points.
In another aspect, an embodiment of the present invention further provides an unmanned agricultural machinery navigation apparatus, where the apparatus includes: a processor, a memory including processor-executable program instructions that, when executed by the processor, cause the apparatus to determine location coordinates of an indoor wireless signal transmission anchor point to perform the following operations: determining geodetic coordinates of characteristic corner points of the graphic signs arranged on the driving path or on two sides of the driving path of the unmanned agricultural machine; positioning to obtain a geodetic coordinate of the unmanned agricultural machine in the driving process of the unmanned agricultural machine, collecting a peripheral environment image, extracting an edge contour of a pattern in the peripheral environment image by using an image edge detection technology, thereby identifying a vertex of the pattern in the peripheral environment image, taking the vertex as a characteristic angular point, and calculating to obtain a pixel coordinate of the characteristic angular point; and correcting and calibrating the geodetic coordinates of the unmanned agricultural machine by using the geodetic coordinates and the pixel coordinates of the characteristic angular point.
In yet another aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the method recited in any one of claims 1 to 7.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has the following advantages: according to the ground coordinate correction method, the graphic indication boards with the characteristic angular points are arranged on two sides of the running path of the unmanned agricultural machine, the image edge detection technology is utilized in the running process, the characteristic angular points of the graphic indication boards are identified, the pixel coordinates of the characteristic angular points are obtained through calculation, then the ground coordinates and the pixel coordinates of the characteristic angular points are utilized, and the ground coordinates of the unmanned agricultural machine are corrected and calibrated, so that the position coordinates of the unmanned agricultural machine can be corrected by utilizing the computer vision technology under the condition that a vehicle-mounted RTK device and the technology are not depended on, the high-precision positioning of the centimeter-level unmanned agricultural machine is realized, the high-precision operation requirement of the unmanned agricultural machine is met, the equipment cost is greatly reduced, and the positioning and navigation precision are improved.
Additional features and advantages of embodiments of the present invention will be described in the detailed description which follows.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
In order to more clearly illustrate the embodiments or technical solutions in the prior art of the present disclosure, the drawings used in the description of the embodiments or prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
FIG. 1 is a flow chart of an unmanned agricultural vehicle navigation method according to an embodiment of the present disclosure;
FIG. 2 is an exemplary graphical sign according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of a graphical sign arrangement according to an embodiment of the present disclosure;
FIG. 4 is a schematic illustration of non-maximum suppression of a gradient according to an embodiment of the disclosure;
FIG. 5 is a block diagram of an unmanned agricultural vehicle navigation apparatus according to an embodiment of the present disclosure;
fig. 6 is a schematic view of an unmanned agricultural machine navigation device according to an embodiment of the present disclosure.
Detailed Description
The present disclosure will be described in detail below with reference to the accompanying drawings and specific embodiments in order to enable those skilled in the art to better understand the technical solutions of the present disclosure. Embodiments of the disclosure are described in further detail below with reference to the figures and the detailed description, but the disclosure is not limited thereto.
With the rapid construction of the autonomous beidou Satellite navigation System in China, the beidou three-Satellite navigation System has begun to provide open and free sub-meter Satellite-Based Augmentation (SBAS) service and decimeter Precision Point Positioning (PPP) service for the industry and the public. Furthermore, unmanned agricultural machinery is generally equipped with or already equipped with equipment such as Inertial Navigation Systems (INS) and Camera (Camera) sensors. Therefore, the inventor proposes that by combining the Beidou SBAS/PPP service with navigation means such as inertia and vision, the unmanned agricultural machinery can obtain continuous, stable and reliable navigation and positioning results with absolute accuracy of a decimeter level.
Based on this, as shown in fig. 1, the invention provides an unmanned agricultural machinery navigation method, which comprises the following steps:
s101, determining geodetic coordinates of characteristic corner points of graphic signs arranged on a driving path or on two sides of the driving path of the unmanned agricultural machine;
wherein, on the unmanned agricultural machinery driving path or on both sides of the driving path, graphic indication boards with characteristic angular points which can be clearly identified by computer vision are arranged.
In some embodiments, the graphical indicators may be in the form of squares, rectangles, triangles, two-dimensional codes, and the like, as shown in fig. 2.
When the unmanned agricultural machine works in the farmland, the unmanned agricultural machine is usually driven from the point A to the point B according to a preset driving path by adopting the mode of A/B point operation, as shown in figure 3. Thus, in some embodiments, the graphical indicators can be disposed on the path of travel of the unmanned agricultural machine, such as at the ends of a ridge; or can be arranged on two sides of the traveling path of the unmanned agricultural machine, such as two sides of a ridge.
In a preferred embodiment, in order to accurately obtain the geodetic coordinates of the graphic indication board and correct the subsequent positioning result of the unmanned agricultural machine, the coordinate position of the graphic indication board can be statically measured and acquired within a period of time by using, for example, a high-precision satellite positioning receiver in advance, the mean value of the coordinate positions of the graphic indication board acquired within the period of time is taken, and the geodetic coordinates of the characteristic angular points of the graphic indication board are calculated.
Further preferably, when the coordinate position of the characteristic angular point is acquired, the phase center of the satellite positioning receiver antenna needs to be overlapped with the characteristic angular point of the graphic indication board as much as possible, so that accurate measurement is ensured to obtain the geodetic coordinate of the characteristic angular point of the graphic indication board. For example, the satellite positioning receiver antenna is placed at the apex of a square or rectangular sign. If the phase center of the satellite positioning receiver antenna is difficult to accurately coincide with the characteristic angular point, the calibration and the correction can be performed manually by measuring personnel.
S102, positioning to obtain geodetic coordinates of the unmanned agricultural machine in the driving process of the unmanned agricultural machine, collecting an image of the surrounding environment, extracting an edge contour of a pattern in the image of the surrounding environment by using an image edge detection technology, thereby identifying a vertex of the pattern in the image of the surrounding environment, taking the vertex as a characteristic corner point, and calculating to obtain pixel coordinates of the characteristic corner point;
in some embodiments, the two-dimensional pixel coordinates of the feature corner point may be obtained by processing: the method comprises the steps of acquiring and obtaining images of surrounding environment through a camera sensor mounted on an unmanned agricultural machine, extracting edge outlines of images or patterns for each frame of image by using an image edge detection technology, and identifying vertexes of the images or patterns and using the vertexes as feature corner points. The edge detection operator can select Canny operator, roberts operator, sobel operator, marr-Hildreth operator, etc.
To find an optimal edge, for example, identify as many actual edges in the image as possible; the identified edges are to be as close as possible to the edges in the actual image; in a preferred embodiment, the extracting the edge contour of the pattern in the image of the surrounding environment by using the image edge detection technology may include the following steps:
s201, calculating the gradient and the angle of the pattern in the surrounding environment image.
S202, performing non-maximum suppression on the gradient;
and S203, connecting the edges of the pattern by using the double thresholds until the complete edge profile of the pattern is extracted.
In step S201, the gradient is a very important concept in artificial intelligence, and extends over the field of machine learning and deep learning. The first order differential of the one-dimensional function is defined as formula (1):
where f (x) is a micro function with respect to an unknown x, x is the unknown, and ε is a small variable with respect to the unknown.
Filtering of the image is typically based on a gray scale map, where the image is two-dimensional. Therefore, two-dimensional function differentiation processing is required, that is, there are formula (2) and formula (3):
where f (x, y) is a two-dimensional function with respect to the unknowns x, y.
From the above formula, the image gradient is the partial derivative of the current pixel point to the X axis and the Y axis, so the gradient can be understood as the change speed of the pixel gray value in the image processing field.
The modulus of the gradient represents the amount by which f (x, y) increases per unit distance in the direction of its maximum rate of change, i.e.:
where G is the modulus of the image gradient.
The calculation of the angle of the gradient is simpler and serves as a basis for the direction of non-maximum suppression. The calculation formula is as follows:
wherein x is the abscissa of the pixel point in the image, y is the ordinate of the pixel point in the image, f is the gray value of the image,and calculating the formed angle matrix for all the pixel points.
In step S202, the gradient is subjected to non-maximum suppression.
The gradient obtained in step S201 has problems of coarse and wide edges, weak edge interference, and the like. Therefore, the local maximum of the pixel points can be searched by using non-maximum suppression processing, and the gray value corresponding to the non-maximum is set to be 0, so that most of non-edge pixel points can be eliminated.
As shown in FIG. 4, C represents the currently detected point, and g1-g4 are its 8-connected neighborhood points. In the figure, the oblique lines indicate the gradient direction of the point C calculated in the previous step.
Judging whether the gray value of a currently detected point C in the gradient is maximum in an 8-connected neighborhood, if so, continuously checking whether the gray values of a first intersection point dTMp1 and a second corner point dTMp2 in the gradient direction in the gradient are greater than C, if so, determining that C is a maximum value and setting the value of C as 1, otherwise, determining that C is a non-maximum value and setting the value of C as 0, traversing all points C in the gradient, searching the local maximum value of pixel points in the gradient, and finishing the suppression of the non-maximum value of the gradient.
It should be noted that the intersection point of the gradient directions does not necessarily fall at the position of 8 points in the 8 neighborhoods, so dTmp1 and dTmp2 are gray values formed by bilinear interpolation of two adjacent points in practical application.
In step S203, after the processing of step S201 and step S202, the edge quality of the pattern is already high, but many false edges still exist, so in order to remove the false edges, the edges of the pattern may be connected using a dual threshold. Specifically, two thresholds are selected, wherein the two thresholds comprise a low threshold and a high threshold, a point smaller than the low threshold is regarded as a false edge and is set to be 0, a point larger than the high threshold is regarded as a strong edge and is set to be 1, and the pixel point between the two thresholds needs to be further checked.
According to the high threshold points in the image, the high threshold points are firstly connected into the outline, when the breakpoint of the outline is reached, a point meeting the low threshold value is searched in the 8 fields of the breakpoint, and new edges are collected according to the point until the whole image is closed.
And S204, identifying and calculating the pixel coordinates of the characteristic corner points on the edge contour lines of the pattern.
For a certain pixel point j on the edge contour line, the pixel coordinate is marked as (uj, vj), and the pixel coordinates of the left and right adjacent pixel points are respectively (uj-1, vj-1) and (uj +1, vj + 1). If it isAbove a certain threshold Γ, there is:and then, the pixel point j is the characteristic angular point of the pattern.
The threshold value Γ should be determined appropriately according to the geometry of the graphic sign. For example, when the image sign is square or rectangular, the threshold Γ should be set close to 90 °; when the image sign is an equilateral triangle, the threshold value Γ should be set close to 60 °.
Further preferably, before step S201, denoising processing may be performed on the acquired ambient environment image by using gaussian filtering. Gaussian filtering is a linear smoothing filtering, can be used to eliminate gaussian white noise, and is widely applied to noise reduction processing in image processing.
And S103, correcting and calibrating the geodetic coordinates of the unmanned agricultural machine by using the geodetic coordinates and the pixel coordinates of the characteristic corner points.
At this time, it is actually a process of building and solving a BA (Bundle Adjustment) model with constraints. For a classical instant positioning and Mapping (SLAM) problem, we consider the internal and external parameters and distortion of the cameras loaded on the unmanned agricultural machinery, starting from a point p in a world coordinate system, and finally project into pixel coordinates, the following steps are required:
s301, converting the world coordinates into camera coordinates, where the camera external parameters (R, t):
P′=Rp+t=[X′,Y′,Z′] T (6)
wherein P 'is a point [ X', Y ', Z' which is located on the physical imaging plane O '-X' -Y 'after the point P in the world coordinate system passes through the pinhole O projection'] T Is the coordinate of P', R is the rotation matrix from world coordinates to camera coordinates, and t is the translation vector from world coordinates to camera coordinates.
S302, projecting P' to a normalization plane to obtain a normalization coordinate:
P c =[u c ,v c ,1] T =[X′/Z′,Y′/Z′,1] T (7)
wherein, P c For the point where point P' is projected on the normalized pixel plane, [ u ] c ,v c ,1] T Is P c The coordinates of (c).
S303, obtaining the original pixel coordinate [ u 'before distortion removal by considering the distortion condition of the normalized coordinate' c ,v′ c ] T Here, only radial distortion is considered for the moment:
wherein k is 1 ,k 2 ,r c The polynomial parameters are corrected for distortion.
S304, calculating pixel coordinates according to the internal reference model:
the above process can be abstractly written as equation (10):
z=h(x,y) (10)
the detailed parameterization of the BA process is given above. Specifically, x here refers to the pose of the camera at this time, i.e., the external parameters R and T, and its corresponding lie group is denoted as T. The landmark feature point y is the three-dimensional point p, and the observation data is the pixel coordinateBased on the principle of least squares, the error equation of this observation can be listed:
e=z-h(T,p) (11)
then, the off-measurements at other times are taken into account and an index is added to the error. Let z ij For unmanned agricultural machinery in position and posture T i Where observation road sign p j The generated data, then the whole generationA price function of
Solving the formula (12) is equivalent to adjusting the pose of the unmanned agricultural machine and the road sign characteristics in the environment simultaneously, namely BA.
On this basis, for the method of the invention, it is noted that the recognizable characteristic corner point in the graphic sign is p' j ,z′ ij For unmanned agricultural machinery in position and posture T i Is observed characteristic angular point p' j Generated data, then due to feature corner point p' j Is known and provides a condition for correction and constraint, so that the earth coordinates can be added to p 'on the basis of the formula (6)' j And z' ij And the cost function is rewritten as equation (13):
the formula (13) is not a linear function, and nonlinear optimization solution is carried out on the linear function, so that the vision correction based unmanned agricultural machine high-precision navigation positioning result can be finally obtained.
Thereby, the coordinates of the corrected unmanned agricultural machine are calculated according to the formula (13),
wherein, p' j As the geodetic coordinates of said characteristic corner points, p j Corrected coordinates, z ', for the unmanned agricultural machine to be optimized' ij For unmanned agricultural machinery in position and posture T i Observing the characteristic corner point p' j The resulting pixel measurement data, eij is an error function, h (T) i Pj) is the projection function of world coordinates to pixel coordinates.
It is also worth noting that the effect of the correction and restraint described above is sustained during the period of time in which the graphical indicators are visible during the course of travel of the unmanned agricultural vehicle. If the graphic indication board on the unmanned agricultural machinery driving road is properly added, the method can obtain better performance and effect.
Compared with the prior art, the technical scheme provided by the embodiment of the disclosure has the following advantages: according to the ground coordinate correction method, the graphic indication boards with the characteristic angular points are arranged on two sides of the running path of the unmanned agricultural machine, the image edge detection technology is utilized in the running process, the characteristic angular points of the graphic indication boards are identified, the pixel coordinates of the characteristic angular points are obtained through calculation, then the ground coordinates and the pixel coordinates of the characteristic angular points are utilized, and the ground coordinates of the unmanned agricultural machine are corrected and calibrated, so that the position coordinates of the unmanned agricultural machine can be corrected by utilizing the computer vision technology under the condition that a vehicle-mounted RTK device and the technology are not depended on, the high-precision positioning of the centimeter-level unmanned agricultural machine is realized, the high-precision operation requirement of the unmanned agricultural machine is met, the equipment cost is greatly reduced, and the positioning and navigation precision are improved.
The embodiment of the invention also provides unmanned agricultural machinery navigation equipment, wherein the device comprises:
the characteristic angular point geodetic coordinate determination device is used for determining geodetic coordinates of characteristic angular points of the graphic signs arranged on the driving path of the unmanned agricultural machine or on two sides of the driving path;
the characteristic angular point pixel coordinate calculation device is used for positioning in the running process of the unmanned agricultural machine to obtain the geodetic coordinates of the unmanned agricultural machine, acquiring an image of the surrounding environment, extracting the edge contour of a pattern in the image of the surrounding environment by using an image edge detection technology, identifying the vertex of the pattern in the image of the surrounding environment, taking the vertex as a characteristic angular point, and calculating to obtain the pixel coordinates of the characteristic angular point;
and the coordinate calibration device is used for correcting and calibrating the geodetic coordinates of the unmanned agricultural machine by using the geodetic coordinates and the pixel coordinates of the characteristic corner point.
On the other hand, an embodiment of the present invention further provides an unmanned agricultural machinery navigation apparatus, as shown in fig. 5, the apparatus includes:
the processor(s) 601 are configured to,
a memory 602 comprising processor-executable program instructions that, when executed by the processor, cause the apparatus to determine position coordinates of an indoor wireless signal transmission anchor point to:
determining geodetic coordinates of characteristic corner points of the graphic signs arranged on the driving path or on two sides of the driving path of the unmanned agricultural machine;
positioning to obtain a geodetic coordinate of the unmanned agricultural machine in the driving process of the unmanned agricultural machine, collecting a peripheral environment image, extracting an edge contour of a pattern in the peripheral environment image by using an image edge detection technology, thereby identifying a vertex of the pattern in the peripheral environment image, taking the vertex as a characteristic angular point, and calculating to obtain a pixel coordinate of the characteristic angular point;
and correcting and calibrating the geodetic coordinates of the unmanned agricultural machine by using the geodetic coordinates and the pixel coordinates of the characteristic angular point.
The embodiment of the present disclosure provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned positioning and navigation method for an unmanned aerial vehicle, and can achieve the same technical effect, and is not described herein again to avoid repetition.
The use of "first," "second," and similar terms in this disclosure is not intended to indicate any order, quantity, or importance, but rather are used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element preceding the word covers the element listed after the word, and does not exclude the possibility that other elements are also covered. "upper", "lower", "left", "right", and the like are used only to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
All terms (including technical or scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs unless specifically defined otherwise. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
Claims (10)
1. An unmanned agricultural machinery navigation method, comprising:
determining the geodetic coordinates of characteristic corner points of the graphic signs arranged on the driving path of the unmanned agricultural machine or on two sides of the driving path;
positioning to obtain geodetic coordinates of the unmanned agricultural machine in the driving process of the unmanned agricultural machine, acquiring an image of a peripheral environment, extracting an edge contour of a pattern in the image of the peripheral environment by using an image edge detection technology, thereby identifying a vertex of the pattern in the image of the peripheral environment, taking the vertex as a characteristic angular point, and calculating to obtain pixel coordinates of the characteristic angular point;
and correcting and calibrating the geodetic coordinates of the unmanned agricultural machine by using the geodetic coordinates and the pixel coordinates of the characteristic angular points.
2. The unmanned agricultural vehicle navigation method of claim 1, wherein determining characteristic angles of graphical signs disposed on or on both sides of the unmanned agricultural vehicle travel path the geodetic coordinates of points comprises:
and statically measuring and collecting the coordinate position of the graphic indication board in a period of time by using an RTK high-precision receiver, averaging the coordinate position of the graphic indication board collected in the period of time, and calculating to obtain the geodetic coordinates of the characteristic angular point of the graphic indication board.
3. The unmanned agricultural machine navigation method of claim 1, wherein extracting an edge contour of a pattern in the surrounding environment image using an image edge detection technique comprises:
calculating the gradient and the angle of the pattern in the surrounding environment image;
performing non-maximum suppression on the gradient;
the edges of the pattern are connected using dual thresholds until a complete edge profile of the pattern is extracted.
4. The unmanned agricultural machine navigation method of claim 3, wherein calculating the gradient and angle of the pattern in the surrounding environment image comprises:
the gradient and angle of the pattern in the ambient image are calculated according to the following formulas,
wherein x is an abscissa of a pixel point of the pattern in the surrounding environment image, y is an ordinate of a pixel point of the pattern in the surrounding environment image, f is a gray value of the pattern in the surrounding environment image,and calculating an angle matrix formed by all pixel points of the pattern in the surrounding environment image.
5. The unmanned agricultural navigation method of claim 3, wherein non-maxima suppression of the gradient comprises:
judging whether the gray value of the currently detected point C in the gradient is the maximum in an 8-connected neighborhood, if so, continuously checking whether the gray values of a first intersection point dTMp1 and a second corner point dTMp2 in the gradient direction in the gradient are larger than C, if C is larger than the gray values of the first intersection point dTMp1 and the second corner point dTMp2, determining that C is the maximum value and setting the value of C to be 1, otherwise, determining that C is the non-maximum value and setting the value of C to be 0, traversing all points C in the gradient, searching the local maximum value of a pixel point in the gradient, and finishing the non-maximum value suppression of the gradient.
6. The unmanned agricultural machine navigation method of claim 3, wherein using the dual thresholds to connect the edges of the pattern until a complete edge profile of the pattern is extracted comprises:
selecting two thresholds, wherein the two thresholds comprise a low threshold and a high threshold, a point smaller than the low threshold is regarded as a false edge and is set as 0, and a point larger than the high threshold is regarded as a strong edge and is set as 1;
according to the high threshold points in the image, firstly connecting the high threshold points into the outline, when the breakpoint of the outline is reached, searching a point meeting a low threshold value in the 8 fields of the breakpoint by an algorithm, and collecting a new edge according to the point until the whole image is closed.
7. The unmanned aerial vehicle navigation method of claim 1, wherein the correcting and calibrating geodetic coordinates of the unmanned aerial vehicle using geodetic coordinates and pixel coordinates of the feature corner points comprises:
calculating the coordinates of the corrected unmanned agricultural machine according to the following formula,
wherein, p' j As the geodetic coordinates of said characteristic corner points, p j Corrected coordinates, z ', for the unmanned agricultural machine to be optimized' ij For unmanned agricultural machinery in position and posture T i Observing the characteristic corner point p' j The resulting pixel measurement data, eij is an error function, h (T) i Pj) is the projection function of world coordinates to pixel coordinates.
8. An unmanned agricultural machine navigation apparatus, wherein the apparatus comprises:
the characteristic angular point geodetic coordinate determination device is used for determining geodetic coordinates of characteristic angular points of the graphic signs arranged on the driving path of the unmanned agricultural machine or on two sides of the driving path;
the characteristic angular point pixel coordinate calculation device is used for positioning in the running process of the unmanned agricultural machine to obtain the geodetic coordinates of the unmanned agricultural machine, acquiring an image of the surrounding environment, extracting the edge contour of a pattern in the image of the surrounding environment by using an image edge detection technology, identifying the vertex of the pattern in the image of the surrounding environment, taking the vertex as a characteristic angular point, and calculating to obtain the pixel coordinates of the characteristic angular point;
and the coordinate calibration device is used for correcting and calibrating the geodetic coordinates of the unmanned agricultural machine by using the geodetic coordinates and the pixel coordinates of the characteristic corner point.
9. An unmanned agricultural machine navigation device, the device comprising:
a processor for processing the received data, wherein the processor is used for processing the received data,
a memory comprising program instructions executable by a processor, the program instructions, when executed by the processor, causing the apparatus to determine location coordinates of an indoor wireless signal transmission anchor to:
determining the geodetic coordinates of characteristic corner points of the graphic signs arranged on the driving path of the unmanned agricultural machine or on two sides of the driving path;
positioning to obtain a geodetic coordinate of the unmanned agricultural machine in the driving process of the unmanned agricultural machine, collecting a peripheral environment image, extracting an edge contour of a pattern in the peripheral environment image by using an image edge detection technology, thereby identifying a vertex of the pattern in the peripheral environment image, taking the vertex as a characteristic angular point, and calculating to obtain a pixel coordinate of the characteristic angular point;
and correcting and calibrating the geodetic coordinates of the unmanned agricultural machine by using the geodetic coordinates and the pixel coordinates of the characteristic angular point.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210635783.5A CN115683102A (en) | 2022-06-07 | 2022-06-07 | Unmanned agricultural machinery navigation method, equipment, device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210635783.5A CN115683102A (en) | 2022-06-07 | 2022-06-07 | Unmanned agricultural machinery navigation method, equipment, device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115683102A true CN115683102A (en) | 2023-02-03 |
Family
ID=85060211
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210635783.5A Pending CN115683102A (en) | 2022-06-07 | 2022-06-07 | Unmanned agricultural machinery navigation method, equipment, device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115683102A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118429497A (en) * | 2024-05-11 | 2024-08-02 | 徐州徐工农业装备科技有限公司 | Animation display method for agricultural machinery navigation system |
-
2022
- 2022-06-07 CN CN202210635783.5A patent/CN115683102A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN118429497A (en) * | 2024-05-11 | 2024-08-02 | 徐州徐工农业装备科技有限公司 | Animation display method for agricultural machinery navigation system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107703528B (en) | Visual positioning method and system combined with low-precision GPS in automatic driving | |
CN113074727A (en) | Indoor positioning navigation device and method based on Bluetooth and SLAM | |
CN113870343B (en) | Relative pose calibration method, device, computer equipment and storage medium | |
CN101285686B (en) | Agricultural machines navigation hierarchical positioning process and system | |
US20210342620A1 (en) | Geographic object detection apparatus and geographic object detection method | |
JP6966218B2 (en) | Imaging equipment calibration equipment, work machines and calibration methods | |
CN104200086A (en) | Wide-baseline visible light camera pose estimation method | |
CN102252681A (en) | Global positioning system (GPS) and machine vision-based integrated navigation and positioning system and method | |
CN113220818B (en) | Automatic mapping and high-precision positioning method for parking lot | |
CN112197741B (en) | Unmanned aerial vehicle SLAM technology inclination angle measuring system based on extended Kalman filtering | |
CN110766760A (en) | Method, device, equipment and storage medium for camera calibration | |
CN117128861A (en) | Monitoring system and monitoring method for station-removing three-dimensional laser scanning bridge | |
CN115683102A (en) | Unmanned agricultural machinery navigation method, equipment, device and storage medium | |
CN110044358B (en) | Mobile robot positioning method based on field line characteristics | |
CN114488094A (en) | Vehicle-mounted multi-line laser radar and IMU external parameter automatic calibration method and device | |
CN118168545A (en) | Positioning navigation system and method for weeding robot based on multi-source sensor fusion | |
Shalal et al. | A preliminary evaluation of vision and laser sensing for tree trunk detection and orchard mapping | |
Hussnain et al. | An automatic procedure for mobile laser scanning platform 6dof trajectory adjustment | |
CN109975848A (en) | Traverse measurement system accuracy optimization method based on RTK technology | |
CN112424568A (en) | System and method for constructing high-definition map | |
CN113484843B (en) | Method and device for determining external parameters between laser radar and integrated navigation | |
CN114004949B (en) | Airborne point cloud-assisted mobile measurement system placement parameter checking method and system | |
CN111854678B (en) | Pose measurement method based on semantic segmentation and Kalman filtering under monocular vision | |
CN114175104A (en) | Vision-based blade positioning | |
CN114120795A (en) | Map drawing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |