CN115112115A - High-precision real-time positioning method for robot orchard - Google Patents

High-precision real-time positioning method for robot orchard Download PDF

Info

Publication number
CN115112115A
CN115112115A CN202210712594.3A CN202210712594A CN115112115A CN 115112115 A CN115112115 A CN 115112115A CN 202210712594 A CN202210712594 A CN 202210712594A CN 115112115 A CN115112115 A CN 115112115A
Authority
CN
China
Prior art keywords
positioning
gnss
fusion
positioning information
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210712594.3A
Other languages
Chinese (zh)
Inventor
郭健
孙瑜
蔡云飞
徐胜元
陈祥龙
李晨星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN202210712594.3A priority Critical patent/CN115112115A/en
Publication of CN115112115A publication Critical patent/CN115112115A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/40Correcting position, velocity or attitude
    • G01S19/41Differential correction, e.g. DGPS [differential GPS]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/485Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Manufacturing & Machinery (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a high-precision real-time positioning method for a robot orchard, which is characterized in that positioning precision of a GNSS positioning signal under interference is compensated through a binocular vision camera and a laser radar sensor; the method comprises the following steps: the method comprises the steps of respectively utilizing a binocular vision camera and a laser radar to obtain positioning information of a robot, fusing the two positioning information, calibrating the fused positioning information by utilizing a high-precision positioning signal when a GNSS RTK positioning state is unlocked, enabling the fused positioning information of a sensor to reach the positioning precision of a GNSS, and replacing the GNSS positioning signal with the calibrated high-precision fused positioning information when the system detects that the GNSS signal is interfered. The method is low in complexity, small in calculated amount and high in signal calibration speed, and can provide high-precision real-time positioning information for the robot when the GNSS is unlocked.

Description

High-precision real-time positioning method for robot orchard
Technical Field
The invention relates to the technical field of robot positioning, in particular to a high-precision real-time positioning method for a robot orchard.
Background
In recent years, the development of agricultural robots is very important in China, more personnel and capital are invested in the research and development industry, and the automatic agricultural machinery is actively popularized and used in the whole society to realize the modern production of agriculture. With the implementation of policies, various intelligent agricultural robots emerge in a large number, wherein a satellite navigation positioning system is mostly used on the reverse side of the robot navigation positioning technology, but due to the particularity and complexity of the operating environment, the agricultural robots face various interferences and shelters in the field navigation positioning, which easily causes the satellite positioning signals to be lost or misaligned, and greatly reduces the reliability and the practicability of the robots.
Taking an orchard picking robot as an example, when the robot walks between fruit tree lanes, the robot is often shielded or covered by numerous branches and leaves, and then, GNSS signals are shielded by fruit trees during transmission, so that the precision of positioning signals is reduced, a user cannot acquire the accurate position of the autonomous robot, and meanwhile, the planning of a navigation route and the subsequent picking and other work are influenced by the misalignment of the position signals of the robot. At present, the mainstream methods for dealing with the problem of misalignment of positioning signals of the robot include GNSS combined visual positioning, GNSS combined laser radar positioning, map-based positioning and the like, and although the positioning effect is improved, the positioning accuracy of the GNSS under the condition that the GNSS is shielded still cannot be ensured, and the fusion positioning methods have certain requirements on application environments and cannot work well in complex environments. Therefore, the research on a high-precision real-time positioning method of the robot in complex field environments such as an orchard is urgent.
Disclosure of Invention
The invention aims to provide a high-precision real-time positioning method for a robot orchard, which can still realize high-precision real-time positioning of a precision approximate satellite positioning signal by using a sensor of the robot orchard in the case that the satellite positioning signal of the robot is lost or is interfered and misaligned, and solves the problem of satellite positioning signal lock losing of the robot in complex field environments such as the orchard and the like.
The technical scheme for realizing the purpose of the invention is as follows:
a high-precision real-time positioning method for a robot orchard calibrates fusion positioning information of each sensor through high-precision positioning information of a GNSS (global navigation satellite system), so that the high-precision fusion positioning information can make up a GNSS positioning signal. The method comprises the following steps:
step 1: and (4) binocular vision positioning, namely acquiring road surface information by using a binocular vision camera, and calculating positioning information of the robot.
Step 2: and (4) positioning by using the laser radar, and scanning the surrounding environment by using the laser radar to determine the positioning information of the robot.
And step 3: and (3) fusing sensor positioning information, namely performing information fusion on the positioning signals of all the positioning modules to obtain a relatively accurate positioning result.
And 4, step 4: and (4) calibrating the fusion positioning information, comparing the high-precision positioning signal of the GNSS with the fusion positioning information of each sensor obtained by the fusion algorithm when the GNSS RTK positioning state is normal, and adjusting the parameters of the fusion algorithm according to the comparison result.
And 5: and detecting whether the positioning signals of the GNSS are normal.
Step 6: and switching the positioning signals, wherein when the GNSS positioning signals are detected to be wrong, the system does not use the positioning result of the GNSS, but switches to the calibrated high-precision fusion positioning information.
Further, the specific process of step 1 includes:
step 1.1: and adjusting the binocular vision camera to vertically shoot a horizontal road surface, and setting shooting frequency according to the moving speed of the robot so that the images of adjacent frames have more overlapped areas.
Step 1.2: and converting the two adjacent shot images into a frequency domain by utilizing Fourier transform, and establishing logarithmic polar coordinates by respectively taking the image center as an origin.
Step 1.3: identifying the overlapped area in two adjacent images, and converting the rotation and scaling change of the overlapped area in the two images into the translation amount of two pairs of polar coordinate axes, thereby obtaining the angle and scaling change coefficient of the two images.
Step 1.4: according to the position change of the overlapping area on the two-dimensional plane, the translation amount of the robot in the transverse direction and the longitudinal direction is obtained, and the three-dimensional rectangular coordinate variation of the robot in the two adjacent frames can be calculated by combining the obtained angle and scaling variation coefficients of the two adjacent frames of images, so that the three-dimensional rectangular coordinate of the robot relative to the starting moment is obtained.
Step 1.5: and (4) repeating the step 1.2 to the step 1.4, and continuously calculating by using the adjacent frame images to obtain the three-dimensional coordinate change of the robot in a certain time period.
Further, the specific process of step 2 includes:
step 2.1: and scanning the surrounding environment by 360 degrees by using a 3D laser radar sensor, and extracting laser points of surrounding fruit trees.
Step 2.2: after the sensors receive data of laser points of surrounding fruit trees, the system can establish a three-dimensional rectangular coordinate system with a laser radar as a center, and then the data of each laser point is converted into three-dimensional coordinate data.
Step 2.3: and selecting enough fruit tree laser points as positioning reference points, continuously changing the coordinates of the reference laser points along with the movement of the robot, and calculating the position variation of the robot according to the coordinate change of the reference points at the starting moment and the stopping moment so as to obtain the three-dimensional rectangular coordinates of the robot relative to the starting moment.
Further, the specific process of step 3 includes:
step 3.1: and converting two groups of positioning coordinates obtained by the binocular vision camera and the laser radar sensor into world geodetic coordinates.
Step 3.2: and establishing a fusion model, initializing fusion parameters, and performing linear fusion on the two groups of sensor positioning data which are converted into geodetic coordinates.
Further, the specific process of step 4 includes:
step 4.1: firstly, the GNSS RTK positioning is determined to be in a normal positioning state, and the precision of a GNSS positioning result is ensured.
Step 4.2: and comparing the high-precision GNSS positioning result with the fusion positioning result of the sensor, and adjusting the relevant parameters of the fusion positioning algorithm according to the comparison result.
Step 4.3: and repeating the step 4.1 and the step 4.2, and continuously adjusting the parameters of the fusion positioning algorithm to enable the fusion positioning result of the sensor to continuously approach the positioning result of the GNSS, so that the precision and the reliability of the fusion positioning result are improved.
Further, the method for detecting whether the positioning signal of the GNSS is normal in step 5 includes: direct and indirect processes. The direct method is that GGA sentences in a GNSS transmission protocol are directly utilized, a Beidou satellite system is taken as an example, mode indicating bit data in BDGGA sentences are detected, if the mode indicating bit data is 0, positioning is invalid, and if the mode indicating bit data is not 0, positioning is valid. The indirect method is to calculate and store the positioning time and the time difference of the received signal of the positioning data of two adjacent frames, compare the calculated and stored time and the time difference with a preset time threshold, and determine that the GNSS positioning signal is interfered if the positioning time and the receiving time difference of some two frames exceed the threshold.
Further, the step 5 is performed in the whole robot positioning process, calibration of fusion positioning data is performed when a GNSS signal is detected to be normal, calibration is not performed when the positioning data accuracy approaches the GNSS accuracy, the GNSS signal continues to be in a real-time detection state, and the step 6 is performed when the GNSS positioning signal is detected to be interfered.
Further, the specific process of step 6 includes:
step 6.1: when the GNSS positioning signal is detected to be interfered, the system immediately switches the positioning signal into a calibrated sensor fusion positioning result, and the precision of the result is consistent with that of the GNSS without losing lock.
Step 6.2: and after the positioning signals are switched, the GNSS positioning signals are continuously detected, and when the interference disappears and the GNSS positioning accuracy is recovered, the system switches the positioning signals into GNSS positioning results again.
Compared with the prior art, the invention has the beneficial effects that: the positioning accuracy of a GNSS positioning signal is compensated through the binocular vision camera and the laser radar sensor, a fusion algorithm is designed to fuse robot positioning information acquired by the binocular vision camera and the laser radar, and then the high-accuracy positioning signal obtained when the GNSS RTK positioning state is unlocked is utilized to calibrate the fused positioning information, so that the sensor fused positioning information can reach the positioning accuracy of the GNSS, and when the system detects that the GNSS signal is interfered, the calibrated high-accuracy fused positioning information is used for replacing the GNSS positioning signal; the method can effectively solve the problem of positioning accuracy reduction when the satellite positioning signals are frequently interfered by the robot in field complex environments such as an orchard and the like, is low in complexity and small in calculation amount, and can quickly and frequently realize signal calibration and switching, so that high-accuracy real-time positioning information is provided for the robot.
Drawings
FIG. 1 is a scene schematic diagram of the high-precision real-time positioning method for the robot orchard.
Fig. 2 is a schematic flow chart of the high-precision real-time positioning method for the robotic orchard.
Detailed Description
According to the method, the precision of the robot positioning signal is improved when the GNSS positioning signal is interfered by the binocular vision camera and the laser radar sensor, and the robot can have a high-precision real-time positioning signal in complex environments such as an orchard and the like.
Referring to fig. 1, when a robot walks in a tree row of an orchard, GNSS positioning signals are frequently blocked by spaced fruit trees, in the method, a binocular vision camera is used for high-frequency shooting of ground images, changes of the robot in the transverse direction, the longitudinal direction and the height are continuously calculated by using changes of images of adjacent frames, meanwhile, a laser radar is used for scanning the surrounding environment, the fruit trees are selected as reference points, such as 7 example reference laser points in fig. 1, coordinate changes of the robot are calculated by using changes of distances between the robot and the fruit trees, the positioning signals of two sensors are fused, finally, the fused signals are calibrated by using a good time period of the GNSS signals in a gap between the fruit trees, so that the precision reaches the precision of the GNSS positioning signals, and when satellite signals are blocked again, the system is switched to fused positioning information to ensure high-precision real-time positioning of the robot.
Referring to fig. 2, a high-precision real-time positioning method for a robotic orchard includes the following steps:
step 1: and (4) binocular vision positioning, namely acquiring road surface information by using a binocular vision camera, and calculating positioning information of the robot. The method comprises the following specific steps:
step 1.1: adjusting a binocular vision camera to vertically shoot a horizontal road surface, and setting shooting frequency according to the moving speed of the robot to enable images of adjacent frames to have more overlapped areas;
step 1.2: converting two adjacent shot images into a frequency domain by utilizing Fourier transform, and respectively establishing a log-polar coordinate by taking the center of the image as an origin;
step 1.3: identifying a coincidence region in two adjacent images, and converting the rotation and scaling change of the coincidence region in the two images into the translation amount of two pairs of polar coordinate axes so as to obtain the angle and scaling change coefficients of the two images;
step 1.4: according to the position change of the overlapping area on a two-dimensional plane, the translation amount of the robot in the transverse direction and the longitudinal direction is obtained, and the three-dimensional rectangular coordinate variation of the robot in the two adjacent frames can be calculated by combining the obtained angle and scaling variation coefficients of the two adjacent frames of images, so that the three-dimensional rectangular coordinate of the robot relative to the starting moment is obtained;
step 1.5: repeating the steps 1.2 to 1.4, and continuously calculating by using the adjacent frame images to obtain the three-dimensional coordinate (X) of the robot relative to the initial position in a certain time period 1 ,Y 1 ,Z 1 )。
Step 2: and (4) positioning by using the laser radar, and scanning the surrounding environment by using the laser radar to determine the positioning information of the robot. The method comprises the following specific steps:
step 2.1: and scanning the surrounding environment by using a 3D laser radar sensor, and extracting laser points of surrounding fruit trees.
Step 2.2: after the sensors receive data of laser points of surrounding fruit trees, the system can establish a three-dimensional rectangular coordinate system with a laser radar as a center, and then the data of each laser point is converted into three-dimensional coordinate data.
Step 2.3: selecting enough fruit tree laser points as positioning reference points, continuously changing the coordinates of the reference laser points along with the movement of the robot, and calculating the position variation of the robot according to the coordinate variation of the reference points at the starting and stopping moments so as to obtain a three-dimensional rectangular coordinate (X) relative to the robot at the starting moment 2 ,Y 2 ,Z 2 )。
And step 3: and (4) fusing sensor positioning information, namely fusing the information of the positioning signals of all the positioning modules to obtain a relatively accurate positioning result. The method comprises the following specific steps:
step 3.1: converting two groups of positioning coordinates obtained by a binocular vision camera and a laser radar sensor into world geodetic coordinates:
Figure BDA0003708615250000061
wherein (B) 1 ,L 1 ,H 1 ) Representing the geodetic coordinates after the conversion of binocular vision positioning coordinates (B) 2 ,L 2 ,H 2 ) And the geodetic coordinates after the laser radar positioning coordinate conversion are represented.
Step 3.2: the fusion model was established as follows:
Figure BDA0003708615250000062
wherein (B, L, H) represents the localization result after fusion, α 1 And alpha 2 As a fusion coefficient of latitude coordinates, beta 1 And beta 2 As a fusion system of longitudinal coordinatesNumber, gamma 1 And gamma 2 As a fusion coefficient of the height coordinate, a 1 、a 2 、b 1 、b 2 、c 1 And c 2 The speed parameter is adjusted for the fusion coefficient,
Figure BDA0003708615250000063
and
Figure BDA0003708615250000064
fine tuning parameters for coordinates.
Initializing a fusion coefficient alpha of fusion parameters, latitude, longitude and altitude 1 、α 2 、β 1 、β 2 、γ 1 And gamma 2 Default is 0.5, and the fusion coefficient adjusts the speed parameter a 1 、a 2 、b 1 、b 2 、c 1 And c 2 Default to 1, coordinate fine tuning parameter
Figure BDA0003708615250000065
And
Figure BDA0003708615250000066
default to 0. The two sets of sensor positioning data that have been converted to geodetic coordinates are then data fused.
And 4, step 4: the GNSS RTK positioning state is normal, and the high-precision positioning signal (B) of the GNSS RTK is calibrated by fusing the positioning information GNSS ,L GNSS ,H GNSS ) And performing first-step parameter adjustment with the positioning result of each sensor, taking a latitude coordinate B as an example:
Figure BDA0003708615250000067
Figure BDA0003708615250000068
if the parameter alpha in the regulation 1 、α 2 Too low or too high variation amplitude of (2) can lead to too many parameter adjustment times, and then the parameter can be adjustedSection a 1 、a 2 When a is 1 And a 2 When the value is more than 1, the latitude parameter alpha 1 、α 2 The adjustment amplitude becomes larger when a 1 And a 2 When less than 1, the latitude parameter alpha 1 、α 2 The adjustment range is reduced, if the latitude coordinate of the GNSS is similar to the latitude coordinate of a certain sensor, the corresponding latitude parameter α is 0, which indicates that the latitude of the sensor does not need to be adjusted. The latitude parameter alpha accelerates the speed of latitude fusion. Similarly, the same is true for the parameter adjustment during the fusion of the longitude coordinate L and the altitude coordinate H.
After the first step of parameter adjustment, the high-precision positioning signal (B) of the GNSS is obtained GNSS ,L GNSS ,H GNSS ) And comparing the fusion positioning information (B, L and H) of each sensor obtained by the fusion algorithm, and performing second-step parameter adjustment:
Figure BDA0003708615250000071
repeating the first step parameter adjustment and the second step parameter adjustment until the positioning information (B, L, H) and the high-precision positioning signal (B) of the GNSS are fused GNSS ,L GNSS ,H GNSS ) Is less than a threshold, i.e., the accuracy of the fused positioning information is similar to the accuracy of the GNSS positioning signals.
And 5: and detecting whether the positioning signals of the GNSS are normal or not. The method for detecting whether the GNSS positioning signal is normal or not comprises the following steps: direct and indirect processes. The direct method is that GGA sentences in a GNSS transmission protocol are directly utilized, a Beidou satellite system is taken as an example, mode indicating bit data in BDGGA sentences are detected, if the mode indicating bit data is 0, positioning is invalid, and if the mode indicating bit data is not 0, positioning is valid. The indirect method is to calculate and store the positioning time and the time difference of receiving signals of two adjacent frames of positioning data, compare the positioning time and the time difference with a preset time threshold, and determine that the GNSS positioning signal is interfered if the positioning time and the time difference of receiving signals of two adjacent frames exceed the threshold.
Particularly, the GNSS positioning signal interference reception detection is started all the time in the whole robot positioning process, when the GNSS signal is detected to be normal, the calibration of the fusion positioning data is performed, when the positioning data precision is close to the GNSS precision, the calibration is not performed, and meanwhile, the GNSS signal continues to be in a real-time detection state, and when the GNSS positioning signal is detected to be interfered, the step 6 is continued.
Step 6: and switching the positioning signals, continuously updating the measuring result of the GNSS signals, and if the GNSS is not interfered, fusing the positioning information/GNSS positioning signal selection switch to select and output the GNSS positioning signals. When the GNSS positioning signal is detected to be interfered, whether the corrected accuracy of the fusion positioning information reaches the positioning accuracy of the GNSS in normal work is further judged, if the accuracy meets the requirement, a fusion positioning information/GNSS positioning signal selection switch selects to output the fusion positioning information, and if the accuracy does not meet the requirement, the fusion parameter adjustment is continued until the accuracy of the fusion positioning information meets the requirement or the GNSS is recovered to be normal.
The positioning in the step 1 and the step 2 is not limited to binocular vision and a laser radar sensor, and other sensors can be selected as a positioning signal source for fusion positioning; and the method of the present invention is not limited to the order of the method steps.
The foregoing detailed description, given for purposes of illustration, is provided to better enable one of ordinary skill in the art to understand the patent and is not to be construed as limiting the scope of the patent; all technical solutions obtained by means of equivalent substitution or equivalent transformation fall within the protection scope of the present invention.

Claims (10)

1. A high-precision real-time positioning method for a robot orchard is characterized by comprising the following steps:
determining a plurality of positioning information of the robot in a plurality of ways;
performing information fusion on the plurality of positioning information through a fusion algorithm to obtain fusion positioning information;
calibrating the fusion positioning information, comparing the positioning signal of the GNSS with the fusion positioning information when the GNSS RTK positioning state is normal, and adjusting the parameters of a fusion algorithm according to the comparison result until the fusion positioning information meets the precision requirement;
detecting whether a positioning signal of the GNSS is normal, if so, repeatedly calibrating and fusing the positioning information, otherwise, executing the following steps;
and switching the positioning signal to the calibrated fusion positioning information, monitoring the GNSS positioning signal, and switching the positioning signal to GNSS positioning if the GNSS positioning signal is normal.
2. The high-precision real-time positioning method for the robotic orchard according to claim 1, wherein the calibrating and fusing positioning information specifically includes:
step 4.1: firstly, determining that GNSS RTK positioning is in a normal positioning state, and ensuring that the precision of a GNSS positioning result meets the set precision requirement;
step 4.2: respectively comparing the GNSS positioning result with a plurality of positioning information and fusion positioning information, and respectively adjusting parameters of a fusion algorithm according to the comparison result;
step 4.3: and (4.1) repeating the step 4.1 and the step 4.2, and continuously adjusting parameters of the fusion algorithm to enable the fusion positioning information to continuously approach the positioning result of the GNSS until the fusion positioning information meets the precision requirement.
3. The method for high-precision real-time positioning of a robot orchard according to any one of claims 1 or 2, wherein the determining of the plurality of positioning information of the robot in a plurality of ways includes acquiring corresponding binocular vision positioning information and laser radar positioning information by a binocular vision camera and a laser radar sensor, respectively.
4. The high-precision real-time positioning method for the robotic orchard of claim 3, wherein the acquiring binocular vision positioning information by the binocular vision camera specifically comprises:
step 1.1: adjusting a binocular vision camera to vertically shoot a horizontal road surface, and setting shooting frequency according to the moving speed of the robot so that images of adjacent frames have overlapping areas;
step 1.2: converting two adjacent shot images into a frequency domain by utilizing Fourier transform, and respectively establishing a log-polar coordinate by taking the center of the image as an origin;
step 1.3: identifying a coincidence region in two adjacent images, converting the rotation and scaling change of the coincidence region in the two images into the translation amount of two log-polar coordinate axes, and obtaining the angle and scaling change coefficient of the two images;
step 1.4: obtaining the translation amount of the robot in the transverse direction and the longitudinal direction according to the position change of the overlapping area in the two-dimensional plane, calculating the three-dimensional rectangular coordinate variation of the robot in the two adjacent frames by combining the angle and the zooming variation coefficient of the two adjacent frames of images, and obtaining the three-dimensional rectangular coordinate of the robot relative to the starting moment;
step 1.5: and (4) repeating the step 1.2 to the step 1.4, and continuously utilizing the adjacent frame images to obtain the three-dimensional coordinate change of the robot in a certain time period.
5. The high-precision real-time positioning method for the robotic orchard of claim 3, wherein the obtaining of the lidar positioning information by the lidar sensor specifically comprises:
step 2.1: scanning the surrounding environment by using a 3D laser radar sensor for 360 degrees, and extracting laser points of surrounding fruit trees;
step 2.2: based on the laser point data, establishing a three-dimensional rectangular coordinate system with a laser radar as a center, and converting the data of each laser point into three-dimensional coordinate data;
step 2.3: and selecting a plurality of fruit tree laser points as positioning reference points, and calculating the position variation of the robot according to the coordinate variation of the reference points at the starting and stopping moments to obtain a three-dimensional rectangular coordinate of the robot relative to the starting moment.
6. The high-precision real-time positioning method for the robotic orchard according to claim 3, wherein the information fusion of the plurality of positioning information by the fusion algorithm to obtain the fused positioning information specifically comprises:
step 3.1: converting the binocular vision positioning information and the laser radar positioning information coordinate into world geodetic coordinates;
step 3.2: establishing a fusion model, initializing fusion parameters, and performing linear fusion on the two groups of positioning data which are converted into world geodetic coordinates; the fusion model is as follows:
Figure FDA0003708615240000021
wherein, (B, L, H) represents fusion positioning information, alpha 1 And alpha 2 As a fusion coefficient of latitude coordinates, beta 1 And beta 2 As a fusion coefficient of longitude coordinates, gamma 1 And gamma 2 As a fusion coefficient of the height coordinate, a 1 、a 2 、b 1 、b 2 、c 1 And c 2 The speed parameter is adjusted for the fusion coefficient,
Figure FDA0003708615240000022
and
Figure FDA0003708615240000023
fine tuning parameters for the coordinates;
the initialization fusion parameters are as follows: initializing a fusion coefficient alpha of latitude, longitude and altitude 1 、α 2 、β 1 、β 2 、γ 1 And gamma 2 Is 0.5, the fusion coefficient adjusts the speed parameter a 1 、a 2 、b 1 、b 2 c 1 And c 2 Is 1, coordinate trimming parameter
Figure FDA0003708615240000034
And
Figure FDA0003708615240000035
is 0.
7. The method of claim 6, wherein the adjusting the parameters of the fusion algorithm comprises:
positioning signal (B) of GNSS GNSS ,L GNSS ,H GNSS ) Respectively comparing the positioning information with a plurality of positioning information one by one to carry out first-step parameter adjustment; for latitude coordinate B, then:
Figure FDA0003708615240000031
Figure FDA0003708615240000032
B 1 representing latitude coordinates after binocular vision positioning coordinate conversion, B 2 Representing latitude coordinates after the laser radar positioning coordinate conversion;
positioning signal (B) of GNSS GNSS ,L GNSS ,H GNSS ) And comparing with the fusion positioning information (B, L, H), and performing second-step parameter adjustment:
high-precision positioning signal (B) of GNSS GNSS ,L GNSS ,H GNSS ) And comparing the fusion positioning information (B, L and H) of each sensor obtained by the fusion algorithm, and performing second-step parameter adjustment:
Figure FDA0003708615240000033
8. the method for high-precision real-time positioning in robotic orchards according to claim 1, wherein the detecting whether the positioning signals of the GNSS are normal comprises a direct method and an indirect method.
9. The method for high-precision real-time positioning of a robot orchard according to claim 8, wherein the direct method is that a GGA statement in a GNSS transmission protocol is directly utilized, for a Beidou satellite system, mode indicating bit data in a BDGGA statement is detected, and if the mode indicating bit data is 0, positioning is invalid, and if the mode indicating bit data is not 0, positioning is valid.
10. The method according to claim 8, wherein the indirect method is to calculate and store the positioning time and the time difference between the received signals of two adjacent frames of positioning data, compare the calculated and stored time and the time difference with a preset time threshold, and determine that the GNSS positioning signal is interfered if the positioning time and the received time difference of two frames exceed the threshold.
CN202210712594.3A 2022-06-22 2022-06-22 High-precision real-time positioning method for robot orchard Pending CN115112115A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210712594.3A CN115112115A (en) 2022-06-22 2022-06-22 High-precision real-time positioning method for robot orchard

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210712594.3A CN115112115A (en) 2022-06-22 2022-06-22 High-precision real-time positioning method for robot orchard

Publications (1)

Publication Number Publication Date
CN115112115A true CN115112115A (en) 2022-09-27

Family

ID=83327932

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210712594.3A Pending CN115112115A (en) 2022-06-22 2022-06-22 High-precision real-time positioning method for robot orchard

Country Status (1)

Country Link
CN (1) CN115112115A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116660916A (en) * 2023-05-26 2023-08-29 广东省农业科学院设施农业研究所 Positioning method, mapping method and electronic equipment for orchard mobile robot

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116660916A (en) * 2023-05-26 2023-08-29 广东省农业科学院设施农业研究所 Positioning method, mapping method and electronic equipment for orchard mobile robot
CN116660916B (en) * 2023-05-26 2024-02-02 广东省农业科学院设施农业研究所 Positioning method, mapping method and electronic equipment for orchard mobile robot

Similar Documents

Publication Publication Date Title
US11441899B2 (en) Real time position and orientation tracker
Kraus et al. Advanced DTM generation from LIDAR data
CN111812649A (en) Obstacle identification and positioning method based on fusion of monocular camera and millimeter wave radar
Bergelt et al. Improving the intrinsic calibration of a Velodyne LiDAR sensor
CN106323267A (en) Orchard work agricultural robot interrow positioning method
CN111077907A (en) Autonomous positioning method of outdoor unmanned aerial vehicle
CN111521195A (en) Intelligent robot
CN114485654A (en) Multi-sensor fusion positioning method and device based on high-precision map
CN113763548A (en) Poor texture tunnel modeling method and system based on vision-laser radar coupling
CN113327296A (en) Laser radar and camera online combined calibration method based on depth weighting
CN110672075A (en) Remote water area detection system and method based on three-dimensional stereo imaging
CN115112115A (en) High-precision real-time positioning method for robot orchard
CN114111791B (en) Indoor autonomous navigation method, system and storage medium for intelligent robot
CN114923477A (en) Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology
CN114296097A (en) SLAM navigation method and system based on GNSS and LiDAR
CN116893425A (en) Ultrahigh-precision positioning method for orchard picking robot
Vaidis et al. Extrinsic calibration for highly accurate trajectories reconstruction
CN115930948A (en) Orchard robot fusion positioning method
CN116026323A (en) Positioning and regional error proofing method for engine oil filling machine
CN116242372A (en) UWB-laser radar-inertial navigation fusion positioning method under GNSS refusing environment
Aggarwal Machine vision based SelfPosition estimation of mobile robots
CN114485613A (en) Multi-information fusion underwater robot positioning method
CN114910062A (en) Navigation positioning method for multi-source information fusion
CN113093155A (en) Laser radar combined calibration method and system
CN114025320A (en) Indoor positioning method based on 5G signal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination