CN116086447A - Fusion navigation method and device for unmanned vehicle - Google Patents

Fusion navigation method and device for unmanned vehicle Download PDF

Info

Publication number
CN116086447A
CN116086447A CN202310271939.0A CN202310271939A CN116086447A CN 116086447 A CN116086447 A CN 116086447A CN 202310271939 A CN202310271939 A CN 202310271939A CN 116086447 A CN116086447 A CN 116086447A
Authority
CN
China
Prior art keywords
unmanned vehicle
running
information
target point
path
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310271939.0A
Other languages
Chinese (zh)
Inventor
高飞
张建民
王郭俊
刘振华
刘小平
赵吉敏
连尧
刘明政
林婉妮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Systems Engineering of PLA Academy of Military Sciences
Original Assignee
Institute of Systems Engineering of PLA Academy of Military Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Systems Engineering of PLA Academy of Military Sciences filed Critical Institute of Systems Engineering of PLA Academy of Military Sciences
Priority to CN202310271939.0A priority Critical patent/CN116086447A/en
Publication of CN116086447A publication Critical patent/CN116086447A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a fusion navigation method and device for an unmanned vehicle. The fusion navigation method comprises the following steps: driving the unmanned aerial vehicle to travel to a preset travel path end point in the first travel area according to the preset travel path; performing point location calibration on the preset running path by using a sensor module of the unmanned vehicle to obtain a calibrated position information set; performing position space judgment processing on the calibration position information set to obtain real-time operation position information of the unmanned vehicle in the first operation area; processing the operation target path by using a navigation error discrimination model to generate planning path information of the unmanned vehicle at the operation target point position; and driving the unmanned vehicle to run according to the planned path information of the running target point position, and reaching the final target point position. The invention uses three types of sensors simultaneously, can ensure that the unmanned vehicle obtains accurate driving logic in the running process, and has the characteristics of few interference factors, simple logic and quick operation.

Description

Fusion navigation method and device for unmanned vehicle
Technical Field
The invention relates to the technical field of unmanned vehicle navigation, in particular to a fusion navigation method and device of an unmanned vehicle.
Background
At present, navigation of unmanned vehicles mainly comprises means such as satellite navigation, laser navigation, inertial navigation and the like. For the laser navigation mode, the cost is higher, and the requirements on the field are higher. For the inertial navigation mode, after long-time use, serious accumulated errors exist, and if the errors are not corrected in time, the unmanned vehicle cannot be in a normal path mode. For the satellite navigation mode, in signal shielding scenes such as an indoor warehouse, an underground place and the like, effective positioning cannot be realized, and the navigation positioning effect is also influenced.
In the unmanned vehicle adopting various navigation modes, only one navigation mode is used at the same time in the actual navigation process at present, and the navigation information is not effectively fused. Under the scene of single environment, higher navigation precision can be obtained by single navigation. However, when the scene with large environmental difference, such as the travel environment, is related to indoor and outdoor, the single navigation mode has the problems of low navigation precision and poor applicability, and especially in the scene with serious interference, even the single navigation mode can be disabled, so that the unmanned vehicle can not be effectively navigated.
Disclosure of Invention
In order to solve the problem that the existing unmanned vehicle navigation method only uses one navigation mode at the same time, and has low unmanned vehicle navigation precision in scenes with large environmental difference and serious interference, the first aspect of the embodiment of the invention discloses a fusion navigation method of an unmanned vehicle, which comprises the following steps:
s1, acquiring a preset running path and a first running area of an unmanned vehicle, and storing the preset running path and the first running area in a navigation module of the unmanned vehicle;
s2, driving the unmanned vehicle to travel to a preset travel path end point in the first travel area according to the preset travel path;
s3, in the running process of the unmanned vehicle according to the preset running path, calibrating the point position of the preset running path by using a sensor module of the unmanned vehicle to obtain a calibrated position information set; the calibration position information set comprises position parameter information and point position arrival time information of a plurality of point positions acquired by the sensor module; the sensor module of the unmanned vehicle comprises a satellite positioning navigation sensor, a laser sensor and an inertial navigation sensor;
s4, performing position space judgment processing on the calibration position information set to obtain real-time operation position information of the unmanned vehicle in the first operation area;
s5, collecting the real-time operation positions to obtain an operation target path of the unmanned vehicle; the running target path comprises a plurality of running target point location information;
s6, processing the running target path by using a navigation error judging model to generate planning path information of the unmanned vehicle at the running target point position; driving the unmanned vehicle to run according to the planned path information of the running target point position, and reaching the final target point position; the navigation error discriminating model comprises a satellite navigation error discriminating model and a laser navigation error discriminating model.
In a first aspect of the embodiment of the present invention, in a running process of the unmanned vehicle according to the preset running path, the performing, by using a sensor module of the unmanned vehicle, point location calibration on the preset running path to obtain a calibration position information set includes:
s31, in the running process of the unmanned vehicle according to the preset running path, respectively utilizing the satellite positioning navigation sensor, the laser sensor and the inertial navigation sensor to acquire data of point position information of the unmanned vehicle according to a preset time interval, and obtaining position parameter information and point arrival time information of the point; the position parameter information of the point position comprises satellite positioning parameters, laser sensor parameters and inertial navigation parameters;
s32, fusion processing is carried out on the position parameter information of the point position and the point position arrival time information to obtain a calibration position information set.
In a first aspect of the embodiment of the present invention, the performing a position space determination process on the calibration position information set to obtain real-time operation position information of the unmanned vehicle in the first operation area includes:
s41, judging the position parameter information of the point position in the calibration position information set, when the position parameter information is positioned in the first operation area, listing the position parameter information as effective sensor position parameter information, and when the position parameter information is positioned outside the first operation area, listing the position parameter information as ineffective sensor position parameter information;
s42, weighting the effective sensor position parameter information by using the sensor confidence coefficient to obtain point location real-time position information; the sensor confidence is determined according to the cumulative occurrence number of the invalid sensor position parameter information or the variance of the position parameter information;
s43, constructing and obtaining real-time operation position information of the unmanned vehicle by using the point position real-time position information and the point position arrival time information.
In an optional implementation manner, in a first aspect of the embodiment of the present invention, the aggregating the real-time operation positions to obtain an operation target path of the unmanned vehicle includes:
s51, sorting the point location real-time position information according to the sequence of the point location arrival time information to obtain unmanned vehicle point location list information;
s52, utilizing the unmanned vehicle point position list information to conduct path planning on unmanned vehicle point positions in the unmanned vehicle point position list, and obtaining an operation target path of the unmanned vehicle.
In a first aspect of the embodiment of the present invention, the navigation error discrimination model is used to process the operation target path to generate planned path information of the unmanned vehicle at the operation target point; driving the unmanned vehicle to travel according to the planned path information of the operation target point position to reach the final target point position, comprising:
s61, driving the unmanned vehicle to travel to the operation target point position by utilizing the operation target path;
s62, generating a target point position running information set of the unmanned vehicle by using the current position information of the unmanned vehicle and the running target path;
s63, processing the target point position running information set by using a navigation error judging model to obtain planned path information of the unmanned vehicle at the running target point position;
s64, driving the unmanned vehicle to travel according to the planned path information of the operation target point position, and reaching the next operation target point position;
s65, judging the reached operation target point, and completing the unmanned vehicle fusion navigation process when the operation target point is the final target point; when the operation target point is not the final target point, step S62 is performed.
In an optional implementation manner, in a first aspect of the embodiment of the present invention, the generating, using current location information of the unmanned vehicle and the running target path, a target point location running information set of the unmanned vehicle includes:
s621, acquiring current position information of the unmanned vehicle by using the sensor module;
s622, establishing a two-dimensional running coordinate system by taking the center of the unmanned aerial vehicle as an origin, taking the direction of the head of the unmanned aerial vehicle as an X axis and taking the direction perpendicular to the head as a Y axis; the coordinates of the head of the unmanned vehicle under the two-dimensional running coordinate system are (X0, Y0);
s623, determining that the coordinate of the next operation target point of the unmanned vehicle is (X1, Y1) in a two-dimensional running coordinate system according to the current position information of the unmanned vehicle and the operation target path;
s624, processing the head coordinates (X0, Y0) of the unmanned vehicle and the coordinates (X1, Y1) of the operation target point by using the satellite positioning navigation sensor to generate first driving vector coordinates (X2, Y2);
s625, processing the unmanned vehicle head coordinates (X0, Y0) and the operation target point position coordinates (X1, Y1) by using the laser sensor to generate second running vector coordinates (X3, Y3);
s626, processing the unmanned vehicle head coordinates (X0, Y0) and the operation target point position coordinates (X1, Y1) by using the inertial navigation sensor to generate third driving vector coordinates (X4, Y4);
s627, constructing a target point location running information set of the unmanned vehicle by using the first running vector, the second running vector and the third running vector.
As an optional implementation manner, in the first aspect of the embodiment of the present invention, the navigation error discriminating model includes a satellite navigation error discriminating model and a laser navigation error discriminating model, including:
the expression of the satellite navigation error discrimination model is as follows:
∣X4-X2∣≤a,
∣Y4-Y2∣≤a,
∣(X2-X1)(Y2-Y1)∣≤a2,
the inertial navigation error threshold value is a, and the satellite navigation error area constraint threshold value is a2;
the laser navigation error discrimination model has the expression:
∣X4-X3∣≤a,
∣Y4-Y3∣≤a,
∣(X3-X1)(Y3-Y1)∣≤a4,
wherein the laser navigation error area constraint threshold is a4.
In a first aspect of the embodiment of the present invention, the processing the target point location running information set by using a navigation error discrimination model to obtain planned path information of the unmanned vehicle at the running target point location includes:
s631, judging the target point location running information set by using a navigation error judging model; if the target point position running information set meets a satellite navigation error judging model, judging that the first running vector is valid, and if the target point position running information set does not meet the satellite navigation error judging model, judging that the first running vector is invalid; if the target point position running information set meets a laser navigation error judging model, judging that the second running vector is valid, and if the target point position running information set does not meet the laser navigation error judging model, judging that the second running vector is invalid;
s632, when only the third running vector is valid, using the third running vector as planned path information of the unmanned vehicle at the running target point; when only the first running vector and the third running vector are valid, the first running vector is used as planned path information of the unmanned vehicle at the running target point; when only the second running vector and the third running vector are valid, the second running vector is used as planning path information of the unmanned vehicle at the running target point; and when the first running vector, the second running vector and the third running vector are simultaneously effective, the three running vectors are processed by using a weighting criterion, so that the planned path information of the unmanned vehicle at the running target point position is obtained.
The second aspect of the embodiment of the invention discloses a fusion navigation device of an unmanned vehicle, which comprises:
a memory storing executable program code;
a processor coupled to the memory;
the processor calls the executable program codes stored in the memory to execute the unmanned vehicle fusion navigation method according to the first aspect of the embodiment of the invention.
The third aspect of the embodiment of the invention discloses a computer storage medium, which stores computer instructions, and the computer instructions are used for executing the unmanned vehicle fusion navigation method according to the first aspect of the embodiment of the invention when being called.
Advantageous effects
1. The invention provides an unmanned vehicle fusion navigation method, wherein three types of sensors are used simultaneously in the navigation process, so that an unmanned vehicle for logistics transportation can obtain accurate driving logic in the operation process, and the unmanned vehicle fusion navigation method has the characteristics of few interference factors, simple logic, quick operation and good applicability.
2. In the method, the running path of the unmanned vehicle is completed through the selection of the calibration points in the execution process of the steps, and in the process of selecting the calibration points, other unmanned vehicles can be used as the calibration points, so that the unmanned vehicle can complete the goods delivery work of the unmanned fork or the goods delivery work of the single task and multiple destinations in the running process, and the goods delivery function of the unmanned vehicle is more intelligent.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It is evident that the drawings in the following description are only some embodiments of the present invention and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art.
FIG. 1 is a flow chart of a fusion navigation method of an unmanned vehicle of the present invention;
fig. 2 is a schematic diagram of basic components of the unmanned vehicle of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more clear, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It will be apparent that the described embodiments are some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
FIG. 1 is a flow chart of a fusion navigation method of an unmanned vehicle of the present invention; fig. 2 is a schematic diagram of basic components of the unmanned vehicle of the present invention.
The invention is further described below with reference to examples.
Example 1
The embodiment discloses a fusion navigation method of an unmanned vehicle, which comprises the following steps:
s1, acquiring a preset running path and a first running area of an unmanned vehicle, and storing the preset running path and the first running area in a navigation module of the unmanned vehicle;
s2, driving the unmanned vehicle to travel to a preset travel path end point in the first travel area according to the preset travel path;
s3, in the running process of the unmanned vehicle according to the preset running path, calibrating the point position of the preset running path by using a sensor module of the unmanned vehicle to obtain a calibrated position information set; the calibration position information set comprises position parameter information and point position arrival time information of a plurality of point positions acquired by the sensor module; the sensor module of the unmanned vehicle comprises a satellite positioning navigation sensor, a laser sensor and an inertial navigation sensor;
s4, performing position space judgment processing on the calibration position information set to obtain real-time operation position information of the unmanned vehicle in the first operation area;
s5, collecting the real-time operation positions to obtain an operation target path of the unmanned vehicle; the running target path comprises a plurality of running target point location information;
s6, processing the running target path by using a navigation error judging model to generate planning path information of the unmanned vehicle at the running target point position; driving the unmanned vehicle to run according to the planned path information of the running target point position, and reaching the final target point position; the navigation error discriminating model comprises a satellite navigation error discriminating model and a laser navigation error discriminating model.
And setting a certain number of laser irradiation reflectors on a preset running path of the unmanned vehicle. The laser sensor irradiates the reflecting plate through irradiation of laser, so that angle and distance information of the unmanned vehicle and the laser irradiating the reflecting plate can be obtained.
In the running process of the unmanned vehicle according to the preset running path, the sensor module of the unmanned vehicle is utilized to perform point location calibration on the preset running path to obtain a calibrated position information set, and the method comprises the following steps:
s31, in the running process of the unmanned vehicle according to the preset running path, respectively utilizing the satellite positioning navigation sensor, the laser sensor and the inertial navigation sensor to acquire data of point position information of the unmanned vehicle according to a preset time interval, and obtaining position parameter information and point arrival time information of the point; the position parameter information of the point position comprises satellite positioning parameters, laser sensor parameters and inertial navigation parameters;
s32, fusion processing is carried out on the position parameter information of the point position and the point position arrival time information to obtain a calibration position information set.
The step S32 includes:
and fusing the position parameter information of the point position and the corresponding point position arrival time information to form a group of position sets, wherein all the position sets form a calibration position information set.
The step of performing position space judgment processing on the calibration position information set to obtain real-time operation position information of the unmanned vehicle in the first operation area comprises the following steps:
s41, judging the position parameter information of the point position in the calibration position information set, when the position parameter information is positioned in the first operation area, listing the position parameter information as effective sensor position parameter information, and when the position parameter information is positioned outside the first operation area, listing the position parameter information as ineffective sensor position parameter information;
s42, weighting the effective sensor position parameter information by using the sensor confidence coefficient to obtain point location real-time position information; the sensor confidence is determined according to the cumulative occurrence number of the invalid sensor position parameter information or the variance of the position parameter information;
the sensor confidence, determined from the cumulative number of occurrences of the invalid sensor position parameter information, comprises:
after judging the position parameter information of the point where the calibration position information is located, updating the accumulated occurrence times of the position parameter information of the invalid sensor corresponding to each sensor to obtain the accumulated occurrence times of the position parameter information of the invalid sensor of each sensor, and normalizing the accumulated occurrence times to obtain a result which is used as the sensor confidence of the sensor.
The sensor confidence, determined from the variance of the location parameter information, comprises:
after the position parameter information of the point position in the calibration position information set is judged, calculating variance values of position parameter information of all effective sensors of each sensor, and carrying out normalization processing on the variance values to obtain a result which is used as the sensor confidence of the sensor.
S43, constructing and obtaining real-time operation position information of the unmanned vehicle by using the point position real-time position information and the point position arrival time information.
The step S43 includes:
forming a real-time position group by utilizing the point position real-time position information and the point position arrival time information corresponding to the point position real-time position information; and constructing a set by utilizing all the real-time position sets, wherein the set is real-time operation position information of the unmanned vehicle.
The step of collecting the real-time operation positions to obtain an operation target path of the unmanned vehicle comprises the following steps:
s51, sorting the point location real-time position information according to the sequence of the point location arrival time information to obtain unmanned vehicle point location list information;
s52, utilizing the unmanned vehicle point position list information to conduct path planning on unmanned vehicle point positions in the unmanned vehicle point position list, and obtaining an operation target path of the unmanned vehicle.
The step S52 includes:
and carrying out connection processing on the unmanned vehicle points according to the appearance sequence of the unmanned vehicle points by utilizing the unmanned vehicle point position list information to obtain an unmanned vehicle operation path, and taking the unmanned vehicle operation path as an operation target path of the unmanned vehicle.
The navigation error judging model is utilized to process the running target path, and planning path information of the unmanned vehicle at the running target point position is generated; driving the unmanned vehicle to travel according to the planned path information of the operation target point position to reach the final target point position, comprising:
s61, driving the unmanned vehicle to travel to the operation target point position by utilizing the operation target path;
s62, generating a target point position running information set of the unmanned vehicle by using the current position information of the unmanned vehicle and the running target path;
s63, processing the target point position running information set by using a navigation error judging model to obtain planned path information of the unmanned vehicle at the running target point position;
s64, driving the unmanned vehicle to travel according to the planned path information of the operation target point position, and reaching the next operation target point position;
s65, judging the reached operation target point, and completing the unmanned vehicle fusion navigation process when the operation target point is the final target point; when the operation target point is not the final target point, step S62 is performed.
Step S64 is to drive the unmanned aerial vehicle, collect the position information of the unmanned aerial vehicle in real time by using a sensor in the running process according to the planned path information of the running target point, compare the position information with the unmanned aerial vehicle point in the running target path, and judge that the unmanned aerial vehicle reaches the next running target point when the distance between the position information and the unmanned aerial vehicle point is smaller than a preset threshold;
the generating a target point location running information set of the unmanned vehicle by using the current position information of the unmanned vehicle and the running target path comprises the following steps:
s621, acquiring current position information of the unmanned vehicle by using the sensor module;
s622, establishing a two-dimensional running coordinate system by taking the center of the unmanned aerial vehicle as an origin, taking the direction of the head of the unmanned aerial vehicle as an X axis and taking the direction perpendicular to the head as a Y axis; the coordinates of the head of the unmanned vehicle under the two-dimensional running coordinate system are (X0, Y0);
s623, determining that the coordinate of the next operation target point of the unmanned vehicle is (X1, Y1) in a two-dimensional running coordinate system according to the current position information of the unmanned vehicle and the operation target path;
s624, processing the head coordinates (X0, Y0) of the unmanned vehicle and the coordinates (X1, Y1) of the operation target point by using the satellite positioning navigation sensor to generate first driving vector coordinates (X2, Y2); the driving vector refers to the movement amount required by the unmanned vehicle to reach the operation target point when the unmanned vehicle uses the corresponding sensor module for navigation. For example, the coordinates of the running vector of the satellite positioning navigation sensor are (X2, Y2), which means that the unmanned vehicle runs along the X axis X2 and runs along the Y axis Y2 when navigating by using the satellite positioning navigation sensor, and thus the unmanned vehicle can reach the running target point.
S625, processing the unmanned vehicle head coordinates (X0, Y0) and the operation target point position coordinates (X1, Y1) by using the laser sensor to generate second running vector coordinates (X3, Y3);
s626, processing the unmanned vehicle head coordinates (X0, Y0) and the operation target point position coordinates (X1, Y1) by using the inertial navigation sensor to generate third driving vector coordinates (X4, Y4);
s627, constructing a target point location running information set of the unmanned vehicle by using the first running vector, the second running vector and the third running vector.
And constructing a running vector set by using the first running vector, the second running vector and the third running vector, and taking the running vector set as a target point position running information set of the unmanned vehicle.
The navigation error discrimination model comprises a satellite navigation error discrimination model and a laser navigation error discrimination model, and comprises:
the expression of the satellite navigation error discrimination model is as follows:
∣X4-X2∣≤a,
∣Y4-Y2∣≤a,
∣(X2-X1)(Y2-Y1)∣≤a2,
the inertial navigation error threshold value is a, and the satellite navigation error area constraint threshold value is a2;
the laser navigation error discrimination model has the expression:
∣X4-X3∣≤a,
∣Y4-Y3∣≤a,
∣(X3-X1)(Y3-Y1)∣≤a4,
wherein the laser navigation error area constraint threshold is a4.
The processing the target point location running information set by using the navigation error discrimination model to obtain the planned path information of the unmanned vehicle at the running target point location comprises the following steps:
s631, judging the target point location running information set by using a navigation error judging model; if the target point position running information set meets a satellite navigation error judging model, judging that the first running vector is valid, and if the target point position running information set does not meet the satellite navigation error judging model, judging that the first running vector is invalid; if the target point position running information set meets a laser navigation error judging model, judging that the second running vector is valid, and if the target point position running information set does not meet the laser navigation error judging model, judging that the second running vector is invalid;
s632, when only the third running vector is valid, using the third running vector as planned path information of the unmanned vehicle at the running target point; when only the first running vector and the third running vector are valid, the first running vector is used as planned path information of the unmanned vehicle at the running target point; when only the second running vector and the third running vector are valid, the second running vector is used as planning path information of the unmanned vehicle at the running target point; and when the first running vector, the second running vector and the third running vector are simultaneously effective, the three running vectors are processed by using a weighting criterion, so that the planned path information of the unmanned vehicle at the running target point position is obtained.
The processing the three running vectors by using the weighting criteria to obtain the planned path information of the unmanned vehicle at the running target point location comprises the following steps:
determining a ratio value of a first running vector weight and a second running vector weight by using the number of the current service satellites of the satellite positioning navigation sensor and the number of the laser irradiation reflectors set by the running target point positions;
determining a third driving vector weight by utilizing the value range of the sum of the number of the current service satellites of the satellite positioning navigation sensor and the number of the laser irradiation reflectors set by the operation target point position;
solving to obtain a first running vector weight and a second running vector weight by using a constraint condition that the sum of the three running vector weights is 1;
and carrying out weighted summation operation on the three running vectors by utilizing the weights of the three running vectors to obtain the planned path information of the unmanned vehicle at the running target point.
The processing the three running vectors by using the weighting criteria to obtain the planned path information of the unmanned vehicle at the running target point location comprises the following steps:
the calculation formula of the weighting criteria is as follows:
X1=X2×ik+X3×k+X4×p,
Y1=Y2×ik+Y3×k+Y4×p,
wherein k is the weight coefficient of the second traveling vector, i is the ratio of the weight coefficients of the first traveling vector and the second traveling vector, p is the weight coefficient of the third traveling vector, and the determination process is shown in table 1 by determining the number of the current service satellites of the satellite positioning navigation sensor and the number of the laser irradiated reflectors set by the operation target point.
Table 1 determination of the ratio of weighting coefficients
Figure BDA0004135052540000121
When the sum of the number of the current service satellites of the satellite positioning navigation sensor and the number of the laser irradiation reflectors arranged at the operation target point is more than or equal to 6, p=0.1; when the sum of the number of the current service satellites of the satellite positioning navigation sensor and the number of the laser irradiation reflectors arranged at the operation target point is smaller than 6, the sum of the number of the current service satellites and the number of the laser irradiation reflectors is expressed as y, and the value of p is 1-0.15p.
One of the main application scenarios of unmanned vehicles is automatic logistics transport. AGVs are abbreviations for Automated Guided Vehicle, i.e. "automated guided vehicles". An AGV is a transport vehicle equipped with an automatic guidance device such as electromagnetic or optical, capable of traveling along a predetermined guidance path, and having safety protection and various transfer functions. The AGV automatically transports the articles to the appointed place through navigation, and the most common navigation modes comprise magnetic stripe guidance, laser guidance, ultra-high frequency RFID guidance and the like. For the magnetic stripe navigation mode, the site is provided with a certain limitation, and the site decoration style is influenced to a certain extent. For the ultrahigh frequency RFID navigation mode, the cost is moderate, and the method has the advantages of high guiding precision, more convenient site setting, capability of meeting the complex site layout scene and no need of changing the integral decoration environment of the site. Meanwhile, the ultrahigh frequency RFID navigation mode has high safety and stability, which are not possessed by the magnetic stripe guiding and laser navigation modes.
The embodiment provides an unmanned vehicle fusion navigation method, three types of sensors are used simultaneously in the navigation process, so that an unmanned vehicle for logistics transportation can obtain accurate driving logic in the operation process, and the unmanned vehicle fusion navigation method has the characteristics of few interference factors, simple logic, quick operation and good applicability.
In the method disclosed by the embodiment, in the process of executing the steps, the running path of the unmanned aerial vehicle is completed through the selection of the calibration point positions, and in the process of selecting the calibration point positions, other unmanned aerial vehicles can be used as the calibration point positions, so that the unmanned aerial vehicle can complete the goods delivery of the unmanned aerial vehicle fork or the goods delivery work of multiple destinations of a single task in the running process, and the goods delivery function of the unmanned aerial vehicle is more intelligent.
Example two
The embodiment discloses a fusion navigation device of unmanned vehicles, the device includes:
a memory storing executable program code;
a processor coupled to the memory;
the processor invokes the executable program code stored in the memory to execute the unmanned vehicle fusion navigation method of embodiment one.
Example III
The embodiment discloses a computer storage medium, wherein the computer storage medium stores computer instructions, and the computer instructions are used for executing the unmanned vehicle fusion navigation method in the first embodiment when being called.
Example IV
The embodiment discloses a fusion navigation method of an unmanned vehicle, wherein the unmanned vehicle comprises a sensor, a navigation module, a walking controller, a motor and the like; the sensor, the navigation module, the walking controller and the motor are connected in sequence; the sensor comprises a satellite positioning navigation sensor, a laser sensor and an inertial navigation sensor. The navigation module forms a control system of the unmanned vehicle; the walking controller and the motor form a walking system of the unmanned vehicle, and the unmanned vehicle is formed as shown in fig. 2; the fusion navigation method of the unmanned vehicle comprises the following steps:
firstly, calibrating a path, namely calibrating a target path by using a manual driving unmanned vehicle, calibrating each point of the path, and recording parameters of each sensor by each point, wherein the parameters are as follows:
point location number 1 (laser sensor point location parameters (including at least three reflector measurement data, angle and distance differences), differential satellite sensor point location parameters, inertial navigation point location parameters);
point No. 2 (laser sensor point parameters, differential satellite sensor point parameters, inertial navigation point parameters);
······
x number point location (laser sensor point location parameter, differential satellite sensor point location parameter, inertial navigation point location parameter);
calibrating to generate data of each point location, storing the data into a navigation module and simultaneously transmitting the data to a dispatching system;
the real-time position of each unmanned vehicle in the unmanned vehicle running area is confirmed through the electronic map stored in the unmanned vehicle running path position calibration and navigation hardware;
collecting the real-time positions to obtain a running target path of the unmanned vehicle; the running target path comprises a plurality of running target point location information;
processing the operation target path by using a navigation error discrimination model to generate planning path information of the unmanned vehicle at the operation target point position; driving the unmanned vehicle to run according to the planned path information of the running target point position, and reaching the final target point position; the navigation error discriminating model comprises a satellite navigation error discriminating model and a laser navigation error discriminating model.
Constructing a database, naming each unmanned vehicle in the unmanned vehicle operation area, establishing a data storage sub-clip in the constructed database according to the name of the unmanned vehicle, and carrying out real-time differentiated storage on the operation data of each unmanned vehicle.
The navigation hardware is deployed in front of the unmanned aerial vehicle and is used for carrying out path image analysis on the unmanned aerial vehicle running area, the unmanned aerial vehicle running path in the unmanned aerial vehicle running area is analyzed in the path image, and the electronic map is designed according to the unmanned aerial vehicle running path and is uploaded in the navigation hardware.
When the unmanned aerial vehicle is driven to run according to the data stored in the navigation hardware, a calibration period and a calibration distance are synchronously set, and a running path of the unmanned aerial vehicle is calibrated in the running process of the unmanned aerial vehicle according to the calibration period; wherein the calibration distance is set to be 1.5-2 m.
When the operation path of the unmanned vehicle is calibrated, a satellite navigation positioning sensor, a laser sensor and an inertial navigation sensor which are deployed on the unmanned vehicle are applied, and according to the calibration period, parameters captured by the sensors during the replacement of the calibration period are used as calibration data and stored in the unmanned vehicle.
The satellite navigation positioning sensor has a differential function.
The data storage sub-clip is used for comparing the newly received unmanned vehicle operation data with the existing data stored in the data storage sub-clip when receiving the new unmanned vehicle operation data each time, and identifying whether the existing data stored in the data storage sub-clip has data with the opposite operation logic to the newly received unmanned vehicle operation data; and deleting the newly received unmanned vehicle operation data and the existing data stored in the data storage subfolder and having opposite operation logic with the newly received unmanned vehicle operation data when the identification result is yes, and transmitting and storing the newly received unmanned vehicle operation data to the data storage subfolder when the identification result is no.
In summary, the steps in the method are executed to enable the unmanned vehicle for logistics transportation to obtain accurate driving logic in the running process, and the method has the characteristics of few interference factors, simple logic, quick operation and good applicability; in addition, in the process of executing the steps, the operation path of the unmanned vehicle is completed through the selection of the calibration points, and in the process of selecting the calibration points, other unmanned vehicles can be used as the calibration points, so that the unmanned vehicle can complete the goods delivery of the unmanned vehicle or the goods delivery work of multiple destinations of a single task in the operation process, and the goods delivery function of the unmanned vehicle is more intelligent.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (10)

1. The fusion navigation method of the unmanned vehicle is characterized by comprising the following steps of:
s1, acquiring a preset running path and a first running area of an unmanned vehicle, and storing the preset running path and the first running area in a navigation module of the unmanned vehicle;
s2, driving the unmanned vehicle to travel to a preset travel path end point in the first travel area according to the preset travel path;
s3, in the running process of the unmanned vehicle according to the preset running path, calibrating the point position of the preset running path by using a sensor module of the unmanned vehicle to obtain a calibrated position information set; the calibration position information set comprises position parameter information and point position arrival time information of a plurality of point positions acquired by the sensor module; the sensor module of the unmanned vehicle comprises a satellite positioning navigation sensor, a laser sensor and an inertial navigation sensor;
s4, performing position space judgment processing on the calibration position information set to obtain real-time operation position information of the unmanned vehicle in the first operation area;
s5, collecting the real-time operation positions to obtain an operation target path of the unmanned vehicle; the running target path comprises a plurality of running target point location information;
s6, processing the running target path by using a navigation error judging model to generate planning path information of the unmanned vehicle at the running target point position; driving the unmanned vehicle to run according to the planned path information of the running target point position, and reaching the final target point position; the navigation error discriminating model comprises a satellite navigation error discriminating model and a laser navigation error discriminating model.
2. The fusion navigation method of the unmanned aerial vehicle according to claim 1, wherein the performing the point calibration on the preset running path by using the sensor module of the unmanned aerial vehicle in the running process of the unmanned aerial vehicle according to the preset running path to obtain the calibration position information set comprises:
s31, in the running process of the unmanned vehicle according to the preset running path, respectively utilizing the satellite positioning navigation sensor, the laser sensor and the inertial navigation sensor to acquire data of point position information of the unmanned vehicle according to a preset time interval, and obtaining position parameter information and point arrival time information of the point; the position parameter information of the point position comprises satellite positioning parameters, laser sensor parameters and inertial navigation parameters;
s32, fusion processing is carried out on the position parameter information of the point position and the point position arrival time information to obtain a calibration position information set.
3. The method for fusion navigation of an unmanned vehicle according to claim 1, wherein the performing a position space judgment process on the calibration position information set to obtain real-time operation position information of the unmanned vehicle in the first operation region comprises:
s41, judging the position parameter information of the point position in the calibration position information set, when the position parameter information is positioned in the first operation area, listing the position parameter information as effective sensor position parameter information, and when the position parameter information is positioned outside the first operation area, listing the position parameter information as ineffective sensor position parameter information;
s42, weighting the effective sensor position parameter information by using the sensor confidence coefficient to obtain point location real-time position information; the sensor confidence is determined according to the cumulative occurrence number of the invalid sensor position parameter information or the variance of the position parameter information;
s43, constructing and obtaining real-time operation position information of the unmanned vehicle by using the point position real-time position information and the point position arrival time information.
4. The fusion navigation method of the unmanned vehicle according to claim 1, wherein the collecting the real-time operation positions to obtain the operation target path of the unmanned vehicle comprises:
s51, sorting the point location real-time position information according to the sequence of the point location arrival time information to obtain unmanned vehicle point location list information;
s52, utilizing the unmanned vehicle point position list information to conduct path planning on unmanned vehicle point positions in the unmanned vehicle point position list, and obtaining an operation target path of the unmanned vehicle.
5. The fusion navigation method of the unmanned vehicle according to claim 1, wherein the navigation error discrimination model is used for processing the operation target path to generate planning path information of the unmanned vehicle at the operation target point; driving the unmanned vehicle to travel according to the planned path information of the operation target point position to reach the final target point position, comprising:
s61, driving the unmanned vehicle to travel to the operation target point position by utilizing the operation target path;
s62, generating a target point position running information set of the unmanned vehicle by using the current position information of the unmanned vehicle and the running target path;
s63, processing the target point position running information set by using a navigation error judging model to obtain planned path information of the unmanned vehicle at the running target point position;
s64, driving the unmanned vehicle to travel according to the planned path information of the operation target point position, and reaching the next operation target point position;
s65, judging the reached operation target point, and completing the unmanned vehicle fusion navigation process when the operation target point is the final target point; when the operation target point is not the final target point, step S62 is performed.
6. The method for fusion navigation of an unmanned vehicle according to claim 5, wherein generating the target point travel information set of the unmanned vehicle using the current position information of the unmanned vehicle and the travel target path comprises:
s621, acquiring current position information of the unmanned vehicle by using the sensor module;
s622, establishing a two-dimensional running coordinate system by taking the center of the unmanned aerial vehicle as an origin, taking the direction of the head of the unmanned aerial vehicle as an X axis and taking the direction perpendicular to the head as a Y axis; the coordinates of the head of the unmanned vehicle under the two-dimensional running coordinate system are (X0, Y0);
s623, determining that the coordinate of the next operation target point of the unmanned vehicle is (X1, Y1) in a two-dimensional running coordinate system according to the current position information of the unmanned vehicle and the operation target path;
s624, processing the head coordinates (X0, Y0) of the unmanned vehicle and the coordinates (X1, Y1) of the operation target point by using the satellite positioning navigation sensor to generate first driving vector coordinates (X2, Y2);
s625, processing the unmanned vehicle head coordinates (X0, Y0) and the operation target point position coordinates (X1, Y1) by using the laser sensor to generate second running vector coordinates (X3, Y3);
s626, processing the unmanned vehicle head coordinates (X0, Y0) and the operation target point position coordinates (X1, Y1) by using the inertial navigation sensor to generate third driving vector coordinates (X4, Y4);
s627, constructing a target point location running information set of the unmanned vehicle by using the first running vector, the second running vector and the third running vector.
7. The fusion navigation method of the unmanned vehicle according to claim 6, wherein the navigation error discrimination model comprises a satellite navigation error discrimination model and a laser navigation error discrimination model, and the method comprises the following steps:
the expression of the satellite navigation error discrimination model is as follows:
∣X4-X2∣≤a,
∣Y4-Y2∣≤a,
∣(X2-X1)(Y2-Y1)∣≤a2,
the inertial navigation error threshold value is a, and the satellite navigation error area constraint threshold value is a2;
the laser navigation error discrimination model has the expression:
∣X4-X3∣≤a,
∣Y4-Y3∣≤a,
∣(X3-X1)(Y3-Y1)∣≤a4,
wherein the laser navigation error area constraint threshold is a4.
8. The fusion navigation method of the unmanned vehicle according to claim 6, wherein the processing the target point location traveling information set by using a navigation error discrimination model to obtain planned path information of the unmanned vehicle at the operation target point location comprises:
s631, judging the target point location running information set by using a navigation error judging model; if the target point position running information set meets a satellite navigation error judging model, judging that the first running vector is valid, and if the target point position running information set does not meet the satellite navigation error judging model, judging that the first running vector is invalid; if the target point position running information set meets a laser navigation error judging model, judging that the second running vector is valid, and if the target point position running information set does not meet the laser navigation error judging model, judging that the second running vector is invalid;
s632, when only the third running vector is valid, using the third running vector as planned path information of the unmanned vehicle at the running target point; when only the first running vector and the third running vector are valid, the first running vector is used as planned path information of the unmanned vehicle at the running target point; when only the second running vector and the third running vector are valid, the second running vector is used as planning path information of the unmanned vehicle at the running target point; and when the first running vector, the second running vector and the third running vector are simultaneously effective, the three running vectors are processed by using a weighting criterion, so that the planned path information of the unmanned vehicle at the running target point position is obtained.
9. A fusion navigation device for an unmanned vehicle, the device comprising:
a memory storing executable program code;
a processor coupled to the memory;
the processor invokes the executable program code stored in the memory to perform the drone fusion navigation method of any one of claims 1-8.
10. A computer storage medium storing computer instructions which, when invoked, are operable to perform the unmanned vehicle fusion navigation method of any one of claims 1-8.
CN202310271939.0A 2023-03-16 2023-03-16 Fusion navigation method and device for unmanned vehicle Pending CN116086447A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310271939.0A CN116086447A (en) 2023-03-16 2023-03-16 Fusion navigation method and device for unmanned vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310271939.0A CN116086447A (en) 2023-03-16 2023-03-16 Fusion navigation method and device for unmanned vehicle

Publications (1)

Publication Number Publication Date
CN116086447A true CN116086447A (en) 2023-05-09

Family

ID=86201017

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310271939.0A Pending CN116086447A (en) 2023-03-16 2023-03-16 Fusion navigation method and device for unmanned vehicle

Country Status (1)

Country Link
CN (1) CN116086447A (en)

Similar Documents

Publication Publication Date Title
CN106969768B (en) Accurate positioning and parking method for trackless navigation AGV
Hata et al. Feature detection for vehicle localization in urban environments using a multilayer LIDAR
US8972095B2 (en) Automatic guided vehicle and method for drive control of the same
JP6659599B2 (en) Self-position estimation device and self-position estimation method
EP2450763B1 (en) Global position and orientation estimation system for a vehicle in a passageway environment
US11493930B2 (en) Determining changes in marker setups for robot localization
CN111487960A (en) Mobile robot path planning method based on positioning capability estimation
EP4102327A1 (en) Position recognition method and position recognition system for vehicle
CN112146668A (en) Unmanned vehicle autonomous navigation method based on ROS
Obregón et al. Precise positioning and heading for autonomous scouting robots in a harsh environment
CN108121359A (en) A kind of shopping robot
Murphy et al. Using incomplete online metric maps for topological exploration with the gap navigation tree
Lima et al. Ultra-wideband time of flight based localization system and odometry fusion for a scanning 3 dof magnetic field autonomous robot
US20140100776A1 (en) Method for determining the position of moving objects
CN116481541A (en) Vehicle autonomous return control method, device and medium without satellite navigation
CN116086447A (en) Fusion navigation method and device for unmanned vehicle
Handa et al. Navigation based on metric route information in places where the mobile robot visits for the first time
Bošnak et al. Obstacle avoidance for line-following AGV with local maps
CN114578821A (en) Mobile robot, method for overcoming difficulty of mobile robot, and storage medium
Shoval et al. Analysis of landmark configuration for absolute positioning of autonomous vehicles
KR20210003065A (en) Method and system for collecting data
Kim et al. Indoor localization using laser scanner and vision marker for intelligent robot
CN115211273A (en) Mower navigation method, device and equipment and unmanned mowing system
Jensfelt et al. A mobile robot system for automatic floor marking
US20240027224A1 (en) Method for recognizing an erroneous map of an environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination