CN117310756B - Multi-sensor fusion positioning method and system and machine-readable storage medium - Google Patents

Multi-sensor fusion positioning method and system and machine-readable storage medium Download PDF

Info

Publication number
CN117310756B
CN117310756B CN202311617207.9A CN202311617207A CN117310756B CN 117310756 B CN117310756 B CN 117310756B CN 202311617207 A CN202311617207 A CN 202311617207A CN 117310756 B CN117310756 B CN 117310756B
Authority
CN
China
Prior art keywords
positioning
measurement data
auxiliary
target
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311617207.9A
Other languages
Chinese (zh)
Other versions
CN117310756A (en
Inventor
刘腾君
秦亚飞
兰骏
徐文质
霍梦晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Lutes Robotics Co ltd
Original Assignee
Ningbo Lutes Robotics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Lutes Robotics Co ltd filed Critical Ningbo Lutes Robotics Co ltd
Priority to CN202311617207.9A priority Critical patent/CN117310756B/en
Publication of CN117310756A publication Critical patent/CN117310756A/en
Application granted granted Critical
Publication of CN117310756B publication Critical patent/CN117310756B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/22Multipath-related issues
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/40Correcting position, velocity or attitude
    • G01S19/41Differential correction, e.g. DGPS [differential GPS]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/43Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled

Abstract

The invention provides a multi-sensor fusion positioning method and system and a machine-readable storage medium. Wherein the method comprises the following steps: respectively acquiring target end positioning measurement data and auxiliary end positioning measurement data, wherein the target end positioning measurement data are acquired by a plurality of positioning sensors on positioning target equipment, and the auxiliary end positioning measurement data are acquired by a plurality of positioning sensors on auxiliary mobile equipment; detecting the space environment states of the positioning target device and the auxiliary mobile device to determine whether a multipath effect exists; and selecting target end positioning measurement data and/or auxiliary end positioning measurement data to perform fusion calculation according to the multipath effect determination results of the positioning target equipment and the auxiliary mobile equipment, and obtaining an optimal estimated position result of the positioning target equipment. According to the scheme, through fusion calculation of the target end positioning measurement data and the auxiliary end positioning measurement data, the observation value deviation caused by the multipath effect is overcome, and accurate positioning is realized.

Description

Multi-sensor fusion positioning method and system and machine-readable storage medium
Technical Field
The invention relates to the field of navigation positioning, in particular to a multi-sensor fusion positioning method and system and a machine-readable storage medium.
Background
In the autonomous positioning navigation process, mobile devices such as intelligent vehicles with automatic driving function generally adopt a positioning scheme of multi-sensor fusion. RTK (Real-time dynamic carrier phase difference technology) based on differential positioning principle can well eliminate delay errors of satellite positioning signals at transmitting ends and transmission paths (troposphere and ionosphere). Compared with a GNSS (Global Navigation Satellite System ) single-point positioning scheme, the RTK can improve the absolute reference position positioning precision from the meter level to the centimeter level, so that the RTK is widely popularized and applied.
However, the RTK is still subject to interference of shielding, multipath and the like, particularly in urban environments of high-rise forests, multipath effects are more likely to occur in the satellite signal transmission and reception process, so that the use of the RTK in urban scenes is greatly restricted, and the development of automatic driving technology in urban environments is limited.
Disclosure of Invention
It is an object of the present invention to address multipath effect positioning deviations caused by ambient occlusion.
A further object of the present invention is to achieve a continuous accurate positioning of a positioning target device.
According to one aspect of the present invention, there is provided a multi-sensor fusion positioning method, including:
respectively acquiring target end positioning measurement data and auxiliary end positioning measurement data, wherein the target end positioning measurement data is acquired by a plurality of positioning sensors arranged on positioning target equipment, and the auxiliary end positioning measurement data is acquired by a plurality of positioning sensors arranged on auxiliary mobile equipment;
detecting the space environment state of the positioning target device to determine whether the positioning target device has a multipath effect;
detecting a spatial environment state of the auxiliary mobile device to determine whether the auxiliary mobile device has a multipath effect;
and selecting target end positioning measurement data and/or auxiliary end positioning measurement data to perform fusion calculation according to the multipath effect determination results of the positioning target equipment and the auxiliary mobile equipment, and obtaining an optimal estimated position result of the positioning target equipment.
Optionally, the step of selecting the target positioning measurement data and/or the auxiliary positioning measurement data for fusion calculation includes:
under the condition that the positioning target equipment and the auxiliary mobile equipment are judged to have no multipath effect, fusing the target end positioning measurement data and the auxiliary end positioning measurement data in a loose coupling mode;
Under the condition that the positioning target equipment is judged to have no multipath effect and the auxiliary mobile equipment has multipath effect, fusion calculation is carried out by using the target end positioning measurement data;
under the condition that the multipath effect exists in the positioning target equipment and the multipath effect does not exist in the auxiliary mobile equipment, replacing data influenced by the multipath effect in the target end positioning measurement data by corresponding data in the auxiliary end positioning measurement data, and performing fusion calculation by using the replaced target end positioning measurement data;
and under the condition that the multipath effect exists in the positioning target equipment and the auxiliary mobile equipment, screening out the data influenced by the multipath effect in the target end positioning measurement data, and carrying out fusion calculation by using the screened out target end positioning measurement data.
Optionally, the positioning sensor configured on the positioning target device includes at least: the inertial navigation system, the real-time dynamic differential sensor, the wheel rotating speed sensor and the relative positioning signal receiver, and the step of acquiring the target end positioning measurement data comprises the following steps:
collecting updated data of the gesture, the speed and the position measured by an inertial navigation system configured on positioning target equipment;
Collecting pseudo-range and carrier phase measured by a real-time dynamic differential sensor configured on positioning target equipment;
collecting wheel speeds measured by wheel speed sensors arranged on positioning target equipment;
acquiring relative position of a positioning target device and an assisting mobile device measured by a relative positioning signal receiver arranged on the positioning target device, and
the positioning sensor configured on the secondary mobile device includes at least: the inertial navigation system, the real-time dynamic differential sensor and the step of obtaining the auxiliary end positioning measurement data comprise the following steps:
acquiring updated data of the gesture, the speed and the position measured by an inertial navigation system configured on the auxiliary mobile equipment;
pseudo-ranges and carrier phases measured by real-time dynamic differential sensors configured on the secondary mobile device are acquired.
Optionally, the step of fusing the target end positioning measurement data and the auxiliary end positioning measurement data in a loose coupling manner includes:
performing navigation calculation on a linear observation equation of an error state constructed by the target end by using a Kalman filtering method according to the target end positioning measurement data to obtain the optimal estimation of the target end;
performing navigation calculation on a linear observation equation of an error state constructed by the auxiliary end by using a Kalman filtering method according to the auxiliary end positioning measurement data to obtain an auxiliary end optimal estimation;
Converting the optimal estimation of the auxiliary end into auxiliary positioning data of the positioning target equipment according to the relative positions of the positioning target equipment and the auxiliary mobile equipment;
and fusing the optimal estimation and auxiliary positioning data of the target end.
Optionally, the step of replacing the data affected by the multipath effect in the target end position measurement data with the corresponding data in the auxiliary end position measurement data includes:
converting the updated data of the gesture, the speed and the position of the auxiliary mobile device and the pseudo range and the carrier phase into alternative measurement data according to the relative positions of the positioning target device and the auxiliary mobile device;
and replacing corresponding data in the target end positioning measurement data by using the replacement measurement data, and taking the data as input data of fusion calculation.
Optionally, the auxiliary mobile device is an unmanned aerial vehicle, and in the process of acquiring the auxiliary end positioning measurement data, the method further comprises:
and adjusting the flight state of the unmanned aerial vehicle according to the running state of the positioning target equipment, and planning the flight path of the unmanned aerial vehicle.
Optionally, the process of obtaining the auxiliary end positioning measurement data further includes:
and adjusting the output range of the signal sent by the unmanned aerial vehicle to the unmanned aerial vehicle according to the road direction of the driving path of the positioning target equipment, so that the size of the output range along the direction of the road exceeds the size along the width direction of the road.
Optionally, the secondary mobile device is a drone, and the drone is configured to provide secondary end location measurement data to the location target devices within the set area.
Optionally, a visual sensor is further arranged on the positioning target device and/or the auxiliary mobile device; and is also provided with
The step of detecting a spatial environment state of the positioning target device further comprises: acquiring a surrounding environment image of the positioning target device acquired by the vision sensor, and determining a signal shielding state of the positioning target device by utilizing the surrounding environment image as a space environment state of the positioning target device;
the step of detecting the spatial environment state of the secondary mobile device further comprises: and acquiring an ambient environment image of the auxiliary mobile device acquired by the vision sensor, and determining a signal shielding state of the auxiliary mobile device by using the ambient environment image as a space environment state of the auxiliary mobile device.
According to another aspect of the present invention, there is also provided a machine-readable storage medium having stored thereon a machine-executable program which, when executed by a processor, is adapted to carry out any of the above-mentioned multi-sensor fusion positioning methods.
According to another aspect of the present invention, there is also provided a multi-sensor fusion positioning system including:
Positioning target equipment, a plurality of positioning sensors arranged on the positioning target equipment and configured to acquire target end positioning measurement data;
the mobile terminal comprises auxiliary mobile equipment, a plurality of positioning sensors and a control unit, wherein the plurality of positioning sensors are configured on the auxiliary mobile equipment and are configured to acquire auxiliary terminal positioning measurement data;
the fusion positioning device comprises a memory, a processor and a machine executable program stored on the memory and running on the processor, and the processor realizes any multi-sensor fusion positioning method when executing the machine executable program.
According to the multi-sensor fusion positioning method and system, auxiliary mobile equipment is additionally arranged besides a plurality of positioning sensors configured on the positioning target equipment. The positioning target equipment collects positioning measurement data of a target end; the auxiliary mobile device collects auxiliary end positioning measurement data. Determining whether the positioning target equipment and the auxiliary mobile equipment have multipath effects or not by detecting the space environment states of the positioning target equipment and the auxiliary mobile equipment; and selecting target end positioning measurement data and/or auxiliary end positioning measurement data to perform fusion calculation according to the multipath effect determination result of the positioning target equipment and the auxiliary mobile equipment. According to the scheme, under the condition that the multipath effect is generated due to the surrounding environment, the observation value deviation caused by the multipath effect can be overcome through the fusion calculation of the target end positioning measurement data and the auxiliary end positioning measurement data, and the accurate positioning is realized.
Further, in the multi-sensor fusion positioning method and system of the present invention, the positioning sensor configured on the positioning target device includes an inertial navigation system (Inertial Navigation System, abbreviated as INS), a Real-time kinematic differential sensor (Real-time kinematic, abbreviated as RTK), a wheel rotation speed sensor (wheel speed sensor, abbreviated as WSS), a relative positioning signal receiver (carrier communication signal receiver, ultrasonic communication signal receiver, laser signal receiver, radar signal receiver, etc.), and the positioning sensor configured on the auxiliary mobile device includes an inertial navigation system (Inertial Navigation System, abbreviated as INS), a Real-time kinematic differential sensor (Real-time kinematic, abbreviated as RTK), etc. The positioning target device can obtain the relative position of the positioning target device to the auxiliary mobile device through resolving and the positioning result of the auxiliary mobile device together as the input data of positioning fusion. The scheme of the invention not only can utilize the RTK centimeter-level positioning performance, but also can radically inhibit the multipath effect and avoid the interference of the multipath effect on the positioning result.
Furthermore, the multi-sensor fusion positioning method and system provided by the invention use the unmanned aerial vehicle as auxiliary mobile equipment, reduce the possibility of multipath effect caused by shielding by obstacles such as high buildings and the like, and can meet the high-precision positioning requirement of the high-rise forest urban environment. The unmanned aerial vehicle can move along with the vehicle or provide auxiliary positioning function in the range of the set area, and the application scene is more flexible.
Furthermore, the multi-sensor fusion positioning method and system reduce the number of devices required by combined navigation by using road side positioning devices in the prior art, and reduce the difficulty of data processing.
The above, as well as additional objectives, advantages, and features of the present invention will become apparent to those skilled in the art from the following detailed description of a specific embodiment of the present invention when read in conjunction with the accompanying drawings.
Drawings
Some specific embodiments of the invention will be described in detail hereinafter by way of example and not by way of limitation with reference to the accompanying drawings. The same reference numbers will be used throughout the drawings to refer to the same or like parts or portions. It will be appreciated by those skilled in the art that the drawings are not necessarily drawn to scale. In the accompanying drawings:
FIG. 1 is a schematic illustration of an application of a multi-sensor fusion positioning system according to one embodiment of the invention;
FIG. 2 is a schematic block diagram of a multi-sensor fusion positioning system in accordance with an embodiment of the invention;
FIG. 3 is a data flow diagram of a multi-sensor fusion positioning system in accordance with an embodiment of the invention;
FIG. 4 is a schematic diagram of a multi-sensor fusion positioning method according to one embodiment of the invention; and
fig. 5 is a schematic diagram of data fusion under different spatial environment states in a multi-sensor fusion positioning method according to an embodiment of the present invention.
Detailed Description
It should be understood by those skilled in the art that the embodiments described below are only some embodiments of the present invention, but not all embodiments of the present invention, and the some embodiments are intended to explain the technical principles of the present invention and are not intended to limit the scope of the present invention. All other embodiments, which can be obtained by a person skilled in the art without any inventive effort, based on the embodiments provided by the present invention, shall still fall within the scope of protection of the present invention.
Fig. 1 is a schematic diagram of an application of a multi-sensor fusion positioning system 10 according to one embodiment of the invention, and fig. 2 is a schematic block diagram of the multi-sensor fusion positioning system 10 according to one embodiment of the invention.
The data flow in (a) is schematically shown. The multi-sensor fusion positioning system 10 of the present embodiment may generally include: positioning target device 100, auxiliary mobile device 200, and fusion positioning device 300.
A plurality of positioning sensors 110 configured on the positioning target device 100 and configured to collect target end positioning measurement data. The positioning target device 100 may be a vehicle, or may be a mobile device that is positioned, such as a ship, an aircraft, or other positioning signals may be affected by multipath effects. In some embodiments, the positioning sensor 110 configured on the positioning target device 100 includes at least an inertial navigation system (Inertial Navigation System, INS for short), a Real-time kinematic (RTK for short), a wheel speed sensor (wheel speed sensor, WSS for short), a relative positioning signal receiver (e.g., a carrier communication signal receiver, an ultrasonic communication signal receiver, a laser signal receiver, a radar signal receiver, etc.).
One arrangement of the positioning sensors 110 of the positioning target device 100 is as follows: the IMU is used for acquiring IMU data during the movement process of the positioning target equipment. The IMU data format may be: frame number, timestamp, linear acceleration of IMU, angular velocity of IMU. The WSS is used for acquiring the wheel speed of a positioning target device (vehicle), and calculating the rotation number of the wheel through the wheel encoder data of the vehicle, wherein the data format is as follows: frame number, timestamp, turns. The RTK may be placed on top of a positioning target device for acquiring satellite positioning signals, determining the absolute position of the vehicle. Other positioning sensors, such as vision sensors, lidar, millimeter wave radar, etc., may be used and provide respective positioning signals as further reference data.
The relative positioning signal receiver 120 has a well-defined lever arm value to the positioning center point that can be used to lock the position of the secondary mobile device 200 (drone) so that the real-time relative position between the relative positioning signal receiver 120 to the relative positioning signal transmitter 220 can be determined. The relative positioning signal receiver 120 acquires and parses positioning result information of the auxiliary mobile device 200; the real-time relative position of the positioning target device 100 to the auxiliary mobile device 200 and the positioning information of the auxiliary mobile device 200 are transferred to the fusion positioning means.
A plurality of positioning sensors 210 are configured on the secondary mobile device 200 and are configured to collect secondary end position measurement data. The auxiliary mobile device 200 may be an unmanned aerial vehicle, or other auxiliary platform having a different positioning signal transmission path from the positioning target device in the air, ground, underground, water, underwater, etc., so that multipath effects can be avoided. In some embodiments, the auxiliary mobile device 200 may also be movably disposed or fixedly mounted in place. In some embodiments, the positioning sensor 210 configured on the secondary mobile device 200 includes at least an inertial navigation system (Inertial Navigation System, INS) a Real-time kinematic (RTK), and the like.
The auxiliary mobile device 200 may further be provided with a relative positioning signal transmitter 220 and a communication module (not shown in the figure), and broadcast positioning signals such as UWB (Ultra Wide Band), ultrasound, laser, radar, etc.; the communication module performs broadcasting of positioning information of the auxiliary mobile device 200, and is used for receiving by the positioning target device 100. In addition, the auxiliary mobile device 200 can be provided with other positioning sensors such as magnetometers, visual sensors and the like.
One arrangement of the positioning sensor 210 of the auxiliary mobile device 200 is: the IMU is configured to facilitate IMU data acquired during movement of the mobile device. The IMU data format is: frame number, timestamp, linear acceleration of IMU, angular velocity of IMU. The RTK is used to acquire satellite positioning signals and determine the absolute position of the secondary mobile device. Visual sensors may be installed around the secondary mobile device 200 to confirm whether multipath effects may exist in the environment in which the secondary mobile device 200 receives the RTK. The relative positioning signal transmitter 220 is used to broadcast positioning signals for the secondary mobile device. The communication module is used for communicating with the positioning target device 100 and broadcasting the INS/RTK positioning result of the auxiliary mobile device 200.
The fusion positioning device 300 is used for processing data of the positioning target device 100 and the auxiliary mobile device 200. The fusion positioning means 300 may be provided on the positioning target device 100 and the auxiliary mobile device 200 as needed, alternatively, the fusion positioning means 300 may be arranged on a network service device, and perform data interaction with the positioning target device 100 and the auxiliary mobile device 200 through a network.
The fusion positioning device 300 may include a memory 320, a processor 310, and a machine executable program 321 stored on the memory 320 and running on the processor 310, and the processor 310 implements the multi-sensor fusion positioning method of the present embodiment when executing the machine executable program 321. The processor 310 is adapted to execute stored instructions; memory 320 provides storage space for the operation of the instructions during operation; processor 310 may be a single core processor, a multi-core processor, a computing cluster, or any number of other configurations. Memory 320 may include Random Access Memory (RAM), read only memory, flash memory, or any other suitable storage system.
The following description of the present embodiment mainly uses an unmanned plane as the auxiliary mobile device 200 and uses a vehicle running on a road as the positioning target device 100 as an example, and those skilled in the art can implement the multi-sensor fusion positioning method of the present embodiment in a positioning scene on the basis of this description. During the running of the vehicle 100 on the road, the signal of the positioning satellite 420 is blocked by the obstacle 410 such as the building, and the reflection is generated, so that the propagation direction, amplitude, polarization, phase and the like of the signal are changed, and the observed quantity is error, namely multipath effect is generated. The unmanned plane 420 relays the signals of the positioning satellite 420, and the multipath effect is avoided by using the flying height of the unmanned plane.
IMUs (Inertial Measurement Unit, inertial measurement units), WSSs, RTKs, and other positioning sensors of the positioning target device 100 (or referred to as the vehicle end) may provide an estimate of position, velocity, attitude at the vehicle perspective; the IMU, RTK positioning module of the auxiliary mobile device 200 (or drone side) may provide a vehicle side position, velocity estimate from the drone perspective. The system 10 of this embodiment processes the above data, combines the environmental space state, and can conveniently and reasonably perform fusion of positioning information acquired by the vehicle 100 and the unmanned aerial vehicle end 200, comprehensively utilizes positioning advantages of sensors of the vehicle 100 and the unmanned aerial vehicle end 200 under different viewing angles, and realizes a multi-sensor fusion navigation positioning scheme of RTK multipath inhibition. Particularly in a driving scenario where urban multipath effects are severe, the system 10 of the present embodiment can provide a wide-range, high-precision, continuous positioning output for the vehicle 100 by suppressing the influence of the RTK multipath effects on the positioning result.
RTK signals acquired by the unmanned aerial vehicle 200 at high altitude are not interfered by multipath, INS and RTK data without multipath interference are used for tight coupling by adopting a Kalman filtering algorithm, and the optimal estimated position result output of the unmanned aerial vehicle 200 can be obtained. After the positioning signal source of the unmanned aerial vehicle 200 is received by the vehicle 100, the relative position of the vehicle 100 and the unmanned aerial vehicle 200 is obtained through calculation, and the relative position and the positioning result of the unmanned aerial vehicle 200 are used as positioning observation of the vehicle 100 to perform positioning fusion. In some embodiments, the signal output range of the signal source broadcast by the unmanned aerial vehicle 200 may be set to be narrow perpendicular to the road direction and wide along the road direction, so as to reliably transmit the positioning information of the unmanned aerial vehicle 200 to the vehicle 100.
FIG. 3 is a data flow diagram of a multi-sensor fusion positioning system in accordance with one embodiment of the invention. The positioning sensor 110 of the vehicle 100 includes: IMU 111, WSS 112, RTK113 and other positioning sensors 115, vision sensors 114. The positioning sensor 210 of the drone 200 includes: IMU 211, RTK212, and vision sensor 213.
As shown in fig. 3, the IMU 111 performs error compensation on the original gyroscope and accelerometer data using the zero offset error estimated online by the kalman filter, and the like, before performing INS mechanical arrangement, and after the INS finishes initial alignment of the INS, updates of the attitude, the speed and the position are completed according to the INS mechanical arrangement algorithm. The RTK113 sensor obtains the pseudo-range and carrier phase observation value of the RTK, and measures the vehicle end. WSS 112 obtains a measurement of wheel speed and relies on non-integrity constraints to measure the state of the vehicle end. The relative positioning signal receiver 120 receives the positioning ranging signal of the unmanned aerial vehicle 200, acquires the relative distance to the unmanned aerial vehicle 200, and measures the relative distance. The vision sensor 114 or other environmental detection device determines the current environment by preset conditions, making a decision as to which combination of two measurement sources, the vehicle end RTK113, the unmanned aerial vehicle RTK212, and the relative ranging value, is used to make a systematic measurement. And (3) fusing the IMU 111, the WSS 112, the RTK113 and the final relative position estimation through a Kalman filter, and optimally estimating the navigation state error and the IMU sensor error of the system to obtain the final PVA estimation of the vehicle 100. Other positioning sensors 118 are compatible with positioning information given by other positioning sources, such as laser, millimeter wave radar, ultrasonic, visual, map and other measuring sources, and perform fusion output of the sensors. And carrying out feedback correction on the navigation result mechanically arranged by the INS by using the navigation state error estimated by the Kalman filter, and simultaneously completing the error compensation of the original observation value of the IMU. And finally outputting navigation parameters such as position, speed, gesture and the like.
In the unmanned aerial vehicle 200, the IMU 211 performs error compensation on the original gyroscope and accelerometer data before performing INS mechanical arrangement by using zero offset errors and the like estimated online by a kalman filter, and after the INS finishes initial alignment of the INS, the INS mechanical arrangement algorithm is used to finish updating of the gesture, speed and position. The RTK 212 obtains the pseudorange and carrier phase observations of the RTK and performs position measurement of the drone 200. The relative positioning signal transmitter 220 broadcasts positioning information of the unmanned aerial vehicle 200, and broadcasts positioning ranging signals for the relative positioning signal receiver 120 to receive and calculate the relative positioning position. The vision sensor 213 or other environment detection means determines whether there is a multi-path effect by judging whether there is a shielding of the current environment through a preset condition.
The unmanned aerial vehicle 200 may also determine a current flight status and a flight altitude of the unmanned aerial vehicle 200 and perform path planning of the unmanned aerial vehicle 200 through the vision sensor 213, the altimeter, and the communication module of the vehicle 100.
FIG. 4 is a schematic diagram of a multi-sensor fusion positioning method according to one embodiment of the invention. The multi-sensor fusion positioning method may generally include:
step S401, target end positioning measurement data and auxiliary end positioning measurement data are respectively acquired, wherein the target end positioning measurement data are acquired by a plurality of positioning sensors configured on the positioning target device, and the auxiliary end positioning measurement data are acquired by a plurality of positioning sensors configured on the auxiliary mobile device.
The positioning sensor configured on the positioning target device at least comprises: the inertial navigation system, the real-time dynamic differential sensor, the wheel rotating speed sensor and the relative positioning signal receiver, and the step of acquiring the target end positioning measurement data comprises the following steps: collecting updated data of the gesture, the speed and the position measured by an inertial navigation system configured on positioning target equipment; collecting pseudo-range and carrier phase measured by a real-time dynamic differential sensor configured on positioning target equipment; collecting wheel speeds measured by wheel speed sensors arranged on positioning target equipment; the relative positions of the positioning target device and the auxiliary mobile device, which are measured by a relative positioning signal receiver arranged on the positioning target device, are acquired. That is, the target positioning measurement data may include: the inertial navigation system is used for measuring updated data of the gesture, the speed and the position, measuring pseudo-range and carrier phase by the real-time dynamic differential sensor, measuring wheel speed by the wheel speed sensor, and positioning the relative positions of the target equipment and the auxiliary mobile equipment.
The positioning sensor configured on the secondary mobile device includes at least: inertial navigation system, real-time dynamic differential sensor. The step of obtaining auxiliary end positioning measurement data comprises the following steps: acquiring updated data of the gesture, the speed and the position measured by an inertial navigation system configured on the auxiliary mobile equipment; pseudo-ranges and carrier phases measured by real-time dynamic differential sensors configured on the secondary mobile device are acquired. That is, the aiding end positioning measurement data may include: updated data of attitude, velocity and position, pseudoranges and carrier phase.
In step S402, the spatial environment state of the positioning target device is detected to determine whether the positioning target device has a multipath effect.
In step S403, the spatial environment status of the secondary mobile device is detected to determine whether the secondary mobile device has a multipath effect.
In some embodiments, visual sensors may also be provided on the positioning target device and/or the secondary mobile device; and the spatial environment state may be detected using a visual sensor. For example, step S402 may include: and acquiring an ambient environment image of the positioning target device acquired by the vision sensor, and determining a signal shielding state of the positioning target device by utilizing the ambient environment image to serve as a space environment state of the positioning target device. Step S403 may include: and acquiring an ambient environment image of the auxiliary mobile device acquired by the vision sensor, and determining a signal shielding state of the auxiliary mobile device by using the ambient environment image as a space environment state of the auxiliary mobile device. In the positioning scene of urban roads, the space environment state is mainly the shielding state of surrounding high buildings.
Step S404, selecting target end positioning measurement data and/or auxiliary end positioning measurement data to perform fusion calculation according to the multipath effect determination results of the positioning target equipment and the auxiliary mobile equipment, and obtaining an optimal estimated position result of the positioning target equipment.
Fig. 5 is a schematic diagram of data fusion under different spatial environment states in a multi-sensor fusion positioning method according to an embodiment of the present invention. The selecting target location measurement data and/or auxiliary location measurement data for fusion calculation may include the steps of:
in step S501, under the condition that it is determined that the positioning target device and the auxiliary mobile device have no multipath effect, the target positioning measurement data and the auxiliary positioning measurement data are fused in a loose coupling manner. For example, when the current vehicle end is judged to be non-shielding through a preset condition, no multipath effect exists, namely, the RTK signal of the vehicle end is normal; meanwhile, the unmanned aerial vehicle end is non-shielding, no multipath effect exists, namely, when the RTK signal of the unmanned aerial vehicle end is normally available, a Kalman filtering algorithm is adopted, and the sensor data of the vehicle-mounted end and the positioning data observation value provided by the unmanned aerial vehicle end are fused together in a loose coupling mode. And outputting as an optimal estimated position result. The vehicle end data and the unmanned aerial vehicle end data are respectively selected to be fused in a loose coupling mode and a tight coupling mode according to actual requirements.
The loose coupling is that the positioning data are respectively and independently operated and then the positioning result is output, and then the positioning result is fused. For example, the fusion of the target end positioning measurement data and the auxiliary end positioning measurement data by loose coupling refers to a positioning algorithm for fusing the positioning result of the vehicle end and the positioning information of the unmanned aerial vehicle end, wherein the vehicle end and the unmanned aerial vehicle end work independently and respectively provide the positioning result of the vehicle end. In the specific implementation process, the position and the speed of the vehicle end and the positioning result (position and speed) of the unmanned aerial vehicle end are used as the input of a Kalman filter, the difference value of the two is compared, an error model is established to correct the positioning result of the vehicle end, and the fusion result of the speed, the position and the gesture is obtained. The loosely coupled navigation mode is easy to realize relatively stability.
The tight coupling is to obtain the original measured value pseudo range and pseudo range rate of the RTK as observation, to make difference with the corresponding value predicted by the INS, and to feed back the difference to the Kalman filter for estimating the error of the INU to obtain the combined navigation result of speed, position and gesture. The tightly coupled navigation mode uses the original value of RTK measurement to observe, so that richer and deeper operations can be realized, and fusion is carried out after good and reliable original observed data are screened out, so that the obtained result is more accurate, but more calculation force support is needed.
In step S502, fusion calculation is performed using the target end positioning measurement data under the condition that it is determined that the positioning target device has no multipath effect and the auxiliary mobile device has multipath effect. For example, when the current vehicle end is judged to be the preset condition through the preset condition, the current vehicle end is judged to be shielded, and a multipath effect exists, namely, the vehicle end RTK signal can not be normally used; meanwhile, the unmanned aerial vehicle end is non-shielding, no multipath effect exists, namely, when the RTK signal of the unmanned aerial vehicle end is normally available, a Kalman filtering algorithm is adopted, and the sensor data observation values of the vehicle-mounted end are fused in a loose coupling mode to be output as an optimal estimated position result.
In step S503, when it is determined that the positioning target device has a multipath effect and the auxiliary mobile device does not have a multipath effect, the data affected by the multipath effect in the target positioning measurement data is replaced by the corresponding data in the auxiliary positioning measurement data, and fusion calculation is performed by using the replaced target positioning measurement data. For example, when the current vehicle end is judged to be the preset condition through the preset condition, the current vehicle end is judged to be non-shielded, and no multipath effect exists, namely the vehicle end RTK signal can be normally used; meanwhile, the unmanned aerial vehicle end is shielded, a multipath effect exists, namely when the RTK signal of the unmanned aerial vehicle end cannot be normally used, RTK data of the vehicle end is replaced by the relative position of the unmanned aerial vehicle and the vehicle to serve as satellite positioning signals through carrying out RTK transfer of the unmanned aerial vehicle end and obtaining, the RTK data are fused with other sensors of the vehicle end in a loose coupling mode by using a Kalman filtering algorithm, and the fused RTK data are output as an optimal estimated position result.
In step S504, under the condition that it is determined that the positioning target device and the auxiliary mobile device both have the multipath effect, the data affected by the multipath effect in the target positioning measurement data is screened out, and fusion calculation is performed using the screened out target positioning measurement data. For example, when the current vehicle end and the unmanned aerial vehicle end are judged to have multipath effects through preset conditions and RTK signals cannot be normally used, a Kalman filtering algorithm is adopted through other sensors except the RTK at the vehicle end to perform multi-sensor fusion, and the multi-sensor fusion is output as an optimal estimated position result.
Under the condition that the positioning target device and the auxiliary mobile device are not judged to have multipath effects, the step of fusing the target end positioning measurement data and the auxiliary end positioning measurement data in a loose coupling mode can comprise the following steps: performing navigation calculation on a linear observation equation of an error state constructed by the target end by using a Kalman filtering method according to the target end positioning measurement data to obtain the optimal estimation of the target end; performing navigation calculation on a linear observation equation of an error state constructed by the auxiliary end by using a Kalman filtering method according to the auxiliary end positioning measurement data to obtain an auxiliary end optimal estimation; converting the optimal estimation of the auxiliary end into auxiliary positioning data of the positioning target equipment according to the relative positions of the positioning target equipment and the auxiliary mobile equipment; and fusing the optimal estimation and auxiliary positioning data of the target end.
In the case that it is determined that the positioning target device has a multipath effect and the auxiliary mobile device does not have a multipath effect, the step of replacing the data affected by the multipath effect in the target positioning measurement data with the corresponding data in the auxiliary positioning measurement data may include: converting the updated data of the gesture, the speed and the position of the auxiliary mobile device and the pseudo range and the carrier phase into alternative measurement data according to the relative positions of the positioning target device and the auxiliary mobile device; and replacing corresponding data in the target end positioning measurement data by using the replacement measurement data, and taking the data as input data of fusion calculation.
In an embodiment in which the auxiliary mobile device is an unmanned aerial vehicle, the unmanned aerial vehicle may move along with the positioning target device, that is, in a process of acquiring the positioning measurement data of the auxiliary end, the method may further include: and adjusting the flight state of the unmanned aerial vehicle according to the running state of the positioning target equipment, and planning the flight path of the unmanned aerial vehicle.
Considering the signal transmission characteristics of the unmanned aerial vehicle and the positioning target equipment on the urban road, the process of acquiring the positioning measurement data of the auxiliary end further comprises the following steps: and adjusting the output range of the signal sent by the unmanned aerial vehicle to the unmanned aerial vehicle according to the road direction of the driving path of the positioning target equipment, so that the size of the output range along the direction of the road exceeds the size along the width direction of the road.
Alternatively, in other embodiments, the drone is configured to provide aiding position measurement data for positioning target devices within a set area. For example, unmanned aerial vehicles may be arranged in road areas where multipath effects occur, with which unmanned aerial vehicles traveling in the surrounding area can be used for positioning and navigation.
In the method of this embodiment, the data processing process of the vehicle end may include:
and respectively establishing a discrete time system error state model at a vehicle end, then constructing a linear observation equation of an error state, and performing integrated navigation calculation by using a basic equation of Kalman filtering to obtain the PVA (position, velocity, attitude) optimal estimation of the vehicle end. The system state of the vehicle end is that Wherein->Position, speed, attitude error of three axes respectively, < >>Zero bias of accelerometer and gyroscope in IMU.
The INS calculates the position, speed and posture information of the to-be-measured point at each updating moment based on the initial position, speed and posture information and through the measurement data output by the current IMU, and achieves inertial navigation updating:
wherein the method comprises the steps ofPosition, speed, attitude information representing IMU updates,/->Acceleration vector information representing accelerometer measurements, +.>Indicating local gravity, ">Respectively representing the instantaneous angular velocity vector of the carrier coordinate system relative to the navigation coordinate system, the angular velocity change vector caused by the earth rotation under the navigation coordinate system, and the instantaneous angular velocity vector of the carrier coordinate system relative to the navigation coordinate system;
based on the system state quantity, a Kalman model is established as follows:
and (3) predicting:
updating:
wherein,respectively carrying out one-step prediction on the state of the last optimal estimation and the current optimal estimation; />The state covariance of the last time is respectively the state estimation covariance, and the state covariance of the current time is respectively obtained; />Respectively a state transition matrix and an observation matrix; />,/>Respectively, measuring noise, kalman gain and measurement value of the process noise covariance; by means of the model, proper parameters are set for different sensors to perform fusion positioning.
In the above description of different multipath effects, under the condition that the RTK signals at the vehicle end and the unmanned aerial vehicle end are normally available, a kalman filtering algorithm is adopted, and the sensor data at the vehicle end and the positioning data observation value provided by the unmanned aerial vehicle end are fused together in a loose coupling mode to be output as an optimal estimated position result. Under the condition that the vehicle-end RTK signal is normal and the unmanned aerial vehicle-end RTK signal is abnormal, a Kalman filtering algorithm is adopted, and the sensor data observation values of the vehicle end are fused in a loose coupling mode to be output as an optimal estimated position result. Under the condition that the vehicle-end RTK signal is abnormal and the unmanned aerial vehicle-end RTK signal is normal, RTK data of the vehicle-end is replaced by the relative positions of the unmanned aerial vehicle and the vehicle to serve as satellite positioning signals through transfer of the unmanned aerial vehicle-end RTK and acquisition, fusion is conducted in a loose coupling mode by using a Kalman filtering algorithm with other sensors of the vehicle-end, and the fusion is output as an optimal estimated position result. Under the condition that RTK signals of the vehicle end and the unmanned aerial vehicle end are abnormal, the multi-sensor fusion is carried out through other sensors except the RTK of the vehicle end by adopting a Kalman filtering algorithm, and the multi-sensor fusion is output as an optimal estimated position result.
In the method of this embodiment, the data processing process of the unmanned aerial vehicle side may include:
the system state of the unmanned aerial vehicle is thatWhereinPosition, speed, attitude error of three axes respectively, < >>Zero offset of the IMU accelerometer and the gyroscope respectively. And the unmanned aerial vehicle end also adopts a Kalman filtering algorithm to carry out optimal estimation of the state quantity.
After the positioning signal source of the unmanned aerial vehicle terminal is received by the vehicle terminal, the relative position of the vehicle terminal and the unmanned aerial vehicle terminal can be obtained through resolving, and the relative position and the positioning result of the unmanned aerial vehicle are used as the positioning observation of the vehicle terminal to perform positioning fusion of the vehicle terminal.
The ranging and measuring process from the vehicle end to the unmanned aerial vehicle end comprises the following steps:
according to the position, speed and gesture calculated by the vehicle end and the coordinate value of the vehicle end when the unmanned aerial vehicle is positioned, calculating the estimated distance from the vehicle end to the unmanned aerial vehicle end, taking the difference between the original distance measurement obtained by the ranging signal and the estimated distance measurement of the vehicle end, and taking the obtained difference as the measurement input of the Kalman filter.
The state quantity of the system is
The measurement vector of the system isWherein->The estimated distances from the vehicle end to the unmanned aerial vehicle are respectively represented, the measured distances from the vehicle end to the unmanned aerial vehicle end are respectively represented, and n represents the number of unmanned aerial vehicles participating in calculation.
The measurement matrix of the system isWherein, the method comprises the steps of, wherein,
is the coordinates of the unmanned aerial vehicle in an ECEF coordinate system (Earth-Centered, earth-Fixed, earth-Centered rectangular coordinate system), and->Is the position of the vehicle end in the ECEF coordinate system. And the distance measurement from the vehicle end to the unmanned plane end is used for measuring the vehicle end position by the model. Positioning of other kinds of coordinate systems can be achieved on the basis of this by a person skilled in the art.
According to the above description, an application procedure of the multi-sensor fusion positioning method of the present embodiment can be summarized as follows:
at the vehicle end, the combined navigation positioning of the RTK/INS/WSS and other sensors carries out the optimal estimation of the state quantity (PVA) of the vehicle end, meanwhile, the space environment state is judged, the condition of the multipath effect is determined, and the decision of fusion is carried out by using the measurement of which combination of the transfer positioning results in the vehicle RTK and the unmanned aerial vehicle;
at the unmanned aerial vehicle end, the unmanned aerial vehicle end uses RTK/INS to carry out multi-sensor integrated navigation, determines the optimal position estimation of the unmanned aerial vehicle end under ECEF, and broadcasts the corresponding result; simultaneously sending a relative positioning ranging signal to the vehicle;
the vehicle end uses a relative positioning signal receiver to acquire a positioning ranging signal of the unmanned aerial vehicle and relative positioning information broadcast by the unmanned aerial vehicle, and the positioning ranging signal and the relative positioning information are used as measurement data to be transmitted into a system state quantity for combined navigation calculation.
According to the process, the complementary characteristics of the vehicle signal view angle and the unmanned aerial vehicle signal view angle are utilized, the influence of multipath effect is restrained by modifying the propagation track of the RTK positioning signal, and the optimized combined positioning result is obtained by means of multi-sensor fusion, so that the performance and usability of an automatic driving positioning algorithm in a city high-rise scene are greatly improved.
The flowcharts provided by this embodiment are not intended to indicate that the operations of the method are to be performed in any particular order, or that all of the operations of the method are included in all of each case. Furthermore, the method may include additional operations. Additional variations may be made to the above-described methods within the scope of the technical ideas provided by the methods of the present embodiments.
By now it should be appreciated by those skilled in the art that while a number of exemplary embodiments of the invention have been shown and described herein in detail, many other variations or modifications of the invention consistent with the principles of the invention may be directly ascertained or inferred from the present disclosure without departing from the spirit and scope of the invention. Accordingly, the scope of the present invention should be understood and deemed to cover all such other variations or modifications.

Claims (10)

1. A multi-sensor fusion positioning method comprises the following steps:
respectively acquiring target end positioning measurement data and auxiliary end positioning measurement data, wherein the target end positioning measurement data is acquired by a plurality of positioning sensors arranged on positioning target equipment, and the auxiliary end positioning measurement data is acquired by a plurality of positioning sensors arranged on auxiliary mobile equipment;
detecting the space environment state of the positioning target device to determine whether the positioning target device has a multipath effect;
detecting a spatial environment state of the auxiliary mobile device to determine whether a multipath effect exists for the auxiliary mobile device;
according to the multipath effect determination result of the positioning target device and the auxiliary mobile device, selecting target end positioning measurement data and/or the auxiliary end positioning measurement data to perform fusion calculation to obtain an optimal estimated position result of the positioning target device, wherein the step of selecting the target end positioning measurement data and/or the auxiliary end positioning measurement data to perform fusion calculation includes:
under the condition that the positioning target equipment and the auxiliary mobile equipment are judged to have no multipath effect, fusing the target end positioning measurement data and the auxiliary end positioning measurement data in a loose coupling mode;
Under the condition that the positioning target equipment is judged to have no multipath effect and the auxiliary mobile equipment has multipath effect, fusion calculation is carried out by using the target end positioning measurement data;
under the condition that the positioning target equipment has multipath effect and the auxiliary mobile equipment does not have multipath effect, replacing data influenced by the multipath effect in the target end positioning measurement data by using corresponding data in the auxiliary end positioning measurement data, and performing fusion calculation by using the replaced target end positioning measurement data;
and screening out the data influenced by the multipath effect in the target end positioning measurement data under the condition that the multipath effect exists in the positioning target equipment and the auxiliary mobile equipment, and carrying out fusion calculation by using the screened out target end positioning measurement data.
2. The multi-sensor fusion positioning method according to claim 1, wherein,
the positioning sensor configured on the positioning target device at least comprises: the system comprises an inertial navigation system, a real-time dynamic differential sensor, a wheel rotating speed sensor and a relative positioning signal receiver, wherein the step of acquiring target end positioning measurement data comprises the following steps of:
Collecting updated data of the gesture, the speed and the position measured by an inertial navigation system configured on the positioning target equipment;
collecting pseudo-range and carrier phase measured by a real-time dynamic differential sensor configured on the positioning target equipment;
collecting wheel speeds measured by wheel speed sensors arranged on the positioning target equipment;
acquiring relative positions of the positioning target device and the auxiliary mobile device measured by a relative positioning signal receiver arranged on the positioning target device, and
the positioning sensor configured on the auxiliary mobile device comprises at least: the inertial navigation system, the real-time dynamic differential sensor, and the step of obtaining the auxiliary end positioning measurement data comprises the following steps:
collecting updated data of the gesture, the speed and the position measured by an inertial navigation system configured on the auxiliary mobile equipment;
and acquiring pseudo-range and carrier phase measured by a real-time dynamic differential sensor configured on the auxiliary mobile device.
3. The multi-sensor fusion positioning method as defined in claim 2, wherein the step of fusing the target end positioning measurement data and the auxiliary end positioning measurement data in a loose coupling manner includes:
Performing navigation calculation on a linear observation equation of an error state constructed by the target end by using a Kalman filtering method according to the target end positioning measurement data to obtain target end optimal estimation;
performing navigation calculation on a linear observation equation of an error state constructed by the auxiliary end by using a Kalman filtering method according to the auxiliary end positioning measurement data to obtain an auxiliary end optimal estimation;
converting the auxiliary end optimal estimation into auxiliary positioning data of the positioning target equipment according to the relative positions of the positioning target equipment and the auxiliary mobile equipment;
and fusing the optimal estimation of the target end and the auxiliary positioning data.
4. The multi-sensor fusion positioning method as defined in claim 2, wherein the step of replacing the data affected by the multipath effect in the target end position measurement data with the corresponding data in the auxiliary end position measurement data comprises:
converting the updated data of the attitude, speed and position of the auxiliary mobile device into alternative measurement data according to the relative positions of the positioning target device and the auxiliary mobile device;
and replacing corresponding data in the target end positioning measurement data by using the replacement measurement data, and taking the data as input data of fusion calculation.
5. The multi-sensor fusion positioning method according to claim 1, wherein,
the auxiliary mobile device is an unmanned aerial vehicle, and further comprises, in the process of acquiring the auxiliary end positioning measurement data:
and adjusting the flight state of the unmanned aerial vehicle according to the running state of the positioning target equipment, and planning the flight path of the unmanned aerial vehicle.
6. The multi-sensor fusion positioning method according to claim 5, wherein the process of acquiring the assistance-localization real-time measurement data further comprises:
and adjusting the output range of the signal sent by the unmanned aerial vehicle to the unmanned aerial vehicle according to the road direction of the driving path of the positioning target equipment, so that the size of the output range along the direction of the road exceeds the size along the width direction of the road.
7. The multi-sensor fusion positioning method according to claim 1, wherein,
the secondary mobile device is an unmanned aerial vehicle and the unmanned aerial vehicle is configured to provide the secondary end location measurement data to the location target devices within a set area range.
8. The multi-sensor fusion positioning method according to claim 1, wherein,
a visual sensor is also arranged on the positioning target equipment and/or the auxiliary mobile equipment; and is also provided with
The step of detecting the spatial environment state of the positioning target device further includes: acquiring an ambient environment image of the positioning target device acquired by the vision sensor, and determining a signal shielding state of the positioning target device by using the ambient environment image as a space environment state of the positioning target device;
the step of detecting the spatial environment state of the secondary mobile device further comprises: and acquiring an ambient environment image of the auxiliary mobile device acquired by the vision sensor, and determining a signal shielding state of the auxiliary mobile device by using the ambient environment image as a space environment state of the auxiliary mobile device.
9. A machine-readable storage medium having stored thereon a machine-executable program which when executed by a processor implements the multisensor fusion positioning method of any one of claims 1 to 8.
10. A multi-sensor fusion positioning system comprising:
positioning target equipment, a plurality of positioning sensors arranged on the positioning target equipment and configured to acquire target end positioning measurement data;
the mobile terminal comprises auxiliary mobile equipment, a plurality of positioning sensors and a control unit, wherein the plurality of positioning sensors are configured on the auxiliary mobile equipment and are configured to acquire auxiliary terminal positioning measurement data;
Fusion positioning device comprising a memory, a processor and a machine executable program stored on the memory and running on the processor, and the processor implementing the multi-sensor fusion positioning method according to any of claims 1 to 8 when executing the machine executable program.
CN202311617207.9A 2023-11-30 2023-11-30 Multi-sensor fusion positioning method and system and machine-readable storage medium Active CN117310756B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311617207.9A CN117310756B (en) 2023-11-30 2023-11-30 Multi-sensor fusion positioning method and system and machine-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311617207.9A CN117310756B (en) 2023-11-30 2023-11-30 Multi-sensor fusion positioning method and system and machine-readable storage medium

Publications (2)

Publication Number Publication Date
CN117310756A CN117310756A (en) 2023-12-29
CN117310756B true CN117310756B (en) 2024-03-29

Family

ID=89281563

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311617207.9A Active CN117310756B (en) 2023-11-30 2023-11-30 Multi-sensor fusion positioning method and system and machine-readable storage medium

Country Status (1)

Country Link
CN (1) CN117310756B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7719461B1 (en) * 2008-08-05 2010-05-18 Lockheed Martin Corporation Track fusion by optimal reduced state estimation in multi-sensor environment with limited-bandwidth communication path
CN111221018A (en) * 2020-03-12 2020-06-02 南京航空航天大学 GNSS multi-source information fusion navigation method for inhibiting marine multipath
CN113405545A (en) * 2021-07-20 2021-09-17 阿里巴巴新加坡控股有限公司 Positioning method, positioning device, electronic equipment and computer storage medium
WO2021213432A1 (en) * 2020-04-21 2021-10-28 北京三快在线科技有限公司 Data fusion
CN114396943A (en) * 2022-01-12 2022-04-26 国家电网有限公司 Fusion positioning method and terminal
CN115097508A (en) * 2022-06-17 2022-09-23 东南大学 Satellite/inertia deep coupling method with multipath error estimator
CN115950418A (en) * 2022-12-09 2023-04-11 青岛慧拓智能机器有限公司 Multi-sensor fusion positioning method
CN116399351A (en) * 2023-04-23 2023-07-07 燕山大学 Vehicle position estimation method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10295365B2 (en) * 2016-07-29 2019-05-21 Carnegie Mellon University State estimation for aerial vehicles using multi-sensor fusion

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7719461B1 (en) * 2008-08-05 2010-05-18 Lockheed Martin Corporation Track fusion by optimal reduced state estimation in multi-sensor environment with limited-bandwidth communication path
CN111221018A (en) * 2020-03-12 2020-06-02 南京航空航天大学 GNSS multi-source information fusion navigation method for inhibiting marine multipath
WO2021213432A1 (en) * 2020-04-21 2021-10-28 北京三快在线科技有限公司 Data fusion
CN113405545A (en) * 2021-07-20 2021-09-17 阿里巴巴新加坡控股有限公司 Positioning method, positioning device, electronic equipment and computer storage medium
CN114396943A (en) * 2022-01-12 2022-04-26 国家电网有限公司 Fusion positioning method and terminal
CN115097508A (en) * 2022-06-17 2022-09-23 东南大学 Satellite/inertia deep coupling method with multipath error estimator
CN115950418A (en) * 2022-12-09 2023-04-11 青岛慧拓智能机器有限公司 Multi-sensor fusion positioning method
CN116399351A (en) * 2023-04-23 2023-07-07 燕山大学 Vehicle position estimation method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
多传感器数据融合的无人机速率估算与定位;吕涛;张长利;王树文;王润涛;张伶;刘超;栾吉玲;周雅楠;;农机化研究(第10期);第1-5页 *
无人机在GNSS拒止环境下的UWB定位系统研;李建盛等;单片机与嵌入式系统应用;第22卷(第10期);第67-71页 *

Also Published As

Publication number Publication date
CN117310756A (en) 2023-12-29

Similar Documents

Publication Publication Date Title
EP3566021B1 (en) Systems and methods for using a global positioning system velocity in visual-inertial odometry
CN101382431B (en) Positioning system and method thereof
US9488480B2 (en) Method and apparatus for improved navigation of a moving platform
EP0679973B1 (en) Integrated vehicle positioning and navigations system, apparatus and method
US6496778B1 (en) Real-time integrated vehicle positioning method and system with differential GPS
US7400956B1 (en) Satellite position and heading sensor for vehicle steering control
JP5673071B2 (en) Position estimation apparatus and program
US8497798B2 (en) Device and method for three-dimensional positioning
US20120059554A1 (en) Automatic Blade Control System during a Period of a Global Navigation Satellite System ...
CA2733032C (en) Method and apparatus for improved navigation of a moving platform
US8922426B1 (en) System for geo-location
US20230033404A1 (en) 3d lidar aided global navigation satellite system and the method for non-line-of-sight detection and correction
KR20160038319A (en) Method for displaying location of vehicle
EP3056926B1 (en) Navigation system with rapid gnss and inertial initialization
JP2011033413A (en) Wireless device
US20240085567A1 (en) System and method for correcting satellite observations
CN115683094A (en) Vehicle-mounted double-antenna tight coupling positioning method and system in complex environment
WO2017039000A1 (en) Moving body travel trajectory measuring system, moving body, and measuring program
JP7111298B2 (en) Satellite selection device and program
JP7201219B2 (en) Positioning device, velocity measuring device, and program
CN117310756B (en) Multi-sensor fusion positioning method and system and machine-readable storage medium
El-Mowafy et al. Reliable positioning and journey planning for intelligent transport systems
JP7392839B2 (en) Measuring device, measuring method, and program
JP2019168257A (en) Moving body information estimation device and program
JP3569015B2 (en) GPS navigation device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant