CN110398765B - Positioning method and device and unmanned equipment - Google Patents

Positioning method and device and unmanned equipment Download PDF

Info

Publication number
CN110398765B
CN110398765B CN201810375538.9A CN201810375538A CN110398765B CN 110398765 B CN110398765 B CN 110398765B CN 201810375538 A CN201810375538 A CN 201810375538A CN 110398765 B CN110398765 B CN 110398765B
Authority
CN
China
Prior art keywords
data
positioning
time
sampled
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810375538.9A
Other languages
Chinese (zh)
Other versions
CN110398765A (en
Inventor
张金凤
吴迪
李雨倩
董秋伟
黄玉玺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jingdong Qianshi Technology Co Ltd
Original Assignee
Beijing Jingdong Qianshi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jingdong Qianshi Technology Co Ltd filed Critical Beijing Jingdong Qianshi Technology Co Ltd
Priority to CN201810375538.9A priority Critical patent/CN110398765B/en
Publication of CN110398765A publication Critical patent/CN110398765A/en
Application granted granted Critical
Publication of CN110398765B publication Critical patent/CN110398765B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled

Abstract

The disclosure provides a positioning method and device and an unmanned device. The positioning device samples speed information provided by a wheel speedometer, angular speed information provided by an inertial navigation system, positioning information provided by a positioning system, first point cloud data provided by a laser radar sensor and second point cloud data provided by a visual sensor at a preset sampling interval delta t; judging whether the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective or not by using the sampling results at the t moment and the t + delta t moment; and under the condition that the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective, positioning processing is carried out by utilizing the effective positioning information, the first point cloud data and the second point cloud data. According to the method and the device, the positioning results fed back by different positioning modes are used for joint processing, and high-precision positioning information can be provided.

Description

Positioning method and device and unmanned equipment
Technical Field
The disclosure relates to the field of positioning, and in particular relates to a positioning method and device and unmanned equipment.
Background
Currently, in order to realize the safe And stable operation of the unmanned aerial vehicle, the Positioning And orientation control of the unmanned aerial vehicle is realized by using a GPS (Global Positioning System), an inertial navigation System, a laser SLAM (singular Localization And mapping), a visual SLAM, a feature point (Landmark) Positioning, a wheel speed meter (odometer), And the like.
However, the GPS receiver board is affected by multipath and occlusion phenomena in the urban road environment, and the positioning accuracy is poor. Meanwhile, the GPS board card has the problem that the display positioning state is good, but the actual positioning accuracy is poor. The GPS/INS combined scheme is influenced by the problems of the GPS board card, and the positioning and orientation precision of the combined scheme is poor in certain environments. The laser SLAM, the visual SLAM and the feature point positioning are influenced by the operating environment (illumination, temperature and the like), and the positioning can not be effectively realized in any environment. The wheel speed meter needs to provide course information by other equipment, and then positioning is realized by pushing, and the positioning accuracy is poor under the conditions that other equipment cannot provide accurate course information and the self proportional parameter is inaccurate.
Disclosure of Invention
One technical problem that embodiments of the present disclosure solve is: it is not possible to provide a high-precision positioning service for the unmanned equipment.
In accordance with an aspect of one or more embodiments of the present disclosure, there is provided a positioning method including: sampling speed information provided by a wheel speedometer, angular speed information provided by an inertial navigation system, positioning information provided by a positioning system, first point cloud data provided by a laser radar sensor and second point cloud data provided by a visual sensor at a preset sampling interval delta t; judging whether the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective or not by using the sampling results at the t moment and the t + delta t moment; and under the condition that the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective, positioning processing is carried out by utilizing the effective positioning information, the first point cloud data and the second point cloud data.
Optionally, the positioning information comprises positioning data Pg and orientation data Hg; the first cloud data comprises positioning data Pl and orientation data Hl; the second point cloud data includes positioning data Ps and orientation data Hs.
Optionally, judging whether the positioning information, the first point cloud data and the second point cloud data sampled at the time t +. DELTA.t are valid by using the sampling results at the time t and at the time t +. DELTA.t includes: determining the moving distance S of the unmanned equipment within a sampling interval delta t by using the sampled speed information and angular speed information; determining the positioning data Pg sampled at the time t + Deltat by using the moving distance Sk+1、Plk+1And Psk+1Whether it is valid; location data Pgk+1、Plk+1And Psk+1Under effective circumstances, use of miningDetermining the relative heading HL of the unmanned equipment within a sampling interval delta t by using directional data in the first point cloud data; the relative course HL is utilized to judge the directional data Hg sampled at the time of t plus delta tk+1And Hsk+1Whether it is valid; at the orientation data Hgk+1And Hsk+1And under the effective condition, determining that the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective.
Alternatively, the moving distance S is used to determine the positioning data Pg sampled at the time t +. DELTA.tk+1、Plk+1And Psk+1Whether valid or not includes: using location data Pg sampled at time tkAnd positioning data Pg sampled at time t +. DELTA.tk+1Determining a moving distance Sg; using positioning data Pl acquired at time tkAnd positioning data Pl acquired at the time of t plus delta tk+1Determining a moving distance Sl; using the positioning data Ps acquired at time tkAnd positioning data Ps acquired at the time of t plus delta tk+1Determining a moving distance Ss; determining the positioning data Pg in the case that the difference between the moving distance Sg and the moving distance S is within a predetermined rangek+1The method is effective; determining the positioning data Pl in the case where the difference between the moving distance Sl and the moving distance S is within a predetermined rangek+1The method is effective; in the case where the difference between the moving distance Ss and the moving distance S is within a predetermined range, the positioning data Ps is determinedk+1Is effective.
Alternatively, in the case where the difference between the moving distance Sg and the moving distance S is not within a predetermined range, the positioning data Pg is determinedk+1Invalidation using distance of movement S and positioning data PgkRegenerating the positioning data Pgk+1(ii) a Determining the positioning data Pl in the case where the difference between the moving distance Sl and the moving distance S is not within a predetermined rangek+1Invalid, using the distance of movement S and the location data PlkRegenerating the positioning data Plk+1(ii) a In the case where the difference between the moving distance Ss and the moving distance S is not within the predetermined range, the positioning data Ps is determinedk+1Invalidation using the distance of movement S and the positioning data PskRegeneration of the positioning data Psk+1
OptionallyAnd judging the directional data Hg sampled at the time of t plus delta t by using the relative course HLk+1And Hsk+1Whether valid or not includes: using directional data Hg sampled at time tkAnd directional data Hg sampled at time t +. DELTA.tk+1Determining a relative course HG; using orientation data Hs acquired at time tkAnd directional data Hs collected at the time of t +. DELTA.tk+1Determining a relative course HS; determining the orientation data Hg in case the difference between the relative heading HG and the relative heading HL is within a predetermined rangek+1The method is effective; determining orientation data Hs in the case that the difference between the relative heading HS and the relative heading HL is within a predetermined rangek+1Is effective.
Optionally, in case the difference between the relative heading HG and the relative heading HL is not within a predetermined range, determining the orientation data HGk+1Null, using orientation data HgkSampling interval Deltat and angular velocity Wn sampled at time tkRegeneration of orientation data Hgk+1(ii) a Determining orientation data Hs in the case that the difference between the relative heading HS and the relative heading HL is not within a predetermined rangek+1Invalidation, using directional data HskSampling interval Deltat and angular velocity WnkRegenerating orientation data Hsk+1
Optionally, determining the moving distance S of the unmanned device within the sampling interval Δ t using the sampled speed information and angular velocity information comprises: using speed V sampled at time tkAnd angular velocity WnkAnd speed V sampled at time t +. DELTA.tk+1And angular velocity Wnk+1And determining the moving distance S of the unmanned equipment from the time t to the time t plus delta t.
Optionally, determining the relative heading HL of the unmanned device within the sampling interval Δ t using the orientation data in the sampled first point cloud data comprises: using directional data Hl sampled at time t +. DELTA.tk+1With directional data Hl sampled at time tkThe difference determines the relative heading HL of the drone within the sampling interval Δ t.
In accordance with another aspect of one or more embodiments of the present disclosure, there is provided a positioning apparatus including: a sampling module configured to sample speed information provided by a wheel speedometer, angular speed information provided by an inertial navigation system, positioning information provided by a positioning system, first point cloud data provided by a laser radar sensor, and second point cloud data provided by a vision sensor at a predetermined sampling interval Δ t; the information identification module is configured to judge whether the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective or not by using the sampling results at the t moment and the t + delta t moment; and the positioning processing module is configured to perform positioning processing by using the effective positioning information, the first point cloud data and the second point cloud data under the condition that the positioning information, the first point cloud data and the second point cloud data sampled at the time of t +. DELTA.t are effective.
Optionally, the positioning information comprises positioning data Pg and orientation data Hg; the first cloud data comprises positioning data Pl and orientation data Hl; the second point cloud data includes positioning data Ps and orientation data Hs.
Optionally, the information identification module is configured to determine a movement distance S of the unmanned device within the sampling interval Δ t using the sampled speed information and angular velocity information; determining the positioning data Pg sampled at the time t + Deltat by using the moving distance Sk+1、Plk+1And Psk+1Whether it is valid; location data Pgk+1、Plk+1And Psk+1Under the effective condition, determining the relative heading HL of the unmanned equipment within a sampling interval delta t by utilizing the directional data in the sampled first point cloud data; the relative course HL is utilized to judge the directional data Hg sampled at the time of t plus delta tk+1And Hsk+1Whether it is valid; at the orientation data Hgk+1And Hsk+1And under the effective condition, determining that the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective.
Optionally, the information identification module is configured to utilize the positioning data Pg sampled at time tkAnd positioning data Pg sampled at time t +. DELTA.tk+1Determining a moving distance Sg; using positioning data Pl acquired at time tkAnd positioning data Pl acquired at the time of t plus delta tk+1Determining the shiftA moving distance Sl; using the positioning data Ps acquired at time tkAnd positioning data Ps acquired at the time of t plus delta tk+1Determining a moving distance Ss; determining the positioning data Pg in the case that the difference between the moving distance Sg and the moving distance S is within a predetermined rangek+1The method is effective; determining the positioning data Pl in the case where the difference between the moving distance Sl and the moving distance S is within a predetermined rangek+1The method is effective; in the case where the difference between the moving distance Ss and the moving distance S is within a predetermined range, the positioning data Ps is determinedk+1Is effective.
Optionally, the information identification module is configured to determine the positioning data Pg if the difference between the moving distance Sg and the moving distance S is not within a predetermined rangek+1Invalidation using distance of movement S and positioning data PgkRegenerating the positioning data Pgk+1(ii) a Determining the positioning data Pl in the case where the difference between the moving distance Sl and the moving distance S is not within a predetermined rangek+1Invalid, using the distance of movement S and the location data PlkRegenerating the positioning data Plk+1(ii) a In the case where the difference between the moving distance Ss and the moving distance S is not within the predetermined range, the positioning data Ps is determinedk+1Invalidation using the distance of movement S and the positioning data PskRegeneration of the positioning data Psk+1
Optionally, the information identification module is configured to utilize the orientation data Hg sampled at time tkAnd directional data Hg sampled at time t +. DELTA.tk+1Determining a relative course HG; using orientation data Hs acquired at time tkAnd directional data Hs collected at the time of t +. DELTA.tk+1Determining a relative course HS; determining the orientation data Hg in case the difference between the relative heading HG and the relative heading HL is within a predetermined rangek+1The method is effective; determining orientation data Hs in the case that the difference between the relative heading HS and the relative heading HL is within a predetermined rangek+1Is effective.
Optionally, the information identification module is further configured to determine the orientation data HG if the difference between the relative heading HG and the relative heading HL is not within a predetermined rangek+1Null, using orientation data HgkSampling interval Deltat and at time tAngular velocity Wn of samplekRegeneration of orientation data Hgk+1(ii) a Determining orientation data Hs in the case that the difference between the relative heading HS and the relative heading HL is not within a predetermined rangek+1Invalidation, using directional data HskSampling interval Deltat and angular velocity WnkRegenerating orientation data Hsk+1
Optionally, the information identification module is further configured to utilize the speed V sampled at time tkAnd angular velocity WnkAnd speed V sampled at time t +. DELTA.tk+1And angular velocity Wnk+1And determining the moving distance S of the unmanned equipment from the time t to the time t plus delta t.
Optionally, the information identification module is further configured to utilize the orientation data Hl sampled at time t +. DELTA.tk+1With directional data Hl sampled at time tkThe difference determines the relative heading HL of the drone within the sampling interval Δ t.
In accordance with another aspect of one or more embodiments of the present disclosure, there is provided a positioning apparatus including: a memory configured to store instructions; a processor coupled to the memory, the processor configured to perform a method according to any of the embodiments described above based on instructions stored in the memory.
According to another aspect of one or more embodiments of the present disclosure, there is provided an unmanned aerial vehicle comprising a positioning device as described in any of the above embodiments, and
a wheel speed meter configured to provide speed information;
an inertial navigation system configured to provide angular velocity information;
a positioning system configured to provide positioning information;
a lidar sensor configured to provide first point cloud data;
a vision sensor configured to provide second point cloud data.
According to another aspect of one or more embodiments of the present disclosure, there is provided a computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions, which when executed by a processor, implement a method as described above in relation to any one of the embodiments.
Other features of the present disclosure and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and for those skilled in the art, other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is an exemplary flowchart of a positioning method according to an embodiment of the present disclosure.
Fig. 2 is an exemplary flow chart for determining whether sampled data is valid, according to one embodiment of the present disclosure.
Fig. 3 is an exemplary block diagram of a positioning device according to an embodiment of the present disclosure.
Fig. 4 is an exemplary block diagram of a positioning device according to another embodiment of the present disclosure.
Fig. 5 is an exemplary block diagram of an unmanned device according to one embodiment of the present disclosure.
Detailed Description
The technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, and not all of the embodiments. The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
The relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Fig. 1 is an exemplary flowchart of a positioning method according to an embodiment of the present disclosure. Alternatively, the method steps of the present embodiment may be performed by a positioning device.
In step 101, speed information provided by a wheel speedometer, angular speed information provided by an inertial navigation system, positioning information provided by a positioning system, first point cloud data provided by a laser radar sensor, and second point cloud data provided by a vision sensor are sampled at a predetermined sampling interval Δ t.
Optionally, the positioning information includes positioning data Pg and orientation data Hg, the first point cloud data includes positioning data Pl and orientation data Hl, and the second point cloud data includes positioning data Ps and orientation data Hs.
For example, the positioning system may be a GPS or other positioning system that provides positioning services.
In step 102, the sampling results at the time t and the time t + Δ t are used to determine whether the positioning information, the first point cloud data and the second point cloud data sampled at the time t + Δ t are valid.
Fig. 2 is an exemplary flow chart for determining whether sampled data is valid, according to one embodiment of the present disclosure.
In step 201, using the sampled velocity information and angular velocity information, the distance S traveled by the drone within the sampling interval Δ t is determined.
Alternatively, use is made of the speed V provided by the wheel speed meter sampled at time tkAnd angular velocity Wn provided by an inertial navigation systemkAnd speed V provided by the wheel speed meter sampled at time t +. DELTA.tk+1And angular velocity Wn provided by an inertial navigation systemk+1And determining the moving distance S of the unmanned equipment from the time t to the time t plus delta t.
For example, the moving distance S can be calculated by the following formula (1).
Figure GDA0003311991020000081
Wherein Wn is usedkAnd Wnk+1The influence of the zero offset of the gyroscope on the course angle can be effectively eliminated.
In step 202, the moving distance S is used to determine the positioning data Pg sampled at time t +. DELTA.tk+1Positioning data Pl in first cloud datak+1And positioning data Ps in the second point cloud datak+1Whether it is valid.
Optionally, positioning data Pg sampled at time t is utilizedkAnd positioning data Pg sampled at time t +. DELTA.tk+1The moving distance Sg is determined. Using positioning data Pl acquired at time tkAnd positioning data Pl acquired at the time of t plus delta tk+1And determining the moving distance Sl. Using the positioning data Ps acquired at time tkAnd positioning data Ps acquired at the time of t plus delta tk+1The movement distance Ss is determined.
Determining the positioning data Pg in the case that the difference between the moving distance Sg and the moving distance S is within a predetermined rangek+1Is effective. For example, if | Sg-S<Δ S1, the positioning data Pg provided by the positioning system and sampled at the time t + Δ t is determinedk+1Is effective.
Conversely, in the case where the difference between the moving distance Sg and the moving distance S is not within the predetermined range, the positioning data Pg is determinedk+1Invalid, using moveMoving distance S and positioning data PgkRegenerating the positioning data Pgk+1. For example, in the case of the unmanned device heading, Pgk+1=Pgk+ S. In the case of the unmanned device backing off, Pgk+1=Pgk-S。
Determining the positioning data Pl in the case where the difference between the moving distance Sl and the moving distance S is within a predetermined rangek+1Is effective. For example, if | Sl-S<Δ S2, the positioning data Pl provided by the lidar sensor sampled at time t +. DELTA.t is determinedk+1Is effective.
Conversely, in the case where the difference between the movement distance Sl and the movement distance S is not within the predetermined range, the positioning data Pl is determinedk+1Invalid, using the distance of movement S and the location data PlkRegenerating the positioning data Plk+1. For example, in the case of unmanned equipment advancing, Plk+1=Plk+ S. In the case of the unmanned device backing off, Plk+1=Plk-S。
In the case where the difference between the moving distance Ss and the moving distance S is within a predetermined range, the positioning data Ps is determinedk+1Is effective. For example, if | Ss-S<Δ S3, the positioning data Ps provided by the vision sensor sampled at the time t +. DELTA.t is determinedk+1Is effective.
Conversely, in the case where the difference between the moving distance Ss and the moving distance S is not within the predetermined range, the positioning data Ps is determinedk+1Invalidation using the distance of movement S and the positioning data PskRegeneration of the positioning data Psk+1. For example, in the case of the unmanned device advancing, Psk+1=Psk+ S. In the case of the unmanned device retreating, Psk+1=Psk-S。
In step 203, the data Pg is locatedk+1、Plk+1And Psk+1And under the effective condition, determining the relative heading HL of the unmanned equipment within the sampling interval delta t by utilizing the directional data in the sampled first point cloud data.
Optionally, directional data Hl sampled at time t +. DELTA.t is usedk+1And is taken at time tSample orientation data HlkThe difference determines the relative heading HL of the drone within the sampling interval Δ t.
In step 204, the relative heading HL is used to determine the directional data Hg sampled at the time t +. DELTA.tk+1And orientation data Hs in the second point cloud datak+1Whether it is valid.
Optionally, directional data Hg sampled at time t is utilizedkAnd directional data Hg sampled at time t +. DELTA.tk+1And determining the relative heading HG. Using orientation data Hs acquired at time tkAnd directional data Hs collected at the time of t +. DELTA.tk+1And determining the relative heading HS.
Determining the orientation data Hg in case the difference between the relative heading HG and the relative heading HL is within a predetermined rangek+1Is effective. For example, if | HG-HL<Δ H1, the positioning data Ps provided by the positioning system, sampled at the time t +. DELTA.t, are determinedk+1Is effective.
Conversely, in the case where the difference between the relative heading HG and the relative heading HL is not within the predetermined range, the orientation data HG is determinedk+1Null, using orientation data HgkSampling interval Deltat and angular velocity Wn sampled at time tkRegeneration of orientation data Hgk+1. E.g. Hgk+1=Hgk+Wnk×△t。
Determining orientation data Hs in the case that the difference between the relative heading HS and the relative heading HL is within a predetermined rangek+1Is effective. For example, if | HS-HL<Δ H2, the positioning data Ps provided by the vision sensor sampled at the time t +. DELTA.t is determinedk+1Is effective.
Conversely, in the case where the difference between the relative heading HS and the relative heading HL is not within the predetermined range, the orientation data HS is determinedk+1Invalidation, using directional data HskSampling interval Deltat and angular velocity WnkRegenerating orientation data Hsk+1. For example, Hsk+1=Hsk+Wnk×△t。
In step 205, at the orientation data Hgk+1And Hsk+1In the effective case, the determination is made at t +. DELTA.tAnd the positioning information, the first point cloud data and the second point cloud data of the moment sampling are effective.
Returning to fig. 1. In step 103, under the condition that the positioning information, the first point cloud data and the second point cloud data sampled at the time of t + Δ t are effective, the effective positioning information, the first point cloud data and the second point cloud data are used for positioning.
After the effective data are determined, the position error, the attitude error and the speed error of the unmanned equipment, the accelerometer zero offset and the gyroscope zero offset of an inertial navigation system and the wheel speed meter proportional factors are used as state variables, and effective positioning and directional data given by each system are used as measurement values to perform extended Kalman filtering, so that the high-precision positioning and directional of the unmanned equipment are realized.
Based on the positioning method provided by the above embodiment of the present disclosure, high-precision positioning information can be provided by performing joint processing by using positioning results fed back by different positioning modes.
Fig. 3 is an exemplary block diagram of a positioning device according to an embodiment of the present disclosure. As shown in fig. 3, the positioning apparatus includes a sampling module 31, an information identifying module 32, and a positioning processing module 33.
As shown in fig. 3, the sampling module 31 is configured to sample speed information provided by the tachometer, angular velocity information provided by the inertial navigation system, positioning information provided by the positioning system, first point cloud data provided by the lidar sensor, and second point cloud data provided by the vision sensor at a predetermined sampling interval Δ t.
Optionally, the positioning information includes positioning data Pg and orientation data Hg, the first point cloud data includes positioning data Pl and orientation data Hl, and the second point cloud data includes positioning data Ps and orientation data Hs.
Optionally, the information identifying module 32 is configured to determine whether the positioning information, the first point cloud data and the second point cloud data sampled at the time t + Δ t are valid or not by using the sampling results at the time t and at the time t + Δ t.
In some embodiments, the information identification module 32 is configured to determine the drone using the sampled velocity information and angular velocity informationThe movement distance S within the sampling interval Deltat is prepared. Optionally, the information identification module utilizes the speed V sampled at time tkAnd angular velocity WnkAnd speed V sampled at time t +. DELTA.tk+1And angular velocity Wnk+1And determining the moving distance S of the unmanned equipment from the time t to the time t plus delta t. For example, the movement distance S can be calculated by the above equation (1).
The information recognition module 32 determines the positioning data Pg sampled at the time t + Δ t using the moving distance Sk+1、Plk+1And Psk+1Whether it is valid.
Location data Pgk+1、Plk+1And Psk+1In the case of validity, the information identification module 32 further determines the relative heading HL of the unmanned device within the sampling interval Δ t using the orientation data in the sampled first point cloud data.
Optionally, directional data Hl sampled at time t +. DELTA.t is usedk+1With directional data Hl sampled at time tkThe difference determines the relative heading HL of the drone within the sampling interval Δ t.
The information recognition module 32 determines the directional data Hg sampled at t +. DELTA.t using the relative heading HLk+1And Hsk+1Whether it is valid; at the orientation data Hgk+1And Hsk+1And under the effective condition, determining that the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective.
In some embodiments, the information identification module 32 utilizes the positioning data Pg sampled at time tkAnd positioning data Pg sampled at time t +. DELTA.tk+1Determining a moving distance Sg; using positioning data Pl acquired at time tkAnd positioning data Pl acquired at the time of t plus delta tk+1Determining a moving distance Sl; using the positioning data Ps acquired at time tkAnd positioning data Ps acquired at the time of t plus delta tk+1The movement distance Ss is determined.
In the case where the difference between the moving distance Sg and the moving distance S is within a predetermined range, the information identification module 32 determines the positioning data Pgk+1Is effective. In addition, theDetermining the positioning data Pg when the difference between the moving distance Sg and the moving distance S is not within a predetermined rangek+1Invalidation using distance of movement S and positioning data PgkRegenerating the positioning data Pgk+1. For example, in the case of the unmanned device heading, Pgk+1=Pgk+ S. In the case of the unmanned device backing off, Pgk+1=Pgk-S。
In the case where the difference between the movement distance Sl and the movement distance S is within a predetermined range, the information identification module 32 determines the positioning data Plk+1Is effective. And determining the positioning data Pl when the difference between the moving distance Sl and the moving distance S is not within a predetermined rangek+1Invalid, using the distance of movement S and the location data PlkRegenerating the positioning data Plk+1. For example, in the case of unmanned equipment advancing, Plk+1=Plk+ S. In the case of the unmanned device backing off, Plk+1=Plk-S。
In the case where the difference between the moving distance Ss and the moving distance S is within the predetermined range, the information identifying module 32 determines the positioning data Psk+1Is effective. And determining the positioning data Ps in the case that the difference between the moving distance Ss and the moving distance S is not within the predetermined rangek+1Invalidation using the distance of movement S and the positioning data PskRegeneration of the positioning data Psk+1. For example, in the case of the unmanned device advancing, Psk+1=Psk+ S. In the case of the unmanned device retreating, Psk+1=Psk-S。
In some embodiments, the information identification module 32 utilizes directional data Hg sampled at time tkAnd directional data Hg sampled at time t +. DELTA.tk+1Determining a relative course HG; using orientation data Hs acquired at time tkAnd directional data Hs collected at the time of t +. DELTA.tk+1And determining the relative heading HS.
Determining the orientation data Hg in case the difference between the relative heading HG and the relative heading HL is within a predetermined rangek+1Is effective. And the difference between the relative heading HG and the relative heading HL is not within the predetermined rangeIn the case of determining the orientation data Hgk+1Null, using orientation data HgkSampling interval Deltat and angular velocity Wn sampled at time tkRegeneration of orientation data Hgk+1. E.g. Hgk+1=Hgk+Wnk×△t。
Determining orientation data Hs in the case that the difference between the relative heading HS and the relative heading HL is within a predetermined rangek+1Is effective. And determining the orientation data Hs under the condition that the difference between the relative course HS and the relative course HL is not within the preset rangek+1Invalidation, using directional data HskSampling interval Deltat and angular velocity WnkRegenerating orientation data Hsk+1. For example, Hsk+1=Hsk+Wnk×△t。
The positioning processing module 33 is configured to perform positioning processing by using the effective positioning information, the first point cloud data, and the second point cloud data when the positioning information, the first point cloud data, and the second point cloud data sampled at the time t +. DELTA.t are effective.
I.e. using the effective Pgk+1、Hgk+1、Plk+1、Hlk+1、Psk+1And Hsk+1And carrying out positioning processing.
Based on the positioning device provided by the above embodiment of the present disclosure, high-precision positioning information can be provided by performing joint processing by using positioning results fed back by different positioning modes.
Fig. 4 is an exemplary block diagram of a positioning device according to another embodiment of the present disclosure. As shown in fig. 4, the positioning device includes a memory 41 and a processor 42.
The memory 41 is used for storing instructions, the processor 42 is coupled to the memory 41, and the processor 42 is configured to execute the method according to any one of the embodiments in fig. 1 and fig. 2 based on the instructions stored in the memory.
As shown in fig. 4, the positioning apparatus further includes a communication interface 43 for information interaction with other devices. Meanwhile, the device also comprises a bus 44, and the processor 42, the communication interface 43 and the memory 41 are communicated with each other through the bus 44.
The memory 41 may comprise a high-speed RAM memory, and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory. The memory 41 may also be a memory array. The storage 41 may also be partitioned, and the blocks may be combined into virtual volumes according to certain rules.
Further, the processor 42 may be a central processing unit CPU, or may be an application specific integrated circuit ASIC, or one or more integrated circuits configured to implement embodiments of the present disclosure.
The present disclosure also relates to a computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions, and the instructions, when executed by a processor, implement a method according to any one of the embodiments of fig. 1 and fig. 2.
Fig. 5 is an exemplary block diagram of an unmanned device according to one embodiment of the present disclosure. As shown in fig. 5, the unmanned aerial vehicle includes a positioning device 51, a wheel speed meter 52, an inertial navigation system 53, a positioning system 54, a laser radar sensor 55, and a vision sensor 56.
As shown in fig. 5, the positioning device 51 is the positioning device according to any one of the embodiments of fig. 3 and 4. The wheel speed meter 52 is configured to provide speed information, the inertial navigation system 53 is configured to provide angular velocity information, the localization system 54 is configured to provide localization information, the lidar sensor 55 is configured to provide first point cloud data, and the vision sensor 56 is configured to provide second point cloud data.
The positioning device can provide high-precision positioning information by acquiring information provided by each device and system at preset time intervals and performing combined processing.
Alternatively, the functional unit modules described above may be implemented as a general purpose Processor, a Programmable Logic Controller (PLC), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable Logic device, discrete Gate or transistor Logic, discrete hardware components, or any suitable combination thereof for performing the functions described in this disclosure.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The description of the present disclosure has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the disclosure in the form disclosed. Many modifications and variations will be apparent to practitioners skilled in this art. The embodiment was chosen and described in order to best explain the principles of the disclosure and the practical application, and to enable others of ordinary skill in the art to understand the disclosure for various embodiments with various modifications as are suited to the particular use contemplated.

Claims (17)

1. A method of positioning, comprising:
sampling speed information provided by a wheel speedometer, angular speed information provided by an inertial navigation system, positioning information provided by a positioning system, first point cloud data provided by a laser radar sensor and second point cloud data provided by a visual sensor at a preset sampling interval delta t, wherein the positioning information comprises positioning data Pg and orientation data Hg, the first point cloud data comprises positioning data Pl and orientation data Hl, and the second point cloud data comprises positioning data Ps and orientation data Hs;
judging whether the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective or not by using the sampling results at the t moment and the t + delta t moment;
under the condition that the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective, the effective positioning information, the first point cloud data and the second point cloud data are utilized for positioning;
the method for judging whether the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective or not by using the sampling results at the t moment and the t + delta t moment comprises the following steps:
determining the moving distance S of the unmanned equipment within a sampling interval delta t by using the sampled speed information and angular speed information;
determining the positioning data Pg sampled at the time t + Deltat by using the moving distance Sk+1、Plk+1And Psk+1Whether it is valid;
location data Pgk+1、Plk+1And Psk+1Under the effective condition, determining the relative heading HL of the unmanned equipment within a sampling interval delta t by utilizing the directional data in the sampled first point cloud data;
utilizing the relative course HL to judge the directional data Hg sampled at the time of t plus delta tk+1And Hsk+1Whether it is valid;
at the orientation data Hgk+1And Hsk+1And under the effective condition, determining that the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective.
2. The positioning method according to claim 1, wherein the moving distance S is used to determine positioning data Pg sampled at time t + Δ tk+1、Plk+1And Psk+1Whether valid or not includes:
using location data Pg sampled at time tkAnd positioning data Pg sampled at time t +. DELTA.tk+1Determining a moving distance Sg; using positioning data Pl acquired at time tkAnd positioning data Pl acquired at the time of t plus delta tk+1Determining a moving distance Sl; using the positioning data Ps acquired at time tkAnd positioning data Ps acquired at the time of t plus delta tk+1Determining a moving distance Ss;
determining the positioning data Pg in the case that the difference between the moving distance Sg and the moving distance S is within a predetermined rangek+1The method is effective;
determining the positioning data Pl in the case where the difference between the moving distance Sl and the moving distance S is within a predetermined rangek+1The method is effective;
the difference between the moving distance Ss and the moving distance S is predeterminedIn the case of being within the range, the positioning data Ps is determinedk+1Is effective.
3. The positioning method according to claim 2, wherein:
determining the positioning data Pg in the case that the difference between the moving distance Sg and the moving distance S is not within a predetermined rangek+1Invalidation using distance of movement S and positioning data PgkRegenerating the positioning data Pgk+1
Determining the positioning data Pl in the case where the difference between the moving distance Sl and the moving distance S is not within a predetermined rangek+1Invalid, using the distance of movement S and the location data PlkRegenerating the positioning data Plk+1
In the case where the difference between the moving distance Ss and the moving distance S is not within the predetermined range, the positioning data Ps is determinedk+1Invalidation using the distance of movement S and the positioning data PskRegeneration of the positioning data Psk+1
4. The positioning method according to claim 1, wherein the relative heading HL is used for judging the orientation data Hg sampled at the time t +. DELTA.tk+1And Hsk+1Whether valid or not includes:
using directional data Hg sampled at time tkAnd directional data Hg sampled at time t +. DELTA.tk+1Determining a relative course HG; using orientation data Hs acquired at time tkAnd directional data Hs collected at the time of t +. DELTA.tk+1Determining a relative course HS;
determining the orientation data Hg in case the difference between the relative heading HG and the relative heading HL is within a predetermined rangek+1The method is effective;
determining orientation data Hs in the case that the difference between the relative heading HS and the relative heading HL is within a predetermined rangek+1Is effective.
5. The positioning method according to claim 4, wherein:
in the case where the difference between the relative heading HG and the relative heading HL is not within the predetermined range,determination of orientation data Hgk+1Null, using orientation data HgkSampling interval Deltat and angular velocity Wn sampled at time tkRegeneration of orientation data Hgk+1
Determining orientation data Hs in the case that the difference between the relative heading HS and the relative heading HL is not within a predetermined rangek+1Invalidation, using directional data HskSampling interval Deltat and angular velocity WnkRegenerating orientation data Hsk+1
6. The positioning method according to claim 1, determining a moving distance S of the unmanned device within a sampling interval Δ t using the sampled velocity information and angular velocity information comprises:
using speed V sampled at time tkAnd angular velocity WnkAnd speed V sampled at time t +. DELTA.tk+1And angular velocity Wnk+1And determining the moving distance S of the unmanned equipment from the time t to the time t plus delta t.
7. The positioning method according to claim 1, wherein determining a relative heading HL of the unmanned device within a sampling interval Δ t using the orientation data in the sampled first point cloud data comprises:
using directional data Hl sampled at time t +. DELTA.tk+1With directional data Hl sampled at time tkThe difference determines the relative heading HL of the drone within the sampling interval Δ t.
8. A positioning device, comprising:
a sampling module configured to sample speed information provided by a wheel speedometer, angular speed information provided by an inertial navigation system, positioning information provided by a positioning system, first point cloud data provided by a laser radar sensor, and second point cloud data provided by a vision sensor at a predetermined sampling interval Δ t, the positioning information including positioning data Pg and orientation data Hg, the first point cloud data including positioning data Pl and orientation data Hl, and the second point cloud data including positioning data Ps and orientation data Hs;
an information identification module configured to determine whether the positioning information, the first point cloud data and the second point cloud data sampled at the time t + Δ t are valid or not by using the sampling results at the time t and at the time t + Δ t, wherein a moving distance S of the unmanned equipment within a sampling interval Δ t is determined by using the sampled speed information and angular velocity information, and the positioning data Pg sampled at the time t + Δ t is determined by using the moving distance Sk+1、Plk+1And Psk+1Whether valid or not, at the location data Pgk+1、Plk+1And Psk+1Under the effective condition, determining a relative heading HL of the unmanned equipment within a sampling interval delta t by using the directional data in the sampled first point cloud data, and judging the directional data Hg sampled at the moment of t plus delta t by using the relative heading HLk+1And Hsk+1Whether valid, in the orientation data Hgk+1And Hsk+1Determining that the positioning information, the first point cloud data and the second point cloud data sampled at the t + delta t moment are effective under the effective condition;
and the positioning processing module is configured to perform positioning processing by using the effective positioning information, the first point cloud data and the second point cloud data under the condition that the positioning information, the first point cloud data and the second point cloud data sampled at the time of t +. DELTA.t are effective.
9. The positioning device of claim 8, wherein:
the information identification module is configured to utilize the positioning data Pg sampled at the time tkAnd positioning data Pg sampled at time t +. DELTA.tk+1Determining a moving distance Sg; using positioning data Pl acquired at time tkAnd positioning data Pl acquired at the time of t plus delta tk+1Determining a moving distance Sl; using the positioning data Ps acquired at time tkAnd positioning data Ps acquired at the time of t plus delta tk+1Determining a moving distance Ss; determining the positioning data Pg in the case that the difference between the moving distance Sg and the moving distance S is within a predetermined rangek+1The method is effective; determining the positioning data Pl in the case where the difference between the moving distance Sl and the moving distance S is within a predetermined rangek+1The method is effective; in the case where the difference between the moving distance Ss and the moving distance S is within a predetermined range, the positioning data Ps is determinedk+1Is effective.
10. The positioning device of claim 9, wherein:
the information identification module is configured to determine the positioning data Pg in the case where the difference between the moving distance Sg and the moving distance S is not within a predetermined rangek+1Invalidation using distance of movement S and positioning data PgkRegenerating the positioning data Pgk+1(ii) a Determining the positioning data Pl in the case where the difference between the moving distance Sl and the moving distance S is not within a predetermined rangek+1Invalid, using the distance of movement S and the location data PlkRegenerating the positioning data Plk+1(ii) a In the case where the difference between the moving distance Ss and the moving distance S is not within the predetermined range, the positioning data Ps is determinedk+1Invalidation using the distance of movement S and the positioning data PskRegeneration of the positioning data Psk+1
11. The positioning device of claim 8, wherein:
the information identification module is configured to utilize the orientation data Hg sampled at time tkAnd directional data Hg sampled at time t +. DELTA.tk+1Determining a relative course HG; using orientation data Hs acquired at time tkAnd directional data Hs collected at the time of t +. DELTA.tk+1Determining a relative course HS; determining the orientation data Hg in case the difference between the relative heading HG and the relative heading HL is within a predetermined rangek+1The method is effective; determining orientation data Hs in the case that the difference between the relative heading HS and the relative heading HL is within a predetermined rangek+1Is effective.
12. The positioning device of claim 11, wherein:
the information identification module is further configured to determine the orientation data Hg if the difference between the relative heading HG and the relative heading HL is not within a predetermined rangek+1Null, using orientation data HgkSample interval deltat and the angular velocity Wn sampled at time tkRegeneration of orientation data Hgk+1(ii) a Determining orientation data Hs in the case that the difference between the relative heading HS and the relative heading HL is not within a predetermined rangek+1Invalidation, using directional data HskSampling interval Deltat and angular velocity WnkRegenerating orientation data Hsk+1
13. The positioning device of claim 8, wherein:
the information identification module is further configured to utilize the speed V sampled at time tkAnd angular velocity WnkAnd speed V sampled at time t +. DELTA.tk+1And angular velocity Wnk+1And determining the moving distance S of the unmanned equipment from the time t to the time t plus delta t.
14. The positioning device of claim 8, wherein:
the information identification module is further configured to utilize the orientation data Hl sampled at time t +. DELTA.tk+1With directional data Hl sampled at time tkThe difference determines the relative heading HL of the drone within the sampling interval Δ t.
15. A positioning device, comprising:
a memory configured to store instructions;
a processor coupled to the memory, the processor configured to perform implementing the method of any of claims 1-7 based on instructions stored by the memory.
16. An unmanned device comprising a positioning apparatus according to any of claims 8-15, and
a wheel speed meter configured to provide speed information;
an inertial navigation system configured to provide angular velocity information;
a positioning system configured to provide positioning information;
a lidar sensor configured to provide first point cloud data;
a vision sensor configured to provide second point cloud data.
17. A computer-readable storage medium, wherein the computer-readable storage medium stores computer instructions which, when executed by a processor, implement the method of any one of claims 1-7.
CN201810375538.9A 2018-04-25 2018-04-25 Positioning method and device and unmanned equipment Active CN110398765B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810375538.9A CN110398765B (en) 2018-04-25 2018-04-25 Positioning method and device and unmanned equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810375538.9A CN110398765B (en) 2018-04-25 2018-04-25 Positioning method and device and unmanned equipment

Publications (2)

Publication Number Publication Date
CN110398765A CN110398765A (en) 2019-11-01
CN110398765B true CN110398765B (en) 2022-02-01

Family

ID=68322363

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810375538.9A Active CN110398765B (en) 2018-04-25 2018-04-25 Positioning method and device and unmanned equipment

Country Status (1)

Country Link
CN (1) CN110398765B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114323035A (en) * 2020-09-30 2022-04-12 华为技术有限公司 Positioning method, device and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011120141A1 (en) * 2010-03-31 2011-10-06 Ambercore Software Inc. Dynamic network adjustment for rigorous integration of passive and active imaging observations into trajectory determination
KR101133037B1 (en) * 2011-12-01 2012-04-04 국방과학연구소 Path updating method for collision avoidance of autonomous vehicle and the apparatus
CN106289275A (en) * 2015-06-23 2017-01-04 沃尔沃汽车公司 For improving unit and the method for positioning precision
CN105954783B (en) * 2016-04-26 2017-03-29 武汉大学 A kind of real-time tight integration of improvement GNSS/INS navigates the method for real-time performance
CN206479647U (en) * 2017-01-25 2017-09-08 北京经纬恒润科技有限公司 Alignment system and automobile

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011120141A1 (en) * 2010-03-31 2011-10-06 Ambercore Software Inc. Dynamic network adjustment for rigorous integration of passive and active imaging observations into trajectory determination
KR101133037B1 (en) * 2011-12-01 2012-04-04 국방과학연구소 Path updating method for collision avoidance of autonomous vehicle and the apparatus
CN106289275A (en) * 2015-06-23 2017-01-04 沃尔沃汽车公司 For improving unit and the method for positioning precision
CN105954783B (en) * 2016-04-26 2017-03-29 武汉大学 A kind of real-time tight integration of improvement GNSS/INS navigates the method for real-time performance
CN206479647U (en) * 2017-01-25 2017-09-08 北京经纬恒润科技有限公司 Alignment system and automobile

Also Published As

Publication number Publication date
CN110398765A (en) 2019-11-01

Similar Documents

Publication Publication Date Title
US10788830B2 (en) Systems and methods for determining a vehicle position
US11383727B2 (en) Vehicle operation based on vehicular measurement data processing
CN108051839B (en) Vehicle-mounted three-dimensional positioning device and three-dimensional positioning method
KR102441073B1 (en) Apparatus for compensating sensing value of gyroscope sensor, system having the same and method thereof
CN111562603B (en) Navigation positioning method, equipment and storage medium based on dead reckoning
KR102331312B1 (en) 3D vehicular navigation system using vehicular internal sensor, camera, and GNSS terminal
CN112835085B (en) Method and device for determining vehicle position
CN109923438B (en) Device and method for determining vehicle speed
CN113405545A (en) Positioning method, positioning device, electronic equipment and computer storage medium
JP2020003463A (en) Vehicle&#39;s self-position estimating device
CN110398765B (en) Positioning method and device and unmanned equipment
CN108254775A (en) Onboard navigation system and its implementation
KR20050052864A (en) Method and apparatus for measuring speed of land vehicle using accelerometer and route guidance information data
CN113917512B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN111580139B (en) Satellite navigation data validity judgment method and device and electronic equipment
KR101964059B1 (en) System for locating vehicle based on wheel speed sensor for guide system in indoor parking lot
US6473689B1 (en) Method for navigating a vehicle
CN109477899B (en) Method for determining a position, control module and storage medium
WO2014129501A1 (en) Passage detection system, passage detection method, passage detection device, program, and recording medium
CN110082805A (en) A kind of 3 D locating device and method
Prusaczyk et al. Vehicle navigation systems involving inertial sensors and odometry data from on-board diagnostics in non-gps applications
CN113671454B (en) Position parameter calibration method and device for vehicle-mounted radar and storage medium
JP7136050B2 (en) Vehicle position estimation device
CN116931041A (en) Track determination method and device
CN116840871A (en) Vehicle tracking method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210304

Address after: Room a1905, 19 / F, building 2, No. 18, Kechuang 11th Street, Daxing District, Beijing, 100176

Applicant after: Beijing Jingdong Qianshi Technology Co.,Ltd.

Address before: 101, 1st floor, building 2, yard 20, Suzhou street, Haidian District, Beijing 100080

Applicant before: Beijing Jingbangda Trading Co.,Ltd.

Effective date of registration: 20210304

Address after: 101, 1st floor, building 2, yard 20, Suzhou street, Haidian District, Beijing 100080

Applicant after: Beijing Jingbangda Trading Co.,Ltd.

Address before: 100195 Beijing Haidian Xingshikou Road 65 West Cedar Creative Garden 4 District 11 Building East 1-4 Floor West 1-4 Floor

Applicant before: BEIJING JINGDONG SHANGKE INFORMATION TECHNOLOGY Co.,Ltd.

Applicant before: BEIJING JINGDONG CENTURY TRADING Co.,Ltd.

GR01 Patent grant
GR01 Patent grant