CN113777644B - Unmanned positioning method based on weak signal scene - Google Patents
Unmanned positioning method based on weak signal scene Download PDFInfo
- Publication number
- CN113777644B CN113777644B CN202111016117.5A CN202111016117A CN113777644B CN 113777644 B CN113777644 B CN 113777644B CN 202111016117 A CN202111016117 A CN 202111016117A CN 113777644 B CN113777644 B CN 113777644B
- Authority
- CN
- China
- Prior art keywords
- data
- point cloud
- positioning
- laser radar
- obstacle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
- G01S15/931—Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/485—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an optical system or imaging system
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Acoustics & Sound (AREA)
- Navigation (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention provides an unmanned positioning method based on a weak signal scene. The invention can accurately position the vehicle in the scene of weak signals, and ensure the navigation accuracy; the visual point cloud image data, the laser radar point cloud data and the ultrasonic radar data at the vehicle to be positioned are acquired, perceived and fused, accurate positioning is obtained after correction, the data acquisition is convenient, and the processing is quick and accurate; the required acquisition sensing equipment is easy to acquire and install, realizes accurate positioning in a weak signal scene or even a no-signal scene, is suitable for popularization, and has good market prospect.
Description
Technical Field
The invention belongs to the technical field of navigation positioning, and particularly relates to an unmanned positioning method of a weak signal scene.
Background
At present, the common navigation modes of the unmanned trolley are a laser navigation mode, an optical navigation mode, a tape navigation mode, a visual navigation mode and an electromagnetic induction navigation mode. The electromagnetic induction navigation mode has the advantages that the navigation line is hidden and is not easy to pollute and damage, the principle is simple and reliable, the cost is low, however, the navigation path has limited complexity, and the expansion or modification of the battlefield line is very troublesome and lacks flexibility. The optical navigation mode is to paint or paste color bands on the walking path, and the optical sensor adopts the color band image signals to simply identify and process to realize navigation, so that the navigation path is flexible to set, sensitive to color band pollution and damage and easy to be limited by site environment. The tape navigation mode has the advantages that the tape laying is relatively simple, the navigation path is easy to change, but the tape navigation mode is easy to pollute and limited by the external environment, and is suitable for the condition with better environment. The laser navigation mode has the advantages of good flexibility, no need of any treatment on the ground, flexible and convenient path conversion, suitability for various field environments, and capability of conveniently and rapidly modifying the motion parameters and the driving path, but the control and navigation algorithm is most complex, the positioning precision depends on the laser head and the algorithm, and the manufacturing cost of the ADV is higher. The visual navigation mode has the typical advantages of low ADV cost, large acquired information quantity, capability of constructing a panoramic three-dimensional map and realizing full-automatic navigation, however, the method is greatly influenced by field light, large in information processing quantity, and the current hardware equipment is difficult to meet the real-time requirement, and the image processing algorithm is not mature. Regardless of the navigation mode, the advantages and characteristics of various navigation modes can be effectively exerted only when the vehicle is positioned accurately enough.
Then, with the continuous development of urban construction, the scenes of the signals can be influenced by an overhead tunnel, a tunnel and the like, and under the weak signal scene, the positioning of the vehicle is often deviated, so that the navigation is delayed and wrong.
Therefore, developing a precise positioning method suitable for weak signal scenes is a problem that needs to be solved by those skilled in the art.
Disclosure of Invention
In order to solve the problems, the invention discloses an unmanned positioning method for a weak signal scene.
In order to achieve the above purpose, the present invention provides the following technical solutions:
the unmanned positioning method based on the weak signal scene acquires and senses positioning data of a vehicle to be positioned, and obtains accurate positioning position and gesture after fusion.
Further, the specific method is as follows:
s1, starting a cruising central control system of an intelligent vehicle, and if satellite signals can be searched, directly performing accurate positioning through satellites; if the satellite signal is not searched, jumping to S2;
s2, collecting and sensing positioning data of a vehicle to be positioned, wherein the positioning data comprise visual point cloud image data, laser radar point cloud data and ultrasonic radar data;
s3, carrying out point cloud data fusion on the visual point cloud image data, the laser radar point cloud data and the ultrasonic radar data obtained in the S2 to obtain a point cloud block diagram;
if the point cloud block diagram does not have drift, accurate positioning is performed to obtain a positioning position and a positioning gesture;
if the point cloud block diagram has drift, the position is calibrated, and then accurate positioning is performed to obtain a positioning position and a gesture.
Further, the specific method of step S2 is as follows:
reading a visual point cloud image, intercepting visual point cloud data of three planes, converting the visual point cloud data into laser radar format data, if the matching degree of the obtained laser radar format data and a map exceeds 60%, waiting for the fusion of the point cloud data, otherwise, re-reading the visual point cloud image, and repeating the steps until the matching degree of the obtained laser radar format data and the map exceeds 60%;
reading laser radar point cloud data, further obtaining laser radar point cloud obstacle data, comparing the laser radar point cloud obstacle data with map obstacle data, if the overlap ratio of the map obstacle exceeds 70%, waiting for the point cloud data to be fused, otherwise, re-reading the laser radar point cloud data and repeating the steps until the overlap ratio of the map obstacle exceeds 70%;
and (3) reading ultrasonic radar data, performing filtering processing, namely, according to reflection opening angle calculation, only reading an obstacle in a range consistent with the size of the trolley, adding a reflection obstacle mark on an ultrasonic layer, if the matching degree of the reflection obstacle mark and a map exceeds 50%, converting the corresponding ultrasonic radar data into point cloud data, and waiting for the fusion of the point cloud data, otherwise, re-reading the ultrasonic radar data, and repeating the steps until the matching degree of the reflection obstacle mark and the map exceeds 50%.
Further, the visual point cloud image data in step S2 is obtained by extracting depth information by a depth camera and extracting feature points by rgb image information.
Compared with the prior art, the invention has the following beneficial effects:
1. the vehicle can be precisely positioned in a scene of weak signals, and the navigation accuracy is ensured;
2. the visual point cloud image data, the laser radar point cloud data and the ultrasonic radar data at the vehicle to be positioned are acquired, perceived and fused, accurate positioning is obtained after correction, the data acquisition is convenient, and the processing is quick and accurate;
3. the required acquisition sensing equipment is easy to acquire and install, a brand new method is provided for accurate positioning and navigation of unmanned driving at present, and the method is suitable for popularization and has good market prospect.
Drawings
Fig. 1, a flow chart of the present invention.
Detailed Description
The technical scheme provided by the present invention will be described in detail with reference to the following specific examples, and it should be understood that the following specific examples are only for illustrating the present invention and are not intended to limit the scope of the present invention.
As shown in fig. 1, which is a flow chart of the invention, the invention relates to an unmanned positioning method based on a weak signal scene, which acquires and senses positioning data of a vehicle to be positioned, and obtains accurate positioning position and gesture after fusion.
The specific method comprises the following steps:
s1, starting a cruising central control system of an intelligent vehicle, and if satellite signals can be searched, directly performing accurate positioning through satellites; if the satellite signal is not searched, jumping to S2;
s2, extracting depth information and rgb image information through a depth camera to extract feature points, obtaining a visual point cloud image, intercepting visual point cloud data of three planes, converting the visual point cloud data into laser radar format data, if the matching degree of the obtained laser radar format data and a map exceeds 60%, waiting for the point cloud data to be fused, otherwise, re-reading the visual point cloud image, and repeating the steps until the matching degree of the obtained laser radar format data and the map exceeds 60%;
reading laser radar point cloud data, further obtaining laser radar point cloud obstacle data, comparing the laser radar point cloud obstacle data with map obstacle data, if the overlap ratio of the map obstacle exceeds 70%, waiting for the point cloud data to be fused, otherwise, re-reading the laser radar point cloud data and repeating the steps until the overlap ratio of the map obstacle exceeds 70%;
reading ultrasonic radar data, detecting obstacles which cannot be detected by a peripheral optical sensor, performing filtering processing, calculating according to a reflection opening angle, only reading obstacles within a range consistent with the size of the trolley, adding reflection obstacle marks on an ultrasonic layer, converting corresponding ultrasonic radar data into point cloud data if the matching degree of the reflection obstacle marks and a map exceeds 50%, waiting for the point cloud data to be fused, otherwise, re-reading the ultrasonic radar data and repeating the steps until the matching degree of the reflection obstacle marks and the map exceeds 50%;
s3, determining the current approximate position and the current gesture of the vehicle through a uwb positioning technology;
s4, carrying out point cloud data fusion on the visual point cloud image data, the laser radar point cloud data and the ultrasonic radar data obtained in the S2 to obtain a point cloud block diagram; combining the approximate position and the gesture obtained in the step S3 with a point cloud block diagram;
if the point cloud block diagram does not have drift, accurate positioning is performed to obtain a positioning position and a positioning gesture;
if the point cloud block diagram has drift, the position is calibrated, and then accurate positioning is performed to obtain a positioning position and a gesture.
Finally, it should be noted that the above embodiments are only for illustrating the technical solution of the present invention and not for limiting the technical solution, and those skilled in the art should understand that modifications and equivalents may be made to the technical solution of the present invention without departing from the spirit and scope of the present invention, and all the modifications and equivalents are included in the scope of the claims of the present invention.
Claims (2)
1. An unmanned positioning method based on a weak signal scene is characterized by comprising the following steps of: the accurate positioning position and posture are obtained by collecting and sensing positioning data of the vehicle to be positioned and processing the information after fusion, and the specific method is as follows:
s1, starting a cruising central control system of an intelligent vehicle, and if satellite signals can be searched, directly performing accurate positioning through satellites; if the satellite signal is not searched, jumping to S2;
s2, collecting and sensing positioning data of a vehicle to be positioned, wherein the positioning data comprise visual point cloud image data, laser radar point cloud data and ultrasonic radar data;
the specific method comprises the following steps:
reading a visual point cloud image, intercepting visual point cloud data of three planes, converting the visual point cloud data into laser radar format data, if the matching degree of the obtained laser radar format data and a map exceeds 60%, waiting for the fusion of the point cloud data, otherwise, re-reading the visual point cloud image, and repeating the steps until the matching degree of the obtained laser radar format data and the map exceeds 60%;
reading laser radar point cloud data, further obtaining laser radar point cloud obstacle data, comparing the laser radar point cloud obstacle data with map obstacle data, if the overlap ratio of the map obstacle exceeds 70%, waiting for the point cloud data to be fused, otherwise, re-reading the laser radar point cloud data and repeating the steps until the overlap ratio of the map obstacle exceeds 70%;
reading ultrasonic radar data, performing filtering processing, namely, according to reflection opening angle calculation, only reading an obstacle in a range consistent with the size of the trolley, adding a reflection obstacle mark on an ultrasonic layer, if the matching degree of the reflection obstacle mark and a map exceeds 50%, converting the corresponding ultrasonic radar data into point cloud data, and waiting for the point cloud data to be fused, otherwise, re-reading the ultrasonic radar data, and repeating the steps until the matching degree of the reflection obstacle mark and the map exceeds 50%;
s3, carrying out point cloud data fusion on the visual point cloud image data, the laser radar point cloud data and the ultrasonic radar data obtained in the S2 to obtain a point cloud block diagram;
if the point cloud block diagram does not have drift, accurate positioning is performed to obtain a positioning position and a positioning gesture;
if the point cloud block diagram has drift, the position is calibrated, and then accurate positioning is performed to obtain a positioning position and a gesture.
2. The unmanned positioning method based on the weak signal scene according to claim 1, wherein: the visual point cloud image data in the step S2 is obtained by extracting depth information and rgb image information through a depth camera and extracting feature points.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111016117.5A CN113777644B (en) | 2021-08-31 | 2021-08-31 | Unmanned positioning method based on weak signal scene |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111016117.5A CN113777644B (en) | 2021-08-31 | 2021-08-31 | Unmanned positioning method based on weak signal scene |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113777644A CN113777644A (en) | 2021-12-10 |
CN113777644B true CN113777644B (en) | 2023-06-02 |
Family
ID=78840475
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111016117.5A Active CN113777644B (en) | 2021-08-31 | 2021-08-31 | Unmanned positioning method based on weak signal scene |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113777644B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109031304A (en) * | 2018-06-06 | 2018-12-18 | 上海国际汽车城(集团)有限公司 | Vehicle positioning method in view-based access control model and the tunnel of millimetre-wave radar map feature |
CN109116374A (en) * | 2017-06-23 | 2019-01-01 | 百度在线网络技术(北京)有限公司 | Determine the method, apparatus, equipment and storage medium of obstacle distance |
CN111257892A (en) * | 2020-01-09 | 2020-06-09 | 武汉理工大学 | Obstacle detection method for automatic driving of vehicle |
CN111796287A (en) * | 2020-07-29 | 2020-10-20 | 上海坤聿智能科技有限公司 | Automatic drive car road data acquisition and analysis device |
CN111796299A (en) * | 2020-06-10 | 2020-10-20 | 东风汽车集团有限公司 | Obstacle sensing method and device and unmanned sweeper |
CN211765500U (en) * | 2019-11-27 | 2020-10-27 | 北京新能源汽车技术创新中心有限公司 | Intelligent driving environment sensing system used in closed scene and automobile |
CN113109821A (en) * | 2021-04-28 | 2021-07-13 | 武汉理工大学 | Mapping method, device and system based on ultrasonic radar and laser radar |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108732603B (en) * | 2017-04-17 | 2020-07-10 | 百度在线网络技术(北京)有限公司 | Method and device for locating a vehicle |
US11113584B2 (en) * | 2020-02-04 | 2021-09-07 | Nio Usa, Inc. | Single frame 4D detection using deep fusion of camera image, imaging RADAR and LiDAR point cloud |
-
2021
- 2021-08-31 CN CN202111016117.5A patent/CN113777644B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109116374A (en) * | 2017-06-23 | 2019-01-01 | 百度在线网络技术(北京)有限公司 | Determine the method, apparatus, equipment and storage medium of obstacle distance |
CN109031304A (en) * | 2018-06-06 | 2018-12-18 | 上海国际汽车城(集团)有限公司 | Vehicle positioning method in view-based access control model and the tunnel of millimetre-wave radar map feature |
CN211765500U (en) * | 2019-11-27 | 2020-10-27 | 北京新能源汽车技术创新中心有限公司 | Intelligent driving environment sensing system used in closed scene and automobile |
CN111257892A (en) * | 2020-01-09 | 2020-06-09 | 武汉理工大学 | Obstacle detection method for automatic driving of vehicle |
CN111796299A (en) * | 2020-06-10 | 2020-10-20 | 东风汽车集团有限公司 | Obstacle sensing method and device and unmanned sweeper |
CN111796287A (en) * | 2020-07-29 | 2020-10-20 | 上海坤聿智能科技有限公司 | Automatic drive car road data acquisition and analysis device |
CN113109821A (en) * | 2021-04-28 | 2021-07-13 | 武汉理工大学 | Mapping method, device and system based on ultrasonic radar and laser radar |
Non-Patent Citations (2)
Title |
---|
李洋.智能车辆障碍物检测技术综述.大众科技.2019,第65-68页. * |
王东敏 ; 彭永胜 ; 李永乐 ; .视觉与激光点云融合的深度图像获取方法.军事交通学院学报.2017,(第10期),第80-84页. * |
Also Published As
Publication number | Publication date |
---|---|
CN113777644A (en) | 2021-12-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110428467B (en) | Robot positioning method combining camera, imu and laser radar | |
CN110084272B (en) | Cluster map creation method and repositioning method based on cluster map and position descriptor matching | |
CN107167826B (en) | Vehicle longitudinal positioning system and method based on variable grid image feature detection in automatic driving | |
CN103901895B (en) | Target positioning method based on unscented FastSLAM algorithm and matching optimization and robot | |
WO2018177026A1 (en) | Device and method for determining road edge | |
CN110135485A (en) | The object identification and localization method and system that monocular camera is merged with millimetre-wave radar | |
CN109696663A (en) | A kind of vehicle-mounted three-dimensional laser radar scaling method and system | |
CN110073362A (en) | System and method for lane markings detection | |
CN105371847A (en) | Indoor live-action navigation method and system | |
CN109583409A (en) | A kind of intelligent vehicle localization method and system towards cognitive map | |
CN104217439A (en) | Indoor visual positioning system and method | |
CN103680291A (en) | Method for realizing simultaneous locating and mapping based on ceiling vision | |
CN114638909A (en) | Substation semantic map construction method based on laser SLAM and visual fusion | |
CN109696173A (en) | A kind of car body air navigation aid and device | |
CN114264297B (en) | Positioning and mapping method and system for UWB and visual SLAM fusion algorithm | |
CN111047879A (en) | Vehicle overspeed detection method | |
CN103759724A (en) | Indoor navigation method based on decorative lighting characteristic and system | |
CN111123914A (en) | Vision scene-based direction estimation method for mowing robot | |
CN113777644B (en) | Unmanned positioning method based on weak signal scene | |
CN110568437A (en) | Precise environment modeling method based on radar assistance | |
CN112577499B (en) | VSLAM feature map scale recovery method and system | |
CN113673386A (en) | Method for marking traffic signal lamp in prior-to-check map | |
CN110865367B (en) | Intelligent radar video data fusion method | |
CN111612833A (en) | Real-time detection method for height of running vehicle | |
CN117310627A (en) | Combined calibration method applied to vehicle-road collaborative road side sensing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |