CN111964673A - Unmanned vehicle positioning system - Google Patents
Unmanned vehicle positioning system Download PDFInfo
- Publication number
- CN111964673A CN111964673A CN202010865439.6A CN202010865439A CN111964673A CN 111964673 A CN111964673 A CN 111964673A CN 202010865439 A CN202010865439 A CN 202010865439A CN 111964673 A CN111964673 A CN 111964673A
- Authority
- CN
- China
- Prior art keywords
- unmanned vehicle
- controller
- camera
- point cloud
- cloud data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/87—Combinations of systems using electromagnetic waves other than radio waves
- G01S17/875—Combinations of systems using electromagnetic waves other than radio waves for determining attitude
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
- G01S19/49—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
The embodiment of the invention discloses an unmanned vehicle positioning system. The system comprises: the device comprises a controller, a camera and a laser radar; the camera is used for collecting images of road signs preset in the garden and sending the collected images of the road signs to the controller; the laser radar is electrically connected with the camera and used for collecting point cloud data of the road sign and sending the point cloud data to the controller; the controller is respectively electrically connected with the camera and the laser radar and is used for receiving the images and the point cloud data of the road signs and determining the position and the direction of the unmanned vehicle based on the images and the point cloud data of the road signs. The effect of accurately positioning the unmanned vehicle in the park is realized under the condition that map data does not need to be constructed and maintained.
Description
Technical Field
The embodiment of the invention relates to a positioning technology, in particular to an unmanned vehicle positioning system.
Background
At present, the unmanned technology is spread in various fields such as military, industry and civil use, and in the process of continuous development, an important mark of the unmanned technology for embodying intelligent performance is autonomous navigation positioning.
At present, the following modes are mostly adopted for positioning and navigating the unmanned vehicle in a park: GPS inertial navigation differential positioning, visual odometer positioning, visual lane line and road sign auxiliary positioning.
According to the method, GPS signals can be lost and the positioning accuracy cannot be guaranteed under the shielding condition of trees, buildings and the like by utilizing GPS inertial navigation differential positioning. When the vision odometer is used for positioning and vision lane lines and road signs are used for assisting positioning, as the lane lines are not arranged in most intersection environments, the lane lines cannot be used, and automatic driving positioning cannot be met. Meanwhile, map data in the park may change along with the change of time or the surrounding environment, so that the map data needs to be maintained in real time, if the maintenance is not timely, the unmanned vehicle in the park may have the condition of inaccurate navigation in the driving process, and the maintenance of the map data needs a large amount of time and capital cost.
Disclosure of Invention
The embodiment of the invention provides an unmanned vehicle positioning system, which aims to realize the effect of accurately positioning a park unmanned vehicle under the condition of not constructing and maintaining map data.
The embodiment of the invention provides an unmanned vehicle positioning system, which comprises: the system comprises a controller, an identification positioning device and a GPS inertial navigation device;
the identification positioning device is used for acquiring an image and point cloud data of a road sign preset in the park and sending the image and point cloud data of the road sign to the controller;
the GPS inertial navigation equipment is used for transmitting a GPS signal to a base station and sending the returned GPS signal data to the controller;
the controller is respectively in communication connection with the identification positioning device and the GPS inertial navigation device and is used for receiving the images of the road signs and the point cloud data and the GPS signal data, when the GPS signal data is larger than or equal to a preset signal threshold value, the position and the direction of the unmanned vehicle are determined by adopting a mode of identification positioning and GPS inertial navigation positioning co-positioning, and when the GPS signal data is smaller than the preset signal threshold value, the position and the direction of the unmanned vehicle are determined by adopting a mode of identification positioning.
According to the technical scheme of the embodiment of the invention, the unmanned vehicle positioning system is designed, and the system comprises: the device comprises a controller, a camera and a laser radar; the camera is used for collecting images of road signs preset in the garden and sending the collected images of the road signs to the controller; the laser radar is electrically connected with the camera and used for collecting point cloud data of the road sign and sending the point cloud data to the controller; the controller is respectively electrically connected with the camera and the laser radar and is used for receiving the images and the point cloud data of the road signs and determining the position and the direction of the unmanned vehicle based on the images and the point cloud data of the road signs. The position and direction of the unmanned vehicle in the garden can be accurately determined through the camera and the laser radar. Therefore, the effect of accurately positioning the unmanned vehicles in the park can be realized under the condition that subsequent map data is not constructed and maintained in time.
Drawings
Fig. 1 is a schematic structural diagram of an unmanned vehicle positioning system according to a first embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating a determination of point cloud data that does not match landmarks in an image according to a first embodiment of the present invention;
FIG. 3 is a schematic structural diagram of an unmanned vehicle positioning system according to a second embodiment of the present invention;
fig. 4 is a schematic view of a positioning process of an unmanned vehicle according to a second embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a schematic structural diagram of an unmanned vehicle positioning system according to an embodiment of the present invention, which is applicable to a situation where an unmanned vehicle in a campus is accurately positioned without constructing and maintaining map data, as shown in fig. 1, the apparatus includes: controller 10, camera 20 and lidar 30.
The camera 20 is configured to collect an image of a road sign preset in the park, and send the collected image of the road sign to the controller 10; the laser radar 30 is electrically connected with the camera 20 and used for collecting point cloud data of the road sign and sending the point cloud data to the controller 10; and the controller 10 is respectively electrically connected with the camera 20 and the laser radar 30 and is used for receiving the images and the point cloud data of the road signs and determining the position and the direction of the unmanned vehicle based on the images and the point cloud data of the road signs.
For example, the preset signpost may be a signpost previously set at a specific location on the campus. The landmark may be a fixed shape and size plate on which an aruco code is drawn. The specific location of the park here may be, for example, where a turn is required in the park.
It should be noted that the shape and size of the road sign are determined when setting, for example, the road sign may be a square structure with a length and a width of 1 meter, and the road sign may be a plane of rigid plastic. The shape, size and material of the road sign can be set according to the user's requirement, and is not limited herein.
Because when the unmanned vehicle drives on the straight road, the unmanned vehicle can directly drive according to the lane line and the like, but when the unmanned vehicle needs to turn, particularly when the number of the curves is large and the distance between the curves is short, the lane line is possibly disordered, and the unmanned vehicle cannot well drive according to the lane line, therefore, the road signs can be arranged at the places, so that the mark positioning equipment can determine the position of the unmanned vehicle according to the preset road signs.
It should be noted that the road signs are arranged on each road section which needs to turn in the park, and the arico code on each road sign is unique so as to distinguish different road sections.
The camera 20 is provided on the unmanned vehicle, and the camera collects an image every predetermined time (for example, every half minute) and transmits the collected image to the controller 10. When the controller 10 receives the image, the image is recognized, and if the image is an image of a road sign, an aruco code on the road sign is recognized to determine which road section to turn in which the unmanned vehicle is. For example, if the controller determines that the arico code on the road sign is No. 1, it determines that the unmanned vehicle is on the section to turn No. 1.
It should be noted that, here, the controller 10 identifies whether the image is a landmark image, and the arico code on the landmark may be identified by using a neural network or an AI technique, so as to improve the efficiency of image identification. Specifically, identifying whether an image is a landmark image by using a neural network or an AI technique, and identifying an arico code on a landmark are prior art, and are not described in detail herein.
Of course, other ways of identifying whether the image is a road sign image and an aruco code on a road sign, and any ways of identifying whether the image is a road sign image and an aruco code on a road sign are also possible and belong to the protection scope of the present invention.
Optionally, here the controller 10 is further configured to: determining a first relative pose of the landmark and the unmanned vehicle based on the received image of the landmark, the attribute information of the landmark and an imaging position of the landmark in the image, wherein the attribute information at least comprises: the size of the road sign and the shape of the road sign.
For example, the first relative pose may be a relative position and angular relationship of the landmark and the unmanned vehicle determined by the camera and the controller.
When the controller 10 receives the image of the landmark and recognizes the arico code on the landmark, the approximate position of the unmanned vehicle is determined, and the controller can calculate the relative pose of the landmark and the unmanned vehicle based on the homography of the image according to the size and the shape of the landmark and the imaging position of the landmark in the image.
It should be noted that, the relative pose between the landmark and the unmanned vehicle can be calculated according to the homography of the image, which belongs to the prior art and is not described herein again.
Optionally, the controller 10 is further configured to: based on the received point cloud data of the road signs and the images of the road signs, deleting the point cloud data which are not matched with the road signs in the images in the point cloud data to obtain filtered point cloud data; and determining a second relative pose of the road sign and the unmanned vehicle through distance calculation and point cloud plane fitting based on the filtered point cloud data.
For example, the point cloud data that is not matched with the landmark in the image may be point cloud data that is not matched with the landmark in the image, and the point cloud data is projected onto the image of the landmark according to the projection relationship between the laser radar and the camera according to the received point cloud data and the image of the landmark. Referring to fig. 2, a schematic diagram of determining point cloud data that is not matched with a landmark in an image is described, in fig. 2, a box 1 is an image of the acquired landmark, a box 2 is the landmark, point cloud data in the box 2 is point cloud data that is matched with the landmark in the image, and point cloud data (point cloud Q) outside the box 1 is point cloud data that is not matched with the landmark in the image.
The second relative pose may be the relative position and angular relationship of the landmark to the unmanned vehicle as determined by the lidar and the controller.
The laser radar 30 may be provided on an unmanned vehicle and electrically connected to the camera 20. The laser radar 30 is used for collecting point cloud data of the road sign and transmitting the point cloud data to the controller 10. The controller 10 may delete the point cloud data that is not matched with the road sign in the image from the point cloud data according to the projection relationship between the laser radar and the camera to obtain filtered point cloud data, perform distance calculation on the filtered point cloud data to obtain the relative position relationship between the road sign and the unmanned vehicle, and obtain the relative angle (orientation) relationship between the road sign and the unmanned vehicle through a point cloud plane fitting algorithm. Thus, the relative position and angle relationship between the road sign and the unmanned vehicle can be determined.
It should be noted that, here, the distance calculation is performed on the filtered point cloud data to obtain the relative position relationship between the road sign and the unmanned vehicle, and the relative angle (orientation) relationship between the road sign and the unmanned vehicle can be obtained through a point cloud plane fitting algorithm, and the distance calculation algorithm and the point cloud plane fitting algorithm both belong to the prior art and are not described here again.
Optionally, the controller 10 is further configured to: determining a first position and a first orientation of the unmanned vehicle based on a first relative pose; determining a first location and a second direction of the unmanned vehicle based on a second relative pose; and fusing the first position and the second position, and fusing the first direction and the second direction to obtain the position and the direction of the unmanned vehicle.
For example, the first position may be a position of the unmanned vehicle determined based on the first relative pose; the first direction may be a direction of the unmanned vehicle determined based on the first relative pose.
The second position may be a position of the unmanned vehicle determined based on the second relative pose; the second direction may be a direction of the unmanned vehicle determined based on the second relative pose.
After the first relative pose of the landmark and the unmanned vehicle is determined, the position and the direction of the unmanned vehicle can be determined according to the known position and the known direction of the landmark and the unmanned vehicle, so that the actual position and the actual direction of the unmanned vehicle in the park, namely the first position and the first direction, can be determined.
After the second relative pose of the landmark and the unmanned vehicle is determined, the position and the angle of the landmark are known, so that the position and the direction of the unmanned vehicle can be determined according to the relative pose of the landmark and the unmanned vehicle, and the actual position and the actual direction of the unmanned vehicle in the park, namely the second position and the second direction, can be determined.
And fusing the first position and the second position through a Kalman filtering method to obtain the accurate position of the unmanned vehicle in the park. And fusing the first direction and the second direction by a Kalman filtering method, so as to obtain the accurate direction of the unmanned vehicle in the park.
Therefore, the position and the direction of the unmanned vehicle in the park can be accurately determined through the camera and the laser radar. Therefore, the effect of accurately positioning the unmanned vehicles in the park can be realized under the condition that subsequent map data is not constructed and maintained in time.
According to the technical scheme of the embodiment of the invention, the unmanned vehicle positioning system is designed, and the system comprises: the device comprises a controller, a camera and a laser radar; the camera is used for collecting images of road signs preset in the garden and sending the collected images of the road signs to the controller; the laser radar is electrically connected with the camera and used for collecting point cloud data of the road sign and sending the point cloud data to the controller; the controller is respectively electrically connected with the camera and the laser radar and is used for receiving the images and the point cloud data of the road signs and determining the position and the target direction of the unmanned vehicle based on the images and the point cloud data of the road signs. The position and direction of the unmanned vehicle in the garden can be accurately determined through the camera and the laser radar. Therefore, the effect of accurately positioning the unmanned vehicles in the park can be realized under the condition that subsequent map data is not constructed and maintained in time.
Example two
Fig. 3 is a schematic structural diagram of an unmanned vehicle positioning system according to a second embodiment of the present invention, and the second embodiment of the present invention may be combined with various alternatives in the foregoing embodiments. In this embodiment of the present invention, optionally, the system further includes: a GPS inertial navigation device 40 electrically connected to the controller 10.
Optionally, the GPS inertial navigation device 40 is configured to transmit a GPS signal to the base station, and send the returned GPS signal data to the controller 10; and the controller 10 is used for receiving the GPS signal data, and when the GPS signal data is greater than or equal to a preset signal threshold value, determining the position and the direction of the unmanned vehicle by adopting a positioning mode of cooperative positioning of a camera, a laser radar and GPS inertial navigation positioning. And when the GPS signal data is smaller than a preset signal threshold value, determining the position and the direction of the unmanned vehicle by adopting a positioning mode of cooperative positioning of the camera and the laser radar.
For example, the GPS inertial navigation device may be a GPS inertial navigation system that integrates GPS and inertial navigation technologies to locate the unmanned vehicle.
The preset signal threshold value can be a preset threshold value of GPS signal intensity, when the signal intensity of the GPS signal data received by the controller is greater than or equal to the preset threshold value, it is proved that the GPS inertial navigation equipment is also used for positioning the unmanned vehicle at the moment, and the position and the direction of the unmanned vehicle can be determined by utilizing a positioning mode of cooperative positioning of a camera, a laser radar and GPS inertial navigation positioning.
When the signal intensity of the GPS signal data received by the controller is smaller than the value, the positioning effect of the GPS inertial navigation equipment is proved to be poor (possibly caused by shielding of buildings, trees and the like) at the moment, the unmanned vehicle cannot be positioned, and the position and the direction of the unmanned vehicle can be determined by utilizing a positioning mode of cooperative positioning of the camera and the laser radar at the moment.
The position and direction of the unmanned vehicle are determined by using the positioning mode of cooperative positioning of the camera, the laser radar and the GPS inertial navigation positioning, which may be obtained by fusing the first position and the first direction determined by the camera with the second position and the second direction determined by the laser radar in the first embodiment, and fusing the position and the direction of the unmanned vehicle determined by the GPS inertial navigation positioning device, so as to obtain a more accurate position and direction of the unmanned vehicle.
The positioning method of cooperative positioning of the camera and the laser radar is used for determining the position and the direction of the unmanned vehicle, and the position and the direction of the unmanned vehicle on the park are determined by using the method in the first embodiment.
Optionally, the controller 10 is further configured to: and judging whether the camera 20 collects the image of the road sign.
For example, referring to the schematic positioning flow chart of the unmanned vehicle described in fig. 4, the controller needs to first determine whether the camera acquires an image of the landmark, that is, the controller needs to first determine whether the unmanned vehicle is located at a preset position (for example, a distance of 10 meters from the corner) away from the corner at the current moment, so that the positioning manner can be determined according to the determination result.
Optionally, the controller 10 is further configured to: when the controller 10 determines that the road sign is not acquired by the camera 20 and the controller 10 determines that the GPS signal data is greater than or equal to a preset signal threshold, determining the position and the direction of the unmanned vehicle by adopting a GPS inertial navigation positioning mode; when the controller 10 determines that the road sign is not acquired by the camera 20 and the controller 10 determines that the GPS signal data is smaller than a preset signal threshold, it is determined that the positioning of the unmanned vehicle fails.
For example, as shown in fig. 4, when the controller 10 determines that the camera 20 does not collect a landmark and the controller 10 determines that the GPS signal data is greater than or equal to the preset signal threshold, the unmanned vehicle may be on a straight road at this time, and then the position and the direction of the unmanned vehicle are determined by using a GPS inertial navigation positioning manner.
When the controller 10 determines that the camera 20 does not collect the road sign, and the controller 10 determines that the GPS signal data is smaller than the preset signal threshold, the unmanned vehicle may be on a straight road at this time, and at this time, due to the shielding of a building, the GPS signal is weak, the position and the direction of the unmanned vehicle cannot be determined by using a GPS inertial navigation positioning mode, and the road sign is not collected at this time, so the camera and the laser radar also do not start a positioning mode, the unmanned vehicle cannot be positioned by using a camera and laser radar co-positioning mode, and therefore the position and the direction of the unmanned vehicle cannot be positioned at this time.
Therefore, when the unmanned vehicle is positioned without using a camera and laser radar cooperative positioning mode, if the GPS inertial navigation equipment can be used, the position and the direction of the unmanned vehicle can be determined by using the GPS inertial navigation positioning mode. Therefore, when the camera and the laser radar are not started to position the unmanned vehicle, the unmanned vehicle can be positioned by the aid of the GPS inertial navigation equipment, and positioning of the unmanned vehicle can be realized at any time.
According to the technical scheme of the embodiment of the invention, the GPS inertial navigation equipment is arranged, so that when the unmanned vehicle is positioned by using a camera and laser radar cooperative positioning mode, if the GPS inertial navigation equipment can be used, the position and the direction of the unmanned vehicle can be determined by using the GPS inertial navigation positioning mode. Therefore, when the camera and the laser radar are not started to position the unmanned vehicle, the unmanned vehicle can be positioned by the aid of the GPS inertial navigation equipment, and positioning of the unmanned vehicle can be realized at any time.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.
Claims (8)
1. An unmanned vehicle positioning system, comprising: the device comprises a controller, a camera and a laser radar;
the camera is used for collecting images of road signs preset in the garden and sending the collected images of the road signs to the controller;
the laser radar is electrically connected with the camera and used for collecting point cloud data of the road sign and sending the point cloud data to the controller;
the controller is respectively electrically connected with the camera and the laser radar and is used for receiving the images and the point cloud data of the road signs and determining the position and the direction of the unmanned vehicle based on the images and the point cloud data of the road signs.
2. The system of claim 1, wherein the controller is further configured to:
identifying the landmark, and determining a first relative pose of the landmark and the unmanned vehicle based on the received image of the landmark, the attribute information of the landmark and the imaging position of the landmark in the image, wherein the attribute information at least comprises: the size of the road sign and the shape of the road sign.
3. The system of claim 1, wherein the controller is further configured to:
based on the received point cloud data of the road signs and the images of the road signs, deleting the point cloud data which are not matched with the road signs in the images in the point cloud data to obtain filtered point cloud data;
and determining a second relative pose of the road sign and the unmanned vehicle through distance calculation and point cloud plane fitting based on the filtered point cloud data.
4. The system of claim 1, wherein the controller is further configured to:
determining a first position and a first orientation of the unmanned vehicle based on a first relative pose;
determining a first location and a second direction of the unmanned vehicle based on a second relative pose;
and fusing the first position and the second position, and fusing the first direction and the second direction to obtain the position and the direction of the unmanned vehicle.
5. The system of claim 1, wherein the camera identifies the landmark through a neural network.
6. The system of claim 1, further comprising: the GPS inertial navigation equipment is electrically connected with the controller;
the GPS inertial navigation equipment is used for transmitting a GPS signal to a base station and sending the returned GPS signal data to the controller;
and the controller is used for receiving the GPS signal data, and when the GPS signal data is greater than or equal to a preset signal threshold value, determining the target position and the target direction of the unmanned vehicle by adopting a positioning mode of cooperative positioning of a camera, a laser radar and GPS inertial navigation positioning. And when the GPS signal data is smaller than a preset signal threshold value, determining the position and the direction of the unmanned vehicle by adopting a positioning mode of cooperative positioning of the camera and the laser radar.
7. The system of claim 6, wherein prior to determining whether the GPS signal data is greater than a preset signal threshold, the controller is further configured to:
and judging whether the camera collects and identifies the road sign.
8. The system of claim 7, wherein the controller is further configured to:
when the controller determines that the camera does not acquire the road sign and the controller determines that the GPS signal data is greater than or equal to a preset signal threshold, determining the position and the direction of the unmanned vehicle by adopting a GPS inertial navigation positioning mode;
and when the controller determines that the camera does not collect the road sign and the controller determines that the GPS signal data is smaller than a preset signal threshold, determining that the unmanned vehicle fails to be positioned.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010865439.6A CN111964673A (en) | 2020-08-25 | 2020-08-25 | Unmanned vehicle positioning system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010865439.6A CN111964673A (en) | 2020-08-25 | 2020-08-25 | Unmanned vehicle positioning system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111964673A true CN111964673A (en) | 2020-11-20 |
Family
ID=73390918
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010865439.6A Pending CN111964673A (en) | 2020-08-25 | 2020-08-25 | Unmanned vehicle positioning system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111964673A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114003041A (en) * | 2021-11-02 | 2022-02-01 | 中山大学 | Multi-unmanned vehicle cooperative detection system |
CN114383620A (en) * | 2021-11-30 | 2022-04-22 | 江铃汽车股份有限公司 | Vehicle accurate position obtaining method and system, readable storage medium and vehicle |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102012110595A1 (en) * | 2012-11-06 | 2014-05-08 | Conti Temic Microelectronic Gmbh | Method and device for detecting traffic signs for a vehicle |
CN108303721A (en) * | 2018-02-12 | 2018-07-20 | 北京经纬恒润科技有限公司 | A kind of vehicle positioning method and system |
CN108694731A (en) * | 2018-05-11 | 2018-10-23 | 武汉环宇智行科技有限公司 | Fusion and positioning method and equipment based on low line beam laser radar and binocular camera |
CN108871314A (en) * | 2018-07-18 | 2018-11-23 | 江苏实景信息科技有限公司 | A kind of positioning and orientation method and device |
CN109100741A (en) * | 2018-06-11 | 2018-12-28 | 长安大学 | A kind of object detection method based on 3D laser radar and image data |
CN109341706A (en) * | 2018-10-17 | 2019-02-15 | 张亮 | A kind of production method of the multiple features fusion map towards pilotless automobile |
CN109583415A (en) * | 2018-12-11 | 2019-04-05 | 兰州大学 | A kind of traffic lights detection and recognition methods merged based on laser radar with video camera |
CN110243358A (en) * | 2019-04-29 | 2019-09-17 | 武汉理工大学 | The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion |
CN110533718A (en) * | 2019-08-06 | 2019-12-03 | 杭州电子科技大学 | A kind of navigation locating method of the auxiliary INS of monocular vision artificial landmark |
CN210129116U (en) * | 2019-08-07 | 2020-03-06 | 苏州索亚机器人技术有限公司 | Be applied to autopilot car of garden scene |
-
2020
- 2020-08-25 CN CN202010865439.6A patent/CN111964673A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102012110595A1 (en) * | 2012-11-06 | 2014-05-08 | Conti Temic Microelectronic Gmbh | Method and device for detecting traffic signs for a vehicle |
CN108303721A (en) * | 2018-02-12 | 2018-07-20 | 北京经纬恒润科技有限公司 | A kind of vehicle positioning method and system |
CN108694731A (en) * | 2018-05-11 | 2018-10-23 | 武汉环宇智行科技有限公司 | Fusion and positioning method and equipment based on low line beam laser radar and binocular camera |
CN109100741A (en) * | 2018-06-11 | 2018-12-28 | 长安大学 | A kind of object detection method based on 3D laser radar and image data |
CN108871314A (en) * | 2018-07-18 | 2018-11-23 | 江苏实景信息科技有限公司 | A kind of positioning and orientation method and device |
CN109341706A (en) * | 2018-10-17 | 2019-02-15 | 张亮 | A kind of production method of the multiple features fusion map towards pilotless automobile |
CN109583415A (en) * | 2018-12-11 | 2019-04-05 | 兰州大学 | A kind of traffic lights detection and recognition methods merged based on laser radar with video camera |
CN110243358A (en) * | 2019-04-29 | 2019-09-17 | 武汉理工大学 | The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion |
CN110533718A (en) * | 2019-08-06 | 2019-12-03 | 杭州电子科技大学 | A kind of navigation locating method of the auxiliary INS of monocular vision artificial landmark |
CN210129116U (en) * | 2019-08-07 | 2020-03-06 | 苏州索亚机器人技术有限公司 | Be applied to autopilot car of garden scene |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114003041A (en) * | 2021-11-02 | 2022-02-01 | 中山大学 | Multi-unmanned vehicle cooperative detection system |
CN114383620A (en) * | 2021-11-30 | 2022-04-22 | 江铃汽车股份有限公司 | Vehicle accurate position obtaining method and system, readable storage medium and vehicle |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110174093B (en) | Positioning method, device, equipment and computer readable storage medium | |
CN106352867B (en) | Method and device for determining the position of a vehicle | |
CN109084786B (en) | Map data processing method | |
CN108257410B (en) | Parking space accurate navigation method with cooperation of field monitoring and navigation system | |
Brenner | Extraction of features from mobile laser scanning data for future driver assistance systems | |
US8134480B2 (en) | Image processing system and method | |
US9948853B2 (en) | Camera parameter calculation device, navigation system and camera parameter calculation method | |
CN112074885A (en) | Lane sign positioning | |
CN107664500B (en) | garage vehicle positioning and navigation method based on image feature recognition | |
CN110763246A (en) | Automatic driving vehicle path planning method and device, vehicle and storage medium | |
US20200033153A1 (en) | System for creating a vehicle surroundings model | |
CN113034960A (en) | Object change detection system for updating precision route map and method thereof | |
US10942519B2 (en) | System and method for navigating an autonomous driving vehicle | |
CN104748736A (en) | Positioning method and device | |
US20200174492A1 (en) | Autonomous driving method and system using road view or aerial view map information | |
CN111123334B (en) | Multi-vehicle cooperative positioning platform and positioning method under limit working condition | |
CN108805930A (en) | The localization method and system of automatic driving vehicle | |
CN211792049U (en) | Vehicle-road cooperative auxiliary system and vehicle | |
EP4040111A1 (en) | Map processing method and apparatus | |
CN102997926A (en) | Method for acquiring navigation data | |
CN113034540A (en) | Automatic precise road map generation system and method based on camera | |
CN114518122A (en) | Driving navigation method, driving navigation device, computer equipment, storage medium and computer program product | |
CN111964673A (en) | Unmanned vehicle positioning system | |
CN114724398A (en) | Parking appointment method and system based on automatic driving and readable storage medium | |
TW202118995A (en) | Vehicle navigation switching device for golf course self-driving cars |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20201120 |