CN210038170U - Tightly-coupled automatic driving sensing system - Google Patents
Tightly-coupled automatic driving sensing system Download PDFInfo
- Publication number
- CN210038170U CN210038170U CN201822154553.9U CN201822154553U CN210038170U CN 210038170 U CN210038170 U CN 210038170U CN 201822154553 U CN201822154553 U CN 201822154553U CN 210038170 U CN210038170 U CN 210038170U
- Authority
- CN
- China
- Prior art keywords
- module
- satellite
- inertial
- navigation
- stereoscopic vision
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Landscapes
- Navigation (AREA)
Abstract
The utility model discloses a close coupling's autopilot perception system, include: a stereoscopic vision image processing module, a satellite navigation module, an inertial navigation module and a system tight coupling module of the binocular direct method; the inertial navigation module acquires inertial measurement data by adopting an inertial sensor; the stereoscopic vision image processing module acquires image data by adopting a binocular camera; the satellite navigation module acquires original measurement data of a navigation satellite through a receiver; and the system tight coupling module is used for carrying out tight coupling processing on the inertial sensor measurement data, the image data and the satellite navigation original measurement data. The utility model discloses a close coupling revises inertial navigation module's drift error to improve positioning accuracy, no longer with the help of the laser scanning radar, reduce the autopilot car cost.
Description
Technical Field
The utility model relates to a technical field that the computer was used, in particular to close-coupled autopilot perception system.
Background
In recent years, with the enhancement of the automobile safety awareness and the development of information technology as much as possible, the field of automatic driving is more and more concerned, many companies and scientific research institutions in the world start to invest in research and development of automatic driving related products, and the automatic driving vehicle is expected to enter the market in 2021 years, so that great revolution is brought to the automobile industry. Relevant researches show that the development of the automatic driving technology can bring subversive development in various fields, for example, the development can enhance the traffic safety of roads, relieve the traffic jam condition, reduce the environmental pollution and the like, and meanwhile, the automatic driving technology is also an important mark for measuring the national research and development strength and the industrial level, and has wide application prospect in the fields of national defense and national economy.
The automatic driving is a technology that an automobile senses a road environment through a vehicle-mounted sensing system, and controls the steering and speed of the automobile according to road, vehicle position, obstacle information and the like obtained by sensing, so that a driving route is planned automatically and the automobile is controlled to reach a preset target.
At present, in the aspect of automatic driving, each large company has own technical direction, a combined system of a vision system and an inertial navigation system of a binocular direct method is available in the prior art, but errors generated by the vision system and the inertial navigation system in the combined system cannot be effectively limited, and the errors of the combined system can increase without limit under the condition of no image gradient for a long time, so that the perception of the combined system fails.
In the prior art, a vision system, an inertial navigation system and a satellite navigation tightly-coupled automatic driving sensing system of a monocular feature point method are also available, but a monocular camera cannot detect a feature-free obstacle, such as an isolation guardrail of a highway, a bicycle or an animal and the like. The existing vision system also adopts a binocular stereo vision system for coupling, but still adopts a characteristic point method, so that the calculation amount is large, and the requirement on hardware performance is high.
Therefore, the automatic driving perception system based on the binocular direct method and tightly coupled with the stereoscopic vision image processing module, the inertial navigation module and the satellite navigation module is provided.
SUMMERY OF THE UTILITY MODEL
A primary object of the present invention is to provide a tightly coupled autopilot sensing system, which has the advantages of improving the positioning accuracy of autopilot sensing system to the maximum extent, and improving the calculation efficiency and reliability.
In order to achieve the above object, the utility model provides a close coupling's autopilot perception system, the system includes: a stereoscopic vision image processing module, a satellite navigation module, an inertial navigation module and a system tight coupling module of the binocular direct method;
the inertial navigation module is used for acquiring inertial measurement data by adopting an inertial sensor; the stereoscopic vision image processing module is used for acquiring image data of the stereoscopic vision module by adopting a binocular camera; the satellite navigation module is used for acquiring original measurement data of satellite navigation through a receiver; the system tight coupling module is used for carrying out tight coupling processing on the inertial sensor measurement data, the image data of the stereoscopic vision module and the satellite navigation original measurement data;
the stereoscopic vision image processing module, the satellite navigation module and the inertial navigation module are all connected with the system tight coupling module.
Preferably, the inertial navigation module specifically includes:
the inertial sensor is used for measuring 3-axis acceleration and 3-axis angular speed of the automatic driving vehicle under a fixed coordinate system;
and the vehicle data calculation unit is used for rotating the acceleration and the angular speed to a navigation coordinate system, solving an inertial navigation mechanical arrangement equation and calculating the position and the attitude angle of the automatic driving vehicle.
Preferably, the stereoscopic image processing module specifically includes:
a binocular camera for photographing an image of an object around the autonomous vehicle;
and the luminosity error calculation unit is used for calculating the luminosity measurement error according to the shot image.
Preferably, the satellite navigation module specifically includes:
a receiver for receiving signals of navigation satellites;
the satellite first data unit is used for resolving ephemeris information of each satellite according to signals of the satellites and calculating the satellite position and the satellite speed of each satellite according to the ephemeris information;
and the satellite second data unit is used for calculating the pseudo range, the Doppler frequency drift and the carrier phase of the satellite.
Preferably, the system close coupling module is specifically configured to correct the drift error of the inertial navigation module through the satellite navigation raw measurement data in combination with the image data of the stereo vision module.
Preferably, the inertial sensor is fixed to the autonomous vehicle, the inertial sensor comprising an accelerometer and a gyroscope.
Compared with the prior art, the utility model discloses following beneficial effect has: the utility model discloses a carry out the close coupling with stereovision image processing module, satellite navigation module, inertial navigation module three's output data in the close coupling module of system, revise inertial navigation module's drift error to improve positioning accuracy, no longer with the help of expensive laser scanning radar, thereby reduce the cost of autopilot car.
Drawings
Fig. 1 is a schematic structural diagram of a tightly coupled automatic driving sensing system according to an embodiment of the present invention.
Detailed Description
In order to make the technical means, creation features, achievement purposes and functions of the present invention easy to understand, the present invention is further described below with reference to the following embodiments.
The utility model provides an autopilot perception system 100 based on stereopsis module, inertial navigation module and satellite navigation module tight coupling. The inertial navigation can continuously provide data, the short-time precision is high, but the positioning error can be accumulated along with the time; the satellite navigation has good long-term stability, but is easy to be interfered; stereo vision calculates the distance from an obstacle to a camera based on the parallax of different cameras, but vision systems are not able to effectively locate and detect obstacle distances in environments where the image lacks gradients. The stereo vision, the inertial navigation and the satellite navigation form a combined navigation system, so that the state and the environmental state of the vehicle can be mutually assisted and sensed, the complementary operation is realized under different environments, the reliability and the navigation precision are improved, the existing laser scanning radar can be replaced, and the cost is reduced.
Fig. 1 is a schematic structural diagram of a tightly coupled autopilot sensing system 100 according to an embodiment of the present invention, as shown in fig. 1, the tightly coupled autopilot sensing system 100 specifically includes: the binocular direct method comprises a stereoscopic vision image processing module 10, a satellite navigation module 20, an inertial navigation module 30 and a system tight coupling module 40. The various modules of the tightly coupled autonomous driving perception system 100 are described in detail below:
the stereoscopic vision image processing module 10 of this embodiment is configured to acquire image data of the stereoscopic vision module by using a binocular camera, and the stereoscopic vision image processing module 10 specifically includes: a binocular camera 11 and a luminosity error calculation unit 12. The binocular camera 11 is used to capture images of objects around the autonomous vehicle. Furthermore, a photometric error calculation unit 12 is used to calculate a photometric error from the captured image.
Further, the satellite navigation module 20 of the present embodiment is configured to obtain raw measurement data of satellite navigation through the receiver 21, and the satellite navigation module 20 specifically includes: a receiver 21, a satellite first data unit 22 and a satellite second data unit 23. The receiver 21 is configured to receive signals of a navigation satellite, calculate a pseudo distance by using signal propagation time, and form a spherical surface by using the distance as a diameter and the satellite position as a center of a circle, where an intersection point of a plurality of spherical surfaces formed by more than 4 satellites is the position of an antenna of the receiver 21. Furthermore, the satellite first data unit 22 is configured to parse ephemeris information of each satellite according to a signal of the satellite, and calculate a satellite position and a satellite velocity of each satellite according to the ephemeris information; and the satellite second data unit 23 is used for calculating the pseudo range, the Doppler frequency drift and the carrier phase of the satellite.
Furthermore, the inertial navigation module 30 is configured to acquire inertial sensor measurement data by using the inertial sensor 31, and the inertial navigation module 30 includes the inertial sensor 31 and a vehicle data calculation unit 32. The inertial sensor 31 is used for measuring 3-axis acceleration and 3-axis angular velocity of the autonomous vehicle in a fixed coordinate system, and preferably, the inertial sensor 31 is fixed on the autonomous vehicle, and the inertial sensor 31 includes an accelerometer for measuring acceleration and a gyroscope for measuring angular velocity. And the vehicle data calculation unit 32 is used for rotating the acceleration and the angular speed to a navigation coordinate system, solving an inertial navigation mechanical arrangement equation and calculating the position and the attitude angle of the automatic driving vehicle.
Furthermore, the system tight coupling module 40 of the present embodiment is used for performing tight coupling processing on the inertial sensor measurement data, the image data of the stereo vision module, and the raw satellite navigation measurement data. The relationship between each module and the system tight coupling module 40 is: the stereoscopic vision image processing module 10, the satellite navigation module 20 and the inertial navigation module 30 are all connected with the system tight coupling module 40, output parameters in the stereoscopic vision image processing module 10, the satellite navigation module 20 and the inertial navigation module 30 are input into the system tight coupling module 40, and drift errors of the inertial navigation module of the automatic driving vehicle are corrected in the tight coupling module, so that the positioning accuracy is improved. Wherein, the output parameter of the stereoscopic vision image processing module 10 is image data; the output parameters of the satellite navigation module 20 are satellite position, satellite velocity, satellite pseudo-range, doppler shift and carrier phase; the output parameters of the inertial navigation module 30 are the position and attitude angle of the autonomous vehicle.
Specifically, the system close-coupled module 40 of the present embodiment is used to correct the drift error of the inertial navigation module by combining the raw measurement data of satellite navigation with the image data of the stereo vision module.
Compared with the prior art, the utility model discloses following beneficial effect has: the utility model discloses a carry out the close coupling with stereovision image processing module 10, satellite navigation module 20, inertial navigation module 30 three's output data in system close coupling module 40, revise inertial sensor measured data's error to improve positioning accuracy, no longer with the help of expensive laser scanning radar, thereby reduced the cost of autopilot car.
The basic principles and the main features of the invention and the advantages of the invention have been shown and described above. It will be understood by those skilled in the art that the present invention is not limited to the above embodiments, and that the foregoing embodiments and descriptions are provided only to illustrate the principles of the present invention without departing from the spirit and scope of the present invention. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (6)
1. A tightly coupled autonomous driving perception system, the system comprising: a stereoscopic vision image processing module, a satellite navigation module, an inertial navigation module and a system tight coupling module of the binocular direct method;
the inertial navigation module is used for acquiring inertial measurement data by adopting an inertial sensor; the stereoscopic vision image processing module is used for acquiring image data of the stereoscopic vision module by adopting a binocular camera; the satellite navigation module is used for acquiring original measurement data of satellite navigation through a receiver; the system tight coupling module is used for carrying out tight coupling processing on the inertial sensor measurement data, the image data of the stereoscopic vision module and the satellite navigation original measurement data;
the stereoscopic vision image processing module, the satellite navigation module and the inertial navigation module are all connected with the system tight coupling module.
2. The close-coupled autopilot sensing system of claim 1 wherein the inertial navigation module specifically comprises:
the inertial sensor is used for measuring 3-axis acceleration and 3-axis angular speed of the automatic driving vehicle under a fixed coordinate system;
and the vehicle data calculation unit is used for rotating the acceleration and the angular speed to a navigation coordinate system, solving an inertial navigation mechanical arrangement equation and calculating the position and the attitude angle of the automatic driving vehicle.
3. The tightly-coupled autopilot perception system according to claim 1 wherein the stereoscopic vision image processing module comprises:
a binocular camera for photographing an image of an object around the autonomous vehicle;
and the luminosity error calculation unit is used for calculating the luminosity measurement error according to the shot image.
4. The tightly-coupled autopilot sensing system of claim 1 wherein the satellite navigation module comprises:
a receiver for receiving signals of navigation satellites;
the satellite first data unit is used for resolving ephemeris information of each satellite according to signals of the satellites and calculating the satellite position and the satellite speed of each satellite according to the ephemeris information;
and the satellite second data unit is used for calculating the pseudo range, the Doppler frequency drift and the carrier phase of the satellite.
5. The tightly-coupled autopilot sensing system of claim 1 wherein the system tight coupling module specifically comprises: and correcting the drift error of the inertial navigation module by combining the satellite navigation original measurement data with the image data of the stereoscopic vision module.
6. The close-coupled autopilot sensing system of claim 2 wherein the inertial sensor is affixed to the autonomous vehicle, the inertial sensor including an accelerometer and a gyroscope.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201822154553.9U CN210038170U (en) | 2018-12-20 | 2018-12-20 | Tightly-coupled automatic driving sensing system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201822154553.9U CN210038170U (en) | 2018-12-20 | 2018-12-20 | Tightly-coupled automatic driving sensing system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN210038170U true CN210038170U (en) | 2020-02-07 |
Family
ID=69342963
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201822154553.9U Expired - Fee Related CN210038170U (en) | 2018-12-20 | 2018-12-20 | Tightly-coupled automatic driving sensing system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN210038170U (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109725339A (en) * | 2018-12-20 | 2019-05-07 | 东莞市普灵思智能电子有限公司 | A kind of tightly coupled automatic Pilot cognitive method and system |
CN111707256A (en) * | 2020-05-13 | 2020-09-25 | 苏州天炯信息科技有限公司 | Comprehensive positioning navigation equipment for rapidly arranging special vehicle by aid of navigation lamp |
CN112229362A (en) * | 2020-10-19 | 2021-01-15 | 南京朗禾智能控制研究院有限公司 | Vehicle-mounted device for accurately measuring area in real time |
-
2018
- 2018-12-20 CN CN201822154553.9U patent/CN210038170U/en not_active Expired - Fee Related
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109725339A (en) * | 2018-12-20 | 2019-05-07 | 东莞市普灵思智能电子有限公司 | A kind of tightly coupled automatic Pilot cognitive method and system |
CN111707256A (en) * | 2020-05-13 | 2020-09-25 | 苏州天炯信息科技有限公司 | Comprehensive positioning navigation equipment for rapidly arranging special vehicle by aid of navigation lamp |
CN112229362A (en) * | 2020-10-19 | 2021-01-15 | 南京朗禾智能控制研究院有限公司 | Vehicle-mounted device for accurately measuring area in real time |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10788830B2 (en) | Systems and methods for determining a vehicle position | |
WO2021248636A1 (en) | System and method for detecting and positioning autonomous driving object | |
Rose et al. | An integrated vehicle navigation system utilizing lane-detection and lateral position estimation systems in difficult environments for GPS | |
EP3792660B1 (en) | Method, apparatus and system for measuring distance | |
US8510027B2 (en) | Method for judging vehicle traveling position and vehicle traveling position judgment device | |
JP6354120B2 (en) | Road information transmission device, map generation device, road information collection system | |
CN210038170U (en) | Tightly-coupled automatic driving sensing system | |
WO2012118207A1 (en) | Local map generation device, local map generation system, global map generation device, global map generation system, and program | |
WO2020124624A1 (en) | Autonomous driving sensing method and system employing close coupling | |
WO2020146283A1 (en) | Vehicle pose estimation and pose error correction | |
JP2014109795A (en) | Vehicle position estimation device | |
JP5261842B2 (en) | Moving body position detecting method and moving body position detecting apparatus | |
CN111353453A (en) | Obstacle detection method and apparatus for vehicle | |
CN115667847A (en) | Vehicle control device and vehicle position estimation method | |
CN110595465A (en) | Positioning and attitude determining system based on GNSS and IMU | |
CN105137468A (en) | Photoelectric type automobile continuous navigation data acquiring device and method in GPS blind area environment | |
WO2011159185A1 (en) | Method and device for determining the direction of a start of a movement | |
CN115900732A (en) | Combined navigation method and system based on roadside camera and vehicle-mounted unit | |
CN205049153U (en) | Sustainable navigation data collection system of vehicle under environment of photoelectric type GPS blind area | |
RU2608792C2 (en) | Method of mobile machine on plane position determining | |
KR102622587B1 (en) | Apparatus and method for correcting longitudinal position error of fine positioning system | |
Jiang et al. | Precise vehicle ego-localization using feature matching of pavement images | |
Wang et al. | Localization of autonomous cars using multi-sensor data fusion | |
CN114035167A (en) | Target high-precision sensing method based on roadside multi-sensors | |
CN111141252B (en) | Monocular calibration ranging method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20200207 Termination date: 20201220 |