WO2020124624A1 - Autonomous driving sensing method and system employing close coupling - Google Patents

Autonomous driving sensing method and system employing close coupling Download PDF

Info

Publication number
WO2020124624A1
WO2020124624A1 PCT/CN2018/123652 CN2018123652W WO2020124624A1 WO 2020124624 A1 WO2020124624 A1 WO 2020124624A1 CN 2018123652 W CN2018123652 W CN 2018123652W WO 2020124624 A1 WO2020124624 A1 WO 2020124624A1
Authority
WO
WIPO (PCT)
Prior art keywords
module
measurement data
satellite
inertial navigation
stereo vision
Prior art date
Application number
PCT/CN2018/123652
Other languages
French (fr)
Chinese (zh)
Inventor
王峰
智凯旋
揭云飞
钟有东
肖飞
黄祖德
Original Assignee
东莞市普灵思智能电子有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 东莞市普灵思智能电子有限公司 filed Critical 东莞市普灵思智能电子有限公司
Publication of WO2020124624A1 publication Critical patent/WO2020124624A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial

Definitions

  • the present invention relates to the technical field of computer applications, and in particular to a tightly coupled automatic driving perception method and system.
  • Automated driving means that the car senses the road environment through the onboard sensor system, and controls the vehicle's steering and speed according to the road, vehicle position and obstacle information obtained by the perception, and then automatically plans the driving route and controls the vehicle to reach the predetermined target Technology.
  • the existing technology has a combined system of a visual system and an inertial navigation module with a binocular direct method, but the error generated by the visual system and the inertial navigation module in the combined system cannot be Effective limitation, the error of the combined system will increase without limitation under the condition of no image gradient for a long time, resulting in the failure of the combined system perception.
  • the prior art also has a vision system with a monocular feature point method, an inertial navigation module, and a tightly coupled automatic driving perception system for satellite navigation, but the monocular camera cannot detect featureless obstacles, such as isolation barriers on high-speed roads, bicycles, or animals, etc. .
  • the existing vision system also uses a binocular stereo vision system for coupling, but the feature point method is still used, which requires a large amount of calculation and requires high hardware performance.
  • a stereo vision image processing module inertial navigation module and satellite navigation module based on the binocular direct method of tightly coupled automatic driving perception method and system.
  • the main purpose of the present invention is to provide a tightly coupled automatic driving perception method and system, which has the advantages of maximizing the positioning accuracy of the automatic driving perception system, and improving the calculation efficiency and reliability.
  • the present invention provides a tightly coupled automatic driving perception method, which includes:
  • Step S1 Obtain the image data of the stereo vision module, the measurement data of the inertial navigation module and the original measurement data of the satellite navigation;
  • Step S2 Tightly couple the image data of the stereo vision module, the original measurement data of the satellite navigation and the measurement data of the inertial navigation module, and correct the drift error of the inertial navigation module.
  • the stereo vision module is processed by a direct method of a binocular camera, and the image data includes static measurement data and dynamic measurement data of the binocular camera.
  • the tight coupling consists of a weighted reprojection error of stereo vision, a satellite navigation error, and a time error from inertial navigation to form a cost function.
  • the step S1 specifically includes:
  • Step S21 Use a binocular camera to obtain the image data of the stereo vision module
  • Step S22 Obtain the measurement data of the inertial navigation module by using an inertial sensor
  • Step S23 Obtain the satellite navigation original measurement data through the receiver
  • Step S24 Perform tight coupling processing on the inertial navigation module measurement data, the image data, and the satellite navigation original measurement data.
  • the step S22 specifically includes:
  • Step S221 Using an inertial sensor to measure the 3-axis acceleration and 3-axis angular velocity of the autonomous vehicle in a fixed coordinate system;
  • Step S222 Rotate the acceleration and the angular velocity to the navigation coordinate system, solve the inertial navigation machinery layout equations, and calculate the position and attitude angle of the autonomous vehicle.
  • the step S23 specifically includes:
  • Step S231 Receive the signal of the navigation satellite with the receiver
  • Step S232 Analyze the ephemeris information of each satellite, and calculate the satellite position and satellite velocity of each satellite according to the ephemeris information;
  • Step S233 and calculate the pseudo distance, Doppler frequency drift and carrier phase of the satellite.
  • the step S24 specifically includes: correcting the drift error of the inertial navigation module through the satellite navigation raw measurement data combined with the image data of the stereo vision module.
  • a tightly coupled automatic driving perception system includes: a stereo vision image processing module of a binocular direct method, a satellite navigation module, an inertial navigation module, and a system tightly coupled module.
  • the inertial navigation module is used to acquire inertia using an inertial sensor The measurement data of the navigation module; the stereo vision image processing module, which uses a binocular camera to obtain the image data of the stereo vision module; the satellite navigation module, which is used to obtain the original measurement data of the satellite navigation through the receiver; the system tight coupling module, which is used to The inertial navigation module measurement data, the image data of the stereo vision module and the satellite navigation original measurement data are tightly coupled;
  • the stereo vision image processing module, the satellite navigation module and the inertial navigation module are all connected to the system tight coupling module.
  • the stereo vision image processing module includes a binocular camera.
  • the inertial navigation module includes an inertial sensor, and the inertial sensor is fixed on the autonomous vehicle.
  • the present invention tightly couples the measurement data of the inertial navigation module, the image data of the stereo vision module and the original measurement data of the satellite navigation, and performs the error measurement of the inertial navigation module.
  • the correction improves the positioning accuracy and no longer requires expensive laser scanning radar, thereby reducing the cost of self-driving cars.
  • FIG. 1 is a flowchart of a tightly coupled automatic driving perception method according to an embodiment of the present invention.
  • FIG. 2 is a flowchart of a system tight coupling module according to an embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of a tightly coupled automatic driving perception system according to an embodiment of the present invention.
  • the invention provides an automatic driving perception method based on tight coupling of a stereo vision module, an inertial navigation module and a satellite navigation module.
  • Inertial navigation can provide information continuously, with high accuracy in a short time, but positioning errors will accumulate over time; satellite navigation has long-term stability, but it is susceptible to interference, and the frequency of data update is low; stereo vision calculates obstacles to cameras according to the parallax of different cameras Distance, but the vision system cannot effectively locate and detect the distance of obstacles in the environment where the image lacks gradient.
  • Stereo vision, inertial navigation and satellite navigation constitute a combined navigation system, which can help each other to sense the status of the vehicle and the environment, complement each other in different environments, improve reliability and navigation accuracy, and can accurately extract the distance, which can replace the existing Laser scanning radar reduces costs.
  • FIG. 1 is a flowchart of a tightly coupled automatic driving perception method according to an embodiment of the present invention. As shown in FIG. 1, the tightly coupled automatic driving perception method specifically includes:
  • the image data of the stereo vision module is obtained by the stereo vision image processing module of the binocular direct method.
  • the following describes the detailed process of acquiring the image data of the stereo vision module by using a binocular camera:
  • the direct method is based on the assumption that the grayscale is unchanged.
  • the pixel grayscale of the same spatial point is fixed in each image.
  • the direct method does not need to extract features, and at the same time in the absence of corners and edges or the light changes are not obvious
  • the environment also has a good effect; and the direct method needs to process less data, which can achieve high calculation efficiency.
  • ⁇ K and I the projection and back projection function of the camera image frame point
  • d p is the inverse depth of p point
  • T ji is the conversion relationship between image frames:
  • R ji is a 3*3 rotation matrix
  • t is a translation vector
  • ⁇ p is the weight value, in order to reduce the weight value corresponding to the pixels with large gradients in the image, so that the final energy function can reflect most pixels
  • Photometric error not the error of pixels with large individual gradients
  • I i [p] and I j [p'] are the gray values of P i at the corresponding points of the image.
  • the camera motion is obtained by minimizing the photometric energy, and when the photometric energy is the minimum value, the image data of the stereo vision module is solved.
  • the objective function is:
  • the above formula (1-3) is the photometric error energy function of the monocular camera.
  • the stereo vision module uses the direct method of the binocular camera to process.
  • the image data of the stereo vision module includes the binocular camera is static Static measurement data at time and dynamic measurement data at dynamic time.
  • the binocular direct method balances the relative weight of each image frame of the camera and static stereo vision in motion by introducing a coupling factor ⁇ .
  • the error energy function is:
  • obs t (p) represents the observation point in all image frames, It is the error energy function in static stereo vision.
  • the objective function of the new binocular camera can be obtained.
  • R ji and t appearing in the conversion between the above two image frames are the pose of the camera, and the pose of the camera is obtained by using the gradient descent method or Gauss-Newton method to solve the objective function.
  • the camera pose is used to correct the error of the measurement data of the inertial navigation module of the self-driving vehicle in the inertial navigation.
  • the camera pose obtained by the binocular direct method does not need to extract features, and the ability to respond to road and environmental changes is significantly improved.
  • the measurement data of the inertial navigation module is obtained through the inertial navigation module.
  • the measurement data of the inertial navigation module includes the speed and displacement angular rate of the autonomous driving vehicle. The following describes the detailed process of acquiring the measurement data of the inertial navigation module:
  • the inertial sensors are used to measure the 3-axis acceleration and 3-axis angular velocity of the autonomous vehicle in a fixed coordinate system.
  • the inertial sensor is installed on the vehicle chassis, the inertial sensor is an accelerometer and a gyroscope, wherein the accelerometer is used to measure the acceleration of the vehicle, and the gyroscope is used to measure the angular velocity of the vehicle.
  • measurement errors such as zero drift in the measurement sensor. These measurement errors cause the positioning error to increase in time squared and the attitude angle error to increase in proportion to time. If not limited, the navigation system quickly loses its ability, then the error is modeled Compensation can reduce deterministic errors and random drift errors.
  • the acceleration and angular velocity are rotated into the navigation coordinate system, the inertial navigation machinery arrangement equation is solved and the position and attitude angle of the autonomous vehicle are calculated.
  • the navigation system selected for updating the pose value and error compensation is the n system, which is usually the northeast sky.
  • I the projection of the carrier velocity in the n system; ⁇ , L, and h represent the longitude, latitude, and height of the carrier; a represents the basic geodetic parameter in the WGS-84 coordinate system-the long semi-axis length of the ellipsoid, and e is the ellipsoid's Eccentricity.
  • the differential equations about ⁇ , L, and h are obtained, and the values of ⁇ , L, and h can be obtained according to the solution, so that the position of the self-driving vehicle can be calculated, R N is the curvature of the unitary surface, and R M is the radius of the meridian circle curvature.
  • v n is the projection of the velocity of the carrier in the n system;
  • the displacement angular rate calculated by the relative velocity of the carrier; It represents the projection of the earth's rotation angular rate under n;
  • g n is the projection of the local gravity acceleration on n;
  • f b is the measured output value of the accelerometer.
  • the symbol " ⁇ " represents vector cross product.
  • the element of is each uncorrelated zero-mean Gaussian white noise process. Is measured by an accelerometer, and g W is the earth's gravitational acceleration vector. Compared with the gyro bias modeled as a random walk, we use a time constant ⁇ >0 to model the accelerometer bias as a bounded random walk.
  • the matrix ⁇ is estimated along with the gyroscope measurement Angular rate composition:
  • the three items on the right are the 1, 2, and 3 components of the quaternion.
  • the satellite navigation raw measurement data is obtained through the satellite navigation module.
  • the satellite navigation raw measurement data includes satellite position, satellite speed, pseudo-range, Doppler frequency drift and carrier phase. The process is described:
  • the Global Navigation Satellite System has high navigation accuracy.
  • the GNSS located on the vehicle transmits signals to the satellites in orbit in the air, and the receiver receives the signals of the navigation satellites; the ephemeris information of each satellite is resolved based on the received satellite signals , Calculate the satellite position and satellite speed of each of the satellites according to the ephemeris information.
  • the pseudo-distance, Doppler frequency drift and carrier phase of the satellite can also be calculated based on ephemeris information.
  • the satellite navigation receiver uses a single-point positioning method to measure the pseudo distance, which is the relative distance between a satellite and the user.
  • the pseudorange ⁇ (n) (t) is calculated using the time difference between the time t u (t) and the transmission time t s (n) (t- ⁇ ) when the satellite signal n is received.
  • the speed c of the radio wave is multiplied by the following expression:
  • the symbol ⁇ represents the actual time interval from the transmission of the GNSS signal to the reception by the user.
  • the satellite time is used to advance the GNSS time.
  • the receiver time advances the GNSS time by ⁇ t u (t), that is:
  • needs to subtract the ionospheric delay I (n) (t) and the tropospheric delay T (n) (t) to be The propagation time of the geometric distance r (n) of the satellite signal from the satellite position to the receiver position, namely:
  • the satellite clock, d iono ionospheric delay and d trop tropospheric delay are known quantities
  • ⁇ (n) represents the unknown pseudorange measurement noise
  • r (n) is the receiver in physical space (x,y, z)
  • the geometric distance from the nth satellite (x (n) , y (n) , z (n) ), the coordinates of each position (x (n) , y (n) , z (n) ) can be from The ephemeris broadcast by each satellite can be calculated.
  • is the carrier wavelength
  • N is the full-period ambiguity
  • t s is the time when the satellite transmits the signal
  • t r is the time when the receiver receives the signal
  • dt s and dt r are the clock difference between the satellite and the receiver
  • c is the speed of light
  • d iono ionospheric delay is the time when the trop delay of the troposphere
  • d SA is the SA influence
  • is the carrier measurement noise
  • ⁇ (t s, t r ) is t s time satellite and The geometric distance between the receiver antennas at time t r , which contains station coordinates, satellite orbits, and earth rotation parameters.
  • v (i) is the moving speed of satellite i
  • a u (i) is the unit vector of user u pointing to satellite i
  • c is the speed of light in vacuum
  • ⁇ f u is the clock drift of user u
  • ⁇ f (i) is the clock drift of satellite i
  • FIG. 2 is a specific flowchart of the system tight coupling module.
  • the inertial sensor is used to obtain the 3-axis acceleration and 3-axis angular velocity of the vehicle in a fixed coordinate system and input into the inertial navigation module; the binocular camera collects pictures for direct method input; the GPS receiver is used to obtain satellites Raw measurement data.
  • the tight coupling consists of a weighted reprojection error of stereo vision, satellite navigation error, and time error from inertial navigation to form a cost function to express, the expression is as follows:
  • e r is the weighted reprojection error of stereo vision
  • e g is the satellite navigation error
  • e s is the time error term of inertial navigation
  • i represents the camera label
  • k represents the camera frame label
  • j represents the landmark label.
  • the landmark labels visible in the k-th frame and the i-th camera are written as the set J(i, k).
  • the information matrix representing the corresponding subscript measurement, and Represents the inertial navigation error information when corresponding to the k-th frame picture Represents the error information matrix corresponding to the satellite when it corresponds to the sth satellite at time t.
  • er is the weighted reprojection error of stereo vision.
  • the manifestation of the reprojection error of stereo vision is:
  • h i represents the projection model of the camera
  • z i, j, k represents the image coordinates of the feature.
  • External parameters representing inertial navigation and camera Represents feature coordinates.
  • e g is the satellite navigation error.
  • the error of each navigation satellite s at each time t includes three parts of errors. Its manifestation is:
  • e p represents a pseudorange error
  • e d denotes the Doppler error
  • e c represents the carrier phase error
  • the value of the cost function J(x) is minimized, thereby completing the tight coupling between the stereo vision image processing module, the inertial navigation module, and the satellite navigation module, and combining the original measurement data of the satellite navigation
  • the image data of the stereo vision module corrects the drift error of the inertial navigation module.
  • the receiver cannot be positioned.
  • the original measurement data of satellite navigation combined with the image data of the stereo vision module corrects the system error of the inertial navigation module, thereby improving the accuracy of navigation and greatly enhancing navigation.
  • the robustness of the system can help provide high-precision measurements and maps to the environment, eliminating the need for expensive laser scanning radar and reducing the cost of autonomous vehicles.
  • FIG. 3 is a schematic structural diagram of the tightly coupled automatic driving perception system 100 according to an embodiment of the present invention, as shown in FIG. It is shown that the tightly coupled automatic driving perception system 100 includes a stereo vision image processing module 10, a satellite navigation module 20, an inertial navigation module 30, and a system tight coupling module 40 of the binocular direct method.
  • the inertial navigation module 30 is used to acquire inertial navigation module measurement data by using an inertial sensor; the details are as described above.
  • the stereo vision image processing module 10 adopts a binocular direct method to obtain the image data of the stereo vision module; as described above.
  • the satellite navigation module 20 is used to obtain the satellite navigation raw measurement data of the navigation satellite through the receiver; as described above.
  • the system tight coupling module 40 is used to perform tight coupling processing on the measurement data of the inertial navigation module, the image data of the stereo vision module, and the original measurement data of the satellite navigation; as described above.
  • the connection relationship between each module is as follows: the stereo vision image processing module 10, the satellite navigation module 20, and the inertial navigation module 30 are all connected to the system tight coupling module 40.
  • the stereo vision image processing module 10 includes a binocular camera, and the binocular camera is installed on the self-driving vehicle; as described above.
  • the inertial navigation module 30 includes an inertial sensor, which is fixed to the autonomous vehicle; specifically as described above.
  • the present invention tightly couples the measurement data of the inertial navigation module, the image data of the stereo vision module and the original measurement data of the satellite navigation, and performs the error measurement of the inertial navigation module.
  • the correction improves the positioning accuracy and no longer requires expensive laser scanning radar, thereby reducing the cost of self-driving cars.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

An autonomous driving sensing method employing close coupling comprises: acquiring measurement data of an inertial navigation module, image data of a stereo vision module, and raw satellite navigation measurement data of an autonomous driving vehicle (S1); and closely coupling the image data of the stereo vision module, the raw satellite navigation measurement data, and the measurement data of the inertial navigation module, and correcting a drift error of the inertial navigation module (S2). An autonomous driving sensing system employing the autonomous driving sensing method employing close coupling is also provided. The method closely couples the measurement data of the inertial navigation module, the image data of the stereo vision module, and the raw satellite navigation measurement data, and limits an increase in the drift error of the inertial navigation module, thereby improving positioning accuracy, eliminating the need for a high-cost laser scanning radar, and reducing costs of the autonomous driving vehicle.

Description

一种紧耦合的自动驾驶感知方法及系统Tightly coupled automatic driving perception method and system 技术领域Technical field
本发明涉及计算机应用的技术领域,特别涉及一种紧耦合的自动驾驶感知方法及系统。The present invention relates to the technical field of computer applications, and in particular to a tightly coupled automatic driving perception method and system.
背景技术Background technique
近年来,随着人们对汽车安全意识的增强以尽信息技术的发展,自动驾驶领域越来越受关注,世界上许多公司和科研机构都开始投入研发自动驾驶相关产品,预计2021年自动驾驶车辆将进入市场,给汽车行业带来巨大的变革。相关研究表明自动驾驶技术的发展将会在多种领域带来颠覆性的发展,例如其发展可以增强公路的交通安全、缓解交通拥堵状况和减少环境污染等方面,同时自动驾驶技术也是衡量一个国家科研实力和工业水平的一个重要标志,在国防和国民经济领域具有广阔的应用前景。In recent years, with the increasing awareness of automobile safety and the development of information technology, more and more attention has been paid to the field of autonomous driving. Many companies and scientific research institutions in the world have begun to invest in the development of autonomous driving related products. It is expected that autonomous driving vehicles will be in 2021. Will enter the market and bring huge changes to the automotive industry. Relevant research shows that the development of autonomous driving technology will bring disruptive development in various fields. For example, its development can enhance highway traffic safety, ease traffic congestion and reduce environmental pollution. At the same time, autonomous driving technology is also a measure of a country. An important symbol of scientific research strength and industrial level has broad application prospects in the field of national defense and national economy.
自动驾驶是指汽车通过车载传感系统来对道路环境进行感知,并根据感知所获得的道路、车辆位置和障碍物信息等控制车辆的转向和速度,进而自动规划行车路线并控制车辆到达预定目标的技术。Automated driving means that the car senses the road environment through the onboard sensor system, and controls the vehicle's steering and speed according to the road, vehicle position and obstacle information obtained by the perception, and then automatically plans the driving route and controls the vehicle to reach the predetermined target Technology.
如今在自动驾驶方面,各大公司都有自己的技术方向,现有技术已有双目直接方法的视觉系统和惯性导航模块的组合系统,但是组合系统中视觉系统和惯性导航模块产生的误差不能有效限制,组合系统在长时间无图像梯度情况下误差会无限制增长,导致组合系统感知失败。Nowadays, in terms of autonomous driving, major companies have their own technical directions. The existing technology has a combined system of a visual system and an inertial navigation module with a binocular direct method, but the error generated by the visual system and the inertial navigation module in the combined system cannot be Effective limitation, the error of the combined system will increase without limitation under the condition of no image gradient for a long time, resulting in the failure of the combined system perception.
现有技术也有单目特征点方法的视觉系统、惯性导航模块及卫星导航的紧耦合自动驾驶感知系统,但是单目摄像头无法探测到无特征障碍物,比如高速路的隔离护栏、自行车或动物等。而现有视觉系统也有采用双目立体视觉系统进行耦合,但是仍然采用特征点法,计算量大,且对硬件性能要求高。The prior art also has a vision system with a monocular feature point method, an inertial navigation module, and a tightly coupled automatic driving perception system for satellite navigation, but the monocular camera cannot detect featureless obstacles, such as isolation barriers on high-speed roads, bicycles, or animals, etc. . The existing vision system also uses a binocular stereo vision system for coupling, but the feature point method is still used, which requires a large amount of calculation and requires high hardware performance.
为此,我们提出了一种基于双目直接方法的立体视觉图像处理模块、惯性导航模块和卫星导航模块紧耦合的自动驾驶感知方法及系统。To this end, we propose a stereo vision image processing module, inertial navigation module and satellite navigation module based on the binocular direct method of tightly coupled automatic driving perception method and system.
发明内容Summary of the invention
本发明的主要目的在于提供了一种紧耦合的自动驾驶感知方法及系统,具有最大限度地提高自动驾驶感知系统的定位精度、提高计算效率和可靠性的优点。The main purpose of the present invention is to provide a tightly coupled automatic driving perception method and system, which has the advantages of maximizing the positioning accuracy of the automatic driving perception system, and improving the calculation efficiency and reliability.
为实现上述目的,本发明提供了一种紧耦合的自动驾驶感知方法,所述方法包括:To achieve the above object, the present invention provides a tightly coupled automatic driving perception method, which includes:
步骤S1、获取立体视觉模块的图像数据、惯性导航模块的测量数据及卫星导航原始测量数据;Step S1: Obtain the image data of the stereo vision module, the measurement data of the inertial navigation module and the original measurement data of the satellite navigation;
步骤S2、将所述立体视觉模块的图像数据、所述卫星导航原始测量数据及所述惯性导航模块的测量数据进行紧耦合,对所述惯性导航模块的漂移误差进行修正。Step S2: Tightly couple the image data of the stereo vision module, the original measurement data of the satellite navigation and the measurement data of the inertial navigation module, and correct the drift error of the inertial navigation module.
优选的,所述立体视觉模块采用双目摄像头直接方法进行处理,所述图像数据包括双目摄像头的静态测量数据和动态测量数据。Preferably, the stereo vision module is processed by a direct method of a binocular camera, and the image data includes static measurement data and dynamic measurement data of the binocular camera.
优选的,所述紧耦合由立体视觉的加权重投影误差、卫星导航误差和来自惯性导航的时间误差构成成本函数。Preferably, the tight coupling consists of a weighted reprojection error of stereo vision, a satellite navigation error, and a time error from inertial navigation to form a cost function.
优选的,所述步骤S1具体包括:Preferably, the step S1 specifically includes:
步骤S21、采用双目摄像头获取立体视觉模块的图像数据;Step S21: Use a binocular camera to obtain the image data of the stereo vision module;
步骤S22、采用惯性传感器获取惯性导航模块的测量数据;Step S22: Obtain the measurement data of the inertial navigation module by using an inertial sensor;
步骤S23、通过接收机获取卫星导航原始测量数据;Step S23: Obtain the satellite navigation original measurement data through the receiver;
步骤S24、将所述惯性导航模块测量数据、所述图像数据及所述卫星导航原始测量数据进行紧耦合处理。Step S24: Perform tight coupling processing on the inertial navigation module measurement data, the image data, and the satellite navigation original measurement data.
优选的,所述步骤S22具体包括:Preferably, the step S22 specifically includes:
步骤S221、利用惯性传感器测量自动驾驶车辆在固定坐标系下的3轴加速度和3轴角速度;Step S221: Using an inertial sensor to measure the 3-axis acceleration and 3-axis angular velocity of the autonomous vehicle in a fixed coordinate system;
步骤S222、将所述加速度和所述角速度转动到导航坐标系下,求解惯性导航机械编排方程并计算出自动驾驶车辆的位置和姿态角。Step S222: Rotate the acceleration and the angular velocity to the navigation coordinate system, solve the inertial navigation machinery layout equations, and calculate the position and attitude angle of the autonomous vehicle.
优选的,所述步骤S23具体包括:Preferably, the step S23 specifically includes:
步骤S231、用接收机接收导航卫星的信号;Step S231: Receive the signal of the navigation satellite with the receiver;
步骤S232、解析出每个卫星的星历信息,根据所述星历信息计算出每个所述卫星的卫星位置和卫星速度;Step S232: Analyze the ephemeris information of each satellite, and calculate the satellite position and satellite velocity of each satellite according to the ephemeris information;
步骤S233、并计算出所述卫星的伪距离、多普勒频飘及载波相位。Step S233, and calculate the pseudo distance, Doppler frequency drift and carrier phase of the satellite.
优选的,所述步骤S24具体包含:通过所述卫星导航原始测量 数据结合所述立体视觉模块的图像数据对所述惯性导航模块的漂移误差进行校正。Preferably, the step S24 specifically includes: correcting the drift error of the inertial navigation module through the satellite navigation raw measurement data combined with the image data of the stereo vision module.
一种紧耦合的自动驾驶感知系统,所述系统包括:双目直接方法的立体视觉图像处理模块、卫星导航模块、惯性导航模块及系统紧耦合模块,惯性导航模块,用于采用惯性传感器获取惯性导航模块的测量数据;立体视觉图像处理模块,采用双目摄像头获取立体视觉模块的图像数据;卫星导航模块,用于通过接收机获取卫星导航原始测量数据;系统紧耦合模块,用于将所述惯性导航模块测量数据、所述立体视觉模块的图像数据及所述卫星导航原始测量数据进行紧耦合处理;A tightly coupled automatic driving perception system, the system includes: a stereo vision image processing module of a binocular direct method, a satellite navigation module, an inertial navigation module, and a system tightly coupled module. The inertial navigation module is used to acquire inertia using an inertial sensor The measurement data of the navigation module; the stereo vision image processing module, which uses a binocular camera to obtain the image data of the stereo vision module; the satellite navigation module, which is used to obtain the original measurement data of the satellite navigation through the receiver; the system tight coupling module, which is used to The inertial navigation module measurement data, the image data of the stereo vision module and the satellite navigation original measurement data are tightly coupled;
所述立体视觉图像处理模块、所述卫星导航模块及所述惯性导航模块均与所述系统紧耦合模块连接。The stereo vision image processing module, the satellite navigation module and the inertial navigation module are all connected to the system tight coupling module.
优选的,所述立体视觉图像处理模块包含双目摄像头。Preferably, the stereo vision image processing module includes a binocular camera.
优选的,所述惯性导航模块包含惯性传感器,所述惯性传感器固定于所述自动驾驶车辆上。Preferably, the inertial navigation module includes an inertial sensor, and the inertial sensor is fixed on the autonomous vehicle.
与现有技术相比,本发明具有如下有益效果:本发明将惯性导航模块测量数据、立体视觉模块的图像数据及卫星导航原始测量数据三者进行紧耦合,对惯性导航模块测量数据的误差进行修正,从而提高了定位精度,不再借助于昂贵的激光扫描雷达,从而降低自动驾驶汽车的成本。Compared with the prior art, the present invention has the following beneficial effects: The present invention tightly couples the measurement data of the inertial navigation module, the image data of the stereo vision module and the original measurement data of the satellite navigation, and performs the error measurement of the inertial navigation module. The correction improves the positioning accuracy and no longer requires expensive laser scanning radar, thereby reducing the cost of self-driving cars.
附图说明BRIEF DESCRIPTION
图1为本发明实施例紧耦合的自动驾驶感知方法的流程图。FIG. 1 is a flowchart of a tightly coupled automatic driving perception method according to an embodiment of the present invention.
图2为本发明实施例系统紧耦合模块的流程图。FIG. 2 is a flowchart of a system tight coupling module according to an embodiment of the present invention.
图3为本发明实施例紧耦合的自动驾驶感知系统的结构示意图。FIG. 3 is a schematic structural diagram of a tightly coupled automatic driving perception system according to an embodiment of the present invention.
具体实施方式detailed description
为使本发明实现的技术手段、创作特征、达成目的与功效易于明白了解,下面结合具体实施方式,进一步阐述本发明。In order to make the technical means, creative features, objectives and effects achieved by the present invention easy to understand, the present invention will be further described below in conjunction with specific embodiments.
本发明提供了一种基于立体视觉模块、惯性导航模块及卫星导航模块紧耦合的自动驾驶感知方法。惯性导航可连续提供信息,短时间精度高,但是定位误差会随时间积累;卫星导航长期稳定性好,但易受到干扰,数据更新频率低;立体视觉根据不同照相机的视差,计算障碍物到照相机的距离,但视觉系统在图像缺乏梯度的环境下,不能有效定位和探测障碍物距离。将立体视觉、惯性导航及卫星导航构成组合导航系统,从而可相互协助感知车辆的状态和环境状态,在不同环境下互补,提高了可靠性和导航精度,能精确提取距离,可以替代现有的激光扫描雷达,降低成本。The invention provides an automatic driving perception method based on tight coupling of a stereo vision module, an inertial navigation module and a satellite navigation module. Inertial navigation can provide information continuously, with high accuracy in a short time, but positioning errors will accumulate over time; satellite navigation has long-term stability, but it is susceptible to interference, and the frequency of data update is low; stereo vision calculates obstacles to cameras according to the parallax of different cameras Distance, but the vision system cannot effectively locate and detect the distance of obstacles in the environment where the image lacks gradient. Stereo vision, inertial navigation and satellite navigation constitute a combined navigation system, which can help each other to sense the status of the vehicle and the environment, complement each other in different environments, improve reliability and navigation accuracy, and can accurately extract the distance, which can replace the existing Laser scanning radar reduces costs.
图1为本发明实施例紧耦合的自动驾驶感知方法的流程图,如图1所示,该紧耦合的自动驾驶感知方法具体包括:FIG. 1 is a flowchart of a tightly coupled automatic driving perception method according to an embodiment of the present invention. As shown in FIG. 1, the tightly coupled automatic driving perception method specifically includes:
S1、获取自动驾驶车辆的惯性导航模块测量数据、立体视觉模块的图像数据及卫星导航原始测量数据;S2、在紧耦合过程中,将立体视觉模块的图像数据及卫星导航原始测量数据与惯性导航模块测量数据进行紧耦合,对惯性导航模块测量数据的误差进行修正,从而感知车辆的状态和环境的状态。下面对具体的自动驾驶感知方法的过程进行详细描述:S1. Obtain the measurement data of the inertial navigation module of the self-driving vehicle, the image data of the stereo vision module and the original measurement data of the satellite navigation; S2. In the tight coupling process, the image data of the stereo vision module and the original measurement data of the satellite navigation and the inertial navigation The measurement data of the module is tightly coupled to correct the error of the measurement data of the inertial navigation module, so as to perceive the state of the vehicle and the environment. The following describes the process of the specific automatic driving perception method in detail:
(1)立体视觉模块的图像数据通过双目直接方法的立体视觉图像处理模块获取,下面对采用双目摄像头获取立体视觉模块的图像数据的详细过程进行描述:(1) The image data of the stereo vision module is obtained by the stereo vision image processing module of the binocular direct method. The following describes the detailed process of acquiring the image data of the stereo vision module by using a binocular camera:
直接方法是基于灰度不变假设的,同一个空间点的像素灰度,在各个图像中是固定不变的,直接方法不需要提取特征,同时在缺乏角点和边缘或光线变化不明显的环境,也有较好效果;且直接方法需处理的数据较少,可实现高计算效率。The direct method is based on the assumption that the grayscale is unchanged. The pixel grayscale of the same spatial point is fixed in each image. The direct method does not need to extract features, and at the same time in the absence of corners and edges or the light changes are not obvious The environment also has a good effect; and the direct method needs to process less data, which can achieve high calculation efficiency.
采用双目直接方法处理摄像头立体视觉拍摄到的图像具体过程如下:The specific process of using the binocular direct method to process the image captured by the camera stereo vision is as follows:
1.利用双目摄像头从不同位置获取被测物体的两幅或多幅图像的灰度图像,记录双目摄像头在t和t+1时刻获得的图像为I i和I j1. Use a binocular camera to obtain grayscale images of two or more images of the measured object from different positions, and record the images obtained by the binocular camera at time t and t+1 as I i and I j .
2.通过matlab或者opencv进行相机标定,获得相机的内参。2. Calibrate the camera through matlab or opencv to obtain the internal parameters of the camera.
3.对获得的图像进行畸变处理。3. Perform distortion processing on the obtained image.
4.将上述的图像数据输入到系统紧耦合模块中,根据拍摄图像的计算出光度测量能量。具体的,图像Ii中的空间点Pi出现在另一帧图像Ij中,选取图像Ii和Ij中出现的一个空间点Pi,根据同一空间三维点在各个视角下测到的灰度值不变假设原理,图像Ii中的点p在图像Ij的投影p’由下式计算:4. Input the above image data into the tightly coupled module of the system, and calculate the photometric energy according to the calculated image. Specifically, the spatial point Pi in the image Ii appears in another image Ij, and a spatial point Pi appearing in the images Ii and Ij is selected, according to the assumption that the gray value measured at the three-dimensional point in the same space under various viewing angles is unchanged In principle, the projection p'of the point p in the image Ii on the image Ij is calculated by the following formula:
Figure PCTCN2018123652-appb-000001
Figure PCTCN2018123652-appb-000001
其中,Π K
Figure PCTCN2018123652-appb-000002
是相机图像帧点的投影和反投影函数,d p是p点的逆深度,T ji是图像帧之间的转换关系:
Among them, Π K and
Figure PCTCN2018123652-appb-000002
Is the projection and back projection function of the camera image frame point, d p is the inverse depth of p point, and T ji is the conversion relationship between image frames:
Figure PCTCN2018123652-appb-000003
Figure PCTCN2018123652-appb-000003
其中,R ji是一个3*3的旋转矩阵,t是平移向量。 Among them, R ji is a 3*3 rotation matrix, and t is a translation vector.
其光度误差能量函数为:Its photometric error energy function is:
Figure PCTCN2018123652-appb-000004
Figure PCTCN2018123652-appb-000004
其中,
Figure PCTCN2018123652-appb-000005
是Huber范数,是为了防止误差能量函数随光度误差增长过快;ω p是权值,为了将图像中梯度大的像素点对应的权值缩小,这样最终的能量函数能够反映大多数像素点的光度误差,而不是个别梯度大的像素点的误差;I i[p]和I j[p']分别为P i在图像对应点的灰度值。
among them,
Figure PCTCN2018123652-appb-000005
Is the Huber norm, to prevent the error energy function from increasing too fast with the photometric error; ω p is the weight value, in order to reduce the weight value corresponding to the pixels with large gradients in the image, so that the final energy function can reflect most pixels Photometric error, not the error of pixels with large individual gradients; I i [p] and I j [p'] are the gray values of P i at the corresponding points of the image.
通过最小化光度测量能量进行优化从而得到相机运动,当光度测量能量为最小值,求解得到立体视觉模块的图像数据。得到目标函数为:The camera motion is obtained by minimizing the photometric energy, and when the photometric energy is the minimum value, the image data of the stereo vision module is solved. The objective function is:
Figure PCTCN2018123652-appb-000006
Figure PCTCN2018123652-appb-000006
上述的公式(1-3)是单目摄像头的光度误差能量函数,当摄像头为双目摄像头时,立体视觉模块采用双目摄像头直接方法进行处理,立体视觉模块的图像数据包括双目摄像头处于静态时的静态测量数据和处于动态时的动态测量数据。双目直接法通过引入耦合因子λ来平衡运动中相机每个图像帧和静态立体视觉的相对权重,其误差能量函数为:The above formula (1-3) is the photometric error energy function of the monocular camera. When the camera is a binocular camera, the stereo vision module uses the direct method of the binocular camera to process. The image data of the stereo vision module includes the binocular camera is static Static measurement data at time and dynamic measurement data at dynamic time. The binocular direct method balances the relative weight of each image frame of the camera and static stereo vision in motion by introducing a coupling factor λ. The error energy function is:
Figure PCTCN2018123652-appb-000007
Figure PCTCN2018123652-appb-000007
其中,obs t(p)表示所有图像帧中的观察点,
Figure PCTCN2018123652-appb-000008
是静态立体视觉中的误差能量函数。则,根据公式(1-4)及公式(1-5)可以得出新 的双目摄像头的目标函数。上述两个图像帧之间的转换中出现的R ji和t,即是相机的位姿,通过采用梯度下降法或者高斯牛顿法求解目标函数,从而得到相机的位姿。进而通过该相机位姿对惯性导航中自动驾驶车辆的惯性导航模块测量数据的误差进行修正,该双目直接方法获取的相机位姿不需要提取特征,对道路和环境变化反应能力显著提高。
Among them, obs t (p) represents the observation point in all image frames,
Figure PCTCN2018123652-appb-000008
It is the error energy function in static stereo vision. Then, according to formula (1-4) and formula (1-5), the objective function of the new binocular camera can be obtained. R ji and t appearing in the conversion between the above two image frames are the pose of the camera, and the pose of the camera is obtained by using the gradient descent method or Gauss-Newton method to solve the objective function. Furthermore, the camera pose is used to correct the error of the measurement data of the inertial navigation module of the self-driving vehicle in the inertial navigation. The camera pose obtained by the binocular direct method does not need to extract features, and the ability to respond to road and environmental changes is significantly improved.
(2)惯性导航模块测量数据通过惯性导航模块获取,惯性导航模块测量数据包括自动驾驶车辆的速度及位移角速率,下面对获取惯性导航模块测量数据的详细过程进行描述:(2) The measurement data of the inertial navigation module is obtained through the inertial navigation module. The measurement data of the inertial navigation module includes the speed and displacement angular rate of the autonomous driving vehicle. The following describes the detailed process of acquiring the measurement data of the inertial navigation module:
首先,利用惯性传感器测量自动驾驶车辆在固定坐标系下的3轴加速度和3轴角速度。优选的,惯性传感器安装在车辆底盘,惯性传感器为加速度计和陀螺仪,其中,加速度计用于测量车辆加速度,陀螺仪用于测量车辆角速度。但,测量传感器存在零飘等测量误差,这些测量误差使得定位误差以时间平方增长、姿态角误差以时间成比例增长,如果不加以限制,导航系统很快失去能力,则,通过建模进行误差补偿,可减小确定性误差和随机漂移误差。First, the inertial sensors are used to measure the 3-axis acceleration and 3-axis angular velocity of the autonomous vehicle in a fixed coordinate system. Preferably, the inertial sensor is installed on the vehicle chassis, the inertial sensor is an accelerometer and a gyroscope, wherein the accelerometer is used to measure the acceleration of the vehicle, and the gyroscope is used to measure the angular velocity of the vehicle. However, there are measurement errors such as zero drift in the measurement sensor. These measurement errors cause the positioning error to increase in time squared and the attitude angle error to increase in proportion to time. If not limited, the navigation system quickly loses its ability, then the error is modeled Compensation can reduce deterministic errors and random drift errors.
进一步的,将加速度和角速度转动到导航坐标系下,求解惯性导航机械编排方程并计算出自动驾驶车辆的位置和姿态角。具体的,进行位姿数值更新和误差补偿时所选用的导航系均为n系,通常是东北天。Further, the acceleration and angular velocity are rotated into the navigation coordinate system, the inertial navigation machinery arrangement equation is solved and the position and attitude angle of the autonomous vehicle are calculated. Specifically, the navigation system selected for updating the pose value and error compensation is the n system, which is usually the northeast sky.
在东北天坐标系下,车辆的位置更新公式如下:Under the northeast sky coordinate system, the vehicle position update formula is as follows:
Figure PCTCN2018123652-appb-000009
Figure PCTCN2018123652-appb-000009
Figure PCTCN2018123652-appb-000010
Figure PCTCN2018123652-appb-000010
Figure PCTCN2018123652-appb-000011
Figure PCTCN2018123652-appb-000011
在上述公式中,
Figure PCTCN2018123652-appb-000012
为载体速度在n系中的投影;λ、L和h分别表示载体的经度、纬度和高度;a表示WGS-84坐标系下基本大地参数-椭球体的长半轴长,e为椭球体的偏心率。最终得到关于λ、L、h的微分方程,根据求解可以得到λ、L、h的值,从而可以计算出自动驾驶车辆的位置,R N为卯酉面曲率,R M为子午圈曲率半径。
In the above formula,
Figure PCTCN2018123652-appb-000012
Is the projection of the carrier velocity in the n system; λ, L, and h represent the longitude, latitude, and height of the carrier; a represents the basic geodetic parameter in the WGS-84 coordinate system-the long semi-axis length of the ellipsoid, and e is the ellipsoid's Eccentricity. Finally, the differential equations about λ, L, and h are obtained, and the values of λ, L, and h can be obtained according to the solution, so that the position of the self-driving vehicle can be calculated, R N is the curvature of the unitary surface, and R M is the radius of the meridian circle curvature.
在东北天坐标系下,车辆的速度更新公式如下:Under the northeast sky coordinate system, the vehicle speed update formula is as follows:
Figure PCTCN2018123652-appb-000013
Figure PCTCN2018123652-appb-000013
Figure PCTCN2018123652-appb-000014
Figure PCTCN2018123652-appb-000014
Figure PCTCN2018123652-appb-000015
Figure PCTCN2018123652-appb-000015
在上述公式中,
Figure PCTCN2018123652-appb-000016
为四元数方向余弦矩阵
Figure PCTCN2018123652-appb-000017
的转置;v n为载体的速度在n系中的投影;
Figure PCTCN2018123652-appb-000018
为通过载体的相对速度计算得到的位移角速率;
Figure PCTCN2018123652-appb-000019
表示的是地球的自转角速率在n下的投影;g n为当地的重力加速度在n上的投影;f b为加速度计的测量输出值。符号“×”代表向量叉乘。最终得到关于v n
Figure PCTCN2018123652-appb-000020
的微分方程,根据求解可以得 到v n
Figure PCTCN2018123652-appb-000021
的值,从而可以计算出自动驾驶车辆的速度及位移角速率。
In the above formula,
Figure PCTCN2018123652-appb-000016
Is the quaternion direction cosine matrix
Figure PCTCN2018123652-appb-000017
The transposition of v n is the projection of the velocity of the carrier in the n system;
Figure PCTCN2018123652-appb-000018
The displacement angular rate calculated by the relative velocity of the carrier;
Figure PCTCN2018123652-appb-000019
It represents the projection of the earth's rotation angular rate under n; g n is the projection of the local gravity acceleration on n; f b is the measured output value of the accelerometer. The symbol "×" represents vector cross product. Finally get about v n ,
Figure PCTCN2018123652-appb-000020
The differential equation of, according to the solution, we can get v n ,
Figure PCTCN2018123652-appb-000021
The value of, so that the speed and displacement angular rate of the autonomous vehicle can be calculated.
再者,将惯性导航运动学与简单的动态偏差模型结合起来,得到如下方程组:Furthermore, combining inertial navigation kinematics with a simple dynamic deviation model yields the following system of equations:
Figure PCTCN2018123652-appb-000022
Figure PCTCN2018123652-appb-000022
Figure PCTCN2018123652-appb-000023
Figure PCTCN2018123652-appb-000023
Figure PCTCN2018123652-appb-000024
Figure PCTCN2018123652-appb-000024
Figure PCTCN2018123652-appb-000025
Figure PCTCN2018123652-appb-000025
Figure PCTCN2018123652-appb-000026
Figure PCTCN2018123652-appb-000026
其中,
Figure PCTCN2018123652-appb-000027
的元素是每个不相关的零均值高斯白噪声过程。
Figure PCTCN2018123652-appb-000028
是加速度计测量,g W是地球的重力加速度矢量。与作为随机游走建模的陀螺仪偏差相比,我们使用时间常数τ>0来将加速度计偏差建模为有界随机游走。矩阵Ω由估计的伴随着陀螺仪测量
Figure PCTCN2018123652-appb-000029
的角速率
Figure PCTCN2018123652-appb-000030
组成:
among them,
Figure PCTCN2018123652-appb-000027
The element of is each uncorrelated zero-mean Gaussian white noise process.
Figure PCTCN2018123652-appb-000028
Is measured by an accelerometer, and g W is the earth's gravitational acceleration vector. Compared with the gyro bias modeled as a random walk, we use a time constant τ>0 to model the accelerometer bias as a bounded random walk. The matrix Ω is estimated along with the gyroscope measurement
Figure PCTCN2018123652-appb-000029
Angular rate
Figure PCTCN2018123652-appb-000030
composition:
Figure PCTCN2018123652-appb-000031
Figure PCTCN2018123652-appb-000031
最后我们得出来组合导航中惯性导航部分的预测误差e s,e s为惯性导航的时间误差项: Finally, we obtain the prediction error e s of the inertial navigation part of the combined navigation, e s is the time error term of the inertial navigation:
Figure PCTCN2018123652-appb-000032
Figure PCTCN2018123652-appb-000032
其中右边三项为四元数的1、2、3分量。The three items on the right are the 1, 2, and 3 components of the quaternion.
(3)卫星导航原始测量数据通过卫星导航模块获取,该卫星导航原始测量数据包括卫星位置、卫星速度、伪距离、多普勒频飘及载波相位,下面对获取卫星导航原始测量数据的详细过程进行描 述:(3) The satellite navigation raw measurement data is obtained through the satellite navigation module. The satellite navigation raw measurement data includes satellite position, satellite speed, pseudo-range, Doppler frequency drift and carrier phase. The process is described:
全球导航卫星系统(GNSS)具有较高的导航精度,位于车辆上的GNSS向空中在轨卫星发射信号,用接收机接收导航卫星的信号;根据接收的卫星信号解析出每个卫星的星历信息,根据所述星历信息计算出每个所述卫星的卫星位置和卫星速度。同时,根据星历信息也可计算出卫星的伪距离、多普勒频飘及载波相位。The Global Navigation Satellite System (GNSS) has high navigation accuracy. The GNSS located on the vehicle transmits signals to the satellites in orbit in the air, and the receiver receives the signals of the navigation satellites; the ephemeris information of each satellite is resolved based on the received satellite signals , Calculate the satellite position and satellite speed of each of the satellites according to the ephemeris information. At the same time, the pseudo-distance, Doppler frequency drift and carrier phase of the satellite can also be calculated based on ephemeris information.
具体的,卫星导航接收机使用单点定位方法来测定伪距离,伪距离即是某颗卫星与用户的相对距离。卫星导航中,伪距ρ (n)(t)的计算是用n号卫星信号被接收的时间t u(t)与发射时间t s (n)(t-τ)之间的时间差与真空中无线电波的速度c相乘,表达式如下: Specifically, the satellite navigation receiver uses a single-point positioning method to measure the pseudo distance, which is the relative distance between a satellite and the user. In satellite navigation, the pseudorange ρ (n) (t) is calculated using the time difference between the time t u (t) and the transmission time t s (n) (t-τ) when the satellite signal n is received. The speed c of the radio wave is multiplied by the following expression:
Figure PCTCN2018123652-appb-000033
Figure PCTCN2018123652-appb-000033
其中,符号τ代表GNSS信号从发射到被用户接收的实际时间间隔。但由于GNSS卫星与用户接收机的时钟通常不会与GNSS时间t同步,将卫星时间超前GNSS时间用
Figure PCTCN2018123652-appb-000034
表示,将接收机时间超前GNSS时间用δt u(t)表示,即:
Among them, the symbol τ represents the actual time interval from the transmission of the GNSS signal to the reception by the user. However, since the clocks of the GNSS satellite and the user receiver are usually not synchronized with the GNSS time t, the satellite time is used to advance the GNSS time.
Figure PCTCN2018123652-appb-000034
Indicates that the receiver time advances the GNSS time by δt u (t), that is:
Figure PCTCN2018123652-appb-000035
Figure PCTCN2018123652-appb-000035
t u(t)=t+δt u(t)       (3-3) t u (t)=t+δt u (t) (3-3)
且,电离层、对流层等结构会对电磁波的传播造成一定程度上的延时,因此τ需要减去电离层延时I (n)(t)和对流层延时T (n)(t)才是卫星信号从卫星位置到接收机位置的几何距离r (n)的传播时间,即: Moreover, the ionosphere, troposphere and other structures will cause a certain degree of delay in the propagation of electromagnetic waves, so τ needs to subtract the ionospheric delay I (n) (t) and the tropospheric delay T (n) (t) to be The propagation time of the geometric distance r (n) of the satellite signal from the satellite position to the receiver position, namely:
τ=r (n)/c+I (n)(t)+T (n)(t)      (3-4) τ=r (n) /c+I (n) (t)+T (n) (t) (3-4)
将(3-2)、(3-3)、(3-4)代入到(3-1)中,得到最终的伪距ρ (n)(t) 的计算公式,即: Substitute (3-2), (3-3), (3-4) into (3-1) to get the final formula for calculating the pseudorange ρ (n) (t), namely:
Figure PCTCN2018123652-appb-000036
Figure PCTCN2018123652-appb-000036
Figure PCTCN2018123652-appb-000037
Figure PCTCN2018123652-appb-000037
d trop=cT (n)(t) d trop = cT (n) (t)
d iono=cI (n)(t) d iono = cI (n) (t)
其中,
Figure PCTCN2018123652-appb-000038
为卫星钟差、d iono电离层延迟和d trop对流层延迟均为已知量,ε (n)表示的是未知的伪距测量噪声,r (n)是物理空间中接收机(x,y,z)到第n颗卫星(x (n),y (n),z (n))的几何距离,各个位置的坐标(x (n),y (n),z (n))均可从各个卫星播发的星历解算得到。
among them,
Figure PCTCN2018123652-appb-000038
The satellite clock, d iono ionospheric delay and d trop tropospheric delay are known quantities, ε (n) represents the unknown pseudorange measurement noise, r (n) is the receiver in physical space (x,y, z) The geometric distance from the nth satellite (x (n) , y (n) , z (n) ), the coordinates of each position (x (n) , y (n) , z (n) ) can be from The ephemeris broadcast by each satellite can be calculated.
当计算出伪距ρ (n)(t)后,可以得到卫星导航中的载波相位的观测方程如下: When the pseudorange ρ (n) (t) is calculated, the observation equation of the carrier phase in satellite navigation can be obtained as follows:
φλ=ρ(t s,t r)+c(dt r-dt s)+d trop-d iono+d rel+d SA+d multi+Nλ+ε  (3-6) φλ=ρ(t s ,t r )+c(dt r -dt s )+d trop -d iono +d rel +d SA +d multi +Nλ+ε (3-6)
其中,
Figure PCTCN2018123652-appb-000039
为载波相位观测值,λ为载波波长,N为整周模糊度,t s为卫星发射信号时刻,t r为接收机接受信号时刻,dt s和dt r分别为卫星和接收机的钟差,c为光速,d iono为电离层延迟,d trop为对流层延迟,d SA为SA影响,d multi为多路径效应,ε为载波观测噪声,ρ(t s,t r)为t s时刻卫星和t r时刻接收机天线之间的几何距离,它含有测站坐标、卫星轨道和地球自转参数等。
among them,
Figure PCTCN2018123652-appb-000039
Is the carrier phase observation value, λ is the carrier wavelength, N is the full-period ambiguity, t s is the time when the satellite transmits the signal, t r is the time when the receiver receives the signal, dt s and dt r are the clock difference between the satellite and the receiver, c is the speed of light, d iono ionospheric delay, d trop delay of the troposphere, d SA is the SA influence, d multi multi-path effect, ε is the carrier measurement noise, ρ (t s, t r ) is t s time satellite and The geometric distance between the receiver antennas at time t r , which contains station coordinates, satellite orbits, and earth rotation parameters.
由于多普勒频移观测量是载波相位率的瞬时观测值,忽略电离层、对流层延迟对时间的变化,Since the Doppler frequency shift observation is an instantaneous observation of the carrier phase rate, ignoring changes in the ionosphere and troposphere delay versus time,
对载波相位观测方程进行微分:Differentiate the carrier phase observation equation:
Figure PCTCN2018123652-appb-000040
Figure PCTCN2018123652-appb-000040
其中,
Figure PCTCN2018123652-appb-000041
则,可得多普勒频移测量方程为:
among them,
Figure PCTCN2018123652-appb-000041
Then, the Doppler frequency shift measurement equation is:
Figure PCTCN2018123652-appb-000042
Figure PCTCN2018123652-appb-000042
其中,λ是载波L1(f1=1575.42MHz)对应的波长,
Figure PCTCN2018123652-appb-000043
是用户u相对卫星i的多普勒频移,v (i)是卫星i的移动速度,
Figure PCTCN2018123652-appb-000044
是用户u的移动速度,a u (i)是用户u指向卫星i的单位向量,c是真空中的光速,δf u是用户u的时钟钟漂,δf (i)是卫星i的时钟钟漂,
Figure PCTCN2018123652-appb-000045
是用户u相对卫星i的多普勒频率测量噪声。
Where λ is the wavelength corresponding to carrier L1 (f1=1575.42MHz),
Figure PCTCN2018123652-appb-000043
Is the Doppler frequency shift of user u relative to satellite i, v (i) is the moving speed of satellite i,
Figure PCTCN2018123652-appb-000044
Is the moving speed of user u, a u (i) is the unit vector of user u pointing to satellite i, c is the speed of light in vacuum, δf u is the clock drift of user u, and δf (i) is the clock drift of satellite i ,
Figure PCTCN2018123652-appb-000045
It is the Doppler frequency measurement noise of user u relative to satellite i.
(4)将立体视觉图像处理模块、惯性导航模块及卫星导航模块三部分的数据输入进行紧耦合,数据包括惯性导航模块测量数据、立体视觉模块的图像数据及卫星导航原始测量数据,通过卫星导航原始测量数据结合立体视觉模块的图像数据对惯性导航模块的漂移误差进行校正,从而可辅助惯性导航模块限制漂移误差,图2为系统紧耦合模块的具体流程图。在该图中,惯性传感器用于获取车辆在固定坐标系下的3轴加速度和3轴角速度输入到惯性导航模块中;双目摄像头采集图片用于直接方法的输入;GPS接收机用于获取卫星的原始测量数据。将双目摄像头直接方法的重投影误差,根据卫星原始测量数据得到的卫星导航误差和惯性导航的时间误差结合,进行总体的最优化估计来校正惯性导航模块的漂移误差,最后输出最优的位姿。(4) Tightly couple the data input of three parts: stereo vision image processing module, inertial navigation module and satellite navigation module. The data includes inertial navigation module measurement data, stereo vision module image data and satellite navigation original measurement data. The original measurement data is combined with the image data of the stereo vision module to correct the drift error of the inertial navigation module, so as to assist the inertial navigation module to limit the drift error. FIG. 2 is a specific flowchart of the system tight coupling module. In this figure, the inertial sensor is used to obtain the 3-axis acceleration and 3-axis angular velocity of the vehicle in a fixed coordinate system and input into the inertial navigation module; the binocular camera collects pictures for direct method input; the GPS receiver is used to obtain satellites Raw measurement data. Combine the reprojection error of the direct method of the binocular camera, the satellite navigation error obtained from the satellite's original measurement data and the time error of inertial navigation, and perform an overall optimization estimate to correct the drift error of the inertial navigation module, and finally output the optimal bit posture.
当进行紧耦合时,紧耦合由立体视觉的加权重投影误差、卫星导航误差和来自惯性导航的时间误差构成成本函数来进行表达,表达式如下:When tight coupling is performed, the tight coupling consists of a weighted reprojection error of stereo vision, satellite navigation error, and time error from inertial navigation to form a cost function to express, the expression is as follows:
Figure PCTCN2018123652-appb-000046
Figure PCTCN2018123652-appb-000046
其中,e r为立体视觉的加权重投影误差,e g为卫星导航误差,e s为惯性导航的时间误差项,i表示相机的标号,k表示摄像机帧标号,j表示地标标号。在第k帧和第i个相机中可见的地标标号被写为集合J(i,k)。此外,
Figure PCTCN2018123652-appb-000047
表示相应下标测量的信息矩阵,并且
Figure PCTCN2018123652-appb-000048
表示对应第k帧图片时惯性导航误差信息,
Figure PCTCN2018123652-appb-000049
表示对应t时刻的第s颗卫星时,卫星对应的误差信息矩阵。
Where e r is the weighted reprojection error of stereo vision, e g is the satellite navigation error, e s is the time error term of inertial navigation, i represents the camera label, k represents the camera frame label, and j represents the landmark label. The landmark labels visible in the k-th frame and the i-th camera are written as the set J(i, k). In addition,
Figure PCTCN2018123652-appb-000047
The information matrix representing the corresponding subscript measurement, and
Figure PCTCN2018123652-appb-000048
Represents the inertial navigation error information when corresponding to the k-th frame picture,
Figure PCTCN2018123652-appb-000049
Represents the error information matrix corresponding to the satellite when it corresponds to the sth satellite at time t.
再者,e r为立体视觉的加权重投影误差,立体视觉的重投影误差的表现形式为: Furthermore, er is the weighted reprojection error of stereo vision. The manifestation of the reprojection error of stereo vision is:
Figure PCTCN2018123652-appb-000050
Figure PCTCN2018123652-appb-000050
其中,h i表示相机的投影模型,z i,j,k表示特征的图像坐标。
Figure PCTCN2018123652-appb-000051
表示优化系统的姿态,
Figure PCTCN2018123652-appb-000052
表示惯性导航和相机的外参,
Figure PCTCN2018123652-appb-000053
表示特征坐标。
Among them, h i represents the projection model of the camera, and z i, j, k represents the image coordinates of the feature.
Figure PCTCN2018123652-appb-000051
Represents the attitude of the optimization system,
Figure PCTCN2018123652-appb-000052
External parameters representing inertial navigation and camera,
Figure PCTCN2018123652-appb-000053
Represents feature coordinates.
进一步的,e g为卫星导航误差,每一个时刻t每一颗导航卫星s的误差包含三个部分的误差,它的表现形式为: Furthermore, e g is the satellite navigation error. The error of each navigation satellite s at each time t includes three parts of errors. Its manifestation is:
Figure PCTCN2018123652-appb-000054
Figure PCTCN2018123652-appb-000054
其中,e p表示伪距误差,e d表示多普勒误差,e c表示载波相位误差。 Wherein, e p represents a pseudorange error, e d denotes the Doppler error, e c represents the carrier phase error.
再者,相应的误差信息矩阵W g的形式就变成了如下的形式: Furthermore, the form of the corresponding error information matrix W g becomes the following form:
Figure PCTCN2018123652-appb-000055
Figure PCTCN2018123652-appb-000055
通过线性或非线性优化的方法使得成本函数J(x)的值达到最小,从而完成立体视觉图像处理模块、惯性导航模块及卫星导航模块三者之间的紧耦合,通过卫星导航原始测量数据结合立体视觉模块的图像数据对惯性导航模块的漂移误差进行校正。当卫星数小于设定数量时,无法对接收机进行定位,通过卫星导航原始测量数据结合立体视觉模块的图像数据对惯性导航模块的系统误差进行校正,从而提高了导航的精度,大大增强了导航系统的鲁棒性。该紧耦合的自动驾驶感知系统可以帮助对环境提供高精度测量和建立地图,不再借助于昂贵的激光扫描雷达,降低了自动驾驶汽车的成本。Through linear or non-linear optimization methods, the value of the cost function J(x) is minimized, thereby completing the tight coupling between the stereo vision image processing module, the inertial navigation module, and the satellite navigation module, and combining the original measurement data of the satellite navigation The image data of the stereo vision module corrects the drift error of the inertial navigation module. When the number of satellites is less than the set number, the receiver cannot be positioned. The original measurement data of satellite navigation combined with the image data of the stereo vision module corrects the system error of the inertial navigation module, thereby improving the accuracy of navigation and greatly enhancing navigation. The robustness of the system. The tightly coupled autonomous driving perception system can help provide high-precision measurements and maps to the environment, eliminating the need for expensive laser scanning radar and reducing the cost of autonomous vehicles.
基于上述紧耦合的自动驾驶感知方法,本发明还提供了一种紧耦合的自动驾驶感知系统100,图3为本发明实施例的紧耦合的自动驾驶感知系统100的结构示意图,如图3所示,该紧耦合的自动驾驶感知系统100包含双目直接方法的立体视觉图像处理模块10、卫星导航模块20、惯性导航模块30及系统紧耦合模块40。Based on the above tightly coupled automatic driving perception method, the present invention also provides a tightly coupled automatic driving perception system 100. FIG. 3 is a schematic structural diagram of the tightly coupled automatic driving perception system 100 according to an embodiment of the present invention, as shown in FIG. It is shown that the tightly coupled automatic driving perception system 100 includes a stereo vision image processing module 10, a satellite navigation module 20, an inertial navigation module 30, and a system tight coupling module 40 of the binocular direct method.
具体的,惯性导航模块30用于采用惯性传感器获取惯性导航模块测量数据;具体如上。立体视觉图像处理模块10采用双目直接方法获取立体视觉模块的图像数据;具体如上。卫星导航模块20用于通过接收机获取导航卫星的卫星导航原始测量数据;具体如 上。系统紧耦合模块40用于将惯性导航模块测量数据、立体视觉模块的图像数据及卫星导航原始测量数据进行紧耦合处理;具体如上。各个模块之间的连接关系为:立体视觉图像处理模块10、卫星导航模块20及惯性导航模块30均与系统紧耦合模块40连接。Specifically, the inertial navigation module 30 is used to acquire inertial navigation module measurement data by using an inertial sensor; the details are as described above. The stereo vision image processing module 10 adopts a binocular direct method to obtain the image data of the stereo vision module; as described above. The satellite navigation module 20 is used to obtain the satellite navigation raw measurement data of the navigation satellite through the receiver; as described above. The system tight coupling module 40 is used to perform tight coupling processing on the measurement data of the inertial navigation module, the image data of the stereo vision module, and the original measurement data of the satellite navigation; as described above. The connection relationship between each module is as follows: the stereo vision image processing module 10, the satellite navigation module 20, and the inertial navigation module 30 are all connected to the system tight coupling module 40.
其中,立体视觉图像处理模块10包含双目摄像头,双目摄像头安装于自动驾驶车辆上;具体如上所述。再者,惯性导航模块30包含惯性传感器,惯性传感器固定于自动驾驶车辆上;具体如上所述。Among them, the stereo vision image processing module 10 includes a binocular camera, and the binocular camera is installed on the self-driving vehicle; as described above. Furthermore, the inertial navigation module 30 includes an inertial sensor, which is fixed to the autonomous vehicle; specifically as described above.
与现有技术相比,本发明具有如下有益效果:本发明将惯性导航模块测量数据、立体视觉模块的图像数据及卫星导航原始测量数据三者进行紧耦合,对惯性导航模块测量数据的误差进行修正,从而提高了定位精度,不再借助于昂贵的激光扫描雷达,从而降低自动驾驶汽车的成本。Compared with the prior art, the present invention has the following beneficial effects: The present invention tightly couples the measurement data of the inertial navigation module, the image data of the stereo vision module and the original measurement data of the satellite navigation, and performs the error measurement of the inertial navigation module. The correction improves the positioning accuracy and no longer requires expensive laser scanning radar, thereby reducing the cost of self-driving cars.
以上显示和描述了本发明的基本原理和主要特征和本发明的优点。本行业的技术人员应该了解,本发明不受上述实施例的限制,上述实施例和说明书中描述的只是说明本发明的原理,在不脱离本发明精神和范围的前提下,本发明还会有各种变化和改进,这些变化和改进都落入要求保护的本发明范围内。本发明要求保护范围由所附的权利要求书及其等效物界定。The basic principles and main features of the present invention and the advantages of the present invention are shown and described above. Those skilled in the art should understand that the present invention is not limited by the above embodiments. The above embodiments and the description only describe the principles of the present invention. Without departing from the spirit and scope of the present invention, the present invention will have Various changes and improvements that fall within the scope of the claimed invention. The claimed protection scope of the present invention is defined by the appended claims and their equivalents.

Claims (10)

  1. 一种紧耦合的自动驾驶感知方法,其特征在于,所述方法包括:A tightly coupled automatic driving perception method, characterized in that the method includes:
    步骤S1、获取立体视觉模块的图像数据、惯性导航模块的测量数据及卫星导航原始测量数据;Step S1: Obtain the image data of the stereo vision module, the measurement data of the inertial navigation module and the original measurement data of the satellite navigation;
    步骤S2、将所述立体视觉模块的图像数据、所述卫星导航原始测量数据及所述惯性导航模块的测量数据进行紧耦合,对所述惯性导航模块的漂移误差进行修正。Step S2: Tightly couple the image data of the stereo vision module, the original measurement data of the satellite navigation and the measurement data of the inertial navigation module, and correct the drift error of the inertial navigation module.
  2. 根据权利要求1所述的紧耦合的自动驾驶感知方法,其特征在于,所述立体视觉模块采用双目摄像头直接方法进行处理,所述图像数据包括双目摄像头的静态测量数据和动态测量数据。The tightly coupled automatic driving perception method according to claim 1, wherein the stereo vision module uses a direct method of binocular camera to process, and the image data includes static measurement data and dynamic measurement data of the binocular camera.
  3. 根据权利要求1所述的紧耦合的自动驾驶感知方法,其特征在于,所述紧耦合由立体视觉的加权重投影误差、卫星导航误差和来自惯性导航的时间误差构成成本函数。The tightly coupled automatic driving perception method according to claim 1, wherein the tightly coupled weighted reprojection error of stereo vision, satellite navigation error, and time error from inertial navigation constitute a cost function.
  4. 根据权利要求1所述的紧耦合的自动驾驶感知方法,其特征在于,所述步骤S1具体包括:The tightly coupled automatic driving perception method according to claim 1, wherein the step S1 specifically comprises:
    步骤S21、采用双目摄像头获取立体视觉模块的图像数据;Step S21: Use a binocular camera to obtain the image data of the stereo vision module;
    步骤S22、采用惯性传感器获取惯性导航模块的测量数据;Step S22: Obtain the measurement data of the inertial navigation module by using an inertial sensor;
    步骤S23、通过接收机获取卫星导航原始测量数据;Step S23: Obtain the satellite navigation original measurement data through the receiver;
    步骤S24、将所述惯性导航模块测量数据、所述图像数据及所述卫星导航原始测量数据进行紧耦合处理。Step S24: Perform tight coupling processing on the inertial navigation module measurement data, the image data, and the satellite navigation original measurement data.
  5. 根据权利要求4所述的紧耦合的自动驾驶感知方法,其特征在于,所述步骤S22具体包括:The tightly coupled automatic driving perception method according to claim 4, wherein the step S22 specifically includes:
    步骤S221、利用惯性传感器测量自动驾驶车辆在固定坐标系下的3轴加速度和3轴角速度;Step S221: Using an inertial sensor to measure the 3-axis acceleration and 3-axis angular velocity of the autonomous vehicle in a fixed coordinate system;
    步骤S222、将所述加速度和所述角速度转动到导航坐标系下,求解惯性导航机械编排方程并计算出自动驾驶车辆的位置和姿态角。Step S222: Rotate the acceleration and the angular velocity to the navigation coordinate system, solve the inertial navigation machinery layout equations, and calculate the position and attitude angle of the autonomous vehicle.
  6. 根据权利要求4所述的紧耦合的自动驾驶感知方法,其特征在于,所述步骤S23具体包括:The tightly coupled automatic driving perception method according to claim 4, wherein the step S23 specifically includes:
    步骤S231、用接收机接收导航卫星的信号;Step S231: Receive the signal of the navigation satellite with the receiver;
    步骤S232、解析出每个卫星的星历信息,根据所述星历信息计算出每个所述卫星的卫星位置和卫星速度;Step S232: Analyze the ephemeris information of each satellite, and calculate the satellite position and satellite velocity of each satellite according to the ephemeris information;
    步骤S233、并计算出所述卫星的伪距离、多普勒频飘及载波相位。Step S233, and calculate the pseudo distance, Doppler frequency drift and carrier phase of the satellite.
  7. 根据权利要求4所述的紧耦合的自动驾驶感知方法,其特征在于,所述步骤S24具体包含:通过所述卫星导航原始测量数据结合所述立体视觉模块的图像数据对所述惯性导航模块的漂移误差进行校正。The tightly coupled automatic driving perception method according to claim 4, wherein step S24 specifically comprises: combining the satellite navigation original measurement data with the image data of the stereo vision module to the inertial navigation module Drift errors are corrected.
  8. 一种紧耦合的自动驾驶感知系统,其特征在于,所述系统包括:双目直接方法的立体视觉图像处理模块、卫星导航模块、惯性导航模块及系统紧耦合模块,惯性导航模块,用于采用惯性传感器获取惯性导航模块的测量数据;立体视觉图像处理模块,采用双目摄像头获取立体视觉模块的图像数据;卫星导航模块,用于通过接收机获取卫星导航原始测量数据;系统紧耦合模块, 用于将所述惯性导航模块测量数据、所述立体视觉模块的图像数据及所述卫星导航原始测量数据进行紧耦合处理;A tightly coupled automatic driving perception system, characterized in that the system includes: a stereo vision image processing module of a binocular direct method, a satellite navigation module, an inertial navigation module, and a system tightly coupled module. The inertial navigation module is used for The inertial sensor obtains the measurement data of the inertial navigation module; the stereo vision image processing module uses a binocular camera to obtain the image data of the stereo vision module; the satellite navigation module is used to obtain the satellite navigation original measurement data through the receiver; the system is tightly coupled module, used Performing tight coupling processing on the measurement data of the inertial navigation module, the image data of the stereo vision module, and the original measurement data of the satellite navigation;
    所述立体视觉图像处理模块、所述卫星导航模块及所述惯性导航模块均与所述系统紧耦合模块连接。The stereo vision image processing module, the satellite navigation module, and the inertial navigation module are all connected to the system tight coupling module.
  9. 根据权利要求8所述的紧耦合的自动驾驶感知系统,其特征在于,所述立体视觉图像处理模块包含双目摄像头。The tightly coupled automatic driving perception system according to claim 8, wherein the stereoscopic image processing module includes a binocular camera.
  10. 根据权利要求8所述的紧耦合的自动驾驶感知系统,其特征在于,所述惯性导航模块包含惯性传感器,所述惯性传感器固定于所述自动驾驶车辆上。The tightly coupled automatic driving perception system according to claim 8, wherein the inertial navigation module includes an inertial sensor, and the inertial sensor is fixed to the autonomous vehicle.
PCT/CN2018/123652 2018-12-20 2018-12-25 Autonomous driving sensing method and system employing close coupling WO2020124624A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811565306.6A CN109725339A (en) 2018-12-20 2018-12-20 A kind of tightly coupled automatic Pilot cognitive method and system
CN201811565306.6 2018-12-20

Publications (1)

Publication Number Publication Date
WO2020124624A1 true WO2020124624A1 (en) 2020-06-25

Family

ID=66297006

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/123652 WO2020124624A1 (en) 2018-12-20 2018-12-25 Autonomous driving sensing method and system employing close coupling

Country Status (2)

Country Link
CN (1) CN109725339A (en)
WO (1) WO2020124624A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220107184A1 (en) * 2020-08-13 2022-04-07 Invensense, Inc. Method and system for positioning using optical sensor and motion sensors

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110502005A (en) * 2019-07-12 2019-11-26 北京合众思壮科技股份有限公司 Automatic Pilot method and system
CN110906923B (en) * 2019-11-28 2023-03-14 重庆长安汽车股份有限公司 Vehicle-mounted multi-sensor tight coupling fusion positioning method and system, storage medium and vehicle
CN111929718A (en) * 2020-06-12 2020-11-13 东莞市普灵思智能电子有限公司 Automatic driving object detection and positioning system and method
CN111829522B (en) * 2020-07-02 2022-07-12 浙江大华技术股份有限公司 Instant positioning and map construction method, computer equipment and device
CN115790282B (en) * 2022-10-11 2023-08-22 西安岳恒机电工程有限责任公司 Unmanned target vehicle direction control system and control method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103969672A (en) * 2014-05-14 2014-08-06 东南大学 Close combination navigation method of multi-satellite system and strapdown inertial navigation system
CA2893306A1 (en) * 2014-06-04 2015-12-04 Novatel Inc. System and method for augmenting a gnss/ins navigation system in a cargo port environment
CN106767787A (en) * 2016-12-29 2017-05-31 北京时代民芯科技有限公司 A kind of close coupling GNSS/INS combined navigation devices
CN107037469A (en) * 2017-04-11 2017-08-11 北京七维航测科技股份有限公司 Based on the self-alignment double antenna combined inertial nevigation apparatus of installation parameter
CN107478221A (en) * 2017-08-11 2017-12-15 黄润芳 A kind of high-precision locating method for mobile terminal
CN107633695A (en) * 2017-09-30 2018-01-26 深圳市德赛微电子技术有限公司 A kind of sensing type intelligent Transportation Terminal
US20180095476A1 (en) * 2016-10-03 2018-04-05 Agjunction Llc Using optical sensors to resolve vehicle heading issues

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2976107B1 (en) * 2011-05-30 2014-01-03 Commissariat Energie Atomique METHOD FOR LOCATING A CAMERA AND 3D RECONSTRUCTION IN A PARTIALLY KNOWN ENVIRONMENT
WO2014130854A1 (en) * 2013-02-21 2014-08-28 Regents Of The Univesity Of Minnesota Extrinsic parameter calibration of a vision-aided inertial navigation system
CN103308925B (en) * 2013-05-31 2015-05-06 中国科学院合肥物质科学研究院 Integral three-dimensional color laser radar data point cloud generating method
CN104316947B (en) * 2014-08-26 2017-04-12 南京航空航天大学 GNSS/INS ultra-tight combination navigation apparatus and relative navigation system thereof
CN107389064B (en) * 2017-07-27 2021-05-18 长安大学 Unmanned vehicle lane change control method based on inertial navigation
CN108444468B (en) * 2018-02-06 2020-08-04 浙江大学 Directional compass integrating downward vision and inertial navigation information
CN108802786A (en) * 2018-07-20 2018-11-13 北斗星通(重庆)汽车电子有限公司 A kind of vehicle positioning method
CN210038170U (en) * 2018-12-20 2020-02-07 东莞市普灵思智能电子有限公司 Tightly-coupled automatic driving sensing system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103969672A (en) * 2014-05-14 2014-08-06 东南大学 Close combination navigation method of multi-satellite system and strapdown inertial navigation system
CA2893306A1 (en) * 2014-06-04 2015-12-04 Novatel Inc. System and method for augmenting a gnss/ins navigation system in a cargo port environment
US20180095476A1 (en) * 2016-10-03 2018-04-05 Agjunction Llc Using optical sensors to resolve vehicle heading issues
CN106767787A (en) * 2016-12-29 2017-05-31 北京时代民芯科技有限公司 A kind of close coupling GNSS/INS combined navigation devices
CN107037469A (en) * 2017-04-11 2017-08-11 北京七维航测科技股份有限公司 Based on the self-alignment double antenna combined inertial nevigation apparatus of installation parameter
CN107478221A (en) * 2017-08-11 2017-12-15 黄润芳 A kind of high-precision locating method for mobile terminal
CN107633695A (en) * 2017-09-30 2018-01-26 深圳市德赛微电子技术有限公司 A kind of sensing type intelligent Transportation Terminal

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220107184A1 (en) * 2020-08-13 2022-04-07 Invensense, Inc. Method and system for positioning using optical sensor and motion sensors
US11875519B2 (en) * 2020-08-13 2024-01-16 Medhat Omr Method and system for positioning using optical sensor and motion sensors

Also Published As

Publication number Publication date
CN109725339A (en) 2019-05-07

Similar Documents

Publication Publication Date Title
WO2021248636A1 (en) System and method for detecting and positioning autonomous driving object
WO2020124624A1 (en) Autonomous driving sensing method and system employing close coupling
CN111077556B (en) Airport luggage tractor positioning device and method integrating Beidou and multiple sensors
CN113203418A (en) GNSSINS visual fusion positioning method and system based on sequential Kalman filtering
CN111829512A (en) AUV navigation positioning method and system based on multi-sensor data fusion
Li et al. Multi-GNSS PPP/INS/Vision/LiDAR tightly integrated system for precise navigation in urban environments
Hsu et al. Hong Kong UrbanNav: An open-source multisensory dataset for benchmarking urban navigation algorithms
CN210038170U (en) Tightly-coupled automatic driving sensing system
Wen et al. 3D LiDAR aided GNSS real-time kinematic positioning
Ishikawa et al. A Mobile Mapping System for road data capture based on 3D road model
Wen et al. 3D vision aided GNSS real-time kinematic positioning for autonomous systems in urban canyons
Afia et al. A low-cost gnss/imu/visual monoslam/wss integration based on federated kalman filtering for navigation in urban environments
CN115523920B (en) Seamless positioning method based on visual inertial GNSS tight coupling
Wen et al. Factor graph optimization for tightly-coupled GNSS pseudorange/Doppler/carrier phase/INS integration: Performance in urban canyons of Hong Kong
CN116027351A (en) Hand-held/knapsack type SLAM device and positioning method
CN115900732A (en) Combined navigation method and system based on roadside camera and vehicle-mounted unit
Ishikawa et al. A study of precise road feature localization using mobile mapping system
Paijitprapaporn et al. Accuracy assessment of integrated GNSS measurements with LIDAR mobile mapping data in urban environments
Aboutaleb et al. Examining the Benefits of LiDAR Odometry Integrated with GNSS and INS in Urban Areas
Pan et al. Tightly coupled integration of monocular visual-inertial odometry and UC-PPP based on factor graph optimization in difficult urban environments
Iqbal et al. A review of sensor system schemes for integrated navigation
CN113031040A (en) Positioning method and system for airport ground clothes vehicle
CN113534227B (en) Multi-sensor fusion absolute positioning method suitable for complex non-cooperative scene
Bi et al. Positioning and navigation technology
Huang Wheel Odometry Aided Visual-Inertial Odometry in Winter Urban Environments

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18943403

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18943403

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18943403

Country of ref document: EP

Kind code of ref document: A1