WO2018066754A1 - Method for estimating attitude of vehicle by using lidar sensor - Google Patents

Method for estimating attitude of vehicle by using lidar sensor Download PDF

Info

Publication number
WO2018066754A1
WO2018066754A1 PCT/KR2016/013670 KR2016013670W WO2018066754A1 WO 2018066754 A1 WO2018066754 A1 WO 2018066754A1 KR 2016013670 W KR2016013670 W KR 2016013670W WO 2018066754 A1 WO2018066754 A1 WO 2018066754A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
transformation matrix
attitude
calculating
occupancy
Prior art date
Application number
PCT/KR2016/013670
Other languages
French (fr)
Korean (ko)
Inventor
박태형
조준형
Original Assignee
충북대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 충북대학교 산학협력단 filed Critical 충북대학교 산학협력단
Publication of WO2018066754A1 publication Critical patent/WO2018066754A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/02Control of vehicle driving stability
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present invention relates to a method of estimating a current posture of a vehicle, and more particularly, to a method of estimating a current posture of a vehicle using a rearward downward-looking 2D LiDAR (LiDAR, Light Detection And Ranging) sensor.
  • LiDAR Light Detection And Ranging
  • 1 is a diagram for defining a posture of a vehicle.
  • a three-dimensional coordinate system is fixed to a vehicle.
  • for x-axis rotation
  • ⁇ for y-axis rotation ⁇ for y-axis rotation
  • ⁇ for z-axis rotation ⁇ and ⁇ are poses.
  • is defined as the heading of the vehicle.
  • a device generally used for estimating a vehicle's attitude is an IMU (Inertial Measurement Unit), which measures three-axis angular velocities, that is, changes in ⁇ , ⁇ , and ⁇ , and three-axis acceleration. At this time, the values of ⁇ , ⁇ , and ⁇ are not measured immediately, but the amount of change is measured. However, since noise components accumulate together, an algorithm that mixes with other measurements is essential.
  • IMU Inertial Measurement Unit
  • the vehicle attitude estimation system is generally configured by adding another sensor to the inertial measuring device.
  • the present invention has been made to solve the above problems, and provides a method for estimating the attitude of the vehicle using a 2D LiDAR (Light Detection And Ranging) sensor mounted to the rear of the autonomous vehicle. There is a purpose.
  • 2D LiDAR Light Detection And Ranging
  • an inertial measurement unit (Inertial Measurement Unit) of the present invention is built in, and a lidar (LiDAR, Light Detection And Ranging) sensor is mounted on the rear of the vehicle attitude estimation method, the lidar sensor Acquiring measurement point data measuring a distance by using; projecting the measurement point data onto two-dimensional rectangular coordinates to find a road line; calculating a change in attitude of the vehicle ( ⁇ ', ⁇ ') from the road line And calculating the postures ( ⁇ , ⁇ ) of the vehicle by fusing the posture change amounts ⁇ 'and ⁇ ' with the measured values of the inertial measurement apparatus.
  • a lidar LiDAR, Light Detection And Ranging
  • the finding of the road line may include converting a point set P, which is a set of the measurement point data expressed in a polar coordinate system, into a rectangular coordinate system. Creating a transformation matrix by matching the point set P to the occupant grid map, continuously updating and optimizing the transformation matrix, and calculating the slope and intercept of the road line from the optimized transformation matrix. Deriving may be included.
  • the optimizing the transformation matrix may include calculating a linearly interpolated occupancy and occupancy slope value, updating a transformation matrix through the linearly interpolated occupancy and slope value, and a first predetermined error of the updated transformation matrix. If the threshold is greater than the threshold and the number of update repetitions is less than or equal to the second predetermined threshold, calculating the linearly interpolated occupancy rate and occupancy slope value again to update the transform matrix and the error of the updated transform matrix is less than the first threshold value, or If the number of update repetitions is greater than or equal to a second predetermined threshold, the optimization process may be terminated.
  • the posture of the vehicle can be estimated quickly by using a lidar sensor attached to an existing vehicle without adding an expensive inertial measurement device or GPS equipment.
  • the vehicle attitude estimation function can be implemented at no additional cost.
  • 1 is a diagram for defining a posture of a vehicle.
  • 2 is a view showing an example of the measurement range and measured value of the 2D lidar sensor.
  • FIG. 3 is an overall flowchart illustrating a method of estimating a vehicle attitude using a lidar sensor according to an exemplary embodiment of the present invention.
  • FIG. 4 is a diagram illustrating measured values of a lidar sensor viewed from a road in a Cartesian coordinate system.
  • FIG. 5 is an exemplary diagram in which a transform is applied to an existing point set.
  • FIG. 6 is a diagram illustrating a road reference line and a road line extracted while driving.
  • FIG. 7 is a diagram illustrating an occupant lattice map used for optimization according to an embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating a transform matrix optimization method according to an embodiment of the present invention.
  • the present invention relates to a method of estimating a vehicle's attitude using a LiDAR (Light Detection And Ranging) sensor.
  • LiDAR Light Detection And Ranging
  • the subject performing the method of estimating the vehicle attitude using the lidar sensor may be an electronic system embedded in the vehicle.
  • a controller (processor) of an electronic control system module embedded in a vehicle, an electronic control system mounted on an autonomous vehicle, or the like may be a main agent.
  • the Lidar sensor is a sensor that directly measures the distance by shooting light. It is widely used in autonomous vehicles for location recognition, obstacle detection, and moving object tracking because the accurate distance can be obtained at high speed.
  • the 2D lidar sensor can shoot light at different angles to obtain multiple distance data in one scan.
  • 2 is a view showing an example of the measurement range and measured value of the 2D lidar sensor.
  • the values reflected and measured by the vehicle lying within the measurement range indicated by the arc-shaped areas in the left figure are shown as points on two-dimensional coordinates in the right figure.
  • the 2D lidar sensor has a characteristic that distance and phase data called measurement points are displayed on one plane.
  • autonomous vehicles may be equipped with a 2D lidar sensor at the rear for parking or rear obstacle detection, and thus, there is no additional cost when using them for estimating the attitude of the vehicle.
  • FIG. 3 is an overall flowchart illustrating a method of estimating a vehicle attitude using a lidar sensor according to an exemplary embodiment of the present invention.
  • the vehicle attitude estimation method according to an embodiment of the present invention is as follows.
  • the measurement point data is displayed on the two-dimensional rectangular coordinates, and the road line is found on the two-dimensional rectangular coordinates (S320).
  • the attitude change amounts ⁇ 'and ⁇ ' of ⁇ and ⁇ are calculated from the road line (S330).
  • every step of S310 to S340 is executed once every measurement cycle of the lidar sensor.
  • each step will be described in detail.
  • step S310 a point set consisting of distance and phase, which is raw data measured by a 2D lidar sensor, is obtained. Each of these points is called a measuring point.
  • FIG. 4 is a diagram illustrating measured values of a lidar sensor viewed from a road in a Cartesian coordinate system.
  • each point may be represented by the following equation.
  • r represents a distance
  • represents an angle
  • n measurement points can be obtained.
  • the number n of measurements depends on the performance of the lidar.
  • measurement points having too large or too small distance values are previously excluded from the point set.
  • Step S320 is a process of finding road lines by projecting measurement point data into a Cartesian coordinate system.
  • the road line is a straight line made up of measurement points indicating the distance measured by the light from the lidar sensor reflected on the road, that is, the distance and phase of the road.
  • FIG. 4 measurement points obtained by a lidar sensor viewed from the road are illustrated.
  • the measuring points representing the road can be approximated by a straight line.
  • the present invention proposes a method of finding a road line by matching a predefined road reference line for every measurement.
  • the range is arbitrarily reduced when the viewing angle of the lidar sensor is wide.
  • the point set P is then multiplied by the two-dimensional transformation matrix.
  • FIG. 5 is an exemplary diagram in which a transform is applied to an existing point set.
  • the transformation matrix is as follows.
  • t x represents x-axis movement
  • t y represents y-axis movement
  • phi means rotational movement angle. Finding this transformation matrix makes it easy to find the road line by simple calculation.
  • Road baseline refers to the portion of the lidar measurements taken on the flat road.
  • FIG. 6 is a diagram illustrating a road reference line and a road line extracted while driving.
  • the dashed line in FIG. 6 is the road reference line, which depends on the mounting height and downward angle of the lidar sensor.
  • the occupant grid map is one of the methods of expressing a map, and has a share in each grid having a fixed size.
  • an occupant grid map in which a road reference line is drawn in advance is used.
  • FIG. 7 is a diagram illustrating an occupant lattice map used for optimization according to an embodiment of the present invention.
  • the occupant lattice map is prepared in the form of a straight line along the Gaussian distribution, and is blanked line by line to prevent numerical divergence due to a zero y-axis gradient.
  • the distance between the road reference line and the origin of the map is calculated as follows through the height and downward angle of the lidar sensor.
  • is the distance from the origin of the occupant grid map to the road reference line
  • h is the height of the lidar sensor
  • is the downward angle of the sensor
  • the size per cell of the graticule map i.e. the resolution, depends on the performance of the LiDAR sensor. That is, as the number of points compared to the viewing angle increases, the resolution of the occupant grid map may be increased. In the present invention, the resolution of the occupant grid map must be high to increase the precision of the transformation matrix.
  • the road line can be found by deriving the slope and the intercept of the road line from the optimized transformation matrix.
  • the vehicle posture change amounts ⁇ 'and ⁇ ' can be calculated from the transformation matrix.
  • the Levenberg-Marquardt algorithm which is one of nonlinear function optimization techniques, is applied to find the optimal transform T * .
  • This algorithm has the advantage of high probability of finding global solution and fast convergence if the initial value is within a certain range.
  • FIG. 8 is a flowchart illustrating a transform matrix optimization method according to an embodiment of the present invention.
  • a linear interpolated occupancy rate and occupancy slope value are calculated (S810).
  • the transformation matrix is updated through linearly interpolated occupancy and slope values (S820).
  • p k + 1 is the variables t x , t y , ⁇ of the transformation matrix T * to be optimized, and p k is the previous transformation matrix T.
  • R (p k ) is the error when the current transformation matrix is applied, J r is the first derivative of the error function, and H r is the second derivative of the error function.
  • H r H r
  • the Jr matrix is calculated using the occupancy ratio and the first derivative of the point, and the transformation matrix is updated.
  • J r is calculated as follows.
  • x and y are the coordinates of each point in the point set, and ⁇ is the rotation angle of the road line.
  • H r is obtained from the calculated J r , and the updated conversion matrix T * is calculated.
  • the buffer coefficient ⁇ k is reduced, otherwise the ⁇ k is increased to be used in the next iteration.
  • the optimization process is terminated and the updated transform matrix T * is stored.
  • the final step S340 is a step of compensating by mixing the amount of change in ⁇ and ⁇ obtained by performing step S330 with the amount of ⁇ and ⁇ estimated by the inertial measurement sensor inside the vehicle.
  • the vehicle attitude is estimated using the vehicle speed, the measurement cycle of the lidar sensor, and the posture change amount acquired by the lidar sensor.
  • the Kalman filter is then used to fuse the position estimated by the inertial measurement sensor.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The present invention provides a method for estimating the attitude of a vehicle having a built-in inertial measurement unit and having a light detection and ranging (LiDAR) sensor mounted in the rear thereof, the method comprising the steps of: acquiring measurement point data corresponding to a distance measured using the LiDAR sensor; finding a road line by projecting the measurement point data on two-dimensional rectangular coordinates; calculating an attitude variation (α', β') of the vehicle from the road line; and calculating the attitude (α, β) of the vehicle by combining the attitude variation (α', β') with a measurement value measured by the inertial measurement unit. Using a LiDAR sensor which is conventionally mounted to a vehicle, the present invention can quickly estimate the attitude of a vehicle without adding any expensive inertial measurement unit or GPS device.

Description

라이다 센서를 이용한 차량의 자세 추정 방법Vehicle attitude estimation method using lidar sensor
본 발명은 차량의 현재 자세를 추정하는 방법에 관한 것으로서, 더욱 상세하게는 후방 하향 주시 2D 라이다(LiDAR, Light Detection And Ranging) 센서를 이용하여 차량의 현재 자세를 추정하는 방법에 관한 것이다.The present invention relates to a method of estimating a current posture of a vehicle, and more particularly, to a method of estimating a current posture of a vehicle using a rearward downward-looking 2D LiDAR (LiDAR, Light Detection And Ranging) sensor.
고속으로 달리는 자율 주행 차량의 제어를 위해서는 차량의 자세를 추정하는 것이 필수적이다. In order to control the autonomous vehicle running at high speed, it is essential to estimate the attitude of the vehicle.
도 1은 차량의 자세를 정의하기 위한 도면이다. 1 is a diagram for defining a posture of a vehicle.
도 1을 참조하면, 차량에 3차원 좌표계가 고정되어 있는데 각 축을 기준으로 하는 회전을 x축 회전은 α, y축 회전은 β, z축 회전은 γ라고 할 때, α와 β는 자세(pose)로 정의하고, γ는 차량의 진행 방향(Heading)이라고 정의한다.Referring to FIG. 1, a three-dimensional coordinate system is fixed to a vehicle. When a rotation about each axis is referred to as α for x-axis rotation, β for y-axis rotation, and γ for z-axis rotation, α and β are poses. ), And γ is defined as the heading of the vehicle.
차량의 자세를 추정하기 위해 일반적으로 사용하는 장치로 3축 각속도, 즉 α, β, γ의 변화량과 3축 가속도가 측정되는 IMU(Inertial Measurement Unit, 이하 '관성측정장치'라 함)가 있다. 이때, α, β, γ 값이 바로 측정되지 않고 변화량이 측정되기 때문에 누적하여 측정하여야 한다. 그러나, 잡음 성분도 같이 누적되기 때문에 다른 측정치와 혼합하는 알고리즘이 필수적이다. A device generally used for estimating a vehicle's attitude is an IMU (Inertial Measurement Unit), which measures three-axis angular velocities, that is, changes in α, β, and γ, and three-axis acceleration. At this time, the values of α, β, and γ are not measured immediately, but the amount of change is measured. However, since noise components accumulate together, an algorithm that mixes with other measurements is essential.
별도의 센서 추가 없이 관성측정장치의 3축 가속도 값을 혼합하여 추정하는 경우, 정밀도가 낮고 필터에 의한 지연속도도 문제가 된다. 고가형 제품을 사용하면 어느 정도 완화할 수 있으나, 관성측정장치는 정밀도가 높을수록 급격히 비싸지기 때문에, 일반적으로 차량의 자세 추정 시스템은 관성측정장치에 다른 센서를 추가 장착하여 구성한다. In the case of estimating the three-axis acceleration values of the inertial measurement unit without additional sensors, the accuracy is low and the delay speed due to the filter also becomes a problem. The use of high-priced products can be alleviated to some extent, but since the inertia measuring device becomes more expensive as the precision is higher, the vehicle attitude estimation system is generally configured by adding another sensor to the inertial measuring device.
GPS(Global Positioning System)를 추가하여 구성하는 경우, 차량의 위치 정보를 관성측정장치의 측정치와 융합하는 방법을 사용한다. 융합하면 적당한 위치 정밀도를 얻지만 정작 필요한 차량의 자세는 일반적인 GPS를 사용할 때, 동작 주기가 1Hz로 매우 느리기 때문에 신뢰도가 떨어진다. 고가의 수신기를 사용하면 위치 정보는 비교적 정확하지만 여전히 고속으로 주행하는 차량의 자세를 즉시 추정하기 어렵다는 문제점이 있다. In the case of the configuration by adding a GPS (Global Positioning System), a method of fusing the vehicle position information with the measurement value of the inertial measurement apparatus is used. Fusion achieves proper positional accuracy, but the vehicle's posture is not reliable because the operating cycle is very slow, such as 1 Hz, when using general GPS. When using an expensive receiver, the positional information is relatively accurate, but there is a problem that it is difficult to immediately estimate the attitude of the vehicle still driving at a high speed.
지자기 센서를 추가하여 구성하는 경우, γ의 변화량이 아닌 γ값을 직접 측정할 수 있어 차량의 자세와 방향을 모두 추정할 수 있다. 그러나 정밀도는 크게 높아지지 않으며 필터에 의한 지연속도는 여전히 줄어들지 않는다는 문제점이 있다.When the geomagnetic sensor is added, it is possible to directly measure the value of γ rather than the change in γ so that both the attitude and the direction of the vehicle can be estimated. However, there is a problem that the precision does not increase significantly and the delay speed due to the filter still does not decrease.
본 발명은 상기와 같은 문제점을 해결하기 위하여 안출된 것으로서, 자율 주행 차량의 후방에 장착되어 있는 2D 라이다(LiDAR, Light Detection And Ranging) 센서를 이용하여 차량의 자세를 추정하는 방법을 제공하는데 그 목적이 있다.The present invention has been made to solve the above problems, and provides a method for estimating the attitude of the vehicle using a 2D LiDAR (Light Detection And Ranging) sensor mounted to the rear of the autonomous vehicle. There is a purpose.
본 발명의 목적은 이상에서 언급한 목적으로 제한되지 않으며, 언급되지 않은 또 다른 목적들은 아래의 기재로부터 통상의 기술자에게 명확하게 이해될 수 있을 것이다.The object of the present invention is not limited to the above-mentioned object, and other objects not mentioned will be clearly understood by those skilled in the art from the following description.
이와 같은 목적을 달성하기 위한 본 발명의 관성측정장치(Inertial Measurement Unit)가 내장되어 있고, 라이다(LiDAR, Light Detection And Ranging) 센서가 후방에 장착된 차량의 자세 추정 방법에서, 상기 라이다 센서를 이용하여 거리를 측정한 측정점 데이터를 획득하는 단계, 상기 측정점 데이터를 2차원 직교 좌표 상에 투영하여 도로선을 찾는 단계, 상기 도로선으로부터 차량의 자세 변화량(α', β')을 계산하는 단계 및 상기 자세 변화량(α', β')을 상기 관성측정장치의 측정치와 융합하여 차량의 자세(α, β)를 계산하는 단계를 포함한다. In order to achieve the above object, an inertial measurement unit (Inertial Measurement Unit) of the present invention is built in, and a lidar (LiDAR, Light Detection And Ranging) sensor is mounted on the rear of the vehicle attitude estimation method, the lidar sensor Acquiring measurement point data measuring a distance by using; projecting the measurement point data onto two-dimensional rectangular coordinates to find a road line; calculating a change in attitude of the vehicle (α ', β') from the road line And calculating the postures (α, β) of the vehicle by fusing the posture change amounts α 'and β' with the measured values of the inertial measurement apparatus.
상기 도로선을 찾는 단계는, 극 좌표계로 표현된 상기 측정점 데이터의 집합인 점 집합 P를 직교 좌표계로 변환하는 단계, 크기가 고정된 각 격자마다 점유율을 갖고 있는 지도인 점유격자 지도를 가우시안 분포를 따르는 직선의 형태로 작성하는 단계, 상기 점 집합 P를 상기 점유격자 지도에 매칭하여 변환행렬을 구하는 단계, 상기 변환행렬을 계속 업데이트하여 최적화하는 단계 및 상기 최적화된 변환행렬로부터 도로선의 기울기와 절편을 도출하는 단계를 포함하여 이루어질 수 있다. The finding of the road line may include converting a point set P, which is a set of the measurement point data expressed in a polar coordinate system, into a rectangular coordinate system. Creating a transformation matrix by matching the point set P to the occupant grid map, continuously updating and optimizing the transformation matrix, and calculating the slope and intercept of the road line from the optimized transformation matrix. Deriving may be included.
상기 변환행렬로부터 차량의 자세 변화량(α', β')을 계산할 수 있다. It is possible to calculate the amount of change in the attitude of the vehicle α 'and β' from the transformation matrix.
상기 변환행렬을 최적화하는 단계는, 선형 보간된 점유율과 점유율 기울기 값을 계산하는 단계, 상기 선형 보간된 점유율과 기울기 값을 통해 변환행렬을 업데이트하는 단계, 업데이트된 변환행렬의 에러가 미리 정해진 제1 임계치 이상이고, 업데이트 반복 횟수가 미리 정해진 제2 임계치 이하이면, 다시 선형 보간된 점유율과 점유율 기울기 값을 계산하여 변환행렬을 업데이트하는 단계 및 업데이트된 변환행렬의 에러가 상기 제1 임계치 미만이거나, 또는 업데이트 반복 횟수가 미리 정해진 제2 임계치 이상이면, 최적화 과정을 종료하는 단계로 이루어질 수 있다.The optimizing the transformation matrix may include calculating a linearly interpolated occupancy and occupancy slope value, updating a transformation matrix through the linearly interpolated occupancy and slope value, and a first predetermined error of the updated transformation matrix. If the threshold is greater than the threshold and the number of update repetitions is less than or equal to the second predetermined threshold, calculating the linearly interpolated occupancy rate and occupancy slope value again to update the transform matrix and the error of the updated transform matrix is less than the first threshold value, or If the number of update repetitions is greater than or equal to a second predetermined threshold, the optimization process may be terminated.
본 발명에 의하면, 고가의 관성측정장치 또는 GPS 장비를 추가하지 않고, 기존의 차량에 기 부착되어 있는 라이다 센서를 이용하여, 차량의 자세를 빠르게 추정할 수 있는 효과가 있다. According to the present invention, the posture of the vehicle can be estimated quickly by using a lidar sensor attached to an existing vehicle without adding an expensive inertial measurement device or GPS equipment.
또한, 본 발명은 기존에 차량에 장착되어 있는 라이다 센서를 이용하므로, 추가 비용 없이 차량 자세 추정 기능을 구현할 수 있는 효과가 있다.In addition, since the present invention uses a lidar sensor mounted on a vehicle, the vehicle attitude estimation function can be implemented at no additional cost.
도 1은 차량의 자세를 정의하기 위한 도면이다.1 is a diagram for defining a posture of a vehicle.
도 2는 2D 라이다 센서의 측정범위와 측정값의 예를 보여주는 도면이다. 2 is a view showing an example of the measurement range and measured value of the 2D lidar sensor.
도 3은 본 발명의 일 실시예에 따른 라이다 센서를 이용한 차량의 자세 추정 방법을 도시한 전체 흐름도이다. 3 is an overall flowchart illustrating a method of estimating a vehicle attitude using a lidar sensor according to an exemplary embodiment of the present invention.
도 4는 도로를 바라본 라이다 센서의 측정값들을 직교 좌표계에 표현한 도면이다. FIG. 4 is a diagram illustrating measured values of a lidar sensor viewed from a road in a Cartesian coordinate system.
도 5는 기존의 점 집합에 변환을 적용한 예시도이다. 5 is an exemplary diagram in which a transform is applied to an existing point set.
도 6은 도로 기준선과 주행 중 추출한 도로선을 예시한 도면이다. 6 is a diagram illustrating a road reference line and a road line extracted while driving.
도 7은 본 발명의 일 실시예에 따른 최적화에 사용된 점유격자 지도를 예시한 도면이다. 7 is a diagram illustrating an occupant lattice map used for optimization according to an embodiment of the present invention.
도 8은 본 발명의 일 실시예에 따른 변환행렬 최적화 방법을 보여주는 흐름도이다. 8 is a flowchart illustrating a transform matrix optimization method according to an embodiment of the present invention.
본 발명의 관성측정장치(Inertial Measurement Unit)가 내장되어 있고, 라이다(LiDAR, Light Detection And Ranging) 센서가 후방에 장착된 차량의 자세 추정 방법에서, 상기 라이다 센서를 이용하여 거리를 측정한 측정점 데이터를 획득하는 단계, 상기 측정점 데이터를 2차원 직교 좌표 상에 투영하여 도로선을 찾는 단계, 상기 도로선으로부터 차량의 자세 변화량(α', β')을 계산하는 단계 및 상기 자세 변화량(α', β')을 상기 관성측정장치의 측정치와 융합하여 차량의 자세(α, β)를 계산하는 단계를 포함한다. In the attitude estimation method of a vehicle in which an inertial measurement unit (Inertial Measurement Unit) of the present invention is built-in and a LiDAR (Light Detection And Ranging) sensor is mounted at the rear, a distance is measured using the lidar sensor. Acquiring measurement point data; projecting the measurement point data onto two-dimensional rectangular coordinates to find a road line; calculating a change in attitude of the vehicle from the road line; ', β') and the measured values of the inertial measurement device to calculate the attitude (α, β) of the vehicle.
본 발명은 다양한 변경을 가할 수 있고 여러 가지 실시예를 가질 수 있는 바, 특정 실시예들을 도면에 예시하고 상세하게 설명하고자 한다. 그러나, 이는 본 발명을 특정한 실시 형태에 대해 한정하려는 것이 아니며, 본 발명의 사상 및 기술 범위에 포함되는 모든 변경, 균등물 내지 대체물을 포함하는 것으로 이해되어야 한다.As the present invention allows for various changes and numerous embodiments, particular embodiments will be illustrated in the drawings and described in detail in the written description. However, this is not intended to limit the present invention to specific embodiments, it should be understood to include all modifications, equivalents, and substitutes included in the spirit and scope of the present invention.
본 출원에서 사용한 용어는 단지 특정한 실시예를 설명하기 위해 사용된 것으로, 본 발명을 한정하려는 의도가 아니다. 단수의 표현은 문맥상 명백하게 다르게 뜻하지 않는 한, 복수의 표현을 포함한다. 본 출원에서, "포함하다" 또는 "가지다" 등의 용어는 명세서상에 기재된 특징, 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것이 존재함을 지정하려는 것이지, 하나 또는 그 이상의 다른 특징들이나 숫자, 단계, 동작, 구성요소, 부품 또는 이들을 조합한 것들의 존재 또는 부가 가능성을 미리 배제하지 않는 것으로 이해되어야 한다.The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting of the present invention. Singular expressions include plural expressions unless the context clearly indicates otherwise. In this application, the terms "comprise" or "have" are intended to indicate that there is a feature, number, step, operation, component, part, or combination thereof described in the specification, and one or more other features. It is to be understood that the present invention does not exclude the possibility of the presence or the addition of numbers, steps, operations, components, components, or a combination thereof.
다르게 정의되지 않는 한, 기술적이거나 과학적인 용어를 포함해서 여기서 사용되는 모든 용어들은 본 발명이 속하는 기술 분야에서 통상의 지식을 가진 자에 의해 일반적으로 이해되는 것과 동일한 의미를 갖고 있다. 일반적으로 사용되는 사전에 정의되어 있는 것과 같은 용어들은 관련 기술의 문맥 상 갖는 의미와 일치하는 의미를 갖는 것으로 해석되어야 하며, 본 출원에서 명백하게 정의하지 않는 한, 이상적이거나 과도하게 형식적인 의미로 해석되지 않는다.Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art. Terms such as those defined in the commonly used dictionaries should be construed to have meanings consistent with the meanings in the context of the related art, and shall not be construed in ideal or excessively formal meanings unless expressly defined in this application. Do not.
또한, 첨부 도면을 참조하여 설명함에 있어, 도면 부호에 관계없이 동일한 구성 요소는 동일한 참조부호를 부여하고 이에 대한 중복되는 설명은 생략하기로 한다. 본 발명을 설명함에 있어서 관련된 공지 기술에 대한 구체적인 설명이 본 발명의 요지를 불필요하게 흐릴 수 있다고 판단되는 경우 그 상세한 설명을 생략한다.In addition, in the description with reference to the accompanying drawings, the same components regardless of reference numerals will be given the same reference numerals and duplicate description thereof will be omitted. In the following description of the present invention, if it is determined that the detailed description of the related known technology may unnecessarily obscure the subject matter of the present invention, the detailed description thereof will be omitted.
본 발명은 라이다(LiDAR, Light Detection And Ranging) 센서를 이용하여 차량의 자세를 추정하는 방법에 대한 것이다. The present invention relates to a method of estimating a vehicle's attitude using a LiDAR (Light Detection And Ranging) sensor.
본 발명에서 라이다 센서를 이용한 차량의 자세 추정 방법의 수행 주체는 차량에 내장된 전자 시스템이 될 수 있다. 가령, 차량에 내장된 전자 제어 시스템 모듈의 제어부(프로세서)나, 자율 주행 차량에 장착된 전자 제어 시스템 등이 수행 주체가 될 수 있다.In the present invention, the subject performing the method of estimating the vehicle attitude using the lidar sensor may be an electronic system embedded in the vehicle. For example, a controller (processor) of an electronic control system module embedded in a vehicle, an electronic control system mounted on an autonomous vehicle, or the like may be a main agent.
라이다 센서는 빛을 쏘아 거리를 직접 측정하는 센서로서, 정확한 거리를 빠른 속도로 얻을 수 있기 때문에 위치 인식, 장애물 감지, 이동 물체 추적을 위해 자율 주행 차량에 많이 사용된다. 2D 라이다 센서는 각도를 달리하여 빛을 쏘아 한 번의 스캔에 여러 개의 거리 데이터를 얻을 수 있다.The Lidar sensor is a sensor that directly measures the distance by shooting light. It is widely used in autonomous vehicles for location recognition, obstacle detection, and moving object tracking because the accurate distance can be obtained at high speed. The 2D lidar sensor can shoot light at different angles to obtain multiple distance data in one scan.
도 2는 2D 라이다 센서의 측정범위와 측정값의 예를 보여주는 도면이다.2 is a view showing an example of the measurement range and measured value of the 2D lidar sensor.
도 2를 참조하면, 좌측 도면에서 호 모양 영역으로 표시된 측정 범위 안에 놓여있는 차량에 반사되어 측정된 값이, 우측 도면에서는 2차원 좌표상의 점들로 나타난다. 이처럼 2D 라이다 센서는 측정점으로 불리는 거리, 위상 데이터들이 하나의 평면상에 나타나는 특징이 있다. Referring to FIG. 2, the values reflected and measured by the vehicle lying within the measurement range indicated by the arc-shaped areas in the left figure are shown as points on two-dimensional coordinates in the right figure. As such, the 2D lidar sensor has a characteristic that distance and phase data called measurement points are displayed on one plane.
일반적으로 자율 주행 차량은 주차 혹은 후방 장애물 감지를 위해 후방에 2D 라이다 센서를 장착하는 경우가 있기 때문에, 이를 차량의 자세 추정에 이용할 경우 추가 비용이 발생하지 않는다.In general, autonomous vehicles may be equipped with a 2D lidar sensor at the rear for parking or rear obstacle detection, and thus, there is no additional cost when using them for estimating the attitude of the vehicle.
도 3은 본 발명의 일 실시예에 따른 라이다 센서를 이용한 차량의 자세 추정 방법을 도시한 전체 흐름도이다. 3 is an overall flowchart illustrating a method of estimating a vehicle attitude using a lidar sensor according to an exemplary embodiment of the present invention.
도 3을 참조하면, 본 발명의 일 실시예에 따른 차량의 자세 추정 방법은 다음과 같다. Referring to Figure 3, the vehicle attitude estimation method according to an embodiment of the present invention is as follows.
먼저, 라이다 센서로부터 거리를 측정한 측정점 데이터를 얻는다(S310).First, measurement point data obtained by measuring a distance from a lidar sensor is obtained (S310).
다음, 측정점 데이터를 2차원 직교 좌표 상에 나타내고, 2차원 직교 좌표 상에서 도로선을 찾는다(S320).Next, the measurement point data is displayed on the two-dimensional rectangular coordinates, and the road line is found on the two-dimensional rectangular coordinates (S320).
그리고, 도로선으로부터 α, β의 자세 변화량(α', β')을 계산한다(S330). The attitude change amounts α 'and β' of α and β are calculated from the road line (S330).
그리고, 관성측정장치의 측정치와 융합하여 최종적인 자세 값인 α, β를 계산한다(S340). Then, the final posture values α and β are calculated by fusion with the measured values of the inertial measurement apparatus (S340).
본 발명에서는 라이다 센서의 측정 주기마다 S310~S340 의 모든 단계가 한 번씩 실행된다. 이하에서는 각 단계별로 상세히 설명한다. In the present invention, every step of S310 to S340 is executed once every measurement cycle of the lidar sensor. Hereinafter, each step will be described in detail.
S310 단계에서는 2D 라이다 센서에서 측정된 가공되지 않은 데이터인, 거리와 위상으로 구성된 점 집합을 얻는다. 이 점들을 각각 측정점이라고 한다.In step S310, a point set consisting of distance and phase, which is raw data measured by a 2D lidar sensor, is obtained. Each of these points is called a measuring point.
도 4는 도로를 바라본 라이다 센서의 측정값들을 직교 좌표계에 표현한 도면이다. FIG. 4 is a diagram illustrating measured values of a lidar sensor viewed from a road in a Cartesian coordinate system.
도 4에서 각 점들은 다음 수학식과 같이 나타낼 수 있다. In FIG. 4, each point may be represented by the following equation.
Figure PCTKR2016013670-appb-M000001
Figure PCTKR2016013670-appb-M000001
여기서, r은 거리를 나타내고, θ는 각도를 나타내며, n개의 측정점을 얻을 수 있다. 측정치의 개수 n은 라이다의 성능에 의존한다. 본 발명에서 연산의 복잡도를 줄이고 정확도를 높이기 위해, 지나치게 크거나 작은 거리값을 가진 측정점은 미리 점 집합에서 제외시킨다.Here, r represents a distance, θ represents an angle, and n measurement points can be obtained. The number n of measurements depends on the performance of the lidar. In the present invention, in order to reduce the complexity of the calculation and increase the accuracy, measurement points having too large or too small distance values are previously excluded from the point set.
S320 단계는 측정점 데이터를 직교 좌표계로 투영하여 도로선을 찾는 과정이다. 여기서 도로선은 측정점들 중에서 라이다 센서에서 쏜 빛이 도로에 반사된 측정치, 즉 도로와의 거리, 위상을 나타내는 측정점으로 이루어진 직선이다.Step S320 is a process of finding road lines by projecting measurement point data into a Cartesian coordinate system. Here, the road line is a straight line made up of measurement points indicating the distance measured by the light from the lidar sensor reflected on the road, that is, the distance and phase of the road.
도 4에서, 도로를 바라본 라이다 센서가 얻은 측정점들이 예시되어 있다. 도 4에서, 도로를 나타내는 측정점들은 직선으로 근사시킬 수 있다. In FIG. 4, measurement points obtained by a lidar sensor viewed from the road are illustrated. In Figure 4, the measuring points representing the road can be approximated by a straight line.
이러한 도로선을 찾기 위해서 빠르고 정확한 직선 추출 방법이 필요하다. 기존의 직선 추출 방법 중 빠르고 정확한 방법들이 여러 가지 있으나 실시간으로 변화하는 도로 환경에서 적용할 때 강인함이 떨어진다는 문제점을 가지고 있다. 이를 해결하기 위해서, 본 발명에서는 미리 정의한 도로 기준선에 매 측정 마다 매칭하여 도로선을 찾는 방법을 제안한다.In order to find these road lines, a fast and accurate straight line extraction method is required. Although there are many fast and accurate methods among the existing straight line extraction methods, it has a problem that the robustness is poor when applied in the road environment that changes in real time. In order to solve this problem, the present invention proposes a method of finding a road line by matching a predefined road reference line for every measurement.
먼저 극 좌표계에서 직교 좌표계로 변환하면 다음과 같다.First, the conversion from the polar coordinate system to the Cartesian coordinate system is as follows.
Figure PCTKR2016013670-appb-M000002
Figure PCTKR2016013670-appb-M000002
본 발명에서는 도로에 반사되어 측정된 데이터만을 필요로 하므로, 라이다 센서의 시야각이 넓은 경우 임의로 범위를 축소시킨다. 그리고 점집합 P 에 2차원 변환행렬을 곱한다. In the present invention, since only the data measured by the reflection of the road is required, the range is arbitrarily reduced when the viewing angle of the lidar sensor is wide. The point set P is then multiplied by the two-dimensional transformation matrix.
도 5는 기존의 점 집합에 변환을 적용한 예시도이다. 여기서 변환행렬은 다음과 같다. 5 is an exemplary diagram in which a transform is applied to an existing point set. Here, the transformation matrix is as follows.
Figure PCTKR2016013670-appb-M000003
Figure PCTKR2016013670-appb-M000003
여기서, tx는 x축 이동을 나타내고, ty는 y축 이동을 나타내고, φ는 회전 이동 각도를 의미한다. 이 변환행렬을 찾으면 도로선을 단순한 계산으로 용이하게 찾아낼 수 있다. Here, t x represents x-axis movement, t y represents y-axis movement, and phi means rotational movement angle. Finding this transformation matrix makes it easy to find the road line by simple calculation.
변환행렬을 찾으려면 비교할 기준인 도로 기준선이 필요하다. 도로 기준선은 평지에서 획득한 라이다 측정치 중 도로인 부분을 의미한다.To find the transformation matrix, you need a road baseline to compare against. Road baseline refers to the portion of the lidar measurements taken on the flat road.
도 6은 도로 기준선과 주행 중 추출한 도로선을 예시한 도면이다. 6 is a diagram illustrating a road reference line and a road line extracted while driving.
도 6에서 점선이 도로 기준선이며, 이는 라이다 센서의 장착 높이와 하향 각도에 따라 달라진다.The dashed line in FIG. 6 is the road reference line, which depends on the mounting height and downward angle of the lidar sensor.
변환행렬을 찾는 방법은 많지만 도로를 주행하는 상황과 같이 변칙적인 상황에서 사용할 수 있는 방법은 제한되어 있다. 왜냐하면, 점 집합의 개수가 매번 같다는 보장이 없고, 단순한 행렬연산으로 변환행렬을 갱신할 수 없기 때문이다. 따라서, 주로 직선과 같은 특징을 뽑아 매칭하거나, 보완된 점 대 점 매칭 방법 등을 사용해왔으나, 전자는 수행 속도가 빠르지만 차량이 회전할 때 강인함이 매우 떨어지는 문제가 있고, 후자는 정확도가 높지만 실시간으로 동작하기 어렵다는 문제가 있다. 그러므로, 본 발명에서는 점유격자 지도 내에서 함수 최적화를 하는 방법을 사용하기로 한다. There are many ways to find the transformation matrix, but there are a limited number of methods that can be used in anomalous situations, such as driving on roads. This is because there is no guarantee that the number of point sets is the same each time, and the transformation matrix cannot be updated by simple matrix operation. Therefore, although a feature such as a straight line has been mainly selected or a supplementary point-to-point matching method has been used, the former has a high performance, but the robustness is very poor when the vehicle rotates, and the latter has a high accuracy but a real time. There is a problem that is difficult to operate. Therefore, in the present invention, a method of optimizing a function in the occupant grid map will be used.
이를 간략히 설명하면, 변환행렬을 찾기 위해, 점 집합 P 를 미리 정의한 점유격자 지도에 매칭하여 오차가 가장 작은 변환행렬을 찾아가는 방법이다. 여기서, 점유격자 지도는 지도를 표현하는 방법 중 하나로서, 크기가 고정된 각 격자마다 점유율을 가지고 있는 지도이다. 본 발명에서는 미리 도로 기준선을 그린 점유격자 지도를 사용한다.Briefly, in order to find a transformation matrix, a method of finding a transformation matrix having the smallest error by matching the point set P with a predefined graticule map. Here, the occupant grid map is one of the methods of expressing a map, and has a share in each grid having a fixed size. In the present invention, an occupant grid map in which a road reference line is drawn in advance is used.
도 7은 본 발명의 일 실시예에 따른 최적화에 사용된 점유격자 지도를 예시한 도면이다. 7 is a diagram illustrating an occupant lattice map used for optimization according to an embodiment of the present invention.
도 7을 참조하면, 점유격자 지도는 가우시안 분포를 따르는 직선의 형태로 작성하며, y축 기울기가 0 이 되어 수치적으로 발산하는 것을 방지하기 위해 한 줄씩 비운다. 여기서, 도로 기준선과 지도의 원점 간 거리는 라이다 센서의 높이와 하향 각도를 통해 다음과 같이 계산된다.Referring to FIG. 7, the occupant lattice map is prepared in the form of a straight line along the Gaussian distribution, and is blanked line by line to prevent numerical divergence due to a zero y-axis gradient. Here, the distance between the road reference line and the origin of the map is calculated as follows through the height and downward angle of the lidar sensor.
Figure PCTKR2016013670-appb-M000004
Figure PCTKR2016013670-appb-M000004
여기서, ρ는 점유격자 지도의 원점에서 도로 기준선까지의 거리이고, h는 라이다 센서의 높이이고, ω는 센서의 하향 각도이다. Where ρ is the distance from the origin of the occupant grid map to the road reference line, h is the height of the lidar sensor, and ω is the downward angle of the sensor.
본 발명에서 점유격자 지도의 각 셀 당 크기, 즉 해상도는 라이다 센서의 성능에 의존한다. 즉, 시야각 대비 점의 개수가 많을수록 점유격자 지도의 해상도를 높일 수 있다. 본 발명에서 점유격자 지도의 해상도가 높아야 변환행렬의 정밀도가 높아진다.In the present invention, the size per cell of the graticule map, i.e. the resolution, depends on the performance of the LiDAR sensor. That is, as the number of points compared to the viewing angle increases, the resolution of the occupant grid map may be increased. In the present invention, the resolution of the occupant grid map must be high to increase the precision of the transformation matrix.
본 발명에서 최적화된 변환행렬로부터 도로선의 기울기와 절편을 도출하는 방식으로 도로선을 찾을 수 있다.In the present invention, the road line can be found by deriving the slope and the intercept of the road line from the optimized transformation matrix.
그리고, 변환행렬로부터 차량의 자세 변화량(α', β')을 계산할 수 있다. Then, the vehicle posture change amounts α 'and β' can be calculated from the transformation matrix.
이제 본 발명에서의 매칭 과정을 상세히 설명하면 다음과 같다. Now, the matching process in the present invention will be described in detail.
본 발명에서는 최적의 변환 T*를 찾기 위하여 비선형 함수 최적화 기법 중 하나인 Levenberg-Marquardt 알고리즘을 적용한다. 이 알고리즘은 초기치가 일정 범위 내에 있으면, 전역 해를 찾을 확률이 높고 수렴속도가 빠르다는 장점이 있다. In the present invention, the Levenberg-Marquardt algorithm, which is one of nonlinear function optimization techniques, is applied to find the optimal transform T * . This algorithm has the advantage of high probability of finding global solution and fast convergence if the initial value is within a certain range.
도 8은 본 발명의 일 실시예에 따른 변환행렬 최적화 방법을 보여주는 흐름도이다. 8 is a flowchart illustrating a transform matrix optimization method according to an embodiment of the present invention.
도 8을 참조하면, 본 발명의 일 실시예에 따른 변환행렬을 최적화하는 단계는, 먼저 선형 보간된 점유율과 점유율 기울기 값을 계산한다(S810). Referring to FIG. 8, in the optimizing of the transformation matrix according to an embodiment of the present invention, first, a linear interpolated occupancy rate and occupancy slope value are calculated (S810).
그리고, 선형 보간된 점유율과 기울기 값을 통해 변환행렬을 업데이트한다(S820). The transformation matrix is updated through linearly interpolated occupancy and slope values (S820).
여기서, 업데이트된 변환행렬의 에러가 미리 정해진 제1 임계치 이상이고(S830), 업데이트 반복 횟수가 미리 정해진 제2 임계치 이하이면(S840), 다시 선형 보간된 점유율과 점유율 기울기 값을 계산하여 변환행렬을 업데이트한다(S810, S820). Here, if the error of the updated transformation matrix is greater than or equal to the first predetermined threshold (S830), and if the number of update iterations is less than or equal to the predetermined second threshold (S840), the linearly interpolated occupancy ratio and occupancy gradient value are calculated again. Update (S810, S820).
그리고, 업데이트된 변환행렬의 에러가 상기 제1 임계치 미만이거나(S830), 또는 업데이트 반복 횟수가 미리 정해진 제2 임계치 이상이면(S840), 최적화 과정을 종료한다.If the error of the updated transformation matrix is less than the first threshold (S830) or if the number of update iterations is greater than or equal to a second predetermined threshold (S840), the optimization process is terminated.
이러한 최적화 과정을 보다 상세히 설명하면 다음과 같다. This optimization process is described in more detail as follows.
먼저 다음 수학식으로 충분히 수렴할 때까지 변환행렬 T*를 업데이트한다.First, we update the transformation matrix T * until it converges sufficiently into the following equation.
Figure PCTKR2016013670-appb-M000005
Figure PCTKR2016013670-appb-M000005
여기서, pk+ 1는 최적화하고자 하는 변환행렬 T*의 변수 tx, ty, φ이고, pk는 이전 변환행렬 T이다. 그리고, r(pk)는 현재 변환행렬을 적용하였을 때의 오차이고, Jr은 오차 함수의 1차 미분 행렬이고, Hr은 오차 함수의 2차 미분 행렬이다. Here, p k + 1 is the variables t x , t y , φ of the transformation matrix T * to be optimized, and p k is the previous transformation matrix T. R (p k ) is the error when the current transformation matrix is applied, J r is the first derivative of the error function, and H r is the second derivative of the error function.
Hr은 다음 수학식과 같이 Jr로 구할 수 있다.H r can be obtained as Jr as in the following equation.
Figure PCTKR2016013670-appb-M000006
Figure PCTKR2016013670-appb-M000006
다음, 이전의 변환행렬 T를 대입하여, 다음 수학식과 같이 선형 보간된 점유율과 점유율 기울기 값을 구한다. Next, the previous transformation matrix T is substituted to obtain a linearly interpolated occupancy rate and occupancy slope value as in the following equation.
Figure PCTKR2016013670-appb-M000007
Figure PCTKR2016013670-appb-M000007
여기서, v는 점유율이고,
Figure PCTKR2016013670-appb-I000001
는 x방향의 기울기이고,
Figure PCTKR2016013670-appb-I000002
는 y방향의 기울기를 의미한다.
Where v is occupancy,
Figure PCTKR2016013670-appb-I000001
Is the slope in the x direction,
Figure PCTKR2016013670-appb-I000002
Denotes the slope of the y direction.
그리고, 해당 점에서의 점유율, 1차 미분을 이용하여 Jr 행렬을 계산하고, 변환행렬을 갱신한다. Jr은 다음 수학식과 같이 계산된다. The Jr matrix is calculated using the occupancy ratio and the first derivative of the point, and the transformation matrix is updated. J r is calculated as follows.
Figure PCTKR2016013670-appb-M000008
Figure PCTKR2016013670-appb-M000008
여기서, x, y는 점 집합 내 각 점의 좌표이고, φ는 도로선의 회전 각도이다. Here, x and y are the coordinates of each point in the point set, and φ is the rotation angle of the road line.
그리고, 계산된 Jr로 Hr를 구하고, 갱신된 변환행렬 T*를 계산한다. Then, H r is obtained from the calculated J r , and the updated conversion matrix T * is calculated.
본 발명에서 오차가 줄어들면 완충계수 μk를 줄이고, 그렇지 않으면 μk를 증가시켜 다음 반복에서 사용한다. 그리고, 오차가 임계보다 작거나 설정된 반복 횟수에 도달하면, 최적화 과정을 종료하고 갱신된 변환행렬 T*를 저장한다.In the present invention, if the error is reduced, the buffer coefficient μ k is reduced, otherwise the μ k is increased to be used in the next iteration. When the error is less than the threshold or reaches the set number of repetitions, the optimization process is terminated and the updated transform matrix T * is stored.
S330 단계는 변환행렬로부터 차량의 α와 β 변화량을 구하는 단계이다. In operation S330, an amount of change in α and β of the vehicle is calculated from the transformation matrix.
도 6을 참조하면, 청색 실선으로 표시된 변환행렬로부터 얻은 직선에 3차원 x, y축 회전변환을 적용하면, 녹색 점선으로 표시된 평지일 때의 직선을 얻을 수 있다. 그리고, 다음 수학식으로 차량의 α와 β 를 계산한다.Referring to FIG. 6, when a three-dimensional x- and y-axis rotation transformation is applied to a straight line obtained from a transformation matrix indicated by a solid blue line, a straight line in a flat plane represented by a green dotted line may be obtained. Then, α and β of the vehicle are calculated by the following equation.
Figure PCTKR2016013670-appb-M000009
Figure PCTKR2016013670-appb-M000009
여기서, p는 평지일 때 라이다 센서의 부착 위치와 하향 각도를 고려한 도로 기준선 위의 점이고, p'은 계산한 직선 위의 점이고, Rx는 x축 4×4 회전행렬이고, Ry는 y축 4×4 회전행렬이다. 이 계산은 α와 β에 관한 2차 연립 방정식으로 쉽게 풀리며, 평지의 도로 기준선과 비교하였으므로, 실제로 변화량을 구한 것이라고 할 수 있다.Where p is a point on the road reference line considering the attachment position and the downward angle of the lidar sensor when p is flat, p 'is a point on the calculated straight line, R x is the x-axis 4x4 rotation matrix, and R y is y Axis 4x4 rotation matrix. This calculation is easily solved by the quadratic system of equations for α and β, and compared with the road baseline on the plain, so it can be said that the change is actually obtained.
마지막 S340 단계는 S330 단계까지 수행하여 획득한 α, β 변화량과 차량 내부의 관성측정센서에서 추정한 α, β와 혼합하여 보상하는 단계이다. 여기서 차량의 속도, 라이다 센서의 측정 주기, 라이다 센서로 획득한 자세 변화량으로 차량의 자세를 추정한다. 그리고 칼만 필터를 사용하여 관성측정센서에서 추정한 자세와 융합한다.The final step S340 is a step of compensating by mixing the amount of change in α and β obtained by performing step S330 with the amount of α and β estimated by the inertial measurement sensor inside the vehicle. Here, the vehicle attitude is estimated using the vehicle speed, the measurement cycle of the lidar sensor, and the posture change amount acquired by the lidar sensor. The Kalman filter is then used to fuse the position estimated by the inertial measurement sensor.
이상 본 발명을 몇 가지 바람직한 실시예를 사용하여 설명하였으나, 이들 실시예는 예시적인 것이며 한정적인 것이 아니다. 본 발명이 속하는 기술분야에서 통상의 지식을 지닌 자라면 본 발명의 사상과 첨부된 특허청구범위에 제시된 권리범위에서 벗어나지 않으면서 다양한 변화와 수정을 가할 수 있음을 이해할 것이다.While the invention has been described using some preferred embodiments, these embodiments are illustrative and not restrictive. Those skilled in the art will appreciate that various changes and modifications can be made without departing from the spirit of the invention and the scope of the rights set forth in the appended claims.

Claims (4)

  1. 관성측정장치(Inertial Measurement Unit)가 내장되어 있고, 라이다(LiDAR, Light Detection And Ranging) 센서가 후방에 장착된 차량의 자세 추정 방법에서, In a vehicle attitude estimation method in which an inertial measurement unit is embedded and a LiDAR (LiDAR, Light Detection And Ranging) sensor is mounted at the rear,
    상기 라이다 센서를 이용하여 거리를 측정한 측정점 데이터를 획득하는 단계;Acquiring measurement point data of a distance using the lidar sensor;
    상기 측정점 데이터를 2차원 직교 좌표 상에 투영하여 도로선을 찾는 단계;Finding the road line by projecting the measurement point data onto two-dimensional rectangular coordinates;
    상기 도로선으로부터 차량의 자세 변화량(α', β')을 계산하는 단계; 및Calculating an amount of change in attitude of the vehicle from the road line (α ', β'); And
    상기 자세 변화량(α', β')을 상기 관성측정장치의 측정치와 융합하여 차량의 자세(α, β)를 계산하는 단계를 포함하는 차량의 자세 추정 방법. And calculating the postures (α, β) of the vehicle by fusing the posture change amounts (α ', β') with the measured values of the inertial measurement apparatus.
  2. 청구항 1에 있어서, The method according to claim 1,
    상기 도로선을 찾는 단계는,Finding the road line,
    극 좌표계로 표현된 상기 측정점 데이터의 집합인 점 집합 P를 직교 좌표계로 변환하는 단계;Converting a point set P, which is a set of measurement point data expressed in a polar coordinate system, into a rectangular coordinate system;
    크기가 고정된 각 격자마다 점유율을 갖고 있는 지도인 점유격자 지도를 가우시안 분포를 따르는 직선의 형태로 작성하는 단계;Creating an occupant grid map, which is a map having a share for each fixed grid, in the form of a straight line along the Gaussian distribution;
    상기 점 집합 P를 상기 점유격자 지도에 매칭하여 변환행렬을 구하는 단계;Obtaining a transformation matrix by matching the point set P to the occupant grid map;
    상기 변환행렬을 계속 업데이트하여 최적화하는 단계; 및Continuously updating and optimizing the transformation matrix; And
    상기 최적화된 변환행렬로부터 도로선의 기울기와 절편을 도출하는 단계를 포함하여 이루어지는 것을 특징으로 하는 차량의 자세 추정 방법. And deriving a slope and an intercept of the road line from the optimized transformation matrix.
  3. 청구항 2에 있어서,The method according to claim 2,
    상기 변환행렬로부터 차량의 자세 변화량(α', β')을 계산하는 것을 특징으로 하는 차량의 자세 추정 방법.And calculating a vehicle posture change amount (α ', β') from the transformation matrix.
  4. 청구항 2에 있어서,The method according to claim 2,
    상기 변환행렬을 최적화하는 단계는, Optimizing the transformation matrix,
    선형 보간된 점유율과 점유율 기울기 값을 계산하는 단계;Calculating linearly interpolated occupancy and occupancy slope values;
    상기 선형 보간된 점유율과 기울기 값을 통해 변환행렬을 업데이트하는 단계;Updating a transformation matrix with the linearly interpolated occupancy and slope values;
    업데이트된 변환행렬의 에러가 미리 정해진 제1 임계치 이상이고, 업데이트 반복 횟수가 미리 정해진 제2 임계치 이하이면, 다시 선형 보간된 점유율과 점유율 기울기 값을 계산하여 변환행렬을 업데이트하는 단계; 및If the error of the updated transformation matrix is greater than or equal to the first predetermined threshold and the number of update iterations is less than or equal to the second predetermined threshold, calculating the linearly interpolated occupancy ratio and occupancy gradient value again to update the transformation matrix; And
    업데이트된 변환행렬의 에러가 상기 제1 임계치 미만이거나, 또는 업데이트 반복 횟수가 미리 정해진 제2 임계치 이상이면, 최적화 과정을 종료하는 단계로 이루어지는 것을 특징으로 하는 차량의 자세 추정 방법.And if the error of the updated transformation matrix is less than the first threshold, or if the number of update iterations is greater than or equal to a second predetermined threshold, terminating the optimization process.
PCT/KR2016/013670 2016-10-06 2016-11-25 Method for estimating attitude of vehicle by using lidar sensor WO2018066754A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160128861A KR101909953B1 (en) 2016-10-06 2016-10-06 Method for vehicle pose estimation using LiDAR
KR10-2016-0128861 2016-10-06

Publications (1)

Publication Number Publication Date
WO2018066754A1 true WO2018066754A1 (en) 2018-04-12

Family

ID=61831821

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2016/013670 WO2018066754A1 (en) 2016-10-06 2016-11-25 Method for estimating attitude of vehicle by using lidar sensor

Country Status (2)

Country Link
KR (1) KR101909953B1 (en)
WO (1) WO2018066754A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111806421A (en) * 2019-04-01 2020-10-23 通用汽车环球科技运作有限责任公司 Vehicle attitude determination system and method
CN111897365A (en) * 2020-08-27 2020-11-06 中国人民解放军国防科技大学 Autonomous vehicle three-dimensional path planning method for contour line guide line
CN112781586A (en) * 2020-12-29 2021-05-11 上海商汤临港智能科技有限公司 Pose data determination method and device, electronic equipment and vehicle
CN112923933A (en) * 2019-12-06 2021-06-08 北理慧动(常熟)车辆科技有限公司 Laser radar SLAM algorithm and inertial navigation fusion positioning method
CN115407355A (en) * 2022-11-01 2022-11-29 小米汽车科技有限公司 Library position map verification method and device and terminal equipment

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11199614B1 (en) * 2020-07-08 2021-12-14 Beijing Voyager Technology Co., Ltd Lidar and image calibration for autonomous vehicles
KR102400033B1 (en) * 2020-08-31 2022-05-20 (주)오토노머스에이투지 Method and computing device for classifying sensing points of multi-layer lidar sensor installed on moving body
KR102635242B1 (en) * 2022-05-30 2024-02-08 현대모비스 주식회사 Apparatus for estimating road to vehicle pitch and method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9086481B1 (en) * 2013-01-18 2015-07-21 Google Inc. Methods and systems for estimating vehicle speed
US20150203109A1 (en) * 2014-01-21 2015-07-23 Continental Automotive Systems, Inc. Road departure protection system
US9207680B1 (en) * 2014-01-07 2015-12-08 Google Inc. Estimating multi-vehicle motion characteristics by finding stable reference points
US20160026184A1 (en) * 2014-07-24 2016-01-28 GM Global Technology Operations LLC Curb detection using lidar with sparse measurements
US20160035081A1 (en) * 2014-04-25 2016-02-04 Google Inc. Methods and Systems for Object Detection using Laser Point Clouds

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9086481B1 (en) * 2013-01-18 2015-07-21 Google Inc. Methods and systems for estimating vehicle speed
US9207680B1 (en) * 2014-01-07 2015-12-08 Google Inc. Estimating multi-vehicle motion characteristics by finding stable reference points
US20150203109A1 (en) * 2014-01-21 2015-07-23 Continental Automotive Systems, Inc. Road departure protection system
US20160035081A1 (en) * 2014-04-25 2016-02-04 Google Inc. Methods and Systems for Object Detection using Laser Point Clouds
US20160026184A1 (en) * 2014-07-24 2016-01-28 GM Global Technology Operations LLC Curb detection using lidar with sparse measurements

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111806421A (en) * 2019-04-01 2020-10-23 通用汽车环球科技运作有限责任公司 Vehicle attitude determination system and method
CN112923933A (en) * 2019-12-06 2021-06-08 北理慧动(常熟)车辆科技有限公司 Laser radar SLAM algorithm and inertial navigation fusion positioning method
CN111897365A (en) * 2020-08-27 2020-11-06 中国人民解放军国防科技大学 Autonomous vehicle three-dimensional path planning method for contour line guide line
CN112781586A (en) * 2020-12-29 2021-05-11 上海商汤临港智能科技有限公司 Pose data determination method and device, electronic equipment and vehicle
WO2022142185A1 (en) * 2020-12-29 2022-07-07 上海商汤临港智能科技有限公司 Pose data determination method and apparatus, and electronic device and vehicle
CN112781586B (en) * 2020-12-29 2022-11-04 上海商汤临港智能科技有限公司 Pose data determination method and device, electronic equipment and vehicle
CN115407355A (en) * 2022-11-01 2022-11-29 小米汽车科技有限公司 Library position map verification method and device and terminal equipment
CN115407355B (en) * 2022-11-01 2023-01-10 小米汽车科技有限公司 Library position map verification method and device and terminal equipment

Also Published As

Publication number Publication date
KR101909953B1 (en) 2018-12-19
KR20180038154A (en) 2018-04-16

Similar Documents

Publication Publication Date Title
WO2018066754A1 (en) Method for estimating attitude of vehicle by using lidar sensor
WO2020140431A1 (en) Camera pose determination method and apparatus, electronic device and storage medium
US8761439B1 (en) Method and apparatus for generating three-dimensional pose using monocular visual sensor and inertial measurement unit
US7782361B2 (en) Method and apparatus for measuring position and orientation
CN110487267B (en) Unmanned aerial vehicle navigation system and method based on VIO & UWB loose combination
CN106814753B (en) Target position correction method, device and system
CN109798891B (en) Inertial measurement unit calibration system based on high-precision motion capture system
CN111380514A (en) Robot position and posture estimation method and device, terminal and computer storage medium
JP2014101075A (en) Image processing apparatus, image processing method and program
CN108375383B (en) Multi-camera-assisted airborne distributed POS flexible baseline measurement method and device
WO2018124337A1 (en) Object detection method and apparatus utilizing adaptive area of interest and discovery window
KR100898169B1 (en) Initial alignment method of inertial navigation system
EP4155873A1 (en) Multi-sensor handle controller hybrid tracking method and device
US20180075609A1 (en) Method of Estimating Relative Motion Using a Visual-Inertial Sensor
CN110044377B (en) Vicon-based IMU offline calibration method
CN111189474A (en) Autonomous calibration method of MARG sensor based on MEMS
CN116817896A (en) Gesture resolving method based on extended Kalman filtering
CN111121755A (en) Multi-sensor fusion positioning method, device, equipment and storage medium
CN111539982B (en) Visual inertial navigation initialization method based on nonlinear optimization in mobile platform
CN113295159A (en) Positioning method and device for end cloud integration and computer readable storage medium
CN111145267B (en) 360-degree panoramic view multi-camera calibration method based on IMU assistance
CN111207688A (en) Method and device for measuring distance of target object in vehicle and vehicle
CN114383612B (en) Vision-assisted inertial differential pose measurement system
CN115540854A (en) Active positioning method, equipment and medium based on UWB assistance
CN116047481A (en) Method, device, equipment and storage medium for correcting point cloud data distortion

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16918375

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16918375

Country of ref document: EP

Kind code of ref document: A1