WO2022114449A1 - Procédé et appareil d'étalonnage en ligne entre un capteur lidar et une caméra - Google Patents

Procédé et appareil d'étalonnage en ligne entre un capteur lidar et une caméra Download PDF

Info

Publication number
WO2022114449A1
WO2022114449A1 PCT/KR2021/010044 KR2021010044W WO2022114449A1 WO 2022114449 A1 WO2022114449 A1 WO 2022114449A1 KR 2021010044 W KR2021010044 W KR 2021010044W WO 2022114449 A1 WO2022114449 A1 WO 2022114449A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
lidar
image
feature
camera image
Prior art date
Application number
PCT/KR2021/010044
Other languages
English (en)
Korean (ko)
Inventor
박민규
김제우
윤주홍
Original Assignee
한국전자기술연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한국전자기술연구원 filed Critical 한국전자기술연구원
Publication of WO2022114449A1 publication Critical patent/WO2022114449A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Definitions

  • the present invention relates to a sensor calibration technology, and more particularly, to correct geometrical errors caused by mechanical shaking or distortion in devices (eg, autonomous vehicles) in which one or more cameras and one or more LiDAR sensors are used at the same time. It relates to a method and apparatus for calibrating.
  • the sensor calibration process is a process of estimating position/posture information between cameras or between a camera and a sensor, which must be preceded for fusion of sensor information.
  • the position of the sensor may change subtly depending on various factors over time, and such a difference can have a significant impact on the algorithm used after sensor fusion.
  • the present invention has been devised to solve the above problems, and an object of the present invention is to calculate the geometrical error between the lidar and the camera using a feature that is predetermined and known in shape, and a method and apparatus for online correction is to provide.
  • a calibration method includes: a first acquiring step of acquiring a camera image; A first extraction step of extracting a feature of a predetermined shape from the acquired camera image; a second acquisition step of acquiring a lidar image; a second extraction step of extracting a feature of a predetermined shape from the acquired lidar image; Comparing the outlines of the extracted features, correcting the error of the position and posture between the camera and the lidar; includes.
  • the feature may be a feature having a fixed shape and size.
  • the feature may be a traffic sign.
  • the camera image is a 2D camera image or a 3D camera image
  • the correction step compares the appearance of the feature extracted from the 2D camera image or 3D camera image with the appearance of the feature extracted from the lidar image, and the position and posture between the camera and the lidar It may include; correcting the error of.
  • the camera image includes a 2D camera image and a 3D camera image
  • the first extraction step extracts the appearance of the feature from the 2D camera image and the 3D camera image, respectively
  • the correction step includes the appearance of the feature extracted from the 2D camera image and the Using the first comparison result, which is a comparison result between the appearance of features extracted from the lidar image, and the second comparison result, which is the comparison result between the appearance of the feature extracted from the 3D camera image and the appearance of the feature extracted from the lidar image
  • the camera and lidar It may include; correcting the error of the liver position and posture.
  • a weight may be applied to one of the first comparison result and the second comparison result.
  • the correction step may include matching the contours of the extracted feature; It may include; correcting the error of the position and posture between the camera and the lidar so that the matching error is minimized.
  • a calibration apparatus a first input unit for acquiring a camera image; a first processing unit for extracting a feature of a predetermined shape from the acquired camera image; a second acquisition unit for acquiring a lidar image; a second processing unit for extracting a feature of a predetermined shape from the obtained lidar image; Comparing the outlines of the extracted features, a calibration unit for correcting the error of the position and posture between the camera and the lidar; includes.
  • the geometric error between the lidar-camera is calculated and online corrected, so that the calibration value that needs correction can be easily and automatically can be corrected.
  • FIG. 1 is a diagram provided for conceptual explanation of a process of correcting a position/posture error between a camera and a lidar sensor
  • 2 is a flowchart provided for explaining a calibration method using a 2D camera image and a lidar image
  • 3 is a flowchart provided for explaining a calibration method using a 3D camera image and a lidar image
  • 4 is a flowchart provided for explaining a calibration method using a 2D camera image, a 3D camera image, and a lidar image;
  • FIG. 5 is a block diagram of a calibration apparatus according to another embodiment of the present invention.
  • An embodiment of the present invention provides an online calibration method between a LiDAR sensor and a camera.
  • a terrain or a feature eg, a traffic sign, a vehicle whose shape is predetermined and known is used.
  • FIG. 1 is a diagram provided for conceptual explanation of a process of correcting a position/posture error between a camera and a lidar sensor.
  • a camera image is acquired, and a topography or feature (in the case of FIG. 1, a traffic sign is exemplified) of a shape and size of a fixed form is extracted from the acquired camera image.
  • a topography or feature in the case of FIG. 1, a traffic sign is exemplified
  • the lidar sensor can separate terrain or features through a clustering process.
  • the appearance of features extracted from the camera image and lidar image is the correspondence information between the camera and lidar sensor, and this is used for calibration.
  • the contour of the feature extracted from the camera image and the appearance of the feature extracted from the lidar image are matched, and the error of the position and posture between the camera and the lidar is corrected so that the matching error is minimized, and calibration is performed.
  • the lidar image is a 3D image
  • the camera image may be a 3D image, but may also be a 2D image.
  • each case is divided and a calibration method will be described in detail.
  • 2 is a flowchart provided to explain a calibration method using a 2D camera image and a lidar image. This is the case when the camera is a 2D camera (1 RGB camera).
  • a 2D camera image is acquired (S110), and a topography or feature having a shape and size determined from the acquired 2D camera image is extracted (S120).
  • step S120 Since the shape and size of the topography or feature to be extracted in step S120 are predetermined and known, it is a case in which 3D information can be inversely calculated based on prior knowledge.
  • a lidar image is acquired (S130), and the same terrain or features are extracted from the acquired lidar image (S140).
  • the error correction in step S170 is a process of obtaining R (posture) and T (position) that minimize the objective function presented in Equation 1 below.
  • l(R,T) is an objective function
  • ⁇ RT is a function for warping a lidar image to a 2D camera image plane
  • X LiDAR is a lidar image
  • u img is a 2D camera image.
  • 3 is a flowchart provided to explain a calibration method using a 3D camera image and a lidar image. This is the case if the camera is a 3D camera (stereo RGB camera or RGB camera + depth camera).
  • a 3D camera image is acquired (S210), and a topography or feature having a shape and size determined from the acquired 3D camera image is extracted (S220).
  • a lidar image is acquired (S230), and the same terrain or features are extracted from the acquired lidar image (S240).
  • the error correction in step S270 is a process of obtaining R (posture) and T (position) that minimize the objective function presented in Equation 2 below.
  • l(R,T) is an objective function
  • X LiDAR is a lidar image
  • X img is a 3D camera image
  • FIG. 4 is a flowchart provided to explain a calibration method using a 2D camera image, a 3D camera image, and a lidar image. It is different from the above-described embodiments in that both 2D and 3D camera images are used.
  • steps S311 to S315 are performed, which are the same as steps S110 to S150 shown in FIG. 2 .
  • steps S321 to S325 are performed, which are the same as steps S210 to S250 shown in FIG. 2 .
  • the error correction in step S330 is a process of obtaining R (posture) and T (position) that minimize the objective function presented in Equation 3 below.
  • ⁇ RT is a function for warping the lidar image to the 2D camera image plane
  • X LiDAR is the lidar image
  • u img is the 2D camera image
  • X img is It is a 3D camera image.
  • is a weight set to balance the two terms.
  • the position of the sensor is fine due to external factors such as unevenness of the ground, mechanical shaking, and external impact.
  • a method of calibrating by estimating the changed information and correcting the location information is proposed.
  • a camera and a lidar sensor to be subjected to on-line calibration presented in the embodiment of the present invention correspond to an example mentioned for convenience of description.
  • the technical idea of the present invention can be applied even when replacing them with other sensors.
  • the calibration apparatus includes a camera image input unit 410 , a lidar image input unit 420 , a camera image processing unit 430 , a lidar image processing unit 440 , and a calibration unit 450 . ) is included.
  • the camera image input unit 410 receives an image generated by a camera (not shown) and transmits it to the camera image processing unit 430 .
  • the camera image received/transmitted by the camera image input unit 410 may be a 2D camera image or a 3D camera image, or both.
  • the camera image processing unit 430 performs preprocessing on the camera image transmitted from the camera image input unit 410 and extracts a predetermined topography or feature from the preprocessed camera image.
  • the lidar image input unit 420 receives the image generated by the lidar sensor (not shown) and transmits it to the lidar image processing unit 440 .
  • the lidar image processing unit 440 pre-processes the lidar image transmitted from the lidar image input unit 420 and extracts a predetermined topography or feature from the preprocessed lidar image.
  • the calibrator 450 matches the contours of the terrain or features extracted by the image processing units 430 and 440 and corrects the position and posture between the camera and the lidar so that the matching error is minimized.
  • the calibrator 450 periodically performs position and posture correction between the camera and the lidar.
  • the technical idea of the present invention can be applied to a computer-readable recording medium containing a computer program for performing the functions of the apparatus and method according to the present embodiment.
  • the technical ideas according to various embodiments of the present invention may be implemented in the form of computer-readable codes recorded on a computer-readable recording medium.
  • the computer-readable recording medium may be any data storage device readable by the computer and capable of storing data.
  • the computer-readable recording medium may be a ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical disk, hard disk drive, or the like.
  • the computer-readable code or program stored in the computer-readable recording medium may be transmitted through a network connected between computers.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Manufacturing & Machinery (AREA)
  • Electromagnetism (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un procédé et un appareil d'étalonnage en ligne entre un capteur lidar et une caméra. Dans un procédé d'étalonnage, selon un mode de réalisation de la présente invention, une image de caméra est obtenue pour extraire une caractéristique planimétrique d'une forme prédéfinie, une image lidar est obtenue pour extraire une caractéristique planimétrique d'une forme prédéfinie, et des formes externes des caractéristiques planimétriques extraites sont comparées pour corriger des erreurs de position et de posture entre une caméra et un lidar. Par conséquent, le fait de calculer des erreurs géométriques entre le lidar et la caméra et de corriger les erreurs en ligne à l'aide de caractéristiques planimétriques dont les formes sont prédéfinies et connues permet de corriger facilement et automatiquement des valeurs d'étalonnage qui nécessitent une correction.
PCT/KR2021/010044 2020-11-30 2021-08-02 Procédé et appareil d'étalonnage en ligne entre un capteur lidar et une caméra WO2022114449A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200163582A KR20220075481A (ko) 2020-11-30 2020-11-30 라이다 센서와 카메라 간 온라인 캘리브레이션 방법 및 장치
KR10-2020-0163582 2020-11-30

Publications (1)

Publication Number Publication Date
WO2022114449A1 true WO2022114449A1 (fr) 2022-06-02

Family

ID=81755154

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/010044 WO2022114449A1 (fr) 2020-11-30 2021-08-02 Procédé et appareil d'étalonnage en ligne entre un capteur lidar et une caméra

Country Status (2)

Country Link
KR (1) KR20220075481A (fr)
WO (1) WO2022114449A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230174492A (ko) 2022-06-21 2023-12-28 주식회사 엘지에너지솔루션 리튬 이차전지 및 이의 제조방법
KR102525563B1 (ko) * 2022-11-30 2023-04-25 주식회사 테스트웍스 다중 라이다 및 카메라 센서를 이용한 영상 획득 방법 및 이를 수행하는 컴퓨팅 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102012179B1 (ko) * 2018-04-30 2019-08-20 충북대학교 산학협력단 3차원 평면 검출 기반의 라이다-카메라 캘리브레이션 장치 및 방법
KR20200054370A (ko) * 2018-11-06 2020-05-20 주식회사 스프링클라우드 자율 주행 차량의 통합 센서 자동 보정 장치 및 방법
JP2020098151A (ja) * 2018-12-18 2020-06-25 株式会社デンソー センサ校正方法およびセンサ校正装置
US10719957B2 (en) * 2018-07-30 2020-07-21 Pony Ai Inc. System and method for calibrating on-board vehicle cameras
US10726579B1 (en) * 2019-11-13 2020-07-28 Honda Motor Co., Ltd. LiDAR-camera calibration

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102012179B1 (ko) * 2018-04-30 2019-08-20 충북대학교 산학협력단 3차원 평면 검출 기반의 라이다-카메라 캘리브레이션 장치 및 방법
US10719957B2 (en) * 2018-07-30 2020-07-21 Pony Ai Inc. System and method for calibrating on-board vehicle cameras
KR20200054370A (ko) * 2018-11-06 2020-05-20 주식회사 스프링클라우드 자율 주행 차량의 통합 센서 자동 보정 장치 및 방법
JP2020098151A (ja) * 2018-12-18 2020-06-25 株式会社デンソー センサ校正方法およびセンサ校正装置
US10726579B1 (en) * 2019-11-13 2020-07-28 Honda Motor Co., Ltd. LiDAR-camera calibration

Also Published As

Publication number Publication date
KR20220075481A (ko) 2022-06-08

Similar Documents

Publication Publication Date Title
WO2022114449A1 (fr) Procédé et appareil d'étalonnage en ligne entre un capteur lidar et une caméra
EP2426642B1 (fr) Procédé, dispositif et système de détection de mouvement
US4825393A (en) Position measuring method
WO2022260386A1 (fr) Procédé et appareil permettant de composer un arrière-plan et un visage en utilisant un réseau d'apprentissage profond
WO2012023639A1 (fr) Procédé pour compter des objets et appareil utilisant une pluralité de détecteurs
WO2017099510A1 (fr) Procédé permettant de segmenter une scène statique sur la base d'informations statistiques d'image et procédé s'y rapportant
WO2014065607A1 (fr) Dispositif de correction d'images servant à accélérer la correction d'images et procédé à cet effet
CN111639629B (zh) 一种基于图像处理的猪只体重测量方法、装置及存储介质
WO2020235734A1 (fr) Procédé destiné à estimer la distance à un véhicule autonome et sa position au moyen d'une caméra monoscopique
CN113012224B (zh) 定位初始化方法和相关装置、设备、存储介质
CN114639078A (zh) 一种车型识别方法、装置和系统
CN108280807A (zh) 单目深度图像采集装置和系统及其图像处理方法
CN115379123A (zh) 一种用无人机巡检的变压器故障检测方法
JP3008875B2 (ja) 被写体抽出装置
CN113344796A (zh) 一种图像处理方法、装置、设备及存储介质
CN114998147A (zh) 一种数字孪生下的交通图像加雾方法
CN109754415A (zh) 一种基于多组双目视觉的车载全景立体感知系统
WO2019098421A1 (fr) Dispositif de reconstruction d'objet au moyen d'informations de mouvement et procédé de reconstruction d'objet l'utilisant
CN112396016A (zh) 一种基于大数据技术的人脸识别系统
WO2011136405A1 (fr) Dispositif et procédé de reconnaissance d'image à l'aide d'un appareil photographique en trois dimensions (3d)
WO2017086522A1 (fr) Procédé de synthèse d'image d'incrustation couleur sans écran d'arrière-plan
CN114167979B (zh) 一种增强现实一体机手柄跟踪算法
WO2018230971A1 (fr) Procédé et appareil de traitement d'image omnidirectionnelle
WO2019178717A1 (fr) Procédé d'appariement binoculaire, dispositif d'imagerie visuelle et dispositif avec fonction de stockage
CN113408396B (zh) 一种基于云计算的桥梁智能感知系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21898278

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21898278

Country of ref document: EP

Kind code of ref document: A1