EP4052222A1 - Procédé de détermination d'un modèle d'une barrière de circulation - Google Patents

Procédé de détermination d'un modèle d'une barrière de circulation

Info

Publication number
EP4052222A1
EP4052222A1 EP20768575.1A EP20768575A EP4052222A1 EP 4052222 A1 EP4052222 A1 EP 4052222A1 EP 20768575 A EP20768575 A EP 20768575A EP 4052222 A1 EP4052222 A1 EP 4052222A1
Authority
EP
European Patent Office
Prior art keywords
vehicles
traffic barrier
model
barrier
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP20768575.1A
Other languages
German (de)
English (en)
Inventor
Christian Thiel
Changhong YANG
Yisen YU
Dongbing QUAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Continental Automotive GmbH
Original Assignee
Continental Automotive GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Automotive GmbH filed Critical Continental Automotive GmbH
Publication of EP4052222A1 publication Critical patent/EP4052222A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30236Traffic on road, railway or crossing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Definitions

  • the disclosure relates to a method for determining a model of a traffic barrier, e.g. a Jersey Wall that may be arranged on a road to separate various roadways of the road from each other.
  • a traffic barrier e.g. a Jersey Wall that may be arranged on a road to separate various roadways of the road from each other.
  • Road furniture includes, for example, traffic signs, poles, barriers, and traffic bar riers which may include Jersey Walls, etc.
  • a traffic barrier like a Jersey Wall may be used in a construction site of a road for separating roadways of the road.
  • Road furniture may be captured and registered by special purpose vehicles using complex sensor systems such as stereo-cameras that capture images of a road and road furniture while driving.
  • Modeling of road furniture means to recover and represent a 3D space information of the road furniture.
  • To detect and model a traffic barrier most of the known approaches adopt LIDAR or a stereo-camera which could gain 3D information directly.
  • US 20200184233 A1 discloses a method of detecting road barriers like jersey walls.
  • vertical functions and horizontal functions are used, which when taken together, map to a plurality of image features. These functions are gener ated by a complex multi-camera system.
  • a simple optical sensor system for example an optical system com prising a monocular camera
  • a monocular camera cannot get 3D information of a scene directly.
  • the only known way for using a monocular camera to reconstruct 3D information needs a pixel's match relation between frames.
  • most of the traffic barrier is nearly texture less. As a result, it is nearly impossible to get a pixel's match relation between frames on the traffic barrier.
  • the disparity is too small to recover its spatial information.
  • a monocular camera could recover spatial lines by multiview geometry theory, only when the disparity is sufficient. However, there is almost no disparity on a traffic barrier edge in consecutive frames.
  • the problem to be solved by the invention is to provide a method for determin ing a model of a traffic barrier by using a cost-effective and simple sensor sys tem, for example a monocular camera. Further, the method should provide an improved recognition accuracy.
  • a method for determining a model of a traffic barrier, by a plurality of vehicles each having at least one camera and a processor for com puter vision processing is provided.
  • a respective image of a scene is captured by the respective at least one camera of each of the plurality of vehicles.
  • the re spective image is evaluated, and a respective preliminary model of the traffic barrier is generated by the respective processor of each of the vehicles.
  • the re spective preliminary model of the traffic barrier is transmitted from each of the vehicles to a server.
  • the respective preliminary model received from each of the vehicles is evaluated, and the model of the traffic barrier is determined by an image processor of the server.
  • a single camera may be sufficient.
  • a plurality of cameras may be provided.
  • a series of successive images are usually taken by means of an optical sensor system, such as a stereo-camera system. Then, pixels belonging to the same position on the object are compared in the successive images to cre ate a model of the object.
  • the stereo-camera system must provide information about the object from different spatial positions.
  • the problem with modeling a traffic barrier surface such as a traffic barrier, e.g. a Jersey Wall, however, is that the surface of the traffic barrier usually has very little texture or is nearly texture less. This makes it difficult to identify the same pixels in successive images being arranged at the same position on the surface of the traffic barrier to be modeled.
  • the method for determining a model of a traffic barrier allows images of a traffic barrier to be captured using simple cameras located at different locations.
  • the individual cameras can, for example, be designed as simple monocular cameras, wherein a respective monocular camera is located in each of a plurality of vehi cles.
  • the plurality of vehicles is located at different positions in a scene so that each camera takes a picture of the traffic barrier from an individual position.
  • a respective processor in each of the vehicles may evaluate the captured image in formation of the traffic barrier so that an individual preliminary/virtual model of the traffic barrier is generated in each vehicle.
  • the individual model information of the traffic barrier i.e. the individual prelimi nary/virtual model of the traffic barrier
  • the preliminary/virtual models of the traffic barrier received from the dif ferent vehicles are evaluated by the image processor of the server.
  • the method for determining a model of a traffic barrier the stereovision thus takes place on the server which can generate a pre cise model of the traffic barrier by evaluating and comparing the individual preliminary/virtual models generated and received from each of the various ve hicles in the scene.
  • a traffic barrier may includeat least one of a Jersey wall, a Jer sey barrier, K-rail or a median barrier.
  • Figure 1 illustrates a system to perform a method for determining a model of a traffic barrier
  • Figure 2 shows a flowchart illustrating method steps of a method for determining a model of a traffic barrier
  • Figure 3 illustrates a simplified example of the method for determining a model of a traffic barrier on a server by evaluating individual preliminary/vir tual models of the traffic barrier generated by individual vehicles.
  • Figure 1 shows a system comprising a plurality of vehicles 100, 101 and 102, wherein each of the vehicles includes at least one camera 10 for capturing an im age/frame of an environmental scene of the respective vehicle, a processor 20 for computer vision processing, and a storage device 30 to store a road database.
  • the various vehicles 100, 101 and 102 may be in communication with a server 200 including an image processor 201. In a very simple embodiment, a single camera may be sufficient.
  • An embodiment to model a traffic barrier in a road database system includes method steps VI, V2 and V3 performed by each of a plurality of vehicles and method steps S performed by the server in a sequence of VI, V2, V3, S.
  • the method steps are illustrated in the flowchart of Figure 2.
  • a plurality of vehicles 100, 101 and 102 is provided.
  • Each of the vehicles includes a respective camera 10 and a respective processor 20 for computer vi sion processing.
  • step VI performed by each of a plurality of vehicles a respec tive image of an environmental scene of the vehicles is captured by the respec tive camera 10 of each of the plurality of vehicles 100, 101 and 102.
  • the camera 10 may be embodied as a monocular camera installed in each of the vehicles.
  • the camera may be configured to capture a video of the environmental scene in which the respective vehicle is located in real time.
  • step V2 the respective image cap tured in step VI is evaluated, and a respective preliminary/virtual model of the traffic barrier is generated by the respective processor 20 of each of the vehicles 100, 101 and 102.
  • a road model may be provided in a respec tive storage device 30 of each of the vehicles 100, 101 and 102.
  • a planar road surface may be assumed for the road model ([0,0,1]). Giving the road model, an image pixel of the captured image may be projected to a 3D point on the road model.
  • the respective captured image is evaluated by the respective processor 20 of each of the vehicles 100, 101 and 102 by computer vision processing to extract the pixels in the respective captured image representing an edge of the traffic barrier.
  • the ex tracted pixels in the respective captured image may represent an upper edge of the traffic barrier.
  • the computer vision processing may be used to detect the whole area of the traffic barrier.
  • the boundary between wall and ground in general may not be clear.
  • the approach to model a traffic barrier only the position of the traffic barrier edge, particularly the upper wall edge, is considered by extracting the pixels representing the edge of the traffic barrier from each of the captured images.
  • the respective processor 20 of each of the vehicles 100, 101 and 102 generates the respective preliminary/virtual model of the traffic barrier by projecting the respective extracted pixels representing the edge of the traffic barrier in the road model to generate respective 3D points of the edge of the traffic barrier.
  • the respective processor 20 of each of the vehicles 100, 101 and 102 generates a respective spline curve of the edge of the traffic barrier.
  • the respective spline curve represents the respective prelimi nary/virtual model of the traffic barrier generated in each of the vehicles.
  • a step V3 of the method for determining a model of a traffic barrier the re spective preliminary/virtual model of the traffic barrier is transmitted from each of the vehicles 100, 101 and 102 to the server 200.
  • the respective image is cap tured in step VI in an individual pose of the respective camera 10 of each of the vehicles 100, 101 and 102.
  • the different preliminary/virtual models generated from the respective processor 20 of the different vehicles are gener ated from different viewpoints. Since the traffic barrier preliminary/virtual mod els are generated from different viewpoints, the camera position/pose is also im portant to determine the model of the traffic barrier by the server.
  • the pose of the respective camera 10 of each of the vehicles 100, 101 and 102 is transmitted together with the respective prelim inary/virtual model of the traffic barrier from each of the vehicles to the server 200.
  • step S the respective preliminary/virtual model re ceived from each of the vehicles 100, 101 and 102 is evaluated, and the model of the traffic barrier is determined by the image processor 201 of the server.
  • the image processor 201 of the server 200 After having collected information, i.e. the reported individual traffic barrier prelimi nary/virtual models and the individual camera poses from each of the vehicles, the image processor 201 of the server 200 will recover the real spatial infor mation of the traffic barrier.
  • the model of the traffic barrier, determined by the image processor 201 of the server may include at least information about the position and the height of the traffic barrier.
  • the image processor 201 of the server 200 evaluates at least two of the respective preliminary/virtual models of the traffic barrier received from at least two of the plurality of vehicles 101 and 102 to determine the model of the traffic barrier by the server.
  • Figure 3 illustrates how to recover a model of the traffic barrier, i.e. the real position of the traffic barrier, assuming that two reports of a respective individual preliminary/virtual model and a respective individual cam era pose of the vehicles 101 and 102 passing in different lanes have been re ceived by the server 200.
  • Reference sign 104 represents the position of the traffic barrier preliminary/vir tual model generated from the camera of the vehicle 101.
  • a line 105 passes through the camera position of the vehicle 101 and the preliminary/virtual posi tion of traffic barrier 104.
  • the true position of the traffic barrier edge of the traf fic barrier is on line 105.
  • the reference sign 103 corresponds to the prelimi nary/virtual model generated by the camera of the vehicle 102.
  • the true position of the traffic barrier is located on the line 106 connecting the position of the ve hicle 102 and the preliminary/virtual model 103.
  • the image processing 200 de termines the intersection 107 as the true position of the traffic barrier. All the in tersection points are fitted by a spline curve in order to model the traffic barrier.
  • the method for determining a model of a traffic barrier has several advantages.
  • the method only needs the generated individual preliminary/virtual wall models and the respective camera pose of each of the vehicles.
  • the method saves communication capa bility.
  • the method to model a traffic barrier can be used for generating a road database used for autonomously driving cars, but can also be used in a plurality of other fields of machine vision and machine orientation, for example robot orientation outdoors or under water.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

Selon un procédé de détermination d'un modèle d'une barrière de circulation, une pluralité de véhicules ayant chacun au moins une caméra et un processeur pour un traitement de vision artificielle sont prévus. Une image respective d'une scène est capturée par la ou les caméras respectives de chacun de la pluralité de véhicules. L'image respective est évaluée et un modèle préliminaire respectif de la barrière de circulation est généré par le processeur respectif de chacun des véhicules. Le modèle préliminaire respectif de la barrière de circulation est transmis de chacun des véhicules à un serveur. Le modèle préliminaire respectif reçu de chacun des véhicules est évalué, et le modèle de la barrière de trafic est déterminé par un processeur d'images du serveur.
EP20768575.1A 2019-09-20 2020-09-08 Procédé de détermination d'un modèle d'une barrière de circulation Pending EP4052222A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019214397 2019-09-20
PCT/EP2020/075025 WO2021052810A1 (fr) 2019-09-20 2020-09-08 Procédé de détermination d'un modèle d'une barrière de circulation

Publications (1)

Publication Number Publication Date
EP4052222A1 true EP4052222A1 (fr) 2022-09-07

Family

ID=72432908

Family Applications (1)

Application Number Title Priority Date Filing Date
EP20768575.1A Pending EP4052222A1 (fr) 2019-09-20 2020-09-08 Procédé de détermination d'un modèle d'une barrière de circulation

Country Status (3)

Country Link
EP (1) EP4052222A1 (fr)
CN (1) CN114730468A (fr)
WO (1) WO2021052810A1 (fr)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100972041B1 (ko) * 2009-01-15 2010-07-22 한민홍 카메라를 이용한 장애물 인식방법
US9280711B2 (en) * 2010-09-21 2016-03-08 Mobileye Vision Technologies Ltd. Barrier and guardrail detection using a single camera
CN102508246B (zh) * 2011-10-13 2013-04-17 吉林大学 车辆前方障碍物检测跟踪方法
CN103176185B (zh) * 2011-12-26 2015-01-21 上海汽车集团股份有限公司 用于检测道路障碍物的方法及系统
US9972096B2 (en) * 2016-06-14 2018-05-15 International Business Machines Corporation Detection of obstructions
US10558222B2 (en) * 2016-07-21 2020-02-11 Mobileye Vision Technologies Ltd. Navigating a vehicle using a crowdsourced sparse map
CN109804223A (zh) * 2016-10-11 2019-05-24 御眼视觉技术有限公司 基于检测到的障碍物导航车辆
EP3619643A1 (fr) 2017-05-03 2020-03-11 Mobileye Vision Technologies Ltd. Systèmes et procédés de détection et de classification pour navigation de véhicule autonome
FR3067999B1 (fr) * 2017-06-23 2019-08-02 Renault S.A.S. Procede d'aide a la conduite d'un vehicule automobile

Also Published As

Publication number Publication date
WO2021052810A1 (fr) 2021-03-25
CN114730468A (zh) 2022-07-08

Similar Documents

Publication Publication Date Title
CN110062871B (zh) 用于基于视频的定位及映射的方法及系统
EP3598874B1 (fr) Systèmes et procédés de mise à jour d'une carte haute résolution sur la base d'images binoculaires
AU2017302833B2 (en) Database construction system for machine-learning
EP3007099B1 (fr) Système de reconnaissance d'image pour véhicule et procédé correspondant
US9121717B1 (en) Collision avoidance for vehicle control
CN107273788B (zh) 在车辆中执行车道检测的成像系统与车辆成像系统
JP2019096072A (ja) 物体検出装置、物体検出方法およびプログラム
CN112967283A (zh) 基于双目摄像头的目标识别方法、系统、设备及存储介质
KR101748780B1 (ko) 스테레오 카메라를 이용한 도로객체 인식방법 및 장치
KR102167835B1 (ko) 영상 처리 방법 및 장치
US20230138487A1 (en) An Environment Model Using Cross-Sensor Feature Point Referencing
CN114419098A (zh) 基于视觉变换的运动目标轨迹预测方法及装置
CN102555905A (zh) 产生车辆周围环境中至少一个物体的影像的方法和设备
CN113838060A (zh) 用于自主车辆的感知系统
Yiruo et al. Complex ground plane detection based on v-disparity map in off-road environment
CN113516711A (zh) 相机位姿估计技术
Geiger et al. Object flow: A descriptor for classifying traffic motion
WO2020118619A1 (fr) Procédé de détection et de modélisation d'un objet sur la surface d'une route
CN113569812A (zh) 未知障碍物的识别方法、装置和电子设备
JP4696925B2 (ja) 画像処理装置
EP4052222A1 (fr) Procédé de détermination d'un modèle d'une barrière de circulation
CN110827340B (zh) 地图的更新方法、装置及存储介质
CN115937436A (zh) 道路场景的三维模型重建方法、装置及驾驶员辅助系统
JP2000099896A (ja) 走行路検出装置、車両走行制御装置および記録媒体
Li et al. Automatic Surround Camera Calibration Method in Road Scene for Self-driving Car

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20220712

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
111L Licence recorded

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

Free format text: EXCLUSIVE LICENSE

Name of requester: QUALCOMM TECHNOLOGIES, INC., US

Effective date: 20231103