WO2016177371A1 - Verfahren und vorrichtung zur erkennung und bewertung von fahrbahnreflexionen - Google Patents

Verfahren und vorrichtung zur erkennung und bewertung von fahrbahnreflexionen Download PDF

Info

Publication number
WO2016177371A1
WO2016177371A1 PCT/DE2016/200207 DE2016200207W WO2016177371A1 WO 2016177371 A1 WO2016177371 A1 WO 2016177371A1 DE 2016200207 W DE2016200207 W DE 2016200207W WO 2016177371 A1 WO2016177371 A1 WO 2016177371A1
Authority
WO
WIPO (PCT)
Prior art keywords
roadway
vehicle
road
camera
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/DE2016/200207
Other languages
German (de)
English (en)
French (fr)
Inventor
Stefan Fritz
Bernd Hartmann
Manuel AMTHOR
Joachim Denzler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Friedrich Schiller Universtaet Jena FSU
Continental Teves AG and Co OHG
Original Assignee
Friedrich Schiller Universtaet Jena FSU
Continental Teves AG and Co OHG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Friedrich Schiller Universtaet Jena FSU, Continental Teves AG and Co OHG filed Critical Friedrich Schiller Universtaet Jena FSU
Priority to JP2017557347A priority Critical patent/JP6453490B2/ja
Priority to CN201680026128.2A priority patent/CN107667378B/zh
Priority to EP16726768.1A priority patent/EP3292510B1/de
Priority to US15/572,010 priority patent/US10442438B2/en
Priority to DE112016002050.3T priority patent/DE112016002050A5/de
Priority to KR1020177031347A priority patent/KR101891460B1/ko
Publication of WO2016177371A1 publication Critical patent/WO2016177371A1/de
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • B60W40/06Road conditions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • G06V10/464Salient features, e.g. scale invariant feature transforms [SIFT] using a plurality of salient features, e.g. bag-of-words [BoW] representations
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/60Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2552/00Input parameters relating to infrastructure
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/247Aligning, centring, orientation detection or correction of the image by affine transforms, e.g. correction due to perspective effects; Quadrilaterals, e.g. trapezoids

Definitions

  • the invention relates to a method for detecting and evaluating reflections on a road. Furthermore, the invention relates to a device for carrying out the aforementioned method and a vehicle with such a device.
  • Image capture allows the use of camera-based
  • Driver assistance systems that are placed behind the windshield capture the visual perception of the driver corresponding to the apron of the vehicle.
  • the functional scope of these systems extends from the high-beam automatic lighting system to the detection and display of speed limits to the warning of lane departure errors or imminent collision.
  • the task of digital image processing as a stand-alone function or in fusion with radar or lidar sensors is primarily to recognize objects, to classify them and to follow them in the image section.
  • Classic objects are usually a variety of vehicles such as cars, trucks, two-wheelers or pedestrians.
  • cameras take over the capture of signs,
  • the inertial sensors provide a good impression of the current driving condition of the vehicle and the entire driving situation. From this the criticality of driving situations can be deduced and the
  • the design of the warning and intervention times is basically based on a dry roadway with high traction potential between the tire and the road.
  • the object of the present invention can therefore be to provide a method and a device of the type mentioned at the outset with which the road condition or even the available friction coefficient of the roadway can be determined so that driver alerts and system interventions can be made more targeted and the effectiveness of accident avoidance driver assistance systems can be increased.
  • the object is solved by the subject matters of the independent claims. Preferred embodiments are subject of the dependent claims.
  • the method according to the invention as claimed in claim 1 serves to detect and evaluate reflections of at least one point on a roadway.
  • a camera is provided, by means of which at least two digital images of the at least one roadway point are generated, wherein the generation of the images from different
  • a road condition information is determined, in particular a road condition information which describes the friction coefficient of the roadway or makes a statement as to whether the roadway is dry, wet or icy.
  • the invention makes use of the fact that reflections can generally be divided into three categories and each with a change in perspective or perspective
  • Image processing with the aim to robustly detect road reflections for the detection of moisture and ice in particular.
  • Roadway point which represents the roadway, draw a conclusion on a current road condition by using in the camera generated from two different perspectives images of the roadway point by using digital
  • Image processing algorithms are searched for specific features that allow it to close on the current road condition.
  • the method is preferably performed on a sufficiently lit scene that enables the production or recording of usable images.
  • Prerequisite for the method is a change in perspective in a sequence of at least two images. In the case of diffuse reflections (indicator for a dry lane), the change of the viewing angle to a fixed point on the roadway has no visual effect, since the light is equally reflected in all directions. Changing the perspective changes that
  • the method according to the invention is preferably used in a vehicle. Deploying the camera can do this especially inside the vehicle, preferably behind the windshield, so that a visual
  • Perception of a driver of the vehicle according to the apron of the vehicle can be detected.
  • the generation of the images from two different perspectives can be effected in particular by a driving movement of the vehicle.
  • a digital camera is provided with which the at least two appearances directly digital
  • a monocamera or a stereo camera can be used to generate the appearances, since depth information from the image can also be used for the algorithm, depending on the severity.
  • depth information from the image can also be used for the algorithm, depending on the severity.
  • Generating at least two digital images of the at least one roadway point by means of the camera takes place, wherein the generation of the images from different recording perspectives takes place with a stereo camera.
  • a particular advantage of the method according to the invention is that specular reflections can be reliably distinguished from shadows (diffuse reflections), since they are a
  • Method includes the additional method steps
  • the Roadway condition information thus serves as an input for an accident-avoiding driver assistance system of a vehicle in order to be able to adapt warning and intervention times of the driver assistance system particularly effectively.
  • ADAS Advanced Driver Assistance Systems
  • Road condition information is used as important information of the driving environment in automating and is preferably supplied to a corresponding system control for autonomous driving. In this sense, it is provided according to a further advantageous embodiment that the
  • Automated vehicle is included and driving strategy, as well as determination of delivery times between a
  • a region comprising a plurality of lane points is used in the images of the camera representing the lane.
  • the region can also be a segmented section.
  • a region in the form of a trapezium is particularly preferred, wherein the trapezoidal region is transformed into a rectangular bird's-eye view by means of an estimated homography.
  • the provision of the camera takes place in a vehicle.
  • the first image is in a first position of the vehicle from a first
  • Recording perspective generated The vehicle is moved to a second position, e.g. method, wherein the second position is different from the first position, i. the first and second positions do not match.
  • This is followed by generating the at least second image in the at least second position of the vehicle from an at least second recording perspective.
  • the at least two images of the at least two different recording perspectives are then transformed into a respective top view.
  • This is followed by registering the at least two generated plan views with means of digital image processing including vehicle dynamics parameters of the vehicle and a comparison of the appearances of the at least one roadway point in the at least two registered plan views.
  • the registration is realized in this embodiment by a simple translation and rotation, since the scene has been transformed into a plan view.
  • the compensation may preferably be done by incorporating individual driving dynamics parameters, e.g. Vehicle speed, steering angle, etc. or entire models, e.g. Groundplane model and vehicle dynamics models are made or supported.
  • individual driving dynamics parameters e.g. Vehicle speed, steering angle, etc.
  • entire models e.g. Groundplane model and vehicle dynamics models are made or supported.
  • the individual features form a feature vector, which is then assigned by at least one class from a classification system (classifier).
  • classifier is a mapping of a feature descriptor to a discrete number representing the classes to be recognized.
  • classifier is preferably a random
  • Decision Forest Random Decision Forest
  • Decision trees are hierarchically arranged
  • Classifiers that iteratively split the classification problem. Beginning in the root, the path to a leaf node is made on the basis of the decisions made, in which the final classification decision takes place. Due to the learning complexity, very simple classifiers, the so-called decision stumps, which separate the input space orthogonally to a coordinate axis, are preferably used for the inner nodes.
  • Decision forests are collections of decision trees that contain randomized elements in training trees, preferably at two sites. First, each tree is trained with a random selection of training data, and second, for each binary decision, only a random selection of allowed dimensions is used. In the leaf nodes, class histograms are stored that contain a
  • Class histograms store the frequency at which a feature vector of a particular lane condition occurs
  • the decision trees preferably be assigned a probability that is calculated from the class histograms.
  • a probability that is calculated from the class histograms.
  • This decision per input image is preferably followed by an optimization.
  • This optimization may take into account temporal context or other information provided by the vehicle.
  • Temporal context is preferably taken into account by using the most common class from a previous period or by using a so-called hysteresis threshold method.
  • hysteresis threshold method the change from one road condition to the other is regulated by means of threshold values. A change takes place only when the
  • the at least two images produced, particularly preferably the generated plan views are averaged to a
  • Average image and the associated column average is formed.
  • a basic assumption of this embodiment is that a region moves through the entire image area. It does not look at a particular region in itself, but their path. Thus, more preferably, more than two images are generated. It is further assumed that there is a straight-line and continuous change in the perspective, preferably a uniformly rectilinear movement of the vehicle. This assumption may preferably be confirmed by vehicle motion parameters as contextual knowledge. Under this condition, the frames, preferably the transformed frames, are averaged from the sequence to obtain an average image. In order to minimize storage space or to weight younger events higher, the calculation of a moving average is also possible.
  • the absolute difference or the quadratic difference is formed between each pixel in the average image and the associated mean of the column.
  • the extraction of features of the average image can be done considering the column averages, preferably using a "bag-of-visual-words" approach, in which the occurrence of certain prototypical values or value tuples is detected on the basis of a histogram.
  • the resulting image can be used to evaluate the presence of specular reflections, such as statistical moments or, in a particularly advantageous form, local features (preferably "local binary patterns") in a "bag-of-visual words". Approach.
  • specular reflections such as statistical moments or, in a particularly advantageous form, local features (preferably "local binary patterns") in a "bag-of-visual words”.
  • local features preferably "local binary patterns”
  • Mean column values are very similar when moving in a straight line, whereas, in specular reflections, the change in the appearance of the passaged regions is very different from the mean in the column. This method is based - as mentioned above - in
  • Another advantage is the required computing time, which, in contrast to the former method, is greatly reduced.
  • the calculations are limited to averaging and some subtractions.
  • the ever-changing exposure time is taken into account, which causes changes in the appearance of the regions in the sequence (for example, brightness changes) and the detection of reflections negative
  • the device according to the invention for detecting and evaluating reflections of at least one point on a roadway according to claim 13 comprises a camera, which is adapted to at least two digital images of the at least one
  • the device is set up using algorithms of digital image processing
  • FIG. 1 a and b show a schematic representation of a
  • the device 1 according to the invention shown by FIGS. 1 a and 1 b comprises a digital camera 2, which is set up to record at least two digital images of a roadway point 3 from different recording perspectives, wherein the different recording perspectives are represented by two different positions A and B, respectively the respective camera 2 are shown.
  • the camera 2 is arranged in a vehicle, not shown, behind its windshield, so that a visual perception of a driver of the vehicle can be detected corresponding to the apron of the vehicle.
  • a driving movement of the vehicle is moved from a first position to a second position. In the first position, in which the camera 2 covers the recording perspective A shown on the right in FIGS. 1a and 1b, a first image of the roadway point 3 is taken in each case.
  • the vehicle is moved to the second position, in which the
  • Recording perspective B is covered, from each of which a second image of the roadway point (3) is recorded.
  • a shot perspective from A to B because an incoming light beam 4 is equally reflected in all directions by a dry road surface 5. This corresponds to a diffuse reflection, which is an indicator of a dry road surface.
  • the device 1 compares the first and second images with each other. Using digital image processing algorithms, the device 1 recognizes that the first and the second image do not differ from each other or only to an extent in that a diffuse reflection must be present. Due to the detected or detected diffuse
  • Reflection determines the device 1 a
  • Road condition information which includes that the road surface 5 is dry. This value is transmitted to an unrepresented driver assistance system.
  • the image of the roadway point 3 changes from A to B when the recording perspective changes, because an incident light beam 6 changes in only one direction from an iced or wet one
  • Road surface 7 is reflected. This corresponds to a specular reflection, which is an indicator of a wet or icy road surface.
  • the device 1 compares the first and second images with each other. Using digital image processing algorithms, the device recognizes that the first and second images are so much different from each other differ that a reflective reflection must be present. Due to the detected or reflected specular reflection, the device determines road condition information, which includes that the road surface is wet or iced. This value is sent to an unillustrated

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Mathematical Physics (AREA)
  • Automation & Control Theory (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)
  • Image Processing (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
PCT/DE2016/200207 2015-05-06 2016-05-04 Verfahren und vorrichtung zur erkennung und bewertung von fahrbahnreflexionen Ceased WO2016177371A1 (de)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2017557347A JP6453490B2 (ja) 2015-05-06 2016-05-04 路面反射を認識し評価するための方法及び装置
CN201680026128.2A CN107667378B (zh) 2015-05-06 2016-05-04 用于识别和评估路面反射的方法和装置
EP16726768.1A EP3292510B1 (de) 2015-05-06 2016-05-04 Verfahren und vorrichtung zur erkennung und bewertung von fahrbahnreflexionen
US15/572,010 US10442438B2 (en) 2015-05-06 2016-05-04 Method and apparatus for detecting and assessing road reflections
DE112016002050.3T DE112016002050A5 (de) 2015-05-06 2016-05-04 Verfahren und vorrichtung zur erkennung und bewertung von fahrbahnreflexionen
KR1020177031347A KR101891460B1 (ko) 2015-05-06 2016-05-04 차도 위의 반사체를 인식하고 평가하기 위한 방법 및 장치

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102015208429.9 2015-05-06
DE102015208429.9A DE102015208429A1 (de) 2015-05-06 2015-05-06 Verfahren und Vorrichtung zur Erkennung und Bewertung von Fahrbahnreflexionen

Publications (1)

Publication Number Publication Date
WO2016177371A1 true WO2016177371A1 (de) 2016-11-10

Family

ID=56097956

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/DE2016/200207 Ceased WO2016177371A1 (de) 2015-05-06 2016-05-04 Verfahren und vorrichtung zur erkennung und bewertung von fahrbahnreflexionen

Country Status (7)

Country Link
US (1) US10442438B2 (enExample)
EP (1) EP3292510B1 (enExample)
JP (1) JP6453490B2 (enExample)
KR (1) KR101891460B1 (enExample)
CN (1) CN107667378B (enExample)
DE (2) DE102015208429A1 (enExample)
WO (1) WO2016177371A1 (enExample)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106686165A (zh) * 2016-12-30 2017-05-17 维沃移动通信有限公司 一种路况检测的方法及移动终端
DE102018203807A1 (de) 2018-03-13 2019-09-19 Continental Teves Ag & Co. Ohg Verfahren und Vorrichtung zur Erkennung und Bewertung von Fahrbahnzuständen und witterungsbedingten Umwelteinflüssen
DE102021101788A1 (de) 2021-01-27 2022-07-28 Zf Cv Systems Global Gmbh Verfahren zum ortsaufgelösten Ermitteln einer Oberflächeneigenschaft eines Untergrundes, Verarbeitungseinheit und Fahrzeug

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114612877B (zh) * 2016-01-05 2025-08-19 御眼视觉技术有限公司 用于估计未来路径的系统和方法
GB201711409D0 (en) * 2016-12-30 2017-08-30 Maxu Tech Inc Early entry
DE102018218733B4 (de) * 2018-10-31 2025-03-13 Robert Bosch Gmbh Verfahren zur Unterstützung einer kamerabasierten Umfelderkennung eines Fortbewegungsmittels mittels einer Strassennässeinformation eines ersten Ultraschallsensors
FI128495B (en) * 2019-05-21 2020-06-15 Vaisala Oyj Method for calibrating optical surface monitoring system, arrangement, device and computer readable memory
FR3103303B1 (fr) * 2019-11-14 2022-07-22 Continental Automotive Détermination d’un coefficient de friction pour un véhicule sur une route
EP3866055B1 (en) * 2020-02-12 2025-09-24 Aptiv Technologies AG System and method for displaying spatial information in the field of view of a driver of a vehicle
US20220198200A1 (en) * 2020-12-22 2022-06-23 Continental Automotive Systems, Inc. Road lane condition detection with lane assist for a vehicle using infrared detecting device
CN112597666B (zh) * 2021-01-08 2022-05-24 北京深睿博联科技有限责任公司 一种基于表面材质建模的路面状态分析方法及装置
JP6955295B1 (ja) * 2021-02-16 2021-10-27 株式会社アーバンエックステクノロジーズ 識別装置、識別プログラム、および識別方法
CN115201218A (zh) * 2022-07-13 2022-10-18 鲁朗软件(北京)有限公司 一种车载路面病害智能检测方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020191837A1 (en) * 2001-05-23 2002-12-19 Kabushiki Kaisha Toshiba System and method for detecting obstacle
JP2003057168A (ja) * 2001-08-20 2003-02-26 Omron Corp 路面判別装置及び同装置の設置調整方法
WO2004081897A2 (en) * 2003-03-14 2004-09-23 Liwas Aps A device for detection of road surface condition
EP2551794A2 (en) * 2011-07-28 2013-01-30 Hitachi Ltd. Onboard environment recognition system

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE3023444C2 (de) * 1979-06-29 1985-07-11 Omron Tateisi Electronics Co., Kyoto Einrichtung zur Ermittlung des witterungsbedingten Straßenzustandes
JPS6015015B2 (ja) 1979-06-29 1985-04-17 株式会社 レオ技研 路面水分検知装置
DE3738221A1 (de) * 1987-11-11 1989-06-08 Bayerische Motoren Werke Ag Verfahren und einrichtung zum erkennen des zustandes einer strasse
DE4235104A1 (de) * 1992-10-17 1994-04-21 Sel Alcatel Ag Straßenzustandsdetektor
AU3201101A (en) * 2000-02-07 2001-08-14 Intelligent Security Limited Smoke and flame detection
JP3626905B2 (ja) * 2000-11-24 2005-03-09 富士重工業株式会社 車外監視装置
US8184194B2 (en) 2008-06-26 2012-05-22 Panasonic Corporation Image processing apparatus, image division program and image synthesising method
DE112010005669B4 (de) * 2010-06-18 2017-10-26 Honda Motor Co., Ltd. System zur Straßenoberflächenreflektivitätsklassifizierung
JP5761601B2 (ja) * 2010-07-01 2015-08-12 株式会社リコー 物体識別装置
TWI467498B (zh) * 2011-12-19 2015-01-01 Ind Tech Res Inst 影像識別方法及系統
EP2947866B1 (en) * 2013-01-21 2019-06-19 Kowa Company Ltd. Image processing device, image processing method, image processing program, and recording medium storing said program
US10188036B2 (en) * 2015-10-23 2019-01-29 Carnegie Mellon University System for evaluating agricultural material

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020191837A1 (en) * 2001-05-23 2002-12-19 Kabushiki Kaisha Toshiba System and method for detecting obstacle
JP2003057168A (ja) * 2001-08-20 2003-02-26 Omron Corp 路面判別装置及び同装置の設置調整方法
WO2004081897A2 (en) * 2003-03-14 2004-09-23 Liwas Aps A device for detection of road surface condition
EP2551794A2 (en) * 2011-07-28 2013-01-30 Hitachi Ltd. Onboard environment recognition system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
AMTHOR MANUEL ET AL: "Road Condition Estimation Based on Spatio-Temporal Reflection Models", 3 November 2015, CORRECT SYSTEM DESIGN; [LECTURE NOTES IN COMPUTER SCIENCE; LECT.NOTES COMPUTER], SPRINGER INTERNATIONAL PUBLISHING, CHAM, PAGE(S) 3 - 15, ISBN: 978-3-642-36616-1, ISSN: 0302-9743, XP047333260 *
FUJIMURA K ET AL: "Road surface sensor", no. 1, 1 February 1988 (1988-02-01), pages 64 - 72, XP002688511, ISSN: 0289-3789, Retrieved from the Internet <URL:http://www.fujitsu-ten.com/business/technicaljournal/pdf/1-6E.pdf> [retrieved on 20121204] *
SHOHEI KAWAI ET AL: "A method to distinguish road surface conditions for car-mounted camera images at night-time", ITS TELECOMMUNICATIONS (ITST), 2012 12TH INTERNATIONAL CONFERENCE ON, IEEE, 5 November 2012 (2012-11-05), pages 668 - 672, XP032327884, ISBN: 978-1-4673-3071-8, DOI: 10.1109/ITST.2012.6425265 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106686165A (zh) * 2016-12-30 2017-05-17 维沃移动通信有限公司 一种路况检测的方法及移动终端
DE102018203807A1 (de) 2018-03-13 2019-09-19 Continental Teves Ag & Co. Ohg Verfahren und Vorrichtung zur Erkennung und Bewertung von Fahrbahnzuständen und witterungsbedingten Umwelteinflüssen
WO2019174682A1 (de) 2018-03-13 2019-09-19 Continental Teves Ag & Co. Ohg Verfahren und vorrichtung zur erkennung und bewertung von fahrbahnzuständen und witterungsbedingten umwelteinflüssen
US12249153B2 (en) 2018-03-13 2025-03-11 Continental Autonomous Mobility Germany GmbH Method and device for recognizing and evaluating roadway conditions and weather-related environmental influences
DE102021101788A1 (de) 2021-01-27 2022-07-28 Zf Cv Systems Global Gmbh Verfahren zum ortsaufgelösten Ermitteln einer Oberflächeneigenschaft eines Untergrundes, Verarbeitungseinheit und Fahrzeug
WO2022161750A1 (de) * 2021-01-27 2022-08-04 Zf Cv Systems Global Gmbh Verfahren zum ortsaufgelösten ermitteln einer oberflächeneigenschaft eines untergrundes, verarbeitungseinheit und fahrzeug

Also Published As

Publication number Publication date
EP3292510B1 (de) 2021-07-07
US10442438B2 (en) 2019-10-15
EP3292510A1 (de) 2018-03-14
US20180141561A1 (en) 2018-05-24
JP6453490B2 (ja) 2019-01-16
CN107667378B (zh) 2021-04-27
CN107667378A (zh) 2018-02-06
DE112016002050A5 (de) 2018-01-11
JP2018516799A (ja) 2018-06-28
KR20170127036A (ko) 2017-11-20
DE102015208429A1 (de) 2016-11-10
KR101891460B1 (ko) 2018-08-24

Similar Documents

Publication Publication Date Title
EP3292510B1 (de) Verfahren und vorrichtung zur erkennung und bewertung von fahrbahnreflexionen
EP3044727B1 (de) Verfahren und vorrichtung zur objekterkennung aus tiefenaufgelösten bilddaten
DE102015208428A1 (de) Verfahren und Vorrichtung zur Erkennung und Bewertung von Umwelteinflüssen und Fahrbahnzustandsinformationen im Fahrzeugumfeld
WO2019174682A1 (de) Verfahren und vorrichtung zur erkennung und bewertung von fahrbahnzuständen und witterungsbedingten umwelteinflüssen
DE102014207802B3 (de) Verfahren und System zum proaktiven Erkennen einer Aktion eines Verkehrsteilnehmers
DE102014222617B4 (de) Fahrzeugerfassungsverfahren und Fahrzeugerfassungssytem
WO2014127777A2 (de) Verfahren und vorrichtung zur bestimmung eines fahrbahnzustands
DE102021002798A1 (de) Verfahren zur kamerabasierten Umgebungserfassung
DE102014112797A1 (de) Fahrzeugaußenumgebungerkennungsvorrichtung
DE102018120405A1 (de) Fusion von radar- und bildsensorsystemen
EP3721370A1 (de) Trainieren und betreiben eines maschinen-lern-systems
EP3631677A1 (de) Verfahren zur erkennung von objekten in einem bild einer kamera
DE102014106506A1 (de) Verfahren zum Durchführen einer Diagnose eines Kamerasystems eines Kraftfahrzeugs, Kamerasystem und Kraftfahrzeug
DE102017209496A1 (de) Verfahren und Vorrichtung zum Klassifizieren eines Objekts für ein Fahrzeug
DE102020201939A1 (de) Verfahren und Vorrichtung zur Bewertung eines Bildklassifikators
DE102015206546A1 (de) Fahrbahnmarkierungserkennungsvorrichtung
DE102013021840A1 (de) Verfahren zum Erzeugen eines Umgebungsmodells eines Kraftfahrzeugs, Fahrerassistenzsystem und Kraftfahrzeug
WO2022042903A1 (de) Verfahren zur erkennung dreidimensionaler objekte, computerprogramm, maschinenlesbares speichermedium, steuergerät, fahrzeug und videoüberwachungssystem
DE102018114231A1 (de) Verfahren und System zum Erfassen von Objekten unter Verwendung mindestens eines Bildes eines Bereichs von Interesse (ROI)
DE102022206131A1 (de) Klassifikator und Verfahren für die Erkennung von Objekten aus Sensordaten auf der Basis einer vorgegebenen Klassenhierarchie
EP2696310B1 (de) Verfahren zum Identifizieren eines Straßenrands
DE102022211463A1 (de) Computerimplementiertes Verfahren und Vorrichtung zur Bestimmung einer Klassifikation für ein Objekt
DE102021133032A1 (de) Kamerabasierte Fahrzeugkonturerfassung für Fahrzeugbehandlungsanlagen
DE102023209525B3 (de) Verfahren sowie Vorrichtung zum Koordinieren von Hypothesenberechnungen für eine Erzeugung eines Umfeldmodells für ein Fahrzeug
DE102024135598A1 (de) Verfahren zur Reflexerkennung für ein Kraftfahrzeug

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16726768

Country of ref document: EP

Kind code of ref document: A1

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
REEP Request for entry into the european phase

Ref document number: 2016726768

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20177031347

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2017557347

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 15572010

Country of ref document: US

Ref document number: 112016002050

Country of ref document: DE

REG Reference to national code

Ref country code: DE

Ref legal event code: R225

Ref document number: 112016002050

Country of ref document: DE