CN110161523B - Method for estimating wall position and activating active triangulation of matrix headlight system of motor vehicle - Google Patents

Method for estimating wall position and activating active triangulation of matrix headlight system of motor vehicle Download PDF

Info

Publication number
CN110161523B
CN110161523B CN201910083363.9A CN201910083363A CN110161523B CN 110161523 B CN110161523 B CN 110161523B CN 201910083363 A CN201910083363 A CN 201910083363A CN 110161523 B CN110161523 B CN 110161523B
Authority
CN
China
Prior art keywords
light distribution
light
active triangulation
motor vehicle
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910083363.9A
Other languages
Chinese (zh)
Other versions
CN110161523A (en
Inventor
C·施奈德
S·泽纳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dr Ing HCF Porsche AG
Original Assignee
Dr Ing HCF Porsche AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dr Ing HCF Porsche AG filed Critical Dr Ing HCF Porsche AG
Publication of CN110161523A publication Critical patent/CN110161523A/en
Application granted granted Critical
Publication of CN110161523B publication Critical patent/CN110161523B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2504Calibration devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Lighting Device Outwards From Vehicle And Optical Signal (AREA)

Abstract

The invention relates to a method for estimating a wall position and for activating active triangulation of a matrix headlight system of a motor vehicle, comprising the steps of: a) checking whether basic on-conditions for switching on active triangulation are met and if so b) projecting a light distribution with typical characteristics in a scene in front of the vehicle, by means of which light distribution an artificial field of view based on light can be provided and recorded by a camera, c) extracting typical characteristics from the light distribution generated in the previous step from a camera image of the camera by means of image processing software, d) recording and mathematically describing isolation areas in the light distribution, e) determining the position and continuity of the isolation areas for providing an artificial field of view based on light, f) checking whether on-conditions for switching on active triangulation functions are met and if so switching on active triangulation functions.

Description

Method for estimating wall position and activating active triangulation of matrix headlight system of motor vehicle
Technical Field
The invention relates to a method for estimating a wall position and for activating active triangulation of a matrix headlight system of a motor vehicle.
Background
With the development of motor vehicles, so-called matrix headlight systems, which generally have two matrix headlights arranged in the front region of the vehicle, have an increasing importance. The matrix headlight comprises a matrix of pixels with a selectively activatable or deactivatable, preferably also dimmable, pixel units. It is expected that in matrix headlamp systems, in the future and depending on the technology used, pixel resolutions of tens or hundreds of thousands of pixel units may be achieved. Completely different illumination functions can be realized by means of a matrix of pixels. For example, one possible lighting function is a no glare high beam function, i.e., when the high beam is in an active state, the oncoming traffic participant is not glared. In this case, an onboard camera (driver-assistance camera) is used, which continuously records oncoming traffic participants as well as overtaking traffic participants. These camera images will be processed by means of image processing software. By means of a corresponding electronic control device, the individual pixel elements of the matrix headlights of the matrix headlight system are controlled in a targeted manner, so that a glare-eliminating effect can be achieved.
Distance measurements can be made by active triangulation using cameras with the matrix headlights of the matrix headlight system in order to determine the distance between the vehicle equipped with the matrix headlight system and the object in the vehicle's foreground. After calibrating the camera and the matrix headlight system, a defined pattern with typical characteristics is projected on the front belt of the motor vehicle by the matrix headlight of the matrix headlight system. Subsequently, a scene is generated, wherein the previously projected pattern is deformed according to scene characteristics. The camera image is then recorded by the camera, wherein typical features are extracted from the camera image by corresponding image processing software. The detected features are then associated with matrix headlights of a matrix headlight system. Furthermore, a so-called depth map is calculated by the image processing software on the basis of the light distribution specific to the vehicle.
In order to implement active triangulation in motor vehicles, a corresponding position estimation of the projected wall for defining the on-condition or the working range is necessary. Since the active triangulation function is not switched on by the driver of the motor vehicle and cannot be permanently activated due to certain constraints (e.g. sunlight or the absence of a projected wall), a position estimation of the projected wall is necessary in order to achieve the function of automatically switching on the active triangulation.
Currently, known methods for wall detection or position estimation are based on the detection and tracking of typical features in the front end scene of a motor vehicle. By analyzing the location of these typical features, the presence of the wall and its location can be inferred. However, this approach is problematic for scenes with fewer structures and reduced lateral movement variation. These limitations occur especially in parking scenarios.
In a development of the method set forth above, the prominent points of the projected light distribution (preferably so-called HOVO points) are continuously detected and tracked. Wall detection may be performed by determining the location of these points. However, since it is limited to a single protruding point, position detection cannot be achieved in this method.
In another approach, a light distribution in the form of a checkerboard pattern is generated in the front scene. Further, image processing is globally performed (i.e., for the entire camera image), so that a plurality of salient points can be detected. Based on the detection of these points, distance values are then calculated and the environment is modeled. In this way, wall detection is possible in principle. However, the disadvantage here is the relatively high time and computation power outlay caused by the execution of the corresponding mapping and triangulation algorithms. Since wall detection is used in particular for checking on-conditions, continuous distance triangulation may be less of a goal.
Disclosure of Invention
It is therefore an object of the present invention to provide a method for estimating a wall position and for active triangulation of a matrix headlight system of a motor vehicle, which is robust and can be carried out with relatively low computational power outlay.
The method having the features of the preferred embodiment of the present invention provides a solution to this object. Other alternative embodiments relate to advantageous refinements of the invention.
A method according to the invention for registering active triangulation of walls and for activating a matrix headlight system of a motor vehicle comprises the following steps:
a) It is checked whether the basic on-condition for switching on the active triangulation is fulfilled, and if so,
b) A light distribution with typical characteristics is projected in a scene in front of the vehicle, by means of which a man-made field of view based on light can be provided, and recorded by means of a camera,
c) Typical features are extracted from the light distribution produced in the previous step from the camera image of the camera by image processing software,
d) The isolated areas in the light distribution are recorded and described mathematically,
e) Determining the location and continuity of the isolation region for providing the artificial light-based field of view,
f) It is checked whether the switch-on condition for switching on the active triangulation function is fulfilled and if so, the active triangulation function is switched on.
In order to avoid the use of computationally complex algorithms, in particular mapping and triangulation algorithms, but still to estimate the presence and position of the wall as effectively as possible, it is proposed in the method of the invention that the wall position estimate is based on an artificial field of view based on light. Here, typical features are extracted from the camera image by image processing software from the matching light distribution (e.g. checkerboard pattern). These features form a plurality of isolation regions as a whole (in the case of a checkerboard pattern, for example, four isolation regions). These isolated regions are assigned to mathematical functions by corresponding association methods. The position of these isolated areas is analyzed in the camera image of a camera, which may be in particular a driver-assisted camera. If the wall is tilted, for example, with respect to the reference system of the camera-matrix headlamp system assembly, such tilting can be reflected even in the deflected position of the isolation area. Furthermore, it can be discerned by any discontinuity of the isolation area, in some cases whether the lighting scene has a wall discontinuity.
In an advantageous embodiment of the invention, it is proposed that: after switching on the function of active triangulation, in a further step g) a mapping and triangulation algorithm for 3D reconstruction of the scene in front of the motor vehicle is performed. Distance measurement is particularly possible by active triangulation.
In a particularly advantageous embodiment of the invention, the following possibilities exist: the detection of the basic switching-on condition in method step a) is carried out by evaluating the sensor signal of the ambient light sensor of the motor vehicle and/or by evaluating the kinematic data of the motor vehicle and/or by evaluating the object registered by the perceived light assistance device. A dim ambient scene, such as an underground parking garage, can be recorded by an ambient light sensor of the motor vehicle. Active triangulation is thereby avoided from being activated in an environmental scene with strong daylight illumination. A scene in which the motor vehicle is moving at a low speed (e.g. during parking) can be recorded by means of the kinematic data of the motor vehicle. The activation of the active triangulation function in scenes with high speeds (i.e. in dynamic driving scenes, when driving on highways, etc.) can thus be avoided. The object of analysing the objects recorded by the perceived light assist means of the motor vehicle is to identify scenes with low object density, such as uniform walls. This makes it possible to prevent the active triangulation function from being switched on in a scene with objects in front of the scene, such as in urban traffic at night.
In a preferred embodiment, it can be provided that in the method step b) an individual light distribution, in particular a checkerboard pattern, is projected, or a low-beam distribution or a high-beam distribution is projected, in which these typical features are embedded. The use of a low-beam or high-beam profile has the advantage that a light profile that is consistent with the ECE can be used to perform the method. Preferably, an image processing cascade is used for extracting typical features from the camera image.
In a particularly advantageous embodiment of the invention, it is proposed that outlier identification is performed when extracting these typical features from the camera image. Preferably, anomaly detection is performed by implementing a mathematical estimation method, such as the RANSAC algorithm, to thereby minimize the impact of false detections (outliers) on the overall result.
In a preferred embodiment, it is proposed that the mathematical description of the isolation region is performed by an approximation algorithm, such as the LMS algorithm (Least Mean square algorithm) or the function matching algorithm. The position and orientation of the scene or wall in front of the vehicle can be deduced due to the position and characteristics of the separation area. The location and continuity of the isolation region for providing the light-based artificial field of view can be determined, preferably by analysis of a mathematical function describing the isolation region.
Drawings
Other features and advantages of the present invention will become apparent from the following description of the preferred embodiments, which refers to the accompanying drawings. The drawings show:
fig. 1 is a schematic diagram showing a method flow for recording active triangulation of walls and for activating a matrix headlamp system of a motor vehicle according to a preferred embodiment of the present invention,
fig. 2 is a schematic illustration of light distribution with typical features and a plurality of isolation lines.
Detailed Description
The basic functional flow of the method for registering walls and for activating active triangulation of a matrix headlight system of a motor vehicle shall be further elucidated with reference to fig. 1.
In a first step a), a basic initialization is first performed by checking whether the basic switching-on conditions for switching on the active triangulation are substantially fulfilled. In particular, in this method step, a test is performed by evaluating the sensor signal of the ambient light sensor of the motor vehicle. The aim here is to record a dim light environment scene, such as an underground parking garage, where active triangulation can be performed. Active triangulation is thereby avoided from being activated in an environmental scene with strong daylight illumination. The kinematic data of the motor vehicle can also be analyzed in this method step. The object here is to record scenes in which the motor vehicle has a low speed (for example during a stop). The activation of the active triangulation function in scenes with high speeds (i.e. in dynamic driving scenes, when driving on highways, etc.) can thus be avoided. The object recorded by the perceived light assist device of the motor vehicle can also be analyzed in this method step. The object here is to identify scenes with low object density, such as uniform walls. This makes it possible to avoid switching on the active triangulation function in scenes with objects in front of the scene (e.g. in urban traffic at night).
When the check of the basic on-condition for switching on the active triangulation is successful, in a next step b) a projection of the light distribution embedded with the characteristic features 11, 21, 31, 41, by means of which a light-based artificial field of view can be provided, is performed in the scene in front of the vehicle. For example, an independent light distribution, e.g. a checkerboard pattern, may be projected in the scene in front of the vehicle. In an alternative embodiment, a low-beam or high-beam distribution can be projected and structures embedded therein, which are imperceptible to the vehicle occupants, in particular to the driver. The light distribution generated in this step b) is recorded by a camera (driver-assisted camera).
In a subsequent step c), these typical features 11, 21, 31, 41, by means of which a light-based artificial field of view can be provided, are extracted from the camera image recorded by the camera of the light distribution produced in step b) above by means of image processing software. A corresponding image processing algorithm, such as an image processing cascade, is used for extraction. Here, outlier recognition is preferably performed in order to avoid or minimize the effect of false recognition (outliers) by the image processing algorithm. For this purpose, mathematical estimation methods, such as the RANSAC algorithm, for example, can be used in order to minimize the effect of false detections (outliers) on the overall result.
In a next step d) a plurality of isolation regions 10, 20, 30, 40 are recorded and described mathematically in a suitable manner so that the position or continuity of these isolation regions can be analyzed later. These isolation regions 10, 20, 30, 40 are preferably described mathematically by an approximation algorithm, such as the LMS algorithm (Least Mean square algorithm) or a function matching algorithm.
For example, four isolation regions 10, 20, 30, 40 may be extracted in the case of a checkerboard pattern projection in a scene in front of the vehicle. This is shown in fig. 2. In the light distribution shown in simplified form in fig. 2, different brightnesses are shown by shadows of different intensities. Here, the strongly shaded areas should represent darker structures of the light distribution and the less or hardly shaded areas represent lighter structures of the light distribution. It will be appreciated that there is no brightness jump in the actual light distribution between the respective brightness ranges, but rather a brightness gradient with a gradual change in brightness can be observed.
The upper isolation region 10 may be defined by typical features 11 that are generated by an upper pixel section of a matrix headlight system. In the middle region there are two isolation regions, in particular a middle upper isolation region 20 defined by typical features 21 of the pixel section (which are provided by the middle upper region of the matrix headlight system) and a middle lower isolation region 30 defined by typical features 31 of the pixel section (which are provided by the middle lower region of the matrix headlight system). In addition, the lower isolation region 40 may be defined by typical features 41 that are generated by the lower pixel section of the matrix headlight system.
The position and continuity of the isolation regions 10, 20, 30, 40 are determined in a next step e). The location and continuity of the isolation regions used to provide the artificial field of view can be analyzed, preferably by analysis of mathematical functions describing the isolation regions 10, 20, 30, 40. The position and orientation of the front scene or wall can be inferred due to the location and characteristics of the separation area.
In a next step f), it is checked, based on the calculation performed in the previous step, whether a switch-on condition for switching on the active triangulation function is fulfilled. And if so, providing a corresponding turn-on signal to the matrix headlamp system so that active triangulation can be activated.
In a subsequent step g), mapping and triangulation algorithms for 3D reconstruction of the scene in front of the motor vehicle are performed. Distance measurement is particularly possible by active triangulation.
The above-described principle of the artificial field of view based on light can be used in further embodiments in addition to wall detection and checking if the conditions for switching on the active triangulation function are fulfilled. These further embodiments include, for example, feature association, turning off sections of matrix headlights of a matrix headlight system based on trajectories, and a light-based camera calibration method, which will be briefly described below.
In the case of feature correlation (solving the so-called correspondence problem), the typical features 11, 21, 31, 41 are detected by image processing software in the active triangulation range. In a further step, these features must be associated with the corresponding headlight segments of the matrix headlight system. This is achieved by forming an elliptical trajectory (pixel path) therealong. If a typical feature 11, 21, 31, 41 is detected along the track within this ellipse, these points are assigned to the headlight segments belonging to the track. If the projected wall has a relative rotation, then the ellipse must also be rotated in order to detect the representative features 11, 21, 31, 41. The determination of rotation is achieved by the aforementioned artificial field of view.
In active triangulation, headlight projections are used, which in turn are recorded by a camera, from which the typical features 11, 21, 31, 41 can be extracted. These headlight projections are generated by means of two matrix headlights of a matrix headlight system. With the two matrix headlights, these projections lead to overlapping regions in which the features are shown blurred and cannot be or in any case can only be extracted erroneously by the image processing software. Correspondingly, when an overlap has been recorded, it is required to switch off a certain headlight section of the matrix headlight system. When the intersection of the light-based artificial fields of view (left and right) is determined in addition to the trajectory intersection, an overlap can be detected. In this way, the pixel sections of the matrix headlight can be switched off continuously, and accordingly an error-free feature extraction can be ensured.
In ongoing driving work, the camera must be calibrated with respect to the vehicle. This can be done, for example, by so-called "vanishing point calibration". Here, straight lines extending parallel to each other in the camera image are tracked. These virtually parallel straight lines intersect each other in the camera image by means of the projected camera geometry. These intersections are also called vanishing points. The individual vanishing points lie on a so-called vanishing line, by means of which the geometric position of the camera with respect to the motor vehicle can be determined by means of roll-pitch-yaw angle. It is desirable for this purpose that straight lines that are substantially parallel to each other can be provided by the headlight projection (e.g. by features of a checkerboard pattern). The driver assistance camera can be calibrated externally correspondingly by means of the features of the headlight projection.

Claims (10)

1. A method for estimating wall position and for active triangulation for activating a matrix headlamp system of a motor vehicle, the method comprising the steps of:
a) It is checked whether the basic on-condition for switching on the active triangulation is fulfilled, and if so,
b) Projecting a light distribution having typical characteristics (11, 21, 31, 41) in a scene in front of the vehicle, by means of which light distribution an artificial field of view based on light can be provided, and recording the light distribution by means of a camera,
c) Extracting representative features (11, 21, 31, 41) from the light distribution produced in the previous step from the camera image of the camera by means of image processing software,
d) The isolation regions (10, 20, 30, 40) in the light distribution are recorded and described mathematically,
e) Determining the position and continuity of said isolation regions (10, 20, 30, 40) for providing a light-based artificial field of view,
f) It is checked whether the switch-on condition for switching on the active triangulation function is fulfilled and if so, the active triangulation function is switched on.
2. Method according to claim 1, characterized in that after switching on the active triangulation function, a mapping and triangulation algorithm for 3D reconstruction of the scene in front of the motor vehicle is performed in a further step g).
3. Method according to claim 1, characterized in that the checking of the basic on condition in method step a) is performed by analyzing sensor signals of an ambient light sensor of the motor vehicle and/or by analyzing kinematic data of the motor vehicle and/or by analyzing objects recorded by a perceived light assist device.
4. A method according to one of claims 1 to 3, characterized in that in method step b) an independent light distribution is projected, or a low-beam distribution or a high-beam distribution is projected, in which the characteristic features (11, 21, 31, 41) are embedded.
5. A method according to one of claims 1 to 3, characterized in that an image processing cascade is used for extracting the representative features (11, 21, 31, 41) from the camera image.
6. A method according to one of claims 1 to 3, characterized in that outlier recognition is performed when extracting the representative features (11, 21, 31, 41) from the camera image.
7. The method of claim 6, wherein the outlier identification is performed by implementing a mathematical estimation method.
8. A method according to one of claims 1 to 3, characterized in that the mathematical description of the isolation region (10, 20, 30, 40) is performed by means of an approximation algorithm.
9. A method according to one of claims 1 to 3, characterized in that the position and continuity of the isolation region (10, 20, 30, 40) for providing the artificial field of view based on light is determined by analysis of a mathematical function describing the isolation region (10, 20, 30, 40).
10. The method of claim 4, wherein the independent light distribution is a checkerboard pattern.
CN201910083363.9A 2018-02-12 2019-01-28 Method for estimating wall position and activating active triangulation of matrix headlight system of motor vehicle Active CN110161523B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018103060.6 2018-02-12
DE102018103060.6A DE102018103060B3 (en) 2018-02-12 2018-02-12 A method for estimating a wall of a wall and for activating an active triangulation of a matrix headlight system of a motor vehicle

Publications (2)

Publication Number Publication Date
CN110161523A CN110161523A (en) 2019-08-23
CN110161523B true CN110161523B (en) 2023-06-30

Family

ID=64951566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910083363.9A Active CN110161523B (en) 2018-02-12 2019-01-28 Method for estimating wall position and activating active triangulation of matrix headlight system of motor vehicle

Country Status (2)

Country Link
CN (1) CN110161523B (en)
DE (1) DE102018103060B3 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116698377B (en) * 2023-07-31 2023-10-03 常州星宇车灯股份有限公司 ADB function test method and system for automobile LED matrix headlight

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6285778B1 (en) * 1991-09-19 2001-09-04 Yazaki Corporation Vehicle surroundings monitor with obstacle avoidance lighting
CN104442541A (en) * 2013-09-23 2015-03-25 海拉胡克双合有限公司 Method For Controlling A Light Distribution Of A Headlamp And Headlamp Therefor
CN107607960A (en) * 2017-10-19 2018-01-19 深圳市欢创科技有限公司 A kind of anallatic method and device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070019181A1 (en) * 2003-04-17 2007-01-25 Sinclair Kenneth H Object detection system
US20050099821A1 (en) * 2004-11-24 2005-05-12 Valeo Sylvania Llc. System for visually aiding a vehicle driver's depth perception
US7872764B2 (en) 2007-10-16 2011-01-18 Magna Electronics Inc. Machine vision for predictive suspension
JP5587137B2 (en) * 2010-10-29 2014-09-10 キヤノン株式会社 Measuring apparatus and measuring method
WO2014070448A1 (en) 2012-10-31 2014-05-08 Tk Holdings, Inc. Vehicular path sensing system and method
DE102016109027A1 (en) * 2016-05-17 2017-11-23 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Method for checking the position of characteristic points in light distributions

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6285778B1 (en) * 1991-09-19 2001-09-04 Yazaki Corporation Vehicle surroundings monitor with obstacle avoidance lighting
CN104442541A (en) * 2013-09-23 2015-03-25 海拉胡克双合有限公司 Method For Controlling A Light Distribution Of A Headlamp And Headlamp Therefor
CN107607960A (en) * 2017-10-19 2018-01-19 深圳市欢创科技有限公司 A kind of anallatic method and device

Also Published As

Publication number Publication date
CN110161523A (en) 2019-08-23
DE102018103060B3 (en) 2019-01-24

Similar Documents

Publication Publication Date Title
US10286834B2 (en) Vehicle exterior environment recognition apparatus
US10442343B2 (en) Vehicle exterior environment recognition apparatus
US9371031B2 (en) Method for controlling a headlamp system for a vehicle, and headlamp system
EP2993654B1 (en) Method and system for forward collision warning
US10552688B2 (en) Method and device for detecting objects in the surroundings of a vehicle
US8477999B2 (en) Road division line detector
US20160107595A1 (en) Pedestrian collision warning system
JP6085522B2 (en) Image processing device
KR101510189B1 (en) Automatic exposure control apparatus and automatic exposure control method
CN110073410B (en) Method for tracking objects in a scene
CN110161523B (en) Method for estimating wall position and activating active triangulation of matrix headlight system of motor vehicle
US11170517B2 (en) Method for distance measurement using trajectory-based triangulation
CN109478233B (en) Method for identifying cause of occlusion in image sequence, computer readable recording medium, driving assistance system
KR101511586B1 (en) Apparatus and method for controlling vehicle by detection of tunnel
US8965142B2 (en) Method and device for classifying a light object located ahead of a vehicle
CN112926476B (en) Vehicle identification method, device and storage medium
JP7084223B2 (en) Image processing equipment and vehicle lighting equipment
KR101095023B1 (en) High beam assistance system and method thereof
JP3532896B2 (en) Smear detection method and image processing apparatus using the smear detection method
CN113632450A (en) Imaging system and image processing apparatus
US11100653B2 (en) Image recognition apparatus
WO2024127633A1 (en) Light distribution control system and light distribution control method
JP7241839B1 (en) Self-localization device
US20230386224A1 (en) Stop line recognition device
JP2022161700A (en) Traffic light recognition device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant