EP2715666A1 - Procédé de détermination d'un mouvement de tangage d'une caméra montée dans un véhicule et procédé de commande d'une émission lumineuse d'au moins un phare avant d'un véhicule - Google Patents
Procédé de détermination d'un mouvement de tangage d'une caméra montée dans un véhicule et procédé de commande d'une émission lumineuse d'au moins un phare avant d'un véhiculeInfo
- Publication number
- EP2715666A1 EP2715666A1 EP12719384.5A EP12719384A EP2715666A1 EP 2715666 A1 EP2715666 A1 EP 2715666A1 EP 12719384 A EP12719384 A EP 12719384A EP 2715666 A1 EP2715666 A1 EP 2715666A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- camera
- image
- gradient data
- vehicle
- camera image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q1/00—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
- B60Q1/02—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments
- B60Q1/04—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights
- B60Q1/06—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle
- B60Q1/08—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically
- B60Q1/10—Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to illuminate the way ahead or to illuminate other areas of way or environments the devices being headlights adjustable, e.g. remotely-controlled from inside vehicle automatically due to vehicle inclination, e.g. due to load distribution
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/269—Analysis of motion using gradient-based methods
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/002—Diagnosis, testing or measuring for television systems or their details for television cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q2300/00—Indexing codes for automatically adjustable headlamps or automatically dimmable headlamps
- B60Q2300/10—Indexing codes relating to particular vehicle conditions
- B60Q2300/13—Attitude of the vehicle body
- B60Q2300/132—Pitch
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20068—Projection on vertical or horizontal image axis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present invention relates to a method for determining a pitching motion of a camera installed in a vehicle, to a method for
- the present invention provides an improved method for determining a pitching motion of a vehicle-mounted camera, an improved method of controlling a light emission of at least one headlamp of a vehicle, an improved apparatus configured to perform the steps of such a method , as well as an improved computer program product with program code stored on a machine-readable carrier, for carrying out such a method when the program is executed on a device, presented according to the independent and independent claims.
- Advantageous embodiments emerge from the respective subclaims and the following description.
- the present invention provides a method for determining a pitching movement of a camera installed in a vehicle, the method comprising the following steps:
- first image gradient data representing a change in brightness of adjacent pixels of the first camera image along a vertical axis of the first camera image
- second image gradient data from a second camera image subsequently captured relative to the first camera image
- the second image gradient data is a brightness change represent adjacent pixels of the second camera image along a vertical axis of the second camera image
- the vehicle may be a motor vehicle, for example a passenger car, truck or other commercial vehicle.
- the camera is mounted in the vehicle such that a viewing angle of the camera is directed in the forward or reverse direction of the vehicle.
- a first camera with a viewing angle in the forward direction of travel and a second camera with a viewing angle in the reverse direction of the vehicle may be provided.
- an area which lies in the forward direction of travel in front of the vehicle can be recorded.
- the camera can, for example, the monitoring and / or pursuit of driving ahead
- the camera may be aligned with its optical axis along a longitudinal axis of the vehicle.
- Nick movement also referred to as pitch movement, relates to a rotational movement or pivoting of the camera about a transverse axis of the vehicle.
- the pitching movement causes the optical axis of the camera with respect to the longitudinal axis of the vehicle is pivoted about the transverse axis. Since the camera is mechanically connected to the vehicle, the pitching motion of the camera results from a corresponding movement of the vehicle. Thus, from the pitching motion of the camera conclusions on a movement behavior of the vehicle can be drawn.
- the present invention further provides a method for controlling a light emission of at least one headlight of a vehicle, wherein a camera is installed in the vehicle, the method comprising the following steps:
- the above-mentioned method of determining can be advantageously used.
- the pitching movement of the camera determined by means of the method for determining which is based on a corresponding movement of the vehicle, can be used in the method for controlling in order to set the illumination angle.
- the angle of illumination can be corrected by the pitching motion.
- dazzling of vehicles in front or oncoming traffic by the at least one headlight of the vehicle can be reduced or avoided.
- the present invention further provides an apparatus configured to perform the steps of any of the above methods.
- the device may comprise devices that are designed to execute one step of the method.
- this embodiment of the invention in the form of a device, the object underlying the invention can be solved advantageously and efficiently.
- a device can be understood to mean an electrical device which processes sensor signals and outputs control or data signals in dependence thereon.
- the device may have an interface which and / or may be formed by software.
- the interfaces can be part of a so-called system ASIC, for example, which contains a wide variety of functions of the device.
- the interfaces are their own integrated circuits or at least partially consist of discrete components.
- the interfaces may be software modules that are present, for example, on a microcontroller in addition to other software modules.
- Also of advantage is a computer program product with program code which is stored on a machine-readable carrier such as a semiconductor memory, a hard disk memory or an optical memory and is used to carry out one of the above-mentioned methods when the program is executed on a device or a control device.
- a machine-readable carrier such as a semiconductor memory, a hard disk memory or an optical memory
- the invention is based on the recognition that a determination of the pitching motion of a camera installed in a vehicle can be based on camera images. If, for example, there is a pitching movement of the camera between recording times of two camera images, the pitching motion results in a shift of pixels. In turn, this shift of pixels, for example, can be determined or estimated precisely according to embodiments of the present invention.
- a pure camera pitch can be determined or estimated, which is not the case with sensor-based determination or estimation.
- pitch angular velocity sensors for determining the pitching motion of the camera can be dispensed with. This saves parts, costs and weight and avoids a situation where the camera and pitch angle sensors are mounted at different locations and relative to different coordinate systems. Therefore, according to the present invention, a susceptibility to error due to electromagnetic interference or temperature, especially drift or offset, can be eliminated or significantly reduced. Due to the unnecessary, fault-prone sensor, the determination of the pitch angle according to the present invention is free from such disturbances. Furthermore, camera image recording and pitch angle determination can be synchronized, since the determination is based on the existing images.
- Image gradient data are generated by means of a radon transformation, in particular a radon transformation in the horizontal direction with respect to the relevant camera image.
- the radon transformation is a
- the radon transformation can take into account a change in brightness of adjacent pixels from a plurality of columns of pixels.
- the brightness changes in the several columns of picture elements can be integrated successively, the columns being processed one after the other in the horizontal direction.
- This embodiment offers the advantage that meaningful image gradient data can be generated on the basis of not only one column of pixels with favorable resource expenditure by means of the radon transformation. Based on the image gradient data generated by radon transformation, an image shift value can be efficiently generated.
- Image gradient data from a portion of the first camera image and the second image gradient data from a corresponding portion of the second camera image are generated.
- Invention offers the advantage of significantly reduced required data processing. processing capacity for the generation of the first and second image gradient data, since only a small part of the first and second image needs to be evaluated.
- the at least one image shift value may be generated by cross-correlation from the first image gradient data and the second image gradient data.
- the step of generating may include an estimate, and in particular a subpixel accurate estimate.
- the cross-correlation estimation is highly accurate with high resolution of, for example, less than one pixel (subpixel).
- a pitch angular velocity can be determined in the step of determining to determine the pitching motion of the camera.
- This embodiment offers the advantage that the pitching movement in the form of the pitch angle velocity can be determined in an uncomplicated way.
- a pitch angular velocity may be determined based on the at least one image shift value, a time difference between the first camera image and the second camera image, and a focal length of the camera to determine the pitching motion of the camera.
- ⁇ may denote the pitch angle velocity as a derivative of a pitch angle change ⁇
- Ay may denote the at least one image shift value
- At may designate a time difference between the first camera image and the second camera image
- f y may designate a focal length of the camera.
- the subsection of the first camera image and the subsection of the second camera image can be based on a single subarea of a camera sensor.
- the row positions and column positions of the subsections may be the same with respect to a fixed pixel raster in the camera images.
- the row positions and column positions of the sections do not change with respect to the fixed pixel matrix from the first camera image to the second camera image.
- the subsections can be adaptable in one image width and one image height. This embodiment offers the advantage that the resource cost for determining the pitching motion of the camera is reduced because the amount of input data is reduced by using only partial sections of the camera images and not entire camera images in the step of the generator.
- the subsections of the camera images can be selected such that the subsections have meaningfully evaluable regions of the camera images with regard to the pitching motion of the camera.
- the method may also include a step of selecting a subsection of the first camera image and a subsection of the second camera image.
- the step of selecting may be based on a travel of the vehicle and additionally or alternatively based on the least possible influence of the sub-section by the movement of the vehicle.
- track and / or object detection may include a step of performing track and / or object detection, a step of performing track and / or object tracking and positioning using camera sweep, and a step of driving an actuator an information to z. B. to issue a driver of the vehicle or active and corrective action to intervene.
- FIG. 2 shows camera images and a partial section of a camera image according to an embodiment of the present invention
- FIGS. 5 and 6 are flowcharts of methods in accordance with embodiments of the present invention.
- FIG. 1 shows a vehicle with a control device according to an embodiment of the present invention. Shown are a vehicle 100, a camera 110, a control device 120, a determination device 130, a generation device 140 and a determination device 150.
- Control device 120 has the determination device 130, the generation device 140 and the determination device 150.
- the camera 110 and the controller 120 are disposed in the vehicle 100.
- the camera 1 10 is communicatively connected to the controller 120.
- the determination device 130 is communicatively connected to the generation device 140 of the control device 120.
- the generation device 140 is communicatively connected to the determination device 150 of the control device 120.
- the camera 110 is arranged in the vehicle 100 in such a way that camera images can be received in the forward direction of travel of the vehicle 100 by means of optical devices of the camera 1, even if the arrangement of the camera 1 10 in FIG
- Vehicle 100 of FIG. 1 does not explicitly appear.
- the camera images will be discussed further with reference to FIG.
- the camera 1 10 is connected, for example, via a signal line or the like to the control unit 120.
- the camera 110 is designed to transmit to the control unit 120 image data representing the camera images.
- the control device 120 receives the camera images in the form of the image data from the camera 1 10.
- the control device 120 is designed to determine a pitching movement of the camera 1 10 installed in a vehicle.
- pairs of successive camera images are processed by the devices 130, 140 and 150 to control device 120.
- at least one pair of subsequent or successive camera images is processed in the controller 120.
- a flow of processing in the controller 120 will be explained only for a few such camera images. However, it is evident that the sequence may be repeated for other pairs of such camera images.
- the generation device 130 is designed to generate first image gradient data from a first camera image.
- the first image gradient data represent a change in brightness of adjacent pixels of the first camera image along a vertical axis of the first camera image.
- the Generation device 130 is also designed to generate second image gradient data from a second camera image subsequently recorded with respect to the first camera image.
- the second image gradient data represent a brightness change of adjacent pixels of the second camera image along a vertical axis of the second camera image.
- Image gradient data and the second image gradient data are transmitted from the generation device 130 to the generation device 140.
- the first image gradient data can hereby be transmitted as a first image gradient signal.
- the second image gradient data may be used as a second
- Image gradient signal to be transmitted is transmitted.
- the generator 140 receives the first image gradient data and the second image gradient data from the generator 130.
- the generator 140 is configured to generate an image shift value using the first image gradient data and the second image gradient data.
- the image shift value represents a shift of a pixel of the second camera image relative to a corresponding pixel of the first camera image.
- the generator 140 analyzes the first and second image gradient signals representing the first and second image gradient data to generate the image displacement value.
- the image shift value is transmitted from the generation device 140 to the determination device 150.
- the determination device 150 receives the image shift value from the generation device 140.
- the determination device 150 is configured to determine a pitching movement based on the image shift value in order to determine the pitching movement of the camera.
- the determination device 150 can calculate a pitch angular velocity from the image shift value and further data, as will be explained below.
- FIG. 2 shows camera images and a partial section of a camera image according to an exemplary embodiment of the present invention. Shown are a first camera image 212, a second camera image 214, and a subsection 215.
- the camera images 212, 214 may be captured by a camera such as the camera of FIG.
- the camera with which the camera images 212, 214 are recorded. may be installed in a vehicle such as the vehicle of FIG. 1.
- the second camera image 214 is shown partially obscuring the first camera image 212.
- the first camera image 212 and the second camera image 214 show a similar scene.
- the scenery can be seen completely.
- the second camera image 214 shows a road scene from the perspective of a vehicle interior through a windshield of the vehicle in the direction of travel forward.
- On display are a road with lane markings, a vehicle in front, a bridge spanning the carriageway, as well as buildings and vegetation.
- the first camera image 212 is recorded, for example, temporally in front of the second camera image 214.
- the vehicle in which the camera is mounted may have traveled a certain distance and there may have been a pitching movement of the vehicle and / or the camera. Therefore, image data of the camera images 212, 214 and thus also the objects visible in the camera images 212, 214 may differ due to a travel distance of the vehicle and additionally or alternatively a pitching movement of the vehicle and / or the camera.
- the partial section 215 comprises a partial area of the second camera image 214. Specifically, the partial section 215 comprises a partial area of the second camera image 214 in which the preceding vehicle is depicted. According to the exemplary embodiment illustrated in FIG. 2, the subsection 215 extends from an upper image edge to a lower image edge of the second camera image 214. A height of the subsection 215 thus corresponds here to a height of the second camera image 214. A width of the subsection 215 can for example, may be a fraction of a width of the second camera image 214, as shown in FIG. 2, or may be as high as the width of the second camera image 214, depending on requirements of a particular application.
- a height of the subsection 215 may be a fraction of a height of the second camera image 214, as shown in FIG. 2, or may be as high as the second camera image 214, depending on the needs of a particular application.
- the sub-section 215 can in this case by means of a device of a control device, such as the production device or upstream of the generating device of the control device of FIG. 1, be determined.
- FIG. 3 shows a partial section of a camera image and image gradient data according to an embodiment of the present invention. Shown are a
- the subsection 215 may be the subsection of the second camera image of FIG. 2.
- the subsection 215 in FIG. 3 may be changed based on the subsection of the second camera image of FIG. 2, for example, with respect to an image contrast and the like, so that the
- Image gradient data 330 can be advantageously generated.
- the image gradient data 330 can be generated from the subsection 215 by means of a suitable device, such as, for example, the generation device of the control device from FIG. 1.
- image gradient data 330 is shown to the right of subsection 215.
- the image gradient data 330 represents brightness changes of one or more columns of pixels of the sub-portion 215 from an upper to a lower edge of the sub-portion 215.
- the image gradient data 330 is shown in FIG. 3 as a graph of brightness values running vertically adjacent to the sub-portion 215 ,
- deflections of the graph to the left and to the right represent brightness changes between pixels of the subsection 215
- Image gradient data 330 or the graph of brightness values may be in the form of an image gradient signal.
- FIG. 4 shows a diagram 400 of a profile 410 of a pitch angular velocity ⁇ of a vehicle-mounted camera over time t and of a course 420 of a pitching angle speed ⁇ of a camera installed in a vehicle, determined in accordance with exemplary embodiments of the present invention over time t.
- the graph of the sensor 410 obtained in a conventional manner is generated with ground truth data measured by a high-resolution pitch angular velocity sensor or the like.
- the graph of the pitch angle velocity ⁇ determined according to exemplary embodiments of the present invention can be determined with the control device from FIG.
- certain course 420 follows the course 410 obtained in a conventional way by means of high-resolution sensors almost exactly.
- the method 500 has a step of generating 510 first image gradient data from a first camera image and second image gradient data from a second camera image subsequently recorded relative to the first camera image.
- the first image gradient data represent a brightness change of adjacent pixels of the first camera image along a vertical axis of the first camera image.
- the second image gradient data represents a change in brightness of adjacent pixels of the second camera image along a vertical axis of the second camera image.
- the method 500 also includes a step of generating 520 at least one image shift value representing a shift of a pixel of the second camera image relative to a corresponding pixel of the first camera image. The step of generating 520 is done using the first one
- the method 500 also includes a step of determining 530 a pitching motion based on the at least one image shift value to determine the pitching motion of the camera. Steps 510, 520 and 530 of method 500 may be repeatedly performed to continuously determine pitching motion of the camera based on a plurality of first camera images and second camera images.
- the method 600 includes a step of determining 610 a pitching motion of a vehicle-mounted camera according to the method for determining a pitching motion of a vehicle-mounted camera according to the embodiment of the present invention illustrated in FIG. 5.
- the step of determining 610 may include sub-steps corresponding to the steps of the method for determining a pitching movement of a camera installed in a vehicle according to the embodiment shown in FIG. 5.
- Example of the present invention correspond.
- the method 600 also includes a step of adjusting 620 a flare angle of the at least one headlight in response to the pitching motion of the camera to control the light emission of the at least one headlight.
- FIG. 2 shows a typical street scene with a preceding vehicle.
- two such consecutive camera images 212, 214 are considered.
- the image area defined by the box is taken into account, for example the subsection 215.
- a further reduction in redundant information takes place via the one-dimensional gradient in the vertical direction of the respective camera image. It reinforces the horizontal edge information and filters out the vertical information.
- the dimensional reduction takes place via the one-dimensional radon transformation in the horizontal direction of the subsection of the camera image.
- the result is a one-dimensional or 1 D signal, as shown in FIG. 3 in the form of the image gradient data 330.
- This process is performed for the two consecutive camera images 212, 214 at times (t-1) and (t).
- This shift can be estimated, for example, by the cross-correlation method with decimal precision. So the shift value can be generated. The procedure is as follows
- Pitch angle velocity The estimated or generated shift of the image by cross-correlation method is y, a camera focal length is f y , a pitch angle change is ⁇ , a picture time difference between two consecutive pictures is ⁇ .
- the proposed method 500 for determining the pitching movement of the camera 1 10 is applicable to video-based driving assistance functions, the z. B. Use insight (monocular) and stereo vision algorithms.
- the determination of the pitch angle velocity takes place in this case, for example, in very high resolution with high computational efficiency.
- the entire camera image 212, 214 is moved vertically up or down. This is considered to be a shift in camera images 212, 214 when two consecutive camera images 212, 214 are affected. For example, if the image shift is estimated with subpixel accuracy, the pitch angle change can be calculated with the same accuracy.
- the redundant information from the camera images 212, 214 is prefiltered, and the 2D offset is thus further resolved into a 1-D shift of the image to allow real-time computation.
- a real-time estimation of the pitch movement or pitch angular velocity of the camera 110 and finally a compensation of this movement are made possible.
- tracking accuracy and estimation of the distance to an object can also be improved.
- the determination of the pitch angle velocity is carried out, for example, on the basis of visual features and without sensor assistance.
- Moving object tracking can be achieved by compensating the camera angular velocity. By compensating the camera picking motion, the object tracking accuracy can be improved.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mechanical Engineering (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
- Traffic Control Systems (AREA)
Abstract
L'invention concerne un procédé (500) permettant de déterminer un mouvement de tangage d'une caméra montée dans un véhicule. Ledit procédé (500) comprend les étapes suivantes : production (510) de premières données de gradient d'image à partir d'une première image de la caméra, les premières images de gradient d'image représentant une modification de luminosité de points d'image voisins de la première image de la caméra le long d'un axe vertical de ladite première image; et production (510) de deuxièmes données de gradient d'image à partir d'une deuxième image de la caméra prise à la suite de ladite première image, les deuxièmes données de gradient d'image représentant une modification de luminosité de points d'image voisins de la deuxième image de la caméra le long d'un axe vertical de ladite deuxième image. Ledit procédé (500) comprend également une étape de génération (520) d'au moins une valeur de décalage d'image qui représente un décalage d'un point d'image de la deuxième image de la caméra par rapport à un point d'image correspondant de la première image de la caméra. Cette génération (520) est réalisée par l'utilisation des premières données de gradient d'image et des deuxièmes données de gradient d'image. Le procédé (500) comprend en outre une étape de saisie (530) d'un mouvement de tangage sur la base de la ou des valeurs de décalage d'image afin de déterminer le mouvement de tangage de la caméra.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102011076795A DE102011076795A1 (de) | 2011-05-31 | 2011-05-31 | Verfahren zum Bestimmen einer Nickbewegung einer in einem Fahrzeug verbauten Kamera und Verfahren zur Steuerung einer Lichtaussendung zumindest eines Frontscheinwerfers eines Fahrzeugs |
PCT/EP2012/058454 WO2012163631A1 (fr) | 2011-05-31 | 2012-05-08 | Procédé de détermination d'un mouvement de tangage d'une caméra montée dans un véhicule et procédé de commande d'une émission lumineuse d'au moins un phare avant d'un véhicule |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2715666A1 true EP2715666A1 (fr) | 2014-04-09 |
Family
ID=46044705
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12719384.5A Withdrawn EP2715666A1 (fr) | 2011-05-31 | 2012-05-08 | Procédé de détermination d'un mouvement de tangage d'une caméra montée dans un véhicule et procédé de commande d'une émission lumineuse d'au moins un phare avant d'un véhicule |
Country Status (6)
Country | Link |
---|---|
US (1) | US20140218525A1 (fr) |
EP (1) | EP2715666A1 (fr) |
JP (1) | JP5792378B2 (fr) |
CN (1) | CN103765476A (fr) |
DE (1) | DE102011076795A1 (fr) |
WO (1) | WO2012163631A1 (fr) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9619894B2 (en) * | 2014-05-16 | 2017-04-11 | GM Global Technology Operations LLC | System and method for estimating vehicle dynamics using feature points in images from multiple cameras |
CN105730321B (zh) * | 2016-02-15 | 2018-03-06 | 长沙科讯智能科技有限公司 | 一种车辆远光灯智能控制系统及控制方法 |
US9535423B1 (en) * | 2016-03-29 | 2017-01-03 | Adasworks Kft. | Autonomous vehicle with improved visual detection ability |
WO2018110389A1 (fr) * | 2016-12-15 | 2018-06-21 | 株式会社小糸製作所 | Système d'éclairage de véhicule et véhicule |
JP6901386B2 (ja) | 2017-12-08 | 2021-07-14 | 株式会社東芝 | 勾配推定装置、勾配推定方法、プログラムおよび制御システム |
US10997737B2 (en) * | 2019-05-02 | 2021-05-04 | GM Global Technology Operations LLC | Method and system for aligning image data from a vehicle camera |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06202217A (ja) * | 1992-12-28 | 1994-07-22 | Canon Inc | 手振れ表示機能付カメラ |
JP2000194864A (ja) * | 1998-12-28 | 2000-07-14 | Jeol Ltd | パタ―ン欠陥分類装置 |
JP2002214659A (ja) * | 2001-01-16 | 2002-07-31 | Canon Inc | ぶれ補正機能付きカメラ |
US7253835B2 (en) * | 2002-12-23 | 2007-08-07 | Hrl Laboratories, Llc | Method and apparatus for estimating a camera reference horizon |
JP2006041604A (ja) * | 2004-07-22 | 2006-02-09 | Seiko Epson Corp | 画像処理装置、画像処理方法、および画像処理プログラム |
FR2874300B1 (fr) * | 2004-08-11 | 2006-11-24 | Renault Sas | Procede de calibration automatique d'un systeme de stereovision |
US7443443B2 (en) * | 2005-07-28 | 2008-10-28 | Mitsubishi Electric Research Laboratories, Inc. | Method and apparatus for enhancing flash and ambient images |
JP4697101B2 (ja) | 2006-09-07 | 2011-06-08 | 株式会社デンソー | 車両検出装置、およびライト制御装置 |
JP2008151659A (ja) * | 2006-12-18 | 2008-07-03 | Fuji Heavy Ind Ltd | 物体検出装置 |
US8452123B2 (en) * | 2008-01-18 | 2013-05-28 | California Institute Of Technology | Distortion calibration for optical sensors |
DE102008025948A1 (de) * | 2008-05-30 | 2009-12-03 | Hella Kgaa Hueck & Co. | Verfahren und Vorrichtung zum Steuern der Lichtabgabe minestens eines Frontscheinwerfers eines Fahrzeugs |
JP2010006270A (ja) * | 2008-06-27 | 2010-01-14 | Toyota Motor Corp | 車両挙動検出装置 |
JP5015097B2 (ja) * | 2008-08-29 | 2012-08-29 | シャープ株式会社 | 画像処理装置、画像処理プログラム、コンピュータ読み取り可能な記録媒体、電子機器及び画像処理方法 |
JP2010143506A (ja) * | 2008-12-22 | 2010-07-01 | Koito Mfg Co Ltd | 車両用ランプのオートレベリングシステム |
DE102008063328A1 (de) * | 2008-12-30 | 2010-07-01 | Hella Kgaa Hueck & Co. | Verfahren und Vorrichtung zum Ermitteln einer Änderung des Nickwinkels einer Kamera eines Fahrzeugs |
JP5251544B2 (ja) * | 2009-01-27 | 2013-07-31 | 日本電気株式会社 | 画像処理装置、画像処理方法および画像処理プログラム |
US8259174B2 (en) * | 2009-02-06 | 2012-09-04 | GM Global Technology Operations LLC | Camera auto-calibration by horizon estimation |
JP2011084150A (ja) * | 2009-10-15 | 2011-04-28 | Koito Mfg Co Ltd | 車両用ヘッドランプのオートレベリング装置 |
JP2011098578A (ja) * | 2009-11-04 | 2011-05-19 | Koito Mfg Co Ltd | 二輪車用ヘッドランプシステム |
-
2011
- 2011-05-31 DE DE102011076795A patent/DE102011076795A1/de not_active Ceased
-
2012
- 2012-05-08 US US14/122,183 patent/US20140218525A1/en not_active Abandoned
- 2012-05-08 EP EP12719384.5A patent/EP2715666A1/fr not_active Withdrawn
- 2012-05-08 JP JP2014511806A patent/JP5792378B2/ja not_active Expired - Fee Related
- 2012-05-08 WO PCT/EP2012/058454 patent/WO2012163631A1/fr active Application Filing
- 2012-05-08 CN CN201280026814.1A patent/CN103765476A/zh active Pending
Non-Patent Citations (1)
Title |
---|
See references of WO2012163631A1 * |
Also Published As
Publication number | Publication date |
---|---|
US20140218525A1 (en) | 2014-08-07 |
CN103765476A (zh) | 2014-04-30 |
JP5792378B2 (ja) | 2015-10-14 |
JP2014522589A (ja) | 2014-09-04 |
WO2012163631A1 (fr) | 2012-12-06 |
DE102011076795A1 (de) | 2012-09-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102012104766B4 (de) | Spurermittlung mittels Spurmarkierungsidentifikation für Spurzentrierung/-haltung | |
DE102009005505B4 (de) | Verfahren und Vorrichtung zur Erzeugung eines Abbildes der Umgebung eines Kraftfahrzeugs | |
EP2888604B1 (fr) | Procédé de détermination du tracé de la voie d'un véhicule | |
DE102009044269B4 (de) | Lenkassistentenvorrichtung | |
EP2504209B1 (fr) | Procédé d'estimation de l'angle de roulis dans une voiture | |
DE102013221696A1 (de) | Verfahren und Vorrichtung zum Ermitteln eines Höhenverlaufes einer vor einem Fahrzeug liegenden Straße | |
DE102009005184A1 (de) | Fahrzeugabtastsystem | |
EP2795537A1 (fr) | Détermination d'un profil de hauteur d'un environnement de véhicule au moyen d'un appareil de prise de vues 3d | |
DE102013205882A1 (de) | Verfahren und Vorrichtung zum Führen eines Fahrzeugs im Umfeld eines Objekts | |
DE102011117554A1 (de) | Fahrumgebungserfassungsvorrichtung | |
DE102016212326A1 (de) | Verfahren zur Verarbeitung von Sensordaten für eine Position und/oder Orientierung eines Fahrzeugs | |
WO2012163631A1 (fr) | Procédé de détermination d'un mouvement de tangage d'une caméra montée dans un véhicule et procédé de commande d'une émission lumineuse d'au moins un phare avant d'un véhicule | |
WO2011138164A1 (fr) | Procédé de fonctionnement d'un système d'assistance au conducteur d'un véhicule, système d'assistance au conducteur et véhicule | |
DE102012206211A1 (de) | Verfahren und Vorrichtung zum Bestimmen eines Spuranpassungsparameters für ein Spurhaltesystem eines Fahrzeugs sowie Verfahren und Vorrichtung zur Spurführung eines Fahrzeugs | |
DE112018005415T5 (de) | Steuergerät und Steuerungsverfahren zum Steuern des Verhaltens eines Motorrades | |
DE102015116542A1 (de) | Verfahren zum Bestimmen einer Parkfläche zum Parken eines Kraftfahrzeugs, Fahrerassistenzsystem sowie Kraftfahrzeug | |
EP2131598A2 (fr) | Système de caméra stéréo et procédé de détermination d'au moins une erreur de calibrage d'un système de caméra stéréo | |
DE102013103952B4 (de) | Spurerkennung bei voller Fahrt mit einem Rundumsichtsystem | |
DE102013103953B4 (de) | Spurerkennung bei voller Fahrt unter Verwendung mehrerer Kameras | |
WO2018023143A1 (fr) | Procédé et dispositif de mesure d'une distance entre un premier véhicule et un deuxième véhicule circulant immédiatement devant le premier véhicule | |
DE102008025773A1 (de) | Verfahren zur Schätzung eines Orts- und Bewegungszustands eines beobachteten Objekts | |
WO2009077445A1 (fr) | Procédé et dispositif de détection optique de l'environnement d'un véhicule | |
EP3677019B1 (fr) | Procédé et dispositif de commande prévisible d'exposition d'au moins une première caméra de véhicule | |
EP3655299B1 (fr) | Procédé et dispositif de détermination d'un flux optique à l'aide d'une séquence d'images enregistrée par une caméra d'un véhicule | |
EP1588910B1 (fr) | Procédé pour l'analyse et le réglage de la dynamique de marche d'un véhicule automobile et véhicule pour la mise en oeuvre de ce procédé |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20140102 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20160412 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN |
|
18W | Application withdrawn |
Effective date: 20160803 |