WO2009077445A1 - Procédé et dispositif de détection optique de l'environnement d'un véhicule - Google Patents

Procédé et dispositif de détection optique de l'environnement d'un véhicule Download PDF

Info

Publication number
WO2009077445A1
WO2009077445A1 PCT/EP2008/067397 EP2008067397W WO2009077445A1 WO 2009077445 A1 WO2009077445 A1 WO 2009077445A1 EP 2008067397 W EP2008067397 W EP 2008067397W WO 2009077445 A1 WO2009077445 A1 WO 2009077445A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
area
cameras
camera
camera detection
Prior art date
Application number
PCT/EP2008/067397
Other languages
German (de)
English (en)
Inventor
Mattias Strauss
Original Assignee
Continental Teves Ag & Co. Ohg
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Continental Teves Ag & Co. Ohg filed Critical Continental Teves Ag & Co. Ohg
Publication of WO2009077445A1 publication Critical patent/WO2009077445A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors

Definitions

  • the invention relates to a method and a device for optically detecting the surroundings of a vehicle. Moreover, the invention relates to a vehicle, in particular a motor vehicle, comprising the device.
  • Modern motor vehicles often have environmental sensors that can be used to detect objects that are located in an environment of the vehicle.
  • the environment sensors may be part of safety systems of a vehicle, for example, which detect by means of environmental sensors objects or obstacles in the vehicle environment with which the vehicle could collide.
  • measures for avoiding the collision and / or for reducing collision sequences are carried out by such security systems.
  • cameras can be used as environmental sensors, whose images allow recognition and classification of detected objects in the surroundings of the vehicle.
  • stereo cameras it is also possible to determine the spatial position of objects in relation to the vehicle.
  • the collection and evaluation of image data However, a large environmental area is usually expensive and resource intensive. This is especially true if a stereoscopic evaluation of the image data is provided.
  • a method for optically detecting an environment of a vehicle, in which an area of the vehicle surroundings to be detected optically is recorded with a camera.
  • the method is characterized in that the surrounding area to be detected optically is recorded by means of more than one camera, each with a camera detection area, wherein a covering area is formed by overlapping partial areas, and wherein the covering area is at least partially formed by means of a stereoscopic evaluation method and remaining partial area of the camera Camera detection areas are evaluated by means of a further evaluation.
  • an apparatus for optically detecting an environment of a vehicle with which an area of the vehicle environment to be detected optically can be recorded with a camera.
  • the device comprises a plurality of cameras each having a camera detection area, wherein the cameras are arranged such that an overlapping area is formed by means of overlapping partial areas of a plurality of camera detection areas.
  • the device has an ne evaluation device, which is adapted to evaluate the coverage area at least partially by means of a stereoscopic evaluation and remaining parts of the camera detection areas by means of a further evaluation.
  • the surrounding area is recorded in the overlapping area with a plurality of cameras, stereo images of the overlapping area can advantageously be detected so that a three-dimensional model of the vehicle surroundings can be generated by means of a stereoscopic evaluation method.
  • This makes it possible to determine the relative position of detected objects with respect to the vehicle, in particular the distance between the vehicle and the objects, with high accuracy.
  • the coverage area forms only a partial area of the entire optically detected surrounding area.
  • the image evaluation is carried out on the basis of a further evaluation method. This can be carried out in a simpler and, in particular, resource-saving manner than the stereoscopic evaluation method, in which several individual images of the same area are taken into account. As a result, a large surrounding area of the vehicle can be recorded and evaluated very efficiently.
  • three cameras are provided, the overlapping area being formed on the basis of an overlapping of the camera detection areas of external cameras with the camera detection area of a middle camera and / or on the basis of an overlapping of the camera detection areas of the outer cameras.
  • An advantage of a three-camera configuration is that two cameras can cover a large area of the surrounding area, and a suitable area and coverage of the third camera can be used to selectively create a coverage area without having to limit the total area of coverage captured.
  • a further embodiment of the method and the device provides that the partial areas in which the camera detection areas do not overlap adjoin the covering area laterally and / or forwards with respect to the vehicle longitudinal direction.
  • the optically detected surrounding area beyond the coverage area comprises further particularly important surrounding areas of the vehicle, in particular areas laterally and / or in front of the coverage area.
  • objects may be present that form a potential source of danger to the vehicle when it moves forward in the vehicle longitudinal direction.
  • a further development of the method and the device comprises that the remaining subareas are evaluated on the basis of an optical flow.
  • the optical flow includes the direction and speed of movement of points associated with detected objects.
  • objects which are arranged in the lateral surrounding area of the vehicle often have a particularly high relative speed with respect to the vehicle and can therefore be recognized particularly well by means of the optical flow.
  • An example of such objects are pedestrians moving near the lane of the vehicle, vehicles approaching the side, or objects standing on the side of the lane.
  • An embodiment of the method and the apparatus is characterized in that the part of the overlapping area evaluated by means of the stereoscopic evaluation method is at least partially additionally evaluated by means of the further evaluation method. In this way, a particularly high information density for the surrounding areas can be generated, which are evaluated both on the basis of the stereoscopic and on the basis of the further evaluation method.
  • a further embodiment of the method and the device provides that the part of the overlapping area evaluated by means of the stereoscopic evaluation method is changed dynamically.
  • the region of the part of the overlap region evaluated by means of the stereoscopic evaluation method which is additionally evaluated by means of the further evaluation method, is changed dynamically.
  • an embodiment of the method and the device is characterized in that the dynamic change is made when cornering the vehicle and takes place in such a way that the part or region is expanded and / or displaced essentially in a curve-internal direction.
  • the invention additionally provides a vehicle, in particular a motor vehicle, which comprises a device of the type described above.
  • the cameras of the device are located within an upper third of a windshield area, and preferably within a wipeable windshield area.
  • the cameras are inside the vehicle.
  • Such an arrangement of the cameras ensures that the front environment of the vehicle can be detected.
  • the cameras are largely protected against external influences.
  • An arrangement in the upper windshield area ensures a good overview of the visually detectable surrounding area, and the field of vision of a driver of the vehicle and / or a passenger is not critically restricted.
  • a refinement of the invention includes that the vehicle comprises a decalibration fuse, the cameras being mounted on the decalibration fuse in such a way that movements of the cameras relative to one another are minimized.
  • the decalibration protection can be, for example, a torsion-resistant carrier on which the cameras are mounted together.
  • FIG. 1 is a schematic block diagram showing components of a system for optically detecting a vehicle environment
  • FIG. 2 shows a schematic illustration of the camera detection areas of the cameras contained in the system in an embodiment
  • FIG. 3 shows a schematic illustration of the camera detection areas of the cameras contained in the system in a further embodiment
  • FIG. 4 shows a schematic illustration of the camera detection areas of the cameras contained in the system in a still further embodiment
  • FIG. 5 shows a schematic illustration of the camera detection areas of the cameras contained in the system, as well as of a partial area of an overlapping area of the camera detection areas for which a stereoscopic image evaluation is performed and
  • FIG. 6 shows a schematic illustration of the camera detection areas of the cameras contained in the system and of a partial area, adapted for cornering, of an overlapping area of the camera detection areas for which a stereoscopic image evaluation is undertaken.
  • FIG. 1 shows a schematic representation of an optical detection system of a vehicle 201, with which objects in the environment of the vehicle 201 can be detected.
  • the system features in the illustrated Embodiment via three cameras 101, 102, 103, the images of an area to be detected surrounding area of the vehicle 201 record.
  • the cameras 101, 102, 103 are preferably designed as video cameras which provide video data which continuously includes images of the surrounding area taken in succession.
  • the cameras are arranged in one embodiment in the interior of the passenger compartment of the vehicle 201.
  • the cameras 101, 102, 103 may be mounted in front of the windshield at an upper cross-connection of the A-pillars.
  • the attachment of the cameras 101, 102, 103 is designed so that a Dekalibherung against each other, that is, a change in the relative position and / or orientation of the cameras 101, 102, 103 to each other, is avoided. Therefore, in one embodiment, the cameras 101, 102, 103 are not attached to the windshield because the cameras 101, 102, 103 could decalibrate against each other by distortion or temperature dependent expansion of the disk.
  • the cameras 101, 102, 103 may instead be mounted on a special carrier. Likewise, the cameras 101, 102, 103 can be mounted directly on the A-pillar cross-connection, if this connection is designed so that it does not also warp in chassis connections and is insensitive to temperature variations.
  • the video data captured by means of the cameras 101, 102, 103 are fed to an evaluation device 104, which carries out an evaluation of the video data.
  • the evaluation device 104 is preferably part of a control unit of the vehicle 201. In one embodiment, the latter has a microprocessor for executing programs and a non-volatile memory in which executable programs and other data can be stored on the microprocessor.
  • the evaluation device 104 can be implemented in the form of one or more programs in the controller.
  • the relative position of the objects with respect to the vehicle 201 and / or the relative speed of the objects with respect to the vehicle 201 is determined.
  • a classification of the detected objects is preferably carried out on the basis of the video data. This can be carried out in a manner known per se to the person skilled in the art by means of a neural network, a support vector machine or a corresponding derivative.
  • the objects may be, in particular, other vehicles or obstacles which represent a potential source of danger for the vehicle 201 or should be taken into account in the vehicle guidance.
  • the results of the evaluation of the video data made in the evaluation device 104 can be provided to further systems of the vehicle 201, which are not shown in the figures.
  • a system may, for example, comprise a safety device which, in the event of an imminent collision of the vehicle 201 with an object detected by means of the cameras 101, 102, 103, carries out measures for avoiding the collision or for reducing collision sequences.
  • the cameras 101, 102, 103 are arranged next to each other and preferably capture a surrounding area of the vehicle 201 which is formed in front of the vehicle 201.
  • the middle camera 101 is assigned a central camera detection area 202, the right camera 102 a right camera detection area 203 and the left camera a left camera detection area 204.
  • the cameras 101, 102, 103 are aligned such that the optical axes of the outer cameras 102, 103 intersect, so that the right camera 102, as shown in Figure 3, a camera detection area 204 'on the left vehicle side forms and the left camera 103 a Camera detection area 203 'on the right side of the vehicle.
  • the evaluation of the video data can be made in this case in the same way.
  • variously mounted surrounding areas in front of the vehicle 201 can be monitored.
  • a rather wide surrounding area can be monitored with a smaller longitudinal extent, whereby in particular the area can be detected laterally from the lane of the vehicle 201.
  • a narrower, but far reaching optical surrounding area can be detected. This is particularly advantageous at high speeds of the vehicle 201.
  • the cameras 101, 102, 103 are aligned and arranged such that a plurality of camera detection areas 202, 203, 204 at least partially overlap to an overlapping area 205.
  • the video data includes stereo images, that is, images of the same partial area of the vehicle surroundings shifted from each other.
  • the stereo data enable a stereoscopic image analysis by means of a stereo algorithm, which is performed in the evaluation device 104.
  • a three-dimensional environment model can be determined. In particular, the distance of a recognized object to the vehicle 201 can be determined.
  • the relative speed of the object relative to the vehicle 201 can be calculated.
  • partial areas 206, 207 of the camera detection areas 202, 203, 204, which are not part of the coverage area 205 stereo images can not be detected. Therefore, the evaluation device 104 does not perform stereoscopic image evaluation for these subareas 206, 207. Instead, the image evaluation in the evaluation device 104 is based on another evaluation method.
  • the evaluation is based on an optical flow, which can be determined in a manner known to those skilled in the art. In this case, the optical flow in particular includes the direction of movement and the speed of the picture elements of a picture sequence and allows the identification of objects within the pictures as well as an estimate of their movement.
  • the arrangement of the cameras 101, 102, 103 illustrated with reference to FIG. 2 thus makes it possible to determine the relative position of objects located in the coverage area 205 with respect to the vehicle 201 with high accuracy on the basis of the stereoscopic image evaluation , Objects located in the lateral portions 206, 207 of the camera detection areas 203, 204 that do not belong to the coverage area 205 may also be detected. Although only the optical flow can be determined for these objects, so that the relative position of these objects with respect to the vehicle 201 can be estimated at most roughly.
  • the evaluation based on the optical flow is more resource-conserving and, owing to the lateral arrangement of the relevant subregions 206, 207 of the camera detection regions 203, 204, permits very early detection of objects which move outside of the overlap region 205 onto the vehicle 201.
  • the subregions 206, 207 belonging to the outer camera detection region 203, 204 that do not belong to the overlap region 205 thus represent, as it were, an extension of the overlap region, in which a limited evaluation can be performed.
  • the covering area 205 can be composed in its entirety of a plurality of partial areas (not explicitly numbered for the sake of clarity). For example, in the configuration shown in FIG.
  • a right partial area resulting from overlapping of the central camera detection area 202 and the right camera detection area 203 is formed, and a left coverage area resulting from an overlap of the central camera detection area 202 and the left camera detection area 204 ,
  • a right subarea 206 and a left subarea 207 are formed, in which the vehicle surroundings can be detected only by means of an external camera 102, 103. Thus, no stereo images can be detected in the subregions 206, 207.
  • FIGS. 2 and 3 correspond to arrangements in which the overlapping area 205 is laterally expanded. This allows a particularly early detection of objects, which come from the side on the vehicle 201 to move. By evaluating the video data with respect to the subareas 206, 207, in particular possible collisions with such objects can be detected early. On the basis of the evaluation based on the optical flow, it is easy to detect objects located in the subregions 206, 207, which have a high relative speed with respect to the vehicle 201. An example of such objects are pedestrians, vehicles approaching from the side or objects standing on the side of the driving tube.
  • FIG. 4 shows a further configuration in which the outer cameras 102, 103 have a greater range than the middle camera 101 and their camera detection ranges 203, 204 overlap to a greater extent than is the case in the configuration shown in FIG .
  • the overlapping area 205 here comprises a first overlapping partial area 401, which has a right partial area, which consists of an overlap of the central camera detection area 202 and the right camera detection area 203 results, as well as a left partial area resulting from an overlap of the central camera detection area 202 and the left camera detection area 204.
  • the overlapping portion 401 is adjoined by a further overlapping portion 402 at a greater distance from the vehicle 201.
  • the overlap portion 402 results from the overlap of the right camera detection area 203 and the left camera detection area 204, and has a smaller transverse extent than the overlap area 401.
  • the configuration shown in FIG. 4 has the particular advantage that it captures stereo images of a particularly large surrounding area or the camera detection areas 202, 203, 204 can be used particularly efficiently for acquiring stereo images.
  • the coverage area 205 is evaluated both on the basis of a stereo algorithm and additionally on the basis of the optical flow. As a result, a particularly high information density is available for the coverage area 205.
  • the subareas 206, 207 of the camera detection areas 203, 204, which are not part of the coverage area 205, are only evaluated on the basis of the optical flow, since no stereo data is available for these areas.
  • the evaluation device 104 can first merge the determined data and then extract the relevant objects from the information thus obtained. Likewise, however, it is also possible to first detect the objects based on the stereo data and on the basis of the data obtained from the optical flow, and then to merge the object data determined in the process.
  • Another embodiment differs from the aforementioned embodiment in that the coverage area 205 is evaluated only on the basis of the stereo algorithm and not on the basis of the optical flow. Thus, not all possible information regarding environmental area 205 is available. However, since optical flow data generally arises wherever something moves in a recorded image and relevant objects in the central camera detection region 202 hardly move in relation to a normal driving style in general, the evaluation based on the stereo algorithm usually suffices in order to be able to evaluate the surrounding area 205 with sufficient accuracy.
  • the evaluation of the video data by means of the stereo algorithm can be limited to a partial area 501 of the entire coverage area of the camera detection areas 202, 203, 204.
  • the TeN region 501 can extend over a smaller angular range than the entire overlap region 205.
  • the partial region 501 can be selected freely.
  • an evaluation based on the optical flow, as for the subregions 206, preferably takes place for these regions. 207 of the outer camera detection areas 203, 204, which are not part of the coverage area 205.
  • an evaluation based on the optical flow can be performed.
  • these components are also basically freely selectable.
  • areas can be configured in which an additional evaluation based on the optical flow takes place.
  • These areas are also referred to below as multiple evaluation areas.
  • a multiple evaluation area is provided which comprises a contiguous area. Likewise, however, multiple separate multiple evaluation areas may be configured.
  • the partial area 501 of the overlapping area 205 in which a stereoscopic evaluation of the video data is undertaken, and / or the multiple evaluation areas can be changed dynamically by the evaluation device 104 in one embodiment.
  • the adaptation is made in particular during a cornering of the vehicle 201.
  • the subarea 501 or a multiple evaluation area is expanded and / or shifted in a curve-inward direction.
  • objects that are positioned with respect to the vehicle 201 in such a way that they could be relevant when cornering can be better detected. As shown in FIG.
  • the partial area 501 or a multiple evaluation area is shifted to the right in the case of a right turn of the vehicle 201, so that a partial area 601 is formed in which a stereoscopic image evaluation is performed.
  • an extension to the right can be provided.
  • a shift or extension to the left occurs.
  • Multiple evaluation ranges can be adjusted in an analogous way.
  • a cornering of the vehicle 201 can be determined by the evaluation device 104, for example based on signals of a steering angle sensor, with which the Radeinschlagwinkel the steerable vehicle wheels can be determined, or by means of a position determination device, with the position and direction of movement of the vehicle 201 can be determined.
  • This can be a satellite-supported position-determining device, which can be part of a navigation system of the vehicle.
  • the extent of expansion and / or displacement is determined by the evaluation device 104 in one embodiment as a function of the wheel steering angle of the steerable wheels of the vehicle 201 or in dependence on the curve radius.
  • the Radeinschlagwinkel is determined by means of the steering angle sensor.
  • the curve radius can be determined by the evaluation device 104 on the basis of the wheel steering angle or with the aid of the position determination device.
  • the vehicle 201 has, in addition to the cameras 101, 102, 103, one or more beam sensors for detecting objects in the vehicle environment. These are sensors that emit radiation in a surrounding area. The radiation is partially reflected by objects located in the surrounding area and the reflected radiation is detected in the beam sensor. From the detected radiation, the relative position of an object with respect to the vehicle 201 and in particular the distance of the object from the vehicle 201 can be determined. Furthermore, the relative speed of the object with respect to the vehicle can be determined. Examples of such beam sensors are radar and lidar sensors.
  • the data obtained by the evaluation device 104 by means of the cameras 101, 102, 103 from objects located in the surroundings of the vehicle 201 can be combined in this embodiment with data obtained by means of the existing beam sensors.
  • the distance between the vehicle 201 and those objects can be determined which are recognized in the video data only by an evaluation of the optical flow.
  • the additional detection of the objects by means of the cameras 101, 102, 103 has the particular advantage that a classification of the detected objects can be made on the basis of the video data.
  • the fusion of the data recorded by means of the cameras 101, 102, 103 on the one hand and the available beam sensors on the other hand can be performed both on the measurement data level and on the object level.

Abstract

L'invention concerne un procédé de détection optique d'un environnement d'un véhicule (201), avec lequel une zone (202, 203, 204) de l'environnement du véhicule à détecter de manière optique est enregistrée avec une caméra (101, 102, 103). Le procédé est caractérisé en ce que la zone de l'environnement à détecter de manière optique est enregistrée au moyen de plus d'une caméra (101, 102, 103) à chaque fois à l'aide d'une zone d'acquisition de caméra (202, 203, 204). Selon l'invention, une zone de recouvrement (205) est formée au moyen des zones partielles qui se chevauchent de plusieurs zones d'acquisition de caméra (202, 203, 204) et la zone de recouvrement (205) est analysée au moins partiellement au moyen d'un procédé d'analyse stéréoscopique et les autres zones partielles des zones d'acquisition de caméra (202, 203, 204) sont analysées au moyen d'un procédé d'analyse supplémentaire. L'invention concerne également un dispositif approprié pour mettre en œuvre le procédé et un véhicule comprenant le dispositif.
PCT/EP2008/067397 2007-12-17 2008-12-12 Procédé et dispositif de détection optique de l'environnement d'un véhicule WO2009077445A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DE102007060702 2007-12-17
DE102007060702.6 2007-12-17
DE102008061749A DE102008061749A1 (de) 2007-12-17 2008-12-12 Verfahren und Vorrichtung zum optischen Erfassen einer Fahrzeugumgebung
DE102008061749.0 2008-12-12

Publications (1)

Publication Number Publication Date
WO2009077445A1 true WO2009077445A1 (fr) 2009-06-25

Family

ID=40512372

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2008/067397 WO2009077445A1 (fr) 2007-12-17 2008-12-12 Procédé et dispositif de détection optique de l'environnement d'un véhicule

Country Status (2)

Country Link
DE (1) DE102008061749A1 (fr)
WO (1) WO2009077445A1 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10209073B2 (en) 2010-09-23 2019-02-19 Continental Teves Ag & Co. Ohg Location-determining device in a motor vehicle and information merging method
DE102011080702B3 (de) 2011-08-09 2012-12-13 3Vi Gmbh Objekterfassungsvorrichtung für ein Fahrzeug, Fahrzeug mit einer derartigen Objekterfassungsvorrichtung
DE102011116169A1 (de) * 2011-10-14 2013-04-18 Continental Teves Ag & Co. Ohg Vorrichtung zur Unterstützung eines Fahrers beim Fahren eines Fahrzeugs oder zum autonomen Fahren eines Fahrzeugs
DE102012201441A1 (de) * 2012-02-01 2013-08-01 Rheinmetall Defence Electronics Gmbh Verfahren und Vorrichtung zum Führen eines Fahrzeugs
DE102012215026A1 (de) 2012-08-23 2014-05-28 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Vorrichtung zum Betreiben eines Fahrzeugs
DE102014220558A1 (de) * 2014-10-10 2016-04-14 Conti Temic Microelectronic Gmbh Bilderfassungsvorrichtung für ein fahrzeug und verfahren
DE102016220079B4 (de) 2016-10-14 2023-04-06 Audi Ag Verfahren zur Ermittlung von Entfernungsdaten
DE102021209840A1 (de) 2021-09-07 2023-03-09 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren und Vorrichtung zum Betreiben eines Zugfahrzeugs mit einem Anhänger

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5530420A (en) * 1993-12-27 1996-06-25 Fuji Jukogyo Kabushiki Kaisha Running guide apparatus for vehicle capable of keeping safety at passing through narrow path and the method thereof
WO2003002375A1 (fr) * 2001-06-28 2003-01-09 Robert Bosch Gmbh Dispositif pour la detection par imagerie d'objets, de personnes ou autres a la peripherie d'un vehicule
WO2006021171A1 (fr) * 2004-08-21 2006-03-02 Adc Automotive Distance Control Systems Gmbh Systeme optique d'identification d'objets

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5530420A (en) * 1993-12-27 1996-06-25 Fuji Jukogyo Kabushiki Kaisha Running guide apparatus for vehicle capable of keeping safety at passing through narrow path and the method thereof
WO2003002375A1 (fr) * 2001-06-28 2003-01-09 Robert Bosch Gmbh Dispositif pour la detection par imagerie d'objets, de personnes ou autres a la peripherie d'un vehicule
WO2006021171A1 (fr) * 2004-08-21 2006-03-02 Adc Automotive Distance Control Systems Gmbh Systeme optique d'identification d'objets

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PELLKOFER M ET AL: "EMS-vision: gaze control in autonomous vehicles", INTELLIGENT VEHICLES SYMPOSIUM, 2000. IV 2000. PROCEEDINGS OF THE IEEE DEARBORN, MI, USA 3-5 OCT. 2000, PISCATAWAY, NJ, USA,IEEE, US, 3 October 2000 (2000-10-03), pages 296 - 301, XP010528953, ISBN: 978-0-7803-6363-2 *
WILLIAMSON T ET AL: "A trinocular stereo system for highway obstacle detection", ROBOTICS AND AUTOMATION, 1999. PROCEEDINGS. 1999 IEEE INTERNATIONAL CO NFERENCE ON DETROIT, MI, USA 10-15 MAY 1999, PISCATAWAY, NJ, USA,IEEE, US, vol. 3, 10 May 1999 (1999-05-10), pages 2267 - 2273, XP010336578, ISBN: 978-0-7803-5180-6, Retrieved from the Internet <URL:http://ieeexplore.ieee.org/iel5/6243/16696/00770443.pdf?tp=&arnumber= 770443&isnumber=16696> *

Also Published As

Publication number Publication date
DE102008061749A1 (de) 2009-06-25

Similar Documents

Publication Publication Date Title
EP1928687B1 (fr) Procede et systeme d&#39;aide a la conduite pour la commande de demarrage d&#39;un vehicule automobile basee sur un capteur
EP3084466B1 (fr) Procédé de détection d&#39;un marquage placé sur une route, dispositif d&#39;assistance au conducteur et véhicule automobile
WO2009077445A1 (fr) Procédé et dispositif de détection optique de l&#39;environnement d&#39;un véhicule
DE102011014699B4 (de) Verfahren zum Betrieb eines Fahrerassistenzsystems zum Schutz eines Kraftfahrzeuges vor Beschädigungen und Kraftfahrzeug
DE102013205882A1 (de) Verfahren und Vorrichtung zum Führen eines Fahrzeugs im Umfeld eines Objekts
EP3437929A1 (fr) Système de vision à champs de vision / effet d&#39;incrustation de la zone de vision en fonction de la situation de conduite
DE102011014081A1 (de) Verfahren zum Erkennen eines Abbiegemanövers
EP3665502B1 (fr) Procédé de surveillance d&#39;une zone environnante d&#39;un véhicule automobile, appareil de commande de capteurs, système d&#39;aide à la conduite ainsi que véhicule automobile
EP3044727B1 (fr) Procédé et dispositif de détection d&#39;objets d&#39;après des données d&#39;image ayant une résolution de profondeur
EP2982572B1 (fr) Procédé d&#39;assistance d&#39;un conducteur de véhicule automobile lors du stationnement, système d&#39;assistance de conducteur et véhicule automobile
DE102014224762B4 (de) Verfahren und Vorrichtung zur Informationsgewinnung über ein Objekt in einem nicht einsehbaren, vorausliegenden Umfeldbereich eines Kraftfahrzeugs
DE102009032024A1 (de) Verfahren zum Bestimmen einer Position eines an einem Fahrzeug angehängten Anhängers relativ zum Fahrzeug sowie Fahrerassistenzsystem für ein Fahrzeug
DE102017207960A1 (de) Verfahren und vorrichtung zur ortsaufgelösten detektion von einem fahrzeugexternen objekt mithilfe eines in einem fahrzeug verbauten sensors
DE102015116542A1 (de) Verfahren zum Bestimmen einer Parkfläche zum Parken eines Kraftfahrzeugs, Fahrerassistenzsystem sowie Kraftfahrzeug
WO2012163631A1 (fr) Procédé de détermination d&#39;un mouvement de tangage d&#39;une caméra montée dans un véhicule et procédé de commande d&#39;une émission lumineuse d&#39;au moins un phare avant d&#39;un véhicule
EP2131598A2 (fr) Système de caméra stéréo et procédé de détermination d&#39;au moins une erreur de calibrage d&#39;un système de caméra stéréo
DE102006037600B4 (de) Verfahren zur auflösungsabhängigen Darstellung der Umgebung eines Kraftfahrzeugs
DE102006044864A1 (de) Verfahren zur rechnergestützten Bildverarbeitung in einem Nachtsichtsystem eines Verkehrsmittels
DE102012220191A1 (de) Verfahren zur Unterstützung eines Fahrers bei der Querführung eines Fahrzeugs
EP3032517B1 (fr) Procédé et dispositif d&#39;assistance à un conducteur d&#39;un véhicule automobile, en particulier un véhicule utilitaire
EP3048557B1 (fr) Procédé de détermination d&#39;une position d&#39;un caractéristique de véhicule
DE102019102672A1 (de) Intersensorisches lernen
WO2014202496A1 (fr) Procédé d&#39;affichage d&#39;informations relatives à l&#39;environnement dans un véhicule
DE102005024052B4 (de) Verfahren und Vorrichtung zum gesteuerten Auswählen von vorausschauenden Sensoren für ein Fußgängerschutzsystem eines Kraftfahrzeugs
EP3428682B1 (fr) Procédé de détection d&#39;un obstacle dans une zone environnante d&#39;un véhicule automobile, dispositif d&#39;évaluation, système d&#39;aide à la conduite et véhicule automobile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08862182

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 08862182

Country of ref document: EP

Kind code of ref document: A1