WO2010103061A1 - Dispositif et procédé de détection d'au moins un objet - Google Patents

Dispositif et procédé de détection d'au moins un objet Download PDF

Info

Publication number
WO2010103061A1
WO2010103061A1 PCT/EP2010/053074 EP2010053074W WO2010103061A1 WO 2010103061 A1 WO2010103061 A1 WO 2010103061A1 EP 2010053074 W EP2010053074 W EP 2010053074W WO 2010103061 A1 WO2010103061 A1 WO 2010103061A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
stereo camera
camera
aid
evaluation unit
Prior art date
Application number
PCT/EP2010/053074
Other languages
German (de)
English (en)
Inventor
Matthias Hoffmeier
Original Assignee
Hella Kgaa Hueck & Co.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hella Kgaa Hueck & Co. filed Critical Hella Kgaa Hueck & Co.
Publication of WO2010103061A1 publication Critical patent/WO2010103061A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Definitions

  • the invention relates to a device for detecting at least one object.
  • the device comprises a camera comprising a first image detection sensor, a stereo camera comprising a second and a third image detection sensor, and an evaluation unit.
  • the evaluation unit detects in an image acquired with the aid of the camera the image of the object in this image and determines the position of the image of the object in the image.
  • the invention relates to a method for detecting at least one object.
  • the device and the method are used in particular in driver assistance systems.
  • driver assistance systems stereo cameras or monocameras are used to detect objects, with the aid of which images with images of a detection area are acquired. In the images, images of objects are detected and the detected objects are classified into object classes.
  • preset detection algorithms are processed. In order to achieve a sufficiently high detection and classification accuracy, detection algorithms with a high quality must be used. Such high-quality detection algorithms require relatively much time to detect and classify the objects and much storage space. Using lower-precision detection algorithms reduces both the detection speed required and the space required. However, there is a risk that objects are not detected and / or classified incorrectly, so the driver assistance system may provide incorrect information to the driver or other units to be controlled by the driver assistance system. In this way, if appropriate, road safety can be endangered.
  • a device for a motor vehicle with a stereo camera system having a first camera and a second camera is known from the document DE 10 2004 061 998 A1, wherein the first camera and the second camera differ in at least one camera property.
  • the photosensitivity of the first camera and the photosensitivity of the second camera are different.
  • a stereo camera having a single object lens, a single imaging lens and a twin prism between these two lenses.
  • the imaging lens is a compound lens.
  • the document DE 44 10 358 A1 discloses a stereo camera with electromechanically actuatable closures and with electromechanically actuated shutters. There is a single control electronics with two outputs provided. At the one output both closing stepping motors are connected, at the other the diaphragm stepping motors are connected.
  • units of the vehicle to be activated can be controlled more quickly.
  • Camera detected image is carried out in particular such that that image area is selected in which the object is mapped due to the position of the image of the object in the captured using the camera image with high probability.
  • image data are generated for the images captured by the image acquisition sensors.
  • the image data are processed or further processed, in particular with the aid of the evaluation unit.
  • the stereo camera is a second stereo camera and if the camera comprises a fourth image capture sensor and is a first stereo camera.
  • the distance of an object detected in the image to the device can be determined with the aid of each stereo camera in only one image pair detected by the stereo camera.
  • the evaluation unit determines at least one property of the object, in particular an object class to which the object belongs, based on the image of the object in the defined image area. It is particularly advantageous if the evaluation unit generates an output signal with information about the object as a function of the property of the object, and a control unit is provided for controlling at least one actuator as a function of the output signal of the evaluation unit.
  • the light emission of the headlights, a belt tensioner, a brake assist, a light assistant, a speed assistant, a lane change assistant and / or a headrest can be controlled in a simple manner in a driver assistance system.
  • the output signal or an information corresponding to the output signal via a Man-machine interface, in particular via a display unit are output to an operator of the device.
  • the evaluation unit when detecting the image of the object in the image acquired with the aid of the camera, operates on a first algorithm and on detection of the image of the object in the defined image section of the image pair acquired with the aid of the stereo camera a second algorithm ,
  • the second algorithm has a higher detection accuracy than the first algorithm.
  • the first algorithm used is in particular an algorithm which has only a low detection accuracy, but is fast and requires only relatively few resources, in particular relatively little working memory.
  • the second algorithm it is preferable to use an algorithm having a high detection accuracy, which requires a lot of resources as compared with the first algorithm, and thus is relatively slow. In particular, the second algorithm requires more memory over the first algorithm.
  • the aid of the first algorithm only coarse detection of objects in the image acquired with the aid of the camera takes place.
  • the aid of the second algorithm the objects are detected in the image area in which the images of the objects detected in the image acquired with the aid of the camera are displayed with high probability.
  • the classification of the objects takes place, in particular, only on the basis of the images of the objects detected in the defined image area. In this way a reliable classification of the detected objects is ensured. Since only a relatively small image area of the image pair captured by the stereo camera has to be evaluated with the aid of the second algorithm, the required storage space is considerably reduced in comparison to the entire evaluation of the image pair acquired with the aid of the stereo camera with the aid of the second algorithm.
  • the evaluation unit respectively detects the image of the object in the respective image in the images of an image sequence acquired with the aid of the camera and determines the position of the image of the object in the images, and if the evaluation unit uses an image region of one with the aid of the second Stereocamera detected image pair depending on at least two determined positions of the object in at least two images of the detected using the camera image sequence.
  • the driving movement of the vehicle in particular cornering, can thereby be determined when using the device in a driver assistance system.
  • a driver assistance system In this way, it can be established, for example, at an early stage, if an object will not be displayed in the image pair captured with the aid of the stereo camera due to a cornering of the vehicle, and thus a determination of an image area to be searched can be dispensed with.
  • the image acquisition sensors and / or the evaluation unit are mounted on a printed circuit board, in particular a printed circuit board made of epoxy resin.
  • the image acquisition sensors and the evaluation unit are firmly connected to the board, so that the position of the image acquisition sensors fixed to each other fixed and unchangeable. bar is.
  • the image acquisition sensors and the evaluation unit are electrically conductively connected to the circuit board. Due to the fixed position of the image acquisition sensors to each other, the calibration effort is reduced. It is only a one-time calibration necessary in the assembly of image acquisition sensors, a recalibration during operation of the device is not necessary.
  • the image sensing sensors are preferably semiconductor optical sensors, CCD image sensing sensors and / or CMOS image sensing sensors. Such sensors can be produced easily and inexpensively. Furthermore, the overall manufacturing cost of the device is reduced.
  • a first and a second objective are arranged in such a way that in each case an image of a detection area of the first stereo camera is imaged and / or focused on respectively one image capture area of the image acquisition sensors of the first stereo camera, and if a third and a fourth objective are arranged such that in each case an image of a detection area of the second stereo camera are imaged and / or focused on in each case one image-capturing area of the image-capturing sensors of the second stereo camera.
  • the lenses may be fixedly connected to the circuit board and or arranged in a housing in which the device is received.
  • the first and second lenses are preferably identical.
  • the third and fourth lenses are preferably identical. It is particularly advantageous if the first and the second lens angle s have a smaller opening than the third and the fourth lens.
  • the opening angle of the first and second lenses has a particular Value in the range of 15 ° to 25 °, the opening angle of the third and fourth lens in the range of 45 ° to 65 °. This ensures that the first stereo camera has a larger focal length than the second stereo camera. With the aid of the first stereo camera, a remote area is thus detected, while with the help of the second stereo camera, a near area in front of the device is detected.
  • the image pair of the first stereo camera is recorded in time before the image pair of the second stereo camera.
  • At least one infrared light source in particular with at least one infrared light-emitting diode, for illuminating the coverage area of the first stereo camera and / or the coverage area of the second stereo camera. In this way, objects can also be detected and classified at night.
  • the evaluation unit comprises a first processing module, a second processing module and a central module.
  • the first processing module detects in an image captured with the aid of the camera the images of objects in this image and transmits a first object list with information about the detected objects to the central module. The information is transmitted using data.
  • the second processing module detects the images of the objects in an image area of the image pair acquired with the aid of the stereo camera and transmits a second object list with information about the detected objects to the central module.
  • the central module then generates, depending on the information of the first object list and the information of the two th object list an output signal with output information.
  • the output signal is used in particular for controlling units which are controlled by the driving assistance system. With the help of the information of the first and the second object list, relevant objects, in particular lane and road boundary elements, traffic signs, vehicles and / or persons, can be reliably detected.
  • the invention relates to a method for detecting at least one object in which an image is detected with the aid of a camera and the image of the object is detected in this image and the position of the image of the object is determined. Furthermore, a pair of images is detected with the aid of a stereo camera and an image area of the image pair is determined as a function of the determined position of the object in the image acquired with the aid of the camera. The specified image area is searched for an image of the object.
  • the method specified by the independent method claim can be developed in the same way as the device according to claim 1.
  • the method can be further developed with the features indicated in the dependent claims dependent on the device or corresponding method features.
  • Show it: 1 is a plan view of a schematically illustrated device for detecting at least one object according to a first embodiment of the invention
  • Fig. 2 is a schematic perspective view of an apparatus for detecting at least one object according to a second embodiment of the invention
  • FIG. 3 shows a schematic representation of a vehicle with a device for detecting at least one object according to FIG. 2 in a first traffic situation
  • FIG. 4 shows a schematic illustration of the vehicle according to FIG. 3 in a second traffic situation
  • FIG. 5 is a schematic representation of an apparatus for detecting at least one object according to a third embodiment of the invention.
  • FIG. 1 shows a plan view of a schematic representation of a device 10 for detecting at least one object.
  • the device comprises a circuit board 1 and further components 2 to 7 connected to the circuit board 1.
  • the circuit board 1 in turn comprises a carrier of insulating material which has electrically conductive connections which interconnect the electrical connections of the components 2 to 7 connected to the circuit board 1 connect.
  • the electrically conductive connections are preferably designed as printed conductors in the form of a printed circuit.
  • the circuit board 1 is preferably made Epoxy resin with conductor tracks made of copper and is also referred to as a printed circuit board.
  • a first image sensing sensor 4, a second image sensing sensor 5, a third image sensing sensor 2 and a fourth image sensing sensor 3 are arranged on the circuit board 1.
  • the image detection sensors 2 to 5 are electrically connected via electrical connections to the printed conductors, not shown, of the circuit board 1.
  • the image sensing sensors 2 to 5 may be both color image sensing sensors and gray level image sensing sensors. Preferably, either all of the image sensing sensors are 2 to 5 color image sensing sensors or all image sensing sensors are 2 to 5 grayscale imaging sensors.
  • the first image detection sensor 4 and the second image detection sensor 5 constitute a first sensor group, and the third image detection sensor 2 and the fourth image detection sensor 3 form a second sensor group.
  • the image acquisition sensors 2 to 5 of each sensor group are each same type. It is advantageous if all four image acquisition sensors 2 to 5 are of the same type.
  • the image sensing sensors 2 to 5 are connected to the board 1 via solder joints in each case.
  • the image detection sensors 2 to 5 may be plugged into a respective base connected to the circuit board 1 by means of soldered connections.
  • the image acquisition sensors 2 to 5 are applied in particular in the form of opto-arrays.
  • the image acquisition sensors 2 to 5 are preferably optical semiconductor sensors, in particular CCD image acquisition sensors and / or CMOS image acquisition sensors.
  • the third image-capturing sensor 2 and the fourth image-capturing sensor 4 are in particular arranged such that the centers of their image-capturing regions lie on a straight line that runs parallel to one of the long side edges 102 of the circuit board 1.
  • the first image-capturing sensor 4 and the second image-capturing sensor 5 are preferably arranged such that the centers of their image-capturing regions lie on a second straight line which orthogonally intersects the first straight line.
  • the image sensing sensors 2 to 5 are particularly arranged such that the distance of the intersection of the two straight lines to the center of the image sensing area of the first image sensing sensor 4 and the distance of the intersection to the center of the image sensing area of the second image sensing sensor 5 are identical. Also, the distance of the intersection point to the center of the image capturing area of the third image sensing sensor 2 and the distance of the intersection to the center of the image capturing area of the fourth image capturing sensor 3 are identical.
  • the image sensing sensors 2 to 5 are arranged on the board 1 so that at least the image sensing areas of the first image sensing sensor 4 and the second image sensing sensor 5 are arranged in the first plane except for small manufacturing tolerances and the image sensing areas of the third image sensing sensor 2 and the fourth image sensing sensor 5 in one second level are arranged. Preferably, all the image-capturing areas of the image-capturing sensors 2 to 5 are arranged in one plane. Alternatively, other arrangements of the image sensing sensors 2 to 5 are possible. In particular, the image sensing sensors 2 to 5 may be arranged such that the centers of their image sensing areas lie on a straight line which runs parallel to the edge 102. Further, the image sensing sensors 2 to 5 may have different sizes.
  • the device 10 comprises an evaluation unit.
  • the components 6, 7 are in particular field programmable gate arrays (FPGAs).
  • the components 6, 7 may alternatively also be arranged on the rear side of the circuit board 1, which is opposite to the front side of the circuit board 1, on which the image detection sensors 2 to 5 are applied.
  • the image data generated by the image acquisition sensors 2 to 5 are transmitted to the evaluation unit via electrical connections provided by the circuit board 1.
  • the processing of the image data in the evaluation unit takes place with the aid of image processing algorithms implemented in the evaluation unit. For this purpose, at least one image processing program can be stored in the evaluation unit.
  • FIG. 2 shows a schematic perspective view of a device 20 for detecting at least one object according to a second embodiment of the invention. Elements with the same structure or the same function have the same reference numerals.
  • the distances 80, 83 between the image detection sensors 2 to 5 are predetermined by the printed circuit of the board 1.
  • the device 20 comprises four objectives 60 to 63.
  • the objectives 60 to 63 are arranged such that the optical axis of the first objective 63 through the center of the image sensing area of the first image sensing sensor 4, the optical axis of the second objective 62 through the center of the image sensing area of the second image sensing sensor 5, the optical axis of the third objective 60 through the center of the image sensing area of the third image sensing sensor 2, and the optical axis of the fourth Objective 61 passes through the center of the image sensing area of the fourth image sensing sensor 3.
  • the objectives 60 to 63 are particularly arranged such that the optical axes of the lenses 60 to 63 are orthogonal to the respective image sensing areas of the respective image sensing sensors 2 to 5.
  • the objectives 60 to 63 may be arranged such that their optical axes intersect the image sensing areas near the respective center points.
  • the lenses 60 to 63 are in particular connected to the board 1, preferably glued to the board 1, bolted to the board 1 or connected via suitable clamping or snap-in connections with the board 1.
  • the lenses 60 to 63 may also be integrated in a housing, not shown.
  • the housing can serve, for example, for permanent protection of the device 20 from environmental influences, in particular against water and / or vandalism.
  • the lenses are advantageously secured by means of holders on the board 1, wherein each two pins of a stop in holes of the board 1 engage.
  • the first image sensing sensor 4, the second image sensing sensor 5, the first lens 63 and the second lens 62 together form a first stereo camera.
  • the third image sensing sensor 2, the fourth image sensing sensor 3, the third lens 60, and the fourth lens 61 form a second stereo camera.
  • a monocamera comprising an image detection sensor and a lens may be provided.
  • the objectives 60 to 63 each have one or more lenses and / or further optical elements and are used in particular for focusing images on the image capture areas of the image acquisition sensors 2 to 5.
  • the lenses 62, 63 of the first stereo camera and the lenses 60, 61 of the second stereo camera are preferably identical in each case.
  • the lenses 62, 63 of the first stereo camera have a larger focal length than the lenses 60, 61 of the second stereo camera.
  • the lenses 62, 63 have an opening s angle of 15 ° to 25 ° and the lenses 60, 61 an opening angle of 45 ° to 65 °.
  • the lenses 60 to 63 By this choice of the lenses 60 to 63 is achieved that with the help of the first stereo camera images are captured with imaging of a long-range and using the second stereo camera images are captured with mapping of a short-range.
  • the area in front of the device 20 up to a distance of about 60 m is referred to as the near zone. In the vicinity, the edge regions of a roadway 16 are detected. Accordingly, the range is referred to as the distance range from a distance of 60 m.
  • components 70 of an infrared illuminating device are arranged on the circuit board 1.
  • the components 70 are in particular infrared diodes.
  • the device 20 can be used for image acquisition even in the dark, since the detection areas of the stereo cameras are at least partially illuminated with the aid of the infrared illumination device.
  • the device 20 for detecting at least one object is used in particular in connection with driver assistance systems in vehicles, in particular in road vehicles. It can also be used in conjunction with traffic control systems.
  • the device 20 is then preferably stationary, for example attached to a mast, such as a mast of a beacon.
  • FIG. 3 shows a schematic representation of a vehicle 15 with a device for detecting at least one object according to FIG. 2 in a first traffic situation.
  • the vehicle 15 drives on the roadway 16 in the direction of travel Pl.
  • the detection range of the first stereo camera is indicated by the solid lines 17a, 17b and designated by the reference numeral 11.
  • the detection range of the second stereo camera is indicated by the solid lines 18a, 18b and designated by the reference numeral 12.
  • An object 21 arranged next to the roadway 16, in particular a traffic sign, lies at least partially in the detection area 11 of the first stereo camera. The object 21 is so far away from the vehicle 15 that it is not yet in the detection area 12 of the second stereo camera.
  • FIG. 4 shows a schematic illustration of the vehicle 15 according to FIG. 3 in a second traffic situation.
  • the second traffic situation occurred shortly after the first traffic situation shown in FIG.
  • the vehicle 15 has moved toward the object 21 in the direction of travel P1 so far that the object 21 is arranged outside the detection area 11 of the first stereo camera and within the detection area 12 of the second stereo camera.
  • a first image pair with an image of the detection area 11 is first detected at a first time with the aid of the first stereo camera.
  • the aid of the evaluation unit an image of the object 21 in the first image pair recorded with the aid of the first stereo camera is detected and the position of the image of the object 21 is determined.
  • the evaluation unit processes a first detection algorithm.
  • a second image pair with an image of the detection area 12 of the second stereo camera is acquired with the aid of the second stereo camera.
  • an image area of the second image pair is determined.
  • the image area is determined such that the image of the detected object 21 in the second image pair at the second time is shown with a high probability in the defined image area.
  • the evaluation unit processes the defined image area of the second image pair to detect the image of the object 21 in the second image pair. For this purpose, the evaluation unit processes a second detection algorithm.
  • the first and second detection algorithms differ in that the first detection algorithm has a significantly lower detection accuracy than the second detection algorithm.
  • the higher the detection accuracy of a detection algorithm the more reliably objects 21 to be detected are actually detected.
  • the second detection algorithm requires for the detection of an object 21 a much longer time than the first detection algorithm and a likewise much more resources, in particular more storage space.
  • the first detection algorithm rithm is also called feature stereo and the second detection algorithm is dense stereo.
  • an object 21 is already roughly detected with the aid of the first image pair recorded with the aid of the first stereo camera, if the object 21 is still located in the far field.
  • the first detection algorithm for the detection of the object 21 in the remote area, only little time and relatively little memory are required for this purpose. If the object 21 has moved so far toward the device 20 or the device 20 has moved toward the object 21 so far that the object 21 is in the near zone and thus in the detection zone 12 of the second stereo camera, then it is detected with the aid of the second detection algorithm reliably detected in the second image pair. Since the position of the object 21 in the first image pair has been determined and is thus known, an image area of the second image pair can be defined by imaging the object 21 in the second image pair.
  • At least one property of the object 21 is determined on the basis of the image of the object 21 in the second image pair.
  • an object class is determined to which the object 21 can be assigned. In driver assistance systems will work this way In particular, it determines whether the object 21 is a traffic sign, a moving self-illuminated object, a static self-illuminated object and / or a person or an animal.
  • the object class is determined in particular by comparison with objects stored in a database. In order to ensure reliable classification of objects 21, detection using only the first detection algorithm with the lower detection accuracy would not be sufficient. Thus, when using only one stereo camera system, the entire image pair would have to be searched using the time- and memory-intensive second detection algorithm.
  • an image sequence of successively recorded images with images of the detection area 11 is detected with the aid of the first stereo camera.
  • the images of the object 21 are determined with the aid of a tracking function stored in the evaluation unit, and thus the image of the object 21 is determined via the images of the image sequence.
  • FIG. 5 shows a schematic representation of a device 90 for detecting at least one object 21.
  • the image pair acquired with the aid of the first stereo camera is stored in a first memory element 29.
  • the image of the objec- 21 in the image pair detected with the aid of the first stereo camera and determines the position of the image of the object 21 in the image pair.
  • a first object list with information about the objects 21 detected in the first image pair is generated and transmitted to a fusion unit 40.
  • the second image pair acquired with the aid of the second stereo camera system is stored in a second memory element 28.
  • a second object list is generated with information about the objects 21 detected in the second image pair and also determined to the fusion unit 40.
  • the fusion unit 40 generates an output signal which is transmitted to a human-machine interface 50 and / or a control unit 52 in dependence on the information of the first object list and the information of the second object list.
  • the human-machine interface 50 is a display unit, via which information about relevant objects 21 detected and classified by means of a device 90 is transmitted to the operator.
  • the control unit 52 is used to control actuators 54 to 58.
  • the actuators 54 to 58 are in particular a control module for controlling the headlights of a vehicle 15, a brake assist of a vehicle 15, a unit for controlling a drive unit of a headrest of the vehicle 15 and / or a unit for controlling a belt tensioner.
  • the described method and the described apparatus 10, 20, 90 can also be used to acquire, record, evaluate and count static and moving objects as well as to track and analyze the movement behavior of persons and / or objects and to measure objects and objects / or persons are used.
  • the device 10, 20, 90 can be used for object recognition in motor vehicles, for motor vehicle systems, in road transport, in manufacturing facilities, in event facilities, in long-distance and local traffic, on farms, in public facilities, for controlling mass flows and in industrial processes.
  • the device 10, 20, 90 is also called combination camera.
  • a microphone may additionally be provided for recording noises.
  • the sounds are stored and / or evaluated together with the associated images.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Measurement Of Optical Distance (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne un dispositif (10, 20, 90) permettant de détecter au moins un objet (21). Le dispositif est équipé d'un appareil photo comportant un premier capteur d'image (4), d'un appareil photo stéréoscopique comportant un deuxième capteur d'image (2) et un troisième capteur d'image (3), et d'une unité d'évaluation. L'unité d'évaluation détermine dans une image, saisie à l'aide de l'appareil photo, la reproduction de l'objet (21) dans cette image et la position de la reproduction de l'objet (21). L'unité d'évaluation détermine en outre une zone d'image dans une paire d'images, saisies à l'aide de l'appareil photo stéréoscopique, en fonction de la position déterminée de l'objet (21) dans l'image saisie à l'aide de l'appareil photo, et traite la zone d'image déterminée afin de détecter une reproduction de l'objet (21).
PCT/EP2010/053074 2009-03-12 2010-03-11 Dispositif et procédé de détection d'au moins un objet WO2010103061A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102009012758.5 2009-03-12
DE102009012758A DE102009012758A1 (de) 2009-03-12 2009-03-12 Vorrichtung und Verfahren zur Detektion mindestens eines Objektes

Publications (1)

Publication Number Publication Date
WO2010103061A1 true WO2010103061A1 (fr) 2010-09-16

Family

ID=42226678

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/053074 WO2010103061A1 (fr) 2009-03-12 2010-03-11 Dispositif et procédé de détection d'au moins un objet

Country Status (2)

Country Link
DE (1) DE102009012758A1 (fr)
WO (1) WO2010103061A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160214534A1 (en) * 2014-09-02 2016-07-28 FLIR Belgium BVBA Watercraft thermal monitoring systems and methods
US9451237B2 (en) 2011-01-12 2016-09-20 Myestro Interactive Gmbh Remote control device for controlling a mechanism with the aid of a movable object and an interface module based on movement and distance of the movable object with respect to a camera

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011006564A1 (de) 2011-03-31 2012-10-04 Robert Bosch Gmbh Verfahren zur Auswertung eines von einer Kamera eines Fahrzeugs aufgenommenen Bildes und Bildaufbereitungsvorrichtung
DE102011080702B3 (de) * 2011-08-09 2012-12-13 3Vi Gmbh Objekterfassungsvorrichtung für ein Fahrzeug, Fahrzeug mit einer derartigen Objekterfassungsvorrichtung
DE102013102207A1 (de) * 2013-03-06 2014-09-11 Conti Temic Microelectronic Gmbh Kamerasystem für ein Fahrzeug
DE102014206677A1 (de) * 2014-04-07 2015-10-08 Robert Bosch Gmbh Kamerasystem und Verfahren zum Erfassen eines Umfeldes eines Fahrzeuges
DE102014215843A1 (de) * 2014-08-11 2016-02-11 Robert Bosch Gmbh Fahrerbeobachtungssystem für ein Kraftfahrzeug
DE102016217450A1 (de) 2016-09-13 2018-03-15 Conti Temic Microelectronic Gmbh Stereokameraanordnung für ein Kraftfahrzeug sowie Kraftfahrzeug mit einer solchen Stereokameraanordnung
JP7027844B2 (ja) 2017-11-29 2022-03-02 株式会社デンソー カメラモジュール
DE102019211967B3 (de) * 2019-08-09 2020-08-20 Audi Ag Vorrichtung zum Anzeigen eines Anzeigeinhalts und zum Erfassen eines Fahrzeuglenkers sowie ein Kraftfahrzeug hierzu

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4219851A1 (de) 1991-06-21 1992-12-24 Atomic Energy Authority Uk Stereokamera
DE4410358A1 (de) 1994-03-25 1995-09-28 Rbt Raumbildtechnik Gmbh Stereokamera
DE10025593A1 (de) 2000-05-24 2001-12-06 Minoru Inaba Stereokamera
DE10033355A1 (de) 2000-07-08 2002-01-17 Zeiss Carl Kamera, insbesondere Stereokamera
US20050004762A1 (en) * 2003-07-01 2005-01-06 Nissan Motor Co., Ltd. Obstacle detection apparatus and method for automotive vehicle
US20050102070A1 (en) * 2003-11-11 2005-05-12 Nissan Motor Co., Ltd. Vehicle image processing device
US20050232463A1 (en) * 2004-03-02 2005-10-20 David Hirvonen Method and apparatus for detecting a presence prior to collision
DE102004061998A1 (de) 2004-12-23 2006-07-06 Robert Bosch Gmbh Stereokamera für ein Kraftfahrzeug
WO2008081323A2 (fr) * 2006-12-27 2008-07-10 Toyota Jidosha Kabushiki Kaisha Système et procédé de détection d'obstacle installé sur un véhicule

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10005835A1 (de) * 2000-02-10 2001-08-16 Bundesdruckerei Gmbh Automatische Authentifikation von mit Sicherheitsmerkmalen geschützten Dokumenten
DE10154861A1 (de) * 2001-11-08 2003-05-22 Ibeo Automobile Sensor Gmbh Verfahren zur Bereitstellung von Bildinformationen
US7406216B2 (en) * 2004-12-15 2008-07-29 Micron Technology, Inc. Method and apparatus for distributed analyses of images

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4219851A1 (de) 1991-06-21 1992-12-24 Atomic Energy Authority Uk Stereokamera
DE4410358A1 (de) 1994-03-25 1995-09-28 Rbt Raumbildtechnik Gmbh Stereokamera
DE10025593A1 (de) 2000-05-24 2001-12-06 Minoru Inaba Stereokamera
DE10033355A1 (de) 2000-07-08 2002-01-17 Zeiss Carl Kamera, insbesondere Stereokamera
US20050004762A1 (en) * 2003-07-01 2005-01-06 Nissan Motor Co., Ltd. Obstacle detection apparatus and method for automotive vehicle
US20050102070A1 (en) * 2003-11-11 2005-05-12 Nissan Motor Co., Ltd. Vehicle image processing device
US20050232463A1 (en) * 2004-03-02 2005-10-20 David Hirvonen Method and apparatus for detecting a presence prior to collision
DE102004061998A1 (de) 2004-12-23 2006-07-06 Robert Bosch Gmbh Stereokamera für ein Kraftfahrzeug
WO2008081323A2 (fr) * 2006-12-27 2008-07-10 Toyota Jidosha Kabushiki Kaisha Système et procédé de détection d'obstacle installé sur un véhicule

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9451237B2 (en) 2011-01-12 2016-09-20 Myestro Interactive Gmbh Remote control device for controlling a mechanism with the aid of a movable object and an interface module based on movement and distance of the movable object with respect to a camera
US20160214534A1 (en) * 2014-09-02 2016-07-28 FLIR Belgium BVBA Watercraft thermal monitoring systems and methods
US10931934B2 (en) * 2014-09-02 2021-02-23 FLIR Belgium BVBA Watercraft thermal monitoring systems and methods

Also Published As

Publication number Publication date
DE102009012758A1 (de) 2010-09-16

Similar Documents

Publication Publication Date Title
WO2010103061A1 (fr) Dispositif et procédé de détection d'au moins un objet
EP2715446B1 (fr) Système de caméra pour un véhicule
DE102004010197B4 (de) Verfahren zur Funktionskontrolle einer Positionsermittlungs- oder Umgebungserfassungseinrichtung eines Fahrzeugs oder zur Kontrolle einer digitalen Karte
DE102006010295B4 (de) Kamerasystem mit zumindest zwei Bildaufnehmern
DE102008006375B4 (de) Überwachungssystem
DE102008047413A1 (de) Vorrichtung und Verfahren zur Detektion von Objekten
EP1589484A1 (fr) Procédé pour la détection et/ou le suivi d'objets
DE102019133642A1 (de) Digitales bildgebungssystem einschliesslich optischer plenoptik-vorrichtung und bilddaten-verarbeitungsverfahren zur erfassung von fahrzeughindernissen und gesten
WO2008135604A1 (fr) Procédé et dispositif pour déterminer la position d'un panneau de signalisation
DE102012106834A1 (de) Verfahren zur Ausrichtung zweier Bildaufnahmeelemente eines Stereokamerasystems
DE102012105436B4 (de) Fahrzeugkamera zur Entfernungsmessung
DE102018005969A1 (de) Verfahren zum Betreiben eines Fahrerassistenzsvstems mit zwei Erfassungseinrichtungen
DE102013019226A1 (de) Vorrichtung zur kamerabasierten Umgebungserfassung für ein Fahrzeug
DE102018004782A1 (de) Verfahren und Vorrichtung zum Erfassen einer Umgebung, sowie Fahrzeug mit einer solchen Vorrichtung
DE102006044615A1 (de) Verfahren zur Kalibrierung von Bilderfassungseinrichtungen in Fahrzeugen
WO2006015906A1 (fr) Module optique pour un systeme d'assistance au conducteur operant une detection dans l'espace exterieur avant, dans le sens de roulement, d'un vehicule a moteur
DE10146786A1 (de) Einrichtung zur automatischen Schaltung von Beleuchtungseinrichtungen eines Fahrzeugs
EP2562685B1 (fr) Procédé et dispositif de classification d'un objet lumineux situé à l'avant d'un véhicule
EP2768701A1 (fr) Dispositif optique pour un véhicule
DE10022454A1 (de) Bildaufnehmer und Bildaufnahmeverfahren, insbesondere zur dreidimensionalen Erfassung von Objekten und Szenen
DE102019200099A1 (de) Sensorvorrichtung für ein Ego-Fahrzeug, Fahrerassistenzvorrichtung und Fahrzeug mit einer solchen Sensorvorrichtung
DE102012105435B4 (de) Fahrzeugkamera zur Entfernungsmessung
DE102020117060B4 (de) Vorrichtung und Verfahren zur Umfelderfassung eines Fahrzeugs
EP1962245B1 (fr) Procédé et dispositif de détermination de l'état de déplacement d'objets
EP2630632A1 (fr) Détermination de la largeur de base d'un système de détection d'image stéréoscopique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10717068

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10717068

Country of ref document: EP

Kind code of ref document: A1