DE102009012758A1 - Device and method for detecting at least one object - Google Patents

Device and method for detecting at least one object

Info

Publication number
DE102009012758A1
DE102009012758A1 DE102009012758A DE102009012758A DE102009012758A1 DE 102009012758 A1 DE102009012758 A1 DE 102009012758A1 DE 102009012758 A DE102009012758 A DE 102009012758A DE 102009012758 A DE102009012758 A DE 102009012758A DE 102009012758 A1 DE102009012758 A1 DE 102009012758A1
Authority
DE
Germany
Prior art keywords
image
object
stereo camera
camera
evaluation unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
DE102009012758A
Other languages
German (de)
Inventor
Matthias Dipl.-Ing. Hoffmeier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hella GmbH and Co KGaA
Original Assignee
Hella KGAA Huek and Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hella KGAA Huek and Co filed Critical Hella KGAA Huek and Co
Priority to DE102009012758A priority Critical patent/DE102009012758A1/en
Publication of DE102009012758A1 publication Critical patent/DE102009012758A1/en
Application status is Pending legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Abstract

The invention relates to a device (10, 20, 90) for detecting at least one object (21) with a camera, comprising a first image capture sensor (4), a stereo camera comprising a second image capture sensor (2) and a third image capture sensor (3) and an evaluation unit. The evaluation unit determines the image of the object (21) in this image and the position of the image of the object (21) in an image acquired with the aid of the camera. Furthermore, the evaluation unit determines an image area of a pair of images acquired with the aid of the stereo camera as a function of the determined position of the object (21) in the image acquired with the aid of the camera and processes the defined image area to detect an image of the object (21).

Description

  • The The invention relates to a device for detecting at least one Object. The apparatus includes a first image sensing sensor Camera, a second and a third image acquisition sensor comprehensive stereo camera and an evaluation unit. The evaluation unit detects in an image captured with the help of the camera, the image of the object in this picture and determines the position of the picture of the object in the picture. Furthermore, the invention relates to a method for detection at least one object.
  • The The device and the method are used in particular in driver assistance systems used. In driver assistance systems are used to detect objects Stereo cameras or monocameras used with their help pictures be captured with images of a detection area. In the Images are images of objects detected and the detected Objects are classified into object classes. These are preset Processing algorithms processed. In order to achieve a sufficiently high detection and to achieve classification accuracy, detection algorithms must be used be used with a high quality. Such detection algorithms with a high quality need relatively a lot of time to detect and classify the objects and much Space. When using detection algorithms with Although a lower accuracy will be both the required Detection speed and the required storage space reduced. However, there is a danger that objects will not be detected and / or misclassified so that the driver assistance system may incorrect information to the driver or others supplies units to be controlled by the driver assistance system. hereby if necessary, road safety can be endangered.
  • From the document DE 100 33 355 A1 is a stereo camera with two image sensors known, which are read out pixel by pixel controlled by a read-out clock signal generator.
  • From the document DE 10 2004 061 998 A1 a device for a motor vehicle with a stereo camera system with a first camera and a second camera is known, wherein the first camera and the second camera differ in at least one camera characteristic. In particular, the photosensitivity of the first camera and the photosensitivity of the second camera are different.
  • From the document DE 42 19 851 A1 For example, a stereo camera is known that has a single object lens, a single imaging lens, and a twin prism between these two lenses. Preferably, at least the imaging lens is a compound lens.
  • From the document DE 44 10 358 A1 is a stereo camera with electromechanically actuated closures and known with electromechanically actuable aperture. There is a single control electronics with two outputs provided. At the one output both closing stepping motors are connected, at the other the diaphragm stepping motors are connected.
  • From the document DE 100 25 593 A1 is known a stereo camera in which the distance between the optical axes of the receiving objects is set narrower than the conventional distance between the optical axes.
  • It Object of the invention, an apparatus and a method for Specify detection of at least one object in which the object quickly and reliably detectable with low storage requirements is.
  • The The object is achieved by a device having the features of the claim 1 and by a method having the features of the independent method claim solved. Advantageous developments of the invention are specified in the dependent claims.
  • By processing only one image area of the image pair detected by the stereo camera depending on the detected position of the object detected by the camera for detecting an image of the object, the image of the object is compared with the processing of the whole image pair , Achieved a much more resource-conserving processing of the image data for the detection of the object. As a result, a higher detection speed is achieved with less memory requirement. The detection rate, which indicates the number of detected objects per unit time, is also increased. Due to the faster detection of the object in the image pair acquired with the aid of the stereo camera, the object can also be classified more quickly and / or one or more properties of the object can be determined more quickly. In the further processing of the properties of the object and / or the classified object, for example in a driver assistance system, units of the vehicle to be activated can be controlled more quickly. The further processing of the image area of the image pair acquired with the aid of the stereo camera as a function of the determined position of the object in the image acquired with the aid of the camera takes place in particular such that the image area is selected in which the object is selected on the basis of the position of the image of the object in FIG captured with the help of the camera image with high true is portrayed.
  • It is advantageous when taken by the image acquisition sensors Images corresponding image data are generated. The image data will in particular with the aid of the evaluation unit, processed or further processed.
  • It is advantageous if the stereo camera is a second stereo camera and if the camera includes a fourth image sensing sensor and a first stereo camera is. By using two stereo cameras can with the help of each stereo camera in only one of each the stereo camera captured image pair the distance of one in the Image detected object to the device can be determined.
  • It is advantageous if the evaluation unit based on the figure of the Object in the defined image area at least one property of the object, in particular an object class, to which the object heard, determined. It is especially advantageous if the Evaluation unit depending on the property of the Object an output signal with information on the NEN Object generated and a control unit to control at least one Actuator in dependence of the output signal of the evaluation unit is provided. This can be done in a driver assistance system in a simple way, for example, the light emission of the headlights, a belt tensioner, a brake assistant, a light assistant, a speed assistant, a lane change assistant and / or a headrest controlled become. Alternatively or additionally, the output signal or an information corresponding to the output signal a man-machine interface, in particular over a Display unit, issued to an operator of the device become.
  • at a preferred embodiment of the invention operates the evaluation unit when detecting the image of the object in the image captured by means of the camera, a first algorithm and during the detection of the image of the object in the defined image section of the captured with the help of the stereo camera image pair a second Algorithm. The second algorithm has a higher one Detection accuracy as the first algorithm. As the first algorithm In particular, an algorithm that uses only a small amount is used Detection accuracy has, but is fast and only relatively few resources, especially relatively little memory, requires. The second algorithm is preferably an algorithm used with a high detection accuracy compared with the first algorithm requires a lot of resources and thereby is relatively slow. In particular, the second algorithm requires more memory compared to the first algorithm. With the help of the first algorithm, only a rough detection takes place objects in the image captured by the camera. With help of the second algorithm are displayed in the image area where the images the image detected in the image captured by the camera are shown with high probability, the objects detected. The classification of the objects takes place in particular first the images of the objects detected in the defined image area. In this way, a reliable classification the detected objects ensured. Since this is only a relative small image area of the image pair captured by the stereo camera must be evaluated with the help of the second algorithm the required storage space compared to the entire evaluation of the captured with the help of the stereo camera image pair using the second algorithm significantly reduced.
  • It is advantageous if the evaluation unit in the images with a Using the camera captured image sequence respectively the image of the object detected in the respective image and the position of the image of the object in the images, and if the evaluation unit an image area of a captured by means of the second stereo camera Image pair depending on at least two determined Positions of the object in at least two pictures with the help determines the captured image sequence. By capturing a Image sequence with the help of the camera and the detection of the images the object in multiple images of this image sequence can be tracked of the object, a so-called tracking. As a result, the image area of the detected with the help of the stereo camera Pair of pictures, in which the picture of the object is searched, be set so that the object with very high probability is displayed, and / or a small image area is selected become.
  • alternative or additionally, this may result in the use of the device in a driver assistance system, the driving movement of the vehicle, in particular a cornering, to be determined. In this way, for example be detected early when an object is due a cornering of the vehicle not in with the help of the stereo camera captured image pair will be displayed, and thus to a determination a searchable image area can be dispensed with.
  • In a preferred embodiment of the invention, the image acquisition sensors and / or the evaluation unit are mounted on a printed circuit board, in particular a printed circuit board made of epoxy resin. The image acquisition sensors and the evaluation unit are firmly connected to the circuit board, so that the position of the image acquisition sensors to each other is fixed and unchangeable. Furthermore, the image acquisition sensors and the evaluation unit are electrically conductive with connected to the board. Due to the fixed position of the image acquisition sensors to each other, the calibration effort is reduced. It is only a one-time calibration necessary in the assembly of image acquisition sensors, a recalibration during operation of the device is not necessary.
  • The Image sensing sensors are preferably semiconductor optical sensors, CCD image capture sensors and / or CMOS image capture sensors. Such sensors can be produced easily and inexpensively. Furthermore, the overall manufacturing cost of the device is reduced.
  • Of Furthermore, it is advantageous if a first and a second objective are arranged such that in each case an image of a detection area the first stereo camera on each image capture area imaged and / or the image acquisition sensors of the first stereo camera are focused, and if a third and a fourth lens like that are arranged, that in each case an illustration of a detection range the second stereo camera on each image capture area imaged and / or the image acquisition sensors of the second stereo camera are focused. The lenses can be fixed to the board be connected and or in a housing in which the device is received, be arranged.
  • The first and second lenses are preferably identical. As well For example, the third and fourth lenses are preferably identical. It is particularly advantageous if the first and the second lens a smaller opening angle than the third and the fourth Lens. The opening angle of the first and second Lens in particular has a value in the range of 15 ° to 25 °, the opening angle of the third and fourth Lens in the range of 45 ° to 65 °. hereby is achieved that the first stereo camera a larger Focal length than the second stereo camera has. With the help of the first Stereo camera is thus detected a long range while using Help the second stereo camera a close range in front of the device is detected. In this way, objects can already be in great Distance to the device can be detected using the first stereo camera and later captured in the second stereo camera Image pair are classified as soon as the object is in close range located. For this, the image pair of the first stereo camera is temporally recorded in front of the image pair of the second stereo camera.
  • Further it is advantageous to use at least one infrared light source, in particular with at least one infrared light emitting diode, for illumination of the detection area the first stereo camera and / or the detection area of the second Provide stereo camera. In this way you can create objects can also be detected and classified at night.
  • at a preferred embodiment of the invention the evaluation unit, a first processing module, a second processing module and a central module. The first processing module detects in an image captured with the help of the camera, the images of objects in this picture and submit a first object list Information about the detected objects to the central module. The information is transmitted using data. Likewise, the second processing module detects in an image area of the captured with the help of the stereo camera image pair the pictures of the objects and transmits a second list of objects Information about the detected objects to the central module. The central module then generates depending on the information of the first object list and the information of the second object list an output signal with output information. The output signal is used in particular for controlling units, which are controlled by the driver assistance system. With the help of Information of the first and the second object list can relevant objects, in particular lane and road boundary elements, Traffic signs, vehicles and / or people, reliable be detected.
  • Further The invention relates to a method for detecting at least one Object that captures an image using a camera and the image of the object detected in this image and the position the image of the object is determined. Further, with help a stereo camera captures a pair of images and an image area of the Picture pair depending on the determined position of the object in the image captured by the camera. The defined image area becomes after a picture of the object searched.
  • The specified by the independent method claim Method can be developed in the same way as the device according to claim 1. In particular, the method with the in dependent on the device Claims specified or corresponding Trained procedural features.
  • Further Features and advantages of the invention will become apparent from the following Description, which the invention in conjunction with the accompanying figures explained in more detail with reference to embodiments.
  • It demonstrate:
  • 1 a plan view of a schematically illustrated device for detecting at least one object according to a first embodiment of the invention;
  • 2 a schematic perspective view of an apparatus for detecting at least one object according to a second embodiment of the invention;
  • 3 a schematic representation of a vehicle with a device for detecting at least one object according to 2 in a first traffic situation;
  • 4 a schematic representation of the vehicle according to 3 in a second traffic situation; and
  • 5 a schematic representation of an apparatus for detecting at least one object according to a third embodiment of the invention.
  • In 1 is a plan view of a schematic representation of a device 10 shown for detecting at least one object. The device comprises a circuit board 1 and more with the board 1 connected components 2 to 7 , The board 1 In turn, it includes a support of insulating material having electrically conductive connections that connect the electrical connections to the board 1 connected components 2 to 7 connect with each other. The electrically conductive connections are preferably designed as printed conductors in the form of a printed circuit. The board 1 is preferably made of epoxy resin with traces of copper and is also referred to as a printed circuit board.
  • On the board 1 are a first image capture sensor 4 , a second image capture sensor 5 , a third image acquisition sensor 2 and a fourth image sensing sensor 3 arranged. The image capture sensors 2 to 5 are via electrical connections with the printed conductors, not shown, of the board 1 electrically connected. The image capture sensors 2 to 5 Both color imaging sensors and gray scale imaging sensors may be used. Preferably, either all image acquisition sensors 2 to 5 Color image sensing sensors or all image sensing sensors 2 to 5 Gray scale imaging sensors. The first image capture sensor 4 and the second image detection sensor 5 form a first sensor group and the third image sensor 2 and the fourth image sensing sensor 3 form a second sensor group. Preferably, the image capture sensors are 2 to 5 each sensor group identical in each case. It is advantageous if all four image acquisition sensors 2 to 5 are the same type.
  • The image capture sensors 2 to 5 are with the board 1 each connected via solder joints. Alternatively, the image acquisition sensors 2 to 5 in each case one with the board 1 each be connected by means of solder joints connected socket. The image capture sensors 2 to 5 are applied in particular in the form of opto-arrays. Due to the resulting fixed position of the image acquisition sensors 2 to 5 to each other, in particular by the fixed position of the first image detection sensor 4 to the second image acquisition sensor 5 and the fixed position of the third image sensing sensor 2 to the fourth image acquisition sensor 3 , is the calibration effort to calibrate the device 10 reduced. It is only a one-time calibration when mounting the image acquisition sensors 2 to 5 necessary. A recalibration during operation of the device 10 is not necessary.
  • The image capture sensors 2 to 5 are preferably optical semiconductor sensors, in particular CCD image acquisition sensors and / or CMOS image acquisition sensors. The third image acquisition sensor 2 and the fourth image sensing sensor 4 are arranged in particular such that the centers of their image detection areas lie on a straight line parallel to one of the long side edges 102 the board 1 runs. The first image capture sensor 4 and the second image detection sensor 5 are preferably arranged such that the centers of their image capture areas are located on a second straight line which orthogonally intersects the first straight line. Further, the image sensing sensors are 2 to 5 in particular arranged such that the distance of the intersection of the two straight lines to the center of the image capture area of the first image acquisition sensor 4 and the distance of the intersection to the center of the image capture area of the second image capture sensor 5 are identical. Likewise, the distance of the intersection point is the center of the image-capturing area of the third image-capturing sensor 2 and the distance of the intersection point from the center of the image-capturing area of the fourth image-capturing sensor 3 identical. The image capture sensors 2 to 5 are so on the board 1 arranged that at least the image sensing areas of the first image sensing sensor 4 and the second image acquisition sensor 5 are arranged apart from low manufacturing tolerances in the first plane and the image sensing areas of the third image sensing sensor 2 and the fourth image acquisition sensor 5 arranged in a second plane. Preferably, all image capture areas are the image capture sensors 2 to 5 arranged in a plane.
  • Alternatively, other arrangements of image sensing sensors are also possible 2 to 5 possible. In particular, the image acquisition sensors can 2 to 5 be arranged so that the centers of their image sensing areas lie on a straight line parallel to the edge 102 runs. Furthermore, the image acquisition sensors 2 to 5 have different sizes.
  • Furthermore, the device comprises 10 an evaluation unit. In 1 two belonging to this evaluation unit components are shown and with the reference numerals 6 and 7 designated. The components 6 . 7 are in particular field programmable gate arrays (FPGAs). The components 6 . 7 Alternatively, you can also use the back of the board 1 be arranged on the front of the board 1 on which the image capture sensors 2 to 5 are applied, is opposite. The from the image acquisition sensors 2 to 5 generated image data are transferred from the board 1 provided electrical connections to the evaluation transferred. The processing of the image data in the evaluation unit takes place with the aid of image processing algorithms implemented in the evaluation unit. For this purpose, at least one image processing program can be stored in the evaluation unit.
  • In 2 is a schematic perspective view of a device 20 for detecting at least one object according to a second embodiment of the invention. Elements with the same structure or the same function have the same reference numerals. The distances 80 . 83 between the image acquisition sensors 2 to 5 are through the printed circuit of the board 1 specified.
  • Furthermore, the device comprises 20 four lenses 60 to 63 , The lenses 60 to 63 are arranged such that the optical axis of the first objective 63 through the center of the image capture area of the first image capture sensor 4 , the optical axis of the second objective 62 through the center of the image capture area of the second image capture sensor 5 , the optical axis of the third lens 60 through the center of the image sensing area of the third image sensing sensor 2 and the optical axis of the fourth lens 61 through the center of the image sensing area of the fourth image sensing sensor 3 runs. The lenses 60 to 63 are in particular arranged such that the optical axes of the lenses 60 to 63 orthogonal to the respective image capture areas of the respective image capture sensors 2 to 5 run. Alternatively, the lenses can 60 to 63 be arranged so that their optical axes intersect the image sensing areas near the respective centers.
  • The lenses 60 to 63 are in particular with the board 1 connected, preferably on the board 1 glued, with the board 1 bolted or via suitable clamping or snap connections with the board 1 connected. Alternatively, the lenses can 60 to 63 be integrated into a housing, not shown. The housing can, for example, for permanent protection of the device 20 against environmental influences, especially against water and / or vandalism serve. The lenses are advantageously using holders on the board 1 attached, with two pins each of a stop in holes of the board 1 engage.
  • The first image capture sensor 4 , the second image sensor 5 , the first lens 63 and the second lens 62 together form a first stereo camera. Likewise form the third image acquisition sensor 2 , the fourth image sensor 3 , the third lens 60 and the fourth lens 61 a second stereo camera. In an alternative embodiment of the invention, instead of the first stereo camera, a monocamera comprising an image acquisition sensor and an objective can be provided.
  • The lenses 60 to 63 each have one or more lenses and / or other optical elements and are used in particular for focusing images on the image capture areas of the image acquisition sensors 2 to 5 , The lenses 62 . 63 the first stereo camera and the lenses 60 . 61 the second stereo camera are preferably identical in each case. The lenses 62 . 63 the first stereo camera have a larger focal length than the lenses 60 . 61 the second stereo camera. To do this, have the lenses 62 . 63 an opening angle of 15 ° to 25 ° and the lenses 60 . 61 an opening angle of 45 ° to 65 °. Through this choice of lenses 60 to 63 is achieved that with the help of the first stereo camera images are captured with a long-range image and captured with the help of the second stereo camera images with a close-up image. As a short-range is in particular the area in front of the device 20 up to a distance of about 60 m. In the vicinity who also the edge areas of a roadway 16 detected. Accordingly, the range is referred to as the distance range from a distance of 60 m.
  • Further, on the board 1 components 70 an infrared illuminating device arranged. The components 70 are in particular infrared diodes. By using infrared light, the device can 20 also be used in the dark for image capture, since the detection areas of the stereo cameras are at least partially illuminated by means of the infrared illumination device.
  • The device 20 for detecting at least one object is used in particular in connection with driver assistance systems in vehicles, in particular in road vehicles. It can also be used in conjunction with traffic control systems. The device 20 is then preferably arranged stationary, for example, attached to a mast, such as a mast of a signal light signaling system.
  • In 3 is a schematic representation of a vehicle 15 with a device for detecting at least one object 2 shown in a first traffic situation. The vehicle 15 drives on the road 16 in the direction of travel P1.
  • The detection range of the first stereo camera is the solid lines 17a . 17b indicated and by the reference numeral 11 designated. The coverage of the second stereo camera is through the solid lines 18a . 18b indicated and by the reference numeral 12 designated. One beside the roadway 16 arranged object 21 , in particular a traffic sign, is at least partially within the scope of coverage 11 the first stereo camera ra. The object 21 is so far from the vehicle 15 removed that it is not yet in the detection area 12 the second stereo camera is located.
  • In 4 is a schematic representation of the vehicle 15 to 3 shown in a second traffic situation. The second traffic situation is temporally shortly after in 3 occurred first traffic situation. The vehicle 15 has in the meantime so far in the direction P1 to the object 21 to move that object 21 outside the coverage area 11 the first stereo camera and within the detection range 12 the second stereo camera is arranged.
  • In the detection of an object 21 First of all, a first image pair with an image of the detection area is first of all taken with the aid of the first stereo camera 11 detected. With the help of the evaluation unit is a picture of the object 21 detected in the first pair of images recorded using the first stereo camera and the position of the image of the object 21 determined. For this purpose, the evaluation unit processes a first detection algorithm. At a second time, which is arranged temporally after the first time, a second image pair with an image of the detection area is made with the aid of the second stereo camera 12 captured by the second stereo camera. Depending on the determined position of the image of the object 21 In the first image pair, an image area of the second image pair is determined. The image area is in particular determined such that the image of the detected object 21 is shown in the second image pair at the second time with a high probability in the specified image area. Subsequently, the evaluation unit processes the defined image area of the second image pair to detect the image of the object 21 in the second image pair. For this purpose, the evaluation unit operates a second detection algorithm.
  • The first and second detection algorithms differ in that the first detection algorithm has a significantly lower detection accuracy than the second detection algorithm. The higher the detection accuracy of a detection algorithm, the more reliable are objects to be detected 21 actually detected. The second detection algorithm needed for the detection of an object 21 a much longer time than the first detection algorithm and also significantly more resources, especially more storage space. The first detection algorithm is also called feature stereo and the second detection algorithm is dense stereo.
  • This will accomplish that an object 21 already roughly detected with the help of the first stereo camera recorded first image pair, when the object 21 still in the far range. By using the first detection algorithm to detect the object 21 in the remote area, this requires little time and relatively little memory. Has the object 21 so far on the device 20 to move or the device 20 so far on the object 21 too moved that the object 2 1 in the near range and thus in the detection range 12 is the second stereo camera, it is reliably detected using the second detection algorithm in the second image pair. Because the position of the object 21 was determined in the first image pair and thus is known, an image area of the second image pair can be determined by the object 21 is shown in the second image pair. Thus, not the entire second image pair needs to be scanned using the second detection algorithm after imaging the object 21 be searched, but only the specified image area. This will cause the detection of the image of the object 21 in the second image pair, despite the high detection accuracy of the second detection algorithm only little time and little space needed. Thus, compared with a device for detecting objects with only one stereo camera, both the detection range, the detection rate, the detection speed and the detection quality are improved.
  • In a preferred embodiment of the invention is based on the image of the object 21 in the second image pair at least one property of the object 21 determined. In particular, an object class is determined that contains the object 21 is assignable. In driver assistance systems, in particular it is determined in this way whether the object is 21 is a traffic sign, a moving self-illuminated object, a static self-illuminated object, and / or a person or an animal. The object class is determined in particular by comparison with objects stored in a database. To get a reliable classification of objects 21 To ensure that detection would only be insufficient with the aid of the first detection algorithm with the lower detection accuracy. Thus, when using only a stereo camera system would have the entire picture be searched using the time- and memory-intensive second detection algorithm.
  • In a further preferred embodiment of the invention, with the aid of the first stereo camera, an image sequence of successively recorded images with images of the detection area 11 detected. In the pictures of the picture sequence the pictures of the object become 21 determined using a stored in the evaluation unit tracking function and thus the image of the object 21 determined via the images of the image sequence. By tracking the position of the image of the object 21 a more accurate prediction can be made in which image area of the image pair of the second stereo camera the object 21 is pictured when it is in close range. In particular, in this way it can be determined early in a driver assistance system if an object 21 , in particular due to a cornering of a vehicle 15 , does not enter the immediate area at all and therefore does not need to be considered further.
  • In 5 is a schematic representation of a device 90 for detecting at least one object 21 shown. The image pair acquired with the aid of the first stereo camera is stored in a first memory element 29 saved. With the help of a first processing unit 31 becomes the picture of the object 21 detected in the captured with the help of the first stereo camera image pair and the position of the image of the object 21 determined in the image pair. Further, using the processing unit 31 a first object list with information about the objects detected in the first image pair 21 generated and sent to a fusion unit 40 transmitted.
  • The captured with the help of the second stereo camera system second image pair is in a second memory element 28 saved. With the help of a second processing unit 30 is a second object list with information about the objects detected in the second image pair 21 generated and also to the fusion unit 40 determined. The fusion unit 40 generates an output signal which is sent to a man-machine interface in dependence on the information of the first object list and the information of the second object list 50 and / or a control unit 52 is transmitted. The man-machine interface 50 is in particular a display unit, via which the operator information about relevant with the help of a device 90 detected and classified objects 21 be transmitted. The control unit 52 serves to control actuators 54 to 58 , At the actuators 54 to 58 In particular, it is a control module for controlling the headlights of a vehicle 15 , a brake assistant of a vehicle 15 a unit for controlling a drive unit of a headrest of the vehicle 15 and / or a unit for controlling a belt tensioner.
  • The method described and the device described 10 . 20 . 90 They can also be used to acquire, record, evaluate and count static and moving objects, and to track and analyze the movement behavior of people and / or objects and to measure objects and / or people. The device 10 . 20 . 90 can be used for object recognition in motor vehicles, for motor vehicle assistance systems, in road traffic, in production facilities, in event facilities, in long-distance and local traffic, on farms, in public facilities, for controlling mass flows and in industrial processes. The device 10 . 20 . 90 is also referred to as combi camera.
  • at Another embodiment of the invention may additionally a microphone is provided for recording noises be. The sounds will be together with the corresponding ones Images saved and / or evaluated.
  • 1
    circuit board
    2 until 5
    Image Sensor
    6 7
    module
    10 20, 90
    contraption
    11 12
    detection range
    15
    vehicle
    16
    roadway
    17a, 17b, 18a,
    18b full line
    21
    object
    28 29
    storage element
    30 31
    processing unit
    40
    fusion unit
    50
    Human-machine interface
    52
    control unit
    54 56, 58
    actuator
    60 to 63
    lens
    70
    infrared diode
    80 83
    distance
    102
    edge
    P1
    direction of travel
  • QUOTES INCLUDE IN THE DESCRIPTION
  • This list The documents listed by the applicant have been automated generated and is solely for better information recorded by the reader. The list is not part of the German Patent or utility model application. The DPMA takes over no liability for any errors or omissions.
  • Cited patent literature
    • - DE 10033355 A1 [0003]
    • DE 102004061998 A1 [0004]
    • - DE 4219851 A1 [0005]
    • - DE 4410358 A1 [0006]
    • - DE 10025593 A1 [0007]

Claims (15)

  1. Device for detecting at least one object, comprising a camera, comprising a first image-sensing sensor ( 4 ), with a stereo camera comprising a second image capture sensor ( 2 ) and a third image acquisition sensor ( 3 ), and with an evaluation unit, wherein the evaluation unit in an image captured with the aid of the camera, the image of the object ( 21 ) is detected in this image and the position of the image of the object ( 21 ), the evaluation unit determines an image area of an image pair acquired with the aid of the stereo camera as a function of the determined position of the object ( 21 ) in the image acquired with the aid of the camera, and the evaluation unit determines the defined image area for detecting an image of the object ( 21 ) processed.
  2. Contraption ( 10 . 20 . 90 ) according to claim 1, characterized in that the stereo camera is a second stereo camera, and that the camera has a fourth image sensor ( 5 ) and a first stereo camera.
  3. Contraption ( 10 . 20 . 90 ) according to one of the preceding claims, characterized in that the evaluation unit based on the image of the object ( 21 ) in the defined image area at least one property of the object ( 21 ), in particular an object class containing the object ( 21 ) is determined.
  4. Contraption ( 10 . 20 . 90 ) according to claim 3, characterized in that the evaluation unit in dependence on the property of the object ( 21 ) an output signal with information about the object ( 21 ) generated.
  5. Contraption ( 90 ) according to claim 4, characterized in that a control unit ( 52 ) for controlling at least one actuator ( 54 . 56 . 58 ) is provided as a function of the output signal of the evaluation unit.
  6. Contraption ( 10 . 20 . 90 ) according to one of the preceding claims, characterized in that the evaluation unit in the detection of the image of the object ( 21 ) in the image acquired with the aid of the camera, a first algorithm and in the detection of the image of the object ( 21 ) processes a second algorithm in the defined image area of the image pair acquired with the aid of the stereo camera, the second algorithm having a higher detection accuracy than the first algorithm.
  7. Contraption ( 10 . 20 . 90 ) according to one of the preceding claims, characterized in that the evaluation unit in the images of an image sequence acquired with the aid of the camera in each case the image of the object ( 21 ) are detected in the respective image and the positions of the images of the object ( 21 ), and in that the evaluation unit determines an image area of an image pair acquired with the aid of the second stereo camera as a function of at least two determined position of the object ( 21 ) in at least two images of the image sequence acquired with the aid of the camera.
  8. Contraption ( 10 . 20 . 90 ) according to one of the preceding claims, characterized in that the camera captures the image or the images of the image sequence before the stereo camera detects the image pair.
  9. Contraption ( 10 . 20 . 90 ) according to one of the preceding claims, characterized in that the image acquisition sensors ( 2 to 5 ) and / or the evaluation unit on a circuit board ( 1 ), in particular a printed circuit board made of epoxy resin, are applied and connected to this electrically conductive.
  10. Contraption ( 10 . 20 . 90 ) according to one of the preceding claims, characterized in that the image acquisition sensors ( 2 to 5 ) are semiconductor optical sensors, CCD image sensing sensors and / or CMOS image sensing sensors.
  11. Contraption ( 20 . 90 ) according to one of claims 2 to 10, characterized in that a first objective ( 63 ) and a second lens ( 62 ) are arranged such that in each case an image of a detection range of the first stereo camera on in each case an image detection range of the image acquisition sensors ( 4 . 5 ) of the first stereo camera and / or focused, and that a third lens ( 60 ) and a fourth lens ( 61 ) are arranged such that in each case an image of a detection range of the second stereo camera on in each case one image-capturing region of the image-capturing sensors ( 2 . 3 ) of the second stereo camera are imaged and / or focused.
  12. Contraption ( 20 . 90 ) according to claim 11, characterized in that the first objective ( 63 ) and the second lens ( 62 ) has a smaller aperture angle than the third lens ( 60 ) and the fourth lens ( 61 ) to have.
  13. Contraption ( 20 . 90 ) according to one of claims 2 to 12, characterized in that at least one infrared diode ( 70 ) is provided for illuminating a part of the detection area of the first stereo camera and / or a part of the detection area of the second stereo camera.
  14. Contraption ( 90 ) after one of vorge Henden claims, characterized in that the evaluation unit, a first processing module ( 30 ), a second processing module ( 31 ) and a central module ( 40 ), the first processing module ( 30 ) in an image captured with the aid of the camera, the images of objects ( 21 ) in this image and a first object list with information about the detected objects ( 21 ) to the central module ( 40 ), the second processing module ( 32 ) in a partial area of the image pair captured with the aid of the stereo camera, the images of the objects ( 21 ) and a second object list with information about the detected objects ( 21 ) to the central module ( 40 ) and the central module ( 40 ) generates an output signal in dependence on the information of the first object list and the information of the second object list.
  15. Method for detecting at least one object in which an image is acquired with the aid of a camera, the image of the object ( 21 ) is detected in this image and the position of the image of the object ( 21 ) are detected, with the aid of a stereo camera, a pair of images is detected, an image area of the image pair depending on the determined position of the object ( 21 ) and in which the defined image area is determined according to a representation of the object ( 21 ) is searched.
DE102009012758A 2009-03-12 2009-03-12 Device and method for detecting at least one object Pending DE102009012758A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102009012758A DE102009012758A1 (en) 2009-03-12 2009-03-12 Device and method for detecting at least one object

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102009012758A DE102009012758A1 (en) 2009-03-12 2009-03-12 Device and method for detecting at least one object
PCT/EP2010/053074 WO2010103061A1 (en) 2009-03-12 2010-03-11 Apparatus and method for detection of at least one object

Publications (1)

Publication Number Publication Date
DE102009012758A1 true DE102009012758A1 (en) 2010-09-16

Family

ID=42226678

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102009012758A Pending DE102009012758A1 (en) 2009-03-12 2009-03-12 Device and method for detecting at least one object

Country Status (2)

Country Link
DE (1) DE102009012758A1 (en)
WO (1) WO2010103061A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102011080702B3 (en) * 2011-08-09 2012-12-13 3Vi Gmbh Object detection device for a vehicle, vehicle having such an object detection device
DE102013102207A1 (en) * 2013-03-06 2014-09-11 Conti Temic Microelectronic Gmbh Camera system for a vehicle
DE102014206677A1 (en) * 2014-04-07 2015-10-08 Robert Bosch Gmbh Camera system and method for detecting an environment of a vehicle
DE102014215843A1 (en) * 2014-08-11 2016-02-11 Robert Bosch Gmbh Driver observation system for a motor vehicle
US9451237B2 (en) 2011-01-12 2016-09-20 Myestro Interactive Gmbh Remote control device for controlling a mechanism with the aid of a movable object and an interface module based on movement and distance of the movable object with respect to a camera
US9563807B2 (en) 2011-03-31 2017-02-07 Robert Bosch Gmbh Method for analyzing an image recorded by a camera of a vehicle and image processing device
DE102016217450A1 (en) 2016-09-13 2018-03-15 Conti Temic Microelectronic Gmbh Stereo camera arrangement for a motor vehicle and motor vehicle with such a stereo camera arrangement

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4219851A1 (en) 1991-06-21 1992-12-24 Atomic Energy Authority Uk stereo camera
DE4410358A1 (en) 1994-03-25 1995-09-28 Rbt Raumbildtechnik Gmbh Stereo camera with twin shutters and aperture settings
DE10005835A1 (en) * 2000-02-10 2001-08-16 Bundesdruckerei Gmbh Authenticating of documents containing security features having different spectral responses
DE10025593A1 (en) 2000-05-24 2001-12-06 Minoru Inaba Set up of distance between object lenses of stereo camera for close-up photography to provide natural impression of distance when viewing images in stereo image frame
DE10033355A1 (en) 2000-07-08 2002-01-17 Zeiss Carl Stereo camera includes marker pixels in left and right image sensors, enabling correction of image read-out and of image signal timing
DE10154861A1 (en) * 2001-11-08 2003-05-22 Ibeo Automobile Sensor Gmbh Localizing system for objects uses transmitter for pulsed emission of laser beams and receiver with sensor to pick up reflected beam pulses and to analyze them regarding their execution time
DE102005045077A1 (en) * 2004-12-15 2006-06-29 Agilent Technologies, Inc. (n.d.Ges.d.Staates Delaware), Palo Alto A method and apparatus for distributed analysis of images
DE102004061998A1 (en) 2004-12-23 2006-07-06 Robert Bosch Gmbh Stereo camera for a motor vehicle

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4433887B2 (en) * 2003-07-01 2010-03-17 日産自動車株式会社 Vehicle external recognition device
JP3925488B2 (en) * 2003-11-11 2007-06-06 日産自動車株式会社 Image processing apparatus for vehicle
US20050232463A1 (en) * 2004-03-02 2005-10-20 David Hirvonen Method and apparatus for detecting a presence prior to collision
JP4360403B2 (en) * 2006-12-27 2009-11-11 トヨタ自動車株式会社 In-vehicle obstacle detection device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4219851A1 (en) 1991-06-21 1992-12-24 Atomic Energy Authority Uk stereo camera
DE4410358A1 (en) 1994-03-25 1995-09-28 Rbt Raumbildtechnik Gmbh Stereo camera with twin shutters and aperture settings
DE10005835A1 (en) * 2000-02-10 2001-08-16 Bundesdruckerei Gmbh Authenticating of documents containing security features having different spectral responses
DE10025593A1 (en) 2000-05-24 2001-12-06 Minoru Inaba Set up of distance between object lenses of stereo camera for close-up photography to provide natural impression of distance when viewing images in stereo image frame
DE10033355A1 (en) 2000-07-08 2002-01-17 Zeiss Carl Stereo camera includes marker pixels in left and right image sensors, enabling correction of image read-out and of image signal timing
DE10154861A1 (en) * 2001-11-08 2003-05-22 Ibeo Automobile Sensor Gmbh Localizing system for objects uses transmitter for pulsed emission of laser beams and receiver with sensor to pick up reflected beam pulses and to analyze them regarding their execution time
DE102005045077A1 (en) * 2004-12-15 2006-06-29 Agilent Technologies, Inc. (n.d.Ges.d.Staates Delaware), Palo Alto A method and apparatus for distributed analysis of images
DE102004061998A1 (en) 2004-12-23 2006-07-06 Robert Bosch Gmbh Stereo camera for a motor vehicle

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9451237B2 (en) 2011-01-12 2016-09-20 Myestro Interactive Gmbh Remote control device for controlling a mechanism with the aid of a movable object and an interface module based on movement and distance of the movable object with respect to a camera
US9563807B2 (en) 2011-03-31 2017-02-07 Robert Bosch Gmbh Method for analyzing an image recorded by a camera of a vehicle and image processing device
WO2013020872A1 (en) 2011-08-09 2013-02-14 3Vi Gmbh Object detection device for a vehicle, vehicle with such an object detection device and method for determining a relative positional relationship of stereo cameras with respect to one another
DE102011080702B3 (en) * 2011-08-09 2012-12-13 3Vi Gmbh Object detection device for a vehicle, vehicle having such an object detection device
DE102013102207A1 (en) * 2013-03-06 2014-09-11 Conti Temic Microelectronic Gmbh Camera system for a vehicle
DE102014206677A1 (en) * 2014-04-07 2015-10-08 Robert Bosch Gmbh Camera system and method for detecting an environment of a vehicle
DE102014215843A1 (en) * 2014-08-11 2016-02-11 Robert Bosch Gmbh Driver observation system for a motor vehicle
DE102016217450A1 (en) 2016-09-13 2018-03-15 Conti Temic Microelectronic Gmbh Stereo camera arrangement for a motor vehicle and motor vehicle with such a stereo camera arrangement

Also Published As

Publication number Publication date
WO2010103061A1 (en) 2010-09-16

Similar Documents

Publication Publication Date Title
DE102007002419B4 (en) Vehicle environment monitoring device, method and program
EP1671216B1 (en) Moving object detection using low illumination depth capable computer vision
US20110063446A1 (en) Saccadic dual-resolution video analytics camera
EP1592985B1 (en) Device for a motor vehicle used for the three-dimensional detection of a scene inside or outside said motor vehicle
US8743202B2 (en) Stereo camera for a motor vehicle
JP2007263563A (en) Camera module
US9177211B2 (en) Method and apparatus for identifying motor vehicles for monitoring traffic
US7684590B2 (en) Method of recognizing and/or tracking objects
JP2014523359A (en) Automotive camera systems
US20110234749A1 (en) System and method for detecting and recording traffic law violation events
US20100085474A1 (en) Camera module
WO2003029046A1 (en) Apparatus and method for sensing the occupancy status of parking spaces in a parking lot
ES2352300T3 (en) System to detect vehicles.
JP2009040270A (en) In-vehicle camera
US8836786B2 (en) Camera for a vehicle
DE60208894T2 (en) Rain sensor with stereoscopic picture system
DE10228638A1 (en) Vehicle zone monitoring apparatus for detecting physical body present in vicinity of vehicle, has CPU that identifies whether object indicated in image obtained from cameras is artificial structure
TW200702633A (en) Imaging position analyzing method
DE102009023326A1 (en) Apparatus and method for detecting a critical area and pedestrian detection apparatus using the same
NL1031867C1 (en) Method for determining vehicle data.
US6606458B2 (en) Automatic framing camera
WO2010050012A1 (en) Camera module mounted on a car
DE102004010197B4 (en) Method for checking the function of a position-finding or environment detection device of a vehicle or for controlling a digital map
EP2384292B1 (en) Camera arrangement for a motor vehicle, and motor vehicle comprising a camera arrangement
EP2517175A1 (en) Object identifying apparatus, moving body control apparatus, and information providing apparatus

Legal Events

Date Code Title Description
OM8 Search report available as to paragraph 43 lit. 1 sentence 1 patent law
R082 Change of representative

Representative=s name: SCHAUMBURG & PARTNER PATENTANWAELTE GBR, DE

Representative=s name: SCHAUMBURG & PARTNER PATENTANWAELTE MBB, DE

Representative=s name: SCHAUMBURG UND PARTNER PATENTANWAELTE MBB, DE

R012 Request for examination validly filed
R081 Change of applicant/patentee

Owner name: HELLA GMBH & CO. KGAA, DE

Free format text: FORMER OWNER: HELLA KGAA HUECK & CO., 59557 LIPPSTADT, DE

R082 Change of representative

Representative=s name: SCHAUMBURG UND PARTNER PATENTANWAELTE MBB, DE