EP1595391A1 - Vorrichtung und verfahren zur bilderzeugung - Google Patents

Vorrichtung und verfahren zur bilderzeugung

Info

Publication number
EP1595391A1
EP1595391A1 EP03799438A EP03799438A EP1595391A1 EP 1595391 A1 EP1595391 A1 EP 1595391A1 EP 03799438 A EP03799438 A EP 03799438A EP 03799438 A EP03799438 A EP 03799438A EP 1595391 A1 EP1595391 A1 EP 1595391A1
Authority
EP
European Patent Office
Prior art keywords
image
partial
area
infrared
areas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP03799438A
Other languages
German (de)
English (en)
French (fr)
Inventor
Ulrich Seger
Jens Schick
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP1595391A1 publication Critical patent/EP1595391A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/30Transforming light or analogous information into electric information
    • H04N5/33Transforming infrared radiation

Definitions

  • Passive and active night vision systems are known. Passive night vision systems consist of a thermal imaging camera. The disadvantage of passive night vision systems is that it is difficult to produce realistic images with a thermal imager.
  • active night vision systems consist of an infrared-illuminating lighting unit, for example a halogen light with a filter, and one or more infrared-sensitive cameras. The lighting unit illuminates the vehicle apron in the high beam area and the cameras record the reflected infrared high beam and display the image on a monitor or a head-up display. The headlights can be used for visible low beam and high beam for emitting infrared light.
  • German patent DE 42 43 200 C2 describes a device for friend-foe identification of land vehicles for military applications. A thermal imager is coupled with a CO2 laser to make a hidden signature for friend-foe identification visible. An observer sends out a single light pulse and the infrared camera receives the reflected signal synchronized. A disadvantage of this device is that thermal imaging cameras do not provide realistic images. DE 42 43 200 C2 does not contain any references to a device or a method for the generation of realistic images that are suitable for use in motor vehicles.
  • the device described below and the method for image generation in a motor vehicle wherein images of the surroundings of the motor vehicle are generated with at least one infrared-sensitive camera, have the advantage that realistic images with high image quality are produced in good and also in poor visibility and / or weather conditions .
  • the device and the method for image generation can contribute in a particularly advantageous manner to a reduction in the number of accidents in poor visibility conditions, in particular at night, due to the high image quality of the images generated.
  • the generation of realistic images with high image quality in poor visibility and / or weather conditions is also achieved because the illumination of the image capture area with a radiation source that radiates at least in the near infrared spectral range is not significantly disturbed by rain or snow.
  • the lifespan of the at least one radiation source that radiates at least in the near infrared spectral range is increased by the pulsed light emission.
  • the thermal output of the radiation source is lower than in continuous operation with the same output, since the radiation source can regenerate during the pause in the pulse. This leads directly to an increase in the service life.
  • a longer service life of the radiation source and, in connection with this, longer replacement intervals advantageously contribute to a reduction in operating costs in the case of a power tool.
  • At least one infrared-sensitive CMOS camera for generating at least one image of the surroundings of the motor vehicle is particularly advantageous. Compared to other types of cameras the blooming effect is reduced in CMOS cameras. Blooming is the overexposure of the generated image through glare from strong light sources. It is also possible to use CMOS cameras with a linear or logarithmic characteristic of exposure sensitivity. The logarithmic characteristic enables the generation of images with high image quality even with a large dynamic range of brightness within the image capture area. A large dynamic range of brightness occurs in a motor vehicle, for example, when the motor vehicle enters a tunnel from a sunlit street.
  • the maximum average radiation intensity or the light intensity of radiation sources in motor vehicles is defined.
  • the radiation intensity is the radiation flow per solid angle with the unit watt / sterad.
  • the luminous intensity is a physiological variable that determines the luminous flux per solid angle.
  • the unit of light intensity is the candela.
  • the at least one laser diode array that radiates at least in the near infrared radiation range and / or at least one laser array that radiates in the near infrared spectral range enables the pulsed illumination of partial regions of the image capture range of the at least one infrared-sensitive camera in a simple manner.
  • laser diodes enable a short response to generate short light pulses with high radiation intensity during the light pulse duration.
  • infrared-emitting laser diodes and / or infrared-emitting lasers have the advantage that the laser light has a small spectral bandwidth. Appropriate band filter in front of at least one infrared sensitive camera it is possible to filter out other spectral ranges.
  • the use of at least one laser array and / or at least one laser diode array thus contributes to the generation of images with high image quality.
  • infrared-radiating lasers and / or infrared-radiating laser diodes have the advantage that they are highly efficient.
  • the duty cycle of the laser diodes and / or the lasers is reduced by the pulsed illumination of partial areas of the image area of the infrared-sensitive camera.
  • the duty cycle is the time of the light emission in relation to the total operating time of a lighting device.
  • the detection of image lines and / or image columns of the infrared-sensitive camera as partial image areas is advantageous.
  • This advantageously makes it possible to increase the radiation intensity as explained above, in compliance with the statutory provisions.
  • This advantage is achieved, on the one hand, by only illuminating individual sub-areas, while the remaining sub-areas are not illuminated.
  • modern cameras only need a few nanoseconds to sample an image line and / or image column.
  • This short pulsed and spatially restricted illumination makes it possible to further increase the radiation intensity, that is to say the radiation flow per solid angle. Both effects contribute to a substantial increase in the radiation intensity of the illuminated partial area when the partial image area is recorded as an image line and / or image column. This in turn contributes to an improvement in the overall image quality of the generated image.
  • the partial image areas are captured, essentially the corresponding partial area of the image capture area of the at least one infrared-sensitive camera is illuminated in a pulsed manner. This avoids unnecessary illumination of areas of the image capture area that are not captured. This has two important advantages. On the one hand, this contributes to energy savings in the motor vehicle, since areas which are not covered are not illuminated. On the other hand, it is possible to increase the radiation intensity of a partial area with the same average radiation intensity.
  • a variant of the device and the method is advantageous in which at least one partial area of the image capture area of the at least one infrared-sensitive camera comprises at least two partial image areas. The combination of partial image areas into an illuminated partial area contributes to a simple and inexpensive construction of the radiation source.
  • a variant of the device and the method described below has particular advantages, in which the acquisition of the partial image areas can be shifted in time compared to the pulsed illumination. This makes it possible, for example, to compensate for the time-of-flight effects of the radiation from the emission of light pulses to the detection by the camera. This contributes to images with high image quality.
  • An infrared-sensitive camera in particular an infrared-sensitive CMOS camera, with means for carrying out all or at least the essential steps of the described method for image generation in a motor vehicle is particularly advantageous.
  • the advantages of such an infrared-sensitive camera are the described advantages of the method and the device for image generation in a rafting vehicle.
  • FIG. 1 shows an overview drawing of the preferred exemplary embodiment
  • FIG. 2 shows a block diagram of the preferred exemplary embodiment
  • FIG. 3 shows a time diagram of the preferred exemplary embodiment.
  • An infrared-sensitive camera creates images of the surroundings of the motor vehicle.
  • a radiation source that radiates in the near infrared spectral range is used for the pulsed illumination of partial areas of the image acquisition area of the camera.
  • the camera generates the images by capturing partial image areas.
  • a time synchronization is carried out with the pulsed illumination of the partial area comprising at least the partial image area.
  • the acquisition of partial image areas of an infrared-sensitive CMOS camera is time-synchronized with a laser diode array which radiates in the near infrared spectral range.
  • FIG. 1 shows an overview drawing of the device for image generation in a motor vehicle of the preferred exemplary embodiment, consisting of an infrared-sensitive camera 10 with a control unit / processing unit 16 and a radiation source 12 that radiates in the near infrared spectral range with a control unit 14.
  • the control unit / processing unit 16 of the infrared-sensitive camera 10 and the control unit 14 of the radiation source 12 are connected to one another via a signal line 18.
  • the radiation source 12 generates infrared radiation 20 in the near infrared spectral range for the pulsed illumination of the surroundings 24 of the motor vehicle.
  • the radiation source 12 is installed in the front area of the motor vehicle between the headlights for the low beam / high beam.
  • a laser diode array is used as the radiation source 12 which radiates in the near infrared spectral range and emits infrared radiation 20 in the near infrared spectral range.
  • the radiation source 12 is controlled and monitored by the control unit 14.
  • the infrared-sensitive camera 10 generates images of the surroundings 24 of the motor vehicle from the backscattered infrared radiation 22.
  • the infrared-sensitive camera 10 is mounted behind the windshield of the motor vehicle in the area of the interior rear-view mirror.
  • the infrared-sensitive camera 10 is an infrared-sensitive CMOS camera 10.
  • the CMOS camera 10 is controlled via the control unit / processing unit 16.
  • the CMOS camera 10 transmits the generated images of the surroundings 24 of the motor vehicle to the control unit / processing unit 16 for further processing.
  • the control unit / processing unit 16 of the CMOS camera 10 and the control unit 14 of the radiation source 12 contain programs for controlling and for executing / or processing at least one microprocessor.
  • FIG. 2 shows a block diagram of the device for image generation in a motor vehicle of the preferred exemplary embodiment. The additional components compared to FIG. 1 and the mode of operation of the device are explained below.
  • the radiation source 12 which radiates in the near infrared spectral range consists of a laser diode matrix with five rows, each with four laser diodes 28.
  • the laser diodes 28 of the radiation source 12 generate infrared radiation at least in the near infrared spectral range, preferably in the wavelength range between 850 nm and 900 nm.
  • Four laser diodes 28 of a line are controlled by the control unit 14, line by line, pulsed together.
  • the line-by-line infrared radiation is then used via optics 26 for pulsed illumination of a partial area 32, 34, 36, 38, 40 of the image capture area of the CMOS camera 10.
  • FIG. 2 shows the first illuminated partial area 32, the second illuminated partial area 34, the third illuminated partial area 36, the fourth illuminated partial area 38 and the fifth illuminated partial area 40 of the image acquisition area.
  • the first partial area 32 is illuminated by the first line of the radiation source 12, while the further partial areas 34, 36, 38, 40 are illuminated by lines 2 to 5 of the radiation source 12.
  • the optics 26 ensure that there is a partial overlap of the partial areas 32, 34, 36, 38, 40, in particular for complete illumination of the partial areas, so that the entire image capture area of the infrared-sensitive camera is covered by the partial areas 32, 34, 36, 38, 40 is covered.
  • the optics 26 also serves to generate a homogeneously illuminated partial area 32, 34, 36, 38, 40 from the four point-shaped laser diodes 28 as light sources.
  • the lines of the radiation source 12 are controlled in succession by the control unit 14 in such a way that the subregions 32, 34, 36, 38, 40 are sequentially illuminated in pulsed fashion.
  • a partial area 32, 34, 36, 38, 40 is in the preferred exemplary embodiment an approximately rectangular area around the motor vehicle in the direction of travel.
  • the surroundings of the motor vehicle are, for example, the roadway 30 shown in FIG. 2.
  • the reflected infrared radiation is captured by the infrared-sensitive CMOS camera 10.
  • the infrared-sensitive CMOS camera 10 generates images of the surroundings of the motor vehicle from the retroreflected infrared radiation and transmits the generated images to the control unit / processing unit 16.
  • the CMOS camera 10 is oriented such that it encompasses the surroundings of the motor vehicle in the direction of travel, for example that in FIG Figure 2 drawn roadway 30.
  • the infrared-sensitive CMOS camera 10 consists of individual pixels, which in the first embodiment in a matrix comprising 640x480 pixels are arranged.
  • Sub-image areas 42, 44, 46, 48, 50 are captured to generate an image.
  • the corresponding partial image areas 42, 44, 46, 48, 50 are shown in FIG.
  • the first partial image area 42 comprises the first partial area 32, while the further partial areas 44, 46, 48, 50 comprise the corresponding partial areas 34, 36, 38, 40.
  • the partial image areas 42, 44, 46, 48, 50 are dimensioned such that limits of the associated partial areas 32, 34, 36, 38, 40 are not exceeded. This ensures complete illumination of the partial image areas 42, 44, 46, 48, 50.
  • the control unit / processing unit 16 performs the time synchronization explained below between the detection of a partial image area 42, 44, 46, 48, 50 by the CMOS camera 10 and the pulsed illumination of the partial area corresponding to the partial image area 42, 44, 46, 48, 50 32, 34, 36, 38, 40 controlled.
  • the control unit / processing unit 16 controls the control unit 14 via the signal line 18 in such a way that the laser diodes 28 of the first line of the radiation source 12 are activated.
  • Sub-area 32 is thereby illuminated with near infrared radiation.
  • the control unit processing unit 16 controls the infrared-sensitive CMOS camera 10 in such a way that, during the illumination of the partial area 32, the image lines of the CMOS camera 10, which lie within the corresponding partial image area 42, are read sequentially image line by image line by a sample and hold circuit. In this exemplary embodiment, these are the picture lines 1 to 96.
  • the control unit / processing unit 16 controls the control unit 14 via the signal line 18 such that the laser diodes 28 of the first line of the radiation source 12 are switched off.
  • steps 1 to 3 for lines 2, 3, 4 and 5 of radiation source 12 and image lines 97 to 192, 193 to 288, 289 to 384 and 385 to 480 of infrared-sensitive CMOS camera 10 are repeated accordingly.
  • the generated image is finally composed of the captured image lines 1 to 480 of the partial image areas.
  • the process described above is repeated to generate further images.
  • 25 images are generated per second.
  • the four laser diodes 28 of each line are operated accordingly with a period of 40 ms and a pulse duration of 8 ms.
  • the control unit 14 is controlled by the control unit / processing unit 16 by unidirectional or bidirectional transmission of analog and / or digital signals.
  • the signal line 18 is a communication bus in a motor vehicle, for example a CA ⁇ bus.
  • FIG. 3 shows a time diagram for the preferred exemplary embodiment of the irradiance of the pulsed illuminated partial areas 32, 34, 36, 38, 40 in the vicinity of the Motor vehicle.
  • FIG. 3 shows the basic course of the irradiance of the partial areas 32, 34, 36, 38, 40 which are illuminated in a pulsed manner. The time t is plotted on the abscissa of FIGS. 3a, 3b, 3c, 3d and 3e.
  • the irradiance is defined as the radiant power per area.
  • FIG. 3 a shows the time course of the irradiance of the pulsed-illuminated partial area 32 in accordance with the block diagram according to FIG. 2.
  • the partial area 32 is illuminated with a period of 40 ms approximately 8 ms.
  • FIGS. 3b, 3c, 3d, 3e correspondingly show the time course of the irradiance of the further partial areas 34, 36, 38, 40 according to FIG. 2.
  • the pulsed illumination of these partial areas 34, 36, 38, 40 also takes place with a period of 40 ms and a lighting duration of 8 ms.
  • the pulsed illumination takes place sequentially starting from the partial area 32 via the partial areas 34, 36, 38 up to the partial area 40. After the pulsed illumination of the partial area 40, the pulsed illumination starts again at the partial area 32.
  • the image lines of the CMOS camera 10 are not sequentially image line by image line, but rather all image lines of a partial image area are read in parallel. In the preferred exemplary embodiment, this means that, for example, in the first step, image lines 1 to 96 are read in simultaneously in a parallel step by means of a sampling process.
  • the partial regions 32, 34, 36, 38, 40 according to FIG. 2 are also pulsedly illuminated by the laser diode array used in the preferred exemplary embodiment.
  • the corresponding partial area 32, 34, 36, 38, 40 is illuminated in a pulsed manner each time a picture line is sampled. This results in the fact that, for example, to capture the partial image area 42, the partial area 32 is illuminated with a total of 96 successive light pulses. With each light pulse for illuminating the partial area 32, the image lines 1 to 96 are read sequentially in a sampling process.
  • the pulsed illumination of the partial area 32 is synchronized in time with the sampling process of all 96 image lines of the first partial image area 42 and accordingly for all further partial areas 34, 36, 38, 40.
  • the acquisition of a partial image area 42, 44, 46, 48, 50 according to FIG. 2 takes approximately 8 ms with a period of 40 ms.
  • individual light pulses with a light pulse duration of approximately 120 ns are generated with a period of the light pulses of approximately 83 ⁇ s.
  • These light pulses are synchronized in time with the sampling process of a corresponding image line.
  • a modified laser diode array is used as in the preferred exemplary embodiment.
  • the laser diode array used in this variant consists of 480 lines, each with about 100 laser diodes per line.
  • the laser diodes used are miniaturized. There is identity between the number of lines of the laser diode array and the number of image lines of the CMOS image sensor used.
  • the optics used are such that a vertical illumination range of 1 ° is achieved per line. The illumination area is extended in the horizontal direction, so that overall a linear illumination of the surroundings of the motor vehicle takes place.
  • 480 subregions are generated for pulsed illumination. When a partial area is pulsed illuminated by a line of the laser diode array, the partial image area corresponding to this partial area is recorded for this purpose.
  • a partial image area is an image line of the CMOS image sensor.
  • Each section is illuminated with a light pulse duration of approximately 80 ns and a period of 40 ms. So there is a duty cycle of pulse duration to pulse period of 0.000 002.
  • the sampling process for one image line takes about 120 ns.
  • the light pulse is synchronized in time with the sampling process of the image line so that the 80 ns light pulse is generated approximately in the middle of the 120 ns sample process of the image line.
  • the control of the lines of the laser diode array takes place in such a way that the. generated illumination sweep takes place synchronously with the scan sweep of the CMOS image sensor.
  • At least one infrared-sensitive camera is generally used, which has means for capturing partial image areas, in particular image lines.
  • at least one further infrared-sensitive camera in particular at least one infrared-sensitive CMOS camera, is used in one variant.
  • infrared-sensitive cameras with a matrix size of, for example, 1024x768 pixels and / or 1280x1024 pixels are used in further variants.
  • image columns are captured in other variants of the exemplary embodiments described above instead of image lines. Accordingly, the partial image areas and the partial areas are correspondingly spatial in the vertical direction instead of in the horizontal direction extended.
  • the laser diode array used which is controlled column by column for pulsed illumination.
  • further variants for the division and number of partial areas and partial picture areas are possible. The only requirement is that the division and the number of partial areas and partial picture areas is selected such that at least one partial area can be assigned to each partial picture area, this at least one partial area comprising at least the partial picture area. It is also necessary that a temporal synchronization between the detection of a partial image area with the pulsed illumination of the partial area comprising at least the partial image area can be carried out. In addition to the temporal synchronization, a kind of spatial synchronization between the pulsed illumination of at least one partial area and the at least one corresponding partial image area is thereby made possible.
  • Control unit processing unit of the camera one unit.
  • the at least one radiation source that radiates at least in the near infrared spectral range and the at least one control unit of the radiation source are one unit.
  • a variant of the device and the method described above uses, in addition to the laser diode array which radiates in the near infrared spectral range, at least one further radiation source which radiates in the near infrared spectral range.
  • at least one laser array emitting in the near infrared spectral range is used.
  • at least one radiation source which radiates in the near infrared spectral range is suitable for pulsed illumination of partial areas and for synchronizing the time with the detection of partial image areas of a camera.
  • the temporal synchronization of the detection of a partial image area with the pulsed illumination of the partial area comprising at least the partial image area takes place in different variants of the exemplary embodiments described line by line and / or field by region and / or part by region and / or image and / or image sequence.
  • temporal synchronization line by line synchronization signals are generated for each image line.
  • temporal field-by-field synchronization corresponding synchronization signals are generated for each field region.
  • the time synchronization takes place frame by frame. This type of synchronization takes place by means of image synchronization pulses.
  • the time synchronization is carried out frame by frame. This means that a
  • Image sequence synchronization pulse is generated for example after ten images.
  • the use of clock generators in the control unit of the radiation source and / or in the control unit / processing unit of the camera is possible.
  • quartz crystals are used as the clock.
  • Possible phase shifts between the pulsed illumination of partial areas of the image capture area of the camera and the capture of the image lines to be synchronized in time are compensated for by an adjustable time offset to one another.
  • the cause of the phase shift are time delays in the generation of the light pulses and transit time delays of the light pulse.
  • This shift can be set in a type-specific manner, that is to say depending on the individual components used, or it is determined by means of an image quality measure.
  • the image quality measure is determined via an image evaluation in the control unit / processing unit of the infrared-sensitive camera via the brightness of the image and / or the brightness gradient in the image edges, that is to say in the direction from the first image line to the last image line.
  • the time shift is optimally set as a function of the determined image quality measure by means of a corresponding control.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Radiation Pyrometers (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
EP03799438A 2003-02-07 2003-12-19 Vorrichtung und verfahren zur bilderzeugung Withdrawn EP1595391A1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE10305010 2003-02-07
DE10305010A DE10305010B4 (de) 2003-02-07 2003-02-07 Vorrichtung und Verfahren zur Bilderzeugung
PCT/DE2003/004249 WO2004071074A1 (de) 2003-02-07 2003-12-19 Vorrichtung und verfahren zur bilderzeugung

Publications (1)

Publication Number Publication Date
EP1595391A1 true EP1595391A1 (de) 2005-11-16

Family

ID=32797327

Family Applications (1)

Application Number Title Priority Date Filing Date
EP03799438A Withdrawn EP1595391A1 (de) 2003-02-07 2003-12-19 Vorrichtung und verfahren zur bilderzeugung

Country Status (5)

Country Link
US (1) US7935928B2 (ja)
EP (1) EP1595391A1 (ja)
JP (1) JP4204558B2 (ja)
DE (1) DE10305010B4 (ja)
WO (1) WO2004071074A1 (ja)

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7983802B2 (en) * 1997-10-22 2011-07-19 Intelligent Technologies International, Inc. Vehicular environment scanning techniques
DE102005041535A1 (de) * 2005-08-31 2007-03-01 Siemens Ag Night-Vision-System, bei dem die Umgebung mittels einer Aufnahmevorrichtung aufgenommen und mittels einer Anzeige zumindest teilweise dargestellt wird
DE102006031580A1 (de) 2006-07-03 2008-01-17 Faro Technologies, Inc., Lake Mary Verfahren und Vorrichtung zum dreidimensionalen Erfassen eines Raumbereichs
US7745771B2 (en) * 2007-04-03 2010-06-29 Delphi Technologies, Inc. Synchronous imaging using segmented illumination
DE102007022523A1 (de) * 2007-05-14 2008-11-20 Bayerische Motoren Werke Aktiengesellschaft Kraftfahrzeug
JP4702426B2 (ja) 2008-10-10 2011-06-15 株式会社デンソー 車両検出装置、車両検出プログラム、およびライト制御装置
JP4666049B2 (ja) 2008-10-17 2011-04-06 株式会社デンソー 光源識別装置、光源識別プログラム、車両検出装置、およびライト制御装置
DE102009010465B3 (de) * 2009-02-13 2010-05-27 Faro Technologies, Inc., Lake Mary Laserscanner
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
DE102009015920B4 (de) 2009-03-25 2014-11-20 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
FR2944363B1 (fr) 2009-04-14 2011-11-18 Onera (Off Nat Aerospatiale) Procede et systeme d'imagerie active a champ large
DE102009035337A1 (de) 2009-07-22 2011-01-27 Faro Technologies, Inc., Lake Mary Verfahren zum optischen Abtasten und Vermessen eines Objekts
DE102009035336B3 (de) 2009-07-22 2010-11-18 Faro Technologies, Inc., Lake Mary Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
DE102009055988B3 (de) 2009-11-20 2011-03-17 Faro Technologies, Inc., Lake Mary Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
DE102009057101A1 (de) 2009-11-20 2011-05-26 Faro Technologies, Inc., Lake Mary Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
DE102009055989B4 (de) 2009-11-20 2017-02-16 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
JP2013517502A (ja) 2010-01-20 2013-05-16 ファロ テクノロジーズ インコーポレーテッド 複数の通信チャネルを有する可搬型の関節アーム座標測定機
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9879976B2 (en) 2010-01-20 2018-01-30 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
US8692198B2 (en) * 2010-04-21 2014-04-08 Sionyx, Inc. Photosensitive imaging devices and associated methods
DE102010020925B4 (de) 2010-05-10 2014-02-27 Faro Technologies, Inc. Verfahren zum optischen Abtasten und Vermessen einer Umgebung
DE102010032726B3 (de) 2010-07-26 2011-11-24 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
DE102010032725B4 (de) 2010-07-26 2012-04-26 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
DE102010032723B3 (de) 2010-07-26 2011-11-24 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
DE102010033561B3 (de) 2010-07-29 2011-12-15 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
DE102012100609A1 (de) 2012-01-25 2013-07-25 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
EP2690483A1 (en) * 2012-07-25 2014-01-29 Johnson Controls Automotive Electronics SAS Head-up display and method for operating it
DE102012107544B3 (de) 2012-08-17 2013-05-23 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
DE102012109481A1 (de) 2012-10-05 2014-04-10 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
DE102013013330A1 (de) 2013-08-09 2014-03-06 Daimler Ag Fahrzeug mit zumindest einer Vorrichtung zur Erfassung der Fahrzeugumgebung
JP6064151B1 (ja) 2015-06-12 2017-01-25 パナソニックIpマネジメント株式会社 照明装置、撮像システムおよび照明方法
EP3159711A1 (en) 2015-10-23 2017-04-26 Xenomatix NV System and method for determining a distance to an object
KR102372088B1 (ko) 2015-10-29 2022-03-08 삼성전자주식회사 영상 획득 장치 및 영상 획득 방법
DE102015122844A1 (de) 2015-12-27 2017-06-29 Faro Technologies, Inc. 3D-Messvorrichtung mit Batteriepack
DE102016119343A1 (de) * 2016-10-11 2018-04-12 Bircher Reglomat Ag Objektüberwachung mit Infrarotbildaufnahme und Infrarotpulsbeleuchtung
KR20220112322A (ko) * 2021-02-03 2022-08-11 삼성전자주식회사 송신 회로의 출력 임피던스를 조절할 수 있는 인터페이스 회로 및 이를 포함하는 이미지 센서

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3727061A (en) * 1970-07-06 1973-04-10 Us Army Pulse laser communication system
US3947119A (en) * 1974-02-04 1976-03-30 Ball Brothers Research Corporation Active sensor automatic range sweep technique
US4294544A (en) * 1979-08-03 1981-10-13 Altschuler Bruce R Topographic comparator
US4501961A (en) * 1982-09-01 1985-02-26 Honeywell Inc. Vision illumination system for range finder
DE3404396A1 (de) * 1984-02-08 1985-08-14 Dornier Gmbh, 7990 Friedrichshafen Vorrichtung und verfahren zur aufnahme von entfernungsbildern
DE3642051A1 (de) * 1985-12-10 1987-06-11 Canon Kk Verfahren zur dreidimensionalen informationsverarbeitung und vorrichtung zum erhalten einer dreidimensionalen information ueber ein objekt
DE8903591U1 (ja) 1989-03-22 1990-04-19 Schucht, Conrad, Dipl.-Ing., 3300 Braunschweig, De
US5003166A (en) * 1989-11-07 1991-03-26 Massachusetts Institute Of Technology Multidimensional range mapping with pattern projection and cross correlation
DE3942770A1 (de) * 1989-12-23 1991-07-11 Dornier Luftfahrt Entfernungsbildkamera
DE4007646C2 (de) * 1990-03-10 2003-08-07 Daimler Chrysler Ag Anordnung zur Verbesserung der Sicht in Fahrzeugen
DE4107850B4 (de) * 1990-03-10 2006-06-29 Daimlerchrysler Ag Anordnung zur Verbesserung der Sicht, insbesondere in Fahrzeugen
US5029967A (en) * 1990-04-09 1991-07-09 The Boeing Company Optical source for optical sensing system
JP2923331B2 (ja) * 1990-06-12 1999-07-26 オリンパス光学工業株式会社 光記録媒体及びその再生装置
US5175601A (en) * 1991-10-15 1992-12-29 Electro-Optical Information Systems High-speed 3-D surface measurement surface inspection and reverse-CAD system
DE4243200C1 (de) 1992-12-19 1996-01-18 Dornier Gmbh Vorrichtung zur Freund-Feind-Kennung von Landfahrzeugen
JPH06301304A (ja) * 1993-02-19 1994-10-28 Minolta Camera Co Ltd 定着装置
US5669174A (en) * 1993-06-08 1997-09-23 Teetzel; James W. Laser range finding apparatus
DE69413761T2 (de) * 1993-07-29 1999-07-08 Omron Tateisi Electronics Co Sender für elektromagnetische Wellen und Entfernungsmesser
US5675407A (en) * 1995-03-02 1997-10-07 Zheng Jason Geng Color ranging method for high speed low-cost three dimensional surface profile measurement
US5852672A (en) * 1995-07-10 1998-12-22 The Regents Of The University Of California Image system for three dimensional, 360 DEGREE, time sequence surface mapping of moving objects
US5726443A (en) * 1996-01-18 1998-03-10 Chapman Glenn H Vision system and proximity detector
SE9601742L (sv) * 1996-05-07 1997-11-08 Besam Ab Sätt att bestämma avstånd och läge för ett föremål
US6154279A (en) * 1998-04-09 2000-11-28 John W. Newman Method and apparatus for determining shapes of countersunk holes
JP3690157B2 (ja) 1998-12-28 2005-08-31 スズキ株式会社 表面欠陥検査装置
US6377353B1 (en) * 2000-03-07 2002-04-23 Pheno Imaging, Inc. Three-dimensional measuring system for animals using structured light
AU2001261160A1 (en) * 2000-05-03 2001-11-12 Stephen T Flock Prosthesis and method of making
US6429429B1 (en) * 2000-06-22 2002-08-06 Ford Global Technologies, Inc. Night vision system utilizing a diode laser illumination module and a method related thereto
US6420704B1 (en) * 2000-12-07 2002-07-16 Trw Inc. Method and system for improving camera infrared sensitivity using digital zoom
JP4688196B2 (ja) 2001-03-23 2011-05-25 スタンレー電気株式会社 自動車用暗視システム
US6690017B2 (en) * 2002-02-21 2004-02-10 Ford Global Technologies, Llc GPS-based anti-blinding system for active night vision
DE10226278A1 (de) * 2002-06-13 2003-12-24 Peter Lux Rückfahrhilfe für Fahrzeuge

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2004071074A1 *

Also Published As

Publication number Publication date
US20070023660A1 (en) 2007-02-01
DE10305010B4 (de) 2012-06-28
JP4204558B2 (ja) 2009-01-07
WO2004071074A1 (de) 2004-08-19
JP2006515130A (ja) 2006-05-18
US7935928B2 (en) 2011-05-03
DE10305010A1 (de) 2004-09-02

Similar Documents

Publication Publication Date Title
DE10305010B4 (de) Vorrichtung und Verfahren zur Bilderzeugung
DE60319238T2 (de) Nachtsichtabbildungssystem und -Verfahren zur Montage in einem Fahrzeug
DE102004050181B4 (de) Aktives Nachtsichtsystem mit adaptiver Bildgebung
DE60316074T2 (de) Aktives Nachtsichtsystem für ein Kraftfahrzeug
DE602005000337T2 (de) Nachtsichtsystem
EP3465534B1 (de) Beleuchtungseinrichtung für ein kraftfahrzeug zur erhöhung der erkennbarkeit eines hindernisses
DE102005033863A1 (de) Bildaufnahmesystem
WO2004071095A1 (de) Vorrichtung und verfahren zur bilderzeugung
EP1668386B1 (de) Verfahren zur verbesserung der sicht in einem kraftfahrzeug
DE102011081384B4 (de) Verfahren und Vorrichtung zur Abstandsbestimmung für ein Fahrzeug
EP1298481A2 (de) Nachtsichtvorrichtung für Fahrzeuge
DE102007004349A1 (de) Nachtsichtsystem, insbesondere für ein Fahrzeug, und Verfahren zum Erstellen eines Nachtsichtbildes
DE102017222708A1 (de) 3D-Umfelderfassung mittels Projektor und Kameramodulen
WO2010136380A1 (de) Verfahren und vorrichtung zur fahrzeug gestützten beleuchtung bei unzureichend beleuchteten verkehrsumgebungen
WO2010057697A1 (de) Beleuchtungseinheit für ein fahrzeug, fahrzeug und verfahren hierfür
DE102012011847A1 (de) Nachtsichtsystem für ein Kraftfahrzeug
EP3833161B1 (de) Beleuchtungsvorrichtung für einen kraftfahrzeugscheinwerfer
DE102005020950A1 (de) Aktives Nachtsichtsystem mit vollständig synchronisierter Lichtquelle und Empfänger
EP1632791A2 (de) Nachtsichtsystem für ein Kraftfahrzeug
DE10033103A1 (de) Infrarot-Sichtsystem
EP1262369B1 (de) Nachtsichteinrichtung für Fahrzeuge
DE10062783A1 (de) Infrarot-Sichtsystem
EP3213963B1 (de) Fahrzeug mit kameraeinrichtung und aussenlichtanlage
DE102012018118A1 (de) Verfahren zum Betreiben einer Frontkamera eines Kraftfahrzeugs unter Berücksichtigung des Lichts des Scheinwerfers, entsprechende Vorrichtung und Kraftfahrzeug
EP1437615A1 (de) Nachtsichtsystem für Kraftfahrzeuge

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20050907

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LI LU MC NL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL LT LV MK

DAX Request for extension of the european patent (deleted)
RBV Designated contracting states (corrected)

Designated state(s): DE FR GB SE

17Q First examination report despatched

Effective date: 20101013

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20110224