WO2005119175A1 - カメラモジュール - Google Patents
カメラモジュール Download PDFInfo
- Publication number
- WO2005119175A1 WO2005119175A1 PCT/JP2005/007039 JP2005007039W WO2005119175A1 WO 2005119175 A1 WO2005119175 A1 WO 2005119175A1 JP 2005007039 W JP2005007039 W JP 2005007039W WO 2005119175 A1 WO2005119175 A1 WO 2005119175A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- camera module
- wavelength selection
- image
- imaging
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 claims abstract description 87
- 230000003287 optical effect Effects 0.000 claims abstract description 82
- 230000035945 sensitivity Effects 0.000 claims description 15
- 230000015572 biosynthetic process Effects 0.000 claims description 12
- 238000003786 synthesis reaction Methods 0.000 claims description 12
- 230000007246 mechanism Effects 0.000 claims description 6
- 239000002131 composite material Substances 0.000 claims description 4
- 230000002194 synthesizing effect Effects 0.000 claims description 2
- 230000005540 biological transmission Effects 0.000 abstract 1
- 230000000875 corresponding effect Effects 0.000 description 44
- 238000005259 measurement Methods 0.000 description 17
- 239000011295 pitch Substances 0.000 description 16
- 238000012545 processing Methods 0.000 description 16
- 238000000034 method Methods 0.000 description 12
- 238000010586 diagram Methods 0.000 description 8
- 238000006073 displacement reaction Methods 0.000 description 8
- 230000000694 effects Effects 0.000 description 5
- 239000000758 substrate Substances 0.000 description 5
- 239000003086 colorant Substances 0.000 description 4
- 230000002596 correlated effect Effects 0.000 description 3
- 238000001444 catalytic combustion detection Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000006866 deterioration Effects 0.000 description 2
- 230000003449 preventive effect Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C3/00—Measuring distances in line of sight; Optical rangefinders
- G01C3/02—Details
- G01C3/06—Use of electric means to obtain final indication
- G01C3/08—Use of electric radiation detectors
- G01C3/085—Use of electric radiation detectors with electronic parallax measurement
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/20—Filters
- G02B5/201—Filters in the form of arrays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/13—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with multiple sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/106—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using night vision cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/107—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using stereoscopic cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/108—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using 'non-standard' camera systems, e.g. camera sensor used for additional purposes i.a. rain sensor, camera sensor split in multiple image areas
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8033—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for pedestrian protection
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/8093—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/0006—Arrays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
- H04N23/843—Demosaicing, e.g. interpolating colour pixel values
Definitions
- the present invention relates to a small and thin camera module capable of measuring a distance to a subject.
- a stereo camera device that can be mounted on an automobile, in which a subject is imaged by a stereo camera having two cameras, and the distance to the subject is calculated by calculating the obtained image information!
- the distance calculation in this stereo camera device is performed as follows. Using triangulation, one of the three-dimensional measurement techniques, search for and identify pixel blocks that have a correlation with a certain pixel block in the image obtained by one camera in the image obtained by the other camera . Then, the distance to the subject is calculated based on the parallax between the two pixel blocks, that is, the relative displacement between the two pixel blocks in the two images (stereo images).
- Japanese Patent Laying-Open No. 2003-83742 discloses the following method for correcting a horizontal mounting error of a stereo camera.
- a plurality of approximate straight lines that are spatially parallel to each other and extend in the distance direction are specified, and a first vanishing point is calculated from the intersection of the approximate straight lines.
- a plurality of approximate straight lines extending in the distance direction and spatially parallel to each other are specified, and the intersection point force of the approximate straight lines also calculates a second vanishing point. Then, based on the amount of deviation between the first vanishing point and the second vanishing point, an error in the measurement distance due to a mounting error in the horizontal direction is corrected.
- a method of calculating a vanishing point will be described with reference to FIG. For example, consider a case where the camera images a pair of broken white lines 901 and 902 drawn on a road as shown in FIG. In this case, two approximate straight lines 901a and 902a extending in the distance direction are specified using one edge of each of the white lines 901 and 902. Then, an intersection 903 between the two approximate straight lines 901a and 902a is determined as a vanishing point.
- Japanese Patent Application Laid-Open No. H10-255019 discloses the following method.
- Japanese Patent Laying-Open No. 2004-12863 discloses a camera device provided with a distance measuring optical system and an imaging optical system independently of each other.
- This camera device includes a lens module in which two lenses constituting a distance measuring optical system and one lens constituting a photometric optical system are integrally arranged on the same plane. Further, this camera device is provided with an imaging lens constituting an imaging optical system separately from the lens module. Automatic focusing is achieved by calculating the amount of movement of the imaging lens using two lenses for distance measurement. In this camera device, the imaging of the subject is performed only by the imaging lens. Therefore, distance measurement and shooting An independent optical system is required for each image, and the number of parts and the number of assembling steps are increased, resulting in an increase in cost and an increase in the size of the apparatus.
- a camera module includes a lens module having a plurality of lenses arranged on the same plane, and at least one optic that selectively transmits light of a specific wavelength band among light of subject power.
- a plurality of wavelength selection regions each having a filter, and a plurality of imaging regions having a large number of pixels and outputting image information according to incident light are provided.
- the plurality of lenses, the plurality of wavelength selection areas, and the plurality of imaging areas are arranged in a one-to-one correspondence with each other! Puru.
- At least two of the plurality of wavelength selection regions are a first wavelength selection region transmitting at least one wavelength band of infrared light, red light, green light, and blue light. It is. At least two imaging regions respectively corresponding to the at least two first wavelength selection regions are first imaging regions having sensitivity to light in a wavelength band transmitted by the first wavelength selection region.
- the camera module further includes a distance calculation circuit that calculates a distance to a subject based on at least two pieces of image information output from the at least two first imaging regions.
- the camera module outputs an image signal based on image information output by at least one of the plurality of imaging regions.
- a distance to a subject can be measured by a triangulation method, and a subject image can be captured. Therefore, it is possible to simultaneously recognize the shape of the subject and measure the distance to the subject.
- a plurality of lenses are integrated as a lens module! /, A plurality of independent cameras are not required. Therefore, a compact and low-cost camera module can be realized with a simple configuration.
- FIG. 1 is an exploded perspective view of a camera module according to Embodiment 1 of the present invention.
- FIG. 2 is a cross-sectional view of the camera module according to Embodiment 1 of the present invention.
- FIG. 3 is a diagram for explaining the principle of measuring the distance to a subject by the camera lens module according to Embodiment 1 of the present invention.
- FIG. 4 is a conceptual diagram of an optical filter module used for a camera module according to Embodiment 2 of the present invention.
- FIG. 5 is a schematic perspective view of an example of an optical filter module used for a camera module according to Embodiment 2 of the present invention.
- FIG. 6 is a perspective view of an optical filter module used for a camera module according to Embodiment 3 of the present invention.
- FIG. 7 is an enlarged front view of a part of a light incident surface of an image sensor used in the camera module according to the present invention.
- FIG. 8 is a conceptual diagram illustrating a mechanism in which a pixel value of an image sensor is determined in a camera module according to the present invention.
- FIG. 9 is a diagram showing a concept of a sub-pixel of an image sensor in a camera module according to Embodiment 4 of the present invention.
- FIG. 10 is a diagram for explaining a method of obtaining the pixel value of one sub-pixel of an image sensor by interpolation in the camera module according to Embodiment 4 of the present invention.
- FIG. 11 is a top view of a running automobile equipped with a camera module according to Embodiment 5 of the present invention.
- FIG. 12 is a perspective view showing an appearance of a camera module according to Embodiment 6 of the present invention.
- FIG. 13 is a diagram showing definitions of an X axis, a Y axis, and a Z axis in FIG.
- FIG. 14 is another rotation drive mechanism of the camera module according to Embodiment 6 of the present invention.
- FIG. 15 is a diagram illustrating a method of calculating vanishing points in a conventional stereo camera device.
- the at least two first wavelength selection regions may selectively transmit infrared light.
- the distance to the subject can be measured even in a dark environment such as at night.
- distance can be measured with high accuracy even using a single lens.
- the at least two first wavelength selection regions may selectively transmit light in any one wavelength band of red light, green light, and blue light.
- the distance to the subject can be measured in a bright environment such as the daytime.
- distance can be measured with high accuracy even with a single lens.
- the plurality of wavelength selection regions may include at least two infrared light wavelength selection regions that selectively transmit infrared light, and one of the wavelengths of red light, green light, and blue light. At least two visible light wavelength selection regions for selectively transmitting light in a band may be included.
- the distance calculation circuit calculates a distance to the subject based on at least two pieces of image information output from at least two infrared light imaging areas corresponding to the at least two infrared light wavelength selection areas.
- the distance to the subject is calculated based on at least two pieces of image information output from at least two visible light imaging regions corresponding to the at least two visible light wavelength selection regions.
- the at least two first wavelength selection regions may include wavelengths of red light, green light, and blue light. It is preferable that the red optical filter, the green optical filter, and the blue optical filter that transmit light in the respective bands are wavelength-selective regions arranged in a Bay arrangement according to the arrangement of the plurality of pixels. Thereby, the distance can be calculated based on the color image information. Since the color image information has a larger amount of information than the monochrome image information, the distance measurement accuracy is significantly improved.
- At least one of the plurality of wavelength selection regions excluding the first wavelength selection region is red light.
- Red light filter, green light filter, and blue light filter that transmit light in the wavelength bands of green light and blue light, respectively, even in a wavelength selection region in which the plurality of pixels are arranged in a Bayer arrangement.
- it is preferable that at least one imaging region corresponding to the at least one wavelength selection region arranged in the Bayer array has sensitivity to visible light. This makes it possible to realize a camera module that can perform distance measurement even in a dark environment such as at night and can capture a color image.
- the first wavelength selection region selectively transmits infrared light
- at least one of the plurality of wavelength selection regions excluding the first wavelength selection region is a red light. It may be a second wavelength selection region that transmits light in any one of the wavelength bands of green light, blue light, and blue light.
- the second imaging region corresponding to the second wavelength selection region has sensitivity to light in a wavelength band transmitted by the second wavelength selection region.
- the camera module further includes an image synthesis circuit that calculates one synthesized image based on the image information output from the first imaging region and the image information output from the second imaging region. Is preferred. It is preferable that the camera module outputs a signal related to the composite image as the image signal.
- the camera module at least two imaging regions respectively corresponding to at least two wavelength selection regions that transmit light in different wavelength bands are provided. It is preferable to further include an image synthesizing circuit that calculates one synthesized image based on at least two pieces of output image information. In this case, it is preferable that the camera module outputs a signal related to the composite image as the image signal. As a result, the amount of information of the output image signal increases, so that the accuracy of shape recognition and the like using the image signal is improved.
- the distance calculation circuit interpolates between the plurality of pixels based on the image information output from the first imaging region, and uses the information obtained by the interpolation to determine a distance to a subject. Is preferably calculated. Accordingly, the distance measurement accuracy can be improved without increasing the distance between a plurality of lenses. Further, distance measurement can be performed with high accuracy not only at short distances but also at long distances.
- the camera module of the present invention may be mounted on an automobile and used to obtain information on surrounding conditions. Thereby, the preventive safety of the vehicle is improved.
- the driving mechanism that changes the direction of the camera module in accordance with the camera module force and the vehicle speed signal and the Z or steering angle signal of the vehicle.
- a driving mechanism that changes the direction of the camera module in accordance with the camera module force and the vehicle speed signal and the Z or steering angle signal of the vehicle.
- FIG. 1 is an exploded perspective view of the camera module according to the first embodiment.
- the camera module of the present embodiment includes a lens module 1, an optical filter module 2, and a substrate 3 in this order from the subject side.
- a signal processing unit 5 including image sensors 4a to 4d and a digital signal processor (DSP) is mounted on the substrate 3.
- Arrow 10 indicates a ray from the subject.
- FIG. 2 is a cross-sectional view of the camera module of the present embodiment along a direction parallel to the optical axis of the lens module 1.
- reference numeral 6 denotes a fixing base for supporting and fixing the lens module 1 and the optical filter module 2 to the substrate 3.
- the optical filter module 2 includes four wavelength selection regions 2a to 2d.
- the wavelength selection areas 2a and 2d which are arranged diagonally to each other, receive light 10 from the subject.
- an infrared optical filter that selectively transmits infrared light
- the wavelength selection areas 2b and 2c which are arranged at diagonal positions, select green light from the light 10 of the object power. It consists of a green optical filter that transmits light.
- the lenses la and Id corresponding to the wavelength selection regions 2a and 2d that transmit infrared light are designed to satisfy the required optical specifications such as MTF in the wavelength band of infrared light.
- the imaging devices 4a and 4d corresponding to the wavelength selection regions 2a and 2d have sensitivity in the wavelength band of infrared light.
- the lenses lb and lc corresponding to the wavelength selection regions 2b and 2c that transmit green light are designed to satisfy the required optical specifications such as MTF in the wavelength band of green light.
- the imaging devices 4b and 4c corresponding to the selection regions 2b and 2c have sensitivity at least in the wavelength band of green light. In the present embodiment, the imaging devices 4b and 4c have sensitivity over the wavelength band of visible light including green light.
- the lens module 1 including the four lenses la to ld is made of a material such as glass or plastic.
- Four lenses la: Ld are integrally formed on the same plane.
- Each lens la-: Ld passes light 10 from a subject through each wavelength selection area 2a-2d, and then forms an image on each image sensor 4a-4d.
- imaging sensors such as CCDs can be used. After passing through the lenses 1a to 1d, the light in the wavelength band of infrared light or green light is selected and transmitted through each of the wavelength selection areas 2a to 2d, and the light passes through each of the imaging elements 4a to 4d. Form an image. That is, only the infrared light wavelength band out of the light 10 from the subject forms an image on the imaging devices 4a and 4d, and the green light out of the subject light 10 on the imaging devices 4b and 4c. Light in only the wavelength band of light forms an image. In this manner, the subject light 10 is separated into light in the wavelength band of infrared light and light in the wavelength band of green light, and forms images on separate image sensors.
- Each of the imaging devices 4a to 4d includes a number of pixels that perform photoelectric conversion. Light of a specific wavelength band is incident on each pixel of each of the imaging elements 4a to 4d. Each of the image sensors 4a to 4d An electric signal corresponding to the light intensity is output as image information.
- Video processing including various signal processing is performed on the image information.
- the distance to the object is measured using the two image information output from the two imaging elements into which light in the same wavelength band enters.
- a plurality of pieces of image information output from a plurality of image sensors may be combined.
- These signal processes are performed by a signal processing unit 5 including a DSP.
- the light beam 11 b from the subject 11 that the lens lb forms an image on the image sensor 4 b and the light beam 1 lc from the subject 11 that the lens lc forms an image on the image sensor 4 c are Since they are different from each other, the image forming position of the light beam 1 lb on the image sensor 4b is different from the image forming position of the light beam 1 lc on the image sensor 4c.
- a virtual equivalent ray 11c ′ having a relation equivalent to the relation of the ray l ib to the lens lb.
- the image forming position of the light beam 11c and the image forming position of the equivalent light beam 1lc ' are shifted by the shift amount Z based on the parallax.
- An image obtained by one of the imaging devices 4b and 4c corresponding to green light is set as a reference image.
- the light receiving area of the image sensor 4b is divided into a plurality of blocks (for example, one block is composed of a total of 64 pixels of 8 rows ⁇ 8 columns).
- an image having a correlation with an image captured by a certain block of the image sensor 4b is searched for and specified in the image obtained by the other image sensor 4c.
- the difference between the position of the certain block in one image sensor 4b and the position of the image correlated with this block in the other image sensor 4c is determined as the shift amount Z.
- the shift amount Z is obtained using the pixel pitch of the imaging elements 4b and 4c as a unit.
- the distance from the lens lb, lc to the subject 11 is D [mm]
- the lenses lb, lc are the same lens
- the focal length is f [mm]
- the optical axis lb of the lens lb
- the distance between the lens lc and the optical axis 1 c ′ is L [mm]
- the pixel pitch of the imaging elements 4 b and 4 c is p [mm]
- the relative displacement Z between the mutually correlated images is z [pixels].
- the distance D to the subject can be obtained using the following equation (1).
- the distance L 'and the optical axis lc' optical axis lb and is 3 [mm]
- Gasopi Tutsi p is 2. 8 X 10- 3 in the camera module of the [mm] If the displacement Z is 3 [pixel pitch], the distance D to the subject is
- the distance to the subject can be calculated as about 1 m by detecting the shift amount Z as the three pixel pitch.
- the above calculation can be performed by the distance calculation circuit in the signal processing unit 5 including the DSP.
- the lens lb and the lens lc are integrally formed as a lens module 1, so that the camera module can be used in a conventional distance measuring device assembled using two cameras.
- the distance L between the optical axis lb 'and the optical axis lc' can be reduced, and the accuracy of the distance L and the relative inclination of the two optical axes lb ', lc' can be improved. Therefore, a small and low-cost camera module that can measure the distance to the subject with high accuracy can be realized.
- the lenses la to Ld are designed to satisfy desired optical specifications for light in a specific wavelength band, a single lens can be used as the lenses la to Ld. As a result, the camera module can be further reduced in size and cost.
- the distance to the subject is set to the wavelength selection regions 2b and 2c that transmit the green light, and the green light formed by the corresponding lenses lb and lc and the imaging devices 4b and 4c.
- measurement was performed using a pair of corresponding optical systems, the present invention is not limited to this.
- a pair of optical systems corresponding to infrared light including wavelength selection regions 2a and 2d that transmit infrared light, lenses la and 1d corresponding to the wavelength selection regions 2a and 2d, and image sensors 4a and 4d, are used.
- the distance to the subject may be measured.
- the distance to the subject can be measured even in a dark environment such as at night.
- the lenses la and Id are integrally formed as the lens module 1, a small and low-cost camera module that can measure the distance to the subject with high accuracy can be realized.
- the distance to the subject may be measured using both a pair of optical systems corresponding to green light and a pair of optical systems corresponding to infrared light. As a result, a compact and low-cost camera module that can measure the distance to the subject with high accuracy regardless of day or night can be realized.
- the camera module of the present embodiment may output at least one of the image information output by each of the four imaging elements 4a to 4d as an image signal.
- the image information used as the image signal may be image information used for distance measurement or other image information. For example, if image information output from the image sensor 4b (or 4c), into which monochromatic light in the visible light band is incident, is output as an image signal, a single-color image captured in an environment with daylight brightness! Obtainable.
- image information output from the image sensor 4a (or 4d) on which infrared light is incident is output as an image signal
- an image captured by infrared in a dark environment such as at night can be obtained.
- the image information output from the image sensor 4b (or 4c) and the image information output from the image sensor 4a (or 4d) are combined and one image signal is output, the image information is sharp both day and night. Therefore, the sensitivity of the imaging device can be increased in terms of appearance.
- the synthesis of a plurality of pieces of image information can be performed by an image synthesis circuit in the signal processing unit 5 including the DSP.
- FIG. 4 is a conceptual diagram of the optical filter module 21 used for the camera module according to the second embodiment.
- the wavelength selection areas 21a and 21d arranged diagonally to each other are composed of infrared optical filters that selectively transmit infrared light out of the light of the subject power, and are arranged diagonally to each other.
- the wavelength selection regions 21b and 21c correspond to the wavelength selection regions 21b and 21c, respectively, where the red optical filter 21R, the green optical filter 21G, and the blue optical filter 21B that transmit red light, green light, and blue light respectively.
- This is a wavelength selection region that is arranged in a Bayer arrangement according to the arrangement of the pixels of the imaging elements 4b and 4c.
- the lenses la and Id corresponding to the wavelength selection regions 21a and 21d that transmit infrared light are designed so as to satisfy required optical specifications such as MTF in the wavelength band of infrared light.
- the imaging devices 4a and 4d corresponding to the wavelength selection regions 21a and 21d have sensitivity in a wavelength band of infrared light.
- the lenses lb and lc corresponding to the wavelength selection regions 21b and 21c in which the red optical filter 21R, the green optical filter 21G, and the blue optical filter 21B are arranged in a Bayer array have the required MTF in the visible light wavelength band.
- the imaging devices 4b and 4c corresponding to the wavelength selection regions 21b and 21c have sensitivity in the wavelength band of visible light.
- FIG. 4 shows whether the wavelength selection regions 21a and 21d and the wavelength selection regions 21b and 21c form a common optical filter module 21 in order to make the difference from the other embodiments easier to understand. It is drawn as follows. However, it is necessary to accurately align the red optical filter 21R, the green optical filter 21G, and the blue optical filter 21B arranged in the bay arrangement with the light receiving sections of the individual pixels of the imaging devices 4b and 4c. Therefore, in reality, it is necessary to directly form the red optical filter 21R, the green optical filter 21G, and the blue optical filter 21B on the pixels of the imaging devices 4b and 4c in a Bayer arrangement. Therefore, for example, as shown in FIG.
- the wavelength selection regions 21b and 21c in which the red optical filter 21R, the green optical filter 21G, and the blue optical filter 21B are arranged in a Bayer array are wavelength selection regions that transmit infrared light. It is formed on the imaging elements 4b, 4c separately from the optical filter module 21 provided with 21a, 21d.
- the second embodiment is different from the first embodiment in that, in the first embodiment, the imaging elements 4b and 4c emit light of one wavelength band of any one of red light, green light, and blue light. Only incident on the other hand, Embodiment 2 is characterized in that all the light in the visible light band enters the imaging devices 4b and 4c. That is, the imaging elements 4b and 4c output monochromatic image information in the first embodiment, whereas they output color image information in the second embodiment.
- the distance to the subject can be measured in the same manner as in Embodiment 1 based on the two color image information obtained from imaging elements 4b and 4c. From the red, green, and blue wavelength components contained in the color image information, it is possible to extract three components of luminance, hue, and saturation. Since geometrically connected regions in the subject often have similar colors, a specific image having a correlation is searched for and extracted from two color images using color information, and the shift amount Z is determined. You can ask. Therefore, the accuracy of searching and extracting a specific image can be remarkably improved as compared with the case where the distance is measured using a monochromatic image. This significantly improves the accuracy of distance measurement.
- the wavelength selection regions 21a and 21d that transmit infrared light and the corresponding lenses la and Id and the imaging devices 4a and 4d may be measured using a pair of corresponding optical systems. If infrared light is used, the distance to the subject can be measured even in a dark environment such as at night.
- a pair of optical systems corresponding to visible light and a pair of optical systems corresponding to infrared light each of which includes wavelength selection regions 21b and 21c, lenses lb and lc corresponding thereto, and image pickup devices 4b and 4c.
- the distance to the subject may be measured using both of them. This makes it possible to realize a small and low-cost camera module that can measure the distance to the object with high accuracy regardless of day or night.
- the camera module of the present embodiment outputs at least one of the image information output by each of the four imaging elements 4 a to 4 d as an image signal. Is also good. As a result, a compact and low-cost camera module that outputs distance information to a subject and subject image information can be realized.
- the image information used as the image signal may be image information used for distance measurement, or may be other image information. For example, if image information output from the image sensor 4b (or 4c) on which light in the visible light band enters is output as an image signal, a single image captured in a bright environment such as daytime can be obtained. .
- the output image information is output as an image signal
- an image captured by infrared rays in a dark environment such as at night can be obtained.
- the image information output from the image sensor 4b (or 4c) and the image information output from the image sensor 4a (or 4d) are combined to output one image signal, the visible light power and infrared Utilizing light in the entire wavelength band up to light, clear images can be captured regardless of day or night, and the visibility of the image sensor can be increased and the sensitivity of the image sensor can be increased.
- the synthesis of a plurality of pieces of image information can be performed by an image synthesis circuit in the signal processing unit 5 including the DSP.
- the camera module power also includes the output image signal power color image information
- the following effects can be obtained. From the red, green, and blue wavelength components contained in the color image information, it is possible to extract three components: brightness, hue, and saturation.
- the characteristic information of the subject is obtained by using the color information. Extraction can be performed easily and with high accuracy. Therefore, the shape recognition accuracy can be significantly improved as compared with the case where shape recognition is performed using single-color image information.
- the overall configuration of the camera module according to the third embodiment is almost the same as that of the camera module according to the first embodiment shown in FIGS. 1 and 2, and is implemented in each component of the camera module. Different from form 1.
- the third embodiment will be described focusing on the differences from the first embodiment.
- FIG. 6 is a perspective view of the optical filter module 31 used in the camera module according to the third embodiment.
- wavelength selection areas 31b and 31c arranged diagonally to each other are composed of green optical filters that selectively transmit green light among the light of the object power, and the wavelength selection area 31a is
- the wavelength selection area 31d is a blue optical filter that selectively transmits blue light of light from a subject.
- the lenses lb, lc corresponding to the wavelength selection region 3 lb, 31c for transmitting the green light are designed to satisfy the required optical specifications such as MTF in the wavelength band of the green light! Puru .
- the lens la corresponding to the wavelength selection region 3 la for transmitting red light is designed to satisfy the required optical specifications such as MTF in the wavelength band of red light.
- the lens Id corresponding to the wavelength selection region 3 Id that transmits blue light is designed so as to satisfy required optical specifications such as MTF in a wavelength band of blue light.
- the imaging devices 4a to 4d have at least sensitivity in the wavelength band of light transmitted by the corresponding wavelength selection regions 31a to 31d. In the present embodiment, each of the imaging elements 4a to 4d has sensitivity over the wavelength band of visible light.
- each lens la to: Ld After the light from the subject passes through each lens la to: Ld, light of a specific wavelength band is selected and transmitted through each of the wavelength selection regions 31a to 31d, and each image sensor 4a Focus on ⁇ 4d.
- Each of the imaging devices 4a to 4d outputs an electric signal corresponding to the intensity of the incident light as image information.
- the wavelength selection region 3 lb, 31c for transmitting green light, and the corresponding green light composed of corresponding lenses lb, lc and image pickup devices 4b, 4c The distance to the subject is measured using the pair of optical systems.
- the outline is as follows.
- An image obtained by one of the imaging elements 4b and 4c corresponding to green light is set as a reference image.
- the light receiving area of the image sensor 4b is divided into a plurality of blocks (for example, one block is composed of a total of 64 pixels of 8 rows ⁇ 8 columns).
- an image having a correlation with an image captured by a certain block of the image sensor 4b is searched for and specified in the image obtained by the other image sensor 4c.
- the difference between the position of the above-mentioned block in one image sensor 4b and the position of the image having a correlation with this block in the other image sensor 4c is obtained as a shift amount Z. Based on the displacement Z, the distance D to the subject is calculated in the same manner as described in the first embodiment.
- an image sensor 4a corresponding to red light, an image sensor 4d (or 4b) corresponding to green light, and an image sensor 4d corresponding to blue light are respectively provided.
- an image sensor 4d (or 4b) corresponding to green light, and an image sensor 4d corresponding to blue light are respectively provided.
- Image information relating to three monochromatic lights is synthesized as follows. On the basis of the above-mentioned shift amount Z in which the image information output by the image sensors 4b and 4d corresponding to green light is also obtained, Image sensors 4a and 4d calculate the amount of shift between each of the red and blue images obtained by the image sensors 4a and 4d with respect to the green image (reference image) obtained by the image sensor 4b. Then, the shift amounts of the red and blue images are corrected, and the red and blue images are combined with the green reference image. Thereby, a color image without color shift can be obtained.
- the synthesis of a plurality of pieces of image information as described above can be performed by the image synthesis circuit in the signal processing unit 5 including the DSP.
- the camera module of the present embodiment includes at least one of the image information output by the four imaging elements 4a to 4d and Z or the above
- the obtained color image information may be output as an image signal.
- the output image signal is color image information
- the following effects can be obtained. From the red, green, and blue wavelength components contained in the color image information, it is possible to extract three components of luminance, hue, and saturation.
- the shape of a subject is recognized using this image information, the geometrically connected areas in the subject often have similar colors. Can be easily and accurately extracted. Therefore, the shape recognition accuracy can be significantly improved as compared with the case where shape recognition is performed using single-color image information.
- Each pixel of an imaging sensor such as a CCD photoelectrically converts incident light and outputs an electric signal (hereinafter, referred to as "pixel value") according to the intensity of the incident light. Since the light receiving surface of each pixel has a predetermined area, the pixel value is an integrated value of the light intensity at each point in the light receiving surface of the pixel.
- FIG. 7 is an enlarged front view of a part of the light incident surface of the image sensor.
- Each area separated by solid lines 41H and 41V indicates one pixel 40.
- FIG. 7 shows a total of 4 pixels in 2 rows ⁇ 2 columns. The shaded area indicates the light receiving surface of one pixel 40.
- Dotted lines 42H and 42V are the center lines of the pixel rows and pixel columns, and both pass through the center of the light receiving surface of each pixel 40. During ⁇ The intervals between the core wires 42H and 42V are called pixel pitches P and P.
- FIG. 8 consider a case where a subject 70 is incident on a light receiving surface of a certain pixel 40.
- one side of the center line 75 of the subject 70 is a uniform black area 70a, and the other side is a uniform white area 70b.
- the black area 70a and the white area 70b have the same area.
- Points 71 and 72 in the black area 70a form images 41 and 42 in the light receiving surface of the pixel 40, and points 73 and 74 in the white area 70b form images 43 and 44 in the light receiving surface.
- the dashed line 45 corresponds to the center line 75 of the subject 70, and the light intensity of the light receiving surface of the pixel 40 corresponding to the black region 70a with respect to the dashed line 45 is reduced to the white region 70a where the light intensity is weak.
- the light intensity in the corresponding area 40b is strong.
- the pixel value of the pixel 40 is equivalent to a case where light having an intermediate intensity between the light intensity in the region 40a and the light intensity in the region 40b is incident on the entire light receiving surface of the pixel 40.
- the shift amount Z is calculated from the positions of the mutually correlated images existing in the two pieces of image information.
- the image information used at this time is constituted by pixel values output from each pixel of the image sensor. As shown in FIG. 8, even when the intensity of light incident on the light receiving surface of the pixel 40 is not uniform, the light intensity distribution in this light receiving surface is averaged when the pixel value is output. I will. Therefore, the detection accuracy of the shift amount Z is determined by the pixel pitches P and P
- the accuracy of detecting the displacement Z can be improved, and as a result, the accuracy of measuring the distance to the subject can be improved. I understand.
- the detection accuracy of the shift amount Z is improved, so that it is possible to accurately measure the distance to a farther object.
- the light intensity distribution in the light receiving surface of each pixel is obtained by interpolating between pixels based on image information composed of pixel values output from each pixel. To detect. That is, the pixel pitches P 1 and P 2 are reduced in a pseudo manner. This is explained below.
- each pixel 40 is divided into a plurality of smaller areas (hereinafter referred to as “sub-pixels”) 46.
- one pixel 40 is composed of a set of a total of 25 sub-pixels in 5 rows ⁇ 5 columns.
- solid lines 41H and 41V indicate boundaries of the pixel 40
- two-dot chain lines 46H and 46V indicate boundaries of the sub-pixel 46.
- the distance to the subject is measured as follows. As described in the first embodiment, an image obtained by one of the imaging elements 4b and 4c corresponding to green light is used as a reference image. Then, an image having a correlation with an image captured by a certain block of the image sensor 4b is searched for and specified in the image obtained by the other image sensor 4c. At this time, in the first embodiment, a plurality of pixels constitute one block. In the present embodiment, a plurality of subpixels constitute one block. That is, in the present embodiment, the correlation between two images is investigated in subpixel units instead of pixel units. Thereby, the accuracy of measuring the distance to the subject can be improved. In addition, it is possible to accurately measure a distance from a near subject to a farther subject.
- the shift amount Z is detected with the pixel pitch as a unit. For example, if the shift amount Z is 3 pixel pitches, it is calculated that the distance D to the subject is about lm.
- the shift amount Z is set in units of 0.1 pixel pitch (that is, in units of 1 sub-pixel pitch). As).
- the displacement amount Z is 0.3 pixel pitch (that is, 3 subpixel pitch)
- the distance D to the subject can be calculated as about 10 m, and distance measurement can be performed with high accuracy even for a distant subject.
- the displacement Z can be detected in units of 0.1 pixel pitch (that is, in units of 1 subpixel pitch) for a subject at a distance of about lm, measurement accuracy is improved by a factor of 10.
- each sub-pixel (hereinafter referred to as "sub-pixel value”) can be determined by, for example, interpolating the pixel value output from one pixel of the image sensor by a linear interpolation method. This will be described with reference to FIG. 10 showing a case where each pixel 40 is divided into a total of four sub-pixels 46 of 2 rows ⁇ 2 columns.
- the pixel value output from each pixel 40 is at the center of that pixel, and that the subpixel value output from each subpixel 46 is also at the center of that subpixel.
- the pixel values of the pixels arranged in 2 rows and 2 columns are P (i, j), P (i, j + l), P (i + 1, j), and P (i + 1, j), respectively. + 1), and P is the subpixel value of the subpixel to be obtained.
- the subpixel value P of this subpixel can be obtained by the following equation.
- the presence or absence of the edge (contour) of the subject formed on the light receiving surface of a certain sub-pixel is determined by using the sub-pixel values of the surrounding sub-pixels. If the edge is detected and it is determined that an edge exists, the edge in the sub-pixel may be interpolated to obtain the edge. Further, the edge detection method may be combined with the linear interpolation method of the sub-pixel value of the sub-pixel 46 described above.
- Fig. 11 is a view of the automobile 100 traveling on the road as viewed from above.
- the automobile 100 is equipped with the camera modules 101 to 104 of the present invention.
- 101a to 104a indicate the angles of view of the camera modules 101 to 104.
- 105 is the preceding vehicle
- 106 is the oncoming vehicle
- 107 is the center line (white line) of the road
- 108 is the pedestrian.
- the camera modules 101 to 104 be thin and small.
- the imaging element is Since the wavelength band of light formed on the element is limited, an image with less aberration can be obtained even if a single lens is used as the lens. Therefore, a very thin and small camera module can be realized.
- the preventive safety of the vehicle 100 can be improved. it can.
- the brightness of the on-coming vehicle such as headlights can be reduced. It is possible to recognize a subject such as a pedestrian and obtain a distance to the subject.
- the camera module captures color image information
- three components of luminance, hue, and saturation are extracted from each of the red, green, and blue wavelength components included in the color image information.
- color information can be used to extract a characteristic portion of a subject, the shape recognition accuracy is significantly improved as compared with the case of using single-color image information.
- the surrounding situation information acquired by the camera module may be the information in the rickshaw compartment, which indicates the case where the information is outside the compartment of the car 100 in Fig. 11.
- an image signal from which the respective forces of the camera modules 101 to 104 are obtained by using a display device (not shown). This may be combined with the distance information to the subject obtained from the camera modules 101 to 104.
- the obstacle can be displayed on the display device every moment along with the distance information up to that point.
- the imaging range of the camera module depends on the orientation of the automobile. For example, if the camera module is fixed so as to capture situation information ahead of the vehicle in the traveling direction, and if the traveling direction of the vehicle changes, the vehicle module changes only when the traveling direction of the vehicle changes. Status information ahead Images can be taken.
- camera module 103 of the present embodiment is mounted on a turntable 200 that can rotate about the Z axis, as shown in FIG. 12, in order to obtain situation information more quickly.
- the X, Y, and Z axes in FIG. 12 are set as shown in FIG. That is, the positive direction of the X-axis is forward in the traveling direction of the vehicle, the Y-axis direction is both sides of the vehicle, and the Z-axis direction is the vertical direction.
- the rotation angle and rotation speed (angular speed) of the turntable 200 around the Z axis are determined according to the change in the traveling direction of the vehicle estimated based on the vehicle speed signal and the Z or steering angle signal of the mounted vehicle. You. Due to the rotation of the turntable 200, the direction of the camera module 103 changes within the range of arrows 201a to 201b in a plane parallel to the XY plane with respect to the direction 201 parallel to the X axis.
- the turntable 200 is used as a drive mechanism for changing the direction of the camera module 103, but the present invention is not limited to this.
- the camera module 103 is supported by a torsion bar 300 extending parallel to the Z-axis, and a force F1 (or F2) substantially parallel to the X-axis is applied to a side end of the camera module 103. Accordingly, the camera module 103 may be rotated around the Z axis.
- a drive mechanism for changing the direction of the camera module 103 can be appropriately selected in consideration of a space in which the camera module 103 is mounted.
- the camera module according to the first to sixth embodiments includes four lenses.
- the present invention is not limited to this. If at least two lenses are provided, the above-described highly accurate distance measurement can be performed.
- the arrangement of the lenses used to measure the distance to the subject is not limited to the examples shown in Embodiments 1 to 6, and may be appropriately changed in consideration of parallax, measurement accuracy, and the like. This It comes out.
- the optical path from the lens to the image sensor is linear, but the present invention is not limited to this.
- the same effect as described above can be obtained with a reflection type in which the optical path is bent.
- a plurality of imaging elements are used as the plurality of imaging regions.
- the present invention is not limited to this.
- a plurality of imaging regions may be obtained. This eliminates the need for alignment of a plurality of imaging devices, facilitates mounting, and reduces cost.
- the lens module and the optical filter module are identical in the first to sixth embodiments.
- a camera module in which an optical system including an imaging area and a signal processing unit including a distance calculation circuit and a Z or image synthesis circuit are integrated.
- the camera module of the present invention is not limited to this.
- the optical system and the signal processing unit may be connected only by a flexible cable or the like.
- the signal processing unit mounted on the substrate common to the imaging area does not include the distance calculation circuit and the Z or image synthesis circuit, and the distance calculation circuit and the Z or image synthesis circuit are connected to this signal processing unit only by a flexible cable or the like. It may be continued. With such a configuration, the optical system and the signal processing unit (or the distance calculation circuit and the Z or image synthesis circuit) can be freely arranged at different positions.
- the field of application of the camera module of the present invention is not particularly limited, but is useful for, for example, small-sized and thin cameras mounted on mobile phones, digital still cameras, surveillance cameras, vehicle-mounted cameras, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Optics & Photonics (AREA)
- Measurement Of Optical Distance (AREA)
- Automatic Focus Adjustment (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/569,723 US7385680B2 (en) | 2004-06-03 | 2005-04-11 | Camera module |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004165269A JP2007263563A (ja) | 2004-06-03 | 2004-06-03 | カメラモジュール |
JP2004-165269 | 2004-06-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005119175A1 true WO2005119175A1 (ja) | 2005-12-15 |
Family
ID=35463003
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/007039 WO2005119175A1 (ja) | 2004-06-03 | 2005-04-11 | カメラモジュール |
Country Status (3)
Country | Link |
---|---|
US (1) | US7385680B2 (ja) |
JP (1) | JP2007263563A (ja) |
WO (1) | WO2005119175A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007129563A1 (ja) * | 2006-05-09 | 2007-11-15 | Panasonic Corporation | 測距用画像選択機能を有する測距装置 |
WO2009001563A1 (ja) * | 2007-06-28 | 2008-12-31 | Panasonic Corporation | 撮像装置及び半導体回路素子 |
JP2009040270A (ja) * | 2007-08-09 | 2009-02-26 | Hitachi Ltd | 車載用カメラ |
US8390703B2 (en) | 2008-07-23 | 2013-03-05 | Panasonic Corporation | Image pickup apparatus and semiconductor circuit element |
CN106488092A (zh) * | 2015-09-01 | 2017-03-08 | 德尔福技术有限公司 | 集成的摄像机、环境光检测及雨传感器组件 |
Families Citing this family (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7560679B1 (en) * | 2005-05-10 | 2009-07-14 | Siimpel, Inc. | 3D camera |
JP2007139756A (ja) * | 2005-10-17 | 2007-06-07 | Ricoh Co Ltd | 相対位置検出装置、回転体走行検出装置及び画像形成装置 |
US8199370B2 (en) * | 2007-08-29 | 2012-06-12 | Scientific Games International, Inc. | Enhanced scanner design |
US8395795B2 (en) * | 2007-09-09 | 2013-03-12 | Xpedite Systems, Llc | Systems and methods for communicating documents |
US8325245B2 (en) * | 2007-09-25 | 2012-12-04 | Rockwell Automation Technologies, Inc. | Apparatus and methods for use of infra-red light in camera applications |
US10003701B2 (en) * | 2008-01-30 | 2018-06-19 | Xpedite Systems, Llc | Systems and methods for generating and communicating enhanced portable document format files |
JP4510928B2 (ja) * | 2008-02-05 | 2010-07-28 | パナソニック株式会社 | 測距装置 |
CN102124392A (zh) | 2008-08-19 | 2011-07-13 | 罗姆股份有限公司 | 照相机 |
US20100061593A1 (en) * | 2008-09-05 | 2010-03-11 | Macdonald Willard S | Extrapolation system for solar access determination |
US9843742B2 (en) | 2009-03-02 | 2017-12-12 | Flir Systems, Inc. | Thermal image frame capture using de-aligned sensor array |
US9208542B2 (en) | 2009-03-02 | 2015-12-08 | Flir Systems, Inc. | Pixel-wise noise reduction in thermal images |
WO2012170949A2 (en) | 2011-06-10 | 2012-12-13 | Flir Systems, Inc. | Non-uniformity correction techniques for infrared imaging devices |
US9235876B2 (en) | 2009-03-02 | 2016-01-12 | Flir Systems, Inc. | Row and column noise reduction in thermal images |
US10091439B2 (en) | 2009-06-03 | 2018-10-02 | Flir Systems, Inc. | Imager with array of multiple infrared imaging modules |
JP2013537661A (ja) | 2010-06-30 | 2013-10-03 | タタ コンサルタンシー サービシズ リミテッド | ステレオビジョン技術を使用することによる移動物体の自動検出 |
US9007604B2 (en) | 2010-06-30 | 2015-04-14 | Xpedite Systems, Llc | System, method, and apparatus for an interactive virtual fax machine |
TWM397531U (en) * | 2010-07-29 | 2011-02-01 | Agait Technology Corp | Detect apparatus and the of detect apparatus from removing apparatus the same |
KR101208215B1 (ko) * | 2011-01-14 | 2012-12-04 | 삼성전기주식회사 | 카메라 모듈 및 이의 제조방법 |
KR101808375B1 (ko) | 2011-06-10 | 2017-12-12 | 플리어 시스템즈, 인크. | 저전력 소형 폼 팩터 적외선 이미징 |
US9143703B2 (en) | 2011-06-10 | 2015-09-22 | Flir Systems, Inc. | Infrared camera calibration techniques |
US20130120621A1 (en) * | 2011-11-10 | 2013-05-16 | Research In Motion Limited | Apparatus and associated method for forming color camera image |
JP5681954B2 (ja) * | 2011-11-30 | 2015-03-11 | パナソニックIpマネジメント株式会社 | 撮像装置及び撮像システム |
DE102012014994B4 (de) * | 2012-07-28 | 2024-02-22 | Volkswagen Aktiengesellschaft | Bildverarbeitungsverfahren für eine digitale Stereokameraanordnung |
US9511708B2 (en) | 2012-08-16 | 2016-12-06 | Gentex Corporation | Method and system for imaging an external scene by employing a custom image sensor |
WO2014100783A1 (en) * | 2012-12-21 | 2014-06-26 | Flir Systems, Inc. | Time spaced infrared image enhancement |
JP6541119B2 (ja) * | 2016-12-05 | 2019-07-10 | パナソニックIpマネジメント株式会社 | 撮像装置 |
US11113791B2 (en) | 2017-01-03 | 2021-09-07 | Flir Systems, Inc. | Image noise reduction using spectral transforms |
EP3634807B1 (en) * | 2017-06-26 | 2021-08-04 | Gentex Corporation | Dynamic calibration of optical properties of a dimming element |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08159754A (ja) * | 1994-12-09 | 1996-06-21 | Fuji Electric Co Ltd | 車間距離検出装置 |
JP2000171246A (ja) * | 1998-09-28 | 2000-06-23 | Asahi Optical Co Ltd | 測距装置 |
JP2001016621A (ja) * | 1999-07-02 | 2001-01-19 | Minolta Co Ltd | 多眼式データ入力装置 |
JP2001174223A (ja) * | 1999-12-20 | 2001-06-29 | Mitsubishi Electric Corp | 熱物体の位置と形状の検出装置 |
JP2002366953A (ja) * | 2001-06-11 | 2002-12-20 | Central Res Inst Of Electric Power Ind | 画像抽出方法および装置ならびに画像抽出プログラム、画像抽出方法を利用した配電柱の柱上機材の異常検出方法および装置ならびに異常検出プログラム |
JP2003150936A (ja) * | 2001-11-08 | 2003-05-23 | Fuji Heavy Ind Ltd | 画像処理装置および画像処理方法 |
JP2004132836A (ja) * | 2002-10-10 | 2004-04-30 | Seiko Precision Inc | 測光機能付き位相差検出装置、測光機能付き測距装置および測光機能付き撮像装置 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05265547A (ja) | 1992-03-23 | 1993-10-15 | Fuji Heavy Ind Ltd | 車輌用車外監視装置 |
JPH10255019A (ja) | 1997-03-07 | 1998-09-25 | Toyota Motor Corp | 車両認識装置 |
US20030048493A1 (en) * | 2001-09-10 | 2003-03-13 | Pontifex Brian Decoursey | Two sensor quantitative low-light color camera |
JP4803927B2 (ja) | 2001-09-13 | 2011-10-26 | 富士重工業株式会社 | 監視システムの距離補正装置および距離補正方法 |
TW593978B (en) * | 2002-02-25 | 2004-06-21 | Mitsubishi Electric Corp | Video picture processing method |
JP2004012863A (ja) | 2002-06-07 | 2004-01-15 | Canon Inc | 測距・測光装置およびカメラ |
-
2004
- 2004-06-03 JP JP2004165269A patent/JP2007263563A/ja active Pending
-
2005
- 2005-04-11 US US11/569,723 patent/US7385680B2/en active Active
- 2005-04-11 WO PCT/JP2005/007039 patent/WO2005119175A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08159754A (ja) * | 1994-12-09 | 1996-06-21 | Fuji Electric Co Ltd | 車間距離検出装置 |
JP2000171246A (ja) * | 1998-09-28 | 2000-06-23 | Asahi Optical Co Ltd | 測距装置 |
JP2001016621A (ja) * | 1999-07-02 | 2001-01-19 | Minolta Co Ltd | 多眼式データ入力装置 |
JP2001174223A (ja) * | 1999-12-20 | 2001-06-29 | Mitsubishi Electric Corp | 熱物体の位置と形状の検出装置 |
JP2002366953A (ja) * | 2001-06-11 | 2002-12-20 | Central Res Inst Of Electric Power Ind | 画像抽出方法および装置ならびに画像抽出プログラム、画像抽出方法を利用した配電柱の柱上機材の異常検出方法および装置ならびに異常検出プログラム |
JP2003150936A (ja) * | 2001-11-08 | 2003-05-23 | Fuji Heavy Ind Ltd | 画像処理装置および画像処理方法 |
JP2004132836A (ja) * | 2002-10-10 | 2004-04-30 | Seiko Precision Inc | 測光機能付き位相差検出装置、測光機能付き測距装置および測光機能付き撮像装置 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007129563A1 (ja) * | 2006-05-09 | 2007-11-15 | Panasonic Corporation | 測距用画像選択機能を有する測距装置 |
WO2009001563A1 (ja) * | 2007-06-28 | 2008-12-31 | Panasonic Corporation | 撮像装置及び半導体回路素子 |
US8395693B2 (en) | 2007-06-28 | 2013-03-12 | Panasonic Corporation | Image pickup apparatus and semiconductor circuit element |
JP2009040270A (ja) * | 2007-08-09 | 2009-02-26 | Hitachi Ltd | 車載用カメラ |
JP4667430B2 (ja) * | 2007-08-09 | 2011-04-13 | 日立オートモティブシステムズ株式会社 | 車載用カメラ |
US8390703B2 (en) | 2008-07-23 | 2013-03-05 | Panasonic Corporation | Image pickup apparatus and semiconductor circuit element |
CN106488092A (zh) * | 2015-09-01 | 2017-03-08 | 德尔福技术有限公司 | 集成的摄像机、环境光检测及雨传感器组件 |
CN106488092B (zh) * | 2015-09-01 | 2021-01-29 | 安波福技术有限公司 | 集成的摄像机、环境光检测及雨传感器组件 |
Also Published As
Publication number | Publication date |
---|---|
US7385680B2 (en) | 2008-06-10 |
JP2007263563A (ja) | 2007-10-11 |
US20070247611A1 (en) | 2007-10-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2005119175A1 (ja) | カメラモジュール | |
US20190351842A1 (en) | In-vehicle camera and vehicle control system | |
CN104512411B (zh) | 车辆控制系统及图像传感器 | |
JP6156724B2 (ja) | ステレオカメラ | |
US7227515B2 (en) | System and method for forming images for display in a vehicle | |
US20190072673A1 (en) | Distance measurement system and solid-state imaging sensor used therefor | |
US7518099B2 (en) | Multifunctional optical sensor comprising a photodetectors matrix coupled to a microlenses matrix | |
KR100804719B1 (ko) | 촬상 모듈 | |
JP5012495B2 (ja) | 撮像素子、焦点検出装置、焦点調節装置および撮像装置 | |
CN102211549B (zh) | 用于车辆的摄像机 | |
CN105723239A (zh) | 测距摄像系统 | |
CN103370224A (zh) | 具有用于监控车辆周边环境的装置的车辆 | |
WO2012137696A1 (ja) | 車両用画像処理装置 | |
CN102566212A (zh) | 摄像机装置和用于运行机动车的摄像机装置的方法 | |
JP2004258266A (ja) | ステレオアダプタ及びそれを用いた距離画像入力装置 | |
JP2008157851A (ja) | カメラモジュール | |
CN104136910A (zh) | 成像单元及其安装方法 | |
JP2007295113A (ja) | 撮像装置 | |
JP2009055553A (ja) | 複数の撮像素子を搭載した撮像装置 | |
JP2002237969A (ja) | 車載カメラおよび画像処理システム | |
US20220368873A1 (en) | Image sensor, imaging apparatus, and image processing method | |
JP6202364B2 (ja) | ステレオカメラ及び移動体 | |
Kidono et al. | Visibility estimation under night-time conditions using a multiband camera | |
JP7207889B2 (ja) | 測距装置および車載カメラシステム | |
JP4604816B2 (ja) | 車載撮像モジュール |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DPEN | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11569723 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase | ||
NENP | Non-entry into the national phase |
Ref country code: JP |
|
WWP | Wipo information: published in national office |
Ref document number: 11569723 Country of ref document: US |