JP5899957B2 - Image processing system and vehicle equipped with the same - Google Patents

Image processing system and vehicle equipped with the same Download PDF

Info

Publication number
JP5899957B2
JP5899957B2 JP2012010010A JP2012010010A JP5899957B2 JP 5899957 B2 JP5899957 B2 JP 5899957B2 JP 2012010010 A JP2012010010 A JP 2012010010A JP 2012010010 A JP2012010010 A JP 2012010010A JP 5899957 B2 JP5899957 B2 JP 5899957B2
Authority
JP
Japan
Prior art keywords
image
light
polarization component
processing system
imaging
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2012010010A
Other languages
Japanese (ja)
Other versions
JP2013148504A (en
Inventor
平井 秀明
秀明 平井
Original Assignee
株式会社リコー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社リコー filed Critical 株式会社リコー
Priority to JP2012010010A priority Critical patent/JP5899957B2/en
Publication of JP2013148504A publication Critical patent/JP2013148504A/en
Application granted granted Critical
Publication of JP5899957B2 publication Critical patent/JP5899957B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Description

  The present invention relates to an image processing system that detects a road surface state (dry state or frozen state) and uses it for the purpose of alerting a driver, and a vehicle equipped with the image processing system.

  Conventionally, a method for detecting whether or not the road surface state is a dry state or a frozen state based on a polarized image obtained by imaging the road surface has been proposed (see, for example, Patent Document 1).

  The road surface state detection device disclosed in Patent Document 1 includes an imaging unit that images a road surface, and a polarization unit that is disposed in front of the imaging unit and whose polarization plane can be varied in the horizontal and vertical directions, and images the road surface. A power spectrum image is generated by performing Fourier transform on the horizontally polarized image and the vertically polarized image, and a feature is calculated from the distribution of frequency components in the power spectrum image, and the calculated feature value is larger than the set value. Whether or not the road surface is frozen, wet and dry is determined and output.

  However, the road surface state detection device disclosed in Patent Document 1 is configured to sequentially set a vertical polarization filter and a horizontal polarization filter by a filter conversion unit when capturing a horizontal polarization image and a vertical polarization image, and provides a real-time image. Not suitable for imaging.

  Although details of the filter conversion unit are not described in Patent Document 1, it is assumed that the filter conversion unit is a mechanical mechanism such as a general motor or a switching mechanism such as a liquid crystal. When these mechanisms are mounted on a vehicle such as an automobile, there are problems in terms of heat resistance and vibration resistance.

  In view of this, an imaging apparatus that can cope with such a problem has been proposed (see, for example, Patent Document 2). According to Patent Document 2, it is possible to capture a horizontally polarized image and a vertically polarized image with one image sensor.

  In recent years, there has been a demand for an image pickup apparatus that picks up a driver's attention by picking up images of road signs and the like. In addition, although the sign on a road makes various information presentation, the magnitude | size may be small. For this reason, in order to detect the information of the sign, a sufficient resolution is required in the imaging apparatus.

  However, in the conventional imaging device as disclosed in Patent Document 2, a polarizer for horizontally polarized light transmission and a polarizer for vertically polarized light transmission are formed in the imaging region, and the number of pixels for recognizing the sign However, there is a problem that it becomes less than half of the total number of pixels in the imaging region.

  The present invention has been made to solve such a conventional problem, and detects a road surface state using a polarization image and secures a resolution of a luminance image and performs high-accuracy sign detection. An object of the present invention is to provide an image processing system and a vehicle equipped with the same.

In order to solve the above-described problems, an image processing system according to the present invention includes an imaging device having a pixel array composed of a plurality of pixels, and an optical filter disposed in front of the imaging device, and includes a road surface. Imaging means for imaging peripheral information; and image analysis means for analyzing the imaging result of the imaging means, wherein the optical filter is disposed in a predetermined area of the effective imaging area and a substrate that transmits incident light, A first polarizing filter layer formed with a polarizer for making each of the plurality of polarization components including at least a horizontal polarization component and a vertical polarization component of incident light incident on each pixel of the pixel array, and the image analyzing means, the predetermined region to produce a horizontally polarized component image and the vertical polarization component image from the pixel value of each pixel of the pixel value of each pixel of the horizontal polarization component image, and the vertical polarization component image To a value based on the difference between the pixel value of each pixel is multiplied by a gain value to generate a polarization image, as well as determine the road surface condition based on the polarization image, transmitted through the effective imaging region other than the predetermined area the have rows labeled recognition based on the imaging result of the incident light, the gain value of the peripheral portion of the effective image pickup area, being greater than that of the gain value of the center of the effective image pickup area.

  With this configuration, the polarizer is arranged only in a predetermined area of the effective imaging area, so that it is possible to detect the road surface state using the polarization image and to ensure the resolution of the luminance image and perform highly accurate sign detection. .

  The present invention provides an image processing system capable of detecting a road surface state using a polarized image and ensuring a resolution of a luminance image and performing highly accurate sign detection, and a vehicle including the image processing system. .

The schematic diagram which shows schematic structure of an on-vehicle apparatus control system provided with the image processing system which concerns on this invention 1 is a schematic diagram showing a schematic configuration of an imaging unit included in an image processing system according to the present invention. Cross-sectional schematic diagram along the light transmission direction of the optical filter, imaging device, and sensor substrate Front schematic view of the optical filter viewed from the sensor substrate side, and front schematic view of the image sensor viewed from the imaging lens side Schematic diagram illustrating correspondence between polarization filter layer of optical filter and pixel of image sensor Explanatory drawing for demonstrating the change of reflected light when a road surface state is a wet state, and when it is a dry state The graph which shows the incident angle dependence of the horizontal polarization component of the reflected light with respect to the incident light of the light intensity I, and a vertical polarization component Explanatory drawing which shows the luminance image and polarization degree image which imaged the concrete surface where the ice plate was put Explanatory drawing which shows the brightness | luminance image and polarization degree image which imaged the obstacle in a shadow Schematic diagram showing how the sign detection is performed by the image processing system mounted inside the vehicle Flow chart showing the procedure of exposure control Graph showing the spectral characteristics of the first spectral filter layer Enlarged view of the wire grid structure constituting the polarizing filter layer Schematic which shows the pattern of the polarizer in 2nd Embodiment. Explanatory drawing which simplifies the correspondence of each area | region of a polarizer array, and each pixel of an image pick-up element. Explanatory drawing for demonstrating the condition where the light of specific polarization enters into a polarizer array and an image sensor The schematic diagram which simplifies and shows the correspondence of the polarization separation means in 3rd Embodiment, a polarizing filter layer, and a pixel. Explanatory drawing for demonstrating the difference of the polarization degree image by the presence or absence of a polarization separation means Sectional schematic drawing along the light transmission direction showing the structure of the optical filter in the fourth embodiment Schematic front view of the optical filter according to the fourth embodiment viewed from the sensor substrate side. Schematic diagram illustrating the correspondence between the polarizing filter layer of the optical filter and the pixels of the image sensor in the fourth embodiment Sectional schematic diagram along the light transmission direction which shows the structure of the image pick-up element and optical filter in 5th Embodiment Schematic front view of the optical filter in the fifth embodiment viewed from the sensor substrate side Schematic diagram illustrating the correspondence between the polarizing filter layer of the optical filter and the pixels of the image sensor in the fifth embodiment Graph showing spectral characteristics of color filter Graph showing the spectral characteristics of the second spectral filter layer Sectional schematic drawing along the light transmission direction showing the structure of the optical filter in the sixth embodiment Schematic front view of the optical filter according to the sixth embodiment viewed from the sensor substrate side Schematic diagram showing a configuration in which a light source for road surface illumination is arranged in the vicinity of the imaging unit The perspective view which shows the structure of the diffraction grating arrange | positioned in the output optical path of a light source Explanatory drawing which shows the reference pattern of a plurality of outgoing lights, and the imaging pattern of those reflected lights

  Hereinafter, an image processing system according to the present invention and a vehicle including the image processing system will be described with reference to the drawings. In addition, the dimensional ratio of each structure on each drawing does not necessarily correspond with the actual dimensional ratio.

(First embodiment)
FIG. 1 is a schematic diagram showing a schematic configuration of an in-vehicle device control system including an image processing system 110 according to the present invention. The in-vehicle device control system uses the captured image data of the front area in the traveling direction or the rear area in the traveling direction of the vehicle 100 captured by the imaging unit 101 mounted on the vehicle 100 such as an automobile to distribute the light of the headlamp 104. Control, drive control of the wiper 107 for removing foreign matter adhering to the windshield (transparent member) 105, and control of other in-vehicle devices are performed.

  The in-vehicle device control system shown in FIG. 1 mainly includes an imaging unit 101, an image analysis unit (image analysis means) 102, a headlamp control unit 103, a wiper control unit 106, and a vehicle travel control unit 108. Prepare.

  The image processing system 110 according to the present embodiment includes an imaging unit 101 and an image analysis unit 102. The image analysis unit 102 has a function of controlling the imaging unit 101 and a function of analyzing captured image data transmitted from the imaging unit 101.

  The image analysis unit 102 analyzes the captured image data transmitted from the imaging unit 101 to detect foreign matters such as raindrops adhering to the windshield 105, or to detect other vehicles existing in front of the vehicle 100 in the captured image data. The position, the direction, and the distance are calculated, and detection objects such as white lines (division lines) and signs on the road surface existing in the imaging region are detected.

  Hereinafter, information such as the position, direction, distance, white line (division line) on the road surface, road surface state, and signs of other vehicles in front or behind the vehicle 100 is also referred to as vehicle peripheral information. In the detection of another vehicle, the image analysis unit 102 detects a preceding vehicle that travels in the same traveling direction as the vehicle 100 by identifying the tail lamp of the other vehicle from captured image data obtained by imaging the surrounding information of the preceding vehicle. By identifying the headlamp of the vehicle, an oncoming vehicle traveling in the opposite direction to the vehicle 100 is detected.

  The calculation result of the image analysis unit 102 is sent to the headlamp control unit 103. For example, the headlamp control unit 103 generates a control signal for controlling the headlamp 104 that is an in-vehicle device of the vehicle 100 from the distance data calculated by the image analysis unit 102. Specifically, for example, the driver of the vehicle 100 can avoid the dazzling of the driver of the other vehicle while avoiding the strong light of the headlamp of the vehicle 100 entering the eyes of the driver of the preceding vehicle or the oncoming vehicle. Therefore, the switching of the high beam and the low beam of the headlamp 104 is controlled, or partial light shielding control of the headlamp 104 is performed.

  The calculation result of the image analysis unit 102 is also sent to the wiper control unit 106. The wiper control unit 106 controls the wiper 107 to remove deposits such as raindrops attached to the windshield 105 of the vehicle 100. The wiper control unit 106 receives a foreign object detection result detected by the image analysis unit 102 and generates a control signal for controlling the wiper 107. When the control signal generated by the wiper control unit 106 is sent to the wiper 107, the wiper 107 is operated to ensure the visibility of the driver of the vehicle 100.

  The calculation result of the image analysis unit 102 is also sent to the vehicle travel control unit 108. The vehicle travel control unit 108 notifies the driver of the vehicle 100 of a warning when the vehicle 100 is out of the lane area defined by the white line based on the white line detection result detected by the image analysis unit 102, for example. Or driving support control such as controlling the steering wheel and brake of the host vehicle.

  In addition, the vehicle travel control unit 108 is based on the difference between the sign detection result (described later) detected by the image analysis unit 102 and the vehicle travel state, for example, when the vehicle 100 is traveling at a speed close to the speed limit. The driver is alerted, or the speed of the vehicle 100 is controlled when the vehicle 100 is traveling beyond the speed limit.

  FIG. 2 is a schematic diagram illustrating a schematic configuration of the imaging unit 101. The imaging unit 101 includes an imaging lens 204 that collects light transmitted through the windshield 105 from the outside of the vehicle 100, and an imaging element 206 that performs imaging by photoelectrically converting the light collected by the imaging lens 204 for each pixel. , An optical filter 205 disposed between the imaging lens 204 and the imaging device 206, a sensor substrate 207 on which the imaging device 206 is mounted, and an analog electrical signal output from the sensor substrate 207 (each light reception on the imaging device 206). A signal processing unit 208 that outputs captured image data obtained by converting a received light amount received by the element into a digital electric signal.

  The imaging lens 204, the optical filter 205, the imaging element 206, and the sensor substrate 207 are arranged in this order from the windshield 105 side. The signal processing unit 208 is electrically connected to the image analysis unit 102. FIG. 2 shows an example in which the image sensor 206 and the signal processing unit 208 are provided independently, but the configuration of the imaging unit 101 is not limited to this. For example, when an image sensor 206 having an A / D converter for each pixel is used, the A / D converter becomes the signal processor 208. That is, in this case, the signal processing unit 208 is built in the image sensor 206.

  The imaging lens 204 is composed of, for example, a plurality of lenses, and the focal position is set at infinity or between the infinity and the outer wall surface of the windshield 105.

  Incident light from the imaging region including the subject (detection target) passes through the imaging lens 204, passes through the optical filter 205, and is photoelectrically converted into an electrical signal corresponding to the light intensity by the imaging element 206. An electrical signal (analog signal) output from the image sensor 206 via the sensor substrate 207 is input to the signal processing unit 208. The signal processing unit 208 then converts the digital signal (captured image data) including the brightness (luminance information) and color information of each pixel on the image sensor 206 together with the horizontal / vertical synchronization signal of the image into the subsequent image analysis unit 102. Output to.

  As already described, in this embodiment, the focal position of the imaging lens 204 is set to infinity or between infinity and the outer wall surface of the windshield 105. Thereby, when detecting a preceding vehicle or an oncoming vehicle, or detecting a white line, appropriate information can be acquired from the captured image data of the imaging unit 101.

  However, when focusing at infinity, when identifying the tail lamp of a preceding vehicle traveling far, there may be about one light receiving element that receives the light of the tail lamp on the image sensor 206. In this case, the light from the tail lamp may not be received by the red light receiving element that receives the tail lamp color (red). In this case, the tail lamp cannot be recognized and the preceding vehicle cannot be detected. In order to avoid such a problem, it is preferable that the imaging lens 204 is focused before infinity. As a result, the tail lamp of the preceding vehicle traveling far is out of focus, so that the number of light receiving elements that receive the light of the tail lamp can be increased, the recognition accuracy of the tail lamp is increased, and the detection accuracy of the preceding vehicle is improved.

  FIG. 3 is a schematic cross-sectional view along the light transmission direction of the optical filter 205A, the image sensor 206, and the sensor substrate 207 in the present embodiment. 4A is a schematic front view of the optical filter 205A viewed from the sensor substrate 207 side, and FIG. 4B is a schematic front view of the image sensor 206 viewed from the imaging lens 204 side. 3 and 4, each pixel of the image sensor 206 is illustrated in a simplified manner, but actually, the image sensor 206 is configured by about several hundreds of thousands of pixels arranged two-dimensionally.

  As shown in FIG. 3, the optical filter 205 </ b> A includes a substrate 220 that is transparent to incident light in a use band (visible light region and infrared light region in the present embodiment), and a surface of the substrate 220 on the imaging lens 204 side. It is formed on the entire surface of the effective imaging region (the region corresponding to all the pixels constituting the imaging device 206), and emits light having a wavelength component in the range of wavelengths λ1 to λ2 and λ3 to λ4 (λ1 <λ2 <λ3 <λ4). A selectively transmitting spectral filter layer (first spectral filter layer) 221, a polarizing filter layer (first polarizing filter layer) 223 formed on the surface of the substrate 220 on the image sensor 206 side, and a polarizing filter layer 223. The surface of the filler 224 on the side of the image sensor 206 is disposed close to the image sensor 206.

  Of the light incident on the optical filter 205A, the light transmitted through the spectral filter layer 221 and the polarizing filter layer 223 is incident on a predetermined region (the lower region 211 in FIG. 4) of the effective image capturing region of the image sensor 206. Further, of the incident light to the optical filter 205A, light that passes through the spectral filter layer 221 and does not pass through the polarizing filter layer 223 enters an effective imaging region other than the lower region 211 of the image sensor 206 (upper region 212 in FIG. 4). Incident.

  The image sensor 206 is an image sensor using a CCD (Charge Coupled Device), a CMOS (Complementary Metal Oxide Semiconductor), or the like, and has a pixel array composed of a plurality of pixels arranged two-dimensionally.

  In each pixel of the image sensor 206, a light receiving element such as a photodiode that images light incident from the optical filter 205A side is disposed. In order to increase the light collection efficiency of the light receiving element, a micro lens (not shown) may be provided on the incident side of the image sensor 206 corresponding to each pixel. The imaging element 206 configured in this manner is bonded to a PWB (Printed Wiring Board) by a technique such as wire bonding and mounted on the sensor substrate 207.

  FIG. 5 is a schematic view illustrating the correspondence between the polarization filter layer 223 of the optical filter 205A and the pixels of the image sensor 206. FIG. 5 shows an example in which a polarizer that transmits the vertical polarization component S of incident light and a polarizer that transmits the horizontal polarization component P of incident light are arranged in a checkered pattern in units of pixels. However, the arrangement pattern of the polarizers is not limited to this, and may be a pattern having a plurality of pixels of two or more pixels as a unit, and a stripe pattern as described later is given as an example.

  In the polarizing filter layer 223, a polarizer corresponding to each pixel of the image sensor 206 is divided into regions in the lower region 211 (see FIG. 4). The polarizer is configured to cause a plurality of polarization components including at least a horizontal polarization component P and a vertical polarization component S of incident light to enter each pixel of the image sensor 206.

  Note that, in the pixel in the lower region 211 where the polarizing filter layer 223 is arranged, images with uneven transmission light amount are captured according to the arrangement pattern of the polarizer, but these images are differential images (described later). It is used for various information detection by being converted to a polarization degree image. On the other hand, from the pixels in the upper region 212 where the polarizing filter layer 223 is not disposed, it is possible to form an image with high resolution without unevenness.

  The image analysis unit 102 acquires the horizontal polarization component P and the vertical polarization component S from each pixel of the lower region 211, and performs a known image interpolation process (for example, taking an average of pixel values of adjacent pixels) Two types of images, a horizontal polarization component image and a vertical polarization component image, are generated.

Further, the image analysis unit 102 obtains the degree of polarization for each pixel from the following equation (1), where I (P) is the pixel value of the horizontal polarization component image and I (S) is the pixel value of the vertical polarization component image. Thus, a polarization degree image that does not depend on luminance information is generated.
Polarization degree = (I (P) −I (S)) / (I (P) + I (S)) (1)

  The polarization degree image is used to detect road surface conditions such as dry, wet, and frozen road surfaces in order to prevent the vehicle 100 from skidding. Here, the change of the reflected light according to the road surface state will be described.

  In the road surface state determination process performed by the image analysis unit 102 in the present embodiment, of the information that can be acquired from the imaging unit 101, the horizontal polarization component P and the vertical polarization of the white component (non-spectral) incident on the lower region 211. Polarization information by comparison with the component S is used.

  FIGS. 6A and 6B are explanatory diagrams for explaining changes in reflected light when the road surface state is a wet state and when the road surface state is a dry state.

As shown to Fig.6 (a), the road surface in a wet state will be in a state close | similar to a mirror surface, when water accumulates in the uneven part of a road surface. Therefore, the reflected light on the wet road surface exhibits the following polarization characteristics. That is, when the reflectances of the horizontal polarization component P and the vertical polarization component S of the reflected light are Rp and Rs, respectively, the horizontal polarization component Ip and the vertical polarization component Is of the reflected light with respect to the incident light having the light intensity I are expressed by the following formula ( It can be calculated from 2) and (3), and the incident angle dependency is as shown in FIG.
Ip = Rp × I (2)
Is = Rs × I (3)

  As can be seen from FIG. 7, the reflectance Rp of the horizontal polarization component Ip of the reflected light at the mirror surface becomes zero when the incident angle is equal to the Brewster angle (53.1 degrees), and the reflected light intensity of the horizontal polarization component Ip is It becomes zero. Further, since the reflectance Rs of the vertical polarization component Is of the reflected light on the mirror surface shows a characteristic that increases gradually as the incident angle increases, the reflected light intensity of the vertical polarization component Is also increases gradually as the incident angle increases.

  On the other hand, as shown in FIG. 6B, the road surface in the dry state has a rough surface, so that irregular reflection is dominant, the reflected light does not exhibit polarization characteristics, and the reflectance Rp of each polarization component, The difference in Rs becomes smaller.

It is possible to determine whether the road surface state is a wet state or a dry state based on the difference in the polarization characteristics of the reflected light from the road surface. Specifically, the image analysis unit 102 uses the polarization ratio H shown in the following formula (4) in determining the wet and dry state of the road surface. For example, the polarization ratio H is calculated by calculating a ratio (S / P) between the vertical polarization component S of white light (non-spectral) and the horizontal polarization component P of white light (non-spectral) for an image region that reflects a road surface. It can be obtained from the average value. Since the polarization ratio H is a parameter that does not depend on the incident light intensity I as shown in the following formula (4), the polarization ratio H is stably used for determining the wet and dry state of the road surface without being affected by the luminance fluctuation in the imaging region. be able to.
H = Is / Ip = Rs / Rp (4)

Equation (4) can also be expressed as the following equation (5) as a function of the degree of polarization shown in equation (1). Here, Is is regarded as the pixel value I (S) of the polarization degree image, and Ip is regarded as the pixel value I (P) of the polarization degree image.
H = (1−degree of polarization) / (1 + degree of polarization) (5)

  When the polarization ratio H obtained in this way exceeds a predetermined threshold value, the image analysis unit 102 determines that the road surface state is a wet state, and when the road surface state is equal to or lower than the predetermined threshold value, the road surface state is dry. It is determined that it is in a state. When the road surface is dry, the vertical polarization component S and the horizontal polarization component P are substantially equal, so the polarization ratio H is about 1. On the other hand, when the road surface is completely wet, the vertical polarization component S takes a value much larger than the horizontal polarization component P, so the polarization ratio H becomes a large value. When the road surface is slightly wet, The ratio H is an intermediate value between them.

  In the present embodiment, the determination result of the determination process of the wet / dry condition of the road surface state as described above is used for driving support control such as warning to the driver of the vehicle 100 and control of the steering wheel and brake of the vehicle 100. . Specifically, when it is determined that the road surface state is a wet state, the determination result is sent to the vehicle travel control unit 108, and is used for, for example, control of the automatic brake system of the vehicle 100, thereby causing a traffic accident. A reduction effect can be expected. In addition, for example, information that warns that the road surface is slippery may be notified on a CRT screen of the car navigation system of the vehicle 100 to alert the driver.

  By the way, when the frozen road surface, in particular, the surface layer of asphalt called black ice is frozen, it is difficult to distinguish the frozen region from the normal luminance image. On the other hand, the frozen region is clearly expressed in the polarization degree image. FIG. 8 shows a luminance image and a polarization degree image obtained by imaging the ice sheet on the concrete surface.

  As shown in FIG. 8 (a), it is difficult to determine where the ice is in the luminance image, whereas in the polarization degree image of FIG. 8 (b), the location (portion enclosed by an ellipse in the figure). You can check the ice plate. Since the light is scattered (diffuse reflection) in the concrete portion without the ice plate, the degree of polarization is almost zero. On the other hand, in the concrete portion where the plate ice is present, the light is scattered inside the plate ice, but the P-polarized component is dominant as the polarization component transmitted through the air interface from the inside of the plate ice. For this reason, in FIG.8 (b), a difference can be seen clearly with a plate ice part and a dry concrete part.

  By using polarization information, it is possible to detect information other than road surface conditions, for example, an obstacle in a shadow that is difficult to detect in a luminance image. FIG. 9 is an explanatory diagram showing the imaging result of the obstacle in the shadow. For convenience, the imaging results in FIG. 9 are based on an imaging device in which a polarizing filter layer is disposed over the entire effective imaging region.

  Compared to the luminance image in FIG. 9A, the vehicle parked beside the road can be recognized in the polarization degree image in FIG. 9B. The polarization degree image is generated by the calculation of the expression (1) already shown, and is independent of the luminance information and can express the angle information and the material difference of the object. Suitable for detecting obstacles in shadows. In other words, in the present embodiment, the image analysis unit 102 can perform obstacle detection with high accuracy by using such a polarization degree image.

  In the present embodiment, the imaging unit 101 may capture not only the vehicle surrounding information in front of the vehicle 100 but also the vehicle surrounding information in the rear, for example. In recent years, it is possible to detect an obstacle or the like hidden in the shadow of the host vehicle by arranging the imaging unit 101 as a rear view camera mounted on various vehicles and capturing a polarization degree image.

  When the imaging unit 101 is used as a rear view camera, it is desirable to use a wide-angle lens having an angle of view of 90 degrees or more as the imaging lens 204. In general, when a wide-angle lens is used, the incident angle of the peripheral image is larger than that of the image center (for example, 20 degrees). When the incident angle increases in this way, the transmittance of incident light decreases in the region where the polarizer of the polarizing filter layer 223 is formed.

Therefore, when a wide-angle lens is used, when the image analysis unit 102 calculates the degree of polarization, a predetermined gain α corresponding to the imaging region may be multiplied by the degree of polarization of Expression (1). That is, instead of equation (1), the following equation (6) may be used.
Polarization degree = α × (I (P) −I (S)) / (I (P) + I (S)) (6)

  For example, the gain α may be α = 1 in the central portion of the effective imaging region and α = 1.1 in the peripheral portion according to the transmittance deterioration. As a result, an image having no difference in polarization degree between the image center and the image periphery can be captured.

  Further, the image analysis unit 102 performs known edge extraction on the luminance image and the polarization degree image, and superimposes the luminance image and the polarization degree image after the edge extraction, thereby comparing the obstacle with the case of only the luminance image. The detection performance can be improved.

  Note that the image processing system 110 according to the present embodiment can capture not only a polarization degree image including information such as a road surface state in the lower region 211 but also a luminance image in the upper region 212 that is not subjected to polarization limitation. It has become. The upper region 212 corresponds to, for example, a region above the road surface and can be used for detecting a sign or the like. The upper region 212 has a higher resolution and a larger amount of light than the lower region 211 where the polarizer is disposed.

  FIG. 10 is a schematic diagram illustrating a state in which the sign detection is performed by the image processing system 110 (only the imaging unit 101 is illustrated) mounted inside the vehicle 100. The imaging unit 101 captures an image of a road sign (for example, a restriction sign for prohibiting entry of a vehicle, etc.) 300 located in front of the vehicle 100, and alerts the driver by voice or image display on the display via the image analysis unit 102. To do.

  There may be a plurality of types of road signs to be subjected to sign detection. For example, the image processing system 110 stores various road sign information in the memory of the image analysis unit 102 in advance, and stores the various road sign information acquired using the imaging unit 101 and the various road sign information stored in the memory. The structure to compare may be sufficient. Further, the image processing system 110 may recognize the acquired various road sign information, convert the recognized various road sign information into a voice synthesis signal, and notify the driver of the various road sign information by voice. Good.

  By the way, the lower region 211 where the polarizer is arranged has about half the amount of transmitted light compared to the upper region 212 that is not subjected to polarization limitation. Therefore, it is preferable to change the exposure amount at the time of imaging between the polarization degree image generated based on the light transmitted through the lower region 211 and the luminance image generated based on the light transmitted through the upper region 212.

  Specifically, by the automatic exposure adjustment of the image analysis unit 102, the imaging unit 101 captures an image frame for a polarization degree image with a first exposure amount (exposure time) in the lower region 211, and in the upper region 212. It may be configured to capture an image frame for a luminance image with the second exposure amount (exposure time). For example, the image analysis unit 102 may be configured to change the exposure amount (exposure time) by controlling the time for each pixel of the image sensor 206 to convert incident light into an electrical signal.

  That is, for the lower region 211, the image analysis unit 102 performs automatic exposure adjustment while detecting the amount of light that passes through the lower region 211, so that the imaging unit 101 can obtain the road surface with the first exposure amount (exposure time). Imaging the state. In the upper region 212, the image analysis unit 102 performs automatic exposure adjustment while detecting the amount of light transmitted through the upper region 212, so that the imaging unit 101 can display the sign image with the second exposure amount (exposure time). Image. Thereby, it becomes possible to capture an image with an optimum exposure amount for each image.

  The upper region 212 has a large light amount change. Specifically, since the illuminance around the vehicle changes from tens of thousands of lux in the daytime to 1 lux or less at night, it is necessary to adjust the exposure time according to the imaging scene. For this, the image analysis unit 102 may perform known automatic exposure control.

  On the other hand, the lower region 211 can be imaged with a fixed exposure time that is half the exposure time of the upper region 212 (images are taken with a fixed exposure amount). Thereby, shortening of exposure control time, simplification of exposure control, and the like can be realized.

  FIG. 11 is a flowchart showing the procedure of exposure control in the present embodiment. First, the image analysis unit 102 performs exposure adjustment on the upper region 212 where no polarizer is arranged. Then, the imaging unit 101 captures an image frame for a luminance image with the second exposure amount set by exposure adjustment by the image analysis unit 102 (step S120).

  Next, the image analysis unit 102 analyzes the image frame for the luminance image captured in step S120 (step S121), and sends an instruction signal for causing the vehicle travel control unit 108 and the like to perform various controls. (Step S122).

  Next, the image analysis unit 102 performs exposure adjustment on the lower region 211 where the polarizer is disposed. Then, the imaging unit 101 captures an image frame for the polarization degree image with the first exposure amount set by the exposure adjustment by the image analysis unit 102 (step S123).

  Next, the image analysis unit 102 analyzes the image frame for the polarization degree image captured in step S123 (step S124). Then, the image analysis unit 102 sends an instruction signal for causing the vehicle travel control unit 108 to perform various controls based on the analysis result in step S124 (step S125).

  Then, the image analysis unit 102 repeatedly executes the processes of steps S120 to S125 until there is a predetermined end instruction (such as an end instruction by the driver of the vehicle 100) (step S126).

  Hereinafter, the spectral characteristics of the spectral filter layer 221 included in the optical filter 205A will be described. FIG. 12 is a graph showing the spectral characteristics of the spectral filter layer 221. As shown in FIG. 12, the spectral filter layer 221 includes incident light in a so-called visible light region having a wavelength range of 400 nm to 670 nm (here, λ1 = 400 nm, λ2 = 670 nm) and a wavelength range of 940 nm to 1000 nm (here, λ3 = 940 nm, λ4 = 1000 nm) in the infrared light region, and has transmittance characteristics that cut incident light in the wavelength range of 670 nm to 940 nm. The transmittance in the wavelength range of 400 nm to 670 nm and the wavelength range of 940 to 1000 nm is preferably 30% or more, and more preferably 90% or more. The transmittance in the wavelength range of 670 nm to 940 nm is preferably 20% or less, and more preferably 5% or less.

  Incident light in the visible light region is used for detecting vehicle periphery information, and incident light in the infrared light region is used for detecting road surface conditions and obstacles at night. The reason why the incident light in the wavelength range of 670 nm to 940 nm is not transmitted is that when the imaging element 206 takes in the incident light in this wavelength range, the obtained captured image data becomes entirely red, and the red sign indicates the red color This is because it may be difficult to extract the red image portion corresponding to the tail lamp.

  Therefore, if the spectral filter layer 221 having the characteristic of cutting most of the wavelength range (670 nm to 940 nm) of the infrared light region as shown in FIG. 12 is formed in the entire effective imaging region, disturbance light can be removed. For example, it is possible to improve the detection accuracy of red signs such as a stop sign in Japan and the identification accuracy of tail lamps. Note that the wavelength range of 940 to 1000 nm and the wavelength range of 400 nm to 670 nm are representative examples of the wavelength range according to the present invention.

  By the way, when the imaging direction of the imaging unit 101 is tilted downward, the hood of the host vehicle may enter the lower part of the imaging area. In this case, sunlight reflected by the bonnet of the host vehicle or the tail lamp of the preceding vehicle becomes disturbance light, which is included in the captured image data, which may cause misidentification of the head lamp of the oncoming vehicle, the tail lamp of the preceding vehicle, and the white line. Become. Even in such a case, in the present embodiment, the spectral filter layer 221 that blocks the light in the wavelength range of 670 nm to 940 nm is formed over the entire effective imaging region, so that the sunlight reflected by the bonnet, the tail lamp of the preceding vehicle, etc. The disturbance light is removed. Therefore, the head lamp of the oncoming vehicle, the tail lamp of the preceding vehicle, and the white line identification accuracy are improved.

  As already described, the optical filter 205A is disposed close to the surface of the image sensor 206 on the imaging lens 204 side. This is because optical crosstalk is more likely to occur between adjacent pixels as the optical filter 205A and the image sensor 206 are separated from each other. Therefore, it is desirable that the gap between the optical filter 205A and the image sensor 206 is closely bonded by a method such as adhesion so that the gap is 2 μm or less. Thereby, the boundary between the lower region 211 and the upper region 212 of the optical filter 205A and the boundary of the pixel of the image sensor 206 can be easily matched.

  For example, the optical filter 205A and the image sensor 206 may be bonded with a UV adhesive, or the four side areas outside the effective imaging area may be bonded or thermo-compression bonded to the four sides outside the effective imaging area while being supported by the spacer outside the effective imaging area. Good.

  Hereinafter, each part of the optical filter 205A will be described in detail. The substrate 220 is made of a transparent material, such as glass, sapphire, or quartz, that can transmit light in the used wavelength band (visible light region and infrared light region in this embodiment). In the present embodiment, glass, particularly quartz glass (refractive index 1.46) and Tempax glass (refractive index 1.51) which are inexpensive and durable can be suitably used.

  The polarizing filter layer 223 formed on the substrate 220 is composed of a polarizer formed with a wire grid structure as shown in FIG. 13, and the surface on the image sensor 206 side is an uneven surface. The wire grid structure is a structure in which metal wires (conductor lines) made of metal such as aluminum and extending in a specific direction are arranged at a specific pitch. By making the wire pitch of the wire grid structure a sufficiently small pitch (for example, ½ or less) compared to the wavelength band of incident light (for example, 400 nm to 800 nm), the wire grid structure vibrates in parallel to the longitudinal direction of the metal wire. Therefore, it can be used as a polarizer for producing a single polarized light because it reflects most of the light of the electric field vector component and transmits almost the light of the electric field vector component that vibrates in the direction orthogonal to the longitudinal direction of the metal wire.

  In a wire grid polarizer, the extinction ratio generally increases as the cross-sectional area of the metal wire increases, and the transmittance decreases for metal wires having a predetermined width or more with respect to the period width. Further, when the cross-sectional shape orthogonal to the longitudinal direction of the metal wire is a tapered shape, the wavelength dispersion of the transmittance and the polarization degree is small in a wide band, and a high extinction ratio characteristic is exhibited.

  In the present embodiment, the polarizing filter layer 223 is formed in a wire grid structure, and thus has the following effects. The wire grid structure can be formed using a widely known semiconductor manufacturing process. Specifically, after depositing an aluminum thin film on the substrate 220, patterning is performed, and the sub-wavelength uneven structure of the wire grid may be formed by a technique such as metal etching. By such a manufacturing process, it becomes possible to adjust the longitudinal direction of the metal wire, that is, the polarization direction (polarization axis) corresponding to the imaging pixel size (several μm level) of the imaging element 206. Therefore, as in the present embodiment, it is possible to create the polarizing filter layer 223 in which the longitudinal direction of the metal wire, that is, the polarization direction (polarization axis) is different for each imaging pixel.

  Further, since the wire grid structure is made of a metal material such as aluminum, there is an advantage that it is excellent in heat resistance and can be suitably used even in a high-temperature environment such as a vehicle interior that is likely to become high temperature.

  As described above, the polarizer of the polarizing filter layer 223 has a sub-wavelength sized wire grid structure, has low mechanical strength, and a metal wire is damaged by a slight external force. Since it is desired that the optical filter 205A of the present embodiment is disposed in close contact with the image sensor 206, there is a possibility that the optical filter 205A and the image sensor 206 are in contact with each other in the manufacturing stage. The optical filter 205A and the image sensor 206 are preferably arranged in parallel, and a planarization layer is preferably formed therebetween.

  The filler 224 is used to flatten the upper surface of the polarizing filter layer 223 in the stacking direction, and is filled in the recesses between the metal wires of the polarizing filter layer 223. As the filler 224, an inorganic material having a refractive index lower than or equal to that of the substrate 220 can be suitably used. In addition, the filler 224 in this embodiment is formed so as to cover the upper surface in the stacking direction of the metal wire portion of the polarizing filter layer 223.

The filler 224 is preferably made of a low refractive index material whose refractive index is as close as possible to the refractive index of air (refractive index = 1) so as not to deteriorate the polarization characteristics of the polarizing filter layer 223. As a specific material, for example, a porous ceramic material formed by dispersing fine pores in ceramics is preferable, and specifically, porous silica (SiO 2 ), porous magnesium fluoride (MgF). And porous alumina (Al 2 O 3 ). The degree of these low refractive indexes is determined by the number and size of pores in the ceramic (porosity). When the main component of the substrate 220 is made of silica crystal or glass, porous silica (n = 1.2-1.26) can be preferably used.

As a method for forming the filler 224, for example, an inorganic coating film (SOG: Spin On Glass) method can be suitably used. Specifically, a solvent in which silanol (Si (OH) 4 ) is dissolved in alcohol is spin-coated on the polarizing filter layer 223 formed on the substrate 220, and then the solvent component is volatilized by heat treatment, whereby the silanol itself Thus, the filler 224 is formed in such a manner as to cause a dehydration polymerization reaction.

  As described above, in this embodiment, since the upper surface in the stacking direction of the polarizing filter layer 223, that is, the surface on the image sensor 206 side is covered with the filler 224, the wire grid structure is formed when contacting the image sensor 206. The situation that damages is suppressed.

  Further, by filling the filler 224 into the recesses between the metal wires in the wire grid structure of the polarizing filter layer 223 as in the present embodiment, it is possible to prevent foreign matter from entering the recesses.

  The spectral filter layer 221 of the present embodiment is made of a multilayer film structure in which high refractive index thin films and low refractive index thin films are alternately stacked. If such a multilayer film structure is adopted, the degree of freedom in setting the spectral transmittance is increased by using light interference, and a specific wavelength (for example, a wavelength band other than red) is obtained by stacking thin films in multiple layers. On the other hand, it is also possible to realize a reflectance close to 100%.

  As described above, in the image processing system of the present embodiment, the polarizing filter layer is disposed only below the effective imaging region. By limiting the arrangement of the polarizing filter layer in this way, the image processing system of the present embodiment and the vehicle including the same can obtain the polarization information of light from the road surface corresponding to the lower part of the effective imaging region, An image can be taken without losing the resolution and the amount of light in a region above the road surface, that is, a road sign.

(Second Embodiment)
An image processing system according to a second embodiment of the present invention and a vehicle including the image processing system will be described with reference to the drawings. In addition, description is abbreviate | omitted suitably about the structure and operation | movement similar to embodiment already demonstrated.

  In this embodiment, the example which changed the area | region division method of the polarizing filter layer is shown. In the present embodiment, the polarizing filter layer is formed by alternately forming a polarizer that transmits a horizontal polarization component P of incident light and a polarizer that transmits a vertical polarization component S of incident light as a stripe pattern.

  FIG. 14 is a schematic diagram illustrating the correspondence between the polarizing filter layer 223 of the optical filter 205A illustrated in FIG. 5 of the first embodiment and the pixels of the image sensor 206, and illustrates another example of a polarizer pattern. Is. That is, FIG. 14 shows that a polarizer that transmits the horizontal polarization component P of incident light is arranged in the vertical direction (Z direction) with a width of 1 pixel, and a polarizer that transmits the vertical polarization component S of incident light. An example is shown in which a stripe-shaped polarizer pattern is formed with a width of one pixel arranged in a row in the vertical direction.

  In the present embodiment, by forming the polarizing filter layer 223 in a stripe pattern, each pixel of the image sensor 206 and the polarizing filter layer 223 of the optical filter 205A are formed as compared with the checkered pattern in the first embodiment. It is possible to relax the positional deviation accuracy with respect to the formed part.

  That is, in the case of the checkered pattern as in the first embodiment, in order to match each pixel of the image sensor 206 with the portion where the polarizing filter layer 223 of the optical filter 205A is formed, the Y direction and the Z direction are used. Each position needs to be precisely adjusted. On the other hand, in the stripe pattern as in the present embodiment, in order to match each pixel of the image sensor 206 and the portion where the polarizing filter layer 223 of the optical filter 205A is formed, the position adjustment in the Y direction is performed. Only need to be precise. Thereby, in the process of bonding the optical filter 205A and the image sensor 206, the assembly time can be shortened and the assembling apparatus can be simplified.

  In addition, for the purpose of detecting the position and distance information where the road surface state changes when the vehicle travels, it is desirable that the image resolution in the traveling direction is sufficient, and the direction of the pitch of the stripe pattern is the horizontal direction (Y direction). It is good to be.

(Third embodiment)
An image processing system according to a third embodiment of the present invention and a vehicle including the image processing system will be described with reference to the drawings. In addition, description is abbreviate | omitted suitably about the structure and operation | movement similar to embodiment already demonstrated.

  In the present embodiment, the polarization separation unit 230 is disposed between the imaging lens 204 and the optical filter 205A (the front stage of the optical filter 205A). The polarized light separating unit 230 separates incident light into light of a horizontal polarization component P and a vertical polarization component S. Further, the polarization separation means 230 has a function of causing one of the horizontal polarization component P and the vertical polarization component S of the incident light to go straight and shifting the other optical path by a predetermined shift amount.

  Hereinafter, with reference to FIGS. 15 and 16, the problem of the accuracy of the polarization component image information in the entire imaging apparatus that does not have the polarization separation means as described above will be described. FIG. 15 shows a simplified correspondence between each region (f0, f1, f2, f3, f4, f5) of the polarizer array and each pixel (c0, c1, c2, c3, c4, c5) of the image sensor. It is explanatory drawing shown. As shown in FIG. 15, first polarization component image information (corresponding to one pixel of the polarization component image) is generated from the horizontal polarization component P at position A and the vertical polarization component S at position B, and the horizontal polarization component at position C. Second polarization component image information (corresponding to one pixel of the polarization component image) is generated from the vertical polarization component S at P and position D. In this way, a polarization component image is acquired from a plurality of pieces of polarization component image information.

  Next, with reference to FIG. 16, a situation in which light of a specific polarization enters the polarizer array and the image sensor will be considered. FIG. 16A shows two pixels (light receiving elements) c1 and c2, and polarizers f1 and f2 that respectively transmit polarized components corresponding to them and orthogonal to each other. Incident light passes through the polarizer array and enters each pixel, and is converted into an electrical signal (pixel value) at each pixel. Here, for simplicity of explanation, the polarizers f1 and f2 completely cover the two pixels c1 and c2, respectively, and completely transmit only one of the orthogonal polarization components, and completely block the other. Assume that In addition, it is assumed that light having the same light intensity is incident at both positions A and B.

  FIG. 16B shows pixel values output by each pixel in the configuration of FIG. The horizontal axis indicates the position, and the vertical axis indicates the pixel value with respect to the light intensity of the incident light, and polarization component image information is generated from this pixel value. In the examples of FIGS. 16A and 16B, since the pixel values at the positions A and B are the same, the polarization component image information is output as non-polarized light (the degree of polarization is zero).

  FIGS. 16C and 16D show a state where light having different polarization characteristics is incident on the same image sensor. FIG. 16C shows an example in which non-polarized light is incident on the position A and S-polarized light is incident on the position B. Here, the intensity of the non-polarized light before passing through the polarizer array is twice the intensity of the S-polarized light. Although the S-polarized light is incident on the position B in this way, the pixel values at the position A and the position B are equal as shown in FIG. Information is output as (polarization degree is zero). That is, a result different from the polarization characteristic information of the light before passing through the polarizer array is output as polarization component image information.

  Here, it is important that the distribution of pixel values shows exactly the same shape even though light having different polarization characteristics is incident. As described above, when the pixel values of the pixels output from the image sensor are the same, whether the original incident light is as shown in FIG. 16A in the signal processing for obtaining the polarization component image information in the subsequent stage. It cannot be distinguished whether it was what was shown in FIG.16 (c).

  Therefore, in general, since there is a high possibility that light in the same polarization state is incident on the neighboring region, there are many cases where the estimation is limited to one of the examples in FIGS. As a result, there is a problem in that the accuracy of the polarization component image information is deteriorated at an edge portion such as a horizontal and vertical shape in which the polarization state is easily switched. Such a phenomenon is a fundamental problem in all imaging devices including a region division type filter for causing each component of incident light to enter pixels (light receiving elements) at different positions.

  Thus, if there is a deviation between the pixel resolution of the polarization component image and the spatial frequency of the pattern of the subject, moire occurs in the polarization component image. In the present embodiment, as a method for improving such moire, polarization separation means 230 as an optical low-pass filter is provided in front of the optical filter 205A.

  FIG. 17 is a schematic diagram showing the correspondence between the polarization separating unit 230, the polarizing filter layer 223 of the optical filter 205A in the subsequent stage, and the pixels of the image sensor 206 in a simplified manner. In the example shown in FIG. 17, the polarization separation unit 230 moves the horizontal polarization component P of the incident light straight and shifts the vertical polarization component S by a predetermined shift amount δ with respect to the straight light path of the horizontal polarization component P. It has a function.

  As a material of the polarization separating means 230, a birefringent plate using calcite or quartz can be used. The shift amount δ may coincide with the pitch of the stripe pattern of the polarizing filter layer 223 described in the second embodiment. The shift amount δ can be adjusted by the direction of the optical axis and the thickness t of the material.

  FIG. 18 is an explanatory diagram for explaining the difference in the polarization degree image depending on the presence / absence of the polarization separation means. In the imaging result (FIG. 18 (a)) obtained by the imaging apparatus that does not have the polarization separation means, moire is generated in a portion surrounded by an ellipse in the figure. On the other hand, it can be seen that moire is suppressed in the imaging result (FIG. 18B) obtained by the image processing system of the present embodiment having the polarization separation means 230.

  Further, the polarization separation unit 230 may be provided only in the lower region 211 (see FIG. 4) where the polarizer of the polarization filter layer 223 is formed, or may be provided in the entire effective imaging region. However, since the thickness t of the polarization separation unit 230 is about 1 mm, when the polarization separation unit 230 is disposed only in the lower region 211 where the polarizer of the polarization filter layer 223 is formed, the upper region 212 and the lower region are arranged. 211, there is a problem that the focus position of the image is shifted. Therefore, the polarizing filter layer 223 is desirably formed so as to cover the entire effective imaging region. In this case, it is possible to obtain an effect such that high accuracy is not required for position adjustment between the polarization separation unit 230 and the polarization filter layer 223.

  As described above, the polarization separation unit 230 may not be disposed between the imaging lens 204 and the optical filter 205A, but the substrate 220 may have the function of the polarization separation unit 230. That is, the substrate 220 separates the incident light into the horizontal polarization component P and the vertical polarization component S, makes one of the horizontal polarization component P and the vertical polarization component S go straight, and shifts the other optical path by a predetermined shift amount. It may be made of a birefringent material. The predetermined shift amount corresponds to the pitch of the stripe pattern.

  As described above, by providing the substrate 220 with the polarization separation function, the material can be reduced as compared with the configuration in which the polarization separation unit 230 is separately provided.

  As described above, the image processing system according to the present embodiment and the vehicle equipped with the image processing system have the polarization separation function, so that the polarization component image information can be detected even in edge portions such as horizontal and vertical shapes in which the polarization state easily changes. Accuracy can be improved.

(Fourth embodiment)
An image processing system according to a fourth embodiment of the present invention and a vehicle including the image processing system will be described with reference to the drawings. In addition, description is abbreviate | omitted suitably about the structure and operation | movement similar to embodiment already demonstrated.

  In the first to third embodiments, the case where the polarizing filter layer is formed in the region for imaging the road surface has been described. However, the present invention is not limited to this, and the polarizing filter layer is formed in the region where the sign detection is performed. It may be.

  FIG. 19 is a schematic cross-sectional view along the light transmission direction showing another example of the structure of the optical filter. FIG. 20 is a schematic front view of the optical filter 205B in this embodiment as viewed from the sensor substrate 207 side.

  That is, as shown in FIGS. 19 and 20, the optical filter 205B further includes a polarizing filter layer 225 (second polarizing filter layer) disposed in the upper region 212 (see FIG. 4). Of the light incident on the optical filter 205 </ b> B, the light transmitted through the polarization filter layer 223 is incident on the lower region 211 of the image sensor 206. On the other hand, out of the incident light to the optical filter 205 </ b> B, the light transmitted through the polarizing filter layer 225 enters the upper region 212 of the image sensor 206.

  FIG. 21 is a schematic view illustrating the correspondence between the polarization filter layer 225 of the optical filter 205B and the pixels of the image sensor 206. That is, FIG. 21 shows an example in which a polarizer for making only the horizontal polarization component P of incident light enter each pixel of the image sensor 206 is formed on the entire polarizing filter layer 225.

  In general, when an imaging device is installed in a vehicle and an image outside the vehicle is to be captured, unnecessary light reflected by a dashboard or a hood may be superimposed on the captured image data. These may be noise light and cause a reduction in the recognition rate of original label detection, etc., but most of the polarization component of such unnecessary light is the S polarization component. Therefore, these unnecessary lights can be reduced by suppressing the S polarization component.

  In consideration of this point, the image processing system of this embodiment and the vehicle including the same form a polarizer region that transmits only the horizontal polarization component P of incident light as the polarizing filter layer 225. The label can be detected without being affected by unnecessary light.

  As shown in FIG. 19, the two polarizing filter layers 223 and 225 are formed over the entire effective imaging region, so that the flatness of the filler 224 can be more easily obtained. It is also possible to directly bond the two polarizing filter layers 223 and 225 to the image sensor without using the filler 224. That is, when the polarizing filter layer is formed only in the lower region 211 (first to third embodiments), the optical filter 205A and the image sensor 206 are bonded while maintaining parallelism without the filler 224. Although it was difficult, with the configuration of the present embodiment in which the polarizing filter layer is provided in the entire effective imaging region, the optical filter 205B and the imaging element 206 can be bonded while maintaining parallelism even without the filler 224. And the step of applying the filler 224 can be omitted.

(Fifth embodiment)
An image processing system according to a fifth embodiment of the present invention and a vehicle including the image processing system will be described with reference to the drawings. In addition, description is abbreviate | omitted suitably about the structure and operation | movement similar to embodiment already demonstrated.

  In the first to fourth embodiments, the case where the imaging element is a so-called monochrome sensor has been described. However, the present invention is not limited to this, and the imaging element may be a color sensor. In this embodiment, another configuration example of the spectral filter layer in the optical filter is shown.

  FIG. 22 is a schematic cross-sectional view along the light transmission direction showing another example of the structure of the imaging element and the optical filter. FIG. 23 is a schematic front view of the optical filter 205C according to this embodiment as viewed from the sensor substrate 207 side.

  The image sensor 206 ′ in the present embodiment is a color sensor in which a color filter 206b is formed on the surface of a pixel array 206a composed of a plurality of pixels that are two-dimensionally arranged. The color filter 206b includes a filter that mainly transmits light in the red wavelength band (λ = 550 to 650 nm), a filter that mainly transmits light in the green wavelength band (λ = 500 to 550 nm), and a blue wavelength band (λ (= 450 to 500 nm) has a Bayer array configuration in which the filter that mainly transmits light is disposed corresponding to each pixel of the pixel array 206a.

  FIG. 24 is a schematic view illustrating the correspondence between the polarization filter layer 223 of the optical filter 205C and the pixels of the image sensor 206 ′ in the present embodiment. Hereinafter, the pixel of the image sensor 206 ′ corresponding to the filter for the red wavelength band corresponds to the R pixel, the pixel of the image sensor 206 ′ corresponding to the filter for the green wavelength band corresponds to the G pixel, and the filter for the blue wavelength band. A pixel of the image sensor 206 ′ is referred to as a B pixel. Four pixels (one R pixel, two G pixels, and one B pixel) adjacent vertically and horizontally form one pixel group.

  The polarizing filter layer 223 includes a polarizer for causing the horizontal polarization component P of incident light to enter one of the two G pixels included in each pixel group, and a vertical polarization component of incident light to the other of the two G pixels. A polarizer for entering S may be formed. For example, a striped polarizer pattern as shown in FIG. 24 may be used.

  In addition to the configuration of the optical filter 205A illustrated in FIG. 3 and the like, the optical filter 205C is formed in the lower region 211 on the image sensor 206 ′ side surface of the substrate 220 via the polarizing filter layer 223 and the filler 224. A spectral filter layer 222 (second spectral filter layer) that selectively transmits light having a wavelength component in the wavelength range of λ3 to λ4 is included, and the surface of the spectral filter layer 222 on the image sensor 206 ′ side is the image sensor 206 ′. It is arranged close to.

  That is, the optical filter 205C has a structure in which the spectral filter layer 221 and the spectral filter layer 222 are overlapped in the light transmission direction.

  Here, the spectral characteristics of the color filter used in the color sensor generally show characteristics as shown in FIG. In FIG. 25, the spectral characteristic of the R pixel is indicated by a solid line, the spectral characteristic of the G pixel is indicated by a broken line, and the spectral characteristic of the B pixel is indicated by a one-dot chain line. That is, with respect to wavelengths shorter than 800 nm, the spectral characteristics of the R pixel, G pixel, and B pixel are different.

  On the other hand, the spectral characteristics of the spectral filter layer 222 included in the optical filter 205C may be as shown in FIG. That is, the spectral filter layer 222 may have spectral characteristics having, for example, a near-infrared light region in the wavelength range of 940 nm to 1000 nm as a transmission band (cutting light on a shorter wavelength side than the wavelength 940 nm).

  The near-infrared light region in the wavelength range of 940 nm to 1000 nm is a wavelength band in which the spectral characteristics of all three colors of pixels (R pixel, G pixel, and B pixel) in the spectral characteristics of the color filter shown in FIG. is there.

  Therefore, the image processing system according to the present embodiment and the vehicle including the same use the pixels of three colors with almost no difference in luminance information in the region where the spectral filter layer 222 is formed, and stably and highly polarized light. A degree image can be generated. Furthermore, in this embodiment, it is possible to increase the recognition accuracy of the sign including various information such as red, yellow, and blue by using the color sensor.

  Next, the structure and manufacturing method of the spectral filter layer 222 will be described. Similar to the spectral filter layer 221, the spectral filter layer 222 can be manufactured as a multilayer film structure in which high refractive index thin films and low refractive index thin films are alternately stacked.

  In the present embodiment, since the use wavelength range of the captured image data is substantially from the visible light wavelength band to the infrared light wavelength band, the image sensor 206 ′ having sensitivity in the use wavelength range is employed. Since the spectral filter layer 222 only needs to transmit a part of infrared light, the transmission wavelength range of the multilayer film portion of the spectral filter layer 222 is set to, for example, 900 nm or more, and other wavelength bands are reflected. A filter (see FIG. 26) may be formed.

Such a cut filter is obtained by forming a multilayer film having a configuration of “substrate / (0.125L 0.25H 0.125L) p / medium A” in order from the lower side in the stacking direction of the optical filter 205C. It is done. The “substrate” here means the filler 224 described above. In addition, “0.125L” is obtained by setting nd / λ to 1L in the film thickness marking method of a low refractive index material (for example, SiO 2 ). Therefore, the film of “0.125L” has 1/8 wavelength. It is a film of a low refractive index material having a film thickness that becomes an optical path length. “N” is a refractive index, “d” is a thickness, and “λ” is a cutoff wavelength.

Similarly, “0.25H” is a film thickness marking method of a high refractive index material (for example, TiO 2 ) in which nd / λ is 1H. Therefore, a film of “0.25H” has a quarter wavelength. It is a film of a high refractive index material having a film thickness such that the optical path length becomes. “P” indicates the number of times the film combination shown in parentheses is repeated (laminated), and the more “p”, the more the influence of ripple and the like can be suppressed. The medium A is intended for resin or adhesive for tight bonding with air or the image sensor 206 ′.

Further, the spectral filter layer 222 may be a band-pass filter having a transmission wavelength range of 940 to 970 nm as a multilayer film structure that transmits only the infrared wavelength band. Such a band-pass filter is, for example, “substrate / (0.125L 0.5M 0.125L) p (0.125L 0.5H 0.125L) q (0.125L 0.5M 0.125L) r / It can be obtained by producing a multilayer film having a configuration such as “medium A”. If titanium dioxide (TiO 2 ) is used as the high refractive index material and silicon dioxide (SiO 2 ) is used as the low refractive index material, a highly weather resistant spectral filter layer 222 can be realized.

  An example of a method for manufacturing the spectral filter layer 222 will be described. First, the multilayer film described above is formed on the filler 224 formed on the substrate 220 and the polarizing filter layer 223. As a method for forming such a multilayer film, a known method such as vapor deposition may be used. Subsequently, the multilayer film is removed from a portion corresponding to the non-spectral region (for example, the upper region of the image sensor 206 ′).

  As this removal method, a general lift-off processing method may be used. In the lift-off processing method, a pattern opposite to the target pattern is formed in advance on a layer of the filler 224 with a metal, a photoresist, or the like, a multilayer film is formed thereon, and then a non-spectral region. The multi-layer film corresponding to the above is removed together with the metal and photoresist.

  In this embodiment, since the multilayer structure is adopted as the spectral filter layer 222, there is an advantage that the degree of freedom in setting the spectral characteristics is high.

  In the present embodiment, the spectral filter layer 222 laminated on the filler 224 is not provided with a protective layer like the filler 224. According to the experiments by the present inventors, even when the spectral filter layer 222 is in contact with the image sensor 206 ′, damage that affects the captured image did not occur. The protective layer is omitted.

  The height of the metal wire (convex portion) of the polarizing filter layer 223 is generally as low as half or less of the wavelength used, while the height of the spectral filter layer 222 increases the transmittance at the cutoff wavelength as the height (thickness) increases. Since the characteristics can be sharpened, the height is about the same as the used wavelength to several times higher.

  As the thickness of the filler 224 increases, it becomes more difficult to ensure the flatness of the upper surface, which affects the characteristics of the optical filter 205C, so there is a limit to increasing the thickness of the filler 224. Therefore, in the present embodiment, the spectral filter layer 222 is not covered with a filler.

  That is, in this embodiment, since the spectral filter layer 222 is formed after the polarizing filter layer 223 is covered with the filler 224, the layer of the filler 224 can be stably formed. In addition, the spectral filter layer 222 formed on the upper surface of the layer of the filler 224 can be optimally formed.

(Sixth embodiment)
An image processing system according to a sixth embodiment of the present invention and a vehicle including the image processing system will be described with reference to the drawings. In addition, description is abbreviate | omitted suitably about the structure and operation | movement similar to embodiment already demonstrated. In the present embodiment, another configuration example of the spectral filter layer in the optical filter is shown.

  FIG. 27 is a schematic cross-sectional view along the light transmission direction showing another example of the structure of the optical filter. FIG. 28 is a schematic front view of the optical filter 205D in the present embodiment as viewed from the sensor substrate 207 side.

  In addition to the configuration of the optical filter 205C shown in FIG. 22 and the like of the fifth embodiment, the optical filter 205D has a filler 224 outside the effective imaging region on the surface of the substrate 220 on the imaging element 206 (or 206 ′) side. The surface of the spectral filter layer 222 and the spectral filter layer 226 on the image sensor 206 (or 206 ′) side is the image sensor 206 (or 206). ') Be placed close to.

  With this configuration, the image processing system of this embodiment and the vehicle including the same have an effect that the optical filter and the image sensor can be joined in parallel and stably.

(Seventh embodiment)
An image processing system according to a seventh embodiment of the present invention and a vehicle including the image processing system will be described with reference to the drawings. In addition, description is abbreviate | omitted suitably about the structure and operation | movement similar to embodiment already demonstrated.

  In the present embodiment, a headlamp 104 (see FIG. 1) is a halogen lamp or a light emitting diode (LED) having a wavelength band in the near infrared band, particularly one having a light emission wavelength in the transmission band of the spectral filter layer 222. Thereby, the road surface state can be detected by capturing an image that has passed through the polarizing filter layer 223 and the spectral filter layer 222 even at night.

  Note that the light source is not limited to the headlamp 104. For example, as shown in FIG. May be installed. FIG. 29 shows an example in which a light source 202 that irradiates light toward the windshield 105 from the inner wall surface 105 a side of the windshield 105 of the vehicle 100 is housed in a cover 210 fixed in the vehicle 100. Is shown.

  As the light projection wavelength of the headlamp 104 and the light emission wavelength of the light source 202, the wavelength whose center wavelength and bandwidth are included in the wavelength range λ3 to λ4 of the spectral filter layer 222 may be selected. That is, for example, a semiconductor laser of 950 nm ± 5 nm may be used. In general, the semiconductor laser has a characteristic that the emission wavelength shifts to the longer wavelength side depending on the temperature. However, if a cut filter as shown in FIG. 26 is used, it is possible to cover the fluctuation of the emission wavelength.

  By the way, Patent Document 3 discloses a laser device that irradiates a road surface with laser light, a camera that captures reflected light from the road surface of the laser light, and a polarizing plate that is disposed in the front stage of the camera and that can rotate the polarization plane. , And a system for detecting a road surface state from a plurality of pieces of polarization information is disclosed.

  On the other hand, since the present invention can acquire the horizontal polarization component image and the vertical polarization component image in real time without rotating the polarizing plate by using the optical filter 205C in the fifth embodiment, the vehicle 100 travels. This makes it possible to detect the road surface condition.

  Further, by providing a polarizer (not shown) having a 45-degree oblique direction on the polarizing filter layer 223, information on three polarization states (polarization angles 0, 45, and 90 degrees) as disclosed in Patent Document 3 is provided. Can also be detected. In this case, the image analysis unit 102 generates a polarization component image with a polarization angle of 45 degrees in addition to the horizontal polarization component image and the vertical polarization component image, and determines the road surface state based on these three types of images. Become.

  Even if the optical filter has the spectral filter layer 222, disturbance light that passes through the bandpass region of the spectral filter layer 222 also exists, and thus the influence of the disturbance light cannot be completely removed. For example, an infrared wavelength component included in the headlamp of an oncoming vehicle may enter the imaging unit 101 as disturbance light at night.

  In the present embodiment, in order to prevent such erroneous detection, the image analysis unit 102 performs image processing and light source control described below. The image analysis unit 102 is electrically connected to the headlamp 104 (or the light source 202), and controls the lighting of the headlamp 104 (or the light source 202) to be synchronized with the exposure timing of the imaging unit 101.

  Here, at least two frames are used as image frames for detecting the road surface condition. Further, in an application in which the vehicle 100 is controlled based on the information of the sign detection, it is common to perform automatic exposure control (AEC: Auto Exposure Control) in accordance with the luminance value of the upper region 212. For the two frames, optimal exposure control is performed for detecting the road surface condition.

  For example, the imaging unit 101 captures an image with the headlamp 104 (or light source 202) turned on in the first frame of the two frames, and the headlamp 104 (or light source) in the second frame of the two frames. 202) is controlled so as to capture an image with the light off. Note that power consumption can be reduced if the headlamp 104 (or the light source 202) is always turned off except at the exposure timing of the first frame for detecting the road surface condition.

Assuming that the luminance value of a frame imaged with the headlamp 104 (or light source 202) turned on is Ya, and the luminance value of a frame imaged with the headlamp 104 (or light source 202) turned off, the luminance value is Yb. Ya includes the reflected light and disturbance light from the road surface, and the luminance value Yb includes only the disturbance light. Here, the image analysis unit 102 calculates the luminance value Yr from which the influence of disturbance light is removed by the following equation (7) for each pixel in the lower area 211 for detecting the road surface state.
Yr = Ya-Yb (7)

  That is, the difference image obtained with the luminance value Yr is obtained by detecting only the light obtained by the reflected light from the road surface. In addition, when there is no disturbance light, the luminance value of the frame imaged with the headlamp 104 (or the light source 202) turned off is substantially zero in the lower region 211. That is, the frame in which the light source 202 is turned on and the difference image are almost the same image.

  Even in the presence of disturbance light, only the light reflected on the road surface is obtained as an image as in the case of no difference light. The image analysis unit 102 can detect the road surface state by eliminating the influence of disturbance light by performing image processing for detecting the road surface state based on the difference image.

  The light source 202 is not limited to the one that emits a single light beam, and may have a plurality of light emitting units. The plurality of light emitting units may be composed of a plurality of separate light sources, or may be configured to divide light emitted from a single light source into a plurality of light sources. As a dividing method, there is a method in which a diffraction grating having a pitch approximately equal to the emission wavelength of the light source 202 is disposed in the outgoing light path of the light source 202. Thereby, the emitted light can be divided into a plurality of zero-order, ± first-order,...

  In addition, as shown in FIG. 30, if the two diffraction gratings 231 and 232 having the orthogonal pitch directions are arranged in the outgoing light path of the light source 202, the outgoing light is divided into N × N lines. Can do.

  FIG. 31 shows a reference pattern (FIG. 31A) of a plurality of emitted lights output from the light source 202, and the emitted light is applied to an object (subject), and reflected light from the object is imaged by the imaging unit 101. It is explanatory drawing which shows the imaging pattern (FIG.31 (b)) at the time of being carried out.

  As shown in FIG. 31B, when a plurality of light emitted from the light source 202 is reflected by an object, the interval between spots (black circles) of the imaging pattern at a location corresponding to the position of the object is the spot of the reference pattern. It turns out that it is disturbed compared with (white circle). The image analysis unit 102 can acquire object shape information and the like by detecting such a deviation of the imaging pattern from the reference pattern. In other words, this configuration enables obstacle detection on the road surface.

100 Vehicle 101 Imaging unit (imaging means)
102 Image analysis unit (image analysis means)
103 Headlamp control unit 104 Headlamp 105 Windshield (transparent member)
105a Inner wall surface 106 Wiper control unit 107 Wiper 108 Vehicle traveling control unit 110 Image processing system 202 Light source 204 Imaging lens 205, 205A to 205D Optical filter 206, 206 ′ Image sensor 206a Pixel array 206b Color filter 207 Sensor substrate 208 Signal processing unit 210 Cover 211 Lower region 212 Upper region 220 Substrate 221 Spectral filter layer (first spectral filter layer)
222 Spectral filter layer (second spectral filter layer)
223 Polarizing filter layer (first polarizing filter layer)
224 Filler 225 Polarizing filter layer (second polarizing filter layer)
226 Spectral filter layer (third spectral filter layer)
230 Polarization separation means 231 232 Diffraction grating

Japanese Patent No. 2707426 JP 2007-86720 A JP 2007-316049 A

Claims (14)

  1. An imaging device having a pixel array composed of a plurality of pixels, and an imaging unit that has an optical filter arranged in a preceding stage of the imaging device and images vehicle periphery information including a road surface;
    Image analysis means for analyzing the imaging result of the imaging means,
    The optical filter is
    A substrate that transmits incident light;
    A first polarizer is formed, which is disposed in a predetermined area of the effective imaging area and is configured to cause each of the plurality of polarization components including at least a horizontal polarization component and a vertical polarization component of the incident light to enter each pixel of the pixel array. A polarizing filter layer,
    Wherein the image analysis means, said generating a horizontal polarization component image and the vertical polarization component image from the pixel value of each pixel in the predetermined area, the pixel values of the pixels of the horizontal polarization component image, and each of the vertical polarization component image A polarization degree image is generated by multiplying a value based on a difference between pixel values of pixels by a gain value, a road surface state is determined based on the polarization degree image , and before passing through the effective imaging area other than the predetermined area There rows labeled recognition based on the imaging result of the incident light,
    The image processing system according to claim 1, wherein the gain value in a peripheral portion of the effective imaging region is larger than the gain value in a central portion of the effective imaging region .
  2.   The first polarizing filter layer is formed by alternately forming a polarizer that transmits a horizontal polarization component of the incident light and a polarizer that transmits a vertical polarization component of the incident light as a stripe pattern. The image processing system according to claim 1.
  3. A polarization separation means arranged in a preceding stage of the optical filter and separating the incident light into a horizontal polarization component and a vertical polarization component;
    The polarization separation means is configured to linearly shift one of the horizontal polarization component and the vertical polarization component and shift the other optical path by a predetermined shift amount,
    The image processing system according to claim 2, wherein the predetermined shift amount corresponds to a pitch of the stripe pattern.
  4. The substrate of the optical filter separates the incident light into a horizontal polarization component and a vertical polarization component, and straightens one of the horizontal polarization component and the vertical polarization component and shifts the other optical path by a predetermined shift amount. Made of a birefringent material having a polarization separation function,
    The image processing system according to claim 2, wherein the predetermined shift amount corresponds to a pitch of the stripe pattern.
  5.   A second polarizing filter layer in which the optical filter is disposed in the effective imaging region other than the predetermined region, and a polarizer is formed to allow only the horizontally polarized component of the incident light to enter each pixel of the pixel array. The image processing system according to claim 1, further comprising:
  6. The optical filter is
    Selects light having a wavelength component in the range of wavelengths λ1 to λ2 and λ3 to λ4 (λ1 <λ2 <λ3 <λ4) formed on the entire surface of the effective imaging region on the surface of the substrate opposite to the imaging device. The image processing system according to any one of claims 1 to 5, further comprising a first spectral filter layer that transmits light.
  7. The optical filter is
    A second spectral filter layer that is formed in the predetermined area of the effective imaging area on the imaging element side surface of the substrate and selectively transmits light having a wavelength component in the wavelength range of λ3 to λ4. The image processing system according to claim 6.
  8.   An exposure amount when the imaging unit images a frame for the predetermined area of the effective imaging region is different from an exposure amount when the imaging unit images a frame for the effective imaging region other than the predetermined area. The image processing system according to claim 1, wherein the image processing system is an image processing system.
  9. The imaging unit further includes a light source including one or more light emitting units that irradiate light toward a subject,
    The emission wavelength of each light emitting part is included in the range of the wavelengths λ3 to λ4,
    The image processing system according to claim 7 or 8, wherein the image analysis unit acquires shape information of the subject based on light emitted from the light source and reflected by the subject.
  10.   When the pixel value of each pixel of the horizontal polarization component image is I (P) and the pixel value of each pixel of the vertical polarization component image is I (S), the image analysis means is α × (I (P ) −I (S)) / (I (P) + I (S)) (where α is a constant), and α in the periphery of the effective imaging region is calculated as the effective imaging region. The image processing system according to claim 1, wherein the image processing system is larger than a central portion of the image processing system.
  11. The polarizer of the first polarizing filter layer transmits a polarization component that forms an angle of 45 degrees with respect to the horizontal polarization component and the vertical polarization component of the incident light;
    The said image analysis means discriminate | determines a road surface state based on the said horizontal polarization component, the said vertical polarization component, and the polarization component which makes the said 45 degree | times angle, It is any one of Claims 1-10 characterized by the above-mentioned. The image processing system according to item.
  12. An image processing system according to any one of claims 1 to 11,
    Vehicle driving control means for performing driving support control for supporting driving of the vehicle based on the polarization degree image generated by the image analysis means provided in the image processing system.
  13. The range of transmission wavelengths λ3 to λ4 of the first and second spectral filter layers included in the image processing system includes at least a part of a projection wavelength band of a vehicle headlamp. Vehicle described in .
  14. The image analysis means turns on when the headlamp or the light source is imaged by the imaging means when imaging the first frame for the predetermined area of the effective imaging area, and turns off when imaging the second frame. The vehicle according to claim 13 , wherein a road surface condition is detected based on difference information between the first frame and the second frame .
JP2012010010A 2012-01-20 2012-01-20 Image processing system and vehicle equipped with the same Active JP5899957B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2012010010A JP5899957B2 (en) 2012-01-20 2012-01-20 Image processing system and vehicle equipped with the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2012010010A JP5899957B2 (en) 2012-01-20 2012-01-20 Image processing system and vehicle equipped with the same

Publications (2)

Publication Number Publication Date
JP2013148504A JP2013148504A (en) 2013-08-01
JP5899957B2 true JP5899957B2 (en) 2016-04-06

Family

ID=49046123

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012010010A Active JP5899957B2 (en) 2012-01-20 2012-01-20 Image processing system and vehicle equipped with the same

Country Status (1)

Country Link
JP (1) JP5899957B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101394244B1 (en) * 2013-12-12 2014-05-14 한국건설기술연구원 Multi image acquisition apparatus, and probe car-based system for sensing road surface condition automatically using the same
JP6340795B2 (en) * 2013-12-27 2018-06-13 株式会社リコー Image processing apparatus, image processing system, image processing method, image processing program, and moving body control apparatus

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5438414A (en) * 1993-01-22 1995-08-01 The Johns Hopkins University Integrated dual imaging detector
JP3341664B2 (en) * 1997-12-15 2002-11-05 トヨタ自動車株式会社 Vehicle line detection device and the road line detection method and medium recording a program
JP4974543B2 (en) * 2005-08-23 2012-07-11 株式会社フォトニックラティス Polarization imaging device
JP5610254B2 (en) * 2008-06-18 2014-10-22 株式会社リコー Imaging apparatus and road surface state determination method
JP5572954B2 (en) * 2009-01-26 2014-08-20 株式会社リコー Image pickup device and image pickup apparatus including the image pickup device
US8411146B2 (en) * 2009-09-04 2013-04-02 Lockheed Martin Corporation Single camera color and infrared polarimetric imaging
BR112012017199A2 (en) * 2009-12-25 2018-07-31 Ricoh Co Ltd object identification apparatus, moving body monitoring apparatus and apparatus for providing information
JP5696927B2 (en) * 2009-12-25 2015-04-08 株式会社リコー Object identification device, and moving body control device and information providing device provided with the same
US8796798B2 (en) * 2010-01-27 2014-08-05 Ricoh Company, Ltd. Imaging module, fabricating method therefor, and imaging device
JP5637448B2 (en) * 2011-01-27 2014-12-10 株式会社リコー Polarized imaging device

Also Published As

Publication number Publication date
JP2013148504A (en) 2013-08-01

Similar Documents

Publication Publication Date Title
CA2487409C (en) Light source detection and categorization system for automatic vehicle exterior light control and method of manufacturing
US8446470B2 (en) Combined RGB and IR imaging sensor
US6573490B2 (en) Interleaved mosaic imaging rain sensor
US20170221949A1 (en) Two-dimensional solid-state image capture device and polarization-light data processing method therefor
US7518099B2 (en) Multifunctional optical sensor comprising a photodetectors matrix coupled to a microlenses matrix
US20080129541A1 (en) Black ice detection and warning system
EP1919199A2 (en) Multiband camera system
US7579593B2 (en) Night-vision imaging apparatus, control method of the same, and headlight module
DE102011103302A1 (en) Camera system for a vehicle
EP1418089B1 (en) Multifunctional integrated optical system with CMOS or CCD technology matrix
EP2026591A2 (en) Solid-state imaging device, camera, vehicle and surveillance device
CN101861542B (en) The imaging system multizone
US7385680B2 (en) Camera module
KR20060123353A (en) Method and system for wavelength-dependent imaging and detection using a hybrid filter
US20100208060A1 (en) Liquid droplet recognition apparatus, raindrop recognition apparatus, and on-vehicle monitoring apparatus
JP5278165B2 (en) Focus detection device, imaging device, and electronic camera
JP2004531740A (en) Sensors for dual wavelength band
EP2439716A2 (en) Object identification device, moving object controlling apparatus having object identification device and information presenting apparatus having object identification device
CN103221805B (en) By means of a raindrop on the detection camera and an illumination apparatus glazing
US9666620B2 (en) Stacked filter and image sensor containing the same
CN1902522A (en) Night vision system for motor vehicles, comprising a partial optical filter
JP6399496B2 (en) Polarized image processing device
CN101828403B (en) Color mask for an image sensor of a vehicle camera
US9187063B2 (en) Detection apparatus and method
US8830324B2 (en) Vehicle monitoring camera and vehicle monitoring camera system

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20141217

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20151026

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20151110

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20160108

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20160209

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20160222