WO2015015718A1 - センサーアセンブリ - Google Patents
センサーアセンブリ Download PDFInfo
- Publication number
- WO2015015718A1 WO2015015718A1 PCT/JP2014/003611 JP2014003611W WO2015015718A1 WO 2015015718 A1 WO2015015718 A1 WO 2015015718A1 JP 2014003611 W JP2014003611 W JP 2014003611W WO 2015015718 A1 WO2015015718 A1 WO 2015015718A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor assembly
- lens
- sensor
- pixel
- optical system
- Prior art date
Links
- 238000001514 detection method Methods 0.000 claims abstract description 43
- 230000003287 optical effect Effects 0.000 claims description 93
- 238000003384 imaging method Methods 0.000 claims description 71
- 239000000463 material Substances 0.000 claims description 18
- 239000004698 Polyethylene Substances 0.000 claims description 13
- -1 polyethylene Polymers 0.000 claims description 13
- 229920000573 polyethylene Polymers 0.000 claims description 13
- 230000005855 radiation Effects 0.000 claims description 9
- 230000005670 electromagnetic radiation Effects 0.000 claims description 2
- 230000035945 sensitivity Effects 0.000 description 29
- 238000009826 distribution Methods 0.000 description 19
- 230000000694 effects Effects 0.000 description 18
- 230000035807 sensation Effects 0.000 description 16
- 238000002834 transmittance Methods 0.000 description 12
- 238000005070 sampling Methods 0.000 description 11
- 239000000758 substrate Substances 0.000 description 10
- 230000004075 alteration Effects 0.000 description 8
- 229910052732 germanium Inorganic materials 0.000 description 7
- GNPVGFCGXDBREM-UHFFFAOYSA-N germanium atom Chemical compound [Ge] GNPVGFCGXDBREM-UHFFFAOYSA-N 0.000 description 7
- 229910052710 silicon Inorganic materials 0.000 description 7
- 239000010703 silicon Substances 0.000 description 7
- 206010010071 Coma Diseases 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 238000000034 method Methods 0.000 description 5
- 239000011347 resin Substances 0.000 description 5
- 229920005989 resin Polymers 0.000 description 5
- 238000004378 air conditioning Methods 0.000 description 4
- 238000005520 cutting process Methods 0.000 description 4
- 230000006866 deterioration Effects 0.000 description 4
- 239000000126 substance Substances 0.000 description 4
- 210000005069 ears Anatomy 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000005498 polishing Methods 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000001902 propagating effect Effects 0.000 description 2
- 238000001228 spectrum Methods 0.000 description 2
- 238000001931 thermography Methods 0.000 description 2
- 230000005457 Black-body radiation Effects 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 238000000862 absorption spectrum Methods 0.000 description 1
- 238000000149 argon plasma sintering Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000000875 corresponding effect Effects 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 210000001061 forehead Anatomy 0.000 description 1
- 229920001903 high density polyethylene Polymers 0.000 description 1
- 239000004700 high-density polyethylene Substances 0.000 description 1
- 238000002844 melting Methods 0.000 description 1
- 230000008018 melting Effects 0.000 description 1
- 230000005499 meniscus Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000004611 spectroscopical analysis Methods 0.000 description 1
- 238000000411 transmission spectrum Methods 0.000 description 1
- 238000010792 warming Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/02—Constructional details
- G01J5/08—Optical arrangements
- G01J5/0806—Focusing or collimating elements, e.g. lenses or concave mirrors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/02—Details
- G01J1/04—Optical or mechanical part supplementary adjustable parts
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/0022—Radiation pyrometry, e.g. infrared or optical thermometry for sensing the radiation of moving bodies
- G01J5/0025—Living bodies
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/02—Constructional details
- G01J5/08—Optical arrangements
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/02—Constructional details
- G01J5/08—Optical arrangements
- G01J5/0808—Convex mirrors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/10—Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B17/00—Systems with reflecting surfaces, with or without refracting elements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/02—Simple or compound lenses with non-spherical faces
- G02B3/06—Simple or compound lenses with non-spherical faces with cylindrical or toric faces
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/02—Simple or compound lenses with non-spherical faces
- G02B3/08—Simple or compound lenses with non-spherical faces with discontinuous faces, e.g. Fresnel lens
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J5/00—Radiation pyrometry, e.g. infrared or optical thermometry
- G01J5/10—Radiation pyrometry, e.g. infrared or optical thermometry using electric radiation detectors
- G01J2005/106—Arrays
Definitions
- the present invention relates mainly to a sensor assembly used in measuring far-infrared distribution.
- a sensor assembly having a two-dimensional array sensor and imaging optics is used to obtain a two-dimensional image of infrared light.
- cost reduction of the sensor assembly has been achieved.
- Patent Document 1 discloses a technique for obtaining a two-dimensional image by scanning a line sensor in order to reduce the cost of the sensor assembly.
- Patent Document 2 discloses an imaging optical system using a silicon or germanium lens.
- the present invention provides a sensor assembly that can be manufactured inexpensively while maintaining detection sensitivity.
- a sensor assembly is a plurality of pixels for detecting an electromagnetic wave, the line sensor having a plurality of pixels arranged in a line in a predetermined direction, and an image by the electromagnetic wave on the plurality of pixels.
- the sensor assembly of the present invention can be manufactured inexpensively while maintaining detection sensitivity.
- FIG. 1A is a side view (a) and a top view (b) of a sensor assembly according to Embodiment 1.
- FIG. 1B is a front view of the sensor assembly in Embodiment 1.
- FIG. 2A is a side view of the sensor assembly in Embodiment 1, and is a schematic view of the relationship between the width of the incident light of the imaging optical system and the light collection diameter.
- FIG. 2B is an explanatory view of an image forming method in the first embodiment.
- FIG. 3 is a schematic view (a) of a sensor used for the sensor assembly in Embodiment 1, a side view (b) and a top view (c) of the sensor assembly.
- FIG. 4 is a side view (a) and a top view (b) of the sensor assembly in the first embodiment.
- FIG. 5 is an explanatory view of the shape of the lens.
- 6A and 6B are an explanatory view (a) regarding the Fresnel shape in the X direction and an explanatory view (b) regarding the condensing position on each pixel with respect to oblique incident light in the sensor assembly in Embodiment 1.
- FIG. FIG. 7 is a side view (a) and a top view (b) of the sensor assembly in the first embodiment.
- FIG. 8 is a schematic view (a) of a sensor used for the sensor assembly in Embodiment 1, a side view (b) and a top view (c) of the sensor assembly.
- FIG. 9 is a side view (a) and a top view (b) of the sensor assembly in the second embodiment.
- FIG. 10 is an explanatory diagram of an arrangement that makes it difficult to receive stray light in the sensor assembly according to the second embodiment.
- FIG. 11 is an explanatory view (a) relating to an arrangement in which it is difficult to receive stray light in the sensor assembly according to the second embodiment, and an explanatory view (b) relating to an arrangement in which it is further difficult to receive stray light.
- FIG. 12 is a cross-sectional view (a) and a top view (b) of the automobile when the sensor assembly is mounted on the automobile in the third embodiment.
- FIG. 13 is an explanatory view of a configuration for preventing the influence of extraneous light when the sensor assembly is mounted on a car in the third embodiment.
- FIG. 14 is an explanatory view of a configuration for preventing the influence of extraneous light when the sensor assembly is mounted on a car in the third embodiment.
- FIG. 15 is an explanatory view (a) of acquiring a temperature distribution in a vehicle by scanning a sensor assembly in the third embodiment, and an explanatory view (b) regarding a sampling period at the time of scanning.
- FIG. 16 is a side view (a) and a top view (b) of the sensor assembly in the fourth embodiment.
- FIG. 17 is a front view of the sensor assembly in the fourth embodiment.
- FIG. 18 is a schematic view showing a measurement range of the sensor assembly in the fourth embodiment.
- Infrared rays in the near-infrared region with a wavelength of 0.7 to 2.5 micrometers are used for security applications such as night-vision cameras, or for remote control applications used for televisions and the like.
- infrared rays in the mid-infrared region of 2.5 to 4.0 micrometers are often used to identify a substance based on an absorption spectrum specific to the measurement object by spectrometry of the transmission spectrum of the measurement object.
- thermography In the far infrared region of 4.0 to 10 micrometers, there is a peak of the black body radiation spectrum near normal temperature. Therefore, the substance is detected by detecting infrared rays in the far infrared region emitted from the substance. Is used to measure the surface temperature of This usage is generally used to two-dimensionally capture the surface temperature of a substance as thermography.
- a sensor assembly used for thermography or the like includes a two-dimensional array sensor and an imaging optical system for forming an image on the two-dimensional array sensor, as with a camera or the like in a visible region.
- a bolometer etc. are conventionally used for the two-dimensional array sensor for far infrared rays.
- a bolometer is a sensor that is warmed by incident far infrared radiation, and detects the temperature rise of the sensor due to the warming as a resistance value.
- the bolometer has high image quality, it has a complicated and expensive mechanism for reading out, such as a circuit for supplying a current.
- Patent Document 1 In recent years, in order to mount a sensor assembly to household appliances, such as an air conditioner, cost reduction of a sensor assembly is needed. Therefore, cost reduction is achieved by using an inexpensive sensor such as a thermopile or obtaining a two-dimensional image by scanning a line sensor (Patent Document 1).
- Patent Document 2 using a lens or the like of silicon or germanium in an imaging optical system which is a component of a sensor assembly is also performed.
- the present invention provides a sensor assembly that can be manufactured inexpensively while maintaining detection sensitivity.
- a sensor assembly is a line sensor including a plurality of pixels for detecting an electromagnetic wave, the plurality of pixels being arranged in a line in a predetermined direction. And an imaging optical system for forming an image by an electromagnetic wave on a detection surface on the plurality of pixels, wherein the imaging optical system in a first direction orthogonal to the predetermined direction in a plane parallel to the detection surface Is different from the f-number of the imaging optical system in the second direction which is the predetermined direction.
- the sensor assembly uses the imaging optical system having the larger one of the F numbers of the imaging optical system in the first and second directions as the F number of the first and second directions Rather, it is possible to increase the dose of electromagnetic waves detected by the pixels through the imaging optical system.
- the sensor assembly can improve detection sensitivity over the above case.
- the sensor assembly can maintain or increase the dose of the electromagnetic wave transmitted through the entire imaging optical system.
- the transmittance of the electromagnetic wave of the material used for the imaging optical system is relatively low, the amount of the electromagnetic wave transmitted through the imaging optical system is reduced.
- the amount of the electromagnetic waves passing through the imaging optical system is increased. Therefore, the dose of the electromagnetic wave transmitted as a whole of the imaging optical system can be maintained or increased by setting the increase to be equal to or larger than the decrease.
- the sensor assembly can be manufactured inexpensively while maintaining detection sensitivity.
- the f-number of the imaging optical system in the first direction is smaller than the f-number of the imaging optical system in the second direction.
- the sensor assembly allows the sensor assembly to increase the dose of electromagnetic radiation transmitted through the imaging optics while keeping the line sensor pixel spacing the same as in the prior art.
- the sensor assembly can be manufactured inexpensively while maintaining detection sensitivity using a line sensor of the same configuration as the conventional one.
- the imaging optical system has a lens, and the f-number of the lens in the first direction is smaller than the f-number of the lens in the second direction.
- the sensor assembly is specifically realized using lenses having different F-factors in the first and second directions as an imaging optical system.
- the width of the lens in the first direction is greater than the width of the lens in the second direction.
- the sensor assembly is specifically realized using lenses having different widths in the first and second directions as an imaging optical system.
- the cross-sectional shape in the plane perpendicular to the first direction of the lens is different from the cross-sectional shape in the plane perpendicular to the second direction of the lens.
- the sensor assembly is specifically realized using lenses having different cross-sectional shapes in the first and second directions as an imaging optical system.
- the cross-sectional shape in a plane perpendicular to the second direction of the lens includes a Fresnel shape.
- a Fresnel shape is used as a cross-sectional shape in a plane perpendicular to the second direction which may have relatively low accuracy among the first and second directions. In this way, it is possible to suppress the thickness of the imaging optical system while suppressing the influence of the line sensor on the detection accuracy of the electromagnetic wave.
- the cross-sectional shape in the plane perpendicular to the first direction of the lens does not include the Fresnel shape.
- the sensor assembly can maintain the detection accuracy of the electromagnetic wave.
- the detection accuracy of the electromagnetic wave is lowered. Maintaining a desire for detection by the imaging optical system by not using a Fresnel shape in a cross-sectional shape in a plane perpendicular to the first direction which requires relatively high accuracy in the first and second directions it can.
- the imaging optical system has a mirror, and the f-number of the mirror in the first direction is larger than the f-number of the mirror in the second direction.
- the sensor assembly is specifically realized using mirrors with different F-factors in the first and second directions as imaging optics. That is, the sensor assembly can be realized by using a reflective imaging optical system instead of a transmissive imaging optical system.
- the mirror is an off-axis parabolic mirror.
- the sensor assembly further comprises a flat mirror.
- the sensor assembly can be specifically configured to be difficult for the line sensor to receive stray light incident inside the sensor assembly. This allows the sensor assembly to focus the electromagnetic waves on the pixels of the line sensor with higher accuracy.
- the width in each of the plurality of pixels in the first direction is larger than the width in each of the plurality of pixels in the second direction.
- the electromagnetic wave includes far infrared rays having a wavelength of 8 to 10 micrometers.
- the material of the lens is polyethylene.
- the sensor assembly can be manufactured inexpensively by using a lens made of polyethylene as an imaging optical system.
- each of the plurality of pixels is an infrared detector using a thermopile or a bolometer.
- the sensor assembly is specifically realized by an infrared detector using a thermopile or bolometer.
- Embodiment 1 In the present embodiment, a sensor assembly manufactured inexpensively and improving detection sensitivity will be described.
- the sensor assembly in the present embodiment is an imaging optical system used for a line sensor, and can be manufactured inexpensively while maintaining detection sensitivity.
- the sensor assembly 100 according to the present embodiment will be described with reference to FIGS. 1A and 1B.
- FIG. 1A (a) is a side view of the sensor assembly 100
- FIG. 1A (b) is a top view of the sensor assembly 100 as viewed in the direction of arrow A in FIG. 1A (a).
- FIG. 1B is a front view of sensor assembly 100.
- the X-axis direction is a direction passing through the sheet from the front to the back
- the Y-axis direction is a direction from the bottom to the top of the sheet
- the Z-axis direction is from left to right in the sheet It is the direction to go.
- the X-axis direction is the direction from the bottom to the top of the paper
- the Y-axis direction is the direction from the back to the front of the paper
- the Z-axis direction from the left in the paper It is the direction to the right.
- the X, Y and Z directions are directions as indicated by the coordinate axes.
- the sensor assembly 100 comprises a sensor 101 and a lens 104.
- the sensor 101 is a line sensor having a sensor substrate 103 and a pixel array 102.
- the pixel array 102 is provided on the sensor substrate 103, and includes pixels 102a, 102b, 102c, 102d and 102e. Each of the pixels 102 a and the like detects an electromagnetic wave.
- each of the pixels 102 a and the like will be described as detecting particularly far infrared rays.
- the peak wavelength of infrared radiation emitted from a black body at normal temperature is in the vicinity of 8 to 10 micrometers. This wavelength range corresponds to far infrared radiation.
- Each of the pixels 102 a and the like is realized by, for example, an infrared detector using a thermopile or a bolometer.
- the surface on the side facing the lens 104 in the sensor 101 is a surface that detects an electromagnetic wave, this surface is also referred to as a detection surface.
- the detection surface can also be referred to as the surface of the sensor 101 opposite to the sensor substrate 103.
- the lens 104 focuses an electromagnetic wave on a detection surface on the sensor 101.
- the lens 104 corresponds to an imaging optical system.
- the material of the lens 104 is, for example, polyethylene. Polyethylene is transparent to infrared radiation of 8 to 10 micrometers.
- the material of the lens 104 is not limited to polyethylene, and any material of resin that can transmit infrared rays of the above-mentioned wavelength and can be molded can be used.
- the lens 104 is, for example, a spherical lens, and its shape will be described later.
- Each of the pixels 102a, 102b, 102c, 102d and 102e has a length L both in width (length in the X-axis direction) and height (length in the Y-axis direction), and has a square shape in the XY plane.
- the pixel 102 c of the sensor 101 and the lens 104 are provided on the optical axis 105.
- the pixels 102a, 102b, 102c, 102d and 102e included in the pixel column 102 are arranged in the Y-axis direction.
- a plurality of the respective pixels are arranged in the Y-axis direction, and only one pixel is provided in the X-axis direction which is a direction orthogonal to the direction of the arrangement. That is, the pixel array 102 is a so-called line sensor in which pixels are arranged in a line.
- the X-axis direction may be expressed as a first direction
- the Y-axis direction may be expressed as a second direction. That is, the present line sensor is a line sensor in which one pixel is arranged in the first direction and a plurality of pixels are arranged in the second direction, whereby the pixels are arranged in one row.
- the mechanism of the operation of the sensor assembly 100 according to the present embodiment will be described.
- the operation of the tangential surface of the sensor assembly 100 (the plane including the optical axis and the chief ray of each electromagnetic wave incident at each angle) will be described with reference to FIG. 1A (a).
- the vertically incident light 106 is an electromagnetic wave emitted from a point not shown on the object plane and on the optical axis 105.
- the vertically incident light 106 is vertically incident on the lens 104 as the vertically incident light 106.
- the vertically incident light 106 is focused on the pixel 102 c on the optical axis 105.
- an electromagnetic wave emitted from a not-shown point on the object surface and out of the optical axis 105 and obliquely incident on the lens 104 is referred to as obliquely incident light.
- the obliquely incident light 107 is incident at the largest incident angle ⁇ 1 among the obliquely incident light.
- the obliquely incident light 107 is condensed on the pixel 102 e farthest from the optical axis 105.
- electromagnetic waves emitted from the object plane along the Y axis and incident on the lens 104 at an angle within the angle ⁇ 1 are all condensed at any position of the pixel row 102 in the tangential plane. .
- the operation of the sensor assembly 100 in the sagittal plane (the plane including the optical axis and perpendicular to the tangential plane) will be described with reference to FIG. 1A (b). Since the pixels 102a, 102b, 102c, 102d and 102e are arranged in the Y-axis direction, only the width of one pixel of the pixel column 102 can be seen in (b) of FIG. 1A.
- the vertically incident light 106 is vertically incident on the lens 104.
- the vertically incident light 106 is focused at the center of the pixel array 102 on the optical axis 105.
- the optical path of the oblique incident light 107 shown in (a) of FIG. 1A is projected to the optical path substantially the same as the vertical incident light 106 in (b) of FIG.
- the light is collected at the center of the column 102 in the X-axis direction.
- the condensing position on the pixel row 102 differs between the vertically incident light 106 and the obliquely incident light 107 in the Y-axis direction, as described with reference to FIG. 1A.
- the vertically incident light 106 is condensed on the pixel 102c on the optical axis 105 in each of the tangential plane and the sagittal plane.
- the focal length of the lens 104 is equal in each of the tangential plane and the sagittal plane, that is, the lens 104 is an optical axis symmetrical lens. In such a configuration, the image of the electromagnetic wave emitted from the linear region on the object surface can be imaged on the pixel array 102 by the lens 104.
- FIG. 2A is a side view of the sensor assembly according to the present embodiment, and is a schematic view of the relationship between the width of the incident light of the imaging optical system and the diameter of the collected light.
- (A) of FIG. 2A is a figure which shows the said relationship on the same scale as (a) of FIG. 1A etc., and (b) of FIG. 2A is an enlarged view which expands and shows only the said relationship.
- the diameter (width) of the lens 104 be as large as possible because the detection sensitivity is higher as the amount of electromagnetic wave incident on each of the pixels 102a, 102b, 102c, 102d and 102e is larger.
- increasing the diameter of the lens leads to an increase in spherical aberration.
- the collection diameter at the collection position (here, on the pixel 102c) is R1.
- the vertically incident light 108 having a width W3 larger than the width W1 is collected by the lens 104
- the light collection diameter at the light collection position is R3.
- R3 is larger than R1 under certain conditions due to the influence of spherical aberration. This is generally the case when the lens 104 is a spherical lens.
- the effect of coma aberration is also remarkable for oblique incident light, so if the width of the electromagnetic wave incident on the lens 104 is large, the diameter of light collected on the pixel for the oblique incident light is It gets bigger.
- the width of the luminous flux or the diameter of the light collection is large is also expressed as thick.
- the term “ideal” refers to the case where the effects of spherical aberration and coma are not considered.
- “realistic” refers to the case where the effects of spherical aberration and coma are taken into consideration.
- the image obtained from the pixel row 102 is a blurred image.
- the shape of the lens 104 is characterized.
- the pixel row 102 is arranged such that the lens diameter D2 in the X-axis direction (first direction) in which the pixels of the sensor are formed of one pixel It is larger than the lens diameter D1 in the Y-axis direction (second direction) which is the direction.
- the lens diameter in the X-axis direction and the like can also be referred to as the width of the lens in the X-axis direction and the like.
- the F number in the X axis direction is smaller than the F number in the Y axis direction (second direction).
- the example of the shape of the lens 104 is shown to each of (a) and (b) of FIG. 1B.
- (A) of FIG. 1B corresponds to a spherical lens cut out with an ellipse having a major axis (diameter in the X axis direction) of D2 and a minor axis (diameter in the Y axis direction) of D1.
- (B) of FIG. 1B corresponds to a spherical lens cut out in a rectangle having a long side (a side parallel to the X axis) D2 and a short side (a side parallel to the Y axis) D1.
- the width in the X-axis direction is D2
- the width in the Y-axis direction is D1.
- the above is merely an example, and the shape of the lens 104 is not limited to these.
- the shape of the lens 104 may be a shape obtained by cutting out a shape having a length in the X-axis direction longer than a length in the Y-axis direction from the spherical lens.
- the shape may be any. That is, the figure may be a figure obtained by combining a triangle, a pentagon, other straight lines or curves, in addition to an ellipse ((a) in FIG. 1B) and a rectangle ((b) in FIG. 1B).
- FIG. 1A are described as the side view and the top view of the sensor assembly 100, respectively. These can also be said to indicate the cross-sectional shape of the sensor assembly 100 in the plane perpendicular to the first direction and the cross-sectional shape of the sensor assembly 100 in the plane perpendicular to the second direction, respectively. Viewed in this way, the cross-sectional shape in a plane perpendicular to the first direction of the lens 104 can be said to be different from the cross-sectional shape in the plane perpendicular to the second direction of the lens 104.
- the sensor assembly 100 has the following effects.
- the size of the lens 104 in the X-axis direction which is a direction orthogonal to the arrangement direction of the pixel columns 102 of the sensor assembly 100, is greater than the size of the lens 104 in the Y-axis direction, which is the alignment direction of the pixel columns 102.
- the imaging optical system configured by one lens is described, the present invention is effective even in an imaging optical system configured by a plurality of lenses.
- the imaging optical system is composed of a plurality of lenses, it is sufficient to consider the f-number of the imaging optical system instead of the size of the lens. That is, the F number of the imaging optical system is smaller than the F number in the Y axis direction, which is the arrangement direction of the pixel lines 102, in the X axis direction which is a direction orthogonal to the arrangement direction of the pixel lines 102 of the sensor assembly 100 By doing this, it is possible to realize a sensor assembly that can simultaneously satisfy resolution and high detection sensitivity.
- the lens size (effective diameter) and the f-number satisfy (Equation 1), so the f-number of the imaging optical system does not depend on the number of lenses. It can think in the same way paying attention.
- the lens 104 described above may be a spherical lens, but may be an aspheric lens, which is not limited here.
- FIG. 2B is an explanatory diagram of an image forming method in the present embodiment.
- Forming an image using the sensor assembly 100 can be performed, for example, while scanning the base 101a on which the sensor 101 and the lens 104 are integrally mounted in the direction of arrow B as shown in (b) of FIG. 2B. It can be achieved by acquiring it. At this time, for example, each time the base 101a is moved by a distance corresponding to half the width of the pixel row 102, an image is taken, and a plurality of obtained images are processed to make the resolution higher than the width L of the pixel row 102. Images can be acquired. It is also called super resolution to obtain an image of high resolution in this way.
- super resolution can be achieved by moving the base 101a so as to rotate the base 101a by half of the width of the pixel column L in the direction of the arrow B as described above.
- the center of rotation when rotating the base 101 a may be anywhere, but can be, for example, the center E of the pixel row 102.
- FIG. 3 is a schematic view (a) of the sensor used for the sensor assembly in the present embodiment, a side view (b) and a top view (c) of the sensor assembly.
- Sensor assembly 200 is similar to sensor assembly 100 shown in FIG. 1A, except that it includes sensor 201 instead of sensor 101.
- the sensor 201 includes a sensor substrate 203 and a pixel array 202.
- the pixel column 202 is provided on the sensor substrate 203, and includes pixels 202a, 202b, 202c, 202d, and 202e. Further, the size of each pixel is not a square, and the length L2 in the X-axis direction orthogonal to the pixel arrangement direction (Y-axis direction) is longer than the length L1 in the arrangement direction.
- the sensor assembly 200 further improves the detection sensitivity.
- the detection sensitivity is improved by the width of each of the pixels in the first direction being larger than the width of each of the plurality of pixels in the second direction.
- FIG. 4 is a side view (a) and a top view (b) of the sensor assembly in the present embodiment.
- the sensor assembly 210 of FIG. 4 is similar to the sensor assembly 100 of FIG. 1A, except that it includes a lens 109 instead of the lens 104 in FIG. 1A.
- the right side surface 109 a of the lens 109 is a continuous surface (or a continuous curved surface) similar to the lens 104 in the Y-axis direction.
- the right side surface 109a of the lens 109 is a Fresnel surface in the X-axis direction, and differs from the lens 104 in this point.
- the cross-sectional shape in a plane perpendicular to the second direction of the lens 109 includes a Fresnel shape.
- the cross-sectional shape in a plane perpendicular to the first direction of the lens 109 does not include the Fresnel shape.
- lenses used in the far infrared region are conventionally germanium or silicon in general. However, they are not only expensive but also hard and have high melting points, so they are difficult to mold. Therefore, polishing is used for processing but the shape that can be processed is limited, so there is only a spherical shape as a shape suitable for mass production.
- the only resin material that can transmit light in the infrared region and that can be molded is polyethylene (including high density polyethylene), but since there is absorption inside the polyethylene, for applications where the lens becomes thicker Is not used.
- the thickness T of this lens is 1.1 mm.
- a Fresnel surface is made at least in the X axis direction which is a direction orthogonal to the arrangement direction of the pixel array 102 of the sensor assembly.
- the transmittance does not easily decrease even if the lens diameter is increased.
- the lens 109 can be manufactured with the required thickness (length) in the Y-axis direction of the lens 109, so it is possible to construct a bright lens that is inexpensive and has a high transmittance. This makes it possible to construct a sensor assembly that is both inexpensive and more sensitive.
- the Fresnel shape of the right side surface 109a in the X-axis direction may be formed by making the same shape as the spherical (or aspherical) shape in the Y-axis direction into a Fresnel surface, as shown in FIG. It may be formed by making a shape different from a spherical (or aspherical) shape in the Y-axis direction into a Fresnel surface as in the above. Since there are a plurality of pixels in the Y-axis direction, the shapes of the lenses 104 and 109 in the Y-axis direction need to take into account the presence of oblique incident light.
- the shape is an aspheric shape different from the Y-axis direction, but if it is a resin that can be molded like polyethylene, it can be easily processed.
- the sensor assembly 210 may be configured.
- the far infrared rays transmitted through the lens 109 and collected on the pixels 102c, 102d and 102e are referred to as vertical incident light 106 and oblique incident light 110 and 107, respectively.
- the obliquely incident light 110 collected on the pixel 102 d located in the middle of the pixel 102 c on the optical axis 105 and the pixel 102 e at the end of the pixel array 102 is focused on the pixel 102 d (focused)
- the shape of the Fresnel surface in the X-axis direction of the lens 109 may be determined.
- the distance from the lens 109 to each pixel is different, and therefore, two or one (one on the optical axis) positions are in focus on the pixel array 102. Therefore, it is not possible to focus on all the positions on the pixel. Therefore, the oblique incident light 107 entering the pixel 102e forms a focus before entering the pixel 102e, and the vertical incident light 106 entering the pixel 102c virtually forms a focus behind the pixel 102c. Make it In this case, since the total defocus amount on each of the pixels 102c, 102d and 102e can be minimized, the beam diameter in the X-axis direction can be minimized. Therefore, it is possible to configure a sensor assembly with less variation in detection sensitivity for each pixel.
- the temperature to be detected is a normal temperature
- the peak wavelength of infrared rays emitted from the black body is in the vicinity of 8 to 10 micrometers.
- polyethylene that transmits 8 to 10 micrometers of infrared light is taken as an example of the lens, but if it is possible to use a resin that can transmit far infrared light other than polyethylene, it is of course possible to use that resin Absent.
- thermopile having sensitivity in the far infrared region
- other materials capable of detecting far infrared radiation may also be used.
- FIG. 7 is a side view (a) and a top view (b) of the sensor assembly in the present embodiment.
- the sensor assembly 220 of FIG. 7 is similar to the sensor assembly 210 of FIG. 4 except that it comprises a lens 111 instead of the lens 109 in FIG.
- the lens 111 differs from the lens 109 in that the left side surface 109b of the lens 109 is a flat surface, whereas the left side surface 111b of the lens 111 is a curved surface.
- the lens 111 forms a meniscus lens in the Y-axis direction. By so doing, the thickness of the lens 111 can be reduced also in the YZ cross section.
- the lens 111 can be configured to be thinner. By doing this, the amount of far infrared radiation incident on each pixel can be increased, so that it is possible to configure a sensor assembly with higher detection sensitivity.
- the shapes of the respective lenses 104, 109 and 111 in the Y-axis direction are not continuous surfaces such as Fresnel surfaces but continuous surfaces.
- it may be a Fresnel surface, but by setting the shape in the Y-axis direction as a continuous surface, for example, deterioration of the light collecting performance due to light scattering by each facet constituting the Fresnel surface can be eliminated. It has the effect of preventing.
- FIG. 8 is a schematic view (a) of a sensor used for the sensor assembly in the present embodiment, a side view (b) and a top view (c) of the sensor assembly.
- a sensor assembly 230 having a sensor 231 with three pixel rows will be described.
- the sensor 231 has three pixel columns 234, 235 and 236.
- the pixel column 234 includes pixels 232a, 232b, 232c, 232d and 232e.
- the pixel column 235 includes pixels 232f, 232g, 232h, 232i and 232j.
- the pixel column 236 has pixels 232 k, 232 l, 232 m, 232 n and 232 o.
- each pixel is arranged in the Y-axis direction, and has a square shape of L1 in both height and width.
- the pixel columns are arranged in parallel at an interval L3.
- FIGS. 8B and 8C A sensor assembly 230 using a sensor 231 is shown in FIGS. 8B and 8C.
- (B) of FIG. 8 is a side view of the sensor assembly 230
- (c) of FIG. 8 is a top view as viewed from the direction of arrow A of (a) of FIG.
- Sensor assembly 230 is similar to sensor assembly 100 shown in FIG. 1A, but instead of one pixel row 102 in sensor assembly 100, in sensor assembly 230, three pixel rows 234, 235 and 236 are spaced It differs in the point provided apart L3.
- the width D2 in the X-axis direction of the lens 104 is larger than the width D1 in the Y-axis direction so that the light should be incident on the pixel column 234 even when the light collecting diameter in the X-axis direction on each pixel is large.
- L3 is determined so that part or all of the electromagnetic wave does not enter the pixel columns 235 and 236.
- the distance L3 between the pixel columns is determined so that part or all of the electromagnetic wave to be incident on the pixel column 235 or 236 does not enter the pixel column 234.
- each pixel has a square shape of one side L1 in FIG. 8, as shown by the sensor 201, it may be a rectangle long in the X-axis direction. By doing so, even when the light collecting diameter in the X-axis direction on each pixel is increased, the light amount of the received electromagnetic wave is increased, so that a sensor assembly with higher detection sensitivity can be configured.
- an image having the larger one of the F numbers of the imaging optical system in the first and second directions in common as the F number of the first and second directions It is possible to increase the dose of the electromagnetic wave detected in the pixel by passing through the imaging optical system compared to the case of using the optical system.
- the sensor assembly can improve detection sensitivity over the above case.
- the sensor assembly can maintain or increase the dose of the electromagnetic wave transmitted through the entire imaging optical system.
- the transmittance of the electromagnetic wave of the material used for the imaging optical system is relatively low, the amount of the electromagnetic wave transmitted through the imaging optical system is reduced.
- the amount of the electromagnetic waves passing through the imaging optical system is increased. Therefore, the dose of the electromagnetic wave transmitted as a whole of the imaging optical system can be maintained or increased by setting the increase to be equal to or larger than the decrease.
- the sensor assembly can be manufactured inexpensively while maintaining detection sensitivity.
- FIG. 9 is a side view (a) and a top view (b) of the sensor assembly 300 in the present embodiment.
- the sensor assembly 300 is similar to the sensor assembly 200 of the first embodiment, but is not a transmission type imaging optical system using the lens 104, but a reflection type imaging optical system using the off-axis parabolic mirror 304. It differs in that it is. Along with that, the arrangement of the sensor 301 is changed to an appropriate position.
- FIG. 9 is a side view of the sensor assembly 300
- (b) of FIG. 9 is a top view of the sensor assembly 300 as viewed in the direction of arrow A of (a) of FIG.
- the off-axis parabolic mirror 304 in FIG. 9 (b) is shown as semi-transparent for the sake of understanding, but it is not actually transparent.
- the X axis direction is a direction in which the paper surface is penetrated from the front to the back
- the Y axis direction is a direction from the bottom to the upper side
- the axial direction is a direction from left to right in the drawing.
- FIG. 9 is a side view of the sensor assembly 300
- FIG. 9 is a top view of the sensor assembly 300 as viewed in the direction of arrow A of (a) of FIG.
- the off-axis parabolic mirror 304 in FIG. 9 (b) is shown as semi-transparent for the sake of understanding, but it is not actually transparent.
- the X axis direction is a direction
- the X-axis is a direction from the bottom to the top of the paper
- the Y-axis direction is a direction from the back to the front of the paper
- the Z-axis is a direction from left to right in the paper There is.
- Sensor assembly 300 comprises a sensor 301 and an off-axis parabolic mirror 304.
- the sensor 301 includes a sensor substrate 303 and a pixel array 302.
- the pixel column 302 is provided on the sensor substrate 303, and includes pixels 302a, 302b, 302c, 302d and 302e.
- the off-axis parabolic mirror 304 focuses an electromagnetic wave on the detection surface on the sensor 301.
- the off-axis parabolic mirror 304 corresponds to an imaging optical system.
- FIG. 9 is a side view (a) and a top view (b) of the sensor assembly 300 in the present embodiment.
- the vertically incident light 305 is an electromagnetic wave emitted from a point not shown on the object plane and on the optical axis 308.
- the vertically incident light 305 is vertically incident along the optical axis 308 on the off-axis parabolic mirror 304 of the sensor assembly 300.
- the vertically incident light 305 is reflected by the off-axis parabolic mirror 304 and is incident on the pixel 302 c on the sensor 301.
- the focal point of the paraboloid constituting the off-axis parabolic mirror 304 is located on the pixel column 302, and the vertically incident light 305 is arranged to be collected at this focal position.
- obliquely incident light An electromagnetic wave emitted obliquely from an unshown point on the object surface and out of the optical axis 308 and obliquely incident on the off-axis parabolic mirror 304 is referred to as obliquely incident light.
- the obliquely incident lights 306 and 307 are, for example, obliquely incident light which is incident at the largest incident angle ⁇ 1.
- Each of the obliquely incident light 306 and 307 condenses on the pixel 302a and 302e which is farthest from the pixel 302c which is the central pixel on the sensor.
- any electromagnetic waves emitted parallel to the Y-axis from the object plane and incident on the off-axis parabolic mirror 304 at an angle within the angle ⁇ 1 are collected at any position of the pixel row 302. Do.
- the pixels 302a, 302b, 302c, 302d, and 302e are slightly rotated counterclockwise around the Y-axis direction in the present embodiment.
- vertically incident light 305 is directed to the off-axis parabolic mirror 304 along the optical axis 308 of the sensor assembly 300.
- the vertically incident light 305 is reflected by the off-axis parabolic mirror 304 and condensed at the center of the pixel 302 c in the X-axis direction.
- each of the obliquely incident lights 306 and 307 shown in FIG. 9A is reflected by the off-axis parabolic mirror 304 in FIG. 9B, and then collected in the vicinity of the pixels 302a and 302e. Light up. In this way, the electromagnetic wave emitted along the Y axis from the object plane can be imaged on the pixel row 302 by the off-axis parabolic mirror 304.
- sagittal light of the vertically incident light 305 is reflected at the off-axis parabolic mirror 304 in a plane including the optical axis.
- sagittal light of the obliquely incident light 306 and 307 is reflected not by a flat surface but by a curved surface.
- the curvatures in the sagittal plane that the obliquely incident light beams 306 and 307 receive at the off-axis parabolic mirror 304 are different from each other.
- sagittal light means light propagating through a sagittal plane.
- both normal incident light 305 and oblique incident light 306 and 307 will be reflected in a plane that includes the optical axis 308 and the Y axis. Therefore, if the F number is the same, the condensed diameters of the obliquely incident lights 306 and 307 on the pixel column 302 in the X-axis direction are different from the condensed diameters of the pixel column direction.
- tangential light refers to light propagating in a tangential plane.
- the focusing diameter in the X-axis direction may be different from the focusing diameter in the Y-axis direction.
- the F-number in the X-axis direction may be different from the F-number in the Y-axis direction.
- by increasing the F number in the X direction it is possible to make the condensed diameter in the X direction on the pixel column 302 substantially equal to the condensed diameter in the Y axis direction.
- the pixel rows are arranged in the direction orthogonal to the X-axis direction as in the present embodiment, in order to prevent the deterioration of the resolution due to the electromagnetic wave entering the pixel adjacent to the pixel
- the condensing diameter in the pixel column direction (the direction in which the pixels are arranged) needs to be narrowed (small). Therefore, the sensor 301 and the off-axis parabolic mirror 304 are disposed so that the light collection diameter in the pixel column direction on each pixel is reduced.
- the length of L5 is longer than the length of L4. Lengthen.
- the arrangement diameter of the sensor 301 or the off-axis parabolic mirror 304 is determined so that the collection diameter in the pixel column direction becomes small, and the collection diameter in the X-axis direction on each pixel becomes large. Even in this case, it is possible to receive light without missing the light amount to be incident in each pixel. Then, while maintaining the resolution in the pixel column direction, the amount of the electromagnetic wave incident on each pixel is increased to simultaneously achieve high resolution and high detection sensitivity, and further reduce variation in the amount of incident light for each pixel. It will be possible.
- the F number in the tangential direction may be reduced within the range in which the resolution in the pixel column direction does not deteriorate.
- the amount of electromagnetic waves incident on each pixel can be increased while maintaining the resolution in the pixel column direction, so high resolution and high detection sensitivity can be achieved simultaneously.
- FIG. 10 is an explanatory view of an arrangement that makes it difficult to receive stray light in the sensor assembly in the present embodiment.
- the sensor 301 is turned counterclockwise in FIG. 10 about the X axis such that the focal point 310 of the off-axis parabolic mirror 304 is at any position on the pixel column 302. It may be rotated by a predetermined angle (angle ⁇ 1). In this state, for example, by providing the shield 309 or the like, it is possible to easily prevent the deterioration of the image quality due to the stray light.
- the electromagnetic wave incident on the sensor 301 without being reflected by the off-axis parabolic mirror 304 is stray light, which corresponds to stray light 311a or 311b shown in FIG.
- the sensor 301 is tilted in the counterclockwise direction around the X axis as in the present embodiment, the angle of vision is reduced only for stray light incident mainly from the upper right of the drawing. Instead, the presence of the shield 309 makes it possible to shield from stray light.
- the sensor assembly 300 can reduce the effects of noise.
- this effect is not an effect limited to the line sensor in which the sensor 301 is configured by the pixel array 302, and the same effect can be obtained even with a two-dimensional area sensor.
- FIG. 11 is an explanatory view (a) relating to an arrangement in which it is difficult to receive stray light in the sensor assembly according to the second embodiment, and an explanatory view (b) relating to an arrangement in which it is further difficult to receive stray light.
- the sensor assembly 320 is similar to the sensor assembly 300 except that the electromagnetic wave reflected by the off-axis parabolic mirror 304 is reflected by the flat mirror 321 before it is received by the pixel column 302 of the sensor 301.
- the obliquely incident light 306 and 307 are illustrated in FIG. 9 or FIG. 10, the oblique incident light 306 and 307 are omitted in FIG. 11 for the sake of clarity.
- the pixel row 302 side of the sensor 301 inevitably turns to the flat mirror side, so stray lights 311 c and 311 d that are mainly incident from the upper right of the drawing directly enter the pixel row 302. It will not be configured.
- the emissivity of the mirror is as low as 0.1 or less, and even if the plane mirror 321 or the off-axis parabolic mirror 304 is disposed at a position close to the pixel column 302, it is unlikely to become a noise source.
- the sensor assembly 320 can further reduce the influence of noise due to stray light. It is to be noted that stray light from other directions can be easily prevented from entering by enclosing the sensor assembly 320 by the housing 322 or the like, and therefore will not be described further here.
- the sensor assembly 320 shown in FIG. 11 (b) is a component similar to the sensor assembly 320 of FIG. 11 (a) described above, but adds the requirements described below with respect to placement.
- the vertically incident light 305 emitted from the off-axis parabolic mirror 304 toward the plane mirror 321 its chief ray is assumed to be incident at an angle of - ⁇ b with respect to the normal of the plane mirror 321.
- ⁇ b is a positive value
- the angle in the counterclockwise direction with respect to the normal of the plane mirror 321 is negative.
- the vertically incident light 305 is incident on the plane mirror 321 at a negative angle.
- a straight line be a straight line B.
- an angle formed with the normal line of the plane mirror 321 is + ⁇ a as shown in FIG. 11B.
- ⁇ a is a positive value
- the vertically incident light 305 on the flat mirror 321 and the straight line B Stray light will be reflected in the opposite direction to the normal to the plane mirror 321. That is, since the light is reflected in the opposite direction to the sensor 301, stray light incident along the straight line B is not incident on the pixel row 302 on the sensor 301.
- the sensor assembly 320 can further reduce the influence of noise due to stray light.
- stray light from other directions can be easily prevented from entering by surrounding the sensor assembly 320 with the housing 322 or the like, and therefore will not be described further here. .
- the effect of preventing stray light described with reference to FIG. 11 is not an effect limited to the line sensor in which the sensor 301 is configured by the pixel array 302, but is an effect similarly obtained even with a two-dimensional area sensor .
- the sensor assembly in the present embodiment is specifically realized using mirrors having different F factors in the first and second directions as an imaging optical system. That is, the sensor assembly can be realized by using a reflective imaging optical system instead of a transmissive imaging optical system.
- Embodiment 1 or 2 An example in which the sensor assembly described in Embodiment 1 or 2 is used inside a car will be described in the present embodiment.
- FIG. 12 is a cross-sectional view (a) and a top view (b) of the automobile when the sensor assembly is mounted on the automobile in the third embodiment.
- FIG. 12 (a) is a side view of an example in which the sensor assembly 502 is mounted inside the automobile 500
- FIG. 12 (b) is a top view thereof.
- the sensor assembly 502 is one of the sensor assemblies described in any of the first and second embodiments, and here is a sensor assembly of a temperature sensor which has sensitivity in the far infrared region and can measure a temperature distribution.
- a driver 501 is driving on a car 500 and is driving, and the driver 501 is captured within the field of view 506 of the sensor assembly 502.
- the human's thermal sensation feeling of being hot or cold
- the amount of blood flow is large and the forehead close to the trunk, etc. has a small amount of fluctuation with respect to the ambient temperature, and is usually maintained at around 33 ° C.
- the temperature of the cheeks, nose, and ears of the limbs or even the face is easily influenced by the ambient temperature, and is considered to be correlated with the thermal sensation to some extent. Therefore, if the temperature around the limbs or the face can be measured accurately, it is possible to estimate the human thermal sensation from the measurement result. For example, by controlling the air conditioning device based on the estimated thermal sensation, comfortable air conditioning in the automobile 500 can be maintained.
- the temperature distribution of the face of the driver 501 is detected by the sensor assembly 502.
- the sensor assembly 502 is rotatably disposed about the center of rotation 505, as shown in FIG. 12 (b).
- the sensor assembly 502 can detect a temperature distribution while scanning a scanning range 507 including the face of the driver 501 by rotating about the rotation center 505.
- the driver's 501 face is placed within the range of the field of view 506 of the sensor assembly 502. By doing this, it is possible to measure the temperature of the face during driving of the driver 501 in real time.
- the temperature of the face of the driver 501 measured by the sensor assembly 502 is transmitted to the control unit 508 by a wire not shown.
- the control unit 508 analyzes the measured temperature of the face, estimates the thermal sensation of the driver 501 mainly from the temperature of the cheek, nose, ears, etc., and controls the air conditioner 509 based on the estimation result.
- the temperature, direction, wind power and the like of the air blown out from the air conditioner 509 are adjusted. By so doing, the driver 501 can always be kept in a comfortable state, and a stress-reduced driving environment can be provided.
- the sensor assembly 502 is particularly susceptible to extraneous light incident from the rear window 504.
- Sunlight has far infrared rays in its spectrum, and the amount of far infrared light is relatively large. Therefore, when sunlight directly enters the sensor assembly 502, it becomes noise and it becomes difficult to accurately measure the temperature distribution of the driver's 501 face.
- the range of the field of view 506 of the sensor assembly 502 is arranged so as to be all below the horizontal direction with respect to the sensor assembly 502.
- the sun is often located at or above the horizon or horizon, and is generally less likely to be below the horizon or horizon. Therefore, if the range of the field of view 506 of the sensor assembly 502 is lower than the horizontal direction with respect to the sensor assembly 502, sunlight will not be directly incident on the sensor assembly 502 substantially. By doing this, it is possible to suppress the generation of noise due to the far infrared rays contained in the sunlight being incident on the sensor assembly 502, and to reliably estimate the thermal sensation of the driver 501.
- the arrangement of the sensor assembly 502 may be configured such that the rear window 504 does not fall within the field of view 506. This can prevent not only the sunlight 503 but also the extraneous light 510 from entering the sensor assembly 502.
- a filter capable of cutting far infrared rays may be inserted into the rear window 504. Even by intentionally inserting a filter that cuts far infrared light into the rear window 504, foreign light including sunlight can be prevented from being directly incident on the sensor assembly 502. It is possible to suppress the generation of noise due to the infrared rays being incident on the sensor assembly 502 and to estimate the thermal sensation of the driver 501 with certainty.
- the control unit 508 may be controlled. By locally adjusting the temperature, the direction, or the air volume of the air blown out, it is possible to provide the optimal local air conditioning according to each person riding the vehicle 500.
- the thermal sensation of the driver 501 is estimated from the temperature distribution of the face of the driver 501 mainly on the cheeks, nose, ears, etc. in the above, it is of course possible to measure including the hand of the driver 501. By measuring not only the face but also the peripheral body surface temperature, the thermal sensation of the driver 501 can be more accurately estimated, and a more comfortable driving environment can be provided.
- a filter for cutting far infrared rays is inserted into the rear window 504
- a filter for cutting far infrared rays may be inserted not only into the rear window 504 but also into the window on the side of the car 500.
- the sensor assembly 300 in the case where a mirror such as an off-axis parabolic mirror is used as an imaging optical system, it is assumed that extraneous light is incident on the sensor 301.
- the sensor assembly 300 described in the second embodiment is attached to the vicinity of the ceiling of the automobile 500 as shown in FIG.
- the sensor assembly 300 can prevent external light from being directly incident on the sensor 301 from the front of the automobile 500 by being surrounded by the housing 511 except where the far infrared rays are incident.
- the temperature sensor 512 is attached near the ceiling of the car 500, the amount of stray light is calculated from the temperature and the emissivity of the ceiling of the car 500, and the temperature distribution obtained by entering the sensor 301 is corrected.
- the temperature distribution of the driver 501 can be obtained by reducing the influence of the error.
- a material having a low emissivity may be attached to the ceiling of the automobile 500. By doing so, the extraneous light itself can be reduced, and the influence of an error can be reduced when the temperature distribution of the face of the driver 501 is determined.
- the sensor assembly 300 has been described above as an example, this is generally the case when a mirror is included in the sensor assembly, and is not limited to the form of the sensor assembly 300.
- the sensor assembly 502 using a line sensor has been described as the sensor assembly, but in the embodiment described here, the sensor incorporated in the sensor assembly 502 is limited to the line sensor. Similar effects can be obtained even with a two-dimensional area sensor instead of the effect.
- FIG. 15 is an explanatory view (a) of acquiring a temperature distribution in a vehicle by scanning a sensor assembly in the present embodiment, and an explanatory view (b) regarding a sampling cycle at the time of scanning.
- a method of measuring the temperature distribution of a person in the vehicle 500 by scanning the sensor assembly 502 will be described with reference to FIG.
- the sensor assembly 502 detects the temperature distribution of the person who is in the vehicle 500 while scanning the scanning range 507, for example, to scan the temperature distribution of the person while distinguishing the nose and cheeks It is necessary to scan at a sampling pitch of about 1 °.
- the time taken for the sensor assembly 502 to perform one sampling is one second, and the scanning range of 160 ° is scanned, it takes 160 seconds for the sensor assembly 502 to scan from end to end. It becomes. This is not useful, especially if the room temperature needs to be adjusted early after boarding. Therefore, in the present embodiment, scanning by the sensor assembly 502 is performed as follows.
- sampling is performed every 10 ° at the start of operation such as riding.
- it is possible to detect at which position a person is present in 16 seconds ( 160/10) by first scanning across the scanning range.
- the position where the driver 501 is detected is detected every 1 ° which is a required accuracy
- the position of the passenger 513 is detected every 1 ° which is a required accuracy. .
- sampling period and the scanning range described above are merely examples, and the present invention is not limited to these values.
- the above-mentioned rough sampling timing may be other than the above-mentioned riding time (timing when the engine is turned on).
- the door of the car 500 may be opened and closed in order to include the possibility of getting on and off of the person, or even if the person is not detected in the place where the person existed before (for example, the person Or the like) or other timing may be used.
- the temperature range of the driver 501 may first be scanned to detect the temperature distribution of the driver. Normally, at the timing when the vehicle is turned on, there is a high possibility that the driver is on the driver's seat. On the other hand, the possibility that the passenger is on a seat other than the driver's seat at this timing is unknown. Therefore, at the timing when the engine is turned on, the existence range of the driver 501 is scanned to estimate the thermal sensation of the driver 501 from the temperature distribution of the driver 501, starting around the driver 501 and including other seats Based on the thermal sensation of the driver 501, control of the air conditioner 509 is started under the same conditions (temperature, air flow, etc.).
- the scanning range 507 is scanned at intervals of 10 ° to identify the presence or absence of a passenger and, if there is a passenger, identify the location, and the wind direction of the air conditioner 509 becomes comfortable at the position where the passenger is present.
- the air conditioner 509 is operated in consideration of only the surroundings of the driver 501.
- the air conditioner 509 is controlled by estimating the thermal sensation of the passenger from the temperature distribution of the passenger by sampling the locations of the specified passenger every 1 °. In this way, even when the sampling speed is slow, it is possible to realize a comfortable in-vehicle air conditioner even when getting on the vehicle.
- Embodiment 4 A sensor assembly 700 in the present embodiment will be described with reference to FIG.
- FIG. 16 is a side view (a) and a top view (b) of a sensor assembly 700 according to the present embodiment.
- FIG. 17 is a front view of a sensor assembly in the present embodiment.
- the sensor assembly 700 is similar to the sensor assembly 100 according to the first embodiment, except that a lens 704 is provided instead of the lens 104 and a sensor 701 is provided instead of the sensor 101.
- FIG. 16 (a) is a side view of the sensor assembly 700
- FIG. 16 (b) is a top view of the sensor assembly 700 as viewed from the direction of arrow A in FIG. 16 (a).
- the lens 704 focuses an electromagnetic wave on the detection surface on the sensor 701.
- the lens 704 corresponds to an imaging optical system.
- a pixel row 702 is arranged in the lens diameter D1 in the X-axis direction (first direction) in which the pixels of the sensor are formed of one pixel. It is smaller than the lens diameter D2 in the Y-axis direction (second direction) which is the direction. It can also be said that the F number in the X axis direction (first direction) is larger than the F number in the Y axis direction (second direction).
- the sensor 701 includes a sensor substrate 703 and a pixel array 702.
- the pixel column 702 is provided on the sensor substrate 703, and includes pixels 702a, 702b, 702c, 702d and 702e.
- the surface on the side facing the lens 104 in the sensor 701 is also referred to as a detection surface.
- the sensor 701 has five pixels will be described as an example, the number of pixels is not limited to this.
- FIG. 17A corresponds to a spherical lens cut out with an ellipse having a major axis (diameter in the Y axis direction) of D2 and a minor axis (diameter in the X axis direction) of D1.
- (B) of FIG. 17 corresponds to a spherical lens cut out in a rectangle having a long side (side parallel to the Y axis) D2 and a short side (side parallel to the X axis) D1.
- the width in the X-axis direction is D1
- the width in the Y-axis direction is D2.
- the shape of the lens 704 is not limited to these.
- the shape of the cut-out figure may be any shape. That is, the figure may be a figure obtained by combining a triangle, a pentagon, and other straight lines or curves in addition to an ellipse ((a) in FIG. 17) and a rectangle ((b) in FIG. 17).
- FIG. 18 is a schematic view showing a measurement range of the sensor assembly in the present embodiment.
- the pixels 702a, 702c, and 702e among the pixels included in the sensor 701 are illustrated for the sake of explanation.
- pixels 102a, 102c and 102e among the pixels included in the sensor 101 are drawn.
- the position of the pixel 702c which is the central pixel in the sensor 701 and the position of the pixel 102c which is the central pixel in the sensor 101 are drawn to coincide with each other.
- FIG. 18 shows the range of the electromagnetic waves collected on the sensor 701 or the sensor 101 using the lens 704.
- the electromagnetic waves emitted from the area included in the range 712 are collected on the sensor 701 (pixels 702 a to 702 e).
- the electromagnetic waves emitted from the area included in the range 711 are collected on the sensor 101 (pixels 102 a to 102 e). Since the sensor 701 is longer than the sensor 101, the range 712 includes a wider angular range than the range 711. That is, although the sensor 701 and the sensor 101 have the same number of pixels, the sensor 701 can receive electromagnetic waves from a wider angle range.
- the sensor assembly in the present embodiment can receive light from a wider angle range, it can receive electromagnetic waves coming from a wider area.
- the sensor assembly according to the present invention is useful because it can achieve high sensitivity with an inexpensive configuration, is resistant to stray light, and can be used in automobiles.
Abstract
Description
本発明者は、「背景技術」の欄において記載した、センサーアセンブリに関し、以下の問題が生じることを見出した。
本実施の形態において、安価に製造され、かつ、検出感度を向上させるセンサーアセンブリについて説明する。なお、本実施の形態におけるセンサーアセンブリは、ラインセンサーに用いられる結像光学系であり、検出感度を維持しながら安価に製造可能である。
図9を参照しながら、本実施の形態におけるセンサーアセンブリ300について説明する。
実施の形態1又は2において説明したセンサーアセンブリを、自動車内部に用いる例について、本実施の形態において説明する。
図16を参照しながら、本実施の形態におけるセンサーアセンブリ700について説明する。
101、201、231、301、701 センサー
101a ベース
102、202、234、235、236、302、702 画素列
102a、102b、102c、102d、102e、202a、202b、202c、202d、202e、232a、232b、232c、232d、232e、232f、232g、232h、232i、232j、232k、232l、232m、232n、232o、302a、302b、302c、302d、302e、702a、702b、702c、702d、702e 画素
103、203、233、303、703 センサー基板
104、109、111、704 レンズ
105、308 光軸
106、108、305 垂直入射光
107、110、306、307 斜め入射光
109a、111a 右側面
109b、111b 左側面
304 軸外し放物面鏡
309 遮蔽体
310 焦点
311a、311b、311c、311d 迷光
321 平面ミラー
322、511 筐体
301a、304a 点
500 自動車
501 運転者
503 太陽光
504 後部ウインドウ
505 回転中心
506 視野
507 走査範囲
508 制御部
509 空調装置
510 外来光
512 温度センサー
513 同乗者
Claims (15)
- 電磁波を検出するための複数の画素であって、所定方向に一列に配列された複数の画素を有するラインセンサーと、
電磁波による像を前記複数の画素上の検出面に結像させる結像光学系とを備え、
前記検出面に平行な面内で前記所定方向に直交する第一の方向における前記結像光学系のFナンバーは、前記所定方向である第二の方向における前記結像光学系のFナンバーと異なる
センサーアセンブリ。 - 前記第一の方向における前記結像光学系のFナンバーは、前記第二の方向における前記結像光学系のFナンバーより小さい
請求項1に記載のセンサーアセンブリ。 - 前記結像光学系は、レンズを有し、
前記第一の方向における前記レンズのFナンバーは、前記第二の方向における前記レンズのFナンバーより小さい
請求項1又は2に記載のセンサーアセンブリ。 - 前記第一の方向における前記レンズの幅は、前記第二の方向における前記レンズの幅より大きい
請求項3に記載のセンサーアセンブリ。 - 前記レンズの前記第一の方向に垂直な面における断面形状は、前記レンズの前記第二の方向に垂直な面における断面形状と異なる
請求項3又は4に記載のセンサーアセンブリ。 - 前記レンズの前記第二の方向に垂直な面における断面形状は、フレネル形状を含む
請求項3~5のいずれか1項に記載のセンサーアセンブリ。 - 前記レンズの前記第一の方向に垂直な面における断面形状は、フレネル形状を含まない
請求項3~6のいずれか1項に記載のセンサーアセンブリ。 - 前記結像光学系は、ミラーを有し、
前記第一の方向における前記ミラーのFナンバーは、前記第二の方向における前記ミラーのFナンバーより大きい
請求項1又は2に記載のセンサーアセンブリ。 - 前記ミラーは、軸外し放物面鏡である
請求項8に記載のセンサーアセンブリ。 - 前記センサーアセンブリは、さらに、平面ミラーを有する
請求項9に記載のセンサーアセンブリ。 - (i)前記軸外し放物面鏡の端部のうちセンサーに近い一方、及び、前記センサーの端部のうち前記軸外し放物面鏡に近い一方を結ぶ直線と、前記平面ミラーとがなす角の角度が、(ii)前記軸外し放物面鏡から前記平面ミラーに入射する電磁波と前記平面ミラーとがなす角の角度と異なる
請求項10に記載のセンサーアセンブリ。 - 前記複数の画素のそれぞれの前記第一の方向における幅は、前記複数の画素のそれぞれの前記第二の方向における幅より大きい
請求項1~11のいずれか一項に記載のセンサーアセンブリ。 - 前記電磁波は、8~10マイクロメータの波長を有する遠赤外線を含む
請求項1~12のいずれか1項に記載のセンサーアセンブリ。 - 前記レンズの材質は、ポリエチレンである
請求項3~7のいずれか1項に記載のセンサーアセンブリ。 - 前記複数の画素のそれぞれは、サーモパイル又はボロメータを用いた赤外線検出器である
請求項1~14のいずれか1項に記載のセンサーアセンブリ。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015504453A JP6338567B2 (ja) | 2013-07-31 | 2014-07-08 | センサーアセンブリ |
CN201480002521.9A CN104662397B (zh) | 2013-07-31 | 2014-07-08 | 传感器组件 |
US14/431,114 US9448118B2 (en) | 2013-07-31 | 2014-07-08 | Sensor assembly |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013159787 | 2013-07-31 | ||
JP2013-159787 | 2013-07-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015015718A1 true WO2015015718A1 (ja) | 2015-02-05 |
Family
ID=52431281
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/003611 WO2015015718A1 (ja) | 2013-07-31 | 2014-07-08 | センサーアセンブリ |
Country Status (4)
Country | Link |
---|---|
US (1) | US9448118B2 (ja) |
JP (1) | JP6338567B2 (ja) |
CN (1) | CN104662397B (ja) |
WO (1) | WO2015015718A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017002346A1 (ja) * | 2015-07-01 | 2017-01-05 | パナソニックIpマネジメント株式会社 | 空調制御装置 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7313469B2 (ja) * | 2019-03-20 | 2023-07-24 | ウォード,マシュー,イー. | マイクロledアレイを搭載したmems駆動光学パッケージ |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6079128U (ja) * | 1983-11-08 | 1985-06-01 | 株式会社チノー | 放射エネルギ−収集装置の光学系 |
JPS60147671A (ja) * | 1984-01-13 | 1985-08-03 | Mitsubishi Electric Corp | 侵入監視方法および装置 |
JPH02115725A (ja) * | 1988-10-25 | 1990-04-27 | Nec Corp | 光センサ校正装置 |
JPH04372828A (ja) * | 1991-06-24 | 1992-12-25 | Matsushita Electric Ind Co Ltd | 熱画像検出装置 |
JPH07296269A (ja) * | 1994-04-28 | 1995-11-10 | Matsushita Electric Works Ltd | 熱線式検知器 |
JPH09101204A (ja) * | 1995-10-06 | 1997-04-15 | Matsushita Electric Ind Co Ltd | 焦電型赤外線検出装置 |
JPH09105668A (ja) * | 1995-10-13 | 1997-04-22 | Matsushita Electric Ind Co Ltd | 焦電型赤外線センサ |
JPH09145477A (ja) * | 1995-11-20 | 1997-06-06 | Tokyo Instr:Kk | 分光器 |
JP2002228578A (ja) * | 2001-01-30 | 2002-08-14 | Anritsu Corp | ガス検出装置及び該装置の焦点合わせ方法 |
JP2012026917A (ja) * | 2010-07-26 | 2012-02-09 | Mitsubishi Electric Corp | 赤外線センサ及び空気調和機 |
JP2012198191A (ja) * | 2011-03-07 | 2012-10-18 | Ricoh Co Ltd | 遠赤外線検出装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6079128A (ja) | 1983-10-05 | 1985-05-04 | Hokuetsu Kogyo Co Ltd | エンジン駆動発電機制御方法 |
JPH0694991A (ja) | 1992-09-10 | 1994-04-08 | Matsushita Electric Ind Co Ltd | 赤外広角単レンズ |
GB2286042B (en) * | 1994-01-27 | 1998-07-29 | Security Enclosures Ltd | Wide-angle infra-red detection apparatus |
US6856407B2 (en) * | 2000-09-13 | 2005-02-15 | Nextengine, Inc. | Method for depth detection in 3D imaging providing a depth measurement for each unitary group of pixels |
US6876450B2 (en) * | 2001-01-30 | 2005-04-05 | Anritsu Corporation | Laser absorption spectral diffraction type gas detector and method for gas detection using laser absorption spectral diffraction |
CN1317568C (zh) * | 2002-05-28 | 2007-05-23 | 佳能株式会社 | 光检测装置,x射线摄影方法和装置,以及光电变换元件 |
US7684094B2 (en) * | 2004-09-13 | 2010-03-23 | Canon Kabushiki Kaisha | Image optical system and image reading apparatus equipped with same |
JP4356026B2 (ja) * | 2006-10-10 | 2009-11-04 | ソニー株式会社 | 表示装置、受光方法、および情報処理装置 |
JP5300602B2 (ja) | 2008-10-31 | 2013-09-25 | 三菱電機株式会社 | 空気調和機 |
CN105681633B (zh) * | 2009-03-19 | 2019-01-18 | 数字光学公司 | 双传感器照相机及其方法 |
JP5789098B2 (ja) * | 2010-12-16 | 2015-10-07 | キヤノン株式会社 | 焦点検出装置およびその制御方法 |
JP5898481B2 (ja) * | 2011-12-13 | 2016-04-06 | キヤノン株式会社 | 撮像装置及び焦点検出方法 |
-
2014
- 2014-07-08 CN CN201480002521.9A patent/CN104662397B/zh active Active
- 2014-07-08 WO PCT/JP2014/003611 patent/WO2015015718A1/ja active Application Filing
- 2014-07-08 JP JP2015504453A patent/JP6338567B2/ja active Active
- 2014-07-08 US US14/431,114 patent/US9448118B2/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6079128U (ja) * | 1983-11-08 | 1985-06-01 | 株式会社チノー | 放射エネルギ−収集装置の光学系 |
JPS60147671A (ja) * | 1984-01-13 | 1985-08-03 | Mitsubishi Electric Corp | 侵入監視方法および装置 |
JPH02115725A (ja) * | 1988-10-25 | 1990-04-27 | Nec Corp | 光センサ校正装置 |
JPH04372828A (ja) * | 1991-06-24 | 1992-12-25 | Matsushita Electric Ind Co Ltd | 熱画像検出装置 |
JPH07296269A (ja) * | 1994-04-28 | 1995-11-10 | Matsushita Electric Works Ltd | 熱線式検知器 |
JPH09101204A (ja) * | 1995-10-06 | 1997-04-15 | Matsushita Electric Ind Co Ltd | 焦電型赤外線検出装置 |
JPH09105668A (ja) * | 1995-10-13 | 1997-04-22 | Matsushita Electric Ind Co Ltd | 焦電型赤外線センサ |
JPH09145477A (ja) * | 1995-11-20 | 1997-06-06 | Tokyo Instr:Kk | 分光器 |
JP2002228578A (ja) * | 2001-01-30 | 2002-08-14 | Anritsu Corp | ガス検出装置及び該装置の焦点合わせ方法 |
JP2012026917A (ja) * | 2010-07-26 | 2012-02-09 | Mitsubishi Electric Corp | 赤外線センサ及び空気調和機 |
JP2012198191A (ja) * | 2011-03-07 | 2012-10-18 | Ricoh Co Ltd | 遠赤外線検出装置 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017002346A1 (ja) * | 2015-07-01 | 2017-01-05 | パナソニックIpマネジメント株式会社 | 空調制御装置 |
Also Published As
Publication number | Publication date |
---|---|
JP6338567B2 (ja) | 2018-06-06 |
US9448118B2 (en) | 2016-09-20 |
JPWO2015015718A1 (ja) | 2017-03-02 |
CN104662397A (zh) | 2015-05-27 |
US20150276490A1 (en) | 2015-10-01 |
CN104662397B (zh) | 2017-06-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104776801B (zh) | 信息处理装置和信息处理方法 | |
AU2010215048B2 (en) | Dual field-of-view optical imaging system with dual focus lens | |
US20030169491A1 (en) | Impaired vision assist system and method | |
CN110753876B (zh) | 图像投影装置 | |
JP4973921B2 (ja) | ヘッドアップディスプレイ装置 | |
JP5440903B2 (ja) | 撮像装置とステレオカメラ装置及び車外監視装置 | |
US9377612B2 (en) | IR microscope with image field curvature compensation, in particular with additional illumination optimization | |
CN110471182A (zh) | 车辆用显示装置 | |
JP2018205446A (ja) | 表示システム、電子ミラーシステム及びそれを備える移動体 | |
EP2857882A1 (en) | Infrared optical assembly and infrared image capture device | |
WO2015015718A1 (ja) | センサーアセンブリ | |
JP7037760B2 (ja) | 画像投写装置及び移動体 | |
CN109855594A (zh) | 摄像装置、距离测量设备、车载系统和移动设备 | |
JP5330741B2 (ja) | 車載用観察装置 | |
JP2014081477A (ja) | 撮像装置 | |
JP2019211664A (ja) | 撮像装置 | |
JP2015232506A (ja) | アレイミラー系及び赤外線検知装置 | |
JP4650469B2 (ja) | 撮像装置及びレンズ異常診断システム | |
JP7281060B2 (ja) | 撮像装置及び情報処理装置 | |
JP7244129B1 (ja) | ナイトビジョンカメラ | |
JPH11347016A (ja) | 撮像装置 | |
JP2013175130A (ja) | 画像式車両感知器 | |
KR101219872B1 (ko) | 렌즈 모듈, 측방 감시 장치 및 방법 | |
JP2022124808A (ja) | 車室内撮影装置 | |
JPH0743459B2 (ja) | 反射光学系 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 2015504453 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14832825 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14431114 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14832825 Country of ref document: EP Kind code of ref document: A1 |