WO2024135306A1 - Lens optical system and imaging device - Google Patents
Lens optical system and imaging device Download PDFInfo
- Publication number
- WO2024135306A1 WO2024135306A1 PCT/JP2023/043217 JP2023043217W WO2024135306A1 WO 2024135306 A1 WO2024135306 A1 WO 2024135306A1 JP 2023043217 W JP2023043217 W JP 2023043217W WO 2024135306 A1 WO2024135306 A1 WO 2024135306A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lens
- optical system
- optical axis
- refractive power
- image
- Prior art date
Links
- 230000003287 optical effect Effects 0.000 title claims abstract description 328
- 238000003384 imaging method Methods 0.000 title claims abstract description 165
- 230000005499 meniscus Effects 0.000 claims abstract description 68
- 238000005516 engineering process Methods 0.000 abstract description 32
- 230000004075 alteration Effects 0.000 description 51
- 230000014509 gene expression Effects 0.000 description 31
- 210000003128 head Anatomy 0.000 description 25
- 238000001514 detection method Methods 0.000 description 21
- 238000004891 communication Methods 0.000 description 20
- 239000000758 substrate Substances 0.000 description 15
- 238000012545 processing Methods 0.000 description 14
- 238000010586 diagram Methods 0.000 description 13
- 238000002674 endoscopic surgery Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 12
- 239000000853 adhesive Substances 0.000 description 11
- 230000001070 adhesive effect Effects 0.000 description 11
- 239000011521 glass Substances 0.000 description 11
- 239000004065 semiconductor Substances 0.000 description 8
- 125000006850 spacer group Chemical group 0.000 description 8
- 210000001519 tissue Anatomy 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 6
- 239000011347 resin Substances 0.000 description 6
- 229920005989 resin Polymers 0.000 description 6
- 206010010071 Coma Diseases 0.000 description 5
- 239000003795 chemical substances by application Substances 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 238000000034 method Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000005284 excitation Effects 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 238000001356 surgical procedure Methods 0.000 description 4
- 201000009310 astigmatism Diseases 0.000 description 3
- 210000004204 blood vessel Anatomy 0.000 description 3
- 238000010336 energy treatment Methods 0.000 description 3
- 239000003153 chemical reaction reagent Substances 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- MOFVSTNWEDAEEK-UHFFFAOYSA-M indocyanine green Chemical compound [Na+].[O-]S(=O)(=O)CCCCN1C2=CC=C3C=CC=CC3=C2C(C)(C)C1=CC=CC=CC=CC1=[N+](CCCCS([O-])(=O)=O)C2=CC=C(C=CC=C3)C3=C2C1(C)C MOFVSTNWEDAEEK-UHFFFAOYSA-M 0.000 description 2
- 229960004657 indocyanine green Drugs 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000001678 irradiating effect Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 208000003443 Unconsciousness Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003796 beauty Effects 0.000 description 1
- 230000000740 bleeding effect Effects 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000004313 glare Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000002406 microsurgery Methods 0.000 description 1
- 239000003595 mist Substances 0.000 description 1
- 230000000116 mitigating effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 210000004877 mucosa Anatomy 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 210000004761 scalp Anatomy 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/18—Optical objectives specially designed for the purposes specified below with lenses having one or more non-spherical faces, e.g. for reducing geometrical aberration
Definitions
- This technology relates to a lens optical system and an imaging device, and in particular to a lens optical system and an imaging device that can achieve high optical performance compatible with a large, high-pixel solid-state imaging element.
- Imaging devices equipped with solid-state imaging elements such as CCD (Charge-Coupled Device) sensors and CMOS (Complementary Metal Oxide Semiconductor) image sensors include compact cameras mounted on mobile phones and smartphones, as well as digital still cameras. In order to further reduce the size of such imaging devices, there is a demand for smaller lens optical systems with shorter overall optical length.
- CCD Charge-Coupled Device
- CMOS Complementary Metal Oxide Semiconductor
- a lens optical system consisting of seven lenses
- a lens optical system consisting of, in order from the object side to the image side, a first lens having positive refractive power, a second lens having positive refractive power, a third lens having negative refractive power, a fourth lens having positive refractive power, a fifth lens having positive refractive power, a sixth lens, and a seventh lens having negative refractive power
- a first lens having positive refractive power a second lens having positive refractive power
- a third lens having negative refractive power a fourth lens having positive refractive power
- a fifth lens having positive refractive power a sixth lens
- a seventh lens having negative refractive power has been devised (see, for example, Patent Document 1).
- This technology was developed in light of these circumstances, and makes it possible to achieve high optical performance in a lens optical system that is compatible with large, high-pixel solid-state imaging elements.
- the lens optical system of the first aspect of the present technology includes, in order from the object side to the image side, a first lens having positive refractive power near the optical axis and having a meniscus shape convex to the object side, a second lens having negative refractive power near the optical axis and having a meniscus shape convex to the object side, a third lens having refractive power near the optical axis, a fourth lens having positive refractive power near the optical axis and having a meniscus shape convex to the image side, and a fourth lens having refractive power near the optical axis and having a meniscus shape convex to the image side.
- the lens optical system includes a fifth lens, a sixth lens having a refractive power near the optical axis and a meniscus shape convex toward the object side, and a seventh lens having a negative refractive power near the optical axis and a meniscus shape convex toward the object side, and is configured such that, among the air gaps on the optical axis between the opposing surfaces of each pair of adjacent two lenses among the first lens to the seventh lens, the gap between the pair of the fourth lens and the fifth lens is the longest.
- a first lens having positive refractive power near the optical axis and a meniscus shape convex to the object side, and a second lens having negative refractive power near the optical axis and a meniscus shape convex to the object side are provided in that order from the object side to the image side.
- a third lens having refractive power near the optical axis, a fourth lens having positive refractive power near the optical axis and a meniscus shape convex to the image side, and a fifth lens having refractive power near the optical axis and a meniscus shape convex to the image side are provided.
- a sixth lens having refractive power near the optical axis and a meniscus shape convex to the object side is provided.
- a seventh lens having negative refractive power near the optical axis and a meniscus shape convex to the object side is provided.
- the gap between the pair of the fourth lens and the fifth lens is the longest.
- the imaging device of the second aspect of the present technology includes, in order from the object side to the image side, a first lens having positive refractive power near the optical axis and having a meniscus shape convex towards the object side, a second lens having negative refractive power near the optical axis and having a meniscus shape convex towards the object side, a third lens having refractive power near the optical axis, a fourth lens having positive refractive power near the optical axis and having a meniscus shape convex towards the image side, a fifth lens having refractive power near the optical axis and having a meniscus shape convex towards the image side, and a fifth lens having refractive power near the optical axis and having a meniscus shape convex towards the image side.
- the imaging device includes a lens optical system that includes a sixth lens having a meniscus shape convex toward the body side, and a seventh lens having a meniscus shape convex toward the object side, which has negative refractive power near the optical axis, and is configured such that, among the air gaps on the optical axis between the opposing surfaces of each pair of adjacent lenses among the first lens to the seventh lens, the gap between the pair of the fourth lens and the fifth lens is the longest, and an imaging element that converts the optical image formed by the lens optical system into an electrical signal.
- a lens optical system and an imaging element that converts an optical image formed by the lens optical system into an electrical signal are provided.
- the lens optical system includes, in order from the object side to the image side, a first lens having positive refractive power near the optical axis and a meniscus shape convex to the object side, and a second lens having negative refractive power near the optical axis and a meniscus shape convex to the object side.
- a third lens having refractive power near the optical axis, a fourth lens having positive refractive power near the optical axis and a meniscus shape convex to the image side, and a fifth lens having refractive power near the optical axis and a meniscus shape convex to the image side are provided.
- a sixth lens having refractive power near the optical axis and a meniscus shape convex to the object side, and a seventh lens having negative refractive power near the optical axis and a meniscus shape convex to the object side are provided.
- the gap between the pair of the fourth lens and the fifth lens is the longest.
- FIG. 1 is a cross-sectional view showing a configuration example of a first embodiment of an imaging device to which the present technology is applied.
- FIG. 2 is a cross-sectional view showing a first configuration example of a lens optical system.
- 3 is a table showing lens data for the lens of FIG. 2.
- 3 is a table showing aspheric data for the surfaces of FIG. 2;
- FIG. 13 is a diagram illustrating the effect of the surface spacing.
- 3 is a graph showing spherical aberration, field curvature, and distortion in the lens optical system of FIG. 2.
- FIG. 4 is a cross-sectional view showing a second configuration example of the lens optical system.
- 8 is a table showing lens data for the lens of FIG. 7.
- 8 is a table showing aspheric data for the surfaces of FIG.
- FIG. 11 is a cross-sectional view showing a third configuration example of the lens optical system.
- 12 is a table showing lens data for the lens of FIG. 11 .
- 12 is a table showing aspheric data for the surfaces of FIG. 11 .
- 12 is a graph showing spherical aberration, field curvature, and distortion in the lens optical system of FIG. 11 .
- FIG. 11 is a cross-sectional view showing a fourth configuration example of the lens optical system.
- 16 is a table showing lens data for the lens of FIG. 15 .
- 16 is a table showing aspheric data for the surfaces of FIG. 15 .
- FIG. 16 is a graph showing spherical aberration, field curvature, and distortion in the lens optical system of FIG. 15 .
- FIG. 11 is a cross-sectional view showing a fifth configuration example of the lens optical system.
- 20 is a table showing lens data for the lens of FIG. 19.
- 20 is a table showing aspheric data for the surfaces of FIG. 19 .
- 20 is a graph showing spherical aberration, field curvature, and distortion in the lens optical system of FIG. 19.
- 1 is a table showing values of parameters or expressions in a lens optical system.
- FIG. 1 is a block diagram showing an example of a hardware configuration of a smartphone as an electronic device to which the present technology is applied.
- FIG. 1 is a diagram illustrating an example of use of an imaging device.
- FIG. 1 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system.
- 2 is a block diagram showing an example of the functional configuration of a camera head and a CCU.
- FIG. 1 is a block diagram showing an example of a schematic configuration of a vehicle control system;
- FIG. 4 is an explanatory diagram showing an example of an installation position of an imaging unit.
- FIG. 1 is a cross-sectional view showing an example of the configuration of an embodiment of an imaging device to which the present technology is applied.
- the imaging device 10 in FIG. 1 is composed of a thin circuit board 14 on which a solid-state imaging device 13 is mounted, a circuit board 15, and a spacer 16.
- the solid-state imaging device 13 has a CSP (Chip Size Package) structure.
- the CSP structure is one of the structures of solid-state imaging devices that realizes a high pixel count, compact size, and low height, and is an extremely small package structure that is realized with a size similar to that of a single chip.
- the solid-state imaging device 13 is composed of a solid-state imaging element 21, adhesive 22, glass substrate 23, black resin 24, lens optical system 25, and fixing agent 26.
- the solid-state imaging element 21 is a CCD sensor or a CMOS image sensor, and includes a semiconductor substrate 31 and an on-chip lens 32.
- the lower surface of the semiconductor substrate 31 in FIG. 1 is connected to the circuit board 14.
- a pixel array 41 and the like are formed on an imaging surface 31a, which is a partial area of the upper surface of the semiconductor substrate 31 in FIG. 1, and is made up of light receiving elements corresponding to each of a plurality of pixels arranged in a two-dimensional lattice pattern.
- the on-chip lens 32 is formed at a position on the pixel array 41 that corresponds to each pixel.
- the adhesive 22 is a transparent adhesive that is applied to the upper surface in FIG. 1, including the imaging surface 31a of the solid-state imaging element 21.
- the glass substrate 23 is adhered to the solid-state imaging element 21 via the adhesive 22 for the purposes of fixing the solid-state imaging element 21 and protecting the imaging surface 31a.
- the black resin 24 is formed on the surface of the glass substrate 23 opposite the adhesive surface to which the adhesive 22 is applied, and functions as a spacer.
- An IR (Infrared) cut filter (not shown) of the lens optical system 25 is placed on top of the glass substrate 23 via this black resin 24 so that it is parallel to the glass substrate 23. This positions the glass substrate 23 between the lens optical system 25 and the imaging surface 31a.
- the black resin 24 black mask blocks light that is incident via the lens optical system 25 and that is outside the imaging surface 31a.
- the lens optical system 25 is a lens optical system that collects light from a subject and forms an optical image on the imaging surface 31a.
- the configuration of the lens optical system 25 will be described in detail with reference to Figures 2, 7, 11, 15, and 19, which will be described later.
- the fixing agent 26 is applied to the sides of the solid-state imaging element 21, adhesive 22, glass substrate 23, black resin 24, and lens optical system 25, and to the periphery of the object side (light incident side) surface (top surface in FIG. 1) of the lens optical system 25.
- the fixing agent 26 fixes the solid-state imaging element 21, adhesive 22, glass substrate 23, black resin 24, and lens optical system 25.
- This fixing agent 26 can reduce light that is incident from the side of the solid-state imaging device 13 and is refracted or reflected.
- the fixing agent 26 can also block light that is incident on the solid-state imaging device 13 from outside the area corresponding to the imaging surface 31a.
- Each light-receiving element of the pixel array 41 captures the image by converting the optical image into an electrical signal.
- the lens optical system 25 is included within the CSP structure of the solid-state imaging device 13, so the imaging device 10 can be made smaller than when the lens optical system 25 is provided separately.
- the circuit board 14 is connected to the lower surface of the semiconductor substrate 31 in FIG. 1, and outputs a camera signal corresponding to the electrical signal generated by each light receiving element to the spacer 16.
- Circuit board 15 is a circuit board for outputting the camera signal output from circuit board 14 via spacer 16 to the outside, and electronic components and the like are mounted on it.
- Circuit board 15 has connector 15a for connecting to an external device, and outputs the camera signal to the external device.
- Spacer 16 is a spacer with a built-in circuit for fixing an actuator (not shown) that drives lens optical system 25 and circuit board 15.
- Semiconductor components 16a and 16b, etc. are mounted on spacer 16.
- Semiconductor components 16a and 16b are semiconductor components that constitute a capacitor and an LSI (Large Scale Integration) that controls an actuator (not shown) that drives lens optical system 25.
- Spacer 16 outputs a camera signal output from circuit board 14 to circuit board 15.
- FIG. 2 is a cross-sectional view showing a first configuration example of the lens optical system 25. As shown in FIG.
- the lens optical system 25 includes, in order from the object side toward the image side (light exit side), an aperture stop 70, a lens 71 (first lens), a lens 72 (second lens), a lens 73 (third lens), a lens 74 (fourth lens), a lens 75 (fifth lens), a lens 76 (sixth lens), a lens 77 (seventh lens), and an IR cut filter 78.
- the aperture stop 70 limits the light entering the lens optical system 25.
- Lens 71 has surface 71a on the object side (left side in FIG. 2) and surface 71b on the image side (right side in FIG. 2).
- Lens 71 has positive refractive power near the optical axis and has a meniscus shape convex toward the object side.
- Lens 72 has surface 72a on the object side and surface 72b on the image side.
- Lens 72 has negative refractive power near the optical axis and has a meniscus shape convex toward the object side.
- Lens 73 has an object-side surface 73a and an image-side surface 73b.
- Lens 73 has positive refractive power near the optical axis and has a meniscus shape convex toward the object side.
- Lens 73 functions, for example, as a relay lens, and appropriately corrects the angle of incidence of light rays. In order for lens 73 to function as a relay lens, it is desirable that the shape of lens 73 is determined so that the distance from the optical axis of the light rays entering lens 73 is shorter than the distance from the optical axis of the light rays exiting lens 73.
- Lens 74 has an object-side surface 74a and an image-side surface 74b. Lens 74 has positive refractive power near the optical axis and has a meniscus shape convex toward the image side.
- lens 73 has refractive power near the optical axis
- lens 74 has positive refractive power near the optical axis and has a meniscus shape convex toward the image side. Therefore, coma aberration and astigmatism can be corrected well.
- the exit angle of the light ray can be suitably corrected, the total length TTL can be shortened, and the angle between the chief ray of the light incident on the imaging surface 31a and the imaging surface 31a can be suppressed.
- Lens 75 has an object-side surface 75a and an image-side surface 75b.
- Lens 75 has negative refractive power near the optical axis and has a meniscus shape convex toward the image side.
- Lens 75 effectively corrects coma aberration and field curvature aberration in the peripheral area while correcting light rays near the optical axis.
- the peripheral area of surface 75a be located closer to the object side than the apex (center point).
- Lens 76 has an object-side surface 76a and an image-side surface 76b.
- Lens 76 has positive refractive power near the optical axis and has a meniscus shape convex toward the object side.
- Lens 77 has an object-side surface 77a and an image-side surface 77b.
- Lens 77 has negative refractive power near the optical axis and has a meniscus shape convex toward the object side.
- Lenses 76 and 77 appropriately correct coma aberration, spherical aberration, field curvature aberration, and chromatic aberration of magnification in the peripheral areas while correcting light rays near the optical axis. It is desirable for lenses 76 and 77 to have one or more inflection points in order to appropriately correct the paths of light rays that reach the periphery of imaging surface 31a.
- lens 75 has refractive power near the optical axis and has a meniscus shape convex toward the image side
- lenses 76 and 77 have refractive power near the optical axis and have a meniscus shape convex toward the object side. Therefore, it is possible to appropriately correct marginal rays near the optical axis while also appropriately correcting field curvature aberration and distortion aberration. In addition, it is possible to appropriately correct the angle between the light rays reaching the periphery of imaging surface 31a and imaging surface 31a, thereby suppressing a decrease in the amount of light received by solid-state imaging element 21.
- the IR cut filter 78 transmits light other than infrared light that is incident on the object side surface 78a and emits it from the image side surface 78b. Note that the IR cut filter 78 does not necessarily have to be provided.
- Light incident on the lens optical system 25 from the subject (object) is emitted through surfaces 71a, 71b, 72a, 72b, 73a, 73b, 74a, 74b, 75a, 75b, 76a, 76b, 77a, 77b, 78a, and 78b.
- the light emitted from the lens optical system 25 in this manner is focused on the imaging surface 31a via the glass substrate 23, adhesive 22, and on-chip lens 32.
- FIG. 2 in order to simplify the drawing, only the imaging surface 31a is shown, but in reality, between the lens optical system 25 and the imaging surface 31a, there are a glass substrate 23, an adhesive 22, and an on-chip lens 32. This is also true in FIGS. 2, 7, 11, 15, and 19, which will be described later.
- FIG. 3 is a table showing lens data for lenses 71 to 77.
- the rows in the table in Figure 3 correspond to surfaces 71a to 78a and surfaces 71b to 78b, respectively. From the left, the columns correspond to the surface number i, the radius of curvature Ri, the surface spacing Di, which is the distance on the optical axis from the closest surface on the image side or the imaging surface 31a, the refractive index Ndi for the d-line (wavelength 587.6 nm), and the Abbe number Vdi for the d-line.
- the surface number is a number assigned to each surface of the lens optical system 25.
- the surface numbers 101 to 116 are assigned to surfaces 71a, 71b, 72a, 72b, 73a, 73b, 74a, 74b, 75a, 75b, 76a, 76b, 77a, 77b, 78a, and 78b, in that order.
- the radius of curvature R101 of surface 71a whose surface number i is 101, is 2.83
- the surface spacing D101 which is the distance on the optical axis between surface 71a and surface 71b, which is the closest surface to the image side of surface 71a
- the refractive index Nd101 is 1.5533.
- the Abbe number Vd101 of surface 71a i.e., the Abbe number V101 of lens 71, is 71.685.
- the radius of curvature R103 of surface 72a whose surface number i is 103, is 4.06
- the surface spacing D103 which is the distance on the optical axis between surface 72a and surface 72b, which is the closest surface to the image side of surface 72a
- the refractive index Nd103 is 2.0018.
- the Abbe number Vd103 of surface 72a i.e., the Abbe number V102 of lens 72
- the radius of curvature R104 of surface 72b, whose surface number i is 104, is 3.16
- the surface spacing D104 which is the distance through air on the optical axis between surface 72b and surface 73a, which is the closest surface to the image side of surface 72b, is 0.496.
- the radius of curvature R105 of the surface 73a having the surface number i of 105 is 2.07 ⁇ 10
- the surface spacing D105 which is the distance on the optical axis between the surface 73a and the closest surface 73b on the image side
- the refractive index Nd105 is 1.5445.
- the Abbe number Vd105 of the surface 73a i.e., the Abbe number V103 of the lens 73
- the radius of curvature R106 of the surface 73b having the surface number i of 106 is 1.81 ⁇ 102
- the surface spacing D106 which is the distance through air on the optical axis between the surface 73b and the closest surface 74a on the image side, is 0.361.
- the radius of curvature R107 of surface 74a whose surface number i is 107, is -2.09 x 10
- the surface spacing D107 which is the distance on the optical axis between surface 74a and surface 74b, which is the closest surface to the image side of surface 74a, is 0.524
- the refractive index Nd107 is 1.6614.
- the Abbe number Vd107 of surface 74a i.e., the Abbe number V104 of lens 74, is 20.410.
- the radius of curvature R108 of surface 74b, whose surface number i is 108, is -9.94
- the surface spacing D108 which is the distance through air on the optical axis between surface 74b and surface 75a, which is the closest surface to the image side of surface 74b, is 1.107.
- the radius of curvature R109 of surface 75a whose surface number i is 109, is -9.28
- the surface spacing D109 which is the distance on the optical axis between surface 75a and surface 75b, which is the closest surface to the image side of surface 75a
- the refractive index Nd109 is 1.7020.
- the Abbe number Vd109 of surface 75a i.e., the Abbe number V105 of lens 75, is 15.500.
- the radius of curvature R110 of surface 75b, whose surface number i is 110, is -2.47 x 10
- the surface spacing D110 which is the distance through air on the optical axis between surface 75b and surface 76a, which is the closest surface to the image side of surface 75b, is 0.225.
- the radius of curvature R111 of surface 76a whose surface number i is 111, is 3.46
- the surface spacing D111 which is the distance on the optical axis between surface 76a and surface 76b, which is the closest surface to the image side of surface 76a
- the refractive index Nd111 is 1.5445.
- the Abbe number Vd111 of surface 76a i.e., the Abbe number V106 of lens 76, is 55.987.
- the radius of curvature R112 of surface 76b, whose surface number i is 112, is 4.96
- the surface spacing D112 which is the distance through air on the optical axis between surface 77a, which is the closest surface to surface 76b on the image side, is 1.072.
- the radius of curvature R113 of surface 77a whose surface number i is 113, is 9.37
- the surface spacing D113 which is the distance on the optical axis between surface 77a and the closest surface 77b on the image side
- the refractive index Nd113 is 1.5445.
- the Abbe number Vd113 of surface 77a i.e., the Abbe number V107 of lens 77
- the radius of curvature R114 of surface 77b, whose surface number i is 114, is 3.05
- the surface spacing D114 which is the distance through air on the optical axis between surface 78a, which is the closest surface 77b on the image side, is 0.370.
- the radius of curvature R115 of surface 78a whose surface number i is 115, is infinity
- the surface spacing D115 which is the distance on the optical axis between surface 78a and the closest surface 78b on the image side
- the refractive index Nd115 is 1.5168.
- the Abbe number Vd115 of surface 78a i.e., the Abbe number V108 of the IR cut filter 78
- the surface spacing D116 which is the distance on the optical axis between surface 78a and the imaging surface 31a, is 0.600.
- surface distance D108 is the longest.
- FIG. 4 is a table showing the aspheric data of the surfaces 71a to 77a and the surfaces 71b to 77b.
- the rows in the table in FIG. 4 correspond to surfaces 71a to 77a and surfaces 71b to 77b, respectively.
- the columns correspond, from the left, to the surface number i, the cone coefficient K, the 4th order aspheric coefficient, the 6th order aspheric coefficient, the 8th order aspheric coefficient, the 10th order aspheric coefficient, the 12th order aspheric coefficient, the 14th order aspheric coefficient, the 16th order aspheric coefficient, the 18th order aspheric coefficient, and the 20th order aspheric coefficient.
- the conic coefficient K of the surface 71a whose surface number i is 101 is -2.42803 ⁇ 10-1 .
- the fourth-order aspheric coefficients, sixth-order aspheric coefficients, eighth-order aspheric coefficients, tenth-order aspheric coefficients, twelfth-order aspheric coefficients, fourteenth-order aspheric coefficients, and sixteenth-order aspheric coefficients are 3.62272 ⁇ 10-3 , 3.65444 ⁇ 10-3 , -3.36391 ⁇ 10-3, 2.41910 ⁇ 10-3 , -9.31617 ⁇ 10-4 , 1.91619 ⁇ 10-4 , and -1.63364 ⁇ 10-5 , respectively.
- the conic coefficient K of the surface 71b whose surface number i is 102 is 2.97650.
- the fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -2.83049 ⁇ 10-3 , 6.81582 ⁇ 10-3 , -6.86480 ⁇ 10-3 , 3.25922 ⁇ 10-3 , and -3.50239 ⁇ 10-4 , respectively.
- the 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are ⁇ 1.63020 ⁇ 10 ⁇ 4 , ⁇ 4.82494 ⁇ 10 ⁇ 5 , 5.67182 ⁇ 10 ⁇ 5 , and ⁇ 1.00277 ⁇ 10 ⁇ 5 , respectively.
- the conic coefficient K of the surface 72a whose surface number i is 103 is 2.61919.
- the fourth-order aspheric coefficients, sixth-order aspheric coefficients, eighth-order aspheric coefficients, tenth-order aspheric coefficients, and twelfth-order aspheric coefficients are -2.21612 ⁇ 10-2 , 9.91095 ⁇ 10-3 , -1.76651 ⁇ 10-2 , 1.61833 ⁇ 10-2 , and -9.24306 ⁇ 10-3 , respectively.
- the 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are 3.51387 ⁇ 10 ⁇ 3 , ⁇ 1.11137 ⁇ 10 ⁇ 3 , 2.83264 ⁇ 10 ⁇ 4 , and ⁇ 3.85854 ⁇ 10 ⁇ 5 , respectively.
- the conic coefficient K of the surface 72b whose surface number i is 104 is -1.94467 ⁇ 10-1 .
- the fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -1.64596 ⁇ 10-3 , 1.61079 ⁇ 10-2 , -2.84858 ⁇ 10-2 , 4.03268 ⁇ 10-2 , and -2.98809 ⁇ 10-2 , respectively.
- the 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are 1.16734 ⁇ 10 ⁇ 2 , ⁇ 1.18767 ⁇ 10 ⁇ 3 , 6.24837 ⁇ 10 ⁇ 4 , and 1.67035 ⁇ 10 ⁇ 4 , respectively.
- the conic coefficient K of the surface 73a whose surface number i is 105 is -1.07370 ⁇ 10-2 .
- the fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, the twelfth-order aspheric coefficient, and the fourteenth-order aspheric coefficient are -6.22065 ⁇ 10-3 , 1.45641 ⁇ 10-3 , -3.00430 ⁇ 10-3 , 3.45145 ⁇ 10-3 , -5.07824 ⁇ 10-4 , and 2.27740 ⁇ 10-5 , respectively.
- the conic coefficient K of the surface 73b whose surface number i is 106 is 1.01266 ⁇ 10-2 .
- the fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -8.15811 ⁇ 10-3 , -9.45658 ⁇ 10-3 , 2.15704 ⁇ 10-2 , -3.18190 ⁇ 10-2 , and 3.53709 ⁇ 10-2 , respectively.
- the 14th-order aspherical coefficient, the 16th-order aspherical coefficient, the 18th-order aspherical coefficient, and the 20th-order aspherical coefficient are ⁇ 2.33788 ⁇ 10 ⁇ 2 , 8.88652 ⁇ 10 ⁇ 3 , 1.58650 ⁇ 10 ⁇ 3 , and 5.25725 ⁇ 10 ⁇ 5 , respectively.
- the conic coefficient K of the surface 74a whose surface number i is 107 is ⁇ 8.54794 ⁇ 10.
- the fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are ⁇ 1.05190 ⁇ 10 , ⁇ 1.79704 ⁇ 10 , 2.92681 ⁇ 10 , ⁇ 3.16587 ⁇ 10 , and 2.14221 ⁇ 10 , respectively.
- the 14th-order aspherical coefficient, the 16th-order aspherical coefficient, the 18th-order aspherical coefficient, and the 20th-order aspherical coefficient are ⁇ 7.63719 ⁇ 10 ⁇ 3 , 1.03705 ⁇ 10 ⁇ 3 , 1.54916 ⁇ 10 ⁇ 4 , and ⁇ 4.93285 ⁇ 10 ⁇ 5 , respectively.
- the conic coefficient K of the surface 74b whose surface number i is 108, is 1.66484 ⁇ 10.
- the fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -8.80471 ⁇ 10 , -1.04782 ⁇ 10 , -5.21460 ⁇ 10 , 6.19538 ⁇ 10 , and -4.35003 ⁇ 10 , respectively.
- the 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are 1.92361 ⁇ 10 ⁇ 3 , ⁇ 4.56399 ⁇ 10 ⁇ 4 , 4.04193 ⁇ 10 ⁇ 5 , and 2.88880 ⁇ 10 ⁇ 6 , respectively.
- the conic coefficient K of the surface 75a whose surface number i is 109 is 1.37649 ⁇ 10.
- the fourth-order aspherical coefficient, the sixth-order aspherical coefficient, the eighth-order aspherical coefficient, the tenth-order aspherical coefficient, and the twelfth-order aspherical coefficient are 2.14908 ⁇ 10 , -2.14981 ⁇ 10 , 9.35939 ⁇ 10 , -4.01620 ⁇ 10 , and 1.02870 ⁇ 10 , respectively.
- the 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are ⁇ 2.09912 ⁇ 10 ⁇ 4 , ⁇ 1.19385 ⁇ 10 ⁇ 5 , 2.16807 ⁇ 10 ⁇ 5 , and ⁇ 3.86333 ⁇ 10 ⁇ 6 , respectively.
- the conic coefficient K of the surface 75b whose surface number i is 110 is -3.16667 ⁇ 10.
- the fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -1.79352 ⁇ 10 , -4.62014 ⁇ 10 , 6.22425 ⁇ 10 , -3.48550 ⁇ 10, and 8.54536 ⁇ 10 , respectively.
- the 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are ⁇ 9.92036 ⁇ 10 ⁇ 5 , 4.79949 ⁇ 10 ⁇ 6 , 9.85895 ⁇ 10 ⁇ 9 , and ⁇ 6.55434 ⁇ 10 ⁇ 9 , respectively.
- the conic coefficient K of the surface 76a whose surface number i is 111 is -3.04052 ⁇ 10-1 .
- the fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -5.33124 ⁇ 10-2 , 7.90951 ⁇ 10-3 , -3.49156 ⁇ 10-3 , 9.13596 ⁇ 10-4 , and -1.36795 ⁇ 10-4 , respectively.
- the 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are 1.19822 ⁇ 10 ⁇ 5 , ⁇ 5.67900 ⁇ 10 ⁇ 7 , 1.08192 ⁇ 10 ⁇ 8 , and ⁇ 4.01201 ⁇ 10 ⁇ 11 , respectively.
- the conic coefficient K of the surface 75b whose surface number i is 112 is -1.86569 ⁇ 10.
- the fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are 3.15385 ⁇ 10 , -1.87448 ⁇ 10 , 4.44523 ⁇ 10 , -6.72679 ⁇ 10, and 6.83777 ⁇ 10 , respectively.
- the 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are ⁇ 4.49314 ⁇ 10 ⁇ 6 , 1.65027 ⁇ 10 ⁇ 7 , ⁇ 1.95722 ⁇ 10 ⁇ 9 , and ⁇ 3.67513 ⁇ 10 ⁇ 11 , respectively.
- the conic coefficient K of the surface 77a whose surface number i is 113 is ⁇ 7.89633.
- the fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are ⁇ 8.88581 ⁇ 10 ⁇ 2 , 2.11957 ⁇ 10 ⁇ 2 , ⁇ 2.73125 ⁇ 10 ⁇ 3 , 2.00134 ⁇ 10 ⁇ 4 , and ⁇ 7.85631 ⁇ 10 ⁇ 6 , respectively.
- the 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are 1.09565 ⁇ 10 ⁇ 7 , 2.89958 ⁇ 10 ⁇ 9 , ⁇ 1.28783 ⁇ 10 ⁇ 10 , and 1.38563 ⁇ 10 ⁇ 12 , respectively.
- the conic coefficient K of the surface 77b, whose surface number i is 114, is -7.39089.
- the fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -3.44999 ⁇ 10-2 , 6.33229 ⁇ 10-3 , -8.14233 ⁇ 10-4 , 6.58014 ⁇ 10-5 , and -3.15114 ⁇ 10-6 , respectively.
- the 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are 7.02794 ⁇ 10 ⁇ 8 , 2.614142 ⁇ 10 ⁇ 10 , ⁇ 4.01785 ⁇ 10 ⁇ 11 , and 4.99232 ⁇ 10 ⁇ 13 , respectively.
- FIG. 5 is a diagram for explaining the effect of the surface distance D108 between the surface 74b and the surface 75a.
- the lens 76, the lens 77, and the IR cut filter 78 between the lens 75 and the imaging surface 31a are not shown.
- the light rays reaching the high image height on the imaging surface 31a can be spread by ⁇ h in the direction away from the optical axis.
- the lens optical system 25 is therefore designed so that among the surface distances D102, D104, D106, D108, D110, and D112 between the opposing surfaces of each pair of adjacent lenses among lenses 71 to 77, surface distance D108 is the longest. This makes it possible to separate light rays that reach high image heights on the imaging surface 31a. As a result, even if the solid-state imaging element 21 is enlarged, coma aberration, field curvature aberration, and distortion aberration on the high image height side can be well corrected.
- the lens optical system 25 makes it possible to correct the light rays near the paraxial line by giving the lenses 75 to 77 the shapes described above.
- FIG. 6 is a graph showing spherical aberration, field curvature, and distortion occurring in the lens optical system 25 of FIG.
- a in FIG. 6 is a graph showing the vertical spherical aberration for each wavelength of light having wavelengths of 443.58 nm, 486.13 nm, 546.07 nm, 587.56 nm, and 656.27 nm that occurs in the lens optical system 25 in FIG. 2.
- the horizontal axis represents the spherical aberration [mm]
- the vertical axis represents the normalized pupil coordinate. This also applies to A in FIG. 10, A in FIG. 14, A in FIG. 18, and A in FIG. 22, which will be described later.
- FIG. 6B is a graph showing the field curvature that occurs in the lens optical system 25 of FIG. 2.
- the horizontal axis shows the field curvature [mm]
- the vertical axis shows the angle [degree] corresponding to the incident position of the light ray in the sagittal or tangential direction.
- the solid line shows the relationship between the incident position in the tangential direction and the field curvature
- the dotted line shows the relationship between the field curvature in the sagittal direction.
- FIG. 10B, FIG. 14B, FIG. 18B, and FIG. 22B which will be described later.
- the difference between the field curvature in the sagittal direction and the tangential direction is astigmatism.
- FIG. 6C is a graph showing the distortion aberration of light with a wavelength of 587.56 nm that occurs in the lens optical system 25 of FIG. 2.
- the horizontal axis shows the distortion aberration [%]
- the vertical axis shows the angle of incidence of the light ray [degrees]. This also applies to FIG. 10C, FIG. 14C, FIG. 18C, and FIG. 25C, which will be described later.
- FIG. 7 is a cross-sectional view showing a second configuration example of the lens optical system 25. As shown in FIG.
- lens optical system 25 of FIG. 7 differs from the lens optical system 25 of FIG. 2 in that lenses 71 to 77 are replaced by lenses 171 to 177, and is otherwise configured in the same way as the lens optical system 25 of FIG. 2.
- Lens 171 to 177 differ from lenses 71 to 77 in the lens data, aspheric data of each surface, and in that lenses 173 and 176 have negative refractive power near the optical axis, but are otherwise configured in the same manner as lenses 71 to 77. Therefore, the following describes the lens data of lenses 171 to 177 and the aspheric data of object-side surfaces 171a to 177a and image-side surfaces 171b to 177b.
- FIG. 8 is a table showing lens data for lenses 171 to 177.
- the rows in the table in FIG. 8 correspond to surfaces 171a to 177a, surfaces 171b to 177b, and surfaces 78a and 78b. From the left, the columns correspond to the surface number i, the radius of curvature Ri, the surface spacing Di, the refractive index Ndi for the d line, and the Abbe number Vdi for the d line.
- the surfaces 171a, 171b, 172a, 172b, 173a, 173b, 174a, 174b, 175a, 175b, 176a, 176b, 177a, and 177b are assigned surface numbers from 201 to 214, in that order. For the values in each column of the table in Figure 8, see Figure 8.
- surface distance D208 is the longest.
- FIG. 9 is a table showing the aspheric surface data of the surfaces 171a to 177a and the surfaces 171b to 177b.
- Each row in the table in FIG. 9 corresponds to surfaces 171a to 177a and surfaces 171b to 177b. From the left, each column corresponds to the surface number i, the cone coefficient K, the 4th order aspheric coefficient, the 6th order aspheric coefficient, the 8th order aspheric coefficient, the 10th order aspheric coefficient, the 12th order aspheric coefficient, the 14th order aspheric coefficient, the 16th order aspheric coefficient, the 18th order aspheric coefficient, and the 20th order aspheric coefficient.
- FIG. 10 is a graph showing the spherical aberration, the field curvature, and the distortion that occur in the lens optical system 25 of FIG.
- a in FIG. 10 is a graph showing the longitudinal spherical aberration for each wavelength of light having wavelengths of 443.58 nm, 486.13 nm, 546.07 nm, 587.56 nm, and 656.27 nm that occurs in the lens optical system 25 in FIG. 7.
- B in FIG. 10 is a graph showing the field curvature that occurs in the lens optical system 25 in FIG. 7.
- C in FIG. 10 is a graph showing the distortion aberration for light having a wavelength of 587.56 nm that occurs in the lens optical system 25 in FIG. 7.
- FIG. 11 is a cross-sectional view showing a third configuration example of the lens optical system 25. As shown in FIG.
- lens optical system 25 of FIG. 11 differs from the lens optical system 25 of FIG. 2 in that lenses 71 to 77 are replaced by lenses 271 to 277, and is otherwise configured in the same way as the lens optical system 25 of FIG. 2.
- Lens 271 to 277 differ from lenses 71 to 77 in the lens data, aspheric data of each surface, and the fact that lens 275 has positive refractive power near the optical axis and lens 276 has negative refractive power near the optical axis, but are otherwise configured in the same manner as lenses 71 to 77. Therefore, the following describes the lens data of lenses 271 to 277 and the aspheric data of object-side surfaces 271a to 277a and image-side surfaces 271b to 277b.
- FIG. 12 is a table showing lens data for lenses 271 to 277.
- the rows in the table in FIG. 12 correspond to surfaces 271a to 277a, surfaces 271b to 277b, and surfaces 78a and 78b. From the left, the columns correspond to the surface number i, the radius of curvature Ri, the surface spacing Di, the refractive index Ndi for the d line, and the Abbe number Vdi for the d line.
- the surfaces 271a, 271b, 272a, 272b, 273a, 273b, 274a, 274b, 275a, 275b, 276a, 276b, 277a, and 277b are assigned surface numbers from 301 to 314, in that order. See FIG. 12 for the values in each column of the table in FIG. 12.
- surface distance D308 is the longest.
- FIG. 13 is a table showing the aspheric surface data of the surfaces 271a to 277a and the surfaces 271b to 277b.
- Each row in the table in FIG. 13 corresponds to surfaces 271a to 277a and surfaces 271b to 277b. From the left, each column corresponds to the surface number i, conic coefficient K, 4th order aspheric coefficient, 6th order aspheric coefficient, 8th order aspheric coefficient, 10th order aspheric coefficient, 12th order aspheric coefficient, 14th order aspheric coefficient, 16th order aspheric coefficient, 18th order aspheric coefficient, and 20th order aspheric coefficient.
- FIG. 14 is a graph showing the spherical aberration, the field curvature, and the distortion that occur in the lens optical system 25 of FIG.
- a in FIG. 14 is a graph showing the longitudinal spherical aberration for each wavelength of light having wavelengths of 443.58 nm, 486.13 nm, 546.07 nm, 587.56 nm, and 656.27 nm that occurs in the lens optical system 25 in FIG. 11.
- B in FIG. 14 is a graph showing the field curvature that occurs in the lens optical system 25 in FIG. 11.
- C in FIG. 14 is a graph showing the distortion aberration for light having a wavelength of 587.56 nm that occurs in the lens optical system 25 in FIG. 11.
- FIG. 15 is a cross-sectional view showing a fourth configuration example of the lens optical system 25. As shown in FIG.
- lens optical system 25 of FIG. 15 differs from the lens optical system 25 of FIG. 2 in that lenses 71 to 77 are replaced by lenses 371 to 377, and is otherwise configured in the same way as the lens optical system 25 of FIG. 2.
- Lens 371 to 377 differ from lenses 71 to 77 in the lens data, aspheric data of each surface, and in that lens 376 has negative refractive power near the optical axis, but are otherwise configured in the same manner as lenses 71 to 77. Therefore, the following describes the lens data of lenses 371 to 377 and the aspheric data of object-side surfaces 371a to 377a and image-side surfaces 371b to 377b.
- FIG. 16 is a table showing lens data for lenses 371 to 377.
- the rows in the table in FIG. 16 correspond to surfaces 371a to 377a, surfaces 371b to 377b, and surfaces 78a and 78b. From the left, the columns correspond to the surface number i, the radius of curvature Ri, the surface spacing Di, the refractive index Ndi for the d line, and the Abbe number Vdi for the d line.
- surface numbers from 401 to 414 are assigned to surfaces 371a, 371b, 372a, 372b, 373a, 373b, 374a, 374b, 375a, 375b, 376a, 376b, 377a, and 377b in order. See FIG. 16 for the values in each column of the table in FIG. 16.
- surface distance D408 is the longest.
- FIG. 17 is a table showing the aspheric surface data of the surfaces 371a to 377a and the surfaces 371b to 377b.
- Each row in the table in FIG. 17 corresponds to surfaces 371a to 377a and surfaces 371b to 377b. From the left, each column corresponds to the surface number i, conic coefficient K, 4th order aspheric coefficient, 6th order aspheric coefficient, 8th order aspheric coefficient, 10th order aspheric coefficient, 12th order aspheric coefficient, 14th order aspheric coefficient, 16th order aspheric coefficient, 18th order aspheric coefficient, and 20th order aspheric coefficient.
- FIG. 18 is a graph showing spherical aberration, field curvature, and distortion occurring in the lens optical system 25 of FIG.
- a in FIG. 18 is a graph showing the longitudinal spherical aberration for each wavelength of light having wavelengths of 443.58 nm, 486.13 nm, 546.07 nm, 587.56 nm, and 656.37 nm that occurs in the lens optical system 25 in FIG. 15.
- B in FIG. 18 is a graph showing the field curvature that occurs in the lens optical system 25 in FIG. 15.
- C in FIG. 18 is a graph showing the distortion aberration for light having a wavelength of 587.56 nm that occurs in the lens optical system 25 in FIG. 15.
- FIG. 19 is a cross-sectional view showing a fifth configuration example of the lens optical system 25. As shown in FIG. 19
- lens optical system 25 of FIG. 19 differs from the lens optical system 25 of FIG. 2 in that lenses 71 to 77 are replaced by lenses 471 to 477, and is otherwise configured in the same way as the lens optical system 25 of FIG. 2.
- Lens 471 to 477 differ from lenses 71 to 77 in the lens data and aspheric data of each surface, but are otherwise configured in the same manner as lenses 71 to 77. Therefore, below, we will explain the lens data of lenses 471 to 477 and the aspheric data of object-side surfaces 471a to 477a and image-side surfaces 471b to 477b.
- FIG. 20 is a table showing lens data for lenses 471 to 477.
- the rows in the table in FIG. 20 correspond to surfaces 471a to 477a, surfaces 471b to 477b, and surfaces 78a and 78b. From the left, the columns correspond to the surface number i, the radius of curvature Ri, the surface spacing Di, the refractive index Ndi for the d line, and the Abbe number Vdi for the d line.
- surface numbers from 501 to 514 are assigned to surfaces 471a, 471b, 472a, 472b, 473a, 473b, 474a, 474b, 475a, 475b, 476a, 476b, 477a, and 477b in order. See FIG. 20 for the values in each column of the table in FIG. 20.
- surface distance D508 is the longest.
- FIG. 21 is a table showing the aspheric surface data of the surfaces 471a to 477a and the surfaces 471b to 477b.
- Each row in the table in FIG. 21 corresponds to surfaces 471a to 477a and surfaces 471b to 477b. From the left, each column corresponds to the surface number i, cone coefficient K, 4th order aspheric coefficient, 6th order aspheric coefficient, 8th order aspheric coefficient, 10th order aspheric coefficient, 12th order aspheric coefficient, 14th order aspheric coefficient, 16th order aspheric coefficient, 18th order aspheric coefficient, and 20th order aspheric coefficient.
- FIG. 22 is a graph showing the spherical aberration, the field curvature, and the distortion that occur in the lens optical system 25 of FIG.
- a in FIG. 22 is a graph showing the longitudinal spherical aberration for each wavelength of light having wavelengths of 443.58 nm, 486.13 nm, 546.07 nm, 587.56 nm, and 656.47 nm that occurs in the lens optical system 25 of FIG. 19.
- B in FIG. 22 is a graph showing the field curvature that occurs in the lens optical system 25 of FIG. 19.
- C in FIG. 22 is a graph showing the distortion aberration for light having a wavelength of 587.56 nm that occurs in the lens optical system 25 of FIG. 19.
- FIG. 23 is a table showing values of parameters or equations for the lens optical system 25 of FIGS.
- the rows in the table in Figure 23 correspond, from top to bottom, to TTL/(D6+D8), (R2+R1)/(R2-R1), (R4+R3)/(R4-R3), (R8+R7)/(R8-R7), v1, v2,
- the columns correspond, from left to right, to the lens optical systems 25 in Figures 2, 7, 11, 15, and 19.
- R1 to R4 are general terms for radii of curvature Rj01 to Rj04, respectively.
- R7 and R8 are general terms for radii of curvature Rj07 and Rj08, respectively.
- V1 and V2 are general terms for Abbe numbers Vj01 and Vj02, respectively.
- f is the focal length of the entire lens optical system 25.
- f1 is a general term for the focal lengths of lenses 71, 171, 271, 371, and 471.
- f3 is a general term for the focal lengths of lenses 73, 173, 273, 373, and 473.
- CRA is the maximum angle between the chief ray of light incident on the imaging surface 31a from the lens optical system 25 and the imaging surface 31a.
- IH is the distance from the center of the imaging surface 31a to the position where the chief ray of the light with the maximum angle of view reaches, i.e., the maximum image height in the lens optical system 25.
- TTL/(D6+D8) are 5.73, 6.33, 6.19, 5.79, and 6.46, respectively. Therefore, the lens optical systems 25 of FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19 satisfy the following conditional expression (1).
- the total length TTL becomes longer than the distance between lens 73 (173, 273, 373, 473) and lens 75 (175, 275, 375, 475), making it difficult to miniaturize the lens optical system 25.
- the lower limit of conditional expression (1) is exceeded, the total length TTL becomes shorter than the distance between lens 73 (173, 273, 373, 473) and lens 75 (175, 275, 375, 475). This makes it difficult to appropriately correct the exit angle of the light beam with lens 74 (174, 274, 374, 474). As a result, it becomes difficult to increase the size of the solid-state imaging element 21.
- conditional expression (2) If the upper limit of conditional expression (2) is exceeded, the power (refractive power) of lens 71 (171, 271, 371, 471) will be weaker and the total length TTL will be longer. This makes it difficult to miniaturize lens optical system 25. If the lower limit of conditional expression (2) is exceeded, the power of lens 71 (171, 271, 371, 471) will be stronger and spherical aberration will worsen.
- the power of lens 72 (172, 272, 372, 472) becomes strong, making it difficult to correct coma and astigmatism. If the lower limit of conditional expression (4) is exceeded, the power of lens 72 (172, 272, 372, 472) becomes weak, making the exit angle of the light ray relative to the optical axis gentle (small), making it difficult to increase the size of solid-state imaging element 21.
- the Abbe numbers V1 of the lens optical systems 25 of Figures 2, 7, 11, 15, and 19 are 71.68, 71.68, 64.92, 81.00, and 71.68, respectively. Therefore, the lens optical systems 25 of Figures 2, 7, 11, 15, and 19 satisfy the following conditional expression (5).
- conditional expression (5) If the upper limit of conditional expression (5) is exceeded, the refractive index of lens 71 (171, 271, 371, 471) is low, making it difficult to correct spherical aberration. If the lower limit of conditional expression (5) is exceeded, it is difficult to correct axial chromatic aberration.
- V2 is 19.32, 19.32, 17.67, 19.32, and 19.32, respectively. Therefore, the lens optical systems 25 of FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19 satisfy the following conditional expression (6).
- conditional expression (6) If the upper limit of conditional expression (6) is exceeded, the refractive index of lens 72 (172, 272, 372, 472) is low, making it difficult to correct spherical aberration. If the lower limit of conditional expression (6) is exceeded, it becomes difficult to correct axial chromatic aberration.
- the focal length f3 is long, making it difficult to correct the exit angle of the light beam. If the lower limit of conditional expression (7) is exceeded, the focal length f3 is short, making it difficult to correct various aberrations.
- f1/f is 0.92, 0.87, 0.90, 1.03, and 0.89, respectively. Therefore, the lens optical systems 25 of FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19 satisfy the following conditional expression (8).
- the CRA of the lens optical systems 25 in FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19 is all 39.30. Therefore, the lens optical systems 25 in FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19 satisfy the following conditional expression (9).
- TTL/IH is 1.02, 1.03, 1.03, 1.03, and 1.11, respectively. Therefore, the lens optical systems 25 of FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19 satisfy the following conditional expression (10).
- the lens optical system 25 includes, in order from the object side to the image side, lenses 71 (171, 271, 371, 471) to 77 (177, 277, 377, 477).
- Lens 71 (171, 271, 371, 471) has positive refractive power near the optical axis and has a convex meniscus shape toward the object side.
- Lens 72 (172, 272, 372, 472) has negative refractive power near the optical axis and has a convex meniscus shape toward the object side.
- Lens 73 (173, 273, 373, 473) has refractive power near the optical axis.
- Lens 74 (174, 274, 374, 474) has positive refractive power near the optical axis and has a convex meniscus shape toward the image side.
- Lens 75 (175, 275, 375, 475) has refractive power near the optical axis and has a meniscus shape convex toward the image side.
- Lens 76 (176, 276, 376, 476) has refractive power near the optical axis and has a meniscus shape convex toward the object side.
- Lens 77 (177, 277, 377, 477) has negative refractive power near the optical axis and has a meniscus shape convex toward the object side.
- surface distance D108 D208, D308, D408, D508 is the longest.
- the values of the lens data and aspheric data in the lens optical system 25 are not limited to the values described above.
- the imaging device 10 described above can be applied to various electronic devices, such as digital still cameras, digital video cameras, and mobile devices such as mobile phones and smartphones equipped with an imaging function.
- FIG. 24 is a block diagram showing an example of the hardware configuration of a smartphone as an electronic device to which this technology is applied.
- a CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- An input/output interface 1005 is further connected to the bus 1004.
- An imaging unit 1006, an input unit 1007, an output unit 1008, and a communication unit 1009 are connected to the input/output interface 1005.
- the imaging unit 1006 is composed of the imaging device 10 described above, etc.
- the imaging unit 1006 captures an image of a subject and obtains an image. This image is stored in the RAM 1003 and/or displayed on the output unit 1008.
- the input unit 1007 is composed of a touchpad, which is a position input device constituting a touch panel, a microphone, etc.
- the output unit 1008 is composed of a liquid crystal panel constituting a touch panel, a speaker, etc.
- the communication unit 1009 is composed of a network interface, etc.
- the imaging device 10 as the imaging unit 1006
- FIG. 25 is a diagram showing an example of using the imaging device 10 described above.
- the imaging device 10 described above can be used in various cases to sense light, such as visible light, infrared light, ultraviolet light, and X-rays, for example, as follows:
- - Devices that take images for viewing such as digital cameras and mobile devices with camera functions
- - Devices for traffic purposes such as in-vehicle sensors that take images of the front and rear of a car, the surroundings, and the interior of the car for safe driving such as automatic stopping and for recognizing the driver's state, surveillance cameras that monitor moving vehicles and roads, and distance measuring sensors that measure the distance between vehicles, etc.
- - Devices for home appliances such as TVs, refrigerators, and air conditioners that take images of users' gestures and operate devices in accordance with those gestures
- - Devices for medical and healthcare purposes such as endoscopes and devices that take images of blood vessels by receiving infrared light
- - Devices for security purposes such as surveillance cameras for crime prevention and cameras for person authentication
- - Devices for beauty purposes such as skin measuring devices that take images of the skin and microscopes that take images of the scalp
- - Devices for sports purposes such as action cameras and wearable cameras for sports purposes, etc.
- - Devices for agricultural purposes such as cameras
- the technology according to the present disclosure (the present technology) can be applied to various products.
- the technology according to the present disclosure may be applied to an endoscopic surgery system.
- FIG. 26 is a diagram showing an example of the general configuration of an endoscopic surgery system to which the technology disclosed herein (the present technology) can be applied.
- an operator (doctor) 11131 is shown using an endoscopic surgery system 11000 to perform surgery on a patient 11132 on a patient bed 11133.
- the endoscopic surgery system 11000 is composed of an endoscope 11100, other surgical tools 11110 such as an insufflation tube 11111 and an energy treatment tool 11112, a support arm device 11120 that supports the endoscope 11100, and a cart 11200 on which various devices for endoscopic surgery are mounted.
- the endoscope 11100 is composed of a lens barrel 11101, the tip of which is inserted into the body cavity of the patient 11132 at a predetermined length, and a camera head 11102 connected to the base end of the lens barrel 11101.
- the endoscope 11100 is configured as a so-called rigid scope having a rigid lens barrel 11101, but the endoscope 11100 may also be configured as a so-called flexible scope having a flexible lens barrel.
- the tip of the tube 11101 has an opening into which an objective lens is fitted.
- a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the tube by a light guide extending inside the tube 11101, and is irradiated via the objective lens towards an object to be observed inside the body cavity of the patient 11132.
- the endoscope 11100 may be a direct-viewing endoscope, an oblique-viewing endoscope, or a side-viewing endoscope.
- An optical system and an image sensor are provided inside the camera head 11102, and the reflected light (observation light) from the object of observation is focused on the image sensor by the optical system.
- the observation light is photoelectrically converted by the image sensor to generate an electrical signal corresponding to the observation light, i.e., an image signal corresponding to the observed image.
- the image signal is sent to the camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
- CCU Camera Control Unit
- the CCU 11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and controls the overall operation of the endoscope 11100 and the display device 11202. Furthermore, the CCU 11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal, such as development processing (demosaic processing), in order to display an image based on the image signal.
- a CPU Central Processing Unit
- GPU Graphics Processing Unit
- the display device 11202 under the control of the CCU 11201, displays an image based on the image signal that has been subjected to image processing by the CCU 11201.
- the light source device 11203 is composed of a light source such as an LED (Light Emitting Diode) and supplies irradiation light to the endoscope 11100 when photographing the surgical site, etc.
- a light source such as an LED (Light Emitting Diode) and supplies irradiation light to the endoscope 11100 when photographing the surgical site, etc.
- the input device 11204 is an input interface for the endoscopic surgery system 11000.
- a user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
- the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) of the endoscope 11100.
- the treatment tool control device 11205 controls the operation of the energy treatment tool 11112 for cauterizing tissue, incising, sealing blood vessels, etc.
- the insufflation device 11206 sends gas into the body cavity of the patient 11132 via the insufflation tube 11111 to inflate the body cavity in order to ensure a clear field of view for the endoscope 11100 and to ensure a working space for the surgeon.
- the recorder 11207 is a device capable of recording various types of information related to the surgery.
- the printer 11208 is a device capable of printing various types of information related to the surgery in various formats such as text, images, or graphs.
- the light source device 11203 that supplies illumination light to the endoscope 11100 when photographing the surgical site can be composed of a white light source composed of, for example, an LED, a laser light source, or a combination of these.
- a white light source composed of, for example, an LED, a laser light source, or a combination of these.
- the white light source is composed of a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high precision, so that the white balance of the captured image can be adjusted in the light source device 11203.
- the light source device 11203 may be controlled to change the intensity of the light it outputs at predetermined time intervals.
- the image sensor of the camera head 11102 may be controlled to acquire images in a time-division manner in synchronization with the timing of the change in the light intensity, and the images may be synthesized to generate an image with a high dynamic range that is free of so-called blackout and whiteout.
- the light source device 11203 may be configured to supply light of a predetermined wavelength band corresponding to special light observation.
- special light observation for example, by utilizing the wavelength dependency of light absorption in body tissue, a narrow band of light is irradiated compared to the light irradiated during normal observation (i.e., white light), and a specific tissue such as blood vessels on the surface of the mucosa is photographed with high contrast, so-called narrow band imaging is performed.
- fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light.
- excitation light is irradiated to body tissue and fluorescence from the body tissue is observed (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and excitation light corresponding to the fluorescence wavelength of the reagent is irradiated to the body tissue to obtain a fluorescent image.
- the light source device 11203 may be configured to supply narrow band light and/or excitation light corresponding to such special light observation.
- FIG. 27 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG. 26.
- the camera head 11102 has a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405.
- the CCU 11201 has a communication unit 11411, an image processing unit 11412, and a control unit 11413.
- the camera head 11102 and the CCU 11201 are connected to each other via a transmission cable 11400 so that they can communicate with each other.
- the lens unit 11401 is an optical system provided at the connection with the lens barrel 11101. Observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
- the lens unit 11401 is composed of a combination of multiple lenses including a zoom lens and a focus lens.
- the imaging unit 11402 is composed of an imaging element.
- the imaging element constituting the imaging unit 11402 may be one (so-called single-plate type) or multiple (so-called multi-plate type).
- each imaging element may generate an image signal corresponding to each of RGB, and a color image may be obtained by combining these.
- the imaging unit 11402 may be configured to have a pair of imaging elements for acquiring image signals for the right eye and the left eye corresponding to 3D (dimensional) display. By performing 3D display, the surgeon 11131 can more accurately grasp the depth of the biological tissue in the surgical site.
- 3D dimensional
- the imaging unit 11402 does not necessarily have to be provided in the camera head 11102.
- the imaging unit 11402 may be provided inside the lens barrel 11101, immediately after the objective lens.
- the driving unit 11403 is composed of an actuator, and moves the zoom lens and focus lens of the lens unit 11401 a predetermined distance along the optical axis under the control of the camera head control unit 11405. This allows the magnification and focus of the image captured by the imaging unit 11402 to be adjusted appropriately.
- the communication unit 11404 is configured with a communication device for transmitting and receiving various information to and from the CCU 11201.
- the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
- the communication unit 11404 also receives control signals for controlling the operation of the camera head 11102 from the CCU 11201, and supplies them to the camera head control unit 11405.
- the control signals include information on the imaging conditions, such as information specifying the frame rate of the captured image, information specifying the exposure value during imaging, and/or information specifying the magnification and focus of the captured image.
- the above-mentioned frame rate, exposure value, magnification, focus, and other imaging conditions may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal.
- the endoscope 11100 is equipped with so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function.
- the camera head control unit 11405 controls the operation of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
- the communication unit 11411 is configured with a communication device for transmitting and receiving various information to and from the camera head 11102.
- the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
- the communication unit 11411 also transmits to the camera head 11102 a control signal for controlling the operation of the camera head 11102.
- the image signal and the control signal can be transmitted by electrical communication, optical communication, etc.
- the image processing unit 11412 performs various image processing operations on the image signal, which is the RAW data transmitted from the camera head 11102.
- the control unit 11413 performs various controls related to the imaging of the surgical site, etc. by the endoscope 11100, and the display of the captured images obtained by imaging the surgical site, etc. For example, the control unit 11413 generates a control signal for controlling the driving of the camera head 11102.
- the control unit 11413 also causes the display device 11202 to display the captured image showing the surgical site, etc., based on the image signal that has been image-processed by the image processing unit 11412. At this time, the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 can recognize surgical tools such as forceps, specific body parts, bleeding, mist generated when the energy treatment tool 11112 is used, etc., by detecting the shape and color of the edges of objects included in the captured image. When the control unit 11413 causes the display device 11202 to display the captured image, it may use the recognition result to superimpose various types of surgical support information on the image of the surgical site. By superimposing the surgical support information and presenting it to the surgeon 11131, the burden on the surgeon 11131 can be reduced and the surgeon 11131 can proceed with the surgery reliably.
- various image recognition techniques such as forceps, specific body parts, bleeding, mist generated when the energy treatment tool 11112 is used, etc.
- the transmission cable 11400 that connects the camera head 11102 and the CCU 11201 is an electrical signal cable that supports electrical signal communication, an optical fiber that supports optical communication, or a composite cable of these.
- communication is performed wired using a transmission cable 11400, but communication between the camera head 11102 and the CCU 11201 may also be performed wirelessly.
- the above describes an example of an endoscopic surgery system to which the technology disclosed herein can be applied.
- the technology disclosed herein can be applied to the lens unit 11401, the imaging unit 11402, and the like, among the configurations described above.
- the imaging device 10 described above can be applied to the lens unit 11401, the imaging unit 11402, and the drive unit 11403.
- the technology disclosed herein to the lens unit 11401 and the imaging unit 11402, it is possible to achieve high optical performance corresponding to a large, high-pixel solid-state imaging element 21 in the lens optical system 25.
- a high-pixel, high-quality image of the surgical site allows, for example, the surgeon to reliably confirm the surgical site.
- the technology according to the present disclosure can be applied to various products.
- the technology according to the present disclosure may be realized as a device mounted on any type of moving body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility device, an airplane, a drone, a ship, or a robot.
- FIG. 28 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile object control system to which the technology disclosed herein can be applied.
- the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
- the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside vehicle information detection unit 12030, an inside vehicle information detection unit 12040, and an integrated control unit 12050.
- Also shown as functional components of the integrated control unit 12050 are a microcomputer 12051, an audio/video output unit 12052, and an in-vehicle network I/F (interface) 12053.
- the drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs.
- the drive system control unit 12010 functions as a control device for a drive force generating device for generating the drive force of the vehicle, such as an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to the wheels, a steering mechanism for adjusting the steering angle of the vehicle, and a braking device for generating a braking force for the vehicle.
- the body system control unit 12020 controls the operation of various devices installed in the vehicle body according to various programs.
- the body system control unit 12020 functions as a control device for a keyless entry system, a smart key system, a power window device, or various lamps such as headlamps, tail lamps, brake lamps, turn signals, and fog lamps.
- radio waves or signals from various switches transmitted from a portable device that replaces a key can be input to the body system control unit 12020.
- the body system control unit 12020 accepts the input of these radio waves or signals and controls the vehicle's door lock device, power window device, lamps, etc.
- the outside-vehicle information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000.
- the image capturing unit 12031 is connected to the outside-vehicle information detection unit 12030.
- the outside-vehicle information detection unit 12030 causes the image capturing unit 12031 to capture images outside the vehicle and receives the captured images.
- the outside-vehicle information detection unit 12030 may perform object detection processing or distance detection processing for people, cars, obstacles, signs, or characters on the road surface based on the received images.
- the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of light received.
- the imaging unit 12031 can output the electrical signal as an image, or as distance measurement information.
- the light received by the imaging unit 12031 may be visible light, or may be invisible light such as infrared light.
- the in-vehicle information detection unit 12040 detects information inside the vehicle.
- a driver state detection unit 12041 that detects the state of the driver is connected.
- the driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 may calculate the driver's degree of fatigue or concentration based on the detection information input from the driver state detection unit 12041, or may determine whether the driver is dozing off.
- the microcomputer 12051 can calculate the control target values of the driving force generating device, steering mechanism, or braking device based on the information inside and outside the vehicle acquired by the outside vehicle information detection unit 12030 or the inside vehicle information detection unit 12040, and output a control command to the drive system control unit 12010.
- the microcomputer 12051 can perform cooperative control aimed at realizing the functions of an ADAS (Advanced Driver Assistance System), including vehicle collision avoidance or impact mitigation, following driving based on the distance between vehicles, maintaining vehicle speed, vehicle collision warning, or vehicle lane departure warning.
- ADAS Advanced Driver Assistance System
- the microcomputer 12051 can also control the driving force generating device, steering mechanism, braking device, etc. based on information about the surroundings of the vehicle acquired by the outside vehicle information detection unit 12030 or the inside vehicle information detection unit 12040, thereby performing cooperative control aimed at automatic driving, which allows the vehicle to travel autonomously without relying on the driver's operation.
- the microcomputer 12051 can also output control commands to the body system control unit 12020 based on information outside the vehicle acquired by the outside-vehicle information detection unit 12030. For example, the microcomputer 12051 can control the headlamps according to the position of a preceding vehicle or an oncoming vehicle detected by the outside-vehicle information detection unit 12030, and perform cooperative control aimed at preventing glare, such as switching high beams to low beams.
- the audio/image output unit 12052 transmits at least one output signal of audio and image to an output device capable of visually or audibly notifying the occupants of the vehicle or the outside of the vehicle of information.
- an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices.
- the display unit 12062 may include, for example, at least one of an on-board display and a head-up display.
- FIG. 29 shows an example of the installation position of the imaging unit 12031.
- the vehicle 12100 has imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
- the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at the front nose, side mirrors, rear bumper, back door, and the top of the windshield inside the vehicle cabin of the vehicle 12100.
- the imaging unit 12101 provided at the front nose and the imaging unit 12105 provided at the top of the windshield inside the vehicle cabin mainly acquire images of the front of the vehicle 12100.
- the imaging units 12102 and 12103 provided at the side mirrors mainly acquire images of the sides of the vehicle 12100.
- the imaging unit 12104 provided at the rear bumper or back door mainly acquires images of the rear of the vehicle 12100.
- the images of the front acquired by the imaging units 12101 and 12105 are mainly used to detect preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, etc.
- FIG. 29 shows an example of the imaging ranges of the imaging units 12101 to 12104.
- Imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose
- imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively
- imaging range 12114 indicates the imaging range of the imaging unit 12104 provided on the rear bumper or back door.
- an overhead image of the vehicle 12100 viewed from above is obtained by superimposing the image data captured by the imaging units 12101 to 12104.
- At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
- at least one of the imaging units 12101 to 12104 may be a stereo camera consisting of multiple imaging elements, or an imaging element having pixels for detecting phase differences.
- the microcomputer 12051 can obtain the distance to each solid object within the imaging ranges 12111 to 12114 and the change in this distance over time (relative speed with respect to the vehicle 12100) based on the distance information obtained from the imaging units 12101 to 12104, and can extract as a preceding vehicle, in particular, the closest solid object on the path of the vehicle 12100 that is traveling in approximately the same direction as the vehicle 12100 at a predetermined speed (e.g., 0 km/h or faster). Furthermore, the microcomputer 12051 can set the inter-vehicle distance that should be maintained in advance in front of the preceding vehicle, and perform automatic braking control (including follow-up stop control) and automatic acceleration control (including follow-up start control). In this way, cooperative control can be performed for the purpose of automatic driving, which runs autonomously without relying on the driver's operation.
- automatic braking control including follow-up stop control
- automatic acceleration control including follow-up start control
- the microcomputer 12051 classifies and extracts three-dimensional object data on three-dimensional objects, such as two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, utility poles, and other three-dimensional objects, based on the distance information obtained from the imaging units 12101 to 12104, and can use the data to automatically avoid obstacles.
- the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see.
- the microcomputer 12051 determines the collision risk, which indicates the risk of collision with each obstacle, and when the collision risk is equal to or exceeds a set value and there is a possibility of a collision, it can provide driving assistance for collision avoidance by outputting an alarm to the driver via the audio speaker 12061 or the display unit 12062, or by forcibly decelerating or steering the vehicle to avoid a collision via the drive system control unit 12010.
- At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
- the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104. The recognition of such a pedestrian is performed, for example, by a procedure of extracting feature points in the captured image of the imaging units 12101 to 12104 as infrared cameras, and a procedure of performing pattern matching processing on a series of feature points that indicate the contour of an object to determine whether or not it is a pedestrian.
- the audio/image output unit 12052 controls the display unit 12062 to superimpose a rectangular contour line for emphasis on the recognized pedestrian.
- the audio/image output unit 12052 may also control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
- the technology according to the present disclosure can be applied to the imaging unit 12031 and the like of the configuration described above.
- the imaging device 10 described above can be applied to the imaging unit 12031.
- high optical performance corresponding to a large, high-pixel solid-state imaging element 21 can be realized in the lens optical system 25.
- a captured image with high pixel count and high image quality can be obtained, which can reduce driver fatigue, for example.
- the embodiments of the present technology are not limited to the above-mentioned embodiments, and various modifications are possible within the scope of the gist of the present technology.
- the number and positions of inflection points of each surface, lens data of each lens, aspheric data of each surface, and tangent angles of the peripheral parts of the effective diameter of the lens are not limited to the above-mentioned examples. It is also possible to adopt a form that combines all or part of the above-mentioned embodiments.
- the present technology can take the following configurations. (1) From the object side to the image side, a first lens having a positive refractive power near an optical axis and a meniscus shape convex toward the object side; a second lens having a negative refractive power near the optical axis and a meniscus shape convex toward the object side; a third lens having a refractive power near the optical axis; a fourth lens having a positive refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side; a fifth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side; a sixth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side; a seventh lens having a negative refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side, A lens optical system configured such that, among the
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Lenses (AREA)
Abstract
The present technology pertains to a lens optical system and an imaging device that enable, in the lens optical system, realization of high optical performance equivalent to that of a large-scale solid-state imaging element having a large pixel number. The lens optical system comprises, sequentially from the object side toward the image side: first and second lenses that have a convex meniscus shape on the object side; a third lens; fourth and fifth lenses that have a convex meniscus shape on the image side; and sixth and seventh lenses that have a convex meniscus shape on the object side. The first lens and the fourth lens have positive refractive power in the vicinity of the optical axis. The second lens and the seventh lens have negative refractive power in the vicinity of the optical axis. The third lens, the fifth lens, and the sixth lens have refractive power in the vicinity of the optical axis. The interval between the pair constituted by the fourth lens and the fifth lens is the longest interval among the air intervals on the optical axis between the facing surfaces of each pair constituted by two adjacent lenses from among the first to seventh lenses. The present technology can be applied to an imaging device or the like, for example.
Description
本技術は、レンズ光学系および撮像装置に関し、特に、レンズ光学系において大型で高画素の固体撮像素子に対応する高光学性能を実現することができるようにしたレンズ光学系および撮像装置に関する。
This technology relates to a lens optical system and an imaging device, and in particular to a lens optical system and an imaging device that can achieve high optical performance compatible with a large, high-pixel solid-state imaging element.
CCD(Charge-Coupled Device)センサやCMOS(Complementary Metal Oxide Semiconductor)イメージセンサ等の固体撮像素子を備える撮像装置として、携帯型電話機やスマートフォンに搭載される小型カメラ、デジタルスチルカメラ等が知られている。このような撮像装置では、更なる小型化を図るため、より小型で光学全長の短いレンズ光学系が要求されている。
Known imaging devices equipped with solid-state imaging elements such as CCD (Charge-Coupled Device) sensors and CMOS (Complementary Metal Oxide Semiconductor) image sensors include compact cameras mounted on mobile phones and smartphones, as well as digital still cameras. In order to further reduce the size of such imaging devices, there is a demand for smaller lens optical systems with shorter overall optical length.
近年、小型カメラにおいて固体撮像素子の高画素化が進んでおり、デジタルスチルカメラと同等の高画素の固体撮像素子を備えるモデルが普及している。そのため、小型カメラのレンズ光学系には高画素の固体撮像素子に対応する高い光学性能が要求されている。また、暗所撮影におけるノイズによる画質劣化を防止しつつ、より高精細な撮影を実現するため、固体撮像素子の大型化が進みつつある。
In recent years, the number of pixels in solid-state image sensors in compact cameras has been increasing, and models equipped with high-pixel solid-state image sensors equivalent to those in digital still cameras are becoming widespread. For this reason, the lens optical system of compact cameras is required to have high optical performance compatible with high-pixel solid-state image sensors. Also, in order to prevent deterioration of image quality due to noise when shooting in dark places and to achieve higher definition shooting, solid-state image sensors are becoming larger.
しかしながら、6枚のレンズにより構成されるレンズ光学系では、固体撮像素子が大型化すると、各レンズの屈折力が大きくなり、各種の収差の補正が困難となる。従って、大型の固体撮像素子に対応する小型かつ高光学性能のレンズ光学系の実現には、7枚以上のレンズが必要となる。
However, in a lens optical system consisting of six lenses, as the solid-state imaging element becomes larger, the refractive power of each lens increases, making it difficult to correct various aberrations. Therefore, to realize a compact lens optical system with high optical performance compatible with a large solid-state imaging element, seven or more lenses are required.
7枚のレンズにより構成されるレンズ光学系としては、例えば、物体側から像側に向かって順に、正の屈折力を有する第1レンズ、正の屈折力を有する第2レンズ、負の屈折力を有する第3レンズ、正の屈折力を有する第4レンズ、正の屈折力を有する第5レンズ、第6レンズ、および負の屈折力を有する第7レンズから構成されるレンズ光学系が考案されている(例えば、特許文献1参照)。
As an example of a lens optical system consisting of seven lenses, a lens optical system consisting of, in order from the object side to the image side, a first lens having positive refractive power, a second lens having positive refractive power, a third lens having negative refractive power, a fourth lens having positive refractive power, a fifth lens having positive refractive power, a sixth lens, and a seventh lens having negative refractive power has been devised (see, for example, Patent Document 1).
しかしながら、7枚以上のレンズを有するレンズ光学系において大型で高画素の固体撮像素子に対応した高光学性能はまだ実現されていない。よって、レンズ光学系において大型で高画素の固体撮像素子に対応する高光学性能を実現する手法の提供が要望されているが、そのような要望に十分にこたえられていない状況である。
However, high optical performance compatible with large, high-pixel solid-state imaging devices has not yet been achieved in lens optical systems with seven or more lenses. Therefore, there is a demand for a method to achieve high optical performance compatible with large, high-pixel solid-state imaging devices in lens optical systems, but this demand has not yet been fully met.
本技術は、このような状況に鑑みてなされたものであり、レンズ光学系において大型で高画素の固体撮像素子に対応する高光学性能を実現することができるようにするものである。
This technology was developed in light of these circumstances, and makes it possible to achieve high optical performance in a lens optical system that is compatible with large, high-pixel solid-state imaging elements.
本技術の第1の側面のレンズ光学系は、物体側から像側に向かって順に、光軸近傍で正の屈折力を有し、前記物体側に凸のメニスカス形状を有する第1のレンズと、光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第2のレンズと、光軸近傍で屈折力を有する第3のレンズと、光軸近傍で正の屈折力を有し、前記像側に凸のメニスカス形状を有する第4のレンズと、光軸近傍で屈折力を有し、前記像側に凸のメニスカス形状を有する第5のレンズと、光軸近傍で屈折力を有し、前記物体側に凸のメニスカス形状を有する第6のレンズと、光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第7のレンズとを備え、前記第1のレンズ乃至前記第7のレンズのうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの光軸上の空気の間隔の中で、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔が最も長いように構成されたレンズ光学系である。
The lens optical system of the first aspect of the present technology includes, in order from the object side to the image side, a first lens having positive refractive power near the optical axis and having a meniscus shape convex to the object side, a second lens having negative refractive power near the optical axis and having a meniscus shape convex to the object side, a third lens having refractive power near the optical axis, a fourth lens having positive refractive power near the optical axis and having a meniscus shape convex to the image side, and a fourth lens having refractive power near the optical axis and having a meniscus shape convex to the image side. The lens optical system includes a fifth lens, a sixth lens having a refractive power near the optical axis and a meniscus shape convex toward the object side, and a seventh lens having a negative refractive power near the optical axis and a meniscus shape convex toward the object side, and is configured such that, among the air gaps on the optical axis between the opposing surfaces of each pair of adjacent two lenses among the first lens to the seventh lens, the gap between the pair of the fourth lens and the fifth lens is the longest.
本技術の第1の側面においては、物体側から像側に向かって順に、光軸近傍で正の屈折力を有し、前記物体側に凸のメニスカス形状を有する第1のレンズと、光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第2のレンズとが設けられる。次に、光軸近傍で屈折力を有する第3のレンズと、光軸近傍で正の屈折力を有し、前記像側に凸のメニスカス形状を有する第4のレンズと、光軸近傍で屈折力を有し、前記像側に凸のメニスカス形状を有する第5のレンズとが設けられる。次に、光軸近傍で屈折力を有し、前記物体側に凸のメニスカス形状を有する第6のレンズが設けられる。次に、光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第7のレンズが設けられる。なお、前記第1のレンズ乃至前記第7のレンズのうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの光軸上の空気の間隔の中で、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔が最も長い。
In the first aspect of the present technology, a first lens having positive refractive power near the optical axis and a meniscus shape convex to the object side, and a second lens having negative refractive power near the optical axis and a meniscus shape convex to the object side are provided in that order from the object side to the image side. Next, a third lens having refractive power near the optical axis, a fourth lens having positive refractive power near the optical axis and a meniscus shape convex to the image side, and a fifth lens having refractive power near the optical axis and a meniscus shape convex to the image side are provided. Next, a sixth lens having refractive power near the optical axis and a meniscus shape convex to the object side is provided. Next, a seventh lens having negative refractive power near the optical axis and a meniscus shape convex to the object side is provided. Among the air gaps on the optical axis between the opposing surfaces of each pair of adjacent two lenses among the first lens to the seventh lens, the gap between the pair of the fourth lens and the fifth lens is the longest.
本技術の第2の側面の撮像装置は、物体側から像側に向かって順に、光軸近傍で正の屈折力を有し、前記物体側に凸のメニスカス形状を有する第1のレンズと、光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第2のレンズと、光軸近傍で屈折力を有する第3のレンズと、光軸近傍で正の屈折力を有し、前記像側に凸のメニスカス形状を有する第4のレンズと、光軸近傍で屈折力を有し、前記像側に凸のメニスカス形状を有する第5のレンズと、光軸近傍で屈折力を有し、前記物体側に凸のメニスカス形状を有する第6のレンズと、光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第7のレンズとを備え、前記第1のレンズ乃至前記第7のレンズのうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの光軸上の空気の間隔の中で、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔が最も長いように構成されたレンズ光学系と、前記レンズ光学系により結像された光学像を電気信号に変換する撮像素子とを備える撮像装置である。
The imaging device of the second aspect of the present technology includes, in order from the object side to the image side, a first lens having positive refractive power near the optical axis and having a meniscus shape convex towards the object side, a second lens having negative refractive power near the optical axis and having a meniscus shape convex towards the object side, a third lens having refractive power near the optical axis, a fourth lens having positive refractive power near the optical axis and having a meniscus shape convex towards the image side, a fifth lens having refractive power near the optical axis and having a meniscus shape convex towards the image side, and a fifth lens having refractive power near the optical axis and having a meniscus shape convex towards the image side. The imaging device includes a lens optical system that includes a sixth lens having a meniscus shape convex toward the body side, and a seventh lens having a meniscus shape convex toward the object side, which has negative refractive power near the optical axis, and is configured such that, among the air gaps on the optical axis between the opposing surfaces of each pair of adjacent lenses among the first lens to the seventh lens, the gap between the pair of the fourth lens and the fifth lens is the longest, and an imaging element that converts the optical image formed by the lens optical system into an electrical signal.
本技術の第2の側面においては、レンズ光学系と、前記レンズ光学系により結像された光学像を電気信号に変換する撮像素子とが設けられる。前記レンズ光学系には、物体側から像側に向かって順に、光軸近傍で正の屈折力を有し、前記物体側に凸のメニスカス形状を有する第1のレンズと、光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第2のレンズとが設けられる。次に、光軸近傍で屈折力を有する第3のレンズと、光軸近傍で正の屈折力を有し、前記像側に凸のメニスカス形状を有する第4のレンズと、光軸近傍で屈折力を有し、前記像側に凸のメニスカス形状を有する第5のレンズとが設けられる。次に、光軸近傍で屈折力を有し、前記物体側に凸のメニスカス形状を有する第6のレンズと、光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第7のレンズとが設けられる。なお、前記第1のレンズ乃至前記第7のレンズのうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの光軸上の空気の間隔の中で、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔が最も長い。
In a second aspect of the present technology, a lens optical system and an imaging element that converts an optical image formed by the lens optical system into an electrical signal are provided. The lens optical system includes, in order from the object side to the image side, a first lens having positive refractive power near the optical axis and a meniscus shape convex to the object side, and a second lens having negative refractive power near the optical axis and a meniscus shape convex to the object side. Next, a third lens having refractive power near the optical axis, a fourth lens having positive refractive power near the optical axis and a meniscus shape convex to the image side, and a fifth lens having refractive power near the optical axis and a meniscus shape convex to the image side are provided. Next, a sixth lens having refractive power near the optical axis and a meniscus shape convex to the object side, and a seventh lens having negative refractive power near the optical axis and a meniscus shape convex to the object side are provided. Among the air gaps on the optical axis between the opposing surfaces of each pair of adjacent two lenses among the first lens to the seventh lens, the gap between the pair of the fourth lens and the fifth lens is the longest.
以下、本技術を実施するための形態(以下、実施の形態という)について説明する。なお、説明は以下の順序で行う。
1.一実施の形態(撮像装置)
2.電子機器への適用例
3.撮像装置の使用例
4.内視鏡手術システムへの応用例
5.移動体への応用例 Hereinafter, modes for carrying out the present technology (hereinafter, referred to as embodiments) will be described in the following order.
1. One embodiment (imaging device)
2. Application examples toelectronic devices 3. Use examples of imaging devices 4. Application examples to endoscopic surgery systems 5. Application examples to moving objects
1.一実施の形態(撮像装置)
2.電子機器への適用例
3.撮像装置の使用例
4.内視鏡手術システムへの応用例
5.移動体への応用例 Hereinafter, modes for carrying out the present technology (hereinafter, referred to as embodiments) will be described in the following order.
1. One embodiment (imaging device)
2. Application examples to
なお、以下の説明で参照する図面において、同一又は類似の部分には同一又は類似の符号を付している。ただし、図面は模式的なものであり、厚みと平面寸法との関係、各層の厚みの比率等は実際のものとは異なる。また、図面相互間においても、互いの寸法の関係や比率が異なる部分が含まれている場合がある。
In the drawings referred to in the following description, the same or similar parts are given the same or similar reference numerals. However, the drawings are schematic, and the relationship between thickness and planar dimensions, the thickness ratio of each layer, etc., differ from the actual ones. Furthermore, there may be parts in which the dimensional relationships and ratios differ between the drawings.
また、以下の説明における上下等の方向の定義は、単に説明の便宜上の定義であって、本開示の技術的思想を限定するものではない。例えば、対象を90°回転して観察すれば上下は左右に変換して読まれ、180°回転して観察すれば上下は反転して読まれる。
Furthermore, the definitions of directions such as up and down in the following explanation are merely for the convenience of explanation and do not limit the technical ideas of this disclosure. For example, if an object is rotated 90 degrees and observed, up and down are converted into left and right and read, and if it is rotated 180 degrees and observed, up and down are read inverted.
<1.一実施の形態>
<撮像装置の構成例>
図1は、本技術を適用した撮像装置の一実施の形態の構成例を示す断面図である。 1. One embodiment
<Configuration example of imaging device>
FIG. 1 is a cross-sectional view showing an example of the configuration of an embodiment of an imaging device to which the present technology is applied.
<撮像装置の構成例>
図1は、本技術を適用した撮像装置の一実施の形態の構成例を示す断面図である。 1. One embodiment
<Configuration example of imaging device>
FIG. 1 is a cross-sectional view showing an example of the configuration of an embodiment of an imaging device to which the present technology is applied.
図1の撮像装置10は、固体撮像装置13が設置される薄型の回路基板14、回路基板15、およびスペーサ16により構成される。
The imaging device 10 in FIG. 1 is composed of a thin circuit board 14 on which a solid-state imaging device 13 is mounted, a circuit board 15, and a spacer 16.
固体撮像装置13は、CSP(Chip Size Package)構造を有する。CSP構造は、多画素化、小型化、および低背化を実現する固体撮像装置の構造の1つであり、チップ単体と同程度のサイズで実現された極めて小型のパッケージ構造である。固体撮像装置13は、固体撮像素子21、接着剤22、ガラス基板23、黒樹脂24、レンズ光学系25、および固定剤26により構成される。
The solid-state imaging device 13 has a CSP (Chip Size Package) structure. The CSP structure is one of the structures of solid-state imaging devices that realizes a high pixel count, compact size, and low height, and is an extremely small package structure that is realized with a size similar to that of a single chip. The solid-state imaging device 13 is composed of a solid-state imaging element 21, adhesive 22, glass substrate 23, black resin 24, lens optical system 25, and fixing agent 26.
固体撮像素子21は、CCDセンサやCMOSイメージセンサであり、半導体基板31とオンチップレンズ32を備える。半導体基板31の図1中下側の面は回路基板14と接続する。半導体基板31の図1中上側の面の一部の領域である撮像面31aには、2次元格子状に配列された複数の各画素に対応する受光素子からなる画素アレイ41等が形成される。オンチップレンズ32は、画素アレイ41上の各画素に対応する位置に形成される。
The solid-state imaging element 21 is a CCD sensor or a CMOS image sensor, and includes a semiconductor substrate 31 and an on-chip lens 32. The lower surface of the semiconductor substrate 31 in FIG. 1 is connected to the circuit board 14. A pixel array 41 and the like are formed on an imaging surface 31a, which is a partial area of the upper surface of the semiconductor substrate 31 in FIG. 1, and is made up of light receiving elements corresponding to each of a plurality of pixels arranged in a two-dimensional lattice pattern. The on-chip lens 32 is formed at a position on the pixel array 41 that corresponds to each pixel.
接着剤22は、固体撮像素子21の撮像面31aを含む図1中上側の面上に設けられる透明な接着剤である。ガラス基板23は、固体撮像素子21の固定、撮像面31aの保護などの目的で、接着剤22を介して固体撮像素子21に接着される。
The adhesive 22 is a transparent adhesive that is applied to the upper surface in FIG. 1, including the imaging surface 31a of the solid-state imaging element 21. The glass substrate 23 is adhered to the solid-state imaging element 21 via the adhesive 22 for the purposes of fixing the solid-state imaging element 21 and protecting the imaging surface 31a.
黒樹脂24は、ガラス基板23の接着剤22の接着面と反対側の面に形成され、スペーサの機能を有する。この黒樹脂24を介して、ガラス基板23の上にレンズ光学系25のIR(Infrared)カットフィルタ(図示せず)がガラス基板23と平行になるように設置される。これにより、ガラス基板23は、レンズ光学系25と撮像面31aとの間に配置されることになる。黒樹脂24(ブラックマスク)は、レンズ光学系25を介して入射される光のうちの、撮像面31aの外側の光を遮光する。
The black resin 24 is formed on the surface of the glass substrate 23 opposite the adhesive surface to which the adhesive 22 is applied, and functions as a spacer. An IR (Infrared) cut filter (not shown) of the lens optical system 25 is placed on top of the glass substrate 23 via this black resin 24 so that it is parallel to the glass substrate 23. This positions the glass substrate 23 between the lens optical system 25 and the imaging surface 31a. The black resin 24 (black mask) blocks light that is incident via the lens optical system 25 and that is outside the imaging surface 31a.
レンズ光学系25は、被写体からの光を集光し、光学像を撮像面31aに結像させるレンズ光学系である。レンズ光学系25の構成については、後述する図2、図7、図11、図15、および図19を参照して詳細に説明する。
The lens optical system 25 is a lens optical system that collects light from a subject and forms an optical image on the imaging surface 31a. The configuration of the lens optical system 25 will be described in detail with reference to Figures 2, 7, 11, 15, and 19, which will be described later.
固定剤26は、固体撮像素子21、接着剤22、ガラス基板23、黒樹脂24、およびレンズ光学系25の側面と、レンズ光学系25の物体側(光の入射側)の面(図1中上面)の周囲とに塗布される。固定剤26は、固体撮像素子21、接着剤22、ガラス基板23、黒樹脂24、およびレンズ光学系25を固定する。この固定剤26により、固体撮像装置13の側面から入射され、屈折されたり反射されたりする光を軽減させることができる。また、固定剤26により、撮像面31aに対応する領域の外側から固体撮像装置13に入射される光を遮光することができる。
The fixing agent 26 is applied to the sides of the solid-state imaging element 21, adhesive 22, glass substrate 23, black resin 24, and lens optical system 25, and to the periphery of the object side (light incident side) surface (top surface in FIG. 1) of the lens optical system 25. The fixing agent 26 fixes the solid-state imaging element 21, adhesive 22, glass substrate 23, black resin 24, and lens optical system 25. This fixing agent 26 can reduce light that is incident from the side of the solid-state imaging device 13 and is refracted or reflected. The fixing agent 26 can also block light that is incident on the solid-state imaging device 13 from outside the area corresponding to the imaging surface 31a.
被写体からの光は、レンズ光学系25、ガラス基板23、接着剤22、およびオンチップレンズ32を介して撮像面31aに入射され、これにより撮像面31aに光学像が結像される。画素アレイ41の各受光素子は、その光学像を電気信号に変換することにより、撮像を行う。
Light from the subject is incident on the imaging surface 31a via the lens optical system 25, the glass substrate 23, the adhesive 22, and the on-chip lens 32, and an optical image is formed on the imaging surface 31a. Each light-receiving element of the pixel array 41 captures the image by converting the optical image into an electrical signal.
以上のように、固体撮像装置13のCSP構造内にレンズ光学系25が含まれるので、別体でレンズ光学系25が設けられる場合に比べて、撮像装置10を小型化することができる。
As described above, the lens optical system 25 is included within the CSP structure of the solid-state imaging device 13, so the imaging device 10 can be made smaller than when the lens optical system 25 is provided separately.
回路基板14は、半導体基板31の図1中下側の面と接続し、各受光素子により生成された電気信号に対応するカメラ信号をスペーサ16に出力する回路基板である。
The circuit board 14 is connected to the lower surface of the semiconductor substrate 31 in FIG. 1, and outputs a camera signal corresponding to the electrical signal generated by each light receiving element to the spacer 16.
回路基板15は、回路基板14からスペーサ16を介して出力されたカメラ信号を外部に出力するための回路基板であり、電子部品等が実装される。回路基板15は、外部の装置と接続するためのコネクタ15aを有し、カメラ信号を外部の装置に出力する。
Circuit board 15 is a circuit board for outputting the camera signal output from circuit board 14 via spacer 16 to the outside, and electronic components and the like are mounted on it. Circuit board 15 has connector 15a for connecting to an external device, and outputs the camera signal to the external device.
スペーサ16は、レンズ光学系25を駆動する図示せぬアクチュエータと回路基板15を固定するための回路内蔵のスペーサである。スペーサ16には、半導体部品16aおよび16b等が実装されている。半導体部品16aおよび16bは、コンデンサ、レンズ光学系25を駆動する図示せぬアクチュエータを制御するLSI(Large Scale Integration)を構成する半導体部品等である。スペーサ16は、回路基板14から出力されたカメラ信号を回路基板15に出力する。
Spacer 16 is a spacer with a built-in circuit for fixing an actuator (not shown) that drives lens optical system 25 and circuit board 15. Semiconductor components 16a and 16b, etc. are mounted on spacer 16. Semiconductor components 16a and 16b are semiconductor components that constitute a capacitor and an LSI (Large Scale Integration) that controls an actuator (not shown) that drives lens optical system 25. Spacer 16 outputs a camera signal output from circuit board 14 to circuit board 15.
<レンズ光学系の第1の構成例>
図2は、レンズ光学系25の第1の構成例を示す断面図である。 <First Configuration Example of Lens Optical System>
FIG. 2 is a cross-sectional view showing a first configuration example of the lensoptical system 25. As shown in FIG.
図2は、レンズ光学系25の第1の構成例を示す断面図である。 <First Configuration Example of Lens Optical System>
FIG. 2 is a cross-sectional view showing a first configuration example of the lens
図2に示すように、レンズ光学系25は、物体側から像側(光の出射側)に向かって順に、開口絞り70、レンズ71(第1のレンズ)、レンズ72(第2のレンズ)、レンズ73(第3のレンズ)、レンズ74(第4のレンズ)、レンズ75(第5のレンズ)、レンズ76(第6のレンズ)、レンズ77(第7のレンズ)、およびIRカットフィルタ78を備える。
As shown in FIG. 2, the lens optical system 25 includes, in order from the object side toward the image side (light exit side), an aperture stop 70, a lens 71 (first lens), a lens 72 (second lens), a lens 73 (third lens), a lens 74 (fourth lens), a lens 75 (fifth lens), a lens 76 (sixth lens), a lens 77 (seventh lens), and an IR cut filter 78.
開口絞り70は、レンズ光学系25に入射される光を制限する。
The aperture stop 70 limits the light entering the lens optical system 25.
レンズ71は、物体側(図2中左側)の面71aと像側(図2中右側)の面71bを有する。レンズ71は、光軸近傍で正の屈折力を有し、物体側に凸のメニスカス形状を有する。レンズ72は、物体側の面72aと像側の面72bを有する。レンズ72は、光軸近傍で負の屈折力を有し、物体側に凸のメニスカス形状を有する。以上のようにレンズ71および72が構成されることにより、球面収差および色収差を良好に補正しつつ、面71aから撮像面31aまでの距離であるレンズ光学系25の全長TTLを短縮することができる。
Lens 71 has surface 71a on the object side (left side in FIG. 2) and surface 71b on the image side (right side in FIG. 2). Lens 71 has positive refractive power near the optical axis and has a meniscus shape convex toward the object side. Lens 72 has surface 72a on the object side and surface 72b on the image side. Lens 72 has negative refractive power near the optical axis and has a meniscus shape convex toward the object side. By configuring lenses 71 and 72 as described above, it is possible to shorten the total length TTL of lens optical system 25, which is the distance from surface 71a to imaging surface 31a, while providing good correction for spherical aberration and chromatic aberration.
レンズ73は、物体側の面73aと像側の面73bを有する。レンズ73は、光軸近傍で正の屈折力を有し、物体側に凸のメニスカス形状を有する。レンズ73は、例えばリレーレンズとして機能し、光線の入射角を好適に補正する。レンズ73をリレーレンズとして機能させるためには、レンズ73の形状は、レンズ73に入射する光線の光軸からの距離が、レンズ73から出射する光線の光軸からの距離より短くなるように決定されることが望ましい。レンズ74は、物体側の面74aと像側の面74bを有する。レンズ74は、光軸近傍で正の屈折力を有し、像側に凸のメニスカス形状を有する。
Lens 73 has an object-side surface 73a and an image-side surface 73b. Lens 73 has positive refractive power near the optical axis and has a meniscus shape convex toward the object side. Lens 73 functions, for example, as a relay lens, and appropriately corrects the angle of incidence of light rays. In order for lens 73 to function as a relay lens, it is desirable that the shape of lens 73 is determined so that the distance from the optical axis of the light rays entering lens 73 is shorter than the distance from the optical axis of the light rays exiting lens 73. Lens 74 has an object-side surface 74a and an image-side surface 74b. Lens 74 has positive refractive power near the optical axis and has a meniscus shape convex toward the image side.
以上のように、レンズ73が光軸近傍で屈折力を有し、レンズ74が光軸近傍で正の屈折力を有するとともに、像側に凸のメニスカス形状を有する。従って、コマ収差および非点収差を良好に補正することができる。また、光線の射出角を好適に補正し、全長TTLの短縮および撮像面31aに入射する光の主光線と撮像面31aのなす角の抑制を図ることができる。
As described above, lens 73 has refractive power near the optical axis, and lens 74 has positive refractive power near the optical axis and has a meniscus shape convex toward the image side. Therefore, coma aberration and astigmatism can be corrected well. In addition, the exit angle of the light ray can be suitably corrected, the total length TTL can be shortened, and the angle between the chief ray of the light incident on the imaging surface 31a and the imaging surface 31a can be suppressed.
レンズ75は、物体側の面75aと像側の面75bを有する。レンズ75は、光軸近傍で負の屈折力を有し、像側に凸のメニスカス形状を有する。レンズ75は、光軸近傍の光線を補正しつつ、周辺部のコマ収差および像面湾曲収差を好適に補正する。撮像面31a周辺に到達する光線の経路を好適に補正するために、面75aの周辺部は、頂点(中心点)より物体側にあることが望ましい。
Lens 75 has an object-side surface 75a and an image-side surface 75b. Lens 75 has negative refractive power near the optical axis and has a meniscus shape convex toward the image side. Lens 75 effectively corrects coma aberration and field curvature aberration in the peripheral area while correcting light rays near the optical axis. To effectively correct the path of light rays reaching the periphery of imaging surface 31a, it is desirable that the peripheral area of surface 75a be located closer to the object side than the apex (center point).
レンズ76は、物体側の面76aと像側の面76bを有する。レンズ76は、光軸近傍で正の屈折力を有し、物体側に凸のメニスカス形状を有する。レンズ77は、物体側の面77aと像側の面77bを有する。レンズ77は、光軸近傍で負の屈折力を有し、物体側に凸のメニスカス形状を有する。レンズ76および77は、光軸近傍の光線を補正しつつ、周辺部のコマ収差、球面収差、像面湾曲収差、および倍率色収差を好適に補正する。撮像面31a周辺に到達する光線の経路を好適に補正するために、レンズ76および77は変曲点を1つ以上有することが望ましい。
Lens 76 has an object-side surface 76a and an image-side surface 76b. Lens 76 has positive refractive power near the optical axis and has a meniscus shape convex toward the object side. Lens 77 has an object-side surface 77a and an image-side surface 77b. Lens 77 has negative refractive power near the optical axis and has a meniscus shape convex toward the object side. Lenses 76 and 77 appropriately correct coma aberration, spherical aberration, field curvature aberration, and chromatic aberration of magnification in the peripheral areas while correcting light rays near the optical axis. It is desirable for lenses 76 and 77 to have one or more inflection points in order to appropriately correct the paths of light rays that reach the periphery of imaging surface 31a.
以上のように、レンズ75は、光軸近傍で屈折力を有するとともに、像側に凸のメニスカス形状を有し、レンズ76と77は、光軸近傍で屈折力を有するとともに、物体側に凸のメニスカス形状を有する。従って、光軸近傍のマージナル光線を好適に補正しつつ、像面湾曲収差および歪曲収差を好適に補正することができる。また、撮像面31a周辺に到達する光線と撮像面31aのなす角を好適に補正し、固体撮像素子21における受光量の低下を抑制することができる。
As described above, lens 75 has refractive power near the optical axis and has a meniscus shape convex toward the image side, and lenses 76 and 77 have refractive power near the optical axis and have a meniscus shape convex toward the object side. Therefore, it is possible to appropriately correct marginal rays near the optical axis while also appropriately correcting field curvature aberration and distortion aberration. In addition, it is possible to appropriately correct the angle between the light rays reaching the periphery of imaging surface 31a and imaging surface 31a, thereby suppressing a decrease in the amount of light received by solid-state imaging element 21.
IRカットフィルタ78は、物体側の面78aから入射された光のうちの赤外光以外の光を透過させ、像側の面78bから出射させる。なお、IRカットフィルタ78は設けられなくてもよい。
The IR cut filter 78 transmits light other than infrared light that is incident on the object side surface 78a and emits it from the image side surface 78b. Note that the IR cut filter 78 does not necessarily have to be provided.
被写体(物体)からレンズ光学系25に入射された光は、面71a、71b、72a,72b,73a,73b,74a,74b,75a,75b,76a,76b,77a,77b,78a、および78bを介して出射される。このようにしてレンズ光学系25から出射された光は、ガラス基板23、接着剤22、およびオンチップレンズ32を介して、撮像面31aに集光される。
Light incident on the lens optical system 25 from the subject (object) is emitted through surfaces 71a, 71b, 72a, 72b, 73a, 73b, 74a, 74b, 75a, 75b, 76a, 76b, 77a, 77b, 78a, and 78b. The light emitted from the lens optical system 25 in this manner is focused on the imaging surface 31a via the glass substrate 23, adhesive 22, and on-chip lens 32.
図2では、図を簡略化するため、撮像面31aのみを図示しているが、実際には、レンズ光学系25と撮像面31aの間には、ガラス基板23、接着剤22、およびオンチップレンズ32が存在する。このことは、後述する図2、図7、図11、図15、および図19においても同様である。
In FIG. 2, in order to simplify the drawing, only the imaging surface 31a is shown, but in reality, between the lens optical system 25 and the imaging surface 31a, there are a glass substrate 23, an adhesive 22, and an on-chip lens 32. This is also true in FIGS. 2, 7, 11, 15, and 19, which will be described later.
<各レンズのレンズデータの第1の例>
図3は、レンズ71乃至77のレンズデータを示す表である。 <First example of lens data for each lens>
FIG. 3 is a table showing lens data forlenses 71 to 77.
図3は、レンズ71乃至77のレンズデータを示す表である。 <First example of lens data for each lens>
FIG. 3 is a table showing lens data for
図3の表の各行は、面71a乃至78aおよび面71b乃至78bそれぞれに対応する。各列は、左側から順に、面番号i、曲率半径Ri、像側で最も近い面または撮像面31aとの光軸上の間隔である面間隔Di、d線(波長587.6nm)に対する屈折率Ndi、およびd線に対するアッベ数Vdiに対応する。
The rows in the table in Figure 3 correspond to surfaces 71a to 78a and surfaces 71b to 78b, respectively. From the left, the columns correspond to the surface number i, the radius of curvature Ri, the surface spacing Di, which is the distance on the optical axis from the closest surface on the image side or the imaging surface 31a, the refractive index Ndi for the d-line (wavelength 587.6 nm), and the Abbe number Vdi for the d-line.
面番号とは、レンズ光学系25の各面に付与された番号である。本明細書では、面71a,71b,72a,72b,73a,73b,74a,74b,75a,75b,76a,76b,77a,77b,78a,78bに対して、順に、101から116までの面番号が付与されているものとする。
The surface number is a number assigned to each surface of the lens optical system 25. In this specification, the surface numbers 101 to 116 are assigned to surfaces 71a, 71b, 72a, 72b, 73a, 73b, 74a, 74b, 75a, 75b, 76a, 76b, 77a, 77b, 78a, and 78b, in that order.
図3に示すように、面番号iが101である面71aの曲率半径R101は、2.83であり、面71aより像側で最も近い面71bとの光軸上の間隔である面間隔D101は0.945であり、屈折率Nd101は1.5533である。面71aのアッベ数Vd101、即ちレンズ71のアッベ数V101は71.685である。面番号iが102である面71bの曲率半径R102は、9.12であり、面71bより像側で最も近い面72aとの光軸上の空気の間隔である面間隔D102は0.020である。
As shown in FIG. 3, the radius of curvature R101 of surface 71a, whose surface number i is 101, is 2.83, the surface spacing D101, which is the distance on the optical axis between surface 71a and surface 71b, which is the closest surface to the image side of surface 71a, is 0.945, and the refractive index Nd101 is 1.5533. The Abbe number Vd101 of surface 71a, i.e., the Abbe number V101 of lens 71, is 71.685. The radius of curvature R102 of surface 71b, whose surface number i is 102, is 9.12, and the surface spacing D102, which is the distance through air on the optical axis between surface 72a, which is the closest surface to surface 71b on the image side, is 0.020.
面番号iが103である面72aの曲率半径R103は、4.06であり、面72aより像側で最も近い面72bとの光軸上の間隔である面間隔D103は0.303であり、屈折率Nd103は2.0018である。面72aのアッベ数Vd103、即ちレンズ72のアッベ数V102は19.325である。面番号iが104である面72bの曲率半径R104は、3.16であり、面72bより像側で最も近い面73aとの光軸上の空気の間隔である面間隔D104は0.496である。
The radius of curvature R103 of surface 72a, whose surface number i is 103, is 4.06, the surface spacing D103, which is the distance on the optical axis between surface 72a and surface 72b, which is the closest surface to the image side of surface 72a, is 0.303, and the refractive index Nd103 is 2.0018. The Abbe number Vd103 of surface 72a, i.e., the Abbe number V102 of lens 72, is 19.325. The radius of curvature R104 of surface 72b, whose surface number i is 104, is 3.16, and the surface spacing D104, which is the distance through air on the optical axis between surface 72b and surface 73a, which is the closest surface to the image side of surface 72b, is 0.496.
面番号iが105である面73aの曲率半径R105は、2.07×10であり、面73aより像側で最も近い面73bとの光軸上の間隔である面間隔D105は0.363であり、屈折率Nd105は1.5445である。面73aのアッベ数Vd105、即ちレンズ73のアッベ数V103は55.987である。面番号iが106である面73bの曲率半径R106は、1.81×102であり、面73bより像側で最も近い面74aとの光軸上の空気の間隔である面間隔D106は0.361である。
The radius of curvature R105 of the surface 73a having the surface number i of 105 is 2.07×10, the surface spacing D105, which is the distance on the optical axis between the surface 73a and the closest surface 73b on the image side, is 0.363, and the refractive index Nd105 is 1.5445. The Abbe number Vd105 of the surface 73a, i.e., the Abbe number V103 of the lens 73, is 55.987. The radius of curvature R106 of the surface 73b having the surface number i of 106 is 1.81× 102 , and the surface spacing D106, which is the distance through air on the optical axis between the surface 73b and the closest surface 74a on the image side, is 0.361.
面番号iが107である面74aの曲率半径R107は、-2.09×10であり、面74aより像側で最も近い面74bとの光軸上の間隔である面間隔D107は0.524であり、屈折率Nd107は1.6614である。面74aのアッベ数Vd107、即ちレンズ74のアッベ数V104は20.410である。面番号iが108である面74bの曲率半径R108は、-9.94であり、面74bより像側で最も近い面75aとの光軸上の空気の間隔である面間隔D108は1.107である。
The radius of curvature R107 of surface 74a, whose surface number i is 107, is -2.09 x 10, the surface spacing D107, which is the distance on the optical axis between surface 74a and surface 74b, which is the closest surface to the image side of surface 74a, is 0.524, and the refractive index Nd107 is 1.6614. The Abbe number Vd107 of surface 74a, i.e., the Abbe number V104 of lens 74, is 20.410. The radius of curvature R108 of surface 74b, whose surface number i is 108, is -9.94, and the surface spacing D108, which is the distance through air on the optical axis between surface 74b and surface 75a, which is the closest surface to the image side of surface 74b, is 1.107.
面番号iが109である面75aの曲率半径R109は、-9.28であり、面75aより像側で最も近い面75bとの光軸上の間隔である面間隔D109は0.330であり、屈折率Nd109は1.7020である。面75aのアッベ数Vd109、即ちレンズ75のアッベ数V105は15.500である。面番号iが110である面75bの曲率半径R110は、-2.47×10であり、面75bより像側で最も近い面76aとの光軸上の空気の間隔である面間隔D110は0.225である。
The radius of curvature R109 of surface 75a, whose surface number i is 109, is -9.28, the surface spacing D109, which is the distance on the optical axis between surface 75a and surface 75b, which is the closest surface to the image side of surface 75a, is 0.330, and the refractive index Nd109 is 1.7020. The Abbe number Vd109 of surface 75a, i.e., the Abbe number V105 of lens 75, is 15.500. The radius of curvature R110 of surface 75b, whose surface number i is 110, is -2.47 x 10, and the surface spacing D110, which is the distance through air on the optical axis between surface 75b and surface 76a, which is the closest surface to the image side of surface 75b, is 0.225.
面番号iが111である面76aの曲率半径R111は、3.46であり、面76aより像側で最も近い面76bとの光軸上の間隔である面間隔D111は0.530であり、屈折率Nd111は1.5445である。面76aのアッベ数Vd111、即ちレンズ76のアッベ数V106は55.987である。面番号iが112である面76bの曲率半径R112は、4.96であり、面76bより像側で最も近い面77aとの光軸上の空気の間隔である面間隔D112は1.072である。
The radius of curvature R111 of surface 76a, whose surface number i is 111, is 3.46, the surface spacing D111, which is the distance on the optical axis between surface 76a and surface 76b, which is the closest surface to the image side of surface 76a, is 0.530, and the refractive index Nd111 is 1.5445. The Abbe number Vd111 of surface 76a, i.e., the Abbe number V106 of lens 76, is 55.987. The radius of curvature R112 of surface 76b, whose surface number i is 112, is 4.96, and the surface spacing D112, which is the distance through air on the optical axis between surface 77a, which is the closest surface to surface 76b on the image side, is 1.072.
面番号iが113である面77aの曲率半径R113は、9.37であり、面77aより像側で最も近い面77bとの光軸上の間隔である面間隔D113は1.093であり、屈折率Nd113は1.5445である。面77aのアッベ数Vd113、即ちレンズ77のアッベ数V107は55.987である。面番号iが114である面77bの曲率半径R114は、3.05であり、面77bより像側で最も近い面78aとの光軸上の空気の間隔である面間隔D114は0.370である。
The radius of curvature R113 of surface 77a, whose surface number i is 113, is 9.37, the surface spacing D113, which is the distance on the optical axis between surface 77a and the closest surface 77b on the image side, is 1.093, and the refractive index Nd113 is 1.5445. The Abbe number Vd113 of surface 77a, i.e., the Abbe number V107 of lens 77, is 55.987. The radius of curvature R114 of surface 77b, whose surface number i is 114, is 3.05, and the surface spacing D114, which is the distance through air on the optical axis between surface 78a, which is the closest surface 77b on the image side, is 0.370.
面番号iが115である面78aの曲率半径R115は、無限大であり、面78aより像側で最も近い面78bとの光軸上の間隔である面間隔D115は0.210であり、屈折率Nd115は1.5168である。面78aのアッベ数Vd115、即ちIRカットフィルタ78のアッベ数V108は64.167である。面番号iが116である面78bの曲率半径R116は、無限大であり、撮像面31aとの光軸上の間隔である面間隔D116は0.600である。
The radius of curvature R115 of surface 78a, whose surface number i is 115, is infinity, the surface spacing D115, which is the distance on the optical axis between surface 78a and the closest surface 78b on the image side, is 0.210, and the refractive index Nd115 is 1.5168. The Abbe number Vd115 of surface 78a, i.e., the Abbe number V108 of the IR cut filter 78, is 64.167. The radius of curvature R116 of surface 78b, whose surface number i is 116, is infinity, and the surface spacing D116, which is the distance on the optical axis between surface 78a and the imaging surface 31a, is 0.600.
以上のように、レンズ71乃至77のうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの面間隔D102,D104,D106,D108,D110、およびD112の中で、面間隔D108が最も長い。
As described above, among the surface distances D102, D104, D106, D108, D110, and D112 between the opposing surfaces of each pair of adjacent lenses among lenses 71 to 77, surface distance D108 is the longest.
<各面の非球面データの第1の例>
図4は、面71a乃至77aおよび面71b乃至77bの非球面データを示す表である。 <First Example of Aspheric Data for Each Surface>
FIG. 4 is a table showing the aspheric data of thesurfaces 71a to 77a and the surfaces 71b to 77b.
図4は、面71a乃至77aおよび面71b乃至77bの非球面データを示す表である。 <First Example of Aspheric Data for Each Surface>
FIG. 4 is a table showing the aspheric data of the
図4の表の各行は、面71a乃至77aおよび面71b乃至77bそれぞれに対応する。各列は、左側から順に、面番号i、円錐係数K、4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、12次非球面係数、14次非球面係数、16次非球面係数、18次非球面係数、20次非球面係数に対応する。
The rows in the table in FIG. 4 correspond to surfaces 71a to 77a and surfaces 71b to 77b, respectively. The columns correspond, from the left, to the surface number i, the cone coefficient K, the 4th order aspheric coefficient, the 6th order aspheric coefficient, the 8th order aspheric coefficient, the 10th order aspheric coefficient, the 12th order aspheric coefficient, the 14th order aspheric coefficient, the 16th order aspheric coefficient, the 18th order aspheric coefficient, and the 20th order aspheric coefficient.
図4に示すように、面番号iが101である面71aの円錐係数Kは、-2.42803×10-1である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、12次非球面係数、14次非球面係数、および16次非球面係数は、それぞれ、3.62272×10-3, 3.65444×10-3, -3.36391×10-3, 2.41910×10-3,-9.31617×10-4, 1.91619×10-4, -1.63364×10-5である。
4, the conic coefficient K of the surface 71a whose surface number i is 101 is -2.42803× 10-1 . The fourth-order aspheric coefficients, sixth-order aspheric coefficients, eighth-order aspheric coefficients, tenth-order aspheric coefficients, twelfth-order aspheric coefficients, fourteenth-order aspheric coefficients, and sixteenth-order aspheric coefficients are 3.62272× 10-3 , 3.65444× 10-3 , -3.36391×10-3, 2.41910× 10-3 , -9.31617× 10-4 , 1.91619×10-4 , and -1.63364× 10-5 , respectively.
面番号iが102である面71bの円錐係数Kは、2.97650である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、および12次非球面係数は、それぞれ、-2.83049×10-3, 6.81582×10-3, -6.86480×10-3, 3.25922×10-3,-3.50239×10-4である。14次非球面係数、16次非球面係数、18次非球面係数、および20次非球面係数は、それぞれ、-1.63020×10-4, -4.82494×10-5, 5.67182×10-5,-1.00277×10-5である。
The conic coefficient K of the surface 71b whose surface number i is 102 is 2.97650. The fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -2.83049× 10-3 , 6.81582× 10-3 , -6.86480× 10-3 , 3.25922× 10-3 , and -3.50239× 10-4 , respectively. The 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are −1.63020×10 −4 , −4.82494×10 −5 , 5.67182×10 −5 , and −1.00277×10 −5 , respectively.
面番号iが103である面72aの円錐係数Kは2.61919である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、および12次非球面係数は、それぞれ、-2.21612×10-2, 9.91095×10-3, -1.76651×10-2, 1.61833×10-2, -9.24306×10-3である。14次非球面係数、16次非球面係数、18次非球面係数、および20次非球面係数は、それぞれ、3.51387×10-3,-1.11137×10-3, 2.83264×10-4,-3.85854×10-5である。
The conic coefficient K of the surface 72a whose surface number i is 103 is 2.61919. The fourth-order aspheric coefficients, sixth-order aspheric coefficients, eighth-order aspheric coefficients, tenth-order aspheric coefficients, and twelfth-order aspheric coefficients are -2.21612× 10-2 , 9.91095× 10-3 , -1.76651× 10-2 , 1.61833× 10-2 , and -9.24306× 10-3 , respectively. The 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are 3.51387×10 −3 , −1.11137×10 −3 , 2.83264×10 −4 , and −3.85854×10 −5 , respectively.
面番号iが104である面72bの円錐係数Kは-1.94467×10-1である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、および12次非球面係数は、それぞれ、-1.64596×10-3, 1.61079×10-2, -2.84858×10-2, 4.03268×10-2, -2.98809×10-2である。14次非球面係数、16次非球面係数、18次非球面係数、および20次非球面係数は、それぞれ、1.16734×10-2, -1.18767×10-3, 6.24837×10-4, 1.67035×10-4である。
The conic coefficient K of the surface 72b whose surface number i is 104 is -1.94467× 10-1 . The fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -1.64596× 10-3 , 1.61079× 10-2 , -2.84858× 10-2 , 4.03268× 10-2 , and -2.98809× 10-2 , respectively. The 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are 1.16734×10 −2 , −1.18767×10 −3 , 6.24837×10 −4 , and 1.67035×10 −4 , respectively.
面番号iが105である面73aの円錐係数Kは-1.07370×102である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、12次非球面係数、および14次非球面係数は、それぞれ、-6.22065×10-3, 1.45641×10-3, -3.00430×10-3, 3.45145×10-3, -5.07824×10-4, 2.27740×10-5である。
The conic coefficient K of the surface 73a whose surface number i is 105 is -1.07370× 10-2 . The fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, the twelfth-order aspheric coefficient, and the fourteenth-order aspheric coefficient are -6.22065× 10-3 , 1.45641× 10-3 , -3.00430× 10-3 , 3.45145× 10-3 , -5.07824× 10-4 , and 2.27740× 10-5 , respectively.
面番号iが106である面73bの円錐係数Kは1.01266×102である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、および12次非球面係数は、それぞれ、-8.15811×10-3, -9.45658×10-3, 2.15704×10-2, -3.18190×10-2, 3.53709×10-2である。14次非球面係数、16次非球面係数、18次非球面係数、および20次非球面係数は、それぞれ、-2.33788×10-2, 8.88652×10-3, 1.58650×10-3, 5.25725×10-5である。
The conic coefficient K of the surface 73b whose surface number i is 106 is 1.01266× 10-2 . The fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -8.15811× 10-3 , -9.45658× 10-3 , 2.15704× 10-2 , -3.18190× 10-2 , and 3.53709× 10-2 , respectively. The 14th-order aspherical coefficient, the 16th-order aspherical coefficient, the 18th-order aspherical coefficient, and the 20th-order aspherical coefficient are −2.33788×10 −2 , 8.88652×10 −3 , 1.58650×10 −3 , and 5.25725×10 −5 , respectively.
面番号iが107である面74aの円錐係数Kは-8.54794×10である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、および12次非球面係数は、それぞれ、-1.05190×10-2, -1.79704×10-2, 2.92681×10-2, -3.16587×10-2, 2.14221×10-2である。14次非球面係数、16次非球面係数、18次非球面係数、および20次非球面係数は、それぞれ、-7.63719×10-3, 1.03705×10-3, 1.54916×10-4, -4.93285×10-5である。
The conic coefficient K of the surface 74a whose surface number i is 107 is −8.54794×10. The fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are −1.05190× 10 , −1.79704× 10 , 2.92681× 10 , −3.16587× 10 , and 2.14221× 10 , respectively. The 14th-order aspherical coefficient, the 16th-order aspherical coefficient, the 18th-order aspherical coefficient, and the 20th-order aspherical coefficient are −7.63719×10 −3 , 1.03705×10 −3 , 1.54916×10 −4 , and −4.93285×10 −5 , respectively.
面番号iが108である面74bの円錐係数Kは1.66484×10である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、および12次非球面係数は、それぞれ、-8.80471×10-3, -1.04782×10-3, -5.21460×10-3, 6.19538×10-3, -4.35003×10-3である。14次非球面係数、16次非球面係数、18次非球面係数、および20次非球面係数は、それぞれ、 1.92361×10-3, -4.56399×10-4, 4.04193×10-5, 2.88880×10-6である。
The conic coefficient K of the surface 74b, whose surface number i is 108, is 1.66484×10. The fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -8.80471× 10 , -1.04782× 10 , -5.21460× 10 , 6.19538× 10 , and -4.35003× 10 , respectively. The 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are 1.92361×10 −3 , −4.56399×10 −4 , 4.04193×10 −5 , and 2.88880×10 −6 , respectively.
面番号iが109である面75aの円錐係数Kは1.37649×10である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、および12次非球面係数は、それぞれ、2.14908×10-2, -2.14981×10-2, 9.35939×10-3, -4.01620×10-3, 1.02870×10-3である。14次非球面係数、16次非球面係数、18次非球面係数、および20次非球面係数は、それぞれ、-2.09912×10-4, -1.19385×10-5, 2.16807×10-5, -3.86333×10-6である。
The conic coefficient K of the surface 75a whose surface number i is 109 is 1.37649×10. The fourth-order aspherical coefficient, the sixth-order aspherical coefficient, the eighth-order aspherical coefficient, the tenth-order aspherical coefficient, and the twelfth-order aspherical coefficient are 2.14908× 10 , -2.14981× 10 , 9.35939× 10 , -4.01620× 10 , and 1.02870× 10 , respectively. The 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are −2.09912×10 −4 , −1.19385×10 −5 , 2.16807×10 −5 , and −3.86333×10 −6 , respectively.
面番号iが110である面75bの円錐係数Kは-3.16667×10である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、および12次非球面係数は、それぞれ、-1.79352×10-2, -4.62014×10-3,6.22425×10-3, -3.48550×10-3, 8.54536×10-4である。14次非球面係数、16次非球面係数、18次非球面係数、および20次非球面係数は、それぞれ、-9.92036×10-5, 4.79949×10-6, 9.85895×10-9, -6.55434×10-9である。
The conic coefficient K of the surface 75b whose surface number i is 110 is -3.16667×10. The fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -1.79352× 10 , -4.62014× 10 , 6.22425× 10 , -3.48550 × 10, and 8.54536× 10 , respectively. The 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are −9.92036×10 −5 , 4.79949×10 −6 , 9.85895×10 −9 , and −6.55434×10 −9 , respectively.
面番号iが111である面76aの円錐係数Kは-3.04052×10-1である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、および12次非球面係数は、それぞれ、-5.33124×10-2, 7.90951×10-3, -3.49156×10-3, 9.13596×10-4, -1.36795×10-4である。14次非球面係数、16次非球面係数、18次非球面係数、および20次非球面係数は、それぞれ、1.19822×10-5, -5.67900×10-7, 1.08192×10-8, -4.01201×10-11である。
The conic coefficient K of the surface 76a whose surface number i is 111 is -3.04052× 10-1 . The fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -5.33124× 10-2 , 7.90951× 10-3 , -3.49156× 10-3 , 9.13596× 10-4 , and -1.36795× 10-4 , respectively. The 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are 1.19822×10 −5 , −5.67900×10 −7 , 1.08192×10 −8 , and −4.01201×10 −11 , respectively.
面番号iが112である面75bの円錐係数Kは-1.86569×10である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、および12次非球面係数は、それぞれ、3.15385×10-2, -1.87448×10-2, 4.44523×10-3, -6.72679×10-4, 6.83777×10-5である。14次非球面係数、16次非球面係数、18次非球面係数、および20次非球面係数は、それぞれ、-4.49314×10-6, 1.65027×10-7, -1.95722×10-9, -3.67513×10-11である。
The conic coefficient K of the surface 75b whose surface number i is 112 is -1.86569×10. The fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are 3.15385× 10 , -1.87448× 10 , 4.44523× 10 , -6.72679 × 10, and 6.83777× 10 , respectively. The 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are −4.49314×10 −6 , 1.65027×10 −7 , −1.95722×10 −9 , and −3.67513×10 −11 , respectively.
面番号iが113である面77aの円錐係数Kは-7.89633である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、および12次非球面係数は、それぞれ、-8.88581×10-2, 2.11957×10-2, -2.73125×10-3, 2.00134×10-4, -7.85631×10-6である。14次非球面係数、16次非球面係数、18次非球面係数、および20次非球面係数は、それぞれ、1.09565×10-7, 2.89958×10-9, -1.28783×10-10, 1.38563×10-12である。
The conic coefficient K of the surface 77a whose surface number i is 113 is −7.89633. The fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are −8.88581× 10−2 , 2.11957× 10−2 , −2.73125× 10−3 , 2.00134× 10−4 , and −7.85631× 10−6 , respectively. The 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are 1.09565×10 −7 , 2.89958×10 −9 , −1.28783×10 −10 , and 1.38563×10 −12 , respectively.
面番号iが114である面77bの円錐係数Kは-7.39089である。4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、および12次非球面係数は、それぞれ、-3.44999×10-2, 6.33229×10-3, -8.14233×10-4, 6.58014×10-5, -3.15114×10-6である。14次非球面係数、16次非球面係数、18次非球面係数、および20次非球面係数は、それぞれ、7.02794×10-8, 2.614142×10-10, -4.01785×10-11, 4.99232×10-13である。
The conic coefficient K of the surface 77b, whose surface number i is 114, is -7.39089. The fourth-order aspheric coefficient, the sixth-order aspheric coefficient, the eighth-order aspheric coefficient, the tenth-order aspheric coefficient, and the twelfth-order aspheric coefficient are -3.44999× 10-2 , 6.33229× 10-3 , -8.14233× 10-4 , 6.58014× 10-5 , and -3.15114× 10-6 , respectively. The 14th-order aspherical coefficients, the 16th-order aspherical coefficients, the 18th-order aspherical coefficients, and the 20th-order aspherical coefficients are 7.02794×10 −8 , 2.614142×10 −10 , −4.01785×10 −11 , and 4.99232×10 −13 , respectively.
<面間隔による効果>
図5は、面74bと面75aの面間隔D108による効果を説明する図である。 <Effect of surface spacing>
FIG. 5 is a diagram for explaining the effect of the surface distance D108 between thesurface 74b and the surface 75a.
図5は、面74bと面75aの面間隔D108による効果を説明する図である。 <Effect of surface spacing>
FIG. 5 is a diagram for explaining the effect of the surface distance D108 between the
なお、図5では、図面を簡略化するため、レンズ75と撮像面31aの間のレンズ76、レンズ77、およびIRカットフィルタ78の図示を省略している。
Note that in FIG. 5, in order to simplify the drawing, the lens 76, the lens 77, and the IR cut filter 78 between the lens 75 and the imaging surface 31a are not shown.
図5のAに示すように面間隔D108が短い場合に比べて、図5のBに示すように面間隔D108が長い場合、撮像面31aの高像高に到達する光線をΔhだけ光軸から離れる方向に広げることができる。
Compared to when the surface distance D108 is short as shown in FIG. 5A, when the surface distance D108 is long as shown in FIG. 5B, the light rays reaching the high image height on the imaging surface 31a can be spread by Δh in the direction away from the optical axis.
そこで、レンズ光学系25では、レンズ71乃至レンズ77のうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの面間隔D102,D104,D106,D108,D110、およびD112の中で、面間隔D108が最も長くなるように設計される。これにより、撮像面31aの高像高に到達する光線を分離することができる。その結果、固体撮像素子21が大型化した場合であっても、高像高側のコマ収差、像面湾曲収差、および歪曲収差を良好に補正することができる。
The lens optical system 25 is therefore designed so that among the surface distances D102, D104, D106, D108, D110, and D112 between the opposing surfaces of each pair of adjacent lenses among lenses 71 to 77, surface distance D108 is the longest. This makes it possible to separate light rays that reach high image heights on the imaging surface 31a. As a result, even if the solid-state imaging element 21 is enlarged, coma aberration, field curvature aberration, and distortion aberration on the high image height side can be well corrected.
このとき、近軸近傍での光線の補正が困難となるが、レンズ光学系25は、レンズ75乃至77の形状を上述した形状にすることで近軸近傍での光線の補正を可能にしている。
In this case, it becomes difficult to correct the light rays near the paraxial line, but the lens optical system 25 makes it possible to correct the light rays near the paraxial line by giving the lenses 75 to 77 the shapes described above.
<球面収差、像面湾曲、および歪曲収差の第1の例>
図6は、図2のレンズ光学系25において発生する球面収差、像面湾曲、および歪曲収差を示すグラフである。 <First Example of Spherical Aberration, Field Curvature, and Distortion>
FIG. 6 is a graph showing spherical aberration, field curvature, and distortion occurring in the lensoptical system 25 of FIG.
図6は、図2のレンズ光学系25において発生する球面収差、像面湾曲、および歪曲収差を示すグラフである。 <First Example of Spherical Aberration, Field Curvature, and Distortion>
FIG. 6 is a graph showing spherical aberration, field curvature, and distortion occurring in the lens
図6のAは、図2のレンズ光学系25において発生する、波長が443.58nm,486.13nm,546.07nm,587.56nm,656.27nmである光の波長ごとの縦方向の球面収差を表すグラフである。図6のAのグラフにおいて、横軸は、球面収差[mm]を表し、縦軸は、正規化瞳座標を表す。このことは、後述する図10のA、図14のA、図18のA、および図22のAにおいても同様である。
A in FIG. 6 is a graph showing the vertical spherical aberration for each wavelength of light having wavelengths of 443.58 nm, 486.13 nm, 546.07 nm, 587.56 nm, and 656.27 nm that occurs in the lens optical system 25 in FIG. 2. In the graph in FIG. 6A, the horizontal axis represents the spherical aberration [mm], and the vertical axis represents the normalized pupil coordinate. This also applies to A in FIG. 10, A in FIG. 14, A in FIG. 18, and A in FIG. 22, which will be described later.
図6のBは、図2のレンズ光学系25において発生する像面湾曲を表すグラフである。図6のBのグラフにおいて、横軸は、像面湾曲[mm]を表し、縦軸は、光線のサジタル方向またはタンジェンシャル方向の入射位置に対応する角度[degree]を表す。図6のBにおいて、実線はタンジェンシャル方向の入射位置と像面湾曲の関係を表し、点線はサジタル方向の像面湾曲の関係を表す。これらのことは、後述する図10のB、図14のB、図18のB、および図22のBにおいても同様である。サジタル方向とタンジェンシャル方向の像面湾曲の差が非点収差である。
B of FIG. 6 is a graph showing the field curvature that occurs in the lens optical system 25 of FIG. 2. In the graph of FIG. 6B, the horizontal axis shows the field curvature [mm], and the vertical axis shows the angle [degree] corresponding to the incident position of the light ray in the sagittal or tangential direction. In FIG. 6B, the solid line shows the relationship between the incident position in the tangential direction and the field curvature, and the dotted line shows the relationship between the field curvature in the sagittal direction. The same applies to FIG. 10B, FIG. 14B, FIG. 18B, and FIG. 22B, which will be described later. The difference between the field curvature in the sagittal direction and the tangential direction is astigmatism.
図6のCは、図2のレンズ光学系25において発生する、波長が587.56nmである光の歪曲収差を表すグラフである。図6のCのグラフにおいて、横軸は、歪曲収差[%]を表し、縦軸は、光線の入射角度[degree]を表す。このことは、後述する図10のC、図14のC、図18のC、および図25のCにおいても同様である。
FIG. 6C is a graph showing the distortion aberration of light with a wavelength of 587.56 nm that occurs in the lens optical system 25 of FIG. 2. In the graph of FIG. 6C, the horizontal axis shows the distortion aberration [%], and the vertical axis shows the angle of incidence of the light ray [degrees]. This also applies to FIG. 10C, FIG. 14C, FIG. 18C, and FIG. 25C, which will be described later.
<レンズ光学系の第2の構成例>
図7は、レンズ光学系25の第2の構成例を示す断面図である。 <Second Configuration Example of Lens Optical System>
FIG. 7 is a cross-sectional view showing a second configuration example of the lensoptical system 25. As shown in FIG.
図7は、レンズ光学系25の第2の構成例を示す断面図である。 <Second Configuration Example of Lens Optical System>
FIG. 7 is a cross-sectional view showing a second configuration example of the lens
図7のレンズ光学系25において、図2のレンズ光学系25と対応する部分については同一の符号を付してある。従って、その部分の説明は適宜省略し、図2のレンズ光学系25と異なる部分に着目して説明する。図7のレンズ光学系25は、レンズ71乃至77がレンズ171乃至177に代わる点が、図2のレンズ光学系25と異なっており、その他は図2のレンズ光学系25と同様に構成されている。
In the lens optical system 25 of FIG. 7, parts corresponding to those in the lens optical system 25 of FIG. 2 are given the same reference numerals. Therefore, the description of those parts will be omitted as appropriate, and the description will focus on the parts that differ from the lens optical system 25 of FIG. 2. The lens optical system 25 of FIG. 7 differs from the lens optical system 25 of FIG. 2 in that lenses 71 to 77 are replaced by lenses 171 to 177, and is otherwise configured in the same way as the lens optical system 25 of FIG. 2.
レンズ171乃至177は、レンズデータ、各面の非球面データ、およびレンズ173と176が光軸近傍で負の屈折力を有する点が、レンズ71乃至77と異なっており、その他はレンズ71乃至77と同様に構成されている。従って、以下では、レンズ171乃至177のレンズデータと物体側の面171a乃至177aおよび像側の面171b乃至177bの非球面データについて説明する。
Lens 171 to 177 differ from lenses 71 to 77 in the lens data, aspheric data of each surface, and in that lenses 173 and 176 have negative refractive power near the optical axis, but are otherwise configured in the same manner as lenses 71 to 77. Therefore, the following describes the lens data of lenses 171 to 177 and the aspheric data of object-side surfaces 171a to 177a and image-side surfaces 171b to 177b.
<各レンズのレンズデータの第2の例>
図8は、レンズ171乃至177のレンズデータを示す表である。 <Second example of lens data for each lens>
FIG. 8 is a table showing lens data forlenses 171 to 177.
図8は、レンズ171乃至177のレンズデータを示す表である。 <Second example of lens data for each lens>
FIG. 8 is a table showing lens data for
図8の表の各行は、面171a乃至177a、面171b乃至177b、並びに面78aおよび78bそれぞれに対応する。各列は、左側から順に、面番号i、曲率半径Ri、面間隔Di、d線に対する屈折率Ndi、およびd線に対するアッベ数Vdiに対応する。
The rows in the table in FIG. 8 correspond to surfaces 171a to 177a, surfaces 171b to 177b, and surfaces 78a and 78b. From the left, the columns correspond to the surface number i, the radius of curvature Ri, the surface spacing Di, the refractive index Ndi for the d line, and the Abbe number Vdi for the d line.
本明細書では、面171a,171b,172a,172b,173a,173b,174a,174b,175a,175b,176a,176b,177a,177bに対して、順に、201から214までの面番号が付与されているものとする。図8の表の各欄の数値については図8を参照。
In this specification, the surfaces 171a, 171b, 172a, 172b, 173a, 173b, 174a, 174b, 175a, 175b, 176a, 176b, 177a, and 177b are assigned surface numbers from 201 to 214, in that order. For the values in each column of the table in Figure 8, see Figure 8.
図8の表に示すように、レンズ171乃至177のうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの面間隔D202,D204,D206,D208,D210、およびD212の中で、面間隔D208が最も長い。
As shown in the table of FIG. 8, among the surface distances D202, D204, D206, D208, D210, and D212 between the opposing surfaces of each pair of adjacent lenses among lenses 171 to 177, surface distance D208 is the longest.
<各面の非球面データの第2の例>
図9は、面171a乃至177aおよび面171b乃至177bの非球面データを示す表である。 <Second Example of Aspheric Data for Each Surface>
FIG. 9 is a table showing the aspheric surface data of thesurfaces 171a to 177a and the surfaces 171b to 177b.
図9は、面171a乃至177aおよび面171b乃至177bの非球面データを示す表である。 <Second Example of Aspheric Data for Each Surface>
FIG. 9 is a table showing the aspheric surface data of the
図9の表の各行は、面171a乃至177aおよび面171b乃至177bそれぞれに対応する。各列は、左側から順に、面番号i、円錐係数K、4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、12次非球面係数、14次非球面係数、16次非球面係数、18次非球面係数、20次非球面係数に対応する。図9の表の各欄の数値については図9を参照。
Each row in the table in FIG. 9 corresponds to surfaces 171a to 177a and surfaces 171b to 177b. From the left, each column corresponds to the surface number i, the cone coefficient K, the 4th order aspheric coefficient, the 6th order aspheric coefficient, the 8th order aspheric coefficient, the 10th order aspheric coefficient, the 12th order aspheric coefficient, the 14th order aspheric coefficient, the 16th order aspheric coefficient, the 18th order aspheric coefficient, and the 20th order aspheric coefficient. For the numerical values in each column of the table in FIG. 9, see FIG. 9.
<球面収差、像面湾曲、および歪曲収差の第2の例>
図10は、図7のレンズ光学系25において発生する球面収差、像面湾曲、および歪曲収差を示すグラフである。 <Second Example of Spherical Aberration, Field Curvature, and Distortion>
FIG. 10 is a graph showing the spherical aberration, the field curvature, and the distortion that occur in the lensoptical system 25 of FIG.
図10は、図7のレンズ光学系25において発生する球面収差、像面湾曲、および歪曲収差を示すグラフである。 <Second Example of Spherical Aberration, Field Curvature, and Distortion>
FIG. 10 is a graph showing the spherical aberration, the field curvature, and the distortion that occur in the lens
図10のAは、図7のレンズ光学系25において発生する、波長が443.58nm,486.13nm,546.07nm,587.56nm,656.27nmである光の波長ごとの縦方向の球面収差を表すグラフである。図10のBは、図7のレンズ光学系25において発生する像面湾曲を表すグラフである。図10のCは、図7のレンズ光学系25において発生する、波長が587.56nmである光の歪曲収差を表すグラフである。
A in FIG. 10 is a graph showing the longitudinal spherical aberration for each wavelength of light having wavelengths of 443.58 nm, 486.13 nm, 546.07 nm, 587.56 nm, and 656.27 nm that occurs in the lens optical system 25 in FIG. 7. B in FIG. 10 is a graph showing the field curvature that occurs in the lens optical system 25 in FIG. 7. C in FIG. 10 is a graph showing the distortion aberration for light having a wavelength of 587.56 nm that occurs in the lens optical system 25 in FIG. 7.
<レンズ光学系の第3の構成例>
図11は、レンズ光学系25の第3の構成例を示す断面図である。 <Third Configuration Example of Lens Optical System>
FIG. 11 is a cross-sectional view showing a third configuration example of the lensoptical system 25. As shown in FIG.
図11は、レンズ光学系25の第3の構成例を示す断面図である。 <Third Configuration Example of Lens Optical System>
FIG. 11 is a cross-sectional view showing a third configuration example of the lens
図11のレンズ光学系25において、図2のレンズ光学系25と対応する部分については同一の符号を付してある。従って、その部分の説明は適宜省略し、図2のレンズ光学系25と異なる部分に着目して説明する。図11のレンズ光学系25は、レンズ71乃至77がレンズ271乃至277に代わる点が、図2のレンズ光学系25と異なっており、その他は図2のレンズ光学系25と同様に構成されている。
In the lens optical system 25 of FIG. 11, parts corresponding to those in the lens optical system 25 of FIG. 2 are given the same reference numerals. Therefore, the description of those parts will be omitted as appropriate, and the description will focus on the parts that differ from the lens optical system 25 of FIG. 2. The lens optical system 25 of FIG. 11 differs from the lens optical system 25 of FIG. 2 in that lenses 71 to 77 are replaced by lenses 271 to 277, and is otherwise configured in the same way as the lens optical system 25 of FIG. 2.
レンズ271乃至277は、レンズデータ、各面の非球面データ、レンズ275が光軸近傍で正の屈折力を有する点、およびレンズ276が光軸近傍で負の屈折力を有する点が、レンズ71乃至77と異なっており、その他はレンズ71乃至77と同様に構成されている。従って、以下では、レンズ271乃至277のレンズデータと物体側の面271a乃至277aおよび像側の面271b乃至277bの非球面データについて説明する。
Lens 271 to 277 differ from lenses 71 to 77 in the lens data, aspheric data of each surface, and the fact that lens 275 has positive refractive power near the optical axis and lens 276 has negative refractive power near the optical axis, but are otherwise configured in the same manner as lenses 71 to 77. Therefore, the following describes the lens data of lenses 271 to 277 and the aspheric data of object-side surfaces 271a to 277a and image-side surfaces 271b to 277b.
<各レンズのレンズデータの第3の例>
図12は、レンズ271乃至277のレンズデータを示す表である。 <Third example of lens data for each lens>
FIG. 12 is a table showing lens data forlenses 271 to 277.
図12は、レンズ271乃至277のレンズデータを示す表である。 <Third example of lens data for each lens>
FIG. 12 is a table showing lens data for
図12の表の各行は、面271a乃至277a、面271b乃至277b、面78aおよび78bそれぞれに対応する。各列は、左側から順に、面番号i、曲率半径Ri、面間隔Di、d線に対する屈折率Ndi、およびd線に対するアッベ数Vdiに対応する。
The rows in the table in FIG. 12 correspond to surfaces 271a to 277a, surfaces 271b to 277b, and surfaces 78a and 78b. From the left, the columns correspond to the surface number i, the radius of curvature Ri, the surface spacing Di, the refractive index Ndi for the d line, and the Abbe number Vdi for the d line.
本明細書では、面271a,271b,272a,272b,273a,273b,274a,274b,275a,275b,276a,276b,277a,277bに対して、順に、301から314までの面番号が付与されているものとする。図12の表の各欄の数値については図12を参照。
In this specification, the surfaces 271a, 271b, 272a, 272b, 273a, 273b, 274a, 274b, 275a, 275b, 276a, 276b, 277a, and 277b are assigned surface numbers from 301 to 314, in that order. See FIG. 12 for the values in each column of the table in FIG. 12.
図12の表に示すように、レンズ271乃至277のうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの面間隔D302,D304,D306,D308,D310、およびD312の中で、面間隔D308が最も長い。
As shown in the table of FIG. 12, among the surface distances D302, D304, D306, D308, D310, and D312 between the opposing surfaces of each pair of adjacent lenses among lenses 271 to 277, surface distance D308 is the longest.
<各面の非球面データの第3の例>
図13は、面271a乃至277aおよび面271b乃至277bの非球面データを示す表である。 <Third example of aspheric data for each surface>
FIG. 13 is a table showing the aspheric surface data of thesurfaces 271a to 277a and the surfaces 271b to 277b.
図13は、面271a乃至277aおよび面271b乃至277bの非球面データを示す表である。 <Third example of aspheric data for each surface>
FIG. 13 is a table showing the aspheric surface data of the
図13の表の各行は、面271a乃至277aおよび面271b乃至277bそれぞれに対応する。各列は、左側から順に、面番号i、円錐係数K、4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、12次非球面係数、14次非球面係数、16次非球面係数、18次非球面係数、20次非球面係数に対応する。図13の表の各欄の数値については図13を参照。
Each row in the table in FIG. 13 corresponds to surfaces 271a to 277a and surfaces 271b to 277b. From the left, each column corresponds to the surface number i, conic coefficient K, 4th order aspheric coefficient, 6th order aspheric coefficient, 8th order aspheric coefficient, 10th order aspheric coefficient, 12th order aspheric coefficient, 14th order aspheric coefficient, 16th order aspheric coefficient, 18th order aspheric coefficient, and 20th order aspheric coefficient. For the numerical values in each column of the table in FIG. 13, see FIG. 13.
<球面収差、像面湾曲、および歪曲収差の第3の例>
図14は、図11のレンズ光学系25において発生する球面収差、像面湾曲、および歪曲収差を示すグラフである。 <Third Example of Spherical Aberration, Field Curvature, and Distortion>
FIG. 14 is a graph showing the spherical aberration, the field curvature, and the distortion that occur in the lensoptical system 25 of FIG.
図14は、図11のレンズ光学系25において発生する球面収差、像面湾曲、および歪曲収差を示すグラフである。 <Third Example of Spherical Aberration, Field Curvature, and Distortion>
FIG. 14 is a graph showing the spherical aberration, the field curvature, and the distortion that occur in the lens
図14のAは、図11のレンズ光学系25において発生する、波長が443.58nm,486.13nm,546.07nm,587.56nm,656.27nmである光の波長ごとの縦方向の球面収差を表すグラフである。図14のBは、図11のレンズ光学系25において発生する像面湾曲を表すグラフである。図14のCは、図11のレンズ光学系25において発生する、波長が587.56nmである光の歪曲収差を表すグラフである。
A in FIG. 14 is a graph showing the longitudinal spherical aberration for each wavelength of light having wavelengths of 443.58 nm, 486.13 nm, 546.07 nm, 587.56 nm, and 656.27 nm that occurs in the lens optical system 25 in FIG. 11. B in FIG. 14 is a graph showing the field curvature that occurs in the lens optical system 25 in FIG. 11. C in FIG. 14 is a graph showing the distortion aberration for light having a wavelength of 587.56 nm that occurs in the lens optical system 25 in FIG. 11.
<レンズ光学系の第4の構成例>
図15は、レンズ光学系25の第4の構成例を示す断面図である。 <Fourth Configuration Example of Lens Optical System>
FIG. 15 is a cross-sectional view showing a fourth configuration example of the lensoptical system 25. As shown in FIG.
図15は、レンズ光学系25の第4の構成例を示す断面図である。 <Fourth Configuration Example of Lens Optical System>
FIG. 15 is a cross-sectional view showing a fourth configuration example of the lens
図15のレンズ光学系25において、図2のレンズ光学系25と対応する部分については同一の符号を付してある。従って、その部分の説明は適宜省略し、図2のレンズ光学系25と異なる部分に着目して説明する。図15のレンズ光学系25は、レンズ71乃至77がレンズ371乃至377に代わる点が、図2のレンズ光学系25と異なっており、その他は図2のレンズ光学系25と同様に構成されている。
In the lens optical system 25 of FIG. 15, parts corresponding to those in the lens optical system 25 of FIG. 2 are given the same reference numerals. Therefore, the description of those parts will be omitted as appropriate, and the description will focus on the parts that differ from the lens optical system 25 of FIG. 2. The lens optical system 25 of FIG. 15 differs from the lens optical system 25 of FIG. 2 in that lenses 71 to 77 are replaced by lenses 371 to 377, and is otherwise configured in the same way as the lens optical system 25 of FIG. 2.
レンズ371乃至377は、レンズデータ、各面の非球面データ、レンズ376が光軸近傍で負の屈折力を有する点が、レンズ71乃至77と異なっており、その他はレンズ71乃至77と同様に構成されている。従って、以下では、レンズ371乃至377のレンズデータと物体側の面371a乃至377aおよび像側の面371b乃至377bの非球面データについて説明する。
Lens 371 to 377 differ from lenses 71 to 77 in the lens data, aspheric data of each surface, and in that lens 376 has negative refractive power near the optical axis, but are otherwise configured in the same manner as lenses 71 to 77. Therefore, the following describes the lens data of lenses 371 to 377 and the aspheric data of object-side surfaces 371a to 377a and image-side surfaces 371b to 377b.
<各レンズのレンズデータの第4の例>
図16は、レンズ371乃至377のレンズデータを示す表である。 <Fourth example of lens data for each lens>
FIG. 16 is a table showing lens data forlenses 371 to 377.
図16は、レンズ371乃至377のレンズデータを示す表である。 <Fourth example of lens data for each lens>
FIG. 16 is a table showing lens data for
図16の表の各行は、面371a乃至377a、面371b乃至377b、面78aおよび78bそれぞれに対応する。各列は、左側から順に、面番号i、曲率半径Ri、面間隔Di、d線に対する屈折率Ndi、およびd線に対するアッベ数Vdiに対応する。
The rows in the table in FIG. 16 correspond to surfaces 371a to 377a, surfaces 371b to 377b, and surfaces 78a and 78b. From the left, the columns correspond to the surface number i, the radius of curvature Ri, the surface spacing Di, the refractive index Ndi for the d line, and the Abbe number Vdi for the d line.
本明細書では、面371a,371b,372a,372b,373a,373b,374a,374b,375a,375b,376a,376b,377a,377bに対して、順に、401から414までの面番号が付与されているものとする。図16の表の各欄の数値については図16を参照。
In this specification, surface numbers from 401 to 414 are assigned to surfaces 371a, 371b, 372a, 372b, 373a, 373b, 374a, 374b, 375a, 375b, 376a, 376b, 377a, and 377b in order. See FIG. 16 for the values in each column of the table in FIG. 16.
図16の表に示すように、レンズ371乃至377のうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの面間隔D402,D404,D406,D408,D410、およびD412の中で、面間隔D408が最も長い。
As shown in the table of FIG. 16, among the surface distances D402, D404, D406, D408, D410, and D412 between the opposing surfaces of each pair of adjacent lenses among lenses 371 to 377, surface distance D408 is the longest.
<各面の非球面データの第4の例>
図17は、面371a乃至377aおよび面371b乃至377bの非球面データを示す表である。 <Fourth example of aspheric surface data for each surface>
FIG. 17 is a table showing the aspheric surface data of thesurfaces 371a to 377a and the surfaces 371b to 377b.
図17は、面371a乃至377aおよび面371b乃至377bの非球面データを示す表である。 <Fourth example of aspheric surface data for each surface>
FIG. 17 is a table showing the aspheric surface data of the
図17の表の各行は、面371a乃至377aおよび面371b乃至377bそれぞれに対応する。各列は、左側から順に、面番号i、円錐係数K、4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、12次非球面係数、14次非球面係数、16次非球面係数、18次非球面係数、20次非球面係数に対応する。図17の表の各欄の数値については図17を参照。
Each row in the table in FIG. 17 corresponds to surfaces 371a to 377a and surfaces 371b to 377b. From the left, each column corresponds to the surface number i, conic coefficient K, 4th order aspheric coefficient, 6th order aspheric coefficient, 8th order aspheric coefficient, 10th order aspheric coefficient, 12th order aspheric coefficient, 14th order aspheric coefficient, 16th order aspheric coefficient, 18th order aspheric coefficient, and 20th order aspheric coefficient. For the numerical values in each column of the table in FIG. 17, see FIG. 17.
<球面収差、像面湾曲、および歪曲収差の第4の例>
図18は、図15のレンズ光学系25において発生する球面収差、像面湾曲、および歪曲収差を示すグラフである。 <Fourth Example of Spherical Aberration, Field Curvature, and Distortion>
FIG. 18 is a graph showing spherical aberration, field curvature, and distortion occurring in the lensoptical system 25 of FIG.
図18は、図15のレンズ光学系25において発生する球面収差、像面湾曲、および歪曲収差を示すグラフである。 <Fourth Example of Spherical Aberration, Field Curvature, and Distortion>
FIG. 18 is a graph showing spherical aberration, field curvature, and distortion occurring in the lens
図18のAは、図15のレンズ光学系25において発生する、波長が443.58nm,486.13nm,546.07nm,587.56nm,656.37nmである光の波長ごとの縦方向の球面収差を表すグラフである。図18のBは、図15のレンズ光学系25において発生する像面湾曲を表すグラフである。図18のCは、図15のレンズ光学系25において発生する、波長が587.56nmである光の歪曲収差を表すグラフである。
A in FIG. 18 is a graph showing the longitudinal spherical aberration for each wavelength of light having wavelengths of 443.58 nm, 486.13 nm, 546.07 nm, 587.56 nm, and 656.37 nm that occurs in the lens optical system 25 in FIG. 15. B in FIG. 18 is a graph showing the field curvature that occurs in the lens optical system 25 in FIG. 15. C in FIG. 18 is a graph showing the distortion aberration for light having a wavelength of 587.56 nm that occurs in the lens optical system 25 in FIG. 15.
<レンズ光学系の第5の構成例>
図19は、レンズ光学系25の第5の構成例を示す断面図である。 <Fifth Configuration Example of Lens Optical System>
FIG. 19 is a cross-sectional view showing a fifth configuration example of the lensoptical system 25. As shown in FIG.
図19は、レンズ光学系25の第5の構成例を示す断面図である。 <Fifth Configuration Example of Lens Optical System>
FIG. 19 is a cross-sectional view showing a fifth configuration example of the lens
図19のレンズ光学系25において、図2のレンズ光学系25と対応する部分については同一の符号を付してある。従って、その部分の説明は適宜省略し、図2のレンズ光学系25と異なる部分に着目して説明する。図19のレンズ光学系25は、レンズ71乃至77がレンズ471乃至477に代わる点が、図2のレンズ光学系25と異なっており、その他は図2のレンズ光学系25と同様に構成されている。
In the lens optical system 25 of FIG. 19, parts corresponding to those in the lens optical system 25 of FIG. 2 are given the same reference numerals. Therefore, the description of those parts will be omitted as appropriate, and the description will focus on the parts that differ from the lens optical system 25 of FIG. 2. The lens optical system 25 of FIG. 19 differs from the lens optical system 25 of FIG. 2 in that lenses 71 to 77 are replaced by lenses 471 to 477, and is otherwise configured in the same way as the lens optical system 25 of FIG. 2.
レンズ471乃至477は、レンズデータおよび各面の非球面データが、レンズ71乃至77と異なっており、その他はレンズ71乃至77と同様に構成されている。従って、以下では、レンズ471乃至477のレンズデータと物体側の面471a乃至477aおよび像側の面471b乃至477bの非球面データについて説明する。
Lens 471 to 477 differ from lenses 71 to 77 in the lens data and aspheric data of each surface, but are otherwise configured in the same manner as lenses 71 to 77. Therefore, below, we will explain the lens data of lenses 471 to 477 and the aspheric data of object-side surfaces 471a to 477a and image-side surfaces 471b to 477b.
<各レンズのレンズデータの第5の例>
図20は、レンズ471乃至477のレンズデータを示す表である。 <Fifth example of lens data for each lens>
FIG. 20 is a table showing lens data forlenses 471 to 477.
図20は、レンズ471乃至477のレンズデータを示す表である。 <Fifth example of lens data for each lens>
FIG. 20 is a table showing lens data for
図20の表の各行は、面471a乃至477a、面471b乃至477b、面78aおよび78bそれぞれに対応する。各列は、左側から順に、面番号i、曲率半径Ri、面間隔Di、d線に対する屈折率Ndi、およびd線に対するアッベ数Vdiに対応する。
The rows in the table in FIG. 20 correspond to surfaces 471a to 477a, surfaces 471b to 477b, and surfaces 78a and 78b. From the left, the columns correspond to the surface number i, the radius of curvature Ri, the surface spacing Di, the refractive index Ndi for the d line, and the Abbe number Vdi for the d line.
本明細書では、面471a,471b,472a,472b,473a,473b,474a,474b,475a,475b,476a,476b,477a,477bに対して、順に、501から514までの面番号が付与されているものとする。図20の表の各欄の数値については図20を参照。
In this specification, surface numbers from 501 to 514 are assigned to surfaces 471a, 471b, 472a, 472b, 473a, 473b, 474a, 474b, 475a, 475b, 476a, 476b, 477a, and 477b in order. See FIG. 20 for the values in each column of the table in FIG. 20.
図20の表に示すように、レンズ471乃至477のうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの面間隔D502,D504,D506,D508,D510、およびD512の中で、面間隔D508が最も長い。
As shown in the table of FIG. 20, among the surface distances D502, D504, D506, D508, D510, and D512 between the opposing surfaces of each pair of adjacent lenses among lenses 471 to 477, surface distance D508 is the longest.
<各面の非球面データの第5の例>
図21は、面471a乃至477aおよび面471b乃至477bの非球面データを示す表である。 <Fifth example of aspheric surface data for each surface>
FIG. 21 is a table showing the aspheric surface data of thesurfaces 471a to 477a and the surfaces 471b to 477b.
図21は、面471a乃至477aおよび面471b乃至477bの非球面データを示す表である。 <Fifth example of aspheric surface data for each surface>
FIG. 21 is a table showing the aspheric surface data of the
図21の表の各行は、面471a乃至477aおよび面471b乃至477bそれぞれに対応する。各列は、左側から順に、面番号i、円錐係数K、4次非球面係数、6次非球面係数、8次非球面係数、10次非球面係数、12次非球面係数、14次非球面係数、16次非球面係数、18次非球面係数、20次非球面係数に対応する。図21の表の各欄の数値については図21を参照。
Each row in the table in FIG. 21 corresponds to surfaces 471a to 477a and surfaces 471b to 477b. From the left, each column corresponds to the surface number i, cone coefficient K, 4th order aspheric coefficient, 6th order aspheric coefficient, 8th order aspheric coefficient, 10th order aspheric coefficient, 12th order aspheric coefficient, 14th order aspheric coefficient, 16th order aspheric coefficient, 18th order aspheric coefficient, and 20th order aspheric coefficient. For the numerical values in each column of the table in FIG. 21, see FIG. 21.
<球面収差、像面湾曲、および歪曲収差の第5の例>
図22は、図19のレンズ光学系25において発生する球面収差、像面湾曲、および歪曲収差を示すグラフである。 Fifth Example of Spherical Aberration, Field Curvature, and Distortion
FIG. 22 is a graph showing the spherical aberration, the field curvature, and the distortion that occur in the lensoptical system 25 of FIG.
図22は、図19のレンズ光学系25において発生する球面収差、像面湾曲、および歪曲収差を示すグラフである。 Fifth Example of Spherical Aberration, Field Curvature, and Distortion
FIG. 22 is a graph showing the spherical aberration, the field curvature, and the distortion that occur in the lens
図22のAは、図19のレンズ光学系25において発生する、波長が443.58nm,486.13nm,546.07nm,587.56nm,656.47nmである光の波長ごとの縦方向の球面収差を表すグラフである。図22のBは、図19のレンズ光学系25において発生する像面湾曲を表すグラフである。図22のCは、図19のレンズ光学系25において発生する、波長が587.56nmである光の歪曲収差を表すグラフである。
A in FIG. 22 is a graph showing the longitudinal spherical aberration for each wavelength of light having wavelengths of 443.58 nm, 486.13 nm, 546.07 nm, 587.56 nm, and 656.47 nm that occurs in the lens optical system 25 of FIG. 19. B in FIG. 22 is a graph showing the field curvature that occurs in the lens optical system 25 of FIG. 19. C in FIG. 22 is a graph showing the distortion aberration for light having a wavelength of 587.56 nm that occurs in the lens optical system 25 of FIG. 19.
<パラメータまたは式の値>
図23は、図2、図7、図11、図15、および図19のレンズ光学系25におけるパラメータまたは式の値を示す表である。 <Parameter or expression value>
FIG. 23 is a table showing values of parameters or equations for the lensoptical system 25 of FIGS.
図23は、図2、図7、図11、図15、および図19のレンズ光学系25におけるパラメータまたは式の値を示す表である。 <Parameter or expression value>
FIG. 23 is a table showing values of parameters or equations for the lens
図23の表の各行は、上から順に、TTL/(D6+D8),(R2+R1)/(R2-R1),(R4+R3)/(R4-R3),(R8+R7)/(R8-R7),v1,v2,|f3/f|,f1/f,CRA,TTL/IHに対応する。各列は、左側から順に、図2、図7、図11、図15、図19のレンズ光学系25に対応する。
The rows in the table in Figure 23 correspond, from top to bottom, to TTL/(D6+D8), (R2+R1)/(R2-R1), (R4+R3)/(R4-R3), (R8+R7)/(R8-R7), v1, v2, |f3/f|, f1/f, CRA, and TTL/IH. The columns correspond, from left to right, to the lens optical systems 25 in Figures 2, 7, 11, 15, and 19.
ここで、D6とは、面間隔Dj06(j=1,2,3,4,5)の総称であり、D8とは、面間隔Dj08の総称である。R1乃至R4とは、曲率半径Rj01乃至Rj04それぞれの総称である。R7およびR8とは、それぞれ、曲率半径Rj07,Rj08の総称である。V1およびV2とは、それぞれ、アッベ数Vj01、Vj02の総称である。fはレンズ光学系25全体の焦点距離である。f1とは、レンズ71,171,271,371、および471の焦点距離の総称である。f3とは、レンズ73,173,273,373、および473の焦点距離の総称である。CRAは、レンズ光学系25から撮像面31aに入射される光の主光線と撮像面31aのなす角の最大角である。IHとは、撮像面31aの中心から最大画角の光線の主光線が到達する位置までの距離、即ちレンズ光学系25における最大像高である。
Here, D6 is a general term for surface spacing Dj06 (j=1, 2, 3, 4, 5), and D8 is a general term for surface spacing Dj08. R1 to R4 are general terms for radii of curvature Rj01 to Rj04, respectively. R7 and R8 are general terms for radii of curvature Rj07 and Rj08, respectively. V1 and V2 are general terms for Abbe numbers Vj01 and Vj02, respectively. f is the focal length of the entire lens optical system 25. f1 is a general term for the focal lengths of lenses 71, 171, 271, 371, and 471. f3 is a general term for the focal lengths of lenses 73, 173, 273, 373, and 473. CRA is the maximum angle between the chief ray of light incident on the imaging surface 31a from the lens optical system 25 and the imaging surface 31a. IH is the distance from the center of the imaging surface 31a to the position where the chief ray of the light with the maximum angle of view reaches, i.e., the maximum image height in the lens optical system 25.
図23に示すように、図2、図7、図11、図15、および図19のレンズ光学系25では、TTL/(D6+D8)は、それぞれ、5.73,6.33,6.19,5.79,6.46である。従って、図2、図7、図11、図15、および図19のレンズ光学系25は、以下の条件式(1)を満たしている。
As shown in FIG. 23, in the lens optical systems 25 of FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19, TTL/(D6+D8) are 5.73, 6.33, 6.19, 5.79, and 6.46, respectively. Therefore, the lens optical systems 25 of FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19 satisfy the following conditional expression (1).
5.0<TTL/(D6+D8)<10.0・・・(1)
5.0<TTL/(D6+D8)<10.0...(1)
条件式(1)の上限を上回る場合、レンズ73(173,273,373,473)とレンズ75(175,275,375,475)の間隔に対して全長TTLが長くなり、レンズ光学系25の小型化が困難である。条件式(1)の下限を下回る場合、レンズ73(173,273,373,473)とレンズ75(175,275,375,475)の間隔に対して全長TTLが短くなる。これにより、レンズ74(174,274,374,474)で光線の射出角を好適に補正することが困難になる。その結果、固体撮像素子21の大型化が困難になる。
If the upper limit of conditional expression (1) is exceeded, the total length TTL becomes longer than the distance between lens 73 (173, 273, 373, 473) and lens 75 (175, 275, 375, 475), making it difficult to miniaturize the lens optical system 25. If the lower limit of conditional expression (1) is exceeded, the total length TTL becomes shorter than the distance between lens 73 (173, 273, 373, 473) and lens 75 (175, 275, 375, 475). This makes it difficult to appropriately correct the exit angle of the light beam with lens 74 (174, 274, 374, 474). As a result, it becomes difficult to increase the size of the solid-state imaging element 21.
図23に示すように、図2、図7、図11、図15、および図19のレンズ光学系25では、(R2+R1)/(R2-R1)は、それぞれ、1.90,1.76,1.78,2.50,2.57である。従って、図2、図7、図11、図15、および図19のレンズ光学系25は、以下の条件式(2)を満たしている。
As shown in Figure 23, in the lens optical systems 25 of Figures 2, 7, 11, 15, and 19, (R2+R1)/(R2-R1) are 1.90, 1.76, 1.78, 2.50, and 2.57, respectively. Therefore, the lens optical systems 25 of Figures 2, 7, 11, 15, and 19 satisfy the following conditional expression (2).
0.5<(R2+R1)/(R2-R1)<3.0・・・(2)
0.5<(R2+R1)/(R2-R1)<3.0...(2)
条件式(2)の上限を上回る場合、レンズ71(171,271,371,471)のパワー(屈折力)が弱くなり、全長TTLが長くなる。従って、レンズ光学系25の小型化が困難である。条件式(2)の下限を下回ると、レンズ71(171,271,371,471)のパワーが強くなり、球面収差が悪化する。
If the upper limit of conditional expression (2) is exceeded, the power (refractive power) of lens 71 (171, 271, 371, 471) will be weaker and the total length TTL will be longer. This makes it difficult to miniaturize lens optical system 25. If the lower limit of conditional expression (2) is exceeded, the power of lens 71 (171, 271, 371, 471) will be stronger and spherical aberration will worsen.
図23に示すように、図2、図7、図11、図15、および図19のレンズ光学系25では、(R4+R3)/(R4-R3)は、それぞれ、-8.02,-7.77,-7.57,-9.18,-8.11である。従って、図2、図7、図11、図15、および図19のレンズ光学系25は、以下の条件式(3)を満たしている。
As shown in Figure 23, in the lens optical systems 25 of Figures 2, 7, 11, 15, and 19, (R4+R3)/(R4-R3) are -8.02, -7.77, -7.57, -9.18, and -8.11, respectively. Therefore, the lens optical systems 25 of Figures 2, 7, 11, 15, and 19 satisfy the following conditional expression (3).
-10.0<(R4+R3)/(R4-R3)<-3.0・・・(3)
-10.0<(R4+R3)/(R4-R3)<-3.0...(3)
条件式(3)の上限を上回る場合、レンズ72(172,272,372,472)のパワーが強くなり、球面収差の補正が困難である。条件式(3)の下限を下回る場合、レンズ72(172,272,372,472)のパワーが弱くなるため、全長TTLが長くなり、レンズ光学系25の小型化が困難である。
If the upper limit of conditional expression (3) is exceeded, the power of lens 72 (172, 272, 372, 472) becomes strong, making it difficult to correct spherical aberration. If the lower limit of conditional expression (3) is exceeded, the power of lens 72 (172, 272, 372, 472) becomes weak, so the total length TTL becomes long, making it difficult to miniaturize lens optical system 25.
図23に示すように、図2、図7、図11、図15、および図19のレンズ光学系25では、(R8+R7)/(R8-R7)は、それぞれ、-2.82,-1.14,-1.33,-1.28,-1.19である。従って、図2、図7、図11、図15、および図19のレンズ光学系25は、以下の条件式(4)を満たしている。
As shown in Figure 23, in the lens optical systems 25 of Figures 2, 7, 11, 15, and 19, (R8+R7)/(R8-R7) are -2.82, -1.14, -1.33, -1.28, and -1.19, respectively. Therefore, the lens optical systems 25 of Figures 2, 7, 11, 15, and 19 satisfy the following conditional expression (4).
-10.0<(R8+R7)/(R8-R7)<-1.0・・・(4)
-10.0<(R8+R7)/(R8-R7)<-1.0...(4)
条件式(4)の上限を上回る場合、レンズ72(172,272,372,472)のパワーが強くなり、コマ収差および非点収差の補正が困難である。条件式(4)の下限を下回る場合、レンズ72(172,272,372,472)のパワーが弱くなり、光軸に対する光線の射出角が緩く(小さく)なるため、固体撮像素子21の大型化が困難である。
If the upper limit of conditional expression (4) is exceeded, the power of lens 72 (172, 272, 372, 472) becomes strong, making it difficult to correct coma and astigmatism. If the lower limit of conditional expression (4) is exceeded, the power of lens 72 (172, 272, 372, 472) becomes weak, making the exit angle of the light ray relative to the optical axis gentle (small), making it difficult to increase the size of solid-state imaging element 21.
図23に示すように、図2、図7、図11、図15、および図19のレンズ光学系25では、アッベ数V1は、それぞれ、71.68,71.68,64.92,81.00,71.68である。従って、図2、図7、図11、図15、および図19のレンズ光学系25は、以下の条件式(5)を満たしている。
As shown in Figure 23, the Abbe numbers V1 of the lens optical systems 25 of Figures 2, 7, 11, 15, and 19 are 71.68, 71.68, 64.92, 81.00, and 71.68, respectively. Therefore, the lens optical systems 25 of Figures 2, 7, 11, 15, and 19 satisfy the following conditional expression (5).
50.0<V1<90.0・・・(5)
50.0<V1<90.0...(5)
条件式(5)の上限を上回る場合、レンズ71(171,271,371,471)の屈折率が低いため、球面収差の補正が困難である。条件式(5)の下限を下回る場合、軸上色収差の補正が困難である。
If the upper limit of conditional expression (5) is exceeded, the refractive index of lens 71 (171, 271, 371, 471) is low, making it difficult to correct spherical aberration. If the lower limit of conditional expression (5) is exceeded, it is difficult to correct axial chromatic aberration.
図23に示すように、図2、図7、図11、図15、および図19のレンズ光学系25では、V2は、それぞれ、19.32,19.32,17.67,19.32,19.32である。従って、図2、図7、図11、図15、および図19のレンズ光学系25は、以下の条件式(6)を満たしている。
As shown in FIG. 23, in the lens optical systems 25 of FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19, V2 is 19.32, 19.32, 17.67, 19.32, and 19.32, respectively. Therefore, the lens optical systems 25 of FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19 satisfy the following conditional expression (6).
15.0<V2<20.0・・・(6)
15.0<V2<20.0...(6)
条件式(6)の上限を上回る場合、レンズ72(172,272,372,472)の屈折率が低いため、球面収差の補正が困難である。条件式(6)の下限を下回ると、軸上色収差の補正が困難である。
If the upper limit of conditional expression (6) is exceeded, the refractive index of lens 72 (172, 272, 372, 472) is low, making it difficult to correct spherical aberration. If the lower limit of conditional expression (6) is exceeded, it becomes difficult to correct axial chromatic aberration.
図23に示すように、図2、図7、図11、図15、および図19のレンズ光学系25では、|f3/f|は、それぞれ、5.62,23.37,51.29,6.25,7.94である。従って、図2、図7、図11、図15、および図19のレンズ光学系25は、以下の条件式(7)を満たしている。
As shown in Figure 23, in the lens optical systems 25 of Figures 2, 7, 11, 15, and 19, |f3/f| is 5.62, 23.37, 51.29, 6.25, and 7.94, respectively. Therefore, the lens optical systems 25 of Figures 2, 7, 11, 15, and 19 satisfy the following conditional expression (7).
5<|f3/f|<100.0・・・(7)
5<|f3/f|<100.0...(7)
条件式(7)の上限を上回る場合、焦点距離f3が長いため、光線の射出角の補正が困難である。条件式(7)の下限を下回ると、焦点距離f3が短いため、各種の収差の補正が困難である。
If the upper limit of conditional expression (7) is exceeded, the focal length f3 is long, making it difficult to correct the exit angle of the light beam. If the lower limit of conditional expression (7) is exceeded, the focal length f3 is short, making it difficult to correct various aberrations.
図23に示すように、図2、図7、図11、図15、および図19のレンズ光学系25では、f1/fは、それぞれ、0.92,0.87,0.90,1.03,0.89である。従って、図2、図7、図11、図15、および図19のレンズ光学系25は、以下の条件式(8)を満たしている。
As shown in FIG. 23, in the lens optical systems 25 of FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19, f1/f is 0.92, 0.87, 0.90, 1.03, and 0.89, respectively. Therefore, the lens optical systems 25 of FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19 satisfy the following conditional expression (8).
0<f1/f<2.0・・・(8)
0<f1/f<2.0...(8)
条件式(8)の上限を上回る場合、焦点距離f1が長いため、球面収差の補正が困難である。
If the upper limit of conditional expression (8) is exceeded, the focal length f1 is long, making it difficult to correct spherical aberration.
図23に示すように、図2、図7、図11、図15、および図19のレンズ光学系25では、CRAは、全て39.30である。従って、図2、図7、図11、図15、および図19のレンズ光学系25は、以下の条件式(9)を満たしている。
As shown in FIG. 23, the CRA of the lens optical systems 25 in FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19 is all 39.30. Therefore, the lens optical systems 25 in FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19 satisfy the following conditional expression (9).
CRA<40.0・・・(9)
CRA<40.0...(9)
条件式(9)の上限を上回る場合、撮像面31aに入射する光線の角度が大きくなり、固体撮像素子21の変換効率が低下する。
If the upper limit of conditional expression (9) is exceeded, the angle of the light rays incident on the imaging surface 31a becomes large, and the conversion efficiency of the solid-state imaging element 21 decreases.
図23に示すように、図2、図7、図11、図15、および図19のレンズ光学系25では、TTL/IHは、それぞれ、1.02,1.03,1.03,1.03,1.11である。従って、図2、図7、図11、図15、および図19のレンズ光学系25は、以下の条件式(10)を満たしている。
As shown in FIG. 23, in the lens optical systems 25 of FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19, TTL/IH is 1.02, 1.03, 1.03, 1.03, and 1.11, respectively. Therefore, the lens optical systems 25 of FIG. 2, FIG. 7, FIG. 11, FIG. 15, and FIG. 19 satisfy the following conditional expression (10).
1.0<TTL/IH<1.3・・・(10)
1.0<TTL/IH<1.3...(10)
条件式(10)の上限を上回る場合、固体撮像素子21に対するレンズ光学系25の全長TTLが長くなり、レンズ光学系25の小型化、ひいては撮像装置10の小型化が困難である。条件式(10)の下限を下回る場合、各種の収差の補正が困難である。
If the upper limit of conditional expression (10) is exceeded, the total length TTL of the lens optical system 25 relative to the solid-state imaging element 21 becomes long, making it difficult to miniaturize the lens optical system 25 and therefore the imaging device 10. If the lower limit of conditional expression (10) is exceeded, it becomes difficult to correct various aberrations.
以上のように、レンズ光学系25は、物体側から像側に向かって順に、レンズ71(171,271,371,471)乃至レンズ77(177,277,377,477)を備える。レンズ71(171,271,371,471)は光軸近傍で正の屈折力を有し、物体側に凸のメニスカス形状を有する。レンズ72(172,272,372,472)は光軸近傍で負の屈折力を有し、物体側に凸のメニスカス形状を有する。レンズ73(173,273,373,473)は光軸近傍で屈折力を有する。レンズ74(174,274,374,474)は光軸近傍で正の屈折力を有し、像側に凸のメニスカス形状を有する。レンズ75(175,275,375,475)は光軸近傍で屈折力を有し、像側に凸のメニスカス形状を有する。レンズ76(176,276,376,476)は光軸近傍で屈折力を有し、物体側に凸のメニスカス形状を有する。レンズ77(177,277,377,477)は光軸近傍で負の屈折力を有し、物体側に凸のメニスカス形状を有する。レンズ71(171,271,371,471)乃至レンズ77(177,277,377,477)のうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの面間隔の中で、面間隔D108(D208,D308,D408,D508)が最も長い。
As described above, the lens optical system 25 includes, in order from the object side to the image side, lenses 71 (171, 271, 371, 471) to 77 (177, 277, 377, 477). Lens 71 (171, 271, 371, 471) has positive refractive power near the optical axis and has a convex meniscus shape toward the object side. Lens 72 (172, 272, 372, 472) has negative refractive power near the optical axis and has a convex meniscus shape toward the object side. Lens 73 (173, 273, 373, 473) has refractive power near the optical axis. Lens 74 (174, 274, 374, 474) has positive refractive power near the optical axis and has a convex meniscus shape toward the image side. Lens 75 (175, 275, 375, 475) has refractive power near the optical axis and has a meniscus shape convex toward the image side. Lens 76 (176, 276, 376, 476) has refractive power near the optical axis and has a meniscus shape convex toward the object side. Lens 77 (177, 277, 377, 477) has negative refractive power near the optical axis and has a meniscus shape convex toward the object side. Among the surface distances between the opposing surfaces of each pair of adjacent two lenses among lenses 71 (171, 271, 371, 471) to 77 (177, 277, 377, 477), surface distance D108 (D208, D308, D408, D508) is the longest.
従って、大型で高画素の固体撮像素子21に対応する高光学性能のレンズ光学系25を実現することができる。また、レンズ光学系25は条件式(10)を満たすので、レンズ光学系25の小型化、ひいては撮像装置10の小型化を実現することができる。
Therefore, it is possible to realize a lens optical system 25 with high optical performance compatible with a large, high-pixel solid-state imaging element 21. Furthermore, since the lens optical system 25 satisfies conditional expression (10), it is possible to realize a compact lens optical system 25, and therefore a compact imaging device 10.
なお、レンズ光学系25におけるレンズデータや非球面データの数値は、上述した数値に限定されない。
The values of the lens data and aspheric data in the lens optical system 25 are not limited to the values described above.
<2.電子機器への適用例>
上述した撮像装置10は、例えば、デジタルスチルカメラやデジタルビデオカメラ、撮像機能を備えた携帯電話機やスマートフォン等のモバイル機器といった各種の電子機器に適用することができる。 2. Examples of application to electronic devices
Theimaging device 10 described above can be applied to various electronic devices, such as digital still cameras, digital video cameras, and mobile devices such as mobile phones and smartphones equipped with an imaging function.
上述した撮像装置10は、例えば、デジタルスチルカメラやデジタルビデオカメラ、撮像機能を備えた携帯電話機やスマートフォン等のモバイル機器といった各種の電子機器に適用することができる。 2. Examples of application to electronic devices
The
図24は、本技術を提供した電子機器としてのスマートフォンのハードウエア構成例を示すブロック図である。
FIG. 24 is a block diagram showing an example of the hardware configuration of a smartphone as an electronic device to which this technology is applied.
スマートフォン1000において、CPU(Central Processing Unit)1001,ROM(Read Only Memory)1002,RAM(Random Access Memory)1003は、バス1004により相互に接続されている。
In the smartphone 1000, a CPU (Central Processing Unit) 1001, a ROM (Read Only Memory) 1002, and a RAM (Random Access Memory) 1003 are interconnected by a bus 1004.
バス1004には、さらに、入出力インタフェース1005が接続されている。入出力インタフェース1005には、撮像部1006、入力部1007、出力部1008、および通信部1009が接続されている。
An input/output interface 1005 is further connected to the bus 1004. An imaging unit 1006, an input unit 1007, an output unit 1008, and a communication unit 1009 are connected to the input/output interface 1005.
撮像部1006は、上述した撮像装置10等により構成される。撮像部1006は、被写体を撮像し、画像を取得する。この画像はRAM1003に記憶されたり、出力部1008に表示されたりする。入力部1007は、タッチパネルを構成する位置入力装置であるタッチパッド、マイクロフォンなどよりなる。出力部1008は、タッチパネルを構成する液晶パネル、スピーカなどよりなる。通信部1009は、ネットワークインタフェースなどよりなる。
The imaging unit 1006 is composed of the imaging device 10 described above, etc. The imaging unit 1006 captures an image of a subject and obtains an image. This image is stored in the RAM 1003 and/or displayed on the output unit 1008. The input unit 1007 is composed of a touchpad, which is a position input device constituting a touch panel, a microphone, etc. The output unit 1008 is composed of a liquid crystal panel constituting a touch panel, a speaker, etc. The communication unit 1009 is composed of a network interface, etc.
以上のように構成されるスマートフォン1000においても、撮像部1006として撮像装置10を適用することにより、レンズ光学系25において大型で高画素の固体撮像素子21に対応する高光学性能を実現することができる。その結果、スマートフォン1000において高画素で高画質の画像を撮影することができる。
Even in the smartphone 1000 configured as described above, by applying the imaging device 10 as the imaging unit 1006, it is possible to achieve high optical performance corresponding to a large, high-pixel solid-state imaging element 21 in the lens optical system 25. As a result, it is possible to capture high-pixel, high-quality images in the smartphone 1000.
<3.撮像装置の使用例>
図25は、上述の撮像装置10を使用する使用例を示す図である。 3. Examples of use of imaging device
FIG. 25 is a diagram showing an example of using theimaging device 10 described above.
図25は、上述の撮像装置10を使用する使用例を示す図である。 3. Examples of use of imaging device
FIG. 25 is a diagram showing an example of using the
上述した撮像装置10は、例えば、以下のように、可視光や、赤外光、紫外光、X線等の光をセンシングする様々なケースに使用することができる。
The imaging device 10 described above can be used in various cases to sense light, such as visible light, infrared light, ultraviolet light, and X-rays, for example, as follows:
・ディジタルカメラや、カメラ機能付きの携帯機器等の、鑑賞の用に供される画像を撮影する装置
・自動停止等の安全運転や、運転者の状態の認識等のために、自動車の前方や後方、周囲、車内等を撮影する車載用センサ、走行車両や道路を監視する監視カメラ、車両間等の測距を行う測距センサ等の、交通の用に供される装置
・ユーザのジェスチャを撮影して、そのジェスチャに従った機器操作を行うために、TVや、冷蔵庫、エアーコンディショナ等の家電に供される装置
・内視鏡や、赤外光の受光による血管撮影を行う装置等の、医療やヘルスケアの用に供される装置
・防犯用途の監視カメラや、人物認証用途のカメラ等の、セキュリティの用に供される装置
・肌を撮影する肌測定器や、頭皮を撮影するマイクロスコープ等の、美容の用に供される装置
・スポーツ用途等向けのアクションカメラやウェアラブルカメラ等の、スポーツの用に供される装置
・畑や作物の状態を監視するためのカメラ等の、農業の用に供される装置 - Devices that take images for viewing, such as digital cameras and mobile devices with camera functions; - Devices for traffic purposes, such as in-vehicle sensors that take images of the front and rear of a car, the surroundings, and the interior of the car for safe driving such as automatic stopping and for recognizing the driver's state, surveillance cameras that monitor moving vehicles and roads, and distance measuring sensors that measure the distance between vehicles, etc.; - Devices for home appliances such as TVs, refrigerators, and air conditioners that take images of users' gestures and operate devices in accordance with those gestures; - Devices for medical and healthcare purposes, such as endoscopes and devices that take images of blood vessels by receiving infrared light; - Devices for security purposes, such as surveillance cameras for crime prevention and cameras for person authentication; - Devices for beauty purposes, such as skin measuring devices that take images of the skin and microscopes that take images of the scalp; - Devices for sports purposes, such as action cameras and wearable cameras for sports purposes, etc.; - Devices for agricultural purposes, such as cameras for monitoring the condition of fields and crops.
・自動停止等の安全運転や、運転者の状態の認識等のために、自動車の前方や後方、周囲、車内等を撮影する車載用センサ、走行車両や道路を監視する監視カメラ、車両間等の測距を行う測距センサ等の、交通の用に供される装置
・ユーザのジェスチャを撮影して、そのジェスチャに従った機器操作を行うために、TVや、冷蔵庫、エアーコンディショナ等の家電に供される装置
・内視鏡や、赤外光の受光による血管撮影を行う装置等の、医療やヘルスケアの用に供される装置
・防犯用途の監視カメラや、人物認証用途のカメラ等の、セキュリティの用に供される装置
・肌を撮影する肌測定器や、頭皮を撮影するマイクロスコープ等の、美容の用に供される装置
・スポーツ用途等向けのアクションカメラやウェアラブルカメラ等の、スポーツの用に供される装置
・畑や作物の状態を監視するためのカメラ等の、農業の用に供される装置 - Devices that take images for viewing, such as digital cameras and mobile devices with camera functions; - Devices for traffic purposes, such as in-vehicle sensors that take images of the front and rear of a car, the surroundings, and the interior of the car for safe driving such as automatic stopping and for recognizing the driver's state, surveillance cameras that monitor moving vehicles and roads, and distance measuring sensors that measure the distance between vehicles, etc.; - Devices for home appliances such as TVs, refrigerators, and air conditioners that take images of users' gestures and operate devices in accordance with those gestures; - Devices for medical and healthcare purposes, such as endoscopes and devices that take images of blood vessels by receiving infrared light; - Devices for security purposes, such as surveillance cameras for crime prevention and cameras for person authentication; - Devices for beauty purposes, such as skin measuring devices that take images of the skin and microscopes that take images of the scalp; - Devices for sports purposes, such as action cameras and wearable cameras for sports purposes, etc.; - Devices for agricultural purposes, such as cameras for monitoring the condition of fields and crops.
<4.内視鏡手術システムへの応用例>
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。 <4. Application example to endoscopic surgery system>
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。 <4. Application example to endoscopic surgery system>
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
図26は、本開示に係る技術(本技術)が適用され得る内視鏡手術システムの概略的な構成の一例を示す図である。
FIG. 26 is a diagram showing an example of the general configuration of an endoscopic surgery system to which the technology disclosed herein (the present technology) can be applied.
図26では、術者(医師)11131が、内視鏡手術システム11000を用いて、患者ベッド11133上の患者11132に手術を行っている様子が図示されている。図示するように、内視鏡手術システム11000は、内視鏡11100と、気腹チューブ11111やエネルギー処置具11112等の、その他の術具11110と、内視鏡11100を支持する支持アーム装置11120と、内視鏡下手術のための各種の装置が搭載されたカート11200と、から構成される。
In FIG. 26, an operator (doctor) 11131 is shown using an endoscopic surgery system 11000 to perform surgery on a patient 11132 on a patient bed 11133. As shown in the figure, the endoscopic surgery system 11000 is composed of an endoscope 11100, other surgical tools 11110 such as an insufflation tube 11111 and an energy treatment tool 11112, a support arm device 11120 that supports the endoscope 11100, and a cart 11200 on which various devices for endoscopic surgery are mounted.
内視鏡11100は、先端から所定の長さの領域が患者11132の体腔内に挿入される鏡筒11101と、鏡筒11101の基端に接続されるカメラヘッド11102と、から構成される。図示する例では、硬性の鏡筒11101を有するいわゆる硬性鏡として構成される内視鏡11100を図示しているが、内視鏡11100は、軟性の鏡筒を有するいわゆる軟性鏡として構成されてもよい。
The endoscope 11100 is composed of a lens barrel 11101, the tip of which is inserted into the body cavity of the patient 11132 at a predetermined length, and a camera head 11102 connected to the base end of the lens barrel 11101. In the illustrated example, the endoscope 11100 is configured as a so-called rigid scope having a rigid lens barrel 11101, but the endoscope 11100 may also be configured as a so-called flexible scope having a flexible lens barrel.
鏡筒11101の先端には、対物レンズが嵌め込まれた開口部が設けられている。内視鏡11100には光源装置11203が接続されており、当該光源装置11203によって生成された光が、鏡筒11101の内部に延設されるライトガイドによって当該鏡筒の先端まで導光され、対物レンズを介して患者11132の体腔内の観察対象に向かって照射される。なお、内視鏡11100は、直視鏡であってもよいし、斜視鏡又は側視鏡であってもよい。
The tip of the tube 11101 has an opening into which an objective lens is fitted. A light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the tube by a light guide extending inside the tube 11101, and is irradiated via the objective lens towards an object to be observed inside the body cavity of the patient 11132. The endoscope 11100 may be a direct-viewing endoscope, an oblique-viewing endoscope, or a side-viewing endoscope.
カメラヘッド11102の内部には光学系及び撮像素子が設けられており、観察対象からの反射光(観察光)は当該光学系によって当該撮像素子に集光される。当該撮像素子によって観察光が光電変換され、観察光に対応する電気信号、すなわち観察像に対応する画像信号が生成される。当該画像信号は、RAWデータとしてカメラコントロールユニット(CCU: Camera Control Unit)11201に送信される。
An optical system and an image sensor are provided inside the camera head 11102, and the reflected light (observation light) from the object of observation is focused on the image sensor by the optical system. The observation light is photoelectrically converted by the image sensor to generate an electrical signal corresponding to the observation light, i.e., an image signal corresponding to the observed image. The image signal is sent to the camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
CCU11201は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)等によって構成され、内視鏡11100及び表示装置11202の動作を統括的に制御する。さらに、CCU11201は、カメラヘッド11102から画像信号を受け取り、その画像信号に対して、例えば現像処理(デモザイク処理)等の、当該画像信号に基づく画像を表示するための各種の画像処理を施す。
The CCU 11201 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), etc., and controls the overall operation of the endoscope 11100 and the display device 11202. Furthermore, the CCU 11201 receives an image signal from the camera head 11102, and performs various image processing on the image signal, such as development processing (demosaic processing), in order to display an image based on the image signal.
表示装置11202は、CCU11201からの制御により、当該CCU11201によって画像処理が施された画像信号に基づく画像を表示する。
The display device 11202, under the control of the CCU 11201, displays an image based on the image signal that has been subjected to image processing by the CCU 11201.
光源装置11203は、例えばLED(Light Emitting Diode)等の光源から構成され、術部等を撮影する際の照射光を内視鏡11100に供給する。
The light source device 11203 is composed of a light source such as an LED (Light Emitting Diode) and supplies irradiation light to the endoscope 11100 when photographing the surgical site, etc.
入力装置11204は、内視鏡手術システム11000に対する入力インタフェースである。ユーザは、入力装置11204を介して、内視鏡手術システム11000に対して各種の情報の入力や指示入力を行うことができる。例えば、ユーザは、内視鏡11100による撮像条件(照射光の種類、倍率及び焦点距離等)を変更する旨の指示等を入力する。
The input device 11204 is an input interface for the endoscopic surgery system 11000. A user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204. For example, the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) of the endoscope 11100.
処置具制御装置11205は、組織の焼灼、切開又は血管の封止等のためのエネルギー処置具11112の駆動を制御する。気腹装置11206は、内視鏡11100による視野の確保及び術者の作業空間の確保の目的で、患者11132の体腔を膨らめるために、気腹チューブ11111を介して当該体腔内にガスを送り込む。レコーダ11207は、手術に関する各種の情報を記録可能な装置である。プリンタ11208は、手術に関する各種の情報を、テキスト、画像又はグラフ等各種の形式で印刷可能な装置である。
The treatment tool control device 11205 controls the operation of the energy treatment tool 11112 for cauterizing tissue, incising, sealing blood vessels, etc. The insufflation device 11206 sends gas into the body cavity of the patient 11132 via the insufflation tube 11111 to inflate the body cavity in order to ensure a clear field of view for the endoscope 11100 and to ensure a working space for the surgeon. The recorder 11207 is a device capable of recording various types of information related to the surgery. The printer 11208 is a device capable of printing various types of information related to the surgery in various formats such as text, images, or graphs.
なお、内視鏡11100に術部を撮影する際の照射光を供給する光源装置11203は、例えばLED、レーザ光源又はこれらの組み合わせによって構成される白色光源から構成することができる。RGBレーザ光源の組み合わせにより白色光源が構成される場合には、各色(各波長)の出力強度及び出力タイミングを高精度に制御することができるため、光源装置11203において撮像画像のホワイトバランスの調整を行うことができる。また、この場合には、RGBレーザ光源それぞれからのレーザ光を時分割で観察対象に照射し、その照射タイミングに同期してカメラヘッド11102の撮像素子の駆動を制御することにより、RGBそれぞれに対応した画像を時分割で撮像することも可能である。当該方法によれば、当該撮像素子にカラーフィルタを設けなくても、カラー画像を得ることができる。
The light source device 11203 that supplies illumination light to the endoscope 11100 when photographing the surgical site can be composed of a white light source composed of, for example, an LED, a laser light source, or a combination of these. When the white light source is composed of a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high precision, so that the white balance of the captured image can be adjusted in the light source device 11203. In this case, it is also possible to capture images corresponding to each of the RGB colors in a time-division manner by irradiating the object of observation with laser light from each of the RGB laser light sources in a time-division manner and controlling the drive of the image sensor of the camera head 11102 in synchronization with the irradiation timing. According to this method, a color image can be obtained without providing a color filter to the image sensor.
また、光源装置11203は、出力する光の強度を所定の時間ごとに変更するようにその駆動が制御されてもよい。その光の強度の変更のタイミングに同期してカメラヘッド11102の撮像素子の駆動を制御して時分割で画像を取得し、その画像を合成することにより、いわゆる黒つぶれ及び白とびのない高ダイナミックレンジの画像を生成することができる。
The light source device 11203 may be controlled to change the intensity of the light it outputs at predetermined time intervals. The image sensor of the camera head 11102 may be controlled to acquire images in a time-division manner in synchronization with the timing of the change in the light intensity, and the images may be synthesized to generate an image with a high dynamic range that is free of so-called blackout and whiteout.
また、光源装置11203は、特殊光観察に対応した所定の波長帯域の光を供給可能に構成されてもよい。特殊光観察では、例えば、体組織における光の吸収の波長依存性を利用して、通常の観察時における照射光(すなわち、白色光)に比べて狭帯域の光を照射することにより、粘膜表層の血管等の所定の組織を高コントラストで撮影する、いわゆる狭帯域光観察(Narrow Band Imaging)が行われる。あるいは、特殊光観察では、励起光を照射することにより発生する蛍光により画像を得る蛍光観察が行われてもよい。蛍光観察では、体組織に励起光を照射し当該体組織からの蛍光を観察すること(自家蛍光観察)、又はインドシアニングリーン(ICG)等の試薬を体組織に局注するとともに当該体組織にその試薬の蛍光波長に対応した励起光を照射し蛍光像を得ること等を行うことができる。光源装置11203は、このような特殊光観察に対応した狭帯域光及び/又は励起光を供給可能に構成され得る。
The light source device 11203 may be configured to supply light of a predetermined wavelength band corresponding to special light observation. In special light observation, for example, by utilizing the wavelength dependency of light absorption in body tissue, a narrow band of light is irradiated compared to the light irradiated during normal observation (i.e., white light), and a specific tissue such as blood vessels on the surface of the mucosa is photographed with high contrast, so-called narrow band imaging is performed. Alternatively, in special light observation, fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light. In fluorescence observation, excitation light is irradiated to body tissue and fluorescence from the body tissue is observed (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally injected into the body tissue and excitation light corresponding to the fluorescence wavelength of the reagent is irradiated to the body tissue to obtain a fluorescent image. The light source device 11203 may be configured to supply narrow band light and/or excitation light corresponding to such special light observation.
図27は、図26に示すカメラヘッド11102及びCCU11201の機能構成の一例を示すブロック図である。
FIG. 27 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG. 26.
カメラヘッド11102は、レンズユニット11401と、撮像部11402と、駆動部11403と、通信部11404と、カメラヘッド制御部11405と、を有する。CCU11201は、通信部11411と、画像処理部11412と、制御部11413と、を有する。カメラヘッド11102とCCU11201とは、伝送ケーブル11400によって互いに通信可能に接続されている。
The camera head 11102 has a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405. The CCU 11201 has a communication unit 11411, an image processing unit 11412, and a control unit 11413. The camera head 11102 and the CCU 11201 are connected to each other via a transmission cable 11400 so that they can communicate with each other.
レンズユニット11401は、鏡筒11101との接続部に設けられる光学系である。鏡筒11101の先端から取り込まれた観察光は、カメラヘッド11102まで導光され、当該レンズユニット11401に入射する。レンズユニット11401は、ズームレンズ及びフォーカスレンズを含む複数のレンズが組み合わされて構成される。
The lens unit 11401 is an optical system provided at the connection with the lens barrel 11101. Observation light taken in from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401. The lens unit 11401 is composed of a combination of multiple lenses including a zoom lens and a focus lens.
撮像部11402は、撮像素子で構成される。撮像部11402を構成する撮像素子は、1つ(いわゆる単板式)であってもよいし、複数(いわゆる多板式)であってもよい。撮像部11402が多板式で構成される場合には、例えば各撮像素子によってRGBそれぞれに対応する画像信号が生成され、それらが合成されることによりカラー画像が得られてもよい。あるいは、撮像部11402は、3D(Dimensional)表示に対応する右目用及び左目用の画像信号をそれぞれ取得するための1対の撮像素子を有するように構成されてもよい。3D表示が行われることにより、術者11131は術部における生体組織の奥行きをより正確に把握することが可能になる。なお、撮像部11402が多板式で構成される場合には、各撮像素子に対応して、レンズユニット11401も複数系統設けられ得る。
The imaging unit 11402 is composed of an imaging element. The imaging element constituting the imaging unit 11402 may be one (so-called single-plate type) or multiple (so-called multi-plate type). When the imaging unit 11402 is composed of a multi-plate type, for example, each imaging element may generate an image signal corresponding to each of RGB, and a color image may be obtained by combining these. Alternatively, the imaging unit 11402 may be configured to have a pair of imaging elements for acquiring image signals for the right eye and the left eye corresponding to 3D (dimensional) display. By performing 3D display, the surgeon 11131 can more accurately grasp the depth of the biological tissue in the surgical site. Note that when the imaging unit 11402 is composed of a multi-plate type, multiple lens units 11401 may be provided corresponding to each imaging element.
また、撮像部11402は、必ずしもカメラヘッド11102に設けられなくてもよい。例えば、撮像部11402は、鏡筒11101の内部に、対物レンズの直後に設けられてもよい。
Furthermore, the imaging unit 11402 does not necessarily have to be provided in the camera head 11102. For example, the imaging unit 11402 may be provided inside the lens barrel 11101, immediately after the objective lens.
駆動部11403は、アクチュエータによって構成され、カメラヘッド制御部11405からの制御により、レンズユニット11401のズームレンズ及びフォーカスレンズを光軸に沿って所定の距離だけ移動させる。これにより、撮像部11402による撮像画像の倍率及び焦点が適宜調整され得る。
The driving unit 11403 is composed of an actuator, and moves the zoom lens and focus lens of the lens unit 11401 a predetermined distance along the optical axis under the control of the camera head control unit 11405. This allows the magnification and focus of the image captured by the imaging unit 11402 to be adjusted appropriately.
通信部11404は、CCU11201との間で各種の情報を送受信するための通信装置によって構成される。通信部11404は、撮像部11402から得た画像信号をRAWデータとして伝送ケーブル11400を介してCCU11201に送信する。
The communication unit 11404 is configured with a communication device for transmitting and receiving various information to and from the CCU 11201. The communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
また、通信部11404は、CCU11201から、カメラヘッド11102の駆動を制御するための制御信号を受信し、カメラヘッド制御部11405に供給する。当該制御信号には、例えば、撮像画像のフレームレートを指定する旨の情報、撮像時の露出値を指定する旨の情報、並びに/又は撮像画像の倍率及び焦点を指定する旨の情報等、撮像条件に関する情報が含まれる。
The communication unit 11404 also receives control signals for controlling the operation of the camera head 11102 from the CCU 11201, and supplies them to the camera head control unit 11405. The control signals include information on the imaging conditions, such as information specifying the frame rate of the captured image, information specifying the exposure value during imaging, and/or information specifying the magnification and focus of the captured image.
なお、上記のフレームレートや露出値、倍率、焦点等の撮像条件は、ユーザによって適宜指定されてもよいし、取得された画像信号に基づいてCCU11201の制御部11413によって自動的に設定されてもよい。後者の場合には、いわゆるAE(Auto Exposure)機能、AF(Auto Focus)機能及びAWB(Auto White Balance)機能が内視鏡11100に搭載されていることになる。
The above-mentioned frame rate, exposure value, magnification, focus, and other imaging conditions may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. In the latter case, the endoscope 11100 is equipped with so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function.
カメラヘッド制御部11405は、通信部11404を介して受信したCCU11201からの制御信号に基づいて、カメラヘッド11102の駆動を制御する。
The camera head control unit 11405 controls the operation of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
通信部11411は、カメラヘッド11102との間で各種の情報を送受信するための通信装置によって構成される。通信部11411は、カメラヘッド11102から、伝送ケーブル11400を介して送信される画像信号を受信する。
The communication unit 11411 is configured with a communication device for transmitting and receiving various information to and from the camera head 11102. The communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
また、通信部11411は、カメラヘッド11102に対して、カメラヘッド11102の駆動を制御するための制御信号を送信する。画像信号や制御信号は、電気通信や光通信等によって送信することができる。
The communication unit 11411 also transmits to the camera head 11102 a control signal for controlling the operation of the camera head 11102. The image signal and the control signal can be transmitted by electrical communication, optical communication, etc.
画像処理部11412は、カメラヘッド11102から送信されたRAWデータである画像信号に対して各種の画像処理を施す。
The image processing unit 11412 performs various image processing operations on the image signal, which is the RAW data transmitted from the camera head 11102.
制御部11413は、内視鏡11100による術部等の撮像、及び、術部等の撮像により得られる撮像画像の表示に関する各種の制御を行う。例えば、制御部11413は、カメラヘッド11102の駆動を制御するための制御信号を生成する。
The control unit 11413 performs various controls related to the imaging of the surgical site, etc. by the endoscope 11100, and the display of the captured images obtained by imaging the surgical site, etc. For example, the control unit 11413 generates a control signal for controlling the driving of the camera head 11102.
また、制御部11413は、画像処理部11412によって画像処理が施された画像信号に基づいて、術部等が映った撮像画像を表示装置11202に表示させる。この際、制御部11413は、各種の画像認識技術を用いて撮像画像内における各種の物体を認識してもよい。例えば、制御部11413は、撮像画像に含まれる物体のエッジの形状や色等を検出することにより、鉗子等の術具、特定の生体部位、出血、エネルギー処置具11112の使用時のミスト等を認識することができる。制御部11413は、表示装置11202に撮像画像を表示させる際に、その認識結果を用いて、各種の手術支援情報を当該術部の画像に重畳表示させてもよい。手術支援情報が重畳表示され、術者11131に提示されることにより、術者11131の負担を軽減することや、術者11131が確実に手術を進めることが可能になる。
The control unit 11413 also causes the display device 11202 to display the captured image showing the surgical site, etc., based on the image signal that has been image-processed by the image processing unit 11412. At this time, the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 can recognize surgical tools such as forceps, specific body parts, bleeding, mist generated when the energy treatment tool 11112 is used, etc., by detecting the shape and color of the edges of objects included in the captured image. When the control unit 11413 causes the display device 11202 to display the captured image, it may use the recognition result to superimpose various types of surgical support information on the image of the surgical site. By superimposing the surgical support information and presenting it to the surgeon 11131, the burden on the surgeon 11131 can be reduced and the surgeon 11131 can proceed with the surgery reliably.
カメラヘッド11102及びCCU11201を接続する伝送ケーブル11400は、電気信号の通信に対応した電気信号ケーブル、光通信に対応した光ファイバ、又はこれらの複合ケーブルである。
The transmission cable 11400 that connects the camera head 11102 and the CCU 11201 is an electrical signal cable that supports electrical signal communication, an optical fiber that supports optical communication, or a composite cable of these.
ここで、図示する例では、伝送ケーブル11400を用いて有線で通信が行われていたが、カメラヘッド11102とCCU11201との間の通信は無線で行われてもよい。
In the illustrated example, communication is performed wired using a transmission cable 11400, but communication between the camera head 11102 and the CCU 11201 may also be performed wirelessly.
以上、本開示に係る技術が適用され得る内視鏡手術システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、レンズユニット11401、撮像部11402等に適用され得る。具体的には、上述した撮像装置10は、レンズユニット11401、撮像部11402、および駆動部11403に適用することができる。レンズユニット11401と撮像部11402に本開示に係る技術を適用することにより、レンズ光学系25において大型で高画素の固体撮像素子21に対応する高光学性能を実現することができる。その結果、高画素で高画質の術部画像により、例えば術者が術部を確実に確認することが可能になる。
The above describes an example of an endoscopic surgery system to which the technology disclosed herein can be applied. The technology disclosed herein can be applied to the lens unit 11401, the imaging unit 11402, and the like, among the configurations described above. Specifically, the imaging device 10 described above can be applied to the lens unit 11401, the imaging unit 11402, and the drive unit 11403. By applying the technology disclosed herein to the lens unit 11401 and the imaging unit 11402, it is possible to achieve high optical performance corresponding to a large, high-pixel solid-state imaging element 21 in the lens optical system 25. As a result, a high-pixel, high-quality image of the surgical site allows, for example, the surgeon to reliably confirm the surgical site.
なお、ここでは、一例として内視鏡手術システムについて説明したが、本開示に係る技術は、その他、例えば、顕微鏡手術システム等に適用されてもよい。
Note that although an endoscopic surgery system has been described here as an example, the technology disclosed herein may also be applied to other systems, such as a microsurgery system.
<5.移動体への応用例>
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。 <5. Examples of applications to moving objects>
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the technology according to the present disclosure may be realized as a device mounted on any type of moving body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility device, an airplane, a drone, a ship, or a robot.
本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。 <5. Examples of applications to moving objects>
The technology according to the present disclosure (the present technology) can be applied to various products. For example, the technology according to the present disclosure may be realized as a device mounted on any type of moving body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, a personal mobility device, an airplane, a drone, a ship, or a robot.
図28は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システムの概略的な構成例を示すブロック図である。
FIG. 28 is a block diagram showing a schematic configuration example of a vehicle control system, which is an example of a mobile object control system to which the technology disclosed herein can be applied.
車両制御システム12000は、通信ネットワーク12001を介して接続された複数の電子制御ユニットを備える。図28に示した例では、車両制御システム12000は、駆動系制御ユニット12010、ボディ系制御ユニット12020、車外情報検出ユニット12030、車内情報検出ユニット12040、及び統合制御ユニット12050を備える。また、統合制御ユニット12050の機能構成として、マイクロコンピュータ12051、音声画像出力部12052、及び車載ネットワークI/F(interface)12053が図示されている。
The vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001. In the example shown in FIG. 28, the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, an outside vehicle information detection unit 12030, an inside vehicle information detection unit 12040, and an integrated control unit 12050. Also shown as functional components of the integrated control unit 12050 are a microcomputer 12051, an audio/video output unit 12052, and an in-vehicle network I/F (interface) 12053.
駆動系制御ユニット12010は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット12010は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。
The drive system control unit 12010 controls the operation of devices related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 functions as a control device for a drive force generating device for generating the drive force of the vehicle, such as an internal combustion engine or a drive motor, a drive force transmission mechanism for transmitting the drive force to the wheels, a steering mechanism for adjusting the steering angle of the vehicle, and a braking device for generating a braking force for the vehicle.
ボディ系制御ユニット12020は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット12020は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット12020には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット12020は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。
The body system control unit 12020 controls the operation of various devices installed in the vehicle body according to various programs. For example, the body system control unit 12020 functions as a control device for a keyless entry system, a smart key system, a power window device, or various lamps such as headlamps, tail lamps, brake lamps, turn signals, and fog lamps. In this case, radio waves or signals from various switches transmitted from a portable device that replaces a key can be input to the body system control unit 12020. The body system control unit 12020 accepts the input of these radio waves or signals and controls the vehicle's door lock device, power window device, lamps, etc.
車外情報検出ユニット12030は、車両制御システム12000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット12030には、撮像部12031が接続される。車外情報検出ユニット12030は、撮像部12031に車外の画像を撮像させるとともに、撮像された画像を受信する。車外情報検出ユニット12030は、受信した画像に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。
The outside-vehicle information detection unit 12030 detects information outside the vehicle equipped with the vehicle control system 12000. For example, the image capturing unit 12031 is connected to the outside-vehicle information detection unit 12030. The outside-vehicle information detection unit 12030 causes the image capturing unit 12031 to capture images outside the vehicle and receives the captured images. The outside-vehicle information detection unit 12030 may perform object detection processing or distance detection processing for people, cars, obstacles, signs, or characters on the road surface based on the received images.
撮像部12031は、光を受光し、その光の受光量に応じた電気信号を出力する光センサである。撮像部12031は、電気信号を画像として出力することもできるし、測距の情報として出力することもできる。また、撮像部12031が受光する光は、可視光であっても良いし、赤外線等の非可視光であっても良い。
The imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal according to the amount of light received. The imaging unit 12031 can output the electrical signal as an image, or as distance measurement information. The light received by the imaging unit 12031 may be visible light, or may be invisible light such as infrared light.
車内情報検出ユニット12040は、車内の情報を検出する。車内情報検出ユニット12040には、例えば、運転者の状態を検出する運転者状態検出部12041が接続される。運転者状態検出部12041は、例えば運転者を撮像するカメラを含み、車内情報検出ユニット12040は、運転者状態検出部12041から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。
The in-vehicle information detection unit 12040 detects information inside the vehicle. To the in-vehicle information detection unit 12040, for example, a driver state detection unit 12041 that detects the state of the driver is connected. The driver state detection unit 12041 includes, for example, a camera that captures an image of the driver, and the in-vehicle information detection unit 12040 may calculate the driver's degree of fatigue or concentration based on the detection information input from the driver state detection unit 12041, or may determine whether the driver is dozing off.
マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット12010に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行うことができる。
The microcomputer 12051 can calculate the control target values of the driving force generating device, steering mechanism, or braking device based on the information inside and outside the vehicle acquired by the outside vehicle information detection unit 12030 or the inside vehicle information detection unit 12040, and output a control command to the drive system control unit 12010. For example, the microcomputer 12051 can perform cooperative control aimed at realizing the functions of an ADAS (Advanced Driver Assistance System), including vehicle collision avoidance or impact mitigation, following driving based on the distance between vehicles, maintaining vehicle speed, vehicle collision warning, or vehicle lane departure warning.
また、マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。
The microcomputer 12051 can also control the driving force generating device, steering mechanism, braking device, etc. based on information about the surroundings of the vehicle acquired by the outside vehicle information detection unit 12030 or the inside vehicle information detection unit 12040, thereby performing cooperative control aimed at automatic driving, which allows the vehicle to travel autonomously without relying on the driver's operation.
また、マイクロコンピュータ12051は、車外情報検出ユニット12030で取得される車外の情報に基づいて、ボディ系制御ユニット12020に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車外情報検出ユニット12030で検知した先行車又は対向車の位置に応じてヘッドランプを制御し、ハイビームをロービームに切り替える等の防眩を図ることを目的とした協調制御を行うことができる。
The microcomputer 12051 can also output control commands to the body system control unit 12020 based on information outside the vehicle acquired by the outside-vehicle information detection unit 12030. For example, the microcomputer 12051 can control the headlamps according to the position of a preceding vehicle or an oncoming vehicle detected by the outside-vehicle information detection unit 12030, and perform cooperative control aimed at preventing glare, such as switching high beams to low beams.
音声画像出力部12052は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図28の例では、出力装置として、オーディオスピーカ12061、表示部12062及びインストルメントパネル12063が例示されている。表示部12062は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。
The audio/image output unit 12052 transmits at least one output signal of audio and image to an output device capable of visually or audibly notifying the occupants of the vehicle or the outside of the vehicle of information. In the example of FIG. 28, an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are exemplified as output devices. The display unit 12062 may include, for example, at least one of an on-board display and a head-up display.
図29は、撮像部12031の設置位置の例を示す図である。
FIG. 29 shows an example of the installation position of the imaging unit 12031.
図29では、車両12100は、撮像部12031として、撮像部12101,12102,12103,12104,12105を有する。
In FIG. 29, the vehicle 12100 has imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
撮像部12101,12102,12103,12104,12105は、例えば、車両12100のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部等の位置に設けられる。フロントノーズに備えられる撮像部12101及び車室内のフロントガラスの上部に備えられる撮像部12105は、主として車両12100の前方の画像を取得する。サイドミラーに備えられる撮像部12102,12103は、主として車両12100の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部12104は、主として車両12100の後方の画像を取得する。撮像部12101及び12105で取得される前方の画像は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。
The imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at the front nose, side mirrors, rear bumper, back door, and the top of the windshield inside the vehicle cabin of the vehicle 12100. The imaging unit 12101 provided at the front nose and the imaging unit 12105 provided at the top of the windshield inside the vehicle cabin mainly acquire images of the front of the vehicle 12100. The imaging units 12102 and 12103 provided at the side mirrors mainly acquire images of the sides of the vehicle 12100. The imaging unit 12104 provided at the rear bumper or back door mainly acquires images of the rear of the vehicle 12100. The images of the front acquired by the imaging units 12101 and 12105 are mainly used to detect preceding vehicles, pedestrians, obstacles, traffic lights, traffic signs, lanes, etc.
なお、図29には、撮像部12101ないし12104の撮影範囲の一例が示されている。撮像範囲12111は、フロントノーズに設けられた撮像部12101の撮像範囲を示し、撮像範囲12112,12113は、それぞれサイドミラーに設けられた撮像部12102,12103の撮像範囲を示し、撮像範囲12114は、リアバンパ又はバックドアに設けられた撮像部12104の撮像範囲を示す。例えば、撮像部12101ないし12104で撮像された画像データが重ね合わせられることにより、車両12100を上方から見た俯瞰画像が得られる。
Note that FIG. 29 shows an example of the imaging ranges of the imaging units 12101 to 12104. Imaging range 12111 indicates the imaging range of the imaging unit 12101 provided on the front nose, imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided on the side mirrors, respectively, and imaging range 12114 indicates the imaging range of the imaging unit 12104 provided on the rear bumper or back door. For example, an overhead image of the vehicle 12100 viewed from above is obtained by superimposing the image data captured by the imaging units 12101 to 12104.
撮像部12101ないし12104の少なくとも1つは、距離情報を取得する機能を有していてもよい。例えば、撮像部12101ないし12104の少なくとも1つは、複数の撮像素子からなるステレオカメラであってもよいし、位相差検出用の画素を有する撮像素子であってもよい。
At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the imaging units 12101 to 12104 may be a stereo camera consisting of multiple imaging elements, or an imaging element having pixels for detecting phase differences.
例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を基に、撮像範囲12111ないし12114内における各立体物までの距離と、この距離の時間的変化(車両12100に対する相対速度)を求めることにより、特に車両12100の進行路上にある最も近い立体物で、車両12100と略同じ方向に所定の速度(例えば、0km/h以上)で走行する立体物を先行車として抽出することができる。さらに、マイクロコンピュータ12051は、先行車の手前に予め確保すべき車間距離を設定し、自動ブレーキ制御(追従停止制御も含む)や自動加速制御(追従発進制御も含む)等を行うことができる。このように運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。
For example, the microcomputer 12051 can obtain the distance to each solid object within the imaging ranges 12111 to 12114 and the change in this distance over time (relative speed with respect to the vehicle 12100) based on the distance information obtained from the imaging units 12101 to 12104, and can extract as a preceding vehicle, in particular, the closest solid object on the path of the vehicle 12100 that is traveling in approximately the same direction as the vehicle 12100 at a predetermined speed (e.g., 0 km/h or faster). Furthermore, the microcomputer 12051 can set the inter-vehicle distance that should be maintained in advance in front of the preceding vehicle, and perform automatic braking control (including follow-up stop control) and automatic acceleration control (including follow-up start control). In this way, cooperative control can be performed for the purpose of automatic driving, which runs autonomously without relying on the driver's operation.
例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を元に、立体物に関する立体物データを、2輪車、普通車両、大型車両、歩行者、電柱等その他の立体物に分類して抽出し、障害物の自動回避に用いることができる。例えば、マイクロコンピュータ12051は、車両12100の周辺の障害物を、車両12100のドライバが視認可能な障害物と視認困難な障害物とに識別する。そして、マイクロコンピュータ12051は、各障害物との衝突の危険度を示す衝突リスクを判断し、衝突リスクが設定値以上で衝突可能性がある状況であるときには、オーディオスピーカ12061や表示部12062を介してドライバに警報を出力することや、駆動系制御ユニット12010を介して強制減速や回避操舵を行うことで、衝突回避のための運転支援を行うことができる。
For example, the microcomputer 12051 classifies and extracts three-dimensional object data on three-dimensional objects, such as two-wheeled vehicles, ordinary vehicles, large vehicles, pedestrians, utility poles, and other three-dimensional objects, based on the distance information obtained from the imaging units 12101 to 12104, and can use the data to automatically avoid obstacles. For example, the microcomputer 12051 distinguishes obstacles around the vehicle 12100 into obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see. The microcomputer 12051 then determines the collision risk, which indicates the risk of collision with each obstacle, and when the collision risk is equal to or exceeds a set value and there is a possibility of a collision, it can provide driving assistance for collision avoidance by outputting an alarm to the driver via the audio speaker 12061 or the display unit 12062, or by forcibly decelerating or steering the vehicle to avoid a collision via the drive system control unit 12010.
撮像部12101ないし12104の少なくとも1つは、赤外線を検出する赤外線カメラであってもよい。例えば、マイクロコンピュータ12051は、撮像部12101ないし12104の撮像画像中に歩行者が存在するか否かを判定することで歩行者を認識することができる。かかる歩行者の認識は、例えば赤外線カメラとしての撮像部12101ないし12104の撮像画像における特徴点を抽出する手順と、物体の輪郭を示す一連の特徴点にパターンマッチング処理を行って歩行者か否かを判別する手順によって行われる。マイクロコンピュータ12051が、撮像部12101ないし12104の撮像画像中に歩行者が存在すると判定し、歩行者を認識すると、音声画像出力部12052は、当該認識された歩行者に強調のための方形輪郭線を重畳表示するように、表示部12062を制御する。また、音声画像出力部12052は、歩行者を示すアイコン等を所望の位置に表示するように表示部12062を制御してもよい。
At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether or not a pedestrian is present in the captured image of the imaging units 12101 to 12104. The recognition of such a pedestrian is performed, for example, by a procedure of extracting feature points in the captured image of the imaging units 12101 to 12104 as infrared cameras, and a procedure of performing pattern matching processing on a series of feature points that indicate the contour of an object to determine whether or not it is a pedestrian. When the microcomputer 12051 determines that a pedestrian is present in the captured image of the imaging units 12101 to 12104 and recognizes a pedestrian, the audio/image output unit 12052 controls the display unit 12062 to superimpose a rectangular contour line for emphasis on the recognized pedestrian. The audio/image output unit 12052 may also control the display unit 12062 to display an icon or the like indicating a pedestrian at a desired position.
以上、本開示に係る技術が適用され得る車両制御システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、撮像部12031等に適用され得る。具体的には、上述した撮像装置10は、撮像部12031に適用することができる。撮像部12031に本開示に係る技術を適用することにより、レンズ光学系25において大型で高画素の固体撮像素子21に対応する高光学性能を実現することができる。その結果、高画素で高画質の撮影画像により、例えばドライバの疲労を軽減することが可能になる。
Above, an example of a vehicle control system to which the technology according to the present disclosure can be applied has been described. The technology according to the present disclosure can be applied to the imaging unit 12031 and the like of the configuration described above. Specifically, the imaging device 10 described above can be applied to the imaging unit 12031. By applying the technology according to the present disclosure to the imaging unit 12031, high optical performance corresponding to a large, high-pixel solid-state imaging element 21 can be realized in the lens optical system 25. As a result, a captured image with high pixel count and high image quality can be obtained, which can reduce driver fatigue, for example.
本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。例えば、各面の変曲点の数や位置、各レンズのレンズデータ、各面の非球面データ、およびレンズ有効径周辺部の接線角度は上述した例に限定されない。上述した複数の実施の形態の全てまたは一部を組み合わせた形態を採用することもできる。
The embodiments of the present technology are not limited to the above-mentioned embodiments, and various modifications are possible within the scope of the gist of the present technology. For example, the number and positions of inflection points of each surface, lens data of each lens, aspheric data of each surface, and tangent angles of the peripheral parts of the effective diameter of the lens are not limited to the above-mentioned examples. It is also possible to adopt a form that combines all or part of the above-mentioned embodiments.
本明細書に記載された効果はあくまで例示であって限定されるものではなく、本明細書に記載されたもの以外の効果があってもよい。
The effects described in this specification are merely examples and are not limiting, and there may be effects other than those described in this specification.
本技術は、以下の構成を取ることができる。
(1)
物体側から像側に向かって順に、
光軸近傍で正の屈折力を有し、前記物体側に凸のメニスカス形状を有する第1のレンズと、
光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第2のレンズと、
光軸近傍で屈折力を有する第3のレンズと、
光軸近傍で正の屈折力を有し、前記像側に凸のメニスカス形状を有する第4のレンズと、
光軸近傍で屈折力を有し、前記像側に凸のメニスカス形状を有する第5のレンズと、
光軸近傍で屈折力を有し、前記物体側に凸のメニスカス形状を有する第6のレンズと、
光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第7のレンズと
を備え、
前記第1のレンズ乃至前記第7のレンズのうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの光軸上の空気の間隔の中で、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔が最も長い
ように構成された
レンズ光学系。
(2)
前記レンズ光学系の全長をTTLとし、前記第3のレンズと前記第4のレンズからなる前記ペアの前記間隔をD6とし、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔をD8としたとき、
5.0<TTL/(D6+D8)<10.0
という条件を満たす
ように構成された
前記(1)に記載のレンズ光学系。
(3)
前記第1のレンズの前記物体側の面の曲率半径をR1とし、前記第1のレンズの前記像側の面の曲率半径をR2としたとき、
0.5<(R2+R1)/(R2-R1)<3.0
という条件を満たす
ように構成された
前記(1)または(2)に記載のレンズ光学系。
(4)
前記第2のレンズの前記物体側の面の曲率半径をR3とし、前記第2のレンズの前記像側の面の曲率半径をR4としたとき、
-10.0<(R4+R3)/(R4-R3)<-3.0
という条件を満たす
ように構成された
前記(1)乃至(3)のいずれかに記載のレンズ光学系。
(5)
前記第4のレンズの前記物体側の面の曲率半径をR7とし、前記第4のレンズの前記像側の面の曲率半径をR8としたとき、
-10.0<(R8+R7)/(R8-R7)<-1.0
という条件を満たす
ように構成された
前記(1)乃至(4)のいずれかに記載のレンズ光学系。
(6)
前記第1のレンズのd線におけるアッベ数をV1としたとき、
50.0<V1<90.0
という条件を満たす
ように構成された
前記(1)乃至(5)のいずれかに記載のレンズ光学系。
(7)
前記第2のレンズのd線におけるアッベ数をV2としたとき、
15.0<V2<20.0
という条件を満たす
ように構成された
前記(1)乃至(6)のいずれかに記載のレンズ光学系。
(8)
前記第3のレンズの焦点距離をf3とし、前記レンズ光学系の焦点距離をfとしたとき、
5<|f3/f|<100.0
という条件を満たす
ように構成された
前記(1)乃至(7)のいずれかに記載のレンズ光学系。
(9)
前記第1のレンズの焦点距離をf1とし、前記レンズ光学系の焦点距離をfとしたとき、
0<f1/f<2.0
という条件を満たす
ように構成された
前記(1)乃至(8)のいずれかに記載のレンズ光学系。
(10)
前記レンズ光学系から撮像面に入射する光の主光線と前記撮像面のなす角の最大角をCRAとしたとき、
CRA<40.0
という条件を満たす
ように構成された
前記(1)乃至(9)のいずれかに記載のレンズ光学系。
(11)
前記レンズ光学系の全長をTTLとし、前記レンズ光学系における最大像高をIHとしたとき、
1.0<TTL/IH<1.3
という条件を満たす
ように構成された
前記(1)乃至(10)のいずれかに記載のレンズ光学系。
(12)
物体側から像側に向かって順に、
光軸近傍で正の屈折力を有し、前記物体側に凸のメニスカス形状を有する第1のレンズと、
光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第2のレンズと、
光軸近傍で屈折力を有する第3のレンズと、
光軸近傍で正の屈折力を有し、前記像側に凸のメニスカス形状を有する第4のレンズと、
光軸近傍で屈折力を有し、前記像側に凸のメニスカス形状を有する第5のレンズと、
光軸近傍で屈折力を有し、前記物体側に凸のメニスカス形状を有する第6のレンズと、
光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第7のレンズと
を備え、
前記第1のレンズ乃至前記第7のレンズのうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの光軸上の空気の間隔の中で、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔が最も長い
ように構成された
レンズ光学系と、
前記レンズ光学系により結像された光学像を電気信号に変換する撮像素子と
を備える撮像装置。 The present technology can take the following configurations.
(1)
From the object side to the image side,
a first lens having a positive refractive power near an optical axis and a meniscus shape convex toward the object side;
a second lens having a negative refractive power near the optical axis and a meniscus shape convex toward the object side;
a third lens having a refractive power near the optical axis;
a fourth lens having a positive refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side;
a fifth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side;
a sixth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side;
a seventh lens having a negative refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side,
A lens optical system configured such that, among the air gaps on the optical axis between opposing surfaces of each pair of adjacent two lenses among the first lens to the seventh lens, the gap between the pair consisting of the fourth lens and the fifth lens is the longest.
(2)
When the total length of the lens optical system is TTL, the distance between the pair of the third lens and the fourth lens is D6, and the distance between the pair of the fourth lens and the fifth lens is D8,
5.0<TTL/(D6+D8)<10.0
The lens optical system according to (1) above, configured to satisfy the following condition.
(3)
When the radius of curvature of the object-side surface of the first lens is R1 and the radius of curvature of the image-side surface of the first lens is R2,
0.5<(R2+R1)/(R2-R1)<3.0
The lens optical system according to (1) or (2) above, configured to satisfy the following condition.
(4)
When the radius of curvature of the object-side surface of the second lens is R3 and the radius of curvature of the image-side surface of the second lens is R4,
-10.0<(R4+R3)/(R4-R3)<-3.0
The lens optical system according to any one of (1) to (3), configured to satisfy the following condition:
(5)
When the radius of curvature of the object-side surface of the fourth lens is R7 and the radius of curvature of the image-side surface of the fourth lens is R8,
-10.0<(R8+R7)/(R8-R7)<-1.0
The lens optical system according to any one of (1) to (4), which is configured to satisfy the following condition:
(6)
When the Abbe number of the first lens at the d line is V1,
50.0<V1<90.0
The lens optical system according to any one of (1) to (5) above, configured to satisfy the following condition:
(7)
When the Abbe number of the second lens at the d line is V2,
15.0<V2<20.0
The lens optical system according to any one of (1) to (6), which is configured to satisfy the following condition:
(8)
When the focal length of the third lens is f3 and the focal length of the lens optical system is f,
5<|f3/f|<100.0
The lens optical system according to any one of (1) to (7), configured to satisfy the following condition:
(9)
When the focal length of the first lens is f1 and the focal length of the lens optical system is f,
0<f1/f<2.0
The lens optical system according to any one of (1) to (8), which is configured to satisfy the following condition:
(10)
When the maximum angle between the chief ray of light incident on the imaging surface from the lens optical system and the imaging surface is defined as CRA,
CRA < 40.0
The lens optical system according to any one of (1) to (9), configured to satisfy the following condition:
(11)
When the total length of the lens optical system is TTL and the maximum image height of the lens optical system is IH,
1.0<TTL/IH<1.3
The lens optical system according to any one of (1) to (10) above, configured to satisfy the following condition:
(12)
From the object side to the image side,
a first lens having a positive refractive power near an optical axis and a meniscus shape convex toward the object side;
a second lens having a negative refractive power near the optical axis and a meniscus shape convex toward the object side;
a third lens having a refractive power near the optical axis;
a fourth lens having a positive refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side;
a fifth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side;
a sixth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side;
a seventh lens having a negative refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side,
a lens optical system configured such that, among air gaps on an optical axis between opposing surfaces of each pair of adjacent two lenses among the first lens to the seventh lens, the gap between the pair of the fourth lens and the fifth lens is the longest;
and an image sensor that converts an optical image formed by the lens optical system into an electrical signal.
(1)
物体側から像側に向かって順に、
光軸近傍で正の屈折力を有し、前記物体側に凸のメニスカス形状を有する第1のレンズと、
光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第2のレンズと、
光軸近傍で屈折力を有する第3のレンズと、
光軸近傍で正の屈折力を有し、前記像側に凸のメニスカス形状を有する第4のレンズと、
光軸近傍で屈折力を有し、前記像側に凸のメニスカス形状を有する第5のレンズと、
光軸近傍で屈折力を有し、前記物体側に凸のメニスカス形状を有する第6のレンズと、
光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第7のレンズと
を備え、
前記第1のレンズ乃至前記第7のレンズのうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの光軸上の空気の間隔の中で、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔が最も長い
ように構成された
レンズ光学系。
(2)
前記レンズ光学系の全長をTTLとし、前記第3のレンズと前記第4のレンズからなる前記ペアの前記間隔をD6とし、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔をD8としたとき、
5.0<TTL/(D6+D8)<10.0
という条件を満たす
ように構成された
前記(1)に記載のレンズ光学系。
(3)
前記第1のレンズの前記物体側の面の曲率半径をR1とし、前記第1のレンズの前記像側の面の曲率半径をR2としたとき、
0.5<(R2+R1)/(R2-R1)<3.0
という条件を満たす
ように構成された
前記(1)または(2)に記載のレンズ光学系。
(4)
前記第2のレンズの前記物体側の面の曲率半径をR3とし、前記第2のレンズの前記像側の面の曲率半径をR4としたとき、
-10.0<(R4+R3)/(R4-R3)<-3.0
という条件を満たす
ように構成された
前記(1)乃至(3)のいずれかに記載のレンズ光学系。
(5)
前記第4のレンズの前記物体側の面の曲率半径をR7とし、前記第4のレンズの前記像側の面の曲率半径をR8としたとき、
-10.0<(R8+R7)/(R8-R7)<-1.0
という条件を満たす
ように構成された
前記(1)乃至(4)のいずれかに記載のレンズ光学系。
(6)
前記第1のレンズのd線におけるアッベ数をV1としたとき、
50.0<V1<90.0
という条件を満たす
ように構成された
前記(1)乃至(5)のいずれかに記載のレンズ光学系。
(7)
前記第2のレンズのd線におけるアッベ数をV2としたとき、
15.0<V2<20.0
という条件を満たす
ように構成された
前記(1)乃至(6)のいずれかに記載のレンズ光学系。
(8)
前記第3のレンズの焦点距離をf3とし、前記レンズ光学系の焦点距離をfとしたとき、
5<|f3/f|<100.0
という条件を満たす
ように構成された
前記(1)乃至(7)のいずれかに記載のレンズ光学系。
(9)
前記第1のレンズの焦点距離をf1とし、前記レンズ光学系の焦点距離をfとしたとき、
0<f1/f<2.0
という条件を満たす
ように構成された
前記(1)乃至(8)のいずれかに記載のレンズ光学系。
(10)
前記レンズ光学系から撮像面に入射する光の主光線と前記撮像面のなす角の最大角をCRAとしたとき、
CRA<40.0
という条件を満たす
ように構成された
前記(1)乃至(9)のいずれかに記載のレンズ光学系。
(11)
前記レンズ光学系の全長をTTLとし、前記レンズ光学系における最大像高をIHとしたとき、
1.0<TTL/IH<1.3
という条件を満たす
ように構成された
前記(1)乃至(10)のいずれかに記載のレンズ光学系。
(12)
物体側から像側に向かって順に、
光軸近傍で正の屈折力を有し、前記物体側に凸のメニスカス形状を有する第1のレンズと、
光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第2のレンズと、
光軸近傍で屈折力を有する第3のレンズと、
光軸近傍で正の屈折力を有し、前記像側に凸のメニスカス形状を有する第4のレンズと、
光軸近傍で屈折力を有し、前記像側に凸のメニスカス形状を有する第5のレンズと、
光軸近傍で屈折力を有し、前記物体側に凸のメニスカス形状を有する第6のレンズと、
光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第7のレンズと
を備え、
前記第1のレンズ乃至前記第7のレンズのうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの光軸上の空気の間隔の中で、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔が最も長い
ように構成された
レンズ光学系と、
前記レンズ光学系により結像された光学像を電気信号に変換する撮像素子と
を備える撮像装置。 The present technology can take the following configurations.
(1)
From the object side to the image side,
a first lens having a positive refractive power near an optical axis and a meniscus shape convex toward the object side;
a second lens having a negative refractive power near the optical axis and a meniscus shape convex toward the object side;
a third lens having a refractive power near the optical axis;
a fourth lens having a positive refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side;
a fifth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side;
a sixth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side;
a seventh lens having a negative refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side,
A lens optical system configured such that, among the air gaps on the optical axis between opposing surfaces of each pair of adjacent two lenses among the first lens to the seventh lens, the gap between the pair consisting of the fourth lens and the fifth lens is the longest.
(2)
When the total length of the lens optical system is TTL, the distance between the pair of the third lens and the fourth lens is D6, and the distance between the pair of the fourth lens and the fifth lens is D8,
5.0<TTL/(D6+D8)<10.0
The lens optical system according to (1) above, configured to satisfy the following condition.
(3)
When the radius of curvature of the object-side surface of the first lens is R1 and the radius of curvature of the image-side surface of the first lens is R2,
0.5<(R2+R1)/(R2-R1)<3.0
The lens optical system according to (1) or (2) above, configured to satisfy the following condition.
(4)
When the radius of curvature of the object-side surface of the second lens is R3 and the radius of curvature of the image-side surface of the second lens is R4,
-10.0<(R4+R3)/(R4-R3)<-3.0
The lens optical system according to any one of (1) to (3), configured to satisfy the following condition:
(5)
When the radius of curvature of the object-side surface of the fourth lens is R7 and the radius of curvature of the image-side surface of the fourth lens is R8,
-10.0<(R8+R7)/(R8-R7)<-1.0
The lens optical system according to any one of (1) to (4), which is configured to satisfy the following condition:
(6)
When the Abbe number of the first lens at the d line is V1,
50.0<V1<90.0
The lens optical system according to any one of (1) to (5) above, configured to satisfy the following condition:
(7)
When the Abbe number of the second lens at the d line is V2,
15.0<V2<20.0
The lens optical system according to any one of (1) to (6), which is configured to satisfy the following condition:
(8)
When the focal length of the third lens is f3 and the focal length of the lens optical system is f,
5<|f3/f|<100.0
The lens optical system according to any one of (1) to (7), configured to satisfy the following condition:
(9)
When the focal length of the first lens is f1 and the focal length of the lens optical system is f,
0<f1/f<2.0
The lens optical system according to any one of (1) to (8), which is configured to satisfy the following condition:
(10)
When the maximum angle between the chief ray of light incident on the imaging surface from the lens optical system and the imaging surface is defined as CRA,
CRA < 40.0
The lens optical system according to any one of (1) to (9), configured to satisfy the following condition:
(11)
When the total length of the lens optical system is TTL and the maximum image height of the lens optical system is IH,
1.0<TTL/IH<1.3
The lens optical system according to any one of (1) to (10) above, configured to satisfy the following condition:
(12)
From the object side to the image side,
a first lens having a positive refractive power near an optical axis and a meniscus shape convex toward the object side;
a second lens having a negative refractive power near the optical axis and a meniscus shape convex toward the object side;
a third lens having a refractive power near the optical axis;
a fourth lens having a positive refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side;
a fifth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side;
a sixth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side;
a seventh lens having a negative refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side,
a lens optical system configured such that, among air gaps on an optical axis between opposing surfaces of each pair of adjacent two lenses among the first lens to the seventh lens, the gap between the pair of the fourth lens and the fifth lens is the longest;
and an image sensor that converts an optical image formed by the lens optical system into an electrical signal.
10 撮像装置, 21 固体撮像素子, 25 レンズ光学系, 31a 撮像面, 71乃至77 レンズ, 71a,71b,72a,72b,73a,73b,74a,74b,75a,75b,76a,76b,77a 面, 171乃至177 レンズ, 171a,171b,172a,172b,173a,173b,174a,174b,175a,175b,176a,176b,177a 面, 271乃至277 レンズ, 271a,271b,272a,272b,273a,273b,274a,274b,275a,275b,276a,276b,277a 面, 371乃至377 レンズ, 371a,371b,372a,372b,373a,373b,374a,374b,375a,375b,376a,376b,377a 面, 471乃至477 レンズ, 471a,471b,472a,472b,473a,473b,474a,474b,475a,475b,476a,476b,477a 面
10 imaging device, 21 solid-state imaging element, 25 lens optical system, 31a imaging surface, 71 to 77 lenses, 71a, 71b, 72a, 72b, 73a, 73b, 74a, 74b, 75a, 75b, 76a, 76b, 77a surface, 171 to 177 lenses, 171a, 171b, 172a, 172b, 173a, 173b, 174a, 174b, 175a, 175b, 176a, 176b, 177a surface, 271 to 277 lenses, 271a, 271b, 272 a, 272b, 273a, 273b, 274a, 274b, 275a, 275b, 276a, 276b, 277a surface, 371 to 377 lens, 371a, 371b, 372a, 372b, 373a, 373b, 374a, 374b, 375a, 375b, 376a, 376b, 377a surface, 471 to 477 lens, 471a, 471b, 472a, 472b, 473a, 473b, 474a, 474b, 475a, 475b, 476a, 476b, 477a surface
Claims (12)
- 物体側から像側に向かって順に、
光軸近傍で正の屈折力を有し、前記物体側に凸のメニスカス形状を有する第1のレンズと、
光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第2のレンズと、
光軸近傍で屈折力を有する第3のレンズと、
光軸近傍で正の屈折力を有し、前記像側に凸のメニスカス形状を有する第4のレンズと、
光軸近傍で屈折力を有し、前記像側に凸のメニスカス形状を有する第5のレンズと、
光軸近傍で屈折力を有し、前記物体側に凸のメニスカス形状を有する第6のレンズと、
光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第7のレンズと
を備え、
前記第1のレンズ乃至前記第7のレンズのうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの光軸上の空気の間隔の中で、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔が最も長い
ように構成された
レンズ光学系。 From the object side to the image side,
a first lens having a positive refractive power near an optical axis and a meniscus shape convex toward the object side;
a second lens having a negative refractive power near the optical axis and a meniscus shape convex toward the object side;
a third lens having a refractive power near the optical axis;
a fourth lens having a positive refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side;
a fifth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side;
a sixth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side;
a seventh lens having a negative refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side,
A lens optical system configured such that, among the air gaps on the optical axis between opposing surfaces of each pair of adjacent two lenses among the first lens to the seventh lens, the gap between the pair consisting of the fourth lens and the fifth lens is the longest. - 前記レンズ光学系の全長をTTLとし、前記第3のレンズと前記第4のレンズからなる前記ペアの前記間隔をD6とし、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔をD8としたとき、
5.0<TTL/(D6+D8)<10.0
という条件を満たす
ように構成された
請求項1に記載のレンズ光学系。 When the total length of the lens optical system is TTL, the distance between the pair of the third lens and the fourth lens is D6, and the distance between the pair of the fourth lens and the fifth lens is D8,
5.0<TTL/(D6+D8)<10.0
The lens optical system according to claim 1 , which is configured to satisfy the following condition: - 前記第1のレンズの前記物体側の面の曲率半径をR1とし、前記第1のレンズの前記像側の面の曲率半径をR2としたとき、
0.5<(R2+R1)/(R2-R1)<3.0
という条件を満たす
ように構成された
請求項1に記載のレンズ光学系。 When the radius of curvature of the object-side surface of the first lens is R1 and the radius of curvature of the image-side surface of the first lens is R2,
0.5<(R2+R1)/(R2-R1)<3.0
The lens optical system according to claim 1 , which is configured to satisfy the following condition: - 前記第2のレンズの前記物体側の面の曲率半径をR3とし、前記第2のレンズの前記像側の面の曲率半径をR4としたとき、
-10.0<(R4+R3)/(R4-R3)<-3.0
という条件を満たす
ように構成された
請求項1に記載のレンズ光学系。 When the radius of curvature of the object-side surface of the second lens is R3 and the radius of curvature of the image-side surface of the second lens is R4,
-10.0<(R4+R3)/(R4-R3)<-3.0
The lens optical system according to claim 1 , which is configured to satisfy the following condition: - 前記第4のレンズの前記物体側の面の曲率半径をR7とし、前記第4のレンズの前記像側の面の曲率半径をR8としたとき、
-10.0<(R8+R7)/(R8-R7)<-1.0
という条件を満たす
ように構成された
請求項1に記載のレンズ光学系。 When the radius of curvature of the object-side surface of the fourth lens is R7 and the radius of curvature of the image-side surface of the fourth lens is R8,
-10.0<(R8+R7)/(R8-R7)<-1.0
The lens optical system according to claim 1 , which is configured to satisfy the following condition: - 前記第1のレンズのd線におけるアッベ数をV1としたとき、
50.0<V1<90.0
という条件を満たす
ように構成された
請求項1に記載のレンズ光学系。 When the Abbe number of the first lens at the d line is V1,
50.0<V1<90.0
The lens optical system according to claim 1 , which is configured to satisfy the following condition: - 前記第2のレンズのd線におけるアッベ数をV2としたとき、
15.0<V2<20.0
という条件を満たす
ように構成された
請求項1に記載のレンズ光学系。 When the Abbe number of the second lens at the d line is V2,
15.0<V2<20.0
The lens optical system according to claim 1 , which is configured to satisfy the following condition: - 前記第3のレンズの焦点距離をf3とし、前記レンズ光学系の焦点距離をfとしたとき、
5<|f3/f|<100.0
という条件を満たす
ように構成された
請求項1に記載のレンズ光学系。 When the focal length of the third lens is f3 and the focal length of the lens optical system is f,
5<|f3/f|<100.0
The lens optical system according to claim 1 , which is configured to satisfy the following condition: - 前記第1のレンズの焦点距離をf1とし、前記レンズ光学系の焦点距離をfとしたとき、
0<f1/f<2.0
という条件を満たす
ように構成された
請求項1に記載のレンズ光学系。 When the focal length of the first lens is f1 and the focal length of the lens optical system is f,
0<f1/f<2.0
The lens optical system according to claim 1 , which is configured to satisfy the following condition: - 前記レンズ光学系から撮像面に入射する光の主光線と前記撮像面のなす角の最大角をCRAとしたとき、
CRA<40.0
という条件を満たす
ように構成された
請求項1に記載のレンズ光学系。 When the maximum angle between the chief ray of light incident on the imaging surface from the lens optical system and the imaging surface is defined as CRA,
CRA < 40.0
The lens optical system according to claim 1 , which is configured to satisfy the following condition: - 前記レンズ光学系の全長をTTLとし、前記レンズ光学系における最大像高をIHとしたとき、
1.0<TTL/IH<1.3
という条件を満たす
ように構成された
請求項1に記載のレンズ光学系。 When the total length of the lens optical system is TTL and the maximum image height of the lens optical system is IH,
1.0<TTL/IH<1.3
The lens optical system according to claim 1 , which is configured to satisfy the following condition: - 物体側から像側に向かって順に、
光軸近傍で正の屈折力を有し、前記物体側に凸のメニスカス形状を有する第1のレンズと、
光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第2のレンズと、
光軸近傍で屈折力を有する第3のレンズと、
光軸近傍で正の屈折力を有し、前記像側に凸のメニスカス形状を有する第4のレンズと、
光軸近傍で屈折力を有し、前記像側に凸のメニスカス形状を有する第5のレンズと、
光軸近傍で屈折力を有し、前記物体側に凸のメニスカス形状を有する第6のレンズと、
光軸近傍で負の屈折力を有し、前記物体側に凸のメニスカス形状を有する第7のレンズと
を備え、
前記第1のレンズ乃至前記第7のレンズのうちの隣り合う2つのレンズからなる各ペアの対向する面どうしの光軸上の空気の間隔の中で、前記第4のレンズと前記第5のレンズからなる前記ペアの前記間隔が最も長い
ように構成された
レンズ光学系と、
前記レンズ光学系により結像された光学像を電気信号に変換する撮像素子と
を備える撮像装置。 From the object side to the image side,
a first lens having a positive refractive power near an optical axis and a meniscus shape convex toward the object side;
a second lens having a negative refractive power near the optical axis and a meniscus shape convex toward the object side;
a third lens having a refractive power near the optical axis;
a fourth lens having a positive refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side;
a fifth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the image side;
a sixth lens having a refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side;
a seventh lens having a negative refractive power in the vicinity of the optical axis and having a meniscus shape convex toward the object side,
a lens optical system configured such that, among air gaps on an optical axis between opposing surfaces of each pair of adjacent two lenses among the first lens to the seventh lens, the gap between the pair of the fourth lens and the fifth lens is the longest;
and an image sensor that converts an optical image formed by the lens optical system into an electrical signal.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022207205 | 2022-12-23 | ||
JP2022-207205 | 2022-12-23 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024135306A1 true WO2024135306A1 (en) | 2024-06-27 |
Family
ID=91588303
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2023/043217 WO2024135306A1 (en) | 2022-12-23 | 2023-12-04 | Lens optical system and imaging device |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024135306A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015161871A (en) * | 2014-02-28 | 2015-09-07 | 株式会社オプトロジック | imaging lens |
JP2019082663A (en) * | 2017-10-30 | 2019-05-30 | エーエーシー テクノロジーズ ピーティーイー リミテッドAac Technologies Pte.Ltd. | Imaging optical lens |
WO2022116145A1 (en) * | 2020-12-04 | 2022-06-09 | 欧菲光集团股份有限公司 | Optical system, image capturing device, and electronic device |
US20220206271A1 (en) * | 2020-12-29 | 2022-06-30 | Aac Optics (Changzhou) Co., Ltd. | Camera Optical Lens |
WO2022157730A1 (en) * | 2021-01-25 | 2022-07-28 | Corephotonics Ltd. | Slim pop-out wide camera lenses |
-
2023
- 2023-12-04 WO PCT/JP2023/043217 patent/WO2024135306A1/en unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015161871A (en) * | 2014-02-28 | 2015-09-07 | 株式会社オプトロジック | imaging lens |
JP2019082663A (en) * | 2017-10-30 | 2019-05-30 | エーエーシー テクノロジーズ ピーティーイー リミテッドAac Technologies Pte.Ltd. | Imaging optical lens |
WO2022116145A1 (en) * | 2020-12-04 | 2022-06-09 | 欧菲光集团股份有限公司 | Optical system, image capturing device, and electronic device |
US20220206271A1 (en) * | 2020-12-29 | 2022-06-30 | Aac Optics (Changzhou) Co., Ltd. | Camera Optical Lens |
WO2022157730A1 (en) * | 2021-01-25 | 2022-07-28 | Corephotonics Ltd. | Slim pop-out wide camera lenses |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7364022B2 (en) | Imaging lens and imaging device | |
JP7449317B2 (en) | Imaging device | |
US20230299102A1 (en) | Imaging element, fabrication method, and electronic equipment | |
JP2018200423A (en) | Imaging device and electronic apparatus | |
JP7552586B2 (en) | Imaging device | |
WO2020090368A1 (en) | Imaging lens and imaging device | |
WO2018139280A1 (en) | Camera module, method for manufacturing same, and electronic device | |
JP7237595B2 (en) | IMAGE SENSOR, MANUFACTURING METHOD, AND ELECTRONIC DEVICE | |
US20230013088A1 (en) | Imaging device and method of manufacturing imaging device | |
WO2022064853A1 (en) | Solid-state imaging device and electronic apparatus | |
WO2019207978A1 (en) | Image capture element and method of manufacturing image capture element | |
JP2018110302A (en) | Imaging device, manufacturing method thereof, and electronic device | |
US11553118B2 (en) | Imaging apparatus, manufacturing method therefor, and electronic apparatus | |
WO2021117497A1 (en) | Imaging lens and imaging device | |
US12040338B2 (en) | Imaging apparatus | |
WO2024135306A1 (en) | Lens optical system and imaging device | |
WO2024181136A1 (en) | Lens optical system and imaging device | |
WO2022091576A1 (en) | Solid-state imaging device and electronic apparatus | |
WO2024116818A1 (en) | Lens optical system and imaging device | |
WO2019039277A1 (en) | Imaging device, camera module and electronic device | |
WO2024070611A1 (en) | Lens optical system and imaging device | |
WO2018131264A1 (en) | Imaging unit and electronic device | |
US20240329371A1 (en) | Imaging lens and imaging device | |
WO2022190616A1 (en) | Semiconductor chip, method for manufacturing same, and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23906672 Country of ref document: EP Kind code of ref document: A1 |