WO2018131264A1 - Imaging unit and electronic device - Google Patents

Imaging unit and electronic device Download PDF

Info

Publication number
WO2018131264A1
WO2018131264A1 PCT/JP2017/039197 JP2017039197W WO2018131264A1 WO 2018131264 A1 WO2018131264 A1 WO 2018131264A1 JP 2017039197 W JP2017039197 W JP 2017039197W WO 2018131264 A1 WO2018131264 A1 WO 2018131264A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical system
lens
imaging
angle
image sensor
Prior art date
Application number
PCT/JP2017/039197
Other languages
French (fr)
Japanese (ja)
Inventor
典宏 田部
洋司 崎岡
茂幸 馬場
豪 浅山
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Publication of WO2018131264A1 publication Critical patent/WO2018131264A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B13/00Optical objectives specially designed for the purposes specified below
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/14Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
    • H01L27/144Devices controlled by radiation
    • H01L27/146Imager structures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith

Definitions

  • This disclosure relates to an imaging unit and an electronic device.
  • WL-CSP Wafer Level Chip Size Package
  • WL-CSP Wafer Level Chip Size Package
  • WL-CSP goes through a manufacturing process in which a lens optical system (cover glass or the like) is bonded to a wafer on which an image sensor is formed, and then the wafer provided with the lens optical system is separated into individual pieces. Is fixed to the image sensor and has an air layer between the cover glass and the image sensor.
  • Patent Document 1 discloses a technique in which a film having a specific gravity and a transparent resin are filled between an imaging element and a cover glass in WL-CSP.
  • the imaging unit it is conceivable to correct the aberration by changing the shape, material, etc. of the imaging optical system (for example, an imaging lens) provided on the subject side from the cover glass. Correction of aberrations becomes difficult.
  • the imaging optical system for example, an imaging lens
  • the present disclosure has been made in view of the above, and the present disclosure provides an imaging unit and an electronic apparatus that can further reduce the influence of aberration caused by the WL-CSP structure.
  • an imaging device that forms an image of a subject, a lens optical system that is provided on the subject side of the imaging device and is fixed to the imaging device, and between the subject and the lens optical system
  • An imaging optical system having a numerical aperture NA, and a wavelength of incident light incident on the imaging element is ⁇ , an angle of view is ⁇ [deg], and an ideal lens for each angle of view ⁇
  • An imaging unit is provided in which when the optical path difference is L ( ⁇ ), the surface shape of the lens optical system located on the subject side satisfies the following formula (1).
  • an imaging element on which an image of a subject is formed a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system
  • the surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (2), where max is the optical path difference with the ideal lens for each angle of view ⁇ , and L ( ⁇ ).
  • an imaging element on which an image of a subject is formed a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system
  • the surface of the lens optical system located on the subject side when the optical path difference with the ideal lens for each angle of view ⁇ of the imaging optical system is f ( ⁇ ).
  • An imaging unit having a shape that satisfies the following expression (3) is provided.
  • an imaging element on which an image of a subject is formed a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system
  • the optical path difference with the ideal lens for each angle of view ⁇ is L ( ⁇ ) and the optical path difference with the ideal lens for each angle of view ⁇ of the imaging optical system is f ( ⁇ )
  • An imaging unit is provided in which the surface shape of the lens optical system located on the subject side satisfies the following expression (4).
  • an imaging element on which an image of a subject is formed a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system
  • An imaging optical system having a numerical aperture of NA, and the wavelength of incident light incident on the image sensor is ⁇ , the field angle is ⁇ [deg], and an ideal lens for each field angle ⁇ Provided by an electronic apparatus provided with an imaging unit, where the surface shape of the lens optical system located on the subject side is a shape satisfying the following expression (5), where L ( ⁇ ) is the optical path difference between Is done.
  • an imaging element on which an image of a subject is formed a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system
  • the optical path difference from the ideal lens for each angle of view ⁇ is L ( ⁇ )
  • the surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (6).
  • An electronic device including an imaging unit is provided.
  • an imaging element on which an image of a subject is formed a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system
  • the surface of the lens optical system located on the subject side when the optical path difference with the ideal lens for each angle of view ⁇ of the imaging optical system is f ( ⁇ ).
  • An electronic device including an imaging unit is provided that has a shape that satisfies the following expression (7).
  • an imaging element on which an image of a subject is formed a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system
  • the optical path difference with the ideal lens for each angle of view ⁇ is L ( ⁇ ) and the optical path difference with the ideal lens for each angle of view ⁇ of the imaging optical system is f ( ⁇ )
  • An electronic apparatus including an imaging unit is provided in which the surface shape of the lens optical system located on the subject side is a shape that satisfies the following formula (8).
  • the aberration caused by the WL-CSP structure is corrected because the surface shape of the lens optical system located on the subject side has the predetermined shape as described above.
  • FIG. 1 is a diagram illustrating a configuration of an imaging unit 10 according to the present disclosure.
  • the imaging unit 10 includes at least a cover glass 11 that is an example of a lens optical system, an imaging element 12, and an imaging lens 13 that is an example of an imaging optical system.
  • the imaging unit 10 causes the imaging element 12 to form imaged image data representing the luminance distribution of the incident light by causing the incident light to form an image on the imaging surface by the imaging lens 13 and the cover glass 11.
  • the imaging unit 10 according to the present disclosure can be used for electronic devices such as smartphones, digital cameras, personal computers, and tablet computers. However, these application examples are merely examples, and the imaging unit 10 according to the present disclosure can be used in various applications or apparatuses.
  • the cover glass 11 and the image sensor 12 in the present disclosure have a WL-CSP structure. That is, the cover glass 11 and the image sensor 12 are separated after the light-transmitting substrate (the substrate on which the cover glass 11 is based) is fixed on the semiconductor substrate in a wafer state on which the plurality of image sensors 12 are formed. It is a package manufactured by being done. WL-CSP performs package processing in the state of a wafer, so that the package can be reduced in size and thickness, and is excellent in terms of manufacturing cost.
  • the cover glass 11 is constructed by separating the light transmissive substrate into pieces, and transmits incident light.
  • the cover glass 11 is positioned closer to the subject side than the image sensor 12, thereby transmitting incident light, and forming the incident light on an imaging surface positioned between the cover glass 11 and the image sensor 12.
  • the cover glass 11 is formed of a material that can transmit light in a wavelength band to which light to be imaged on the image sensor 12 belongs (a material that can be regarded as transparent in the wavelength band of interest).
  • the cover glass 11 is fixed to the image sensor 12, and the space between the cover glass 11 and the image sensor 12 is not hollow.
  • the imaging device 12 generates incident image data corresponding to the incident light by receiving and photoelectrically converting the incident light that has passed through the imaging lens 13 and the cover glass 11 and formed an image on the imaging surface. That is, the imaging device 12 is formed of a semiconductor substrate made of a semiconductor capable of detecting the incident light of interest. For example, when the wavelength band of the incident light of interest is a so-called visible light band, for example, an image sensor capable of color photographing having a Bayer array is used as the image sensor 12.
  • the imaging device 12 may be various known devices such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor.
  • the imaging lens 13 transmits incident light incident on the imaging unit 10 and forms an image of the transmitted incident light on the imaging surface.
  • a single lens is shown as the imaging lens 13 in FIG. 1, this is merely an example, and the imaging lens 13 may be an assembly of a plurality of lenses.
  • the imaging lens 13 having a convex shape is shown.
  • the shape of the imaging lens 13 is arbitrary.
  • the material of the imaging lens 13, the focal length, and the like are not particularly limited.
  • WL-CSP is often adopted as the structure of an imaging unit because the package can be reduced in size and thickness and is excellent in terms of manufacturing cost.
  • the WL-CSP has a structure in which the cover glass is fixed to the imaging element and integrated in the manufacturing process.
  • the influence of the aberration caused by the cover glass increases as the cover glass becomes thicker. Since it becomes large, the quality of a captured image falls. More specifically, when the cover glass is fixed to the image sensor as in the WL-CSP, the optical path difference from the ideal lens becomes larger as the cover glass becomes thicker. The influence of aberration etc. becomes large. When the influence of aberration increases, image disturbance such as defocusing and blurring occurs, and the quality of the captured image decreases.
  • the aberration correction may become difficult as the cover glass becomes thicker.
  • the presenter of the present case has created the present disclosure while paying attention to the above circumstances.
  • the cover glass 11 having a shape such that the optical path difference with the ideal lens for each angle of view ⁇ of the cover glass 11 is equal to or less than the depth of focus is used.
  • the imaging unit 10 according to the present disclosure can correct the aberration and improve the quality of the captured image.
  • an example of the present disclosure a functional configuration of an electronic apparatus including the imaging unit 10 according to the present disclosure, application examples, and the like will be sequentially described.
  • FIG. 2 is a diagram showing the optical path length L from the imaging lens 13 to the imaging surface when the cover glass 11 is the image sensor 12.
  • the optical path length L from the imaging lens 13 to a predetermined imaging plane is simply expressed by the following formula (9).
  • the angle of view ⁇ refers to the direction of incident light with respect to the optical axis of the cover glass 11, and is represented by the angle formed by the optical axis direction and the traveling direction of incident light, as shown in FIG. Is done.
  • FIG. 3 is a diagram schematically showing an optical path length L ′ from the imaging lens 13 to the imaging surface when the flat cover glass 11 is provided in the imaging device 12.
  • FIG. 3 shows an optical path length L ′ when a flat glass plate having a thickness t and a refractive index n is provided as the cover glass 11 in the image sensor 12.
  • the plate thickness t of the cover glass 11 is expressed as t / n when converted to the distance between air. Accordingly, the distance between the imaging lens 13 and the cover glass 11 to be taken into account when correcting the focus shift caused by the provision of the cover glass 11 is Dt / n.
  • Equation (9) the optical path length changes due to the influence of the flat cover glass 11, and aberration occurs.
  • the value obtained by subtracting L represented by Equation (9) from L ′ represented by Equation (10) is the optical path difference caused by the provision of the flat cover glass 11, and this optical path difference is As a result, aberration occurs.
  • FIG. 4 is a diagram showing the optical path length from the imaging lens 13 to the imaging surface when the imaging element 12 is provided with a cover glass 11 having a spherical shape with a radius of curvature R on the surface.
  • the intersection coordinates of the incident light and the cover glass 11 are expressed as a function of the radius of curvature R (x ′ (R), y ′ ( R)).
  • the angle formed by the optical axis and the straight line connecting the center of curvature of the cover glass 11 and the intersection point coordinates is ⁇ (R) as a function of the radius of curvature R, and similarly, the refraction angle on the surface of the cover glass 11 is ⁇ ′′ (R).
  • the optical path length L ′′ from the imaging lens 13 to the imaging surface of the light beam incident at the angle of view ⁇ is simply expressed by the following equation (11).
  • a value obtained by subtracting L represented by Equation (9) from L ′′ represented by Equation (11) is an optical path difference caused by the provision of the cover glass 11 having a spherical surface with a radius of curvature R.
  • the exit pupil distance D, the plate thickness t of the cover glass 11 and the refractive index n are fixed values as shown in the above equation (11).
  • Each of the other values is expressed by a function having the curvature radius R as a variable. Therefore, the optical path difference L ′′ given by the equation (11) is also a function of the curvature radius R derived from the surface shape of the cover glass 11. Become.
  • the optical path difference obtained by subtracting L represented by Expression (9) from L ′′ represented by Expression (11) is equal to or less than the depth of focus at each angle of view ⁇ .
  • a cover glass 11 having a spherical surface with a radius of curvature R as described above is employed, where the numerical aperture of the imaging lens 13 is NA and the wavelength of incident light is ⁇ [nm], the depth of focus.
  • D f is represented by the following formula (12).
  • the shape of the cover glass 11 on the imaging lens 13 side (in other words, among the optical elements such as lenses constituting the lens optical system) so that the relationship of the following expression (13) is satisfied.
  • indicates an optical path difference.
  • the depth of focus D f is a tolerable range in the optical axis direction in which a clear image is considered to be formed practically. Therefore, as in this embodiment, the optical path difference at each angle of view ⁇ is the depth of focus D f. Thus, a clear image is formed regardless of the angle of view ⁇ when light is incident. In other words, the aberration of the cover glass 11 can be corrected by the method of this embodiment, and the quality of the captured image can be improved.
  • the imaging unit 10 of this embodiment is used in an electronic device such as a smartphone.
  • the F value of the imaging unit 10 mounted on an electronic device such as a smartphone is generally about 2.0. Accordingly, if the F value of the imaging lens 13 constituting the imaging unit 10 is 2.0, the numerical aperture NA of the imaging lens 13 is 0.25 according to the following formula (14) (in the formula, Describes the F value as "F").
  • electronic devices such as smartphones basically capture visible light.
  • the wavelength ⁇ of incident light to be detected is a median value of a general visible light wavelength band, and the median value is about 550 [nm]
  • the depth of focus D is calculated based on the equation (12).
  • f is 4.4 [ ⁇ m]. That is, when the imaging unit 10 of the present embodiment is used in an electronic device such as a smartphone, the optical path difference is desirably 4.4 [ ⁇ m] or less based on the equation (13).
  • an optical path difference may occur due to the difference between the optical path length in the sagittal direction and the optical path length in the tangential direction due to the influence of astigmatism.
  • Non-Patent Document 1 Warren J. Smith “Modern Optical Engineering” 2007
  • the amount of point aberration AS ( ⁇ ) is expressed by the following equation (15).
  • a value obtained based on AS ( ⁇ ) of Expression (15) may be applied to Expression (13).
  • the optical path difference in consideration of astigmatism may be calculated by subtracting the value obtained by dividing AS ( ⁇ ) in Expression (15) by 2 from the optical path difference in Expression (13). That is, the shape of the cover glass 11 may be calculated
  • Formula (16) is an example to the last, you may change suitably.
  • the mathematical formula for calculating the astigmatism amount AS based on the surface shape of the cover glass 11 may be appropriately changed.
  • the imaging unit 10 can generate a captured image with sufficient quality by satisfying the above expression (13) or (16), but the optical path difference from the ideal lens is as small as possible. More preferably, a cover glass 11 is used. Here, with reference to FIG. 5, the curvature radius R of the cover glass 11 which can make the optical path difference with an ideal lens as small as possible is demonstrated.
  • FIG. 5 shows a value given by the square of the optical path difference under the conditions where the exit pupil distance D is 4.0 [mm], the refractive index n is 1.5, and the plate thickness t is 0.2 [mm]. Is a view shown for each angle of view ⁇ of incident light.
  • the value of each variable is a value assumed for an electronic device including a small imaging device such as a smartphone or a digital camera.
  • the radius of curvature R of the cover glass 11 is infinite (indicated as “INF” in the figure) (that is, when the cover glass 11 is a flat plate), the radius of curvature R is The optical path difference is larger than in the case of other values.
  • the optical path difference decreases as the radius of curvature R decreases from infinity, becomes minimum when the radius of curvature R is 50.7 [mm], and again when the radius of curvature R becomes smaller than 50.7 [mm]. growing. That is, when each variable is the above-mentioned condition, it is more preferable to employ the cover glass 11 having a curvature radius R of 50.7 [mm].
  • the defocus modulation transfer function (MTF: Modulation Transfer Function) characteristics of the imaging unit 10 will be described with reference to FIGS. 6 to 8, the horizontal axis indicates the defocus amount, and the vertical axis indicates the MTF intensity.
  • 6 to 8 show MTFs when the angle of view ⁇ of incident light is 0 [deg] and 30 [deg] as an example. When the angle of view ⁇ of incident light is 30 [deg], the sagittal MTF and the tangential MTF are shown.
  • FIG. 6 is a graph showing the MTF when the cover glass 11 is not provided. As shown in FIG. 6, in the case of the imaging unit 10 in which the aberration is corrected without the cover glass 11 being provided, when the defocus is 0, the MTF of each angle of view ⁇ shows a good value. .
  • FIG. 7 is a graph showing the MTF when the flat cover glass 11 is provided.
  • the MTF when the angle of view ⁇ of incident light is 0 [deg] shows a good value, but the angle of view ⁇ of incident light.
  • MTF deteriorates when is 30 [deg]. More specifically, when the angle of view ⁇ of incident light is 30 [deg], the MTF peak position moves, so that the MTF when the defocus is 0 is lowered.
  • FIG. 8 is provided with a cover glass 11 having a spherical shape with a radius of curvature R according to the first embodiment (for example, the cover glass 11 when the radius of curvature R is 50.7 [mm]).
  • It is a graph which shows MTF in a case.
  • a cover glass 11 having a spherical surface with a radius of curvature R according to the present embodiment is provided on the surface, the MTF when the angle of view ⁇ of incident light is 0 [deg] is obtained. Not only a good value is shown, but also the degradation of MTF when the angle of view ⁇ of incident light is 30 [deg] is reduced. More specifically, when the angle of view ⁇ of incident light is 30 [deg], the amount of movement of the peak position of the MTF is smaller than that in the example shown in FIG. 7, so the MTF when the defocus is 0 is Kept high.
  • Second Embodiment> The first embodiment of the present disclosure has been described above. Subsequently, a second embodiment of the present disclosure will be described.
  • the imaging unit 10 can correct aberrations including astigmatism, and can further improve the quality of the captured image.
  • the F value of the electronic device is set to 2.0 as in the first embodiment, and the incident light If the wavelength ⁇ is 550 [nm], the average value of the optical path difference with the ideal lens at each angle of view ⁇ is preferably 4.4 [ ⁇ m] or less.
  • the angle of view ⁇ of the imaging unit 10 mounted on an electronic device such as a smartphone is generally 80.0 [deg] or less
  • the refractive index n of the cover glass 11 is 1.5
  • the focal length is generally It is assumed that it is 4.0 [mm].
  • the focal length and the exit pupil distance D are the same.
  • FIG. 9 shows a curve indicating the maximum condition of Expression (18) and a curve indicating the minimum condition of Expression (18).
  • the suitable radius of curvature R varies depending on the plate thickness t.
  • equation (19) is derived by performing linear approximation on each of the curve indicating the maximum condition of equation (18) and the curve indicating the minimum condition of equation (18) in FIG. 9 (in FIG. 9). (19) is also shown).
  • the imaging unit 10 that satisfies Expression (19) when used in an electronic device such as a smartphone, the quality of the captured image can be improved by appropriately correcting the aberration caused by the cover glass 11. .
  • the cover glass 11 having a spherical shape with a radius of curvature R on the surface was used. That is, the cover glass 11 which is a spherical lens is used.
  • the cover glass 11 is an aspheric lens.
  • FIG. 10 is a diagram illustrating the optical path length from the imaging lens 13 to the imaging surface when the cover glass 11 having an aspherical structure is fixed to the imaging device 12.
  • ⁇ x is the amount of deviation in the optical axis direction from the aspherical vertex
  • c is the reciprocal of the radius of curvature R of the aspherical lens (cover glass 11)
  • k is the conic constant
  • ⁇ i is an aspheric coefficient.
  • intersection coordinates ( ⁇ x ′, y ′) are calculated by the following simultaneous equations (21) and (22).
  • Equation (21) Equation (21) and wherein the intersection coordinates as the solutions of the simultaneous equations (22) ( ⁇ x '( R, ⁇ 4, ⁇ 6), y' (R, ⁇ 4, ⁇ 6)) and, the intersection coordinates
  • ⁇ ′ the angle formed between the normal vector and the light beam
  • ⁇ + ⁇ ′ the angle formed between the normal vector and the light beam
  • FIG. 11 is a diagram showing the relationship between the angle of view ⁇ of incident light and the square of the optical path difference when the surface of the cover glass 11 has a spherical shape and an aspherical shape.
  • the exit pupil distance D is set to 4.0 [mm]
  • the refractive index n is set to 1.5
  • the plate thickness t is set to 0.2 [mm]
  • the radius of curvature R is set.
  • the cover glass 11 is drawn under the condition of 50.8 [mm], and in the cover glass 11 having an aspheric shape, the fourth-order aspheric coefficient ⁇ 4 is set to 3 ⁇ 10 ⁇ 4 and the sixth-order aspheric coefficient ⁇ Plotted under conditions where 6 is 5 ⁇ 10 ⁇ 5 .
  • the effect of astigmatism may be taken into account, as in the first embodiment. More specifically, in this embodiment, even if the optical path difference considering astigmatism is calculated by the same method as in the first embodiment, based on AS ( ⁇ ) of the above formula (15). Good. Accordingly, the imaging unit 10 can correct aberrations including astigmatism, and can further improve the quality of the captured image.
  • FIG. 11 shows an example in which the effect of astigmatism is taken into account.
  • the cover glass 11 having an aspherical shape reduces the optical path difference from the ideal lens more than the cover glass 11 having a spherical shape. That is, the imaging unit 10 according to the third embodiment may be able to further reduce the influence of aberration caused by the cover glass 11 by making the surface shape of the cover glass 11 an aspherical shape.
  • the influence of the aberration due to the imaging lens 13 was not taken into consideration. That is, it was a premise that the imaging lens 13 is an aberration-free lens.
  • the influence of aberration caused by the imaging lens 13 is taken into consideration.
  • the optical path difference with the ideal lens for each angle of view ⁇ of the imaging lens 13 may be calculated, and the optical path difference may be used in the above formula (13) or formula (17).
  • the calculation method of the optical path difference with the ideal lens for every angle of view ⁇ of the imaging lens 13 is arbitrary.
  • the optical path difference between the imaging lens 13 and the ideal lens for each angle of view ⁇ may be calculated based on design data of the imaging lens 13 or may be obtained from an actual measurement value.
  • Equation (23) is obtained when f ( ⁇ ) is applied to equation (13), where f ( ⁇ ) is the optical path difference between the imaging lens 13 and the ideal lens for each angle of view ⁇ . It is a relational expression.
  • the quality of the captured image can be improved by appropriately correcting these aberrations.
  • FIG. 12 is a diagram showing the relationship between the angle of view ⁇ of incident light and the square of the optical path difference in the fourth embodiment.
  • the drawing is performed under conditions where the exit pupil distance D is 4.0 [mm], the refractive index n is 1.5, and the plate thickness t is 0.2 [mm].
  • the imaging lens 13 is not an aberration-free lens, that is, when the aberration due to the imaging lens 13 is not corrected well, the degree of field curvature by the imaging lens 13 is the angle of view ⁇ of the incident light. It is proportional to the square.
  • the optical path difference f ( ⁇ ) between the imaging lens 13 and the ideal lens for each angle of view ⁇ satisfies the following formula (25).
  • the influence of astigmatism may be considered. More specifically, in this embodiment, even if the optical path difference considering astigmatism is calculated by the same method as in the first embodiment, based on AS ( ⁇ ) of the above formula (15). Good. Accordingly, the imaging unit 10 can correct aberrations including astigmatism, and can further improve the quality of the captured image.
  • FIG. 12 shows an example in which the effect of astigmatism is taken into account.
  • the imaging lens 13 is a non-aberration lens (in the first embodiment)
  • the optical path difference from the ideal lens at the radius of curvature of 50.7 [mm] which is optimal, becomes large.
  • FIG. 13 is a diagram illustrating a functional configuration of the electronic device 100 including the imaging unit 10 according to the present disclosure.
  • the electronic device 100 includes an imaging unit 10, an input information acquisition unit 20, a display control unit 30, a storage unit 40, and a control unit 50.
  • Imaging unit 10 has the features and functions as described above, and can generate good captured image data by correcting aberrations according to the shape of the cover glass 11.
  • the imaging unit 10 provides the generated captured image data to the control unit 50 described later. Since the configuration of the imaging unit 10 is as described above, detailed description thereof will be omitted below.
  • the input information acquisition unit 20 is an interface used for input by the user of the electronic device 100.
  • the input information acquisition unit 20 includes a button, a touch panel, a keyboard, a microphone, a pointing device, and the like, and the user inputs to the electronic device 100 using these devices.
  • the input information acquisition unit 20 provides information input by the user to the control unit 50 described later. Note that the input information acquisition unit 20 may acquire input information from an external device including a button and provide the input information to the control unit 50.
  • the display control unit 30 controls display of various information.
  • the display control unit 30 includes a display device such as a display, and displays various types of information in various formats such as images, text, and graphs.
  • the display control unit 30 may realize display by transmitting control information for display to an external device including a display or the like.
  • the storage unit 40 stores various parameters and databases, various programs, and the like that can be referred to when the control unit 50 described later performs various control processes. Further, the storage unit 40 may store temporary data generated when various control processes are performed by the control unit 50, various history information, and the like. The control unit 50 can freely perform data read / write processing on the storage unit 40.
  • the storage unit 40 is realized by, for example, a ROM, a RAM, a storage device, and the like.
  • Control unit 50 The control unit 50 controls various processes in the electronic device 100.
  • the control unit 50 controls the imaging process by the imaging unit 10 based on the information input by the user, and controls the display process of the captured image by the display control unit 30.
  • the process is merely an example, and the control unit 50 may appropriately control other various processes.
  • the control unit 50 is realized by, for example, a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like.
  • control unit 50 may embody part of the functions of the imaging unit 10, the input information acquisition unit 20, the display control unit 30, or the storage unit 40.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device that is mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
  • FIG. 14 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
  • the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
  • the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image.
  • the vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light.
  • the imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
  • the vehicle interior information detection unit 12040 detects vehicle interior information.
  • a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
  • the microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010.
  • the microcomputer 12051 realizes an ADAS (Advanced Driver Assistance System) function including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
  • the sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 15 is a diagram illustrating an example of an installation position of the imaging unit 12031.
  • the vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper part of a windshield in the vehicle interior of the vehicle 12100.
  • the imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100.
  • the imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
  • the forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 15 shows an example of the shooting range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
  • the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100).
  • a predetermined speed for example, 0 km / h or more
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like.
  • automatic brake control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
  • the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles.
  • the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see.
  • the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is connected via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not the user is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining.
  • the audio image output unit 12052 When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 has a rectangular contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to be superimposed and displayed.
  • voice image output part 12052 may control the display part 12062 so that the icon etc. which show a pedestrian may be displayed on a desired position.
  • the technology according to the present disclosure may be applied to the imaging unit 12031, for example.
  • the imaging unit 10 according to the present embodiment can be applied to the imaging unit 12031.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be applied to an endoscopic surgery system.
  • FIG. 16 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology (present technology) according to the present disclosure can be applied.
  • FIG. 16 shows a state in which an operator (doctor) 11131 is performing an operation on a patient 11132 on a patient bed 11133 using an endoscopic operation system 11000.
  • an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as an insufflation tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. And a cart 11200 on which various devices for endoscopic surgery are mounted.
  • the endoscope 11100 includes a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
  • a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101.
  • an endoscope 11100 configured as a so-called rigid mirror having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible lens barrel. Good.
  • An opening into which the objective lens is fitted is provided at the tip of the lens barrel 11101.
  • a light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. Irradiation is performed toward the observation target in the body cavity of the patient 11132 through the lens.
  • the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
  • An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated.
  • the image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
  • CCU Camera Control Unit
  • the CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various kinds of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example.
  • a CPU Central Processing Unit
  • GPU Graphics Processing Unit
  • the display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
  • the light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
  • a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
  • the input device 11204 is an input interface for the endoscopic surgery system 11000.
  • a user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204.
  • the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
  • the treatment instrument control device 11205 controls the drive of the energy treatment instrument 11112 for tissue ablation, incision, blood vessel sealing, or the like.
  • the pneumoperitoneum device 11206 passes gas into the body cavity via the pneumoperitoneum tube 11111.
  • the recorder 11207 is an apparatus capable of recording various types of information related to surgery.
  • the printer 11208 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
  • the light source device 11203 that supplies the irradiation light when the surgical site is imaged to the endoscope 11100 can be configured by, for example, a white light source configured by an LED, a laser light source, or a combination thereof.
  • a white light source is configured by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out.
  • the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time. Synchronously with the timing of changing the intensity of the light, the drive of the image sensor of the camera head 11102 is controlled to acquire an image in a time-sharing manner, and the image is synthesized, so that high dynamic without so-called blackout and overexposure A range image can be generated.
  • the light source device 11203 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation.
  • special light observation for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation.
  • a so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast.
  • fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light.
  • the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue and applied to the body tissue. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent.
  • the light source device 11203 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
  • FIG. 17 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG.
  • the camera head 11102 includes a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405.
  • the CCU 11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413.
  • the camera head 11102 and the CCU 11201 are connected to each other by a transmission cable 11400 so that they can communicate with each other.
  • the lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. Observation light taken from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401.
  • the lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
  • the imaging unit 11402 includes an imaging element.
  • One (so-called single plate type) image sensor may be included in the imaging unit 11402, or a plurality (so-called multi-plate type) may be used.
  • image signals corresponding to RGB may be generated by each imaging element, and a color image may be obtained by combining them.
  • the imaging unit 11402 may be configured to include a pair of imaging elements for acquiring right-eye and left-eye image signals corresponding to 3D (Dimensional) display. By performing the 3D display, the operator 11131 can more accurately grasp the depth of the living tissue in the surgical site.
  • 3D 3D
  • the imaging unit 11402 is not necessarily provided in the camera head 11102.
  • the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
  • the driving unit 11403 is configured by an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Thereby, the magnification and the focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
  • the communication unit 11404 is configured by a communication device for transmitting and receiving various types of information to and from the CCU 11201.
  • the communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
  • the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405.
  • the control signal includes, for example, information for designating the frame rate of the captured image, information for designating the exposure value at the time of imaging, and / or information for designating the magnification and focus of the captured image. Contains information about the condition.
  • the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good. In the latter case, a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
  • AE Auto Exposure
  • AF Automatic Focus
  • AWB Auto White Balance
  • the camera head control unit 11405 controls driving of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
  • the communication unit 11411 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 11102.
  • the communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
  • the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102.
  • the image signal and the control signal can be transmitted by electrical communication, optical communication, or the like.
  • the image processing unit 11412 performs various types of image processing on the image signal that is RAW data transmitted from the camera head 11102.
  • the control unit 11413 performs various types of control related to imaging of the surgical site by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102.
  • control unit 11413 causes the display device 11202 to display a picked-up image showing the surgical part or the like based on the image signal subjected to the image processing by the image processing unit 11412.
  • the control unit 11413 may recognize various objects in the captured image using various image recognition techniques.
  • the control unit 11413 detects surgical tools such as forceps, specific biological parts, bleeding, mist when using the energy treatment tool 11112, and the like by detecting the shape and color of the edge of the object included in the captured image. Can be recognized.
  • the control unit 11413 may display various types of surgery support information superimposed on the image of the surgical unit using the recognition result. Surgery support information is displayed in a superimposed manner and presented to the operator 11131, thereby reducing the burden on the operator 11131 and allowing the operator 11131 to proceed with surgery reliably.
  • the transmission cable 11400 for connecting the camera head 11102 and the CCU 11201 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
  • communication is performed by wire using the transmission cable 11400.
  • communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
  • the technology according to the present disclosure can be applied to, for example, the imaging unit 11402 of the camera head 11102.
  • the imaging unit 10 according to the present embodiment can be applied to the imaging unit 11402 of the camera head 11102.
  • the imaging unit 10 By applying the imaging unit 10 to the imaging unit 11402 of the camera head 11102, it is possible to generate a high-quality captured image in which aberrations are appropriately corrected.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure may be applied to an endoscopic surgery system.
  • FIG. 18 is a block diagram illustrating an example of a schematic configuration of a patient in-vivo information acquisition system using a capsule endoscope to which the technique (present technique) according to the present disclosure can be applied.
  • the in-vivo information acquisition system 10001 includes a capsule endoscope 10100 and an external control device 10200.
  • the capsule endoscope 10100 is swallowed by the patient at the time of examination.
  • the capsule endoscope 10100 has an imaging function and a wireless communication function, and moves inside the organ such as the stomach and the intestine by peristaltic motion or the like until it is spontaneously discharged from the patient.
  • Images (hereinafter also referred to as in-vivo images) are sequentially captured at predetermined intervals, and information about the in-vivo images is sequentially wirelessly transmitted to the external control device 10200 outside the body.
  • the external control device 10200 comprehensively controls the operation of the in-vivo information acquisition system 10001. Further, the external control device 10200 receives information about the in-vivo image transmitted from the capsule endoscope 10100 and, based on the received information about the in-vivo image, displays the in-vivo image on the display device (not shown). The image data for displaying is generated.
  • an in-vivo image obtained by imaging the inside of the patient's body can be obtained at any time in this manner until the capsule endoscope 10100 is swallowed and discharged.
  • the capsule endoscope 10100 includes a capsule-type casing 10101.
  • a light source unit 10111 In the casing 10101, a light source unit 10111, an imaging unit 10112, an image processing unit 10113, a wireless communication unit 10114, a power supply unit 10115, and a power supply unit 10116 and the control unit 10117 are stored.
  • the light source unit 10111 is composed of a light source such as an LED (Light Emitting Diode), for example, and irradiates the imaging field of the imaging unit 10112 with light.
  • a light source such as an LED (Light Emitting Diode), for example, and irradiates the imaging field of the imaging unit 10112 with light.
  • the image capturing unit 10112 includes an image sensor and an optical system including a plurality of lenses provided in front of the image sensor. Reflected light (hereinafter referred to as observation light) of light irradiated on the body tissue to be observed is collected by the optical system and enters the image sensor. In the imaging unit 10112, in the imaging element, the observation light incident thereon is photoelectrically converted, and an image signal corresponding to the observation light is generated. The image signal generated by the imaging unit 10112 is provided to the image processing unit 10113.
  • the image processing unit 10113 is configured by a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), and performs various signal processing on the image signal generated by the imaging unit 10112.
  • the image processing unit 10113 provides the radio communication unit 10114 with the image signal subjected to signal processing as RAW data.
  • the wireless communication unit 10114 performs predetermined processing such as modulation processing on the image signal that has been subjected to signal processing by the image processing unit 10113, and transmits the image signal to the external control apparatus 10200 via the antenna 10114A.
  • the wireless communication unit 10114 receives a control signal related to drive control of the capsule endoscope 10100 from the external control device 10200 via the antenna 10114A.
  • the wireless communication unit 10114 provides a control signal received from the external control device 10200 to the control unit 10117.
  • the power feeding unit 10115 includes a power receiving antenna coil, a power regeneration circuit that regenerates power from a current generated in the antenna coil, a booster circuit, and the like. In the power feeding unit 10115, electric power is generated using a so-called non-contact charging principle.
  • the power supply unit 10116 is composed of a secondary battery, and stores the electric power generated by the power supply unit 10115.
  • FIG. 18 in order to avoid complication of the drawing, illustration of an arrow or the like indicating a power supply destination from the power supply unit 10116 is omitted, but the power stored in the power supply unit 10116 is stored in the light source unit 10111.
  • the imaging unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the control unit 10117 can be used for driving them.
  • the control unit 10117 includes a processor such as a CPU, and a control signal transmitted from the external control device 10200 to drive the light source unit 10111, the imaging unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the power feeding unit 10115. Control accordingly.
  • a processor such as a CPU
  • the external control device 10200 is configured by a processor such as a CPU or GPU, or a microcomputer or a control board in which a processor and a storage element such as a memory are mounted.
  • the external control device 10200 controls the operation of the capsule endoscope 10100 by transmitting a control signal to the control unit 10117 of the capsule endoscope 10100 via the antenna 10200A.
  • the capsule endoscope 10100 for example, the light irradiation condition for the observation target in the light source unit 10111 can be changed by a control signal from the external control device 10200.
  • an imaging condition for example, a frame rate or an exposure value in the imaging unit 10112
  • a control signal from the external control device 10200 can be changed by a control signal from the external control device 10200.
  • the contents of processing in the image processing unit 10113 and the conditions (for example, the transmission interval, the number of transmission images, etc.) by which the wireless communication unit 10114 transmits an image signal may be changed by a control signal from the external control device 10200. .
  • the external control device 10200 performs various image processing on the image signal transmitted from the capsule endoscope 10100, and generates image data for displaying the captured in-vivo image on the display device.
  • image processing for example, development processing (demosaic processing), high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing ( Various signal processing such as electronic zoom processing can be performed.
  • the external control device 10200 controls driving of the display device to display an in-vivo image captured based on the generated image data.
  • the external control device 10200 may cause the generated image data to be recorded on a recording device (not shown) or may be printed out on a printing device (not shown).
  • the technology according to the present disclosure can be applied to, for example, the imaging unit 10112 among the configurations described above.
  • the imaging unit 10 according to the present embodiment can be applied to the imaging unit 10112. By applying the imaging unit 10 to the imaging unit 10112, it is possible to generate a high-quality captured image in which aberrations are appropriately corrected.
  • the refraction phenomenon of the cover glass 11 is used as the aberration correcting means.
  • the present invention is not limited to this, and the aberration may be corrected by using the cover glass 11 having the same shape as that of the diffractive lens on the surface.
  • the cover glass 11 having an aspherical shape on the surface is used (third embodiment)
  • the optical path difference is required. Then, the optical path difference is the shape of the cover glass 11 to be equal to or less than the focal depth D f is determined.
  • the imaging lens 13 may also be a diffractive lens.
  • the cover glass 11 and the image sensor 12 have a WL-CSP structure, and the cover glass 11 is light transmissive fixed on the image sensor 12. It was formed by scraping the conductive substrate (the substrate on which the cover glass 11 is based).
  • the cover glass 11 may be formed by loading an optical resin on the image sensor 12.
  • a cover glass 11 having the shape as the optical path difference is equal to or less than the focal depth D f of the ideal lens of each angle ⁇ of the cover glass 11 is used.
  • the imaging unit 10 according to the present disclosure can correct the aberration and improve the quality of the captured image.
  • An image sensor that forms an image of a subject; A lens optical system provided on the subject side of the image sensor and fixed to the image sensor; An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system, When the wavelength of incident light incident on the image sensor is ⁇ , the angle of view is ⁇ [deg], and the optical path difference from the ideal lens for each angle of view ⁇ is L ( ⁇ ), it is located on the subject side.
  • An imaging unit in which the surface shape of the lens optical system is a shape that satisfies the following expression (26).
  • the surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (27).
  • the optical path difference with the ideal lens for each angle of view ⁇ of the imaging optical system is f ( ⁇ )
  • the surface shape of the lens optical system located on the subject side satisfies the following expression (28). It has a shape, The imaging unit according to any one of (1) and (2).
  • the maximum value of the angle of view is ⁇ max [deg]
  • the surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (29).
  • At least a part of a portion where the incident light is incident on the surface of the lens optical system on the subject side has a convex structure with a radius of curvature R.
  • the curvature radius R satisfies the following formula (31) when the thickness of the lens optical system is t [mm].
  • the object side surface of the lens optical system has an aspherical structure, The imaging unit according to any one of (1) to (5). (9) The object side surface of the lens optical system has a diffractive structure; The imaging unit according to any one of (1) to (5). (10) The lens optical system is formed of an optical resin; The imaging unit according to any one of (1) to (9).
  • An image sensor that forms an image of a subject; A lens optical system provided on the subject side of the image sensor and fixed to the image sensor; An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system, The wavelength of incident light incident on the image sensor is ⁇ , the angle of view is ⁇ [deg], the maximum value of the angle of view is ⁇ max , and the optical path difference from the ideal lens for each angle of view ⁇ is L ( ⁇ ),
  • the surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (32).
  • An image sensor that forms an image of a subject; A lens optical system provided on the subject side of the image sensor and fixed to the image sensor; An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system, The wavelength of incident light incident on the image sensor is ⁇ , the angle of view is ⁇ [deg], the optical path difference L ( ⁇ ) with respect to the ideal lens for each angle of view ⁇ , and the angle of view of the image pickup optical system.
  • An image sensor that forms an image of a subject; A lens optical system provided on the subject side of the image sensor and fixed to the image sensor; An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system, The wavelength of incident light incident on the image sensor is ⁇ , the angle of view is ⁇ [deg], the maximum value of the angle of view is ⁇ max , and the optical path difference from the ideal lens for each angle of view ⁇ is L ( ⁇ ) And the surface shape of the lens optical system located on the subject side is given by the following equation (34) where f ( ⁇ ) is the optical path difference between the imaging optical system and the ideal lens for each angle of view ⁇ .
  • the imaging unit has a shape satisfying (14)
  • An image sensor that forms an image of a subject;
  • a lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
  • the wavelength of incident light incident on the image sensor is ⁇
  • the angle of view is ⁇ [deg]
  • the optical path difference from the ideal lens for each angle of view ⁇ is L ( ⁇ )
  • An electronic apparatus including an imaging unit, wherein the surface shape of the lens optical system is a shape that satisfies the following expression (35).
  • An image sensor that forms an image of a subject; A lens optical system provided on the subject side of the image sensor and fixed to the image sensor; An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system, The wavelength of incident light incident on the image sensor is ⁇ , the angle of view is ⁇ [deg], the maximum value of the angle of view is ⁇ max , and the optical path difference from the ideal lens for each angle of view ⁇ is L ( ⁇ ),
  • the surface shape of the lens optical system located on the subject side is an electronic apparatus including an imaging unit that satisfies the following expression (36).
  • An image sensor that forms an image of a subject; A lens optical system provided on the subject side of the image sensor and fixed to the image sensor; An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system, The wavelength of incident light incident on the image sensor is ⁇ , the angle of view is ⁇ [deg], the optical path difference from the ideal lens for each angle of view ⁇ is L ( ⁇ ), and the angle of view of the image pickup optical system is An imaging unit in which the surface shape of the lens optical system located on the object side satisfies the following expression (37), where f ( ⁇ ) is the optical path difference from the ideal lens for each ⁇ .
  • Electronic equipment comprising.
  • An image sensor that forms an image of a subject; A lens optical system provided on the subject side of the image sensor and fixed to the image sensor; An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system, The wavelength of incident light incident on the image sensor is ⁇ , the angle of view is ⁇ [deg], the maximum value of the angle of view is ⁇ max , and the optical path difference from the ideal lens for each angle of view ⁇ is L ( ⁇ ), And the surface shape of the lens optical system located on the subject side is expressed by the following equation (38), where f ( ⁇ ) is the optical path difference between the imaging optical system and the ideal lens for each angle of view ⁇ .
  • An electronic device including an imaging unit that has a shape satisfying

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Power Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Electromagnetism (AREA)
  • Optics & Photonics (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Computer Hardware Design (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Lenses (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Studio Devices (AREA)

Abstract

[Problem] To make it possible to reduce the effects of aberration resulting from a WL-CSP structure. [Solution] This imaging unit is provided which comprises an imaging element where an image of a subject is formed, a lens optical system which is provided on the subject side of the imaging element and is fixed to the imaging element, and an imaging optical system which is provided between the subject and the lens optical system and which has a numerical aperture of NA. Defining λ as the wavelength of the light incident on the imaging element, θ[deg] as the angle of view and L(θ) as the optical path difference from an ideal lens for each angle of view θ, the surface shape of the lens optical system located on the subject side is a shape that satisfies expression (1).

Description

撮像ユニットおよび電子機器Imaging unit and electronic device
 本開示は、撮像ユニットおよび電子機器に関する。 This disclosure relates to an imaging unit and an electronic device.
 近年、パッケージの小型化および薄型化が可能であり、製造コストの観点でも優れている等の理由で、撮像ユニットの構造として、WL-CSP(Wafer Level Chip Size Package)が採用されることが多い。WL-CSPは、撮像素子が形成されたウエハにレンズ光学系(カバーガラス等)を接着させた上で、レンズ光学系が設けられたウエハを個片化させるという製造工程を経るため、カバーガラスが撮像素子に固定され、カバーガラスと撮像素子との間に空気層が存在しない構造を有する。以下の特許文献1には、WL-CSPにおける撮像素子とカバーガラスとの間に、所定の比重を有する膜と透明樹脂とが充填される技術が開示されている。 In recent years, WL-CSP (Wafer Level Chip Size Package) is often adopted as the structure of the imaging unit because the package can be reduced in size and thickness and is excellent in terms of manufacturing cost. . Since WL-CSP goes through a manufacturing process in which a lens optical system (cover glass or the like) is bonded to a wafer on which an image sensor is formed, and then the wafer provided with the lens optical system is separated into individual pieces. Is fixed to the image sensor and has an air layer between the cover glass and the image sensor. Patent Document 1 below discloses a technique in which a film having a specific gravity and a transparent resin are filled between an imaging element and a cover glass in WL-CSP.
特開2014-175465号公報JP 2014-175465 A
 ここで、上記特許文献1に開示されている技術を含む既存技術では、撮像素子に固定されたカバーガラスにより生じる収差の影響を低減させることが困難であった。例えば、光が撮像ユニットに入射し、カバーガラスを透過して、カバーガラスと撮像素子との間に位置する結像面に結像する場合、カバーガラスが厚くなるほどカバーガラスにより生じる収差の影響が大きくなるため、撮像画像の品質が低下する。 Here, with the existing technologies including the technology disclosed in Patent Document 1, it is difficult to reduce the influence of aberration caused by the cover glass fixed to the image sensor. For example, when light enters the imaging unit, passes through the cover glass, and forms an image on an imaging surface located between the cover glass and the imaging device, the influence of the aberration caused by the cover glass increases as the cover glass becomes thicker. Since it becomes large, the quality of a captured image falls.
 また、撮像ユニットにおいて、カバーガラスより被写体側に設けられる撮像光学系(例えば、結像レンズ等)の形状、材質等を変更することによって収差を補正することも考えられるが、カバーガラスが厚くなるほど収差の補正は困難になる。 In the imaging unit, it is conceivable to correct the aberration by changing the shape, material, etc. of the imaging optical system (for example, an imaging lens) provided on the subject side from the cover glass. Correction of aberrations becomes difficult.
 そこで、本開示は、上記に鑑みてなされたものであり、本開示は、WL-CSP構造に起因する収差の影響をより低減させることが可能な、撮像ユニットおよび電子機器を提供する。 Therefore, the present disclosure has been made in view of the above, and the present disclosure provides an imaging unit and an electronic apparatus that can further reduce the influence of aberration caused by the WL-CSP structure.
 本開示によれば、被写体の像が結像する撮像素子と、前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角θごとの理想レンズとの光路差をL(θ)としたとき、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(1)を満たす形状となっている、撮像ユニットが提供される。
Figure JPOXMLDOC01-appb-M000014
According to the present disclosure, an imaging device that forms an image of a subject, a lens optical system that is provided on the subject side of the imaging device and is fixed to the imaging device, and between the subject and the lens optical system An imaging optical system having a numerical aperture NA, and a wavelength of incident light incident on the imaging element is λ, an angle of view is θ [deg], and an ideal lens for each angle of view θ An imaging unit is provided in which when the optical path difference is L (θ), the surface shape of the lens optical system located on the subject side satisfies the following formula (1).
Figure JPOXMLDOC01-appb-M000014
 また、本開示によれば、被写体の像が結像する撮像素子と、前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角の最大値をθmaxとし、前記画角θごとの理想レンズとの光路差をL(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(2)を満たす形状となっている、撮像ユニットが提供される。
Figure JPOXMLDOC01-appb-M000015
Further, according to the present disclosure, an imaging element on which an image of a subject is formed, a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system An imaging optical system having a numerical aperture of NA, wherein the wavelength of incident light incident on the imaging element is λ, the field angle is θ [deg], and the maximum value of the field angle is θ The surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (2), where max is the optical path difference with the ideal lens for each angle of view θ, and L (θ). An imaging unit is provided.
Figure JPOXMLDOC01-appb-M000015
 また、本開示によれば、被写体の像が結像する撮像素子と、前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角θごとの理想レンズとの光路差L(θ)をとし、前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(3)を満たす形状となっている、撮像ユニットが提供される。
Figure JPOXMLDOC01-appb-M000016
Further, according to the present disclosure, an imaging element on which an image of a subject is formed, a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system An imaging optical system having a numerical aperture of NA, and the wavelength of incident light incident on the image sensor is λ, the field angle is θ [deg], and an ideal lens for each field angle θ The surface of the lens optical system located on the subject side when the optical path difference with the ideal lens for each angle of view θ of the imaging optical system is f (θ). An imaging unit having a shape that satisfies the following expression (3) is provided.
Figure JPOXMLDOC01-appb-M000016
 また、本開示によれば、被写体の像が結像する撮像素子と、前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角の最大値をθmaxとし、前記画角θごとの理想レンズとの光路差をL(θ)とし、前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(4)を満たす形状となっている、撮像ユニットが提供される。
Figure JPOXMLDOC01-appb-M000017
Further, according to the present disclosure, an imaging element on which an image of a subject is formed, a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system An imaging optical system having a numerical aperture of NA, wherein the wavelength of incident light incident on the imaging element is λ, the field angle is θ [deg], and the maximum value of the field angle is θ When the optical path difference with the ideal lens for each angle of view θ is L (θ) and the optical path difference with the ideal lens for each angle of view θ of the imaging optical system is f (θ), An imaging unit is provided in which the surface shape of the lens optical system located on the subject side satisfies the following expression (4).
Figure JPOXMLDOC01-appb-M000017
 また、本開示によれば、被写体の像が結像する撮像素子と、前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角θごとの理想レンズとの光路差をL(θ)としたとき、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(5)を満たす形状となっている、撮像ユニットを備える電子機器が提供される。
Figure JPOXMLDOC01-appb-M000018
Further, according to the present disclosure, an imaging element on which an image of a subject is formed, a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system An imaging optical system having a numerical aperture of NA, and the wavelength of incident light incident on the image sensor is λ, the field angle is θ [deg], and an ideal lens for each field angle θ Provided by an electronic apparatus provided with an imaging unit, where the surface shape of the lens optical system located on the subject side is a shape satisfying the following expression (5), where L (θ) is the optical path difference between Is done.
Figure JPOXMLDOC01-appb-M000018
 また、本開示によれば、被写体の像が結像する撮像素子と、前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角の最大値をθmaxとし、前記画角θごとの理想レンズとの光路差をL(θ)としたとき、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(6)を満たす形状となっている、撮像ユニットを備える電子機器が提供される。
Figure JPOXMLDOC01-appb-M000019
Further, according to the present disclosure, an imaging element on which an image of a subject is formed, a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system An imaging optical system having a numerical aperture of NA, wherein the wavelength of incident light incident on the imaging element is λ, the field angle is θ [deg], and the maximum value of the field angle is θ Assuming that the optical path difference from the ideal lens for each angle of view θ is L (θ), the surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (6). An electronic device including an imaging unit is provided.
Figure JPOXMLDOC01-appb-M000019
 また、本開示によれば、被写体の像が結像する撮像素子と、前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角θごとの理想レンズとの光路差をL(θ)とし、前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(7)を満たす形状となっている、撮像ユニットを備える電子機器が提供される。
Figure JPOXMLDOC01-appb-M000020
Further, according to the present disclosure, an imaging element on which an image of a subject is formed, a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system An imaging optical system having a numerical aperture of NA, and the wavelength of incident light incident on the image sensor is λ, the field angle is θ [deg], and an ideal lens for each field angle θ The surface of the lens optical system located on the subject side when the optical path difference with the ideal lens for each angle of view θ of the imaging optical system is f (θ). An electronic device including an imaging unit is provided that has a shape that satisfies the following expression (7).
Figure JPOXMLDOC01-appb-M000020
 また、本開示によれば、被写体の像が結像する撮像素子と、前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角の最大値をθmaxとし、前記画角θごとの理想レンズとの光路差をL(θ)とし、前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(8)を満たす形状となっている、撮像ユニットを備える電子機器が提供される。
Figure JPOXMLDOC01-appb-M000021
Further, according to the present disclosure, an imaging element on which an image of a subject is formed, a lens optical system that is provided on the subject side of the imaging element and is fixed to the imaging element, and the subject and the lens optical system An imaging optical system having a numerical aperture of NA, wherein the wavelength of incident light incident on the imaging element is λ, the field angle is θ [deg], and the maximum value of the field angle is θ When the optical path difference with the ideal lens for each angle of view θ is L (θ) and the optical path difference with the ideal lens for each angle of view θ of the imaging optical system is f (θ), An electronic apparatus including an imaging unit is provided in which the surface shape of the lens optical system located on the subject side is a shape that satisfies the following formula (8).
Figure JPOXMLDOC01-appb-M000021
 本開示によれば、被写体側に位置するレンズ光学系の表面形状が、上記のような所定の形状を有していることで、WL-CSP構造に起因する収差が補正される。 According to the present disclosure, the aberration caused by the WL-CSP structure is corrected because the surface shape of the lens optical system located on the subject side has the predetermined shape as described above.
 以上説明したように本開示によれば、WL-CSP構造に起因する収差の影響をより低減させることが可能になる。 As described above, according to the present disclosure, it is possible to further reduce the influence of aberration caused by the WL-CSP structure.
 なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。 Note that the above effects are not necessarily limited, and any of the effects shown in the present specification, or other effects that can be grasped from the present specification, together with or in place of the above effects. May be played.
本開示に係る撮像ユニットの構成を示す図である。It is a figure which shows the structure of the imaging unit which concerns on this indication. カバーガラスが撮像素子に備えられない場合における結像レンズから結像面までの光路長を示す図である。It is a figure which shows the optical path length from an imaging lens to an imaging surface in case a cover glass is not provided in an image pick-up element. 平らなカバーガラスが撮像素子に備えられる場合における結像レンズから結像面までの光路長を示す図である。It is a figure which shows the optical path length from an imaging lens to an imaging surface in case a flat cover glass is provided in an image pick-up element. 曲率半径Rの球面形状を表面に有するカバーガラスが撮像素子に備えられる場合における結像レンズから結像面までの光路長を示す図である。It is a figure which shows the optical path length from an imaging lens to an imaging surface in case a cover glass which has the spherical shape of the curvature radius R on the surface is equipped with an image pick-up element. 各曲率半径Rにおいて、入射光の画角θと光路差の二乗との関係を示す図である。It is a figure which shows the relationship between the angle of view (theta) of incident light, and the square of an optical path difference in each curvature radius R. カバーガラスが設けられていない場合におけるMTFを示すグラフである。It is a graph which shows MTF in case a cover glass is not provided. 平らなカバーガラスが設けられた場合におけるMTFを示すグラフである。It is a graph which shows MTF in case a flat cover glass is provided. 第1の実施例に係る曲率半径Rの球面形状を表面に有するカバーガラスが設けられた場合におけるMTFを示すグラフである。It is a graph which shows MTF when the cover glass which has the spherical shape of the curvature radius R which concerns on a 1st Example on the surface is provided. 第2の実施例に係る撮像ユニットがスマートフォン等の電子機器に使用された場合におけるカバーガラスの板厚と曲率半径の関係を示す図である。It is a figure which shows the relationship between the plate | board thickness of a cover glass, and a curvature radius when the imaging unit which concerns on a 2nd Example is used for electronic devices, such as a smart phone. 非球面構造を有するカバーガラスが撮像素子に固定されている場合における結像レンズから結像面までの光路長を示す図である。It is a figure which shows the optical path length from an imaging lens to an imaging surface in case the cover glass which has an aspherical structure is being fixed to the image pick-up element. 球面形状を有するカバーガラスと非球面形状を有するカバーガラスにおける、入射光の画角θと光路差の二乗との関係を示す図である。It is a figure which shows the relationship between the field angle (theta) of incident light, and the square of an optical path difference in the cover glass which has a spherical shape, and the cover glass which has an aspherical shape. 第4の実施例において、入射光の画角θと光路差の二乗との関係を示す図である。In a 4th Example, it is a figure which shows the relationship between the angle of view (theta) of incident light, and the square of an optical path difference. 本開示に係る撮像ユニットを備える電子機器の機能構成を示す図である。It is a figure which shows the function structure of an electronic device provided with the imaging unit which concerns on this indication. 車両制御システムの概略的な構成の一例を示すブロック図である。It is a block diagram which shows an example of a schematic structure of a vehicle control system. 車外情報検出部及び撮像部の設置位置の一例を示す説明図である。It is explanatory drawing which shows an example of the installation position of a vehicle exterior information detection part and an imaging part. 内視鏡手術システムの概略的な構成の一例を示す図である。It is a figure which shows an example of a schematic structure of an endoscopic surgery system. カメラヘッド及びCCUの機能構成の一例を示すブロック図である。It is a block diagram which shows an example of a function structure of a camera head and CCU. 体内情報取得システムの概略的な構成の一例を示すブロック図である。It is a block diagram which shows an example of a schematic structure of an in-vivo information acquisition system.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In addition, in this specification and drawing, about the component which has the substantially same function structure, duplication description is abbreviate | omitted by attaching | subjecting the same code | symbol.
 なお、説明は以下の順序で行うものとする。
 1.開示の概要
 2.第1の実施例
 3.第2の実施例
 4.第3の実施例
 5.第4の実施例
 6.本開示に係る撮像ユニットを備える電子機器の機能構成
 7.移動体への応用例
 8.内視鏡手術システムへの応用例
 9.体内情報取得システムへの応用例
 10.備考
 11.むすび
The description will be made in the following order.
1. Summary of disclosure 1. First embodiment Second Example 4 3. Third embodiment 4. Fourth embodiment 6. Functional configuration of an electronic device including an imaging unit according to the present disclosure Application example to a moving body 8. 8. Application example to endoscopic surgery system 9. Application example to in-vivo information acquisition system Remarks 11. Conclusion
  <1.開示の概要>
 (1-1.撮像ユニット10の概要)
 まず、図1を参照して、本開示に係る撮像ユニット10の概要について説明する。図1は、本開示に係る撮像ユニット10の構成を示す図である。
<1. Summary of disclosure>
(1-1. Overview of the imaging unit 10)
First, an overview of the imaging unit 10 according to the present disclosure will be described with reference to FIG. FIG. 1 is a diagram illustrating a configuration of an imaging unit 10 according to the present disclosure.
 図1に示すように、本開示に係る撮像ユニット10は、レンズ光学系の一例であるカバーガラス11と、撮像素子12と、撮像光学系の一例である結像レンズ13と、少なくとも備える。撮像ユニット10は、入射した光を結像レンズ13およびカバーガラス11により結像面に結像させることで、撮像素子12により、入射光の輝度分布を表す撮像画像データを生成する。本開示に係る撮像ユニット10は、スマートフォン、デジタルカメラ、パーソナルコンピュータ、タブレット型コンピュータ等の電子機器に使用することが可能である。しかし、これらの適用例はあくまで一例であり、本開示に係る撮像ユニット10は様々な用途または装置に使用することが可能である。 As shown in FIG. 1, the imaging unit 10 according to the present disclosure includes at least a cover glass 11 that is an example of a lens optical system, an imaging element 12, and an imaging lens 13 that is an example of an imaging optical system. The imaging unit 10 causes the imaging element 12 to form imaged image data representing the luminance distribution of the incident light by causing the incident light to form an image on the imaging surface by the imaging lens 13 and the cover glass 11. The imaging unit 10 according to the present disclosure can be used for electronic devices such as smartphones, digital cameras, personal computers, and tablet computers. However, these application examples are merely examples, and the imaging unit 10 according to the present disclosure can be used in various applications or apparatuses.
 また、本開示におけるカバーガラス11および撮像素子12は、WL-CSP構造を有している。すなわち、カバーガラス11および撮像素子12は、複数の撮像素子12が形成されたウエハの状態の半導体基板上に光透過性基板(カバーガラス11の元となる基板)が固定された後に個片化されることで製造された、パッケージである。WL-CSPは、ウエハの状態で一括処理が行われることにより、パッケージの小型化および薄型化が可能であり、製造コストの観点でも優れている。 In addition, the cover glass 11 and the image sensor 12 in the present disclosure have a WL-CSP structure. That is, the cover glass 11 and the image sensor 12 are separated after the light-transmitting substrate (the substrate on which the cover glass 11 is based) is fixed on the semiconductor substrate in a wafer state on which the plurality of image sensors 12 are formed. It is a package manufactured by being done. WL-CSP performs package processing in the state of a wafer, so that the package can be reduced in size and thickness, and is excellent in terms of manufacturing cost.
 カバーガラス11は、上記光透過性基板が個片化されることによって構築された構成であり、入射光を透過させる。そして、カバーガラス11は、撮像素子12よりも被写体側に位置することで、入射光を透過させ、カバーガラス11と撮像素子12との間に位置する結像面に、入射光を結像させる。すなわち、カバーガラス11は、撮像素子12に結像させたい光が属する波長帯域の光を透過させることが可能な材質(着目する波長帯域において透明とみなすことが可能な材質)で形成されている。また、WL-CSPの製造工程上、カバーガラス11は撮像素子12に固定された状態であり、カバーガラス11と撮像素子12の間は中空状態となっていない。 The cover glass 11 is constructed by separating the light transmissive substrate into pieces, and transmits incident light. The cover glass 11 is positioned closer to the subject side than the image sensor 12, thereby transmitting incident light, and forming the incident light on an imaging surface positioned between the cover glass 11 and the image sensor 12. . That is, the cover glass 11 is formed of a material that can transmit light in a wavelength band to which light to be imaged on the image sensor 12 belongs (a material that can be regarded as transparent in the wavelength band of interest). . Further, in the manufacturing process of WL-CSP, the cover glass 11 is fixed to the image sensor 12, and the space between the cover glass 11 and the image sensor 12 is not hollow.
 撮像素子12は、結像レンズ13およびカバーガラス11を透過して結像面に結像した入射光を受光して光電変換することにより、入射光に対応した撮像画像データを生成する。すなわち、かかる撮像素子12は、着目する入射光を検出可能な半導体からなる半導体基板で形成されている。例えば、着目する入射光の波長帯域が、いわゆる可視光帯域である場合、かかる撮像素子12としては、例えばBayer配列を有するカラー撮影可能な撮像素子等が用いられる。かかる撮像素子12は、例えば、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサ又はCCD(Charge Coupled Device)イメージセンサ等、各種の公知の素子であってよい。 The imaging device 12 generates incident image data corresponding to the incident light by receiving and photoelectrically converting the incident light that has passed through the imaging lens 13 and the cover glass 11 and formed an image on the imaging surface. That is, the imaging device 12 is formed of a semiconductor substrate made of a semiconductor capable of detecting the incident light of interest. For example, when the wavelength band of the incident light of interest is a so-called visible light band, for example, an image sensor capable of color photographing having a Bayer array is used as the image sensor 12. The imaging device 12 may be various known devices such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor.
 結像レンズ13は、撮像ユニット10に入射した入射光を透過させて、透過した入射光を結像面に結像させる。ここで、図1には、結像レンズ13として単数のレンズが示されているが、これはあくまで一例であり、結像レンズ13は複数のレンズの集合体であってもよい。なお、図1には、凸形状を有した結像レンズ13が示されているが、これはあくまで一例であり、凹形状を有した結像レンズ13や非球面構造を有する結像レンズ13等、結像レンズ13の形状は任意である。また、結像レンズ13の材質や、焦点距離等についても、特に限定されるものではない。 The imaging lens 13 transmits incident light incident on the imaging unit 10 and forms an image of the transmitted incident light on the imaging surface. Here, although a single lens is shown as the imaging lens 13 in FIG. 1, this is merely an example, and the imaging lens 13 may be an assembly of a plurality of lenses. In FIG. 1, the imaging lens 13 having a convex shape is shown. However, this is merely an example, and the imaging lens 13 having a concave shape, the imaging lens 13 having an aspherical structure, or the like. The shape of the imaging lens 13 is arbitrary. Further, the material of the imaging lens 13, the focal length, and the like are not particularly limited.
 (1-2.背景)
 上記では、本開示に係る撮像ユニット10の概要について説明した。続いて、本開示の背景について説明する。
(1-2. Background)
The overview of the imaging unit 10 according to the present disclosure has been described above. Subsequently, the background of the present disclosure will be described.
 近年、パッケージの小型化および薄型化が可能であり、かつ、製造コストの観点でも優れている等の理由で、撮像ユニットの構造として、WL-CSPが採用されることが多い。上記のとおり、WL-CSPは、製造工程上、カバーガラスが撮像素子に固定化されて、一体化された構造を有する。 In recent years, WL-CSP is often adopted as the structure of an imaging unit because the package can be reduced in size and thickness and is excellent in terms of manufacturing cost. As described above, the WL-CSP has a structure in which the cover glass is fixed to the imaging element and integrated in the manufacturing process.
 そのため、光が撮像ユニットに入射し、カバーガラスを透過して、カバーガラスと撮像素子との間に位置する結像面に結像する場合、カバーガラスが厚くなるほどカバーガラスにより生じる収差の影響が大きくなるため、撮像画像の品質が低下する。より具体的に説明すると、WL-CSPのようにカバーガラスが撮像素子に固定されて存在する場合、理想レンズとの光路差が、カバーガラスが厚くなるほど大きくなるため、カバーガラスにより生じる収差(球面収差等)の影響が大きくなる。そして、収差の影響が大きくなると、ピントずれ、滲み等の像の乱れが生じ、撮像画像の品質が低下する。 Therefore, when light enters the imaging unit, passes through the cover glass, and forms an image on the imaging surface located between the cover glass and the imaging device, the influence of the aberration caused by the cover glass increases as the cover glass becomes thicker. Since it becomes large, the quality of a captured image falls. More specifically, when the cover glass is fixed to the image sensor as in the WL-CSP, the optical path difference from the ideal lens becomes larger as the cover glass becomes thicker. The influence of aberration etc. becomes large. When the influence of aberration increases, image disturbance such as defocusing and blurring occurs, and the quality of the captured image decreases.
 また、結像レンズの形状及び材質の少なくとも何れかを変更することによって収差補正を行うことも考えられるが、カバーガラスが厚くなるほど、当該収差補正は困難になる場合がある。 Further, although it is conceivable to correct the aberration by changing at least one of the shape and material of the imaging lens, the aberration correction may become difficult as the cover glass becomes thicker.
 本件の開示者は、上記事情に着眼して本開示を創作するに至った。本開示に係る撮像ユニット10においては、カバーガラス11の画角θごとの理想レンズとの光路差が焦点深度以下になるような形状のカバーガラス11が用いられる。これによって、本開示に係る撮像ユニット10は、収差を補正することができ、撮像画像の品質を向上させることができる。以降では、本開示の実施例、本開示に係る撮像ユニット10を備える電子機器の機能構成、応用例等について順次説明する。 The presenter of the present case has created the present disclosure while paying attention to the above circumstances. In the imaging unit 10 according to the present disclosure, the cover glass 11 having a shape such that the optical path difference with the ideal lens for each angle of view θ of the cover glass 11 is equal to or less than the depth of focus is used. Thereby, the imaging unit 10 according to the present disclosure can correct the aberration and improve the quality of the captured image. Hereinafter, an example of the present disclosure, a functional configuration of an electronic apparatus including the imaging unit 10 according to the present disclosure, application examples, and the like will be sequentially described.
  <2.第1の実施例>
 上記では、本開示の背景について説明した。続いて、本開示の第1の実施例について説明する。
<2. First Example>
The background of the present disclosure has been described above. Subsequently, a first example of the present disclosure will be described.
 まず、図2~図4を参照して、カバーガラス11により生じる収差の影響について説明する。図2は、カバーガラス11が撮像素子12に場合における結像レンズ13から結像面までの光路長Lを示す図である。 First, the influence of the aberration caused by the cover glass 11 will be described with reference to FIGS. FIG. 2 is a diagram showing the optical path length L from the imaging lens 13 to the imaging surface when the cover glass 11 is the image sensor 12.
 図2に示したように、結像レンズ13の射出瞳距離をD[mm]とし、かかる結像レンズ13に対して、所定波長の光が画角θ[deg]で入射した場合を考える。この際、結像レンズ13から所定の結像面までの光路長Lは、以下の式(9)で簡易的に表される。なお、本明細書において、画角θとは、カバーガラス11の光軸に対する入射光の方向をいい、図2に示したように、光軸方向と入射光の進行方向とのなす角で表される。 As shown in FIG. 2, let us consider a case where the exit pupil distance of the imaging lens 13 is D [mm], and light of a predetermined wavelength is incident on the imaging lens 13 at an angle of view θ [deg]. At this time, the optical path length L from the imaging lens 13 to a predetermined imaging plane is simply expressed by the following formula (9). In the present specification, the angle of view θ refers to the direction of incident light with respect to the optical axis of the cover glass 11, and is represented by the angle formed by the optical axis direction and the traveling direction of incident light, as shown in FIG. Is done.
Figure JPOXMLDOC01-appb-M000022
Figure JPOXMLDOC01-appb-M000022
 続いて、図3について説明する。図3は、平らなカバーガラス11が撮像素子12に備えられる場合における結像レンズ13から結像面までの光路長L’を模式的に示す図である。図3には、板厚がtであり屈折率がnである平らなガラス板がカバーガラス11として撮像素子12に備えられる場合の光路長L’が示されている。 Subsequently, FIG. 3 will be described. FIG. 3 is a diagram schematically showing an optical path length L ′ from the imaging lens 13 to the imaging surface when the flat cover glass 11 is provided in the imaging device 12. FIG. 3 shows an optical path length L ′ when a flat glass plate having a thickness t and a refractive index n is provided as the cover glass 11 in the image sensor 12.
 図3において、カバーガラス11の板厚tは、空気間距離に換算されるとt/nと表される。したがって、カバーガラス11が設けられることにより発生するピントずれを補正する際に考慮すべき結像レンズ13とカバーガラス11との間の距離は、D-t/nとなる。ここで、入射光が画角θで入射した場合、スネルの法則により画角θと屈折角θ’との関係は、sinθ=n×sinθ’で表される。したがって、図3のように平らなガラス板がカバーガラス11として撮像素子12に備えられる場合に、画角θで入射した光線の、結像レンズ13から結像面までの光路長L’は、以下の式(10)で簡易的に表される。 In FIG. 3, the plate thickness t of the cover glass 11 is expressed as t / n when converted to the distance between air. Accordingly, the distance between the imaging lens 13 and the cover glass 11 to be taken into account when correcting the focus shift caused by the provision of the cover glass 11 is Dt / n. Here, when incident light is incident at an angle of view θ, the relationship between the angle of view θ and the refraction angle θ ′ is expressed by sin θ = n × sin θ ′ according to Snell's law. Therefore, when a flat glass plate is provided in the image sensor 12 as the cover glass 11 as shown in FIG. 3, the optical path length L ′ from the imaging lens 13 to the imaging surface of the light incident at the angle of view θ is It is simply expressed by the following formula (10).
Figure JPOXMLDOC01-appb-M000023
Figure JPOXMLDOC01-appb-M000023
 式(9)と式(10)を比較することで明らかなように、平らなカバーガラス11の影響で光路長が変化し、収差が発生する。言い換えると、式(10)で表されるL’から式(9)で表されるLを引いて得られる値が、平らなカバーガラス11が備えられることで生ずる光路差となり、この光路差に起因して、収差が発生する。 As is clear from a comparison of Equation (9) and Equation (10), the optical path length changes due to the influence of the flat cover glass 11, and aberration occurs. In other words, the value obtained by subtracting L represented by Equation (9) from L ′ represented by Equation (10) is the optical path difference caused by the provision of the flat cover glass 11, and this optical path difference is As a result, aberration occurs.
 続いて、図4について説明する。図4は、曲率半径Rの球面形状を表面に有するカバーガラス11が撮像素子12に備えられる場合における結像レンズ13から結像面までの光路長を示す図である。図4に示すように、結像レンズ13の光学中心を原点とした場合に、入射光とカバーガラス11との交点座標は、曲率半径Rの関数として、(x’(R),y’(R))と表される。また、カバーガラス11の曲率中心と交点座標を結ぶ直線と光軸がなす角度を、曲率半径Rの関数としてφ(R)とし、同様に、カバーガラス11表面における屈折角をθ”(R)とする。このとき、画角θで入射した光線の、結像レンズ13から結像面までの光路長L”は、以下の式(11)で簡易的に表される。 Subsequently, FIG. 4 will be described. FIG. 4 is a diagram showing the optical path length from the imaging lens 13 to the imaging surface when the imaging element 12 is provided with a cover glass 11 having a spherical shape with a radius of curvature R on the surface. As shown in FIG. 4, when the optical center of the imaging lens 13 is the origin, the intersection coordinates of the incident light and the cover glass 11 are expressed as a function of the radius of curvature R (x ′ (R), y ′ ( R)). Further, the angle formed by the optical axis and the straight line connecting the center of curvature of the cover glass 11 and the intersection point coordinates is φ (R) as a function of the radius of curvature R, and similarly, the refraction angle on the surface of the cover glass 11 is θ ″ (R). At this time, the optical path length L ″ from the imaging lens 13 to the imaging surface of the light beam incident at the angle of view θ is simply expressed by the following equation (11).
Figure JPOXMLDOC01-appb-M000024
Figure JPOXMLDOC01-appb-M000024
 式(11)で表されるL”から式(9)で表されるLを引いて得られる値が、曲率半径Rの球面形状を表面に有するカバーガラス11が備えられることで生ずる光路差となり、この光路差に起因する収差が発生することになる。なお、上記式(11)に示すように、射出瞳距離D、カバーガラス11の板厚tおよび屈折率nが固定値である場合に、それ以外の各値は曲率半径Rを変数とする関数で表される。従って、式(11)で与えられる光路差L”も、カバーガラス11の表面形状に由来する曲率半径Rの関数となる。 A value obtained by subtracting L represented by Equation (9) from L ″ represented by Equation (11) is an optical path difference caused by the provision of the cover glass 11 having a spherical surface with a radius of curvature R. In addition, when the aberration due to the optical path difference occurs, the exit pupil distance D, the plate thickness t of the cover glass 11 and the refractive index n are fixed values as shown in the above equation (11). Each of the other values is expressed by a function having the curvature radius R as a variable. Therefore, the optical path difference L ″ given by the equation (11) is also a function of the curvature radius R derived from the surface shape of the cover glass 11. Become.
 本実施例に係る撮像ユニット10においては、式(11)で表されるL”から式(9)で表されるLを引いて得られる光路差が、各画角θにおいて、焦点深度以下となるような曲率半径Rの球面形状を表面に有するカバーガラス11が採用される。ここで、結像レンズ13の開口数をNAとし、入射光の波長をλ[nm]とした場合、焦点深度Dは、以下の式(12)で表される。 In the imaging unit 10 according to the present embodiment, the optical path difference obtained by subtracting L represented by Expression (9) from L ″ represented by Expression (11) is equal to or less than the depth of focus at each angle of view θ. A cover glass 11 having a spherical surface with a radius of curvature R as described above is employed, where the numerical aperture of the imaging lens 13 is NA and the wavelength of incident light is λ [nm], the depth of focus. D f is represented by the following formula (12).
Figure JPOXMLDOC01-appb-M000025
Figure JPOXMLDOC01-appb-M000025
 すなわち、本実施例では、以下の式(13)の関係が満たされるように、結像レンズ13側のカバーガラス11の形状(換言すれば、レンズ光学系を構成するレンズ等の光学素子のうち、最も被写体側に位置するレンズの形状)が規定される。ここで、|Diff(θ,R)|は光路差を示す。 That is, in the present embodiment, the shape of the cover glass 11 on the imaging lens 13 side (in other words, among the optical elements such as lenses constituting the lens optical system) so that the relationship of the following expression (13) is satisfied. , The shape of the lens located closest to the subject). Here, | Diff (θ, R) | indicates an optical path difference.
Figure JPOXMLDOC01-appb-M000026
Figure JPOXMLDOC01-appb-M000026
 焦点深度Dとは、実用上、鮮明な像が結ばれているとみなされる光軸方向の許容範囲であるため、本実施例のように、各画角θにおける光路差が焦点深度D以下になることによって、光がいずれの画角θで入射しても鮮明に結像することになる。すなわち、本実施例の方法により、カバーガラス11により生じる収差を補正することができ、撮像画像の品質の向上させることができる。 The depth of focus D f is a tolerable range in the optical axis direction in which a clear image is considered to be formed practically. Therefore, as in this embodiment, the optical path difference at each angle of view θ is the depth of focus D f. Thus, a clear image is formed regardless of the angle of view θ when light is incident. In other words, the aberration of the cover glass 11 can be corrected by the method of this embodiment, and the quality of the captured image can be improved.
 ここで、一例として、本実施例の撮像ユニット10がスマートフォン等の電子機器に使用される場合について考える。スマートフォン等の電子機器に搭載される撮像ユニット10のF値は、一般的に2.0程度である。従って、撮像ユニット10を構成する結像レンズ13のF値が2.0であるとすると、以下の式(14)により、結像レンズ13の開口数NAは0.25となる(式中には、F値を「F」と記載)。 Here, as an example, consider the case where the imaging unit 10 of this embodiment is used in an electronic device such as a smartphone. The F value of the imaging unit 10 mounted on an electronic device such as a smartphone is generally about 2.0. Accordingly, if the F value of the imaging lens 13 constituting the imaging unit 10 is 2.0, the numerical aperture NA of the imaging lens 13 is 0.25 according to the following formula (14) (in the formula, Describes the F value as "F").
Figure JPOXMLDOC01-appb-M000027
Figure JPOXMLDOC01-appb-M000027
 また、スマートフォン等の電子機器は基本的に可視光の撮像を行う。ここで、検出される入射光の波長λが一般的な可視光の波長帯域の中央値であるとし、当該中央値が約550[nm]としたとき、式(12)に基づいて焦点深度Dは4.4[μm]となる。すなわち、本実施例の撮像ユニット10がスマートフォン等の電子機器に使用される場合、光路差は、式(13)に基づいて4.4[μm]以下となっていることが望ましい。 In addition, electronic devices such as smartphones basically capture visible light. Here, when the wavelength λ of incident light to be detected is a median value of a general visible light wavelength band, and the median value is about 550 [nm], the depth of focus D is calculated based on the equation (12). f is 4.4 [μm]. That is, when the imaging unit 10 of the present embodiment is used in an electronic device such as a smartphone, the optical path difference is desirably 4.4 [μm] or less based on the equation (13).
 また、上記のような球面収差の影響を考慮するだけでなく、更に非点収差の影響を考慮して光路差が算出されてもよい。より具体的に説明すると、実際の光学系では非点収差の影響により、サジタル方向の光路長とタンジェンシャル方向の光路長との差によって光路差が生じ得る。ここで、上記の非特許文献1(Warren J.Smith「Modern Optical Engineering」2007年)には、屈折率をnとし、厚みをtとする平らなガラス板に画角θで入射した光線の非点収差量AS(θ)は以下の式(15)で表されることが開示されている。 Further, not only the influence of the spherical aberration as described above but also the optical path difference may be calculated in consideration of the influence of astigmatism. More specifically, in an actual optical system, an optical path difference may occur due to the difference between the optical path length in the sagittal direction and the optical path length in the tangential direction due to the influence of astigmatism. Here, in the above Non-Patent Document 1 (Warren J. Smith “Modern Optical Engineering” 2007), the non-existence of light incident at an angle of view θ on a flat glass plate where the refractive index is n and the thickness is t. It is disclosed that the amount of point aberration AS (θ) is expressed by the following equation (15).
Figure JPOXMLDOC01-appb-M000028
Figure JPOXMLDOC01-appb-M000028
 本実施例において、式(15)のAS(θ)に基づいて得られた値が式(13)に適用されてもよい。例えば、式(15)のAS(θ)を2で除算して得られた値を式(13)の光路差から減算することで、非点収差を考慮した光路差が算出されてもよい。すなわち、以下の式(16)によってカバーガラス11の形状が求められてもよい。これによって、撮像ユニット10は、球面収差のみならず非点収差をも含めて収差を補正することができ、撮像画像の品質をより向上させることができる。なお、式(16)はあくまで一例であるため、適宜変更されてもよい。例えば、カバーガラス11の表面形状に基づいて非点収差量ASを算出する数式(すなわち、上記式(15))が適宜変更されてもよい。 In the present embodiment, a value obtained based on AS (θ) of Expression (15) may be applied to Expression (13). For example, the optical path difference in consideration of astigmatism may be calculated by subtracting the value obtained by dividing AS (θ) in Expression (15) by 2 from the optical path difference in Expression (13). That is, the shape of the cover glass 11 may be calculated | required by the following formula | equation (16). Accordingly, the imaging unit 10 can correct aberrations including not only spherical aberration but also astigmatism, and can further improve the quality of the captured image. In addition, since Formula (16) is an example to the last, you may change suitably. For example, the mathematical formula for calculating the astigmatism amount AS based on the surface shape of the cover glass 11 (that is, the above formula (15)) may be appropriately changed.
Figure JPOXMLDOC01-appb-M000029
Figure JPOXMLDOC01-appb-M000029
 なお、撮像ユニット10は、上記の式(13)または式(16)を満たすことによって十分な品質の撮像画像を生成することができるが、理想レンズとの光路差が可能な限り小さくなるようなカバーガラス11が使用されることがより好ましい。ここで、図5を参照して、理想レンズとの光路差を可能な限り小さくすることができるカバーガラス11の曲率半径Rについて説明する。 The imaging unit 10 can generate a captured image with sufficient quality by satisfying the above expression (13) or (16), but the optical path difference from the ideal lens is as small as possible. More preferably, a cover glass 11 is used. Here, with reference to FIG. 5, the curvature radius R of the cover glass 11 which can make the optical path difference with an ideal lens as small as possible is demonstrated.
 図5は、射出瞳距離Dを4.0[mm]とし、屈折率nを1.5とし、板厚tを0.2[mm]とした条件下で、光路差の二乗で与えられる値が入射光の画角θ毎に示された図である。各変数の値は、スマートフォン、デジタルカメラ等の小型の撮像デバイスを備えた電子機器が想定された値である。 FIG. 5 shows a value given by the square of the optical path difference under the conditions where the exit pupil distance D is 4.0 [mm], the refractive index n is 1.5, and the plate thickness t is 0.2 [mm]. Is a view shown for each angle of view θ of incident light. The value of each variable is a value assumed for an electronic device including a small imaging device such as a smartphone or a digital camera.
 図5に示すように、カバーガラス11の曲率半径Rが無限大(図中には「INF」と記載)である場合(すなわちカバーガラス11が平らな板状である場合)、曲率半径Rが他の値である場合に比べて光路差が大きくなる。光路差は、曲率半径Rが無限大から小さくなるにつれて小さくなり、曲率半径Rが50.7[mm]であるときに最小となり、曲率半径Rが50.7[mm]よりも小さくなると、再び大きくなる。すなわち、各変数が上記の条件である場合には、曲率半径Rが50.7[mm]のカバーガラス11が採用されることがより好ましい。 As shown in FIG. 5, when the radius of curvature R of the cover glass 11 is infinite (indicated as “INF” in the figure) (that is, when the cover glass 11 is a flat plate), the radius of curvature R is The optical path difference is larger than in the case of other values. The optical path difference decreases as the radius of curvature R decreases from infinity, becomes minimum when the radius of curvature R is 50.7 [mm], and again when the radius of curvature R becomes smaller than 50.7 [mm]. growing. That is, when each variable is the above-mentioned condition, it is more preferable to employ the cover glass 11 having a curvature radius R of 50.7 [mm].
 続いて、図6~図8を参照して、撮像ユニット10のデフォーカス変調伝達関数(MTF:Modulation Transfer Function)の特性について説明する。図6~図8の横軸はデフォーカス量を示しており、縦軸はMTFの強度を示している。また、図6~図8には、一例として、入射光の画角θを0[deg]としたときと30[deg]としたときにおけるMTFが示されている。なお、入射光の画角θを30[deg]としたときにおいては、サジタル方向のMTFおよびタンジェンシャル方向のMTFが示されている。 Subsequently, the defocus modulation transfer function (MTF: Modulation Transfer Function) characteristics of the imaging unit 10 will be described with reference to FIGS. 6 to 8, the horizontal axis indicates the defocus amount, and the vertical axis indicates the MTF intensity. 6 to 8 show MTFs when the angle of view θ of incident light is 0 [deg] and 30 [deg] as an example. When the angle of view θ of incident light is 30 [deg], the sagittal MTF and the tangential MTF are shown.
 まず、図6について説明する。図6は、カバーガラス11が備えられていない場合におけるMTFを示すグラフである。図6に示すように、カバーガラス11が備えられていない状態で収差が補正されている撮像ユニット10の場合、デフォーカスが0のときに各画角θのMTFが良好な値を示している。 First, FIG. 6 will be described. FIG. 6 is a graph showing the MTF when the cover glass 11 is not provided. As shown in FIG. 6, in the case of the imaging unit 10 in which the aberration is corrected without the cover glass 11 being provided, when the defocus is 0, the MTF of each angle of view θ shows a good value. .
 続いて図7について説明する。図7は、平らなカバーガラス11が備えられた場合におけるMTFを示すグラフである。図7に示すように、平らなカバーガラス11が備えられた場合には、入射光の画角θが0[deg]であるときのMTFは良好な値を示すが、入射光の画角θが30[deg]のときのMTFは劣化する。より具体的に説明すると、入射光の画角θが30[deg]のとき、MTFのピーク位置が移動することでデフォーカスが0のときのMTFが低くなる。 Subsequently, FIG. 7 will be described. FIG. 7 is a graph showing the MTF when the flat cover glass 11 is provided. As shown in FIG. 7, when the flat cover glass 11 is provided, the MTF when the angle of view θ of incident light is 0 [deg] shows a good value, but the angle of view θ of incident light. MTF deteriorates when is 30 [deg]. More specifically, when the angle of view θ of incident light is 30 [deg], the MTF peak position moves, so that the MTF when the defocus is 0 is lowered.
 続いて図8について説明する。図8は、第1の実施例に係る曲率半径Rの球面形状を表面に有するカバーガラス11(一例として、曲率半径Rを50.7[mm]としたときのカバーガラス11)が備えられた場合におけるMTFを示すグラフである。図8に示すように、本実施例に係る曲率半径Rの球面形状を表面に有するカバーガラス11が備えられた場合には、入射光の画角θが0[deg]であるときのMTFが良好な値を示すだけでなく、入射光の画角θが30[deg]のときのMTFの劣化が低減される。より具体的に説明すると、入射光の画角θが30[deg]のとき、MTFのピーク位置の移動量が図7に示した例に比べて少ないため、デフォーカスが0のときのMTFが高く保たれる。 Next, FIG. 8 will be described. 8 is provided with a cover glass 11 having a spherical shape with a radius of curvature R according to the first embodiment (for example, the cover glass 11 when the radius of curvature R is 50.7 [mm]). It is a graph which shows MTF in a case. As shown in FIG. 8, when a cover glass 11 having a spherical surface with a radius of curvature R according to the present embodiment is provided on the surface, the MTF when the angle of view θ of incident light is 0 [deg] is obtained. Not only a good value is shown, but also the degradation of MTF when the angle of view θ of incident light is 30 [deg] is reduced. More specifically, when the angle of view θ of incident light is 30 [deg], the amount of movement of the peak position of the MTF is smaller than that in the example shown in FIG. 7, so the MTF when the defocus is 0 is Kept high.
  <3.第2の実施例>
 上記では、本開示の第1の実施例について説明した。続いて、本開示の第2の実施例について説明する。
<3. Second Embodiment>
The first embodiment of the present disclosure has been described above. Subsequently, a second embodiment of the present disclosure will be described.
  第1の実施例では、各画角θにおける理想レンズとの光路差が焦点深度D以下となるようにカバーガラス11の曲率半径Rが決められていた。一方、第2の実施例では、各画角θにおける理想レンズとの光路差の平均値が焦点深度D以下となるように曲率半径Rが決められる。すなわち、第2の実施例では、以下の式(17)もしくは式(18)が満たされる。なお、式(18)は、第1の実施例と同様に、非点収差(上記式(15)のAS(θ))の影響が考慮された場合の条件式である。式(18)によって、撮像ユニット10は、非点収差を含めた収差を補正することができ、撮像画像の品質をより向上させることができる。 In the first embodiment, it has optical path difference is determined radius of curvature R of the cover glass 11 to be equal to or less than the focal depth D f of ideal lens at each angle theta. On the other hand, in the second embodiment, the average value of the optical path difference between the ideal lens at each angle of view θ is the curvature radius R to be equal to or less than the focal depth D f is determined. That is, in the second embodiment, the following expression (17) or expression (18) is satisfied. Note that, similarly to the first embodiment, Expression (18) is a conditional expression when the influence of astigmatism (AS (θ) in Expression (15) above) is taken into consideration. By Expression (18), the imaging unit 10 can correct aberrations including astigmatism, and can further improve the quality of the captured image.
Figure JPOXMLDOC01-appb-M000030
Figure JPOXMLDOC01-appb-M000030
Figure JPOXMLDOC01-appb-M000031
Figure JPOXMLDOC01-appb-M000031
 特に式(18)については、第1の実施例に係る式(16)よりも成立条件を緩和させた上で、十分な結像性能を実現することができる。換言すると、一部の画角θにおいて理想レンズとの光路差が焦点深度Dより大きくなったとして、各画角θにおける理想レンズとの光路差の平均値が焦点深度D以下となれば、式(18)が満たされるため、十分な結像性能を実現することができる。 In particular, with regard to Expression (18), it is possible to realize sufficient imaging performance while relaxing the establishment conditions as compared with Expression (16) according to the first embodiment. In other words, as the optical path difference between the ideal lens in some angle θ is larger than the focal depth D f, if the average value of the optical path difference between the ideal lens at each angle θ is familiar than the focal depth D f Since Expression (18) is satisfied, sufficient imaging performance can be realized.
 なお、第2の実施例に係る撮像ユニット10がスマートフォン等の電子機器に使用された場合を考えると、第1の実施例と同様に、当該電子機器のF値を2.0とし、入射光の波長λを550[nm]としたとすると、各画角θにおける理想レンズとの光路差の平均値は4.4[μm]以下となっていることが望ましい。 Considering the case where the imaging unit 10 according to the second embodiment is used in an electronic device such as a smartphone, the F value of the electronic device is set to 2.0 as in the first embodiment, and the incident light If the wavelength λ is 550 [nm], the average value of the optical path difference with the ideal lens at each angle of view θ is preferably 4.4 [μm] or less.
 また、スマートフォン等の電子機器に搭載される撮像ユニット10の画角θは一般的に80.0[deg]以下であり、カバーガラス11の屈折率nは1.5であり、焦点距離は一般的に4.0[mm]であるとする。ここで、焦点距離と射出瞳距離Dが同一であるとみなす。そして、これらの各パラメータを式(18)に入力することによって、図9に示すような、曲率半径Rおよび板厚tを変数とする曲線が算出される。なお、図9には、式(18)の最大条件を示す曲線と、式(18)の最小条件を示す曲線が示されている。図9に示すように、板厚tによって適した曲率半径Rが変化する。 In addition, the angle of view θ of the imaging unit 10 mounted on an electronic device such as a smartphone is generally 80.0 [deg] or less, the refractive index n of the cover glass 11 is 1.5, and the focal length is generally It is assumed that it is 4.0 [mm]. Here, it is assumed that the focal length and the exit pupil distance D are the same. Then, by inputting these parameters into the equation (18), a curve having the curvature radius R and the plate thickness t as variables as shown in FIG. 9 is calculated. FIG. 9 shows a curve indicating the maximum condition of Expression (18) and a curve indicating the minimum condition of Expression (18). As shown in FIG. 9, the suitable radius of curvature R varies depending on the plate thickness t.
 さらに、図9における式(18)の最大条件を示す曲線と式(18)の最小条件を示す曲線のそれぞれに対して線形近似を行うことによって以下の式(19)が導き出される(図9中にも、式(19)を示している)。 Furthermore, the following equation (19) is derived by performing linear approximation on each of the curve indicating the maximum condition of equation (18) and the curve indicating the minimum condition of equation (18) in FIG. 9 (in FIG. 9). (19) is also shown).
Figure JPOXMLDOC01-appb-M000032
Figure JPOXMLDOC01-appb-M000032
 以上から、式(19)を満たした撮像ユニット10は、スマートフォン等の電子機器に使用された場合に、カバーガラス11による収差を適切に補正することで、撮像画像の品質を向上させることができる。 From the above, when the imaging unit 10 that satisfies Expression (19) is used in an electronic device such as a smartphone, the quality of the captured image can be improved by appropriately correcting the aberration caused by the cover glass 11. .
  <4.第3の実施例>
 上記では、本開示の第2の実施例について説明した。続いて、本開示の第3の実施例について説明する。
<4. Third Example>
The second embodiment of the present disclosure has been described above. Subsequently, a third embodiment of the present disclosure will be described.
 第1の実施例および第2の実施例においては、曲率半径Rの球面形状を表面に有するカバーガラス11が用いられていた。すなわち、球面レンズであるカバーガラス11が用いられていた。一方、以下の第3の実施例では、カバーガラス11が非球面レンズである場合について説明する。 In the first example and the second example, the cover glass 11 having a spherical shape with a radius of curvature R on the surface was used. That is, the cover glass 11 which is a spherical lens is used. On the other hand, in the following third embodiment, the case where the cover glass 11 is an aspheric lens will be described.
 ここで、図10を参照して、第3の実施例についてより具体的に説明する。図10は、非球面構造を有するカバーガラス11が撮像素子12に固定されている場合における結像レンズ13から結像面までの光路長を示す図である。 Here, the third embodiment will be described more specifically with reference to FIG. FIG. 10 is a diagram illustrating the optical path length from the imaging lens 13 to the imaging surface when the cover glass 11 having an aspherical structure is fixed to the imaging device 12.
 ここで、非球面構造を有するレンズとしてよく知られている一般的な偶数次非球面レンズ表面の形状は以下の式(20)で表される。なお、式(20)における△xは非球面頂点からの光軸方向のずれ量であり、cは非球面レンズ(カバーガラス11)の曲率半径Rの逆数であり、kはコーニック定数であり、αは非球面係数である。 Here, the shape of the surface of a general even-order aspherical lens well known as a lens having an aspherical structure is represented by the following formula (20). In Expression (20), Δx is the amount of deviation in the optical axis direction from the aspherical vertex, c is the reciprocal of the radius of curvature R of the aspherical lens (cover glass 11), k is the conic constant, α i is an aspheric coefficient.
Figure JPOXMLDOC01-appb-M000033
Figure JPOXMLDOC01-appb-M000033
 ここで、例えば、コーニック定数kが0であり、4次の非球面係数がαであり、4次の非球面係数がαであると仮定すると、図10における入射光とカバーガラス11との交点座標(△x’,y’)は、以下の式(21)および式(22)の連立方程式によって算出される。 For example, assuming that the conic constant k is 0, the fourth-order aspheric coefficient is α 4 , and the fourth-order aspheric coefficient is α 6 , the incident light and the cover glass 11 in FIG. The intersection coordinates (Δx ′, y ′) are calculated by the following simultaneous equations (21) and (22).
Figure JPOXMLDOC01-appb-M000034
Figure JPOXMLDOC01-appb-M000034
Figure JPOXMLDOC01-appb-M000035
Figure JPOXMLDOC01-appb-M000035
 式(21)および式(22)の連立方程式の解となる交点座標を(△x’(R,α,α),y’(R,α,α))とし、交点座標の法線ベクトルとx軸とがなす角度をφ’とすると、法線ベクトルと光線とがなす角度はθ+φ’となるため、光線の屈折角θ”は、スネルの法則によりsin(θ+φ’)=n・sinθ”と表される。以上より、上記の式(11)と同様の方法で、光路長を算出することができる。 Equation (21) and wherein the intersection coordinates as the solutions of the simultaneous equations (22) (△ x '( R, α 4, α 6), y' (R, α 4, α 6)) and, the intersection coordinates If the angle formed between the normal vector and the x-axis is φ ′, the angle formed between the normal vector and the light beam is θ + φ ′. Therefore, the refraction angle θ ″ of the light beam is expressed by sin (θ + φ ′) = n · sin θ ″. From the above, the optical path length can be calculated by the same method as the above equation (11).
 ここで、図11を参照して、カバーガラス11の表面を非球面形状にすることによる効果について説明する。図11は、カバーガラス11の表面が球面形状である場合と非球面形状である場合それぞれにおける、入射光の画角θと光路差の二乗との関係を示す図である。なお、球面形状を有するカバーガラス11においては、射出瞳距離Dを4.0[mm]とし、屈折率nを1.5とし、板厚tを0.2[mm]とし、曲率半径Rを50.8[mm]とした条件下で作図されており、非球面形状を有するカバーガラス11においては、4次の非球面係数αを3×10-4とし、6次の非球面係数αを5×10-5とした条件下で作図されている。 Here, with reference to FIG. 11, the effect by making the surface of the cover glass 11 into an aspherical surface is demonstrated. FIG. 11 is a diagram showing the relationship between the angle of view θ of incident light and the square of the optical path difference when the surface of the cover glass 11 has a spherical shape and an aspherical shape. In the cover glass 11 having a spherical shape, the exit pupil distance D is set to 4.0 [mm], the refractive index n is set to 1.5, the plate thickness t is set to 0.2 [mm], and the radius of curvature R is set. The cover glass 11 is drawn under the condition of 50.8 [mm], and in the cover glass 11 having an aspheric shape, the fourth-order aspheric coefficient α 4 is set to 3 × 10 −4 and the sixth-order aspheric coefficient α Plotted under conditions where 6 is 5 × 10 −5 .
 また、第3の実施例においても、第1の実施例と同様に、非点収差の影響が考慮されてもよい。より具体的に説明すると、本実施例において、上記式(15)のAS(θ)に基づいて、第1の実施例と同様の方法で、非点収差を考慮した光路差が算出されてもよい。これによって、撮像ユニット10は、非点収差を含めて収差を補正することができ、撮像画像の品質をより向上させることができる。なお、図11には、非点収差の影響が考慮された場合の例が示されている。 Also in the third embodiment, the effect of astigmatism may be taken into account, as in the first embodiment. More specifically, in this embodiment, even if the optical path difference considering astigmatism is calculated by the same method as in the first embodiment, based on AS (θ) of the above formula (15). Good. Accordingly, the imaging unit 10 can correct aberrations including astigmatism, and can further improve the quality of the captured image. FIG. 11 shows an example in which the effect of astigmatism is taken into account.
 図11に示す例では、球面形状を有するカバーガラス11よりも非球面形状を有するカバーガラス11の方が、理想レンズとの光路差をより低減させていることがわかる。すなわち、第3の実施例に係る撮像ユニット10は、カバーガラス11の表面形状を非球面形状にすることによって、カバーガラス11によって生じる収差の影響をより低減させることができる場合がある。 In the example shown in FIG. 11, it can be seen that the cover glass 11 having an aspherical shape reduces the optical path difference from the ideal lens more than the cover glass 11 having a spherical shape. That is, the imaging unit 10 according to the third embodiment may be able to further reduce the influence of aberration caused by the cover glass 11 by making the surface shape of the cover glass 11 an aspherical shape.
  <5.第4の実施例>
 上記では、本開示の第3の実施例について説明した。続いて、本開示の第4の実施例について説明する。
<5. Fourth embodiment>
The third embodiment of the present disclosure has been described above. Subsequently, a fourth embodiment of the present disclosure will be described.
 上記の第1の実施例~第3の実施例においては、結像レンズ13による収差の影響が考慮されていなかった。すなわち、結像レンズ13が無収差レンズであるという前提であった。一方、第4の実施例では、結像レンズ13による収差の影響が考慮される。 In the first to third embodiments, the influence of the aberration due to the imaging lens 13 was not taken into consideration. That is, it was a premise that the imaging lens 13 is an aberration-free lens. On the other hand, in the fourth embodiment, the influence of aberration caused by the imaging lens 13 is taken into consideration.
 より具体的に説明すると、結像レンズ13の画角θごとの理想レンズとの光路差を算出し、当該光路差が上記の式(13)または式(17)に用いられてもよい。なお、結像レンズ13の画角θごとの理想レンズとの光路差の算出方法は任意である。例えば、結像レンズ13の画角θごとの理想レンズとの光路差は、結像レンズ13の設計データに基づいて算出されてもよいし、実測値によって求められてもよい。 More specifically, the optical path difference with the ideal lens for each angle of view θ of the imaging lens 13 may be calculated, and the optical path difference may be used in the above formula (13) or formula (17). In addition, the calculation method of the optical path difference with the ideal lens for every angle of view θ of the imaging lens 13 is arbitrary. For example, the optical path difference between the imaging lens 13 and the ideal lens for each angle of view θ may be calculated based on design data of the imaging lens 13 or may be obtained from an actual measurement value.
 以下の式(23)は、結像レンズ13の画角θごとの理想レンズとの光路差をf(θ)としたときに、当該f(θ)が式(13)に適用された場合の関係式である。 The following equation (23) is obtained when f (θ) is applied to equation (13), where f (θ) is the optical path difference between the imaging lens 13 and the ideal lens for each angle of view θ. It is a relational expression.
Figure JPOXMLDOC01-appb-M000036
Figure JPOXMLDOC01-appb-M000036
 以下の式(24)は、f(θ)が式(17)に適用された場合の関係式である。 The following expression (24) is a relational expression when f (θ) is applied to expression (17).
Figure JPOXMLDOC01-appb-M000037
Figure JPOXMLDOC01-appb-M000037
 第3の実施例は、カバーガラス11による収差だけでなく結像レンズ13による収差も考慮し、これらの収差を適切に補正することで、撮像画像の品質を向上させることができる。 In the third embodiment, not only the aberration caused by the cover glass 11 but also the aberration caused by the imaging lens 13 is taken into consideration, and the quality of the captured image can be improved by appropriately correcting these aberrations.
 次いで、結像レンズ13による収差も考慮される場合において最適な曲率半径Rについて説明する。図12は、第4の実施例において、入射光の画角θと光路差の二乗との関係を示す図である。図12においては、射出瞳距離Dを4.0[mm]とし、屈折率nを1.5とし、板厚tを0.2[mm]とした条件下で作図されている。 Next, the optimum radius of curvature R when the aberration due to the imaging lens 13 is also considered will be described. FIG. 12 is a diagram showing the relationship between the angle of view θ of incident light and the square of the optical path difference in the fourth embodiment. In FIG. 12, the drawing is performed under conditions where the exit pupil distance D is 4.0 [mm], the refractive index n is 1.5, and the plate thickness t is 0.2 [mm].
 ここで、結像レンズ13が無収差レンズでない場合、すなわち、結像レンズ13による収差が良好に補正されていない場合、結像レンズ13による像面湾曲の程度は、入射光の画角θの二乗に比例する。図12においては、一例として、結像レンズ13の画角θごとの理想レンズとの光路差f(θ)が以下の式(25)を満たすと仮定している。 Here, when the imaging lens 13 is not an aberration-free lens, that is, when the aberration due to the imaging lens 13 is not corrected well, the degree of field curvature by the imaging lens 13 is the angle of view θ of the incident light. It is proportional to the square. In FIG. 12, as an example, it is assumed that the optical path difference f (θ) between the imaging lens 13 and the ideal lens for each angle of view θ satisfies the following formula (25).
Figure JPOXMLDOC01-appb-M000038
Figure JPOXMLDOC01-appb-M000038
 また、第4の実施例においても、第1の実施例と同様に、非点収差の影響が考慮されてもよい。より具体的に説明すると、本実施例において、上記式(15)のAS(θ)に基づいて、第1の実施例と同様の方法で、非点収差を考慮した光路差が算出されてもよい。これによって、撮像ユニット10は、非点収差を含めて収差を補正することができ、撮像画像の品質をより向上させることができる。なお、図12には、非点収差の影響が考慮された場合の例が示されている。 Also in the fourth embodiment, as in the first embodiment, the influence of astigmatism may be considered. More specifically, in this embodiment, even if the optical path difference considering astigmatism is calculated by the same method as in the first embodiment, based on AS (θ) of the above formula (15). Good. Accordingly, the imaging unit 10 can correct aberrations including astigmatism, and can further improve the quality of the captured image. FIG. 12 shows an example in which the effect of astigmatism is taken into account.
 図12に示すように、結像レンズ13を無収差レンズとした場合に(第1の実施例において)最適であった曲率半径50.7[mm]における理想レンズとの光路差は大きくなる。一方で、曲率半径Rを33.0[mm]としたときに最小となる。すなわち、各変数が上記の条件である場合には、曲率半径Rが33.0[mm]のカバーガラス11が採用されることがより好ましい。 As shown in FIG. 12, when the imaging lens 13 is a non-aberration lens (in the first embodiment), the optical path difference from the ideal lens at the radius of curvature of 50.7 [mm], which is optimal, becomes large. On the other hand, it becomes the minimum when the curvature radius R is 33.0 [mm]. That is, when each variable is the above-mentioned conditions, it is more preferable to employ the cover glass 11 having a curvature radius R of 33.0 [mm].
  <6.本開示に係る撮像ユニットを備える電子機器の機能構成>
 上記では、本開示の第4の実施例について説明した。続いて、図13を参照して、本開示に係る撮像ユニット10が使用されたスマートフォン等の電子機器100の機能構成について説明する。図13は、本開示に係る撮像ユニット10を備える電子機器100の機能構成を示す図である。
<6. Functional configuration of electronic device including imaging unit according to present disclosure>
The fourth embodiment of the present disclosure has been described above. Subsequently, a functional configuration of the electronic device 100 such as a smartphone in which the imaging unit 10 according to the present disclosure is used will be described with reference to FIG. FIG. 13 is a diagram illustrating a functional configuration of the electronic device 100 including the imaging unit 10 according to the present disclosure.
 図13に示すように、本開示に係る電子機器100は、撮像ユニット10と、入力情報取得部20と、表示制御部30と、記憶部40と、制御部50と、を備える。 As illustrated in FIG. 13, the electronic device 100 according to the present disclosure includes an imaging unit 10, an input information acquisition unit 20, a display control unit 30, a storage unit 40, and a control unit 50.
 (撮像ユニット10)
 撮像ユニット10は、上記のような特徴および機能を有しており、カバーガラス11の形状によって収差が補正されることで良好な撮像画像データを生成することができる。撮像ユニット10は、生成した撮像画像データを後述する制御部50へ提供する。かかる撮像ユニット10の構成は、先だって説明した通りであるため、以下では詳細な説明は省略する。
(Imaging unit 10)
The imaging unit 10 has the features and functions as described above, and can generate good captured image data by correcting aberrations according to the shape of the cover glass 11. The imaging unit 10 provides the generated captured image data to the control unit 50 described later. Since the configuration of the imaging unit 10 is as described above, detailed description thereof will be omitted below.
 (入力情報取得部20)
 入力情報取得部20は、電子機器100のユーザによる入力のために使用されるインタフェースである。例えば、入力情報取得部20は、ボタン、タッチパネル、キーボード、マイク、ポインティングデバイス等を備え、ユーザはこれらの機器を使用して電子機器100へ入力を行う。入力情報取得部20は、ユーザによって入力された情報を後述する制御部50へ提供する。なお、入力情報取得部20は、ボタン等を備える外部機器からの入力情報を取得し、当該入力情報を制御部50へ提供してもよい。
(Input information acquisition unit 20)
The input information acquisition unit 20 is an interface used for input by the user of the electronic device 100. For example, the input information acquisition unit 20 includes a button, a touch panel, a keyboard, a microphone, a pointing device, and the like, and the user inputs to the electronic device 100 using these devices. The input information acquisition unit 20 provides information input by the user to the control unit 50 described later. Note that the input information acquisition unit 20 may acquire input information from an external device including a button and provide the input information to the control unit 50.
 (表示制御部30)
 表示制御部30は、各種情報の表示を制御する。具体的には、表示制御部30は、ディスプレイ等の表示機器を備え、各種情報を画像、テキスト、グラフなどの多様な形式で表示する。なお、表示制御部30は、ディスプレイ等を備える外部機器に対して、表示のための制御情報を送信することで、表示を実現してもよい。
(Display control unit 30)
The display control unit 30 controls display of various information. Specifically, the display control unit 30 includes a display device such as a display, and displays various types of information in various formats such as images, text, and graphs. The display control unit 30 may realize display by transmitting control information for display to an external device including a display or the like.
 (記憶部40)
 記憶部40は、後述する制御部50が各種制御処理を実施するに際して参照可能な、各種のパラメータ及びデータベース、並びに、各種のプログラム等を記憶する。また、かかる記憶部40は、制御部50によって各種制御処理が実施される際に生成される一時的なデータや各種の履歴情報等を格納してもよい。制御部50は、記憶部40に対して、自由にデータのリード/ライト処理を実施することが可能である。記憶部40は、例えば、ROM、RAM、ストレージ装置等により実現される。
(Storage unit 40)
The storage unit 40 stores various parameters and databases, various programs, and the like that can be referred to when the control unit 50 described later performs various control processes. Further, the storage unit 40 may store temporary data generated when various control processes are performed by the control unit 50, various history information, and the like. The control unit 50 can freely perform data read / write processing on the storage unit 40. The storage unit 40 is realized by, for example, a ROM, a RAM, a storage device, and the like.
 (制御部50)
 制御部50は、電子機器100における各種処理を制御する。例えば、制御部50は、ユーザによって入力された情報に基づいて撮像ユニット10による撮像処理を制御し、表示制御部30による撮像画像の表示処理を制御する。当該処理はあくまで一例であり、制御部50は、その他の各種処理を適宜制御してもよい。なお、制御部50は、例えば、CPU(Central Processing Unit)、ROM(Read Only Memory)、RAM(Random Access Memory)等によって実現される。
(Control unit 50)
The control unit 50 controls various processes in the electronic device 100. For example, the control unit 50 controls the imaging process by the imaging unit 10 based on the information input by the user, and controls the display process of the captured image by the display control unit 30. The process is merely an example, and the control unit 50 may appropriately control other various processes. The control unit 50 is realized by, for example, a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like.
 なお、図13の機能構成はあくまで一例であり、本開示に係る電子機器100が有する機能構成は適宜変更されてもよい。また、電子機器100の機能の一部が、制御部50によって具現されてもよい。例えば、制御部50が撮像ユニット10、入力情報取得部20、表示制御部30または記憶部40の機能の一部を具現してもよい。 Note that the functional configuration of FIG. 13 is merely an example, and the functional configuration of the electronic device 100 according to the present disclosure may be changed as appropriate. Further, some of the functions of the electronic device 100 may be implemented by the control unit 50. For example, the control unit 50 may embody part of the functions of the imaging unit 10, the input information acquisition unit 20, the display control unit 30, or the storage unit 40.
  <7.移動体への応用例>
 本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、自動車、電気自動車、ハイブリッド電気自動車、自動二輪車、自転車、パーソナルモビリティ、飛行機、ドローン、船舶、ロボット等のいずれかの種類の移動体に搭載される装置として実現されてもよい。
<7. Application example to mobile objects>
The technology according to the present disclosure (present technology) can be applied to various products. For example, the technology according to the present disclosure is realized as a device that is mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
 図14は、本開示に係る技術が適用され得る移動体制御システムの一例である車両制御システムの概略的な構成例を示すブロック図である。 FIG. 14 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
 車両制御システム12000は、通信ネットワーク12001を介して接続された複数の電子制御ユニットを備える。図14に示した例では、車両制御システム12000は、駆動系制御ユニット12010、ボディ系制御ユニット12020、車外情報検出ユニット12030、車内情報検出ユニット12040、及び統合制御ユニット12050を備える。また、統合制御ユニット12050の機能構成として、マイクロコンピュータ12051、音声画像出力部12052、及び車載ネットワークI/F(interface)12053が図示されている。 The vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001. In the example shown in FIG. 14, the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050. As a functional configuration of the integrated control unit 12050, a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are illustrated.
 駆動系制御ユニット12010は、各種プログラムにしたがって車両の駆動系に関連する装置の動作を制御する。例えば、駆動系制御ユニット12010は、内燃機関又は駆動用モータ等の車両の駆動力を発生させるための駆動力発生装置、駆動力を車輪に伝達するための駆動力伝達機構、車両の舵角を調節するステアリング機構、及び、車両の制動力を発生させる制動装置等の制御装置として機能する。 The drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs. For example, the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
 ボディ系制御ユニット12020は、各種プログラムにしたがって車体に装備された各種装置の動作を制御する。例えば、ボディ系制御ユニット12020は、キーレスエントリシステム、スマートキーシステム、パワーウィンドウ装置、あるいは、ヘッドランプ、バックランプ、ブレーキランプ、ウィンカー又はフォグランプ等の各種ランプの制御装置として機能する。この場合、ボディ系制御ユニット12020には、鍵を代替する携帯機から発信される電波又は各種スイッチの信号が入力され得る。ボディ系制御ユニット12020は、これらの電波又は信号の入力を受け付け、車両のドアロック装置、パワーウィンドウ装置、ランプ等を制御する。 The body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs. For example, the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp. In this case, the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches. The body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
 車外情報検出ユニット12030は、車両制御システム12000を搭載した車両の外部の情報を検出する。例えば、車外情報検出ユニット12030には、撮像部12031が接続される。車外情報検出ユニット12030は、撮像部12031に車外の画像を撮像させるとともに、撮像された画像を受信する。車外情報検出ユニット12030は、受信した画像に基づいて、人、車、障害物、標識又は路面上の文字等の物体検出処理又は距離検出処理を行ってもよい。 The vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted. For example, the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030. The vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image. The vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
 撮像部12031は、光を受光し、その光の受光量に応じた電気信号を出力する光センサである。撮像部12031は、電気信号を画像として出力することもできるし、測距の情報として出力することもできる。また、撮像部12031が受光する光は、可視光であっても良いし、赤外線等の非可視光であっても良い。 The imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light. The imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
 車内情報検出ユニット12040は、車内の情報を検出する。車内情報検出ユニット12040には、例えば、運転者の状態を検出する運転者状態検出部12041が接続される。運転者状態検出部12041は、例えば運転者を撮像するカメラを含み、車内情報検出ユニット12040は、運転者状態検出部12041から入力される検出情報に基づいて、運転者の疲労度合い又は集中度合いを算出してもよいし、運転者が居眠りをしていないかを判別してもよい。 The vehicle interior information detection unit 12040 detects vehicle interior information. For example, a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040. The driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
 マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車内外の情報に基づいて、駆動力発生装置、ステアリング機構又は制動装置の制御目標値を演算し、駆動系制御ユニット12010に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車両の衝突回避あるいは衝撃緩和、車間距離に基づく追従走行、車速維持走行、車両の衝突警告、又は車両のレーン逸脱警告等を含むADAS(Advanced Driver Assistance System)の機能実現を目的とした協調制御を行うことができる。 The microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010. For example, the microcomputer 12051 realizes an ADAS (Advanced Driver Assistance System) function including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030又は車内情報検出ユニット12040で取得される車両の周囲の情報に基づいて駆動力発生装置、ステアリング機構又は制動装置等を制御することにより、運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 Further, the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
 また、マイクロコンピュータ12051は、車外情報検出ユニット12030で取得される車外の情報に基づいて、ボディ系制御ユニット12020に対して制御指令を出力することができる。例えば、マイクロコンピュータ12051は、車外情報検出ユニット12030で検知した先行車又は対向車の位置に応じてヘッドランプを制御し、ハイビームをロービームに切り替える等の防眩を図ることを目的とした協調制御を行うことができる。 Further, the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030. For example, the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
 音声画像出力部12052は、車両の搭乗者又は車外に対して、視覚的又は聴覚的に情報を通知することが可能な出力装置へ音声及び画像のうちの少なくとも一方の出力信号を送信する。図14の例では、出力装置として、オーディオスピーカ12061、表示部12062及びインストルメントパネル12063が例示されている。表示部12062は、例えば、オンボードディスプレイ及びヘッドアップディスプレイの少なくとも一つを含んでいてもよい。 The sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle. In the example of FIG. 14, an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices. The display unit 12062 may include at least one of an on-board display and a head-up display, for example.
 図15は、撮像部12031の設置位置の例を示す図である。 FIG. 15 is a diagram illustrating an example of an installation position of the imaging unit 12031.
 図15では、車両12100は、撮像部12031として、撮像部12101,12102,12103,12104,12105を有する。 In FIG. 15, the vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
 撮像部12101,12102,12103,12104,12105は、例えば、車両12100のフロントノーズ、サイドミラー、リアバンパ、バックドア及び車室内のフロントガラスの上部等の位置に設けられる。フロントノーズに備えられる撮像部12101及び車室内のフロントガラスの上部に備えられる撮像部12105は、主として車両12100の前方の画像を取得する。サイドミラーに備えられる撮像部12102,12103は、主として車両12100の側方の画像を取得する。リアバンパ又はバックドアに備えられる撮像部12104は、主として車両12100の後方の画像を取得する。撮像部12101及び12105で取得される前方の画像は、主として先行車両又は、歩行者、障害物、信号機、交通標識又は車線等の検出に用いられる。 The imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper part of a windshield in the vehicle interior of the vehicle 12100. The imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100. The imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100. The imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100. The forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
 なお、図15には、撮像部12101ないし12104の撮影範囲の一例が示されている。撮像範囲12111は、フロントノーズに設けられた撮像部12101の撮像範囲を示し、撮像範囲12112,12113は、それぞれサイドミラーに設けられた撮像部12102,12103の撮像範囲を示し、撮像範囲12114は、リアバンパ又はバックドアに設けられた撮像部12104の撮像範囲を示す。例えば、撮像部12101ないし12104で撮像された画像データが重ね合わせられることにより、車両12100を上方から見た俯瞰画像が得られる。 FIG. 15 shows an example of the shooting range of the imaging units 12101 to 12104. The imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose, the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively, and the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
 撮像部12101ないし12104の少なくとも1つは、距離情報を取得する機能を有していてもよい。例えば、撮像部12101ないし12104の少なくとも1つは、複数の撮像素子からなるステレオカメラであってもよいし、位相差検出用の画素を有する撮像素子であってもよい。 At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information. For example, at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を基に、撮像範囲12111ないし12114内における各立体物までの距離と、この距離の時間的変化(車両12100に対する相対速度)を求めることにより、特に車両12100の進行路上にある最も近い立体物で、車両12100と略同じ方向に所定の速度(例えば、0km/h以上)で走行する立体物を先行車として抽出することができる。さらに、マイクロコンピュータ12051は、先行車の手前に予め確保すべき車間距離を設定し、自動ブレーキ制御(追従停止制御も含む)や自動加速制御(追従発進制御も含む)等を行うことができる。このように運転者の操作に拠らずに自律的に走行する自動運転等を目的とした協調制御を行うことができる。 For example, the microcomputer 12051, based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100). In particular, it is possible to extract, as a preceding vehicle, a three-dimensional object that travels at a predetermined speed (for example, 0 km / h or more) in the same direction as the vehicle 12100, particularly the closest three-dimensional object on the traveling path of the vehicle 12100. it can. Further, the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like. Thus, cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
 例えば、マイクロコンピュータ12051は、撮像部12101ないし12104から得られた距離情報を元に、立体物に関する立体物データを、2輪車、普通車両、大型車両、歩行者、電柱等その他の立体物に分類して抽出し、障害物の自動回避に用いることができる。例えば、マイクロコンピュータ12051は、車両12100の周辺の障害物を、車両12100のドライバが視認可能な障害物と視認困難な障害物とに識別する。そして、マイクロコンピュータ12051は、各障害物との衝突の危険度を示す衝突リスクを判断し、衝突リスクが設定値以上で衝突可能性がある状況であるときには、オーディオスピーカ12061や表示部12062を介してドライバに警報を出力することや、駆動系制御ユニット12010を介して強制減速や回避操舵を行うことで、衝突回避のための運転支援を行うことができる。 For example, the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles. For example, the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see. The microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is connected via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
 撮像部12101ないし12104の少なくとも1つは、赤外線を検出する赤外線カメラであってもよい。例えば、マイクロコンピュータ12051は、撮像部12101ないし12104の撮像画像中に歩行者が存在するか否かを判定することで歩行者を認識することができる。かかる歩行者の認識は、例えば赤外線カメラとしての撮像部12101ないし12104の撮像画像における特徴点を抽出する手順と、物体の輪郭を示す一連の特徴点にパターンマッチング処理を行って歩行者か否かを判別する手順によって行われる。マイクロコンピュータ12051が、撮像部12101ないし12104の撮像画像中に歩行者が存在すると判定し、歩行者を認識すると、音声画像出力部12052は、当該認識された歩行者に強調のための方形輪郭線を重畳表示するように、表示部12062を制御する。また、音声画像出力部12052は、歩行者を示すアイコン等を所望の位置に表示するように表示部12062を制御してもよい。 At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays. For example, the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not the user is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining. When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 has a rectangular contour line for emphasizing the recognized pedestrian. The display unit 12062 is controlled so as to be superimposed and displayed. Moreover, the audio | voice image output part 12052 may control the display part 12062 so that the icon etc. which show a pedestrian may be displayed on a desired position.
 以上、本開示に係る技術が適用され得る車両制御システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、例えば、撮像部12031に適用され得る。具体的には、本実施形態に係る撮像ユニット10は、撮像部12031に適用され得る。撮像ユニット10が撮像部12031に適用されることにより、収差が適切に補正された品質の高い撮像画像の生成が可能になる。 Heretofore, an example of a vehicle control system to which the technology according to the present disclosure can be applied has been described. Of the configurations described above, the technology according to the present disclosure may be applied to the imaging unit 12031, for example. Specifically, the imaging unit 10 according to the present embodiment can be applied to the imaging unit 12031. By applying the imaging unit 10 to the imaging unit 12031, it is possible to generate a high-quality captured image in which aberrations are appropriately corrected.
  <8.内視鏡手術システムへの応用例>
 本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。
<8. Application example to endoscopic surgery system>
The technology according to the present disclosure (present technology) can be applied to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
 図16は、本開示に係る技術(本技術)が適用され得る内視鏡手術システムの概略的な構成の一例を示す図である。 FIG. 16 is a diagram illustrating an example of a schematic configuration of an endoscopic surgery system to which the technology (present technology) according to the present disclosure can be applied.
 図16では、術者(医師)11131が、内視鏡手術システム11000を用いて、患者ベッド11133上の患者11132に手術を行っている様子が図示されている。図示するように、内視鏡手術システム11000は、内視鏡11100と、気腹チューブ11111やエネルギー処置具11112等の、その他の術具11110と、内視鏡11100を支持する支持アーム装置11120と、内視鏡下手術のための各種の装置が搭載されたカート11200と、から構成される。 FIG. 16 shows a state in which an operator (doctor) 11131 is performing an operation on a patient 11132 on a patient bed 11133 using an endoscopic operation system 11000. As shown in the figure, an endoscopic surgery system 11000 includes an endoscope 11100, other surgical instruments 11110 such as an insufflation tube 11111 and an energy treatment instrument 11112, and a support arm device 11120 that supports the endoscope 11100. And a cart 11200 on which various devices for endoscopic surgery are mounted.
 内視鏡11100は、先端から所定の長さの領域が患者11132の体腔内に挿入される鏡筒11101と、鏡筒11101の基端に接続されるカメラヘッド11102と、から構成される。図示する例では、硬性の鏡筒11101を有するいわゆる硬性鏡として構成される内視鏡11100を図示しているが、内視鏡11100は、軟性の鏡筒を有するいわゆる軟性鏡として構成されてもよい。 The endoscope 11100 includes a lens barrel 11101 in which a region having a predetermined length from the distal end is inserted into the body cavity of the patient 11132, and a camera head 11102 connected to the proximal end of the lens barrel 11101. In the illustrated example, an endoscope 11100 configured as a so-called rigid mirror having a rigid lens barrel 11101 is illustrated, but the endoscope 11100 may be configured as a so-called flexible mirror having a flexible lens barrel. Good.
 鏡筒11101の先端には、対物レンズが嵌め込まれた開口部が設けられている。内視鏡11100には光源装置11203が接続されており、当該光源装置11203によって生成された光が、鏡筒11101の内部に延設されるライトガイドによって当該鏡筒の先端まで導光され、対物レンズを介して患者11132の体腔内の観察対象に向かって照射される。なお、内視鏡11100は、直視鏡であってもよいし、斜視鏡又は側視鏡であってもよい。 An opening into which the objective lens is fitted is provided at the tip of the lens barrel 11101. A light source device 11203 is connected to the endoscope 11100, and light generated by the light source device 11203 is guided to the tip of the lens barrel by a light guide extending inside the lens barrel 11101. Irradiation is performed toward the observation target in the body cavity of the patient 11132 through the lens. Note that the endoscope 11100 may be a direct endoscope, a perspective mirror, or a side endoscope.
 カメラヘッド11102の内部には光学系及び撮像素子が設けられており、観察対象からの反射光(観察光)は当該光学系によって当該撮像素子に集光される。当該撮像素子によって観察光が光電変換され、観察光に対応する電気信号、すなわち観察像に対応する画像信号が生成される。当該画像信号は、RAWデータとしてカメラコントロールユニット(CCU: Camera Control Unit)11201に送信される。 An optical system and an image sensor are provided inside the camera head 11102, and reflected light (observation light) from the observation target is condensed on the image sensor by the optical system. Observation light is photoelectrically converted by the imaging element, and an electrical signal corresponding to the observation light, that is, an image signal corresponding to the observation image is generated. The image signal is transmitted to a camera control unit (CCU: Camera Control Unit) 11201 as RAW data.
 CCU11201は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)等によって構成され、内視鏡11100及び表示装置11202の動作を統括的に制御する。さらに、CCU11201は、カメラヘッド11102から画像信号を受け取り、その画像信号に対して、例えば現像処理(デモザイク処理)等の、当該画像信号に基づく画像を表示するための各種の画像処理を施す。 The CCU 11201 is configured by a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and comprehensively controls operations of the endoscope 11100 and the display device 11202. Further, the CCU 11201 receives an image signal from the camera head 11102 and performs various kinds of image processing for displaying an image based on the image signal, such as development processing (demosaic processing), for example.
 表示装置11202は、CCU11201からの制御により、当該CCU11201によって画像処理が施された画像信号に基づく画像を表示する。 The display device 11202 displays an image based on an image signal subjected to image processing by the CCU 11201 under the control of the CCU 11201.
 光源装置11203は、例えばLED(Light Emitting Diode)等の光源から構成され、術部等を撮影する際の照射光を内視鏡11100に供給する。 The light source device 11203 is composed of a light source such as an LED (Light Emitting Diode), for example, and supplies irradiation light to the endoscope 11100 when photographing a surgical site or the like.
 入力装置11204は、内視鏡手術システム11000に対する入力インタフェースである。ユーザは、入力装置11204を介して、内視鏡手術システム11000に対して各種の情報の入力や指示入力を行うことができる。例えば、ユーザは、内視鏡11100による撮像条件(照射光の種類、倍率及び焦点距離等)を変更する旨の指示等を入力する。 The input device 11204 is an input interface for the endoscopic surgery system 11000. A user can input various information and instructions to the endoscopic surgery system 11000 via the input device 11204. For example, the user inputs an instruction to change the imaging conditions (type of irradiation light, magnification, focal length, etc.) by the endoscope 11100.
 処置具制御装置11205は、組織の焼灼、切開又は血管の封止等のためのエネルギー処置具11112の駆動を制御する。気腹装置11206は、内視鏡11100による視野の確保及び術者の作業空間の確保の目的で、患者11132の体腔を膨らめるために、気腹チューブ11111を介して当該体腔内にガスを送り込む。レコーダ11207は、手術に関する各種の情報を記録可能な装置である。プリンタ11208は、手術に関する各種の情報を、テキスト、画像又はグラフ等各種の形式で印刷可能な装置である。 The treatment instrument control device 11205 controls the drive of the energy treatment instrument 11112 for tissue ablation, incision, blood vessel sealing, or the like. In order to inflate the body cavity of the patient 11132 for the purpose of securing the field of view by the endoscope 11100 and securing the operator's work space, the pneumoperitoneum device 11206 passes gas into the body cavity via the pneumoperitoneum tube 11111. Send in. The recorder 11207 is an apparatus capable of recording various types of information related to surgery. The printer 11208 is a device that can print various types of information related to surgery in various formats such as text, images, or graphs.
 なお、内視鏡11100に術部を撮影する際の照射光を供給する光源装置11203は、例えばLED、レーザ光源又はこれらの組み合わせによって構成される白色光源から構成することができる。RGBレーザ光源の組み合わせにより白色光源が構成される場合には、各色(各波長)の出力強度及び出力タイミングを高精度に制御することができるため、光源装置11203において撮像画像のホワイトバランスの調整を行うことができる。また、この場合には、RGBレーザ光源それぞれからのレーザ光を時分割で観察対象に照射し、その照射タイミングに同期してカメラヘッド11102の撮像素子の駆動を制御することにより、RGBそれぞれに対応した画像を時分割で撮像することも可能である。当該方法によれば、当該撮像素子にカラーフィルタを設けなくても、カラー画像を得ることができる。 In addition, the light source device 11203 that supplies the irradiation light when the surgical site is imaged to the endoscope 11100 can be configured by, for example, a white light source configured by an LED, a laser light source, or a combination thereof. When a white light source is configured by a combination of RGB laser light sources, the output intensity and output timing of each color (each wavelength) can be controlled with high accuracy. Therefore, the light source device 11203 adjusts the white balance of the captured image. It can be carried out. In this case, laser light from each of the RGB laser light sources is irradiated on the observation target in a time-sharing manner, and the drive of the image sensor of the camera head 11102 is controlled in synchronization with the irradiation timing, thereby corresponding to each RGB. It is also possible to take the images that have been taken in time division. According to this method, a color image can be obtained without providing a color filter in the image sensor.
 また、光源装置11203は、出力する光の強度を所定の時間ごとに変更するようにその駆動が制御されてもよい。その光の強度の変更のタイミングに同期してカメラヘッド11102の撮像素子の駆動を制御して時分割で画像を取得し、その画像を合成することにより、いわゆる黒つぶれ及び白とびのない高ダイナミックレンジの画像を生成することができる。 Further, the driving of the light source device 11203 may be controlled so as to change the intensity of the output light every predetermined time. Synchronously with the timing of changing the intensity of the light, the drive of the image sensor of the camera head 11102 is controlled to acquire an image in a time-sharing manner, and the image is synthesized, so that high dynamic without so-called blackout and overexposure A range image can be generated.
 また、光源装置11203は、特殊光観察に対応した所定の波長帯域の光を供給可能に構成されてもよい。特殊光観察では、例えば、体組織における光の吸収の波長依存性を利用して、通常の観察時における照射光(すなわち、白色光)に比べて狭帯域の光を照射することにより、粘膜表層の血管等の所定の組織を高コントラストで撮影する、いわゆる狭帯域光観察(Narrow Band Imaging)が行われる。あるいは、特殊光観察では、励起光を照射することにより発生する蛍光により画像を得る蛍光観察が行われてもよい。蛍光観察では、体組織に励起光を照射し当該体組織からの蛍光を観察すること(自家蛍光観察)、又はインドシアニングリーン(ICG)等の試薬を体組織に局注するとともに当該体組織にその試薬の蛍光波長に対応した励起光を照射し蛍光像を得ること等を行うことができる。光源装置11203は、このような特殊光観察に対応した狭帯域光及び/又は励起光を供給可能に構成され得る。 Further, the light source device 11203 may be configured to be able to supply light of a predetermined wavelength band corresponding to special light observation. In special light observation, for example, by utilizing the wavelength dependence of light absorption in body tissue, the surface of the mucous membrane is irradiated by irradiating light in a narrow band compared to irradiation light (ie, white light) during normal observation. A so-called narrow band imaging is performed in which a predetermined tissue such as a blood vessel is imaged with high contrast. Alternatively, in special light observation, fluorescence observation may be performed in which an image is obtained by fluorescence generated by irradiating excitation light. In fluorescence observation, the body tissue is irradiated with excitation light to observe fluorescence from the body tissue (autofluorescence observation), or a reagent such as indocyanine green (ICG) is locally administered to the body tissue and applied to the body tissue. It is possible to obtain a fluorescence image by irradiating excitation light corresponding to the fluorescence wavelength of the reagent. The light source device 11203 can be configured to be able to supply narrowband light and / or excitation light corresponding to such special light observation.
 図17は、図16に示すカメラヘッド11102及びCCU11201の機能構成の一例を示すブロック図である。 FIG. 17 is a block diagram showing an example of the functional configuration of the camera head 11102 and CCU 11201 shown in FIG.
 カメラヘッド11102は、レンズユニット11401と、撮像部11402と、駆動部11403と、通信部11404と、カメラヘッド制御部11405と、を有する。CCU11201は、通信部11411と、画像処理部11412と、制御部11413と、を有する。カメラヘッド11102とCCU11201とは、伝送ケーブル11400によって互いに通信可能に接続されている。 The camera head 11102 includes a lens unit 11401, an imaging unit 11402, a drive unit 11403, a communication unit 11404, and a camera head control unit 11405. The CCU 11201 includes a communication unit 11411, an image processing unit 11412, and a control unit 11413. The camera head 11102 and the CCU 11201 are connected to each other by a transmission cable 11400 so that they can communicate with each other.
 レンズユニット11401は、鏡筒11101との接続部に設けられる光学系である。鏡筒11101の先端から取り込まれた観察光は、カメラヘッド11102まで導光され、当該レンズユニット11401に入射する。レンズユニット11401は、ズームレンズ及びフォーカスレンズを含む複数のレンズが組み合わされて構成される。 The lens unit 11401 is an optical system provided at a connection portion with the lens barrel 11101. Observation light taken from the tip of the lens barrel 11101 is guided to the camera head 11102 and enters the lens unit 11401. The lens unit 11401 is configured by combining a plurality of lenses including a zoom lens and a focus lens.
 撮像部11402は、撮像素子で構成される。撮像部11402を構成する撮像素子は、1つ(いわゆる単板式)であってもよいし、複数(いわゆる多板式)であってもよい。撮像部11402が多板式で構成される場合には、例えば各撮像素子によってRGBそれぞれに対応する画像信号が生成され、それらが合成されることによりカラー画像が得られてもよい。あるいは、撮像部11402は、3D(Dimensional)表示に対応する右目用及び左目用の画像信号をそれぞれ取得するための1対の撮像素子を有するように構成されてもよい。3D表示が行われることにより、術者11131は術部における生体組織の奥行きをより正確に把握することが可能になる。なお、撮像部11402が多板式で構成される場合には、各撮像素子に対応して、レンズユニット11401も複数系統設けられ得る。 The imaging unit 11402 includes an imaging element. One (so-called single plate type) image sensor may be included in the imaging unit 11402, or a plurality (so-called multi-plate type) may be used. In the case where the imaging unit 11402 is configured as a multi-plate type, for example, image signals corresponding to RGB may be generated by each imaging element, and a color image may be obtained by combining them. Alternatively, the imaging unit 11402 may be configured to include a pair of imaging elements for acquiring right-eye and left-eye image signals corresponding to 3D (Dimensional) display. By performing the 3D display, the operator 11131 can more accurately grasp the depth of the living tissue in the surgical site. Note that in the case where the imaging unit 11402 is configured as a multi-plate type, a plurality of lens units 11401 can be provided corresponding to each imaging element.
 また、撮像部11402は、必ずしもカメラヘッド11102に設けられなくてもよい。例えば、撮像部11402は、鏡筒11101の内部に、対物レンズの直後に設けられてもよい。 Further, the imaging unit 11402 is not necessarily provided in the camera head 11102. For example, the imaging unit 11402 may be provided inside the lens barrel 11101 immediately after the objective lens.
 駆動部11403は、アクチュエータによって構成され、カメラヘッド制御部11405からの制御により、レンズユニット11401のズームレンズ及びフォーカスレンズを光軸に沿って所定の距離だけ移動させる。これにより、撮像部11402による撮像画像の倍率及び焦点が適宜調整され得る。 The driving unit 11403 is configured by an actuator, and moves the zoom lens and the focus lens of the lens unit 11401 by a predetermined distance along the optical axis under the control of the camera head control unit 11405. Thereby, the magnification and the focus of the image captured by the imaging unit 11402 can be adjusted as appropriate.
 通信部11404は、CCU11201との間で各種の情報を送受信するための通信装置によって構成される。通信部11404は、撮像部11402から得た画像信号をRAWデータとして伝送ケーブル11400を介してCCU11201に送信する。 The communication unit 11404 is configured by a communication device for transmitting and receiving various types of information to and from the CCU 11201. The communication unit 11404 transmits the image signal obtained from the imaging unit 11402 as RAW data to the CCU 11201 via the transmission cable 11400.
 また、通信部11404は、CCU11201から、カメラヘッド11102の駆動を制御するための制御信号を受信し、カメラヘッド制御部11405に供給する。当該制御信号には、例えば、撮像画像のフレームレートを指定する旨の情報、撮像時の露出値を指定する旨の情報、並びに/又は撮像画像の倍率及び焦点を指定する旨の情報等、撮像条件に関する情報が含まれる。 Further, the communication unit 11404 receives a control signal for controlling driving of the camera head 11102 from the CCU 11201 and supplies the control signal to the camera head control unit 11405. The control signal includes, for example, information for designating the frame rate of the captured image, information for designating the exposure value at the time of imaging, and / or information for designating the magnification and focus of the captured image. Contains information about the condition.
 なお、上記のフレームレートや露出値、倍率、焦点等の撮像条件は、ユーザによって適宜指定されてもよいし、取得された画像信号に基づいてCCU11201の制御部11413によって自動的に設定されてもよい。後者の場合には、いわゆるAE(Auto Exposure)機能、AF(Auto Focus)機能及びAWB(Auto White Balance)機能が内視鏡11100に搭載されていることになる。 Note that the imaging conditions such as the frame rate, exposure value, magnification, and focus may be appropriately specified by the user, or may be automatically set by the control unit 11413 of the CCU 11201 based on the acquired image signal. Good. In the latter case, a so-called AE (Auto Exposure) function, AF (Auto Focus) function, and AWB (Auto White Balance) function are mounted on the endoscope 11100.
 カメラヘッド制御部11405は、通信部11404を介して受信したCCU11201からの制御信号に基づいて、カメラヘッド11102の駆動を制御する。 The camera head control unit 11405 controls driving of the camera head 11102 based on a control signal from the CCU 11201 received via the communication unit 11404.
 通信部11411は、カメラヘッド11102との間で各種の情報を送受信するための通信装置によって構成される。通信部11411は、カメラヘッド11102から、伝送ケーブル11400を介して送信される画像信号を受信する。 The communication unit 11411 is configured by a communication device for transmitting and receiving various types of information to and from the camera head 11102. The communication unit 11411 receives an image signal transmitted from the camera head 11102 via the transmission cable 11400.
 また、通信部11411は、カメラヘッド11102に対して、カメラヘッド11102の駆動を制御するための制御信号を送信する。画像信号や制御信号は、電気通信や光通信等によって送信することができる。 Further, the communication unit 11411 transmits a control signal for controlling driving of the camera head 11102 to the camera head 11102. The image signal and the control signal can be transmitted by electrical communication, optical communication, or the like.
 画像処理部11412は、カメラヘッド11102から送信されたRAWデータである画像信号に対して各種の画像処理を施す。 The image processing unit 11412 performs various types of image processing on the image signal that is RAW data transmitted from the camera head 11102.
 制御部11413は、内視鏡11100による術部等の撮像、及び、術部等の撮像により得られる撮像画像の表示に関する各種の制御を行う。例えば、制御部11413は、カメラヘッド11102の駆動を制御するための制御信号を生成する。 The control unit 11413 performs various types of control related to imaging of the surgical site by the endoscope 11100 and display of a captured image obtained by imaging of the surgical site. For example, the control unit 11413 generates a control signal for controlling driving of the camera head 11102.
 また、制御部11413は、画像処理部11412によって画像処理が施された画像信号に基づいて、術部等が映った撮像画像を表示装置11202に表示させる。この際、制御部11413は、各種の画像認識技術を用いて撮像画像内における各種の物体を認識してもよい。例えば、制御部11413は、撮像画像に含まれる物体のエッジの形状や色等を検出することにより、鉗子等の術具、特定の生体部位、出血、エネルギー処置具11112の使用時のミスト等を認識することができる。制御部11413は、表示装置11202に撮像画像を表示させる際に、その認識結果を用いて、各種の手術支援情報を当該術部の画像に重畳表示させてもよい。手術支援情報が重畳表示され、術者11131に提示されることにより、術者11131の負担を軽減することや、術者11131が確実に手術を進めることが可能になる。 Further, the control unit 11413 causes the display device 11202 to display a picked-up image showing the surgical part or the like based on the image signal subjected to the image processing by the image processing unit 11412. At this time, the control unit 11413 may recognize various objects in the captured image using various image recognition techniques. For example, the control unit 11413 detects surgical tools such as forceps, specific biological parts, bleeding, mist when using the energy treatment tool 11112, and the like by detecting the shape and color of the edge of the object included in the captured image. Can be recognized. When displaying the captured image on the display device 11202, the control unit 11413 may display various types of surgery support information superimposed on the image of the surgical unit using the recognition result. Surgery support information is displayed in a superimposed manner and presented to the operator 11131, thereby reducing the burden on the operator 11131 and allowing the operator 11131 to proceed with surgery reliably.
 カメラヘッド11102及びCCU11201を接続する伝送ケーブル11400は、電気信号の通信に対応した電気信号ケーブル、光通信に対応した光ファイバ、又はこれらの複合ケーブルである。 The transmission cable 11400 for connecting the camera head 11102 and the CCU 11201 is an electric signal cable corresponding to electric signal communication, an optical fiber corresponding to optical communication, or a composite cable thereof.
 ここで、図示する例では、伝送ケーブル11400を用いて有線で通信が行われていたが、カメラヘッド11102とCCU11201との間の通信は無線で行われてもよい。 Here, in the illustrated example, communication is performed by wire using the transmission cable 11400. However, communication between the camera head 11102 and the CCU 11201 may be performed wirelessly.
 以上、本開示に係る技術が適用され得る内視鏡手術システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、例えば、カメラヘッド11102の撮像部11402等に適用され得る。具体的には、本実施形態に係る撮像ユニット10は、カメラヘッド11102の撮像部11402に適用され得る。撮像ユニット10がカメラヘッド11102の撮像部11402に適用されることにより、収差が適切に補正された品質の高い撮像画像の生成が可能になる。 In the foregoing, an example of an endoscopic surgery system to which the technology according to the present disclosure can be applied has been described. Of the configurations described above, the technology according to the present disclosure can be applied to, for example, the imaging unit 11402 of the camera head 11102. Specifically, the imaging unit 10 according to the present embodiment can be applied to the imaging unit 11402 of the camera head 11102. By applying the imaging unit 10 to the imaging unit 11402 of the camera head 11102, it is possible to generate a high-quality captured image in which aberrations are appropriately corrected.
 なお、ここでは、一例として内視鏡手術システムについて説明したが、本開示に係る技術は、その他、例えば、顕微鏡手術システム等に適用されてもよい。 Note that although an endoscopic surgery system has been described here as an example, the technology according to the present disclosure may be applied to, for example, a microscope surgery system and the like.
  <9.体内情報取得システムへの応用例>
 本開示に係る技術(本技術)は、様々な製品へ応用することができる。例えば、本開示に係る技術は、内視鏡手術システムに適用されてもよい。
<9. Application example for in-vivo information acquisition system>
The technology according to the present disclosure (present technology) can be applied to various products. For example, the technology according to the present disclosure may be applied to an endoscopic surgery system.
 図18は、本開示に係る技術(本技術)が適用され得る、カプセル型内視鏡を用いた患者の体内情報取得システムの概略的な構成の一例を示すブロック図である。 FIG. 18 is a block diagram illustrating an example of a schematic configuration of a patient in-vivo information acquisition system using a capsule endoscope to which the technique (present technique) according to the present disclosure can be applied.
 体内情報取得システム10001は、カプセル型内視鏡10100と、外部制御装置10200とから構成される。 The in-vivo information acquisition system 10001 includes a capsule endoscope 10100 and an external control device 10200.
 カプセル型内視鏡10100は、検査時に、患者によって飲み込まれる。カプセル型内視鏡10100は、撮像機能及び無線通信機能を有し、患者から自然排出されるまでの間、胃や腸等の臓器の内部を蠕動運動等によって移動しつつ、当該臓器の内部の画像(以下、体内画像ともいう)を所定の間隔で順次撮像し、その体内画像についての情報を体外の外部制御装置10200に順次無線送信する。 The capsule endoscope 10100 is swallowed by the patient at the time of examination. The capsule endoscope 10100 has an imaging function and a wireless communication function, and moves inside the organ such as the stomach and the intestine by peristaltic motion or the like until it is spontaneously discharged from the patient. Images (hereinafter also referred to as in-vivo images) are sequentially captured at predetermined intervals, and information about the in-vivo images is sequentially wirelessly transmitted to the external control device 10200 outside the body.
 外部制御装置10200は、体内情報取得システム10001の動作を統括的に制御する。また、外部制御装置10200は、カプセル型内視鏡10100から送信されてくる体内画像についての情報を受信し、受信した体内画像についての情報に基づいて、表示装置(図示せず)に当該体内画像を表示するための画像データを生成する。 The external control device 10200 comprehensively controls the operation of the in-vivo information acquisition system 10001. Further, the external control device 10200 receives information about the in-vivo image transmitted from the capsule endoscope 10100 and, based on the received information about the in-vivo image, displays the in-vivo image on the display device (not shown). The image data for displaying is generated.
 体内情報取得システム10001では、このようにして、カプセル型内視鏡10100が飲み込まれてから排出されるまでの間、患者の体内の様子を撮像した体内画像を随時得ることができる。 In the in-vivo information acquisition system 10001, an in-vivo image obtained by imaging the inside of the patient's body can be obtained at any time in this manner until the capsule endoscope 10100 is swallowed and discharged.
 カプセル型内視鏡10100と外部制御装置10200の構成及び機能についてより詳細に説明する。 The configurations and functions of the capsule endoscope 10100 and the external control device 10200 will be described in more detail.
 カプセル型内視鏡10100は、カプセル型の筐体10101を有し、その筐体10101内には、光源部10111、撮像部10112、画像処理部10113、無線通信部10114、給電部10115、電源部10116、及び制御部10117が収納されている。 The capsule endoscope 10100 includes a capsule-type casing 10101. In the casing 10101, a light source unit 10111, an imaging unit 10112, an image processing unit 10113, a wireless communication unit 10114, a power supply unit 10115, and a power supply unit 10116 and the control unit 10117 are stored.
 光源部10111は、例えばLED(Light Emitting Diode)等の光源から構成され、撮像部10112の撮像視野に対して光を照射する。 The light source unit 10111 is composed of a light source such as an LED (Light Emitting Diode), for example, and irradiates the imaging field of the imaging unit 10112 with light.
 撮像部10112は、撮像素子、及び当該撮像素子の前段に設けられる複数のレンズからなる光学系から構成される。観察対象である体組織に照射された光の反射光(以下、観察光という)は、当該光学系によって集光され、当該撮像素子に入射する。撮像部10112では、撮像素子において、そこに入射した観察光が光電変換され、その観察光に対応する画像信号が生成される。撮像部10112によって生成された画像信号は、画像処理部10113に提供される。 The image capturing unit 10112 includes an image sensor and an optical system including a plurality of lenses provided in front of the image sensor. Reflected light (hereinafter referred to as observation light) of light irradiated on the body tissue to be observed is collected by the optical system and enters the image sensor. In the imaging unit 10112, in the imaging element, the observation light incident thereon is photoelectrically converted, and an image signal corresponding to the observation light is generated. The image signal generated by the imaging unit 10112 is provided to the image processing unit 10113.
 画像処理部10113は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)等のプロセッサによって構成され、撮像部10112によって生成された画像信号に対して各種の信号処理を行う。画像処理部10113は、信号処理を施した画像信号を、RAWデータとして無線通信部10114に提供する。 The image processing unit 10113 is configured by a processor such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), and performs various signal processing on the image signal generated by the imaging unit 10112. The image processing unit 10113 provides the radio communication unit 10114 with the image signal subjected to signal processing as RAW data.
 無線通信部10114は、画像処理部10113によって信号処理が施された画像信号に対して変調処理等の所定の処理を行い、その画像信号を、アンテナ10114Aを介して外部制御装置10200に送信する。また、無線通信部10114は、外部制御装置10200から、カプセル型内視鏡10100の駆動制御に関する制御信号を、アンテナ10114Aを介して受信する。無線通信部10114は、外部制御装置10200から受信した制御信号を制御部10117に提供する。 The wireless communication unit 10114 performs predetermined processing such as modulation processing on the image signal that has been subjected to signal processing by the image processing unit 10113, and transmits the image signal to the external control apparatus 10200 via the antenna 10114A. In addition, the wireless communication unit 10114 receives a control signal related to drive control of the capsule endoscope 10100 from the external control device 10200 via the antenna 10114A. The wireless communication unit 10114 provides a control signal received from the external control device 10200 to the control unit 10117.
 給電部10115は、受電用のアンテナコイル、当該アンテナコイルに発生した電流から電力を再生する電力再生回路、及び昇圧回路等から構成される。給電部10115では、いわゆる非接触充電の原理を用いて電力が生成される。 The power feeding unit 10115 includes a power receiving antenna coil, a power regeneration circuit that regenerates power from a current generated in the antenna coil, a booster circuit, and the like. In the power feeding unit 10115, electric power is generated using a so-called non-contact charging principle.
 電源部10116は、二次電池によって構成され、給電部10115によって生成された電力を蓄電する。図18では、図面が煩雑になることを避けるために、電源部10116からの電力の供給先を示す矢印等の図示を省略しているが、電源部10116に蓄電された電力は、光源部10111、撮像部10112、画像処理部10113、無線通信部10114、及び制御部10117に供給され、これらの駆動に用いられ得る。 The power supply unit 10116 is composed of a secondary battery, and stores the electric power generated by the power supply unit 10115. In FIG. 18, in order to avoid complication of the drawing, illustration of an arrow or the like indicating a power supply destination from the power supply unit 10116 is omitted, but the power stored in the power supply unit 10116 is stored in the light source unit 10111. The imaging unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the control unit 10117 can be used for driving them.
 制御部10117は、CPU等のプロセッサによって構成され、光源部10111、撮像部10112、画像処理部10113、無線通信部10114、及び、給電部10115の駆動を、外部制御装置10200から送信される制御信号に従って適宜制御する。 The control unit 10117 includes a processor such as a CPU, and a control signal transmitted from the external control device 10200 to drive the light source unit 10111, the imaging unit 10112, the image processing unit 10113, the wireless communication unit 10114, and the power feeding unit 10115. Control accordingly.
 外部制御装置10200は、CPU,GPU等のプロセッサ、又はプロセッサとメモリ等の記憶素子が混載されたマイクロコンピュータ若しくは制御基板等で構成される。外部制御装置10200は、カプセル型内視鏡10100の制御部10117に対して制御信号を、アンテナ10200Aを介して送信することにより、カプセル型内視鏡10100の動作を制御する。カプセル型内視鏡10100では、例えば、外部制御装置10200からの制御信号により、光源部10111における観察対象に対する光の照射条件が変更され得る。また、外部制御装置10200からの制御信号により、撮像条件(例えば、撮像部10112におけるフレームレート、露出値等)が変更され得る。また、外部制御装置10200からの制御信号により、画像処理部10113における処理の内容や、無線通信部10114が画像信号を送信する条件(例えば、送信間隔、送信画像数等)が変更されてもよい。 The external control device 10200 is configured by a processor such as a CPU or GPU, or a microcomputer or a control board in which a processor and a storage element such as a memory are mounted. The external control device 10200 controls the operation of the capsule endoscope 10100 by transmitting a control signal to the control unit 10117 of the capsule endoscope 10100 via the antenna 10200A. In the capsule endoscope 10100, for example, the light irradiation condition for the observation target in the light source unit 10111 can be changed by a control signal from the external control device 10200. In addition, an imaging condition (for example, a frame rate or an exposure value in the imaging unit 10112) can be changed by a control signal from the external control device 10200. Further, the contents of processing in the image processing unit 10113 and the conditions (for example, the transmission interval, the number of transmission images, etc.) by which the wireless communication unit 10114 transmits an image signal may be changed by a control signal from the external control device 10200. .
 また、外部制御装置10200は、カプセル型内視鏡10100から送信される画像信号に対して、各種の画像処理を施し、撮像された体内画像を表示装置に表示するための画像データを生成する。当該画像処理としては、例えば現像処理(デモザイク処理)、高画質化処理(帯域強調処理、超解像処理、NR(Noise reduction)処理及び/若しくは手ブレ補正処理等)、並びに/又は拡大処理(電子ズーム処理)等、各種の信号処理を行うことができる。外部制御装置10200は、表示装置の駆動を制御して、生成した画像データに基づいて撮像された体内画像を表示させる。あるいは、外部制御装置10200は、生成した画像データを記録装置(図示せず)に記録させたり、印刷装置(図示せず)に印刷出力させてもよい。 Further, the external control device 10200 performs various image processing on the image signal transmitted from the capsule endoscope 10100, and generates image data for displaying the captured in-vivo image on the display device. As the image processing, for example, development processing (demosaic processing), high image quality processing (band enhancement processing, super-resolution processing, NR (Noise reduction) processing and / or camera shake correction processing, etc.), and / or enlargement processing ( Various signal processing such as electronic zoom processing can be performed. The external control device 10200 controls driving of the display device to display an in-vivo image captured based on the generated image data. Alternatively, the external control device 10200 may cause the generated image data to be recorded on a recording device (not shown) or may be printed out on a printing device (not shown).
 以上、本開示に係る技術が適用され得る体内情報取得システムの一例について説明した。本開示に係る技術は、以上説明した構成のうち、例えば、撮像部10112に適用され得る。本実施形態に係る撮像ユニット10は、撮像部10112に適用され得る。撮像ユニット10が撮像部10112に適用されることにより、収差が適切に補正された品質の高い撮像画像の生成が可能になる。 Heretofore, an example of the in-vivo information acquisition system to which the technology according to the present disclosure can be applied has been described. The technology according to the present disclosure can be applied to, for example, the imaging unit 10112 among the configurations described above. The imaging unit 10 according to the present embodiment can be applied to the imaging unit 10112. By applying the imaging unit 10 to the imaging unit 10112, it is possible to generate a high-quality captured image in which aberrations are appropriately corrected.
  <10.備考>
 上記で説明した第1の実施例~第4の実施例においては、収差の補正手段としてカバーガラス11の屈折現象が用いられた。しかし、これに限定されず、回折レンズと同様の形状を表面に有するカバーガラス11が用いられることで、収差が補正されてもよい。このとき、非球面形状を表面に有するカバーガラス11が用いられた場合(第3の実施例)と同様に、カバーガラス11の表面形状を表す新たなパラメータを定義することによって、理想レンズとの光路差が求められる。そして、当該光路差が焦点深度D以下となるようにカバーガラス11の形状が決定される。なお、結像レンズ13も回折レンズであってもよい。
<10. Remarks>
In the first to fourth embodiments described above, the refraction phenomenon of the cover glass 11 is used as the aberration correcting means. However, the present invention is not limited to this, and the aberration may be corrected by using the cover glass 11 having the same shape as that of the diffractive lens on the surface. At this time, similarly to the case where the cover glass 11 having an aspherical shape on the surface is used (third embodiment), by defining a new parameter representing the surface shape of the cover glass 11, the An optical path difference is required. Then, the optical path difference is the shape of the cover glass 11 to be equal to or less than the focal depth D f is determined. The imaging lens 13 may also be a diffractive lens.
 さらに、第1の実施例~第4の実施例においては、カバーガラス11および撮像素子12はWL-CSP構造を有しており、カバーガラス11は、撮像素子12上に固定化された光透過性基板(カバーガラス11の元となる基板)が削られることで形成されていた。しかし、これに限定されず、例えば、撮像素子12上に光学樹脂が積載されることによってカバーガラス11が形成されてもよい。 Further, in the first to fourth embodiments, the cover glass 11 and the image sensor 12 have a WL-CSP structure, and the cover glass 11 is light transmissive fixed on the image sensor 12. It was formed by scraping the conductive substrate (the substrate on which the cover glass 11 is based). However, the present invention is not limited to this. For example, the cover glass 11 may be formed by loading an optical resin on the image sensor 12.
  <11.むすび>
 以上説明したように、本開示に係る撮像ユニット10においては、カバーガラス11の画角θごとの理想レンズとの光路差が焦点深度D以下になるような形状のカバーガラス11が用いられる。これによって、本開示に係る撮像ユニット10は、収差を補正することができ、撮像画像の品質を向上させることができる。
<11. Conclusion>
As described above, in the imaging unit 10 according to the present disclosure, a cover glass 11 having the shape as the optical path difference is equal to or less than the focal depth D f of the ideal lens of each angle θ of the cover glass 11 is used. Thereby, the imaging unit 10 according to the present disclosure can correct the aberration and improve the quality of the captured image.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 The preferred embodiments of the present disclosure have been described in detail above with reference to the accompanying drawings, but the technical scope of the present disclosure is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Of course, it is understood that it belongs to the technical scope of the present disclosure.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 In addition, the effects described in this specification are merely illustrative or illustrative, and are not limited. That is, the technology according to the present disclosure can exhibit other effects that are apparent to those skilled in the art from the description of the present specification in addition to or instead of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 被写体の像が結像する撮像素子と、
 前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
 前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
 前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角θごとの理想レンズとの光路差をL(θ)としたとき、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(26)を満たす形状となっている、撮像ユニット。
Figure JPOXMLDOC01-appb-M000039
(2)
 前記画角の最大値をθmaxとしたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(27)を満たす形状となっている、
 前記(1)に記載の撮像ユニット。
Figure JPOXMLDOC01-appb-M000040
(3)
 前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(28)を満たす形状となっている、
 前記(1)または(2)のいずれか1項に記載の撮像ユニット。
Figure JPOXMLDOC01-appb-M000041
(4)
 前記画角の最大値をθmax[deg]としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(29)を満たす形状となっている、
 前記(3)に記載の撮像ユニット。
Figure JPOXMLDOC01-appb-M000042
(5)
 前記レンズ光学系の厚みをt[mm]とし、屈折率をnとしたときに、以下の式(30)で表される非点収差量AS(θ)に基づいて前記光路差L(θ)が補正される、
 前記(1)から(4)のいずれか1項に記載の撮像ユニット。
Figure JPOXMLDOC01-appb-M000043
(6)
 前記レンズ光学系の前記被写体側の表面において前記入射光が入射する部分の少なくとも一部は、曲率半径Rの凸構造を有する、
 前記(2)または(4)のいずれか1項に記載の撮像ユニット。
(7)
 前記曲率半径Rは、前記レンズ光学系の厚みをt[mm]としたときに、以下の式(31)を満たす、
 前記(6)に記載の撮像ユニット。
Figure JPOXMLDOC01-appb-M000044
(8)
 前記レンズ光学系の前記被写体側の表面が非球面構造を有する、
 前記(1)から(5)のいずれか1項に記載の撮像ユニット。
(9)
 前記レンズ光学系の前記被写体側の表面が回折構造を有する、
 前記(1)から(5)のいずれか1項に記載の撮像ユニット。
(10)
 前記レンズ光学系が光学樹脂によって形成される、
 前記(1)から(9)のいずれか1項に記載の撮像ユニット。
(11)
 被写体の像が結像する撮像素子と、
 前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
 前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
 前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角の最大値をθmaxとし、前記画角θごとの理想レンズとの光路差をL(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(32)を満たす形状となっている、撮像ユニット。
Figure JPOXMLDOC01-appb-M000045
(12)
 被写体の像が結像する撮像素子と、
 前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
 前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
 前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角θごとの理想レンズとの光路差L(θ)をとし、前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(33)を満たす形状となっている、撮像ユニット。
Figure JPOXMLDOC01-appb-M000046
(13)
 被写体の像が結像する撮像素子と、
 前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
 前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
 前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角の最大値をθmaxとし、前記画角θごとの理想レンズとの光路差をL(θ)とし、前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(34)を満たす形状となっている、撮像ユニット。
Figure JPOXMLDOC01-appb-M000047
(14)
 被写体の像が結像する撮像素子と、
 前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
 前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
 前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角θごとの理想レンズとの光路差をL(θ)としたとき、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(35)を満たす形状となっている、撮像ユニットを備える電子機器。
Figure JPOXMLDOC01-appb-M000048
(15)
 被写体の像が結像する撮像素子と、
 前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
 前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
 前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角の最大値をθmaxとし、前記画角θごとの理想レンズとの光路差をL(θ)としたとき、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(36)を満たす形状となっている、撮像ユニットを備える電子機器。
Figure JPOXMLDOC01-appb-M000049
(16)
 被写体の像が結像する撮像素子と、
 前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
 前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
 前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角θごとの理想レンズとの光路差をL(θ)とし、前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(37)を満たす形状となっている、撮像ユニットを備える電子機器。
Figure JPOXMLDOC01-appb-M000050
(17)
 被写体の像が結像する撮像素子と、
 前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
 前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
 前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角の最大値をθmaxとし、前記画角θごとの理想レンズとの光路差をL(θ)とし、前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(38)を満たす形状となっている、撮像ユニットを備える電子機器。
Figure JPOXMLDOC01-appb-M000051
The following configurations also belong to the technical scope of the present disclosure.
(1)
An image sensor that forms an image of a subject;
A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
When the wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], and the optical path difference from the ideal lens for each angle of view θ is L (θ), it is located on the subject side. An imaging unit in which the surface shape of the lens optical system is a shape that satisfies the following expression (26).
Figure JPOXMLDOC01-appb-M000039
(2)
When the maximum value of the angle of view is θ max , the surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (27).
The imaging unit according to (1).
Figure JPOXMLDOC01-appb-M000040
(3)
When the optical path difference with the ideal lens for each angle of view θ of the imaging optical system is f (θ), the surface shape of the lens optical system located on the subject side satisfies the following expression (28). It has a shape,
The imaging unit according to any one of (1) and (2).
Figure JPOXMLDOC01-appb-M000041
(4)
When the maximum value of the angle of view is θ max [deg], the surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (29).
The imaging unit according to (3).
Figure JPOXMLDOC01-appb-M000042
(5)
When the thickness of the lens optical system is t [mm] and the refractive index is n, the optical path difference L (θ) is based on the astigmatism amount AS (θ) expressed by the following equation (30). Is corrected,
The imaging unit according to any one of (1) to (4).
Figure JPOXMLDOC01-appb-M000043
(6)
At least a part of a portion where the incident light is incident on the surface of the lens optical system on the subject side has a convex structure with a radius of curvature R.
The imaging unit according to any one of (2) and (4).
(7)
The curvature radius R satisfies the following formula (31) when the thickness of the lens optical system is t [mm].
The imaging unit according to (6).
Figure JPOXMLDOC01-appb-M000044
(8)
The object side surface of the lens optical system has an aspherical structure,
The imaging unit according to any one of (1) to (5).
(9)
The object side surface of the lens optical system has a diffractive structure;
The imaging unit according to any one of (1) to (5).
(10)
The lens optical system is formed of an optical resin;
The imaging unit according to any one of (1) to (9).
(11)
An image sensor that forms an image of a subject;
A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
The wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], the maximum value of the angle of view is θ max , and the optical path difference from the ideal lens for each angle of view θ is L (θ ), The surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (32).
Figure JPOXMLDOC01-appb-M000045
(12)
An image sensor that forms an image of a subject;
A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
The wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], the optical path difference L (θ) with respect to the ideal lens for each angle of view θ, and the angle of view of the image pickup optical system. An imaging unit in which the surface shape of the lens optical system located on the subject side satisfies the following expression (33), where f (θ) is the optical path difference from the ideal lens for each θ. .
Figure JPOXMLDOC01-appb-M000046
(13)
An image sensor that forms an image of a subject;
A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
The wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], the maximum value of the angle of view is θ max , and the optical path difference from the ideal lens for each angle of view θ is L (θ ) And the surface shape of the lens optical system located on the subject side is given by the following equation (34) where f (θ) is the optical path difference between the imaging optical system and the ideal lens for each angle of view θ. The imaging unit has a shape satisfying
Figure JPOXMLDOC01-appb-M000047
(14)
An image sensor that forms an image of a subject;
A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
When the wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], and the optical path difference from the ideal lens for each angle of view θ is L (θ), it is located on the subject side. An electronic apparatus including an imaging unit, wherein the surface shape of the lens optical system is a shape that satisfies the following expression (35).
Figure JPOXMLDOC01-appb-M000048
(15)
An image sensor that forms an image of a subject;
A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
The wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], the maximum value of the angle of view is θ max , and the optical path difference from the ideal lens for each angle of view θ is L (θ ), The surface shape of the lens optical system located on the subject side is an electronic apparatus including an imaging unit that satisfies the following expression (36).
Figure JPOXMLDOC01-appb-M000049
(16)
An image sensor that forms an image of a subject;
A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
The wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], the optical path difference from the ideal lens for each angle of view θ is L (θ), and the angle of view of the image pickup optical system is An imaging unit in which the surface shape of the lens optical system located on the object side satisfies the following expression (37), where f (θ) is the optical path difference from the ideal lens for each θ. Electronic equipment comprising.
Figure JPOXMLDOC01-appb-M000050
(17)
An image sensor that forms an image of a subject;
A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
The wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], the maximum value of the angle of view is θ max , and the optical path difference from the ideal lens for each angle of view θ is L (θ ), And the surface shape of the lens optical system located on the subject side is expressed by the following equation (38), where f (θ) is the optical path difference between the imaging optical system and the ideal lens for each angle of view θ. An electronic device including an imaging unit that has a shape satisfying
Figure JPOXMLDOC01-appb-M000051
 100  電子機器
 10  撮像ユニット
 11  カバーガラス
 12  撮像素子
 13  結像レンズ
 20  入力情報取得部
 30  表示制御部
 40  記憶部
 50  制御部
DESCRIPTION OF SYMBOLS 100 Electronic device 10 Imaging unit 11 Cover glass 12 Imaging element 13 Imaging lens 20 Input information acquisition part 30 Display control part 40 Storage part 50 Control part

Claims (17)

  1.  被写体の像が結像する撮像素子と、
     前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
     前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
     前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角θごとの理想レンズとの光路差をL(θ)としたとき、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(1)を満たす形状となっている、撮像ユニット。
    Figure JPOXMLDOC01-appb-M000001
    An image sensor that forms an image of a subject;
    A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
    An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
    When the wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], and the optical path difference from the ideal lens for each angle of view θ is L (θ), it is located on the subject side. An imaging unit in which the surface shape of the lens optical system is a shape that satisfies the following expression (1).
    Figure JPOXMLDOC01-appb-M000001
  2.  前記画角の最大値をθmaxとしたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(2)を満たす形状となっている、
     請求項1に記載の撮像ユニット。
    Figure JPOXMLDOC01-appb-M000002
    When the maximum value of the angle of view is θ max , the surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (2).
    The imaging unit according to claim 1.
    Figure JPOXMLDOC01-appb-M000002
  3.  前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(3)を満たす形状となっている、
     請求項1に記載の撮像ユニット。
    Figure JPOXMLDOC01-appb-M000003
    When the optical path difference with the ideal lens for each angle of view θ of the imaging optical system is f (θ), the surface shape of the lens optical system located on the subject side satisfies the following expression (3). It has a shape,
    The imaging unit according to claim 1.
    Figure JPOXMLDOC01-appb-M000003
  4.  前記画角の最大値をθmax[deg]としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(4)を満たす形状となっている、
     請求項3に記載の撮像ユニット。
    When the maximum value of the angle of view is θ max [deg], the surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (4).
    The imaging unit according to claim 3.
  5.  前記レンズ光学系の厚みをt[mm]とし、屈折率をnとしたときに、以下の式(5)で表される非点収差量AS(θ)に基づいて前記光路差L(θ)が補正される、
     請求項1に記載の撮像ユニット。
    Figure JPOXMLDOC01-appb-M000005
    When the thickness of the lens optical system is t [mm] and the refractive index is n, the optical path difference L (θ) is based on the astigmatism amount AS (θ) expressed by the following equation (5). Is corrected,
    The imaging unit according to claim 1.
    Figure JPOXMLDOC01-appb-M000005
  6.  前記レンズ光学系の前記被写体側の表面において前記入射光が入射する部分の少なくとも一部は、曲率半径Rの凸構造を有する、
     請求項2に記載の撮像ユニット。
    At least a part of a portion where the incident light is incident on the surface of the lens optical system on the subject side has a convex structure with a radius of curvature R.
    The imaging unit according to claim 2.
  7.  前記曲率半径Rは、前記レンズ光学系の厚みをt[mm]としたときに、以下の式(6)を満たす、
     請求項6に記載の撮像ユニット。
    Figure JPOXMLDOC01-appb-M000006
    The curvature radius R satisfies the following expression (6) when the thickness of the lens optical system is t [mm].
    The imaging unit according to claim 6.
    Figure JPOXMLDOC01-appb-M000006
  8.  前記レンズ光学系の前記被写体側の表面が非球面構造を有する、
     請求項1に記載の撮像ユニット。
    The object side surface of the lens optical system has an aspherical structure,
    The imaging unit according to claim 1.
  9.  前記レンズ光学系の前記被写体側の表面が回折構造を有する、
     請求項1に記載の撮像ユニット。
    The object side surface of the lens optical system has a diffractive structure;
    The imaging unit according to claim 1.
  10.  前記レンズ光学系が光学樹脂によって形成される、
     請求項1に記載の撮像ユニット。
    The lens optical system is formed of an optical resin;
    The imaging unit according to claim 1.
  11.  被写体の像が結像する撮像素子と、
     前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
     前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
     前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角の最大値をθmaxとし、前記画角θごとの理想レンズとの光路差をL(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(7)を満たす形状となっている、撮像ユニット。
    Figure JPOXMLDOC01-appb-M000007
    An image sensor that forms an image of a subject;
    A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
    An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
    The wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], the maximum value of the angle of view is θ max , and the optical path difference from the ideal lens for each angle of view θ is L (θ ), The surface shape of the lens optical system located on the subject side is a shape that satisfies the following expression (7).
    Figure JPOXMLDOC01-appb-M000007
  12.  被写体の像が結像する撮像素子と、
     前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
     前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
     前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角θごとの理想レンズとの光路差L(θ)をとし、前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(8)を満たす形状となっている、撮像ユニット。
    Figure JPOXMLDOC01-appb-M000008
    An image sensor that forms an image of a subject;
    A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
    An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
    The wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], the optical path difference L (θ) with respect to the ideal lens for each angle of view θ, and the angle of view of the image pickup optical system. An imaging unit in which the surface shape of the lens optical system located on the object side satisfies the following expression (8), where f (θ) is the optical path difference from the ideal lens for each θ. .
    Figure JPOXMLDOC01-appb-M000008
  13.  被写体の像が結像する撮像素子と、
     前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
     前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
     前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角の最大値をθmaxとし、前記画角θごとの理想レンズとの光路差をL(θ)とし、前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(9)を満たす形状となっている、撮像ユニット。
    Figure JPOXMLDOC01-appb-M000009
    An image sensor that forms an image of a subject;
    A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
    An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
    The wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], the maximum value of the angle of view is θ max , and the optical path difference from the ideal lens for each angle of view θ is L (θ ) And the surface shape of the lens optical system located on the subject side is expressed by the following equation (9) where f (θ) is the optical path difference between the imaging optical system and the ideal lens for each angle of view θ. The imaging unit has a shape satisfying
    Figure JPOXMLDOC01-appb-M000009
  14.  被写体の像が結像する撮像素子と、
     前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
     前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
     前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角θごとの理想レンズとの光路差をL(θ)としたとき、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(10)を満たす形状となっている、撮像ユニットを備える電子機器。
    Figure JPOXMLDOC01-appb-M000010
    An image sensor that forms an image of a subject;
    A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
    An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
    When the wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], and the optical path difference from the ideal lens for each angle of view θ is L (θ), it is located on the subject side. An electronic apparatus provided with an imaging unit, wherein the lens optical system has a surface shape that satisfies the following expression (10).
    Figure JPOXMLDOC01-appb-M000010
  15.  被写体の像が結像する撮像素子と、
     前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
     前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
     前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角の最大値をθmaxとし、前記画角θごとの理想レンズとの光路差をL(θ)としたとき、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(11)を満たす形状となっている、撮像ユニットを備える電子機器。
    Figure JPOXMLDOC01-appb-M000011
    An image sensor that forms an image of a subject;
    A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
    An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
    The wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], the maximum value of the angle of view is θ max , and the optical path difference from the ideal lens for each angle of view θ is L (θ ), The surface shape of the lens optical system located on the subject side is an electronic apparatus including an imaging unit that satisfies the following formula (11).
    Figure JPOXMLDOC01-appb-M000011
  16.  被写体の像が結像する撮像素子と、
     前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
     前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
     前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角θごとの理想レンズとの光路差をL(θ)とし、前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(12)を満たす形状となっている、撮像ユニットを備える電子機器。
    Figure JPOXMLDOC01-appb-M000012
    An image sensor that forms an image of a subject;
    A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
    An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
    The wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], the optical path difference from the ideal lens for each angle of view θ is L (θ), and the angle of view of the image pickup optical system is An imaging unit in which the surface shape of the lens optical system located on the object side satisfies the following expression (12), where f (θ) is the optical path difference from the ideal lens for each θ. Electronic equipment comprising.
    Figure JPOXMLDOC01-appb-M000012
  17.  被写体の像が結像する撮像素子と、
     前記撮像素子よりも前記被写体側に設けられ、前記撮像素子に固定されたレンズ光学系と、
     前記被写体と前記レンズ光学系の間に設けられた開口数がNAの撮像光学系と、を備え、
     前記撮像素子に入射する入射光の波長をλとし、画角をθ[deg]とし、前記画角の最大値をθmaxとし、前記画角θごとの理想レンズとの光路差をL(θ)とし、前記撮像光学系の前記画角θごとの理想レンズとの光路差をf(θ)としたときに、前記被写体側に位置する前記レンズ光学系の表面形状は、以下の式(13)を満たす形状となっている、撮像ユニットを備える電子機器。
    Figure JPOXMLDOC01-appb-M000013
     
    An image sensor that forms an image of a subject;
    A lens optical system provided on the subject side of the image sensor and fixed to the image sensor;
    An imaging optical system having a numerical aperture of NA provided between the subject and the lens optical system,
    The wavelength of incident light incident on the image sensor is λ, the angle of view is θ [deg], the maximum value of the angle of view is θ max , and the optical path difference from the ideal lens for each angle of view θ is L (θ ), And f (θ) is the optical path difference with the ideal lens for each angle of view θ of the imaging optical system, the surface shape of the lens optical system located on the subject side is expressed by the following equation (13) An electronic device including an imaging unit that has a shape satisfying
    Figure JPOXMLDOC01-appb-M000013
PCT/JP2017/039197 2017-01-12 2017-10-30 Imaging unit and electronic device WO2018131264A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-003413 2017-01-12
JP2017003413A JP2018112677A (en) 2017-01-12 2017-01-12 Imaging unit and electronic device

Publications (1)

Publication Number Publication Date
WO2018131264A1 true WO2018131264A1 (en) 2018-07-19

Family

ID=62839337

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/039197 WO2018131264A1 (en) 2017-01-12 2017-10-30 Imaging unit and electronic device

Country Status (2)

Country Link
JP (1) JP2018112677A (en)
WO (1) WO2018131264A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7084218B2 (en) 2018-06-13 2022-06-14 三菱重工業株式会社 Information relay device, remote service system, information relay method and program

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03200911A (en) * 1989-10-13 1991-09-02 Olympus Optical Co Ltd Objective lens for endoscope
JPH04218011A (en) * 1990-10-09 1992-08-07 Olympus Optical Co Ltd Objective optical system of endoscope
JP2000089102A (en) * 1999-10-15 2000-03-31 Olympus Optical Co Ltd Objective lens
WO2004102247A1 (en) * 2003-05-15 2004-11-25 Olympus Corporation Object lens and endoscope using it
JP2008216807A (en) * 2007-03-06 2008-09-18 Sharp Corp Imaging lens, imaging unit and portable information terminal with the same
JP2009008956A (en) * 2007-06-28 2009-01-15 Sharp Corp Imaging lens, imaging unit, and personal digital assistance incorporating the imaging unit
WO2011077972A1 (en) * 2009-12-24 2011-06-30 オリンパスメディカルシステムズ株式会社 Objective lens for endoscope, and endoscope using same
WO2011125539A1 (en) * 2010-04-07 2011-10-13 オリンパスメディカルシステムズ株式会社 Objective lens and endoscope using same
WO2014006971A1 (en) * 2012-07-03 2014-01-09 オリンパスメディカルシステムズ株式会社 Objective optical system for endoscope
WO2015025802A1 (en) * 2013-08-22 2015-02-26 オリンパスメディカルシステムズ株式会社 Enlarging endoscope optical system
WO2016031586A1 (en) * 2014-08-28 2016-03-03 オリンパス株式会社 Endoscope objective optical system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03200911A (en) * 1989-10-13 1991-09-02 Olympus Optical Co Ltd Objective lens for endoscope
JPH04218011A (en) * 1990-10-09 1992-08-07 Olympus Optical Co Ltd Objective optical system of endoscope
JP2000089102A (en) * 1999-10-15 2000-03-31 Olympus Optical Co Ltd Objective lens
WO2004102247A1 (en) * 2003-05-15 2004-11-25 Olympus Corporation Object lens and endoscope using it
JP2008216807A (en) * 2007-03-06 2008-09-18 Sharp Corp Imaging lens, imaging unit and portable information terminal with the same
JP2009008956A (en) * 2007-06-28 2009-01-15 Sharp Corp Imaging lens, imaging unit, and personal digital assistance incorporating the imaging unit
WO2011077972A1 (en) * 2009-12-24 2011-06-30 オリンパスメディカルシステムズ株式会社 Objective lens for endoscope, and endoscope using same
WO2011125539A1 (en) * 2010-04-07 2011-10-13 オリンパスメディカルシステムズ株式会社 Objective lens and endoscope using same
WO2014006971A1 (en) * 2012-07-03 2014-01-09 オリンパスメディカルシステムズ株式会社 Objective optical system for endoscope
WO2015025802A1 (en) * 2013-08-22 2015-02-26 オリンパスメディカルシステムズ株式会社 Enlarging endoscope optical system
WO2016031586A1 (en) * 2014-08-28 2016-03-03 オリンパス株式会社 Endoscope objective optical system

Also Published As

Publication number Publication date
JP2018112677A (en) 2018-07-19

Similar Documents

Publication Publication Date Title
CN111492288B (en) Imaging lens and imaging apparatus
TWI785049B (en) Imaging device, solid state image sensor, and electronic device
WO2017163927A1 (en) Chip size package, production method, electronic apparatus, and endoscope
US10915009B2 (en) Compound-eye camera module and electronic device
WO2020202965A1 (en) Imaging lens and imaging device
JP2018200423A (en) Imaging device and electronic apparatus
US20210382280A1 (en) Imaging lens and imaging apparatus
US10868052B2 (en) Imaging element, manufacturing method, and electronic apparatus
US11750932B2 (en) Image processing apparatus, image processing method, and electronic apparatus
US20210191019A1 (en) Resonator structure, imaging element, and electronic apparatus
JP2018147974A (en) Solid state imaging device, electronic apparatus, and semiconductor device
WO2017169889A1 (en) Camera module, camera module production method, imaging device, and electronic device
US20220293656A1 (en) Solid-state imaging device and electronic apparatus
US11553118B2 (en) Imaging apparatus, manufacturing method therefor, and electronic apparatus
WO2021117497A1 (en) Imaging lens and imaging device
US20230013088A1 (en) Imaging device and method of manufacturing imaging device
WO2018180570A1 (en) Solid-state image pickup element, electronic apparatus, and semiconductor device
JP2019160866A (en) Imaging apparatus
WO2018131264A1 (en) Imaging unit and electronic device
WO2022059463A1 (en) Wide-angle lens and imaging device
CN113692367B (en) Optical system and imaging device
WO2020084973A1 (en) Image processing device
WO2021085154A1 (en) Imaging lens and imaging device
WO2021200257A1 (en) Zoom lens and image pick-up device
JP2019040892A (en) Imaging device, camera module, and electronic device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17891050

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17891050

Country of ref document: EP

Kind code of ref document: A1