WO2020218180A1 - Image processing device and method, and electronic apparatus - Google Patents

Image processing device and method, and electronic apparatus Download PDF

Info

Publication number
WO2020218180A1
WO2020218180A1 PCT/JP2020/016845 JP2020016845W WO2020218180A1 WO 2020218180 A1 WO2020218180 A1 WO 2020218180A1 JP 2020016845 W JP2020016845 W JP 2020016845W WO 2020218180 A1 WO2020218180 A1 WO 2020218180A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
reflected light
light image
unit
contrast
Prior art date
Application number
PCT/JP2020/016845
Other languages
French (fr)
Japanese (ja)
Inventor
哲平 栗田
信一郎 五味
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2020218180A1 publication Critical patent/WO2020218180A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/21Polarisation-affecting properties
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/47Scattering, i.e. diffuse reflection
    • G01N21/49Scattering, i.e. diffuse reflection within a body or fluid
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B11/00Filters or other obturators specially adapted for photographic purposes
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • This disclosure relates to image processing devices, methods and electronic devices.
  • the contrast autofocus method is known as one of the conventional autofocus methods.
  • the contrast autofocus method is a method of focusing by measuring the contrast of an captured image in a sensor field through a lens. More specifically, in the contrast autofocus method, the focus lens is moved to move in a direction in which the contrast is higher, and the focusing control is terminated when the contrast fluctuation amount becomes equal to or less than a predetermined threshold value.
  • the reflected light of the irradiation light Is a mixture of surface-reflected light, which is reflected light from the surface, and internally reflected light, which is reflected light from the inside.
  • the appropriate focus position is also different.
  • the object to be imaged is the skin
  • the surface of the skin transmits visible light, so that the image of the surface of the skin and the image of the inside of the skin (subcutaneous tissue) are overlapped with each other. It was. As a result, due to the influence of each other's images, it was not possible to correctly perform focusing control on either the skin surface or the inside of the skin.
  • the contrast on the surface of the skin is very high compared to the contrast inside the skin, so that the contrast inside the skin cannot be measured normally, and it is difficult to control the contrast autofocus.
  • the present disclosure has been made in view of such a situation, even when the surface of the object to be imaged has transparency and the inside can be imaged at the same time, such as the skin surface and the inside of the skin. It is an object of the present invention to provide an image processing device, a method, and an electronic device capable of correctly focusing on either the surface or the inside of an image-imaging object.
  • the image processing apparatus of the present disclosure includes a surface reflected light image and an internally reflected light image from the captured image based on the imaging signal corresponding to the captured image of the imaged object irradiated with linear polarization.
  • the image generation unit that separates and generates the image
  • the detection unit that detects the contrast evaluation value for each of the surface reflected light image and the internally reflected light image, and the focus position for the reflected light image to be focused based on the contrast evaluation value. It is provided with a determination unit for determining whether or not the image has been used.
  • FIG. 1 is an explanatory diagram of the principle of the image processing apparatus of the embodiment.
  • the image processing device 100 has a focusing lens 101 that can be driven along the optical axis direction, images an imaged object OBJ that has a predetermined vibrating surface and is irradiated with linearly polarized light, and outputs an imaged RAW image GRAW.
  • the image processing device 100 includes a detection unit 104 that detects the contrast evaluation value EVC for each of the surface reflected light image GS and the internally reflected light image GI, and the surface reflected light image GS and the internally reflected light image based on the contrast evaluation value EVC.
  • a drive control signal CDR is input from the contrast AF control unit 105, which functions as a determination unit for determining whether or not the focus position has been reached for each of the GIs and controls the drive of the focusing lens 101, and the contrast AF control unit 105. It is provided with a drive unit 106 that drives the focusing lens 101 under the control of the contrast AF control unit 105.
  • the contrast evaluation value EVC is a value corresponding to the contrast difference (brightness difference) of the captured image, and the higher the contrast difference, that is, the larger the contrast evaluation value EVC, the more in focus (focusing). It will be).
  • the contrast evaluation value EVC corresponding to either one of the reflected light images (for example, the surface reflected light image GS) is set. Based on this, it may be determined whether or not the focus position has been reached for the one reflected light image.
  • FIG. 2 is an explanatory diagram of the detection deviation of the in-focus position.
  • the polarized RAW image shown in FIG. 2 (a) is used as it is as in the conventional case, that is, when the surface reflected light image GS and the internally reflected light image GI are used without separation, as shown in FIG. 2 (b).
  • the focus position FCSp of the surface image is affected by the internally reflected light image GI shown in FIGS. 2 (c) and 2 (d) with respect to the optimum focus position FCS, and is on the focus position side of the internally reflected light image GI. It is detected by shifting to.
  • the focus position FCIp of the internal image is affected by the surface reflected light image GS shown in FIGS. 2 (e) and 2 (f) with respect to the optimum focus position FCI. Therefore, the surface reflected light image GS is detected by shifting to the focus position side, and the correct focus position cannot be obtained in either the surface image or the internal image.
  • the surface reflected light image GS corresponding to the surface reflected light LRS and the internally reflected light image GI corresponding to the internally reflected light LRI are separated from each other.
  • the focus position can be calculated for each image, and a more suitable focus position can be calculated.
  • the surface reflected light LRS reflected on the surface of the skin becomes specularly reflected light and surface scattered light. These surface scattered light and specularly reflected light are polarized light.
  • the internally reflected light LRI that is incident and reflected inside the skin becomes internally scattered light by being scattered inside. This internally scattered light is unpolarized.
  • the polarization of four types of polarized light is detected by using a four-polarized light sensor capable of detecting each, and the irradiated linearly polarized light vibrates at 90 °.
  • the surface reflected light component has the largest amount of a 90 ° polarized light component (mainly a specular reflected light component).
  • the polarized components of 0 °, 45 °, 90 °, and 135 ° appear almost evenly.
  • the 0 ° polarization component having a vibration plane orthogonal to the 90 ° polarization component is subtracted from the 90 ° polarization component, so that the 90 ° polarization component is the surface reflected light.
  • the ingredients will remain mainly.
  • the surface reflected light image (specular reflected light image + surface scattered image) GS is generated by generating an image from the pixel data corresponding to the polarization component of 90 °, and the polarization of 0 ° is performed.
  • An internally reflected light image (internally scattered image) GI is generated by generating an image from pixel data corresponding to the components. Then, the detection unit 104 outputs the contrast evaluation value EVC for each of the surface reflected light image GS and the internally reflected light image GI.
  • the contrast AF control unit 105 determines whether or not the focus position has been reached based on the contrast evaluation value EVC, and has a higher contrast image for each of the surface reflected light image GS and the internally reflected light image GI. Is the in-focus image.
  • the contrast AF control unit 105 has a plurality of different focal positions obtained by driving the focusing lens 101 via the drive unit 106 for each of the surface reflected light image GS and the internally reflected light image GI.
  • the image with higher contrast is assumed to be the in-focus image. Therefore, the optimum focus position can be detected for each of the surface reflected light image GS and the internally reflected light image GI, and the optimum focused image can be obtained for each.
  • the contrast is relatively higher than that of the surface reflected light image GS and the internally reflected light image GI, so that the inner surface image
  • the contrast of the image cannot be measured correctly, and the accuracy of the contrast autofocus is naturally lowered.
  • the contrast can be correctly detected even in the internally reflected light image GI, and the optimum focused image can be obtained.
  • FIG. 3 is a schematic block diagram of the image processing apparatus of the first embodiment.
  • the image processing device 10 is roughly classified into an image imaging unit 11, a reflection component calculation unit 12, a focus target selection unit 13, a contrast detection unit 14, a contrast storage unit 15, and a contrast autofocus (AF) control unit.
  • a focus lens driving unit 17 and a focus lens driving unit 17 are provided.
  • the image capturing unit 11 has a focusing lens 21 and a 4-polarized sensor unit 22, images an image-imaging object OBJ (human skin in this embodiment), and reflects a polarized RAW image (data) GRAW as a reflection component. Output to the calculation unit 12.
  • the reflection component calculation unit 12 separates and generates a surface reflected light image (data) GS and an internally reflected light image (data) GI based on the input polarized RAW image (data) GRAW, and causes the focus target selection unit 13 to perform the process. Output.
  • the surface reflected light image GS includes a specular reflected light image due to specular reflection on the surface of the imaged object OBJ of the irradiated linearly polarized light (for example, linearly polarized light having a vibration surface of 90 °) and the imaged object. It is a composite image of the surface scattering image due to surface scattering.
  • the internally reflected light image GI is an internally scattered image in which the linearly polarized light incident on the inside of the imaging object OBJ is almost unpolarized due to internal scattering.
  • the focus target selection unit 13 determines the surface reflected light image based on whether the user of the image processing apparatus selects the surface reflected light image or the internally reflected light image, that is, which is the reflected light image to be focused. Either GS or the internally reflected light image GI is output to the contrast detection unit 14.
  • the contrast detection unit 14 is based on either the surface reflected light image GS or the internally reflected light image GI output by the focus target selection unit 13 and the user's area designation information, and the current frame image (surface reflected light image GS). Alternatively, the contrast evaluation value EVC of any one of the internally reflected light image GI) is output to the contrast AF control unit 16.
  • the contrast storage unit 15 stores a predetermined number of contrast evaluation values EVC output by the contrast detection unit 14 in chronological order.
  • the contrast autofocus (AF) control unit 16 compares the difference between the contrast evaluation value EVC of the current frame image and the contrast evaluation value EVC of the previous frame image with a predetermined threshold value (contrast difference threshold value), and presents the difference.
  • a predetermined threshold value contrast difference threshold value
  • the current frame image is focused on the assumption that a more suitable focus position has been reached.
  • the image is a polarized RAW image.
  • the focus lens driving unit 17 focuses the image imaging unit 11 when the difference between the contrast evaluation value EVC of the current frame image and the contrast evaluation value EVC of the previous frame still exceeds a predetermined threshold value.
  • the lens 21 is driven to change the focus position and to perform reimaging.
  • the image capturing unit 11 includes a 4-polarization sensor unit 22 as described above.
  • the four polarization sensor unit 22 receives a first sensor that receives polarized light having a 0 ° vibrating surface, a second sensor that receives polarized light having a 45 ° vibrating surface, and a polarized light having a 90 ° vibrating surface.
  • a third sensor and a fourth sensor that receives polarized light having a vibration surface of 135 ° are arranged in two dimensions in association with each color component of R, G, and B.
  • the 4-polarization sensor unit 22 adopts a configuration in which a plurality of (four in the figure) polarizing filters having a pixel configuration in the polarization direction are arranged on an image sensor provided with a color mosaic filter on the imaging surface. ing.
  • FIG. 4 is an explanatory diagram of a configuration example of one pixel constituting the 4-polarization sensor unit.
  • a color mosaic filter having an area ratio of R: G: B of 1: 2: 1 is used for one pixel of the image.
  • the R filter FR includes a light receiving element DR1 in which a polarizing filter having a vibrating surface of 0 ° is arranged on the incident surface, a light receiving element DR2 in which a polarizing filter having a vibrating surface of 45 ° is arranged on the incident surface, and a vibrating surface of 90.
  • the light receiving element DR3 in which the polarizing filter of ° is arranged on the incident surface and the light receiving element DR4 in which the polarizing filter having the vibration surface of 135 ° are arranged on the incident surface correspond.
  • the G filter FG1 includes a light receiving element DG11 in which a polarizing filter having a vibrating surface of 0 ° is arranged on the incident surface, a light receiving element DG12 in which a polarizing filter having a vibrating surface of 45 ° is arranged on the incident surface, and a vibrating surface of 90.
  • the light receiving element DG13 in which the polarizing filter of ° is arranged on the incident surface and the light receiving element DG14 in which the polarizing filter having the vibration surface of 135 ° are arranged on the incident surface correspond.
  • the G filter FG2 includes a light receiving element DG21 in which a polarizing filter having a vibrating surface of 0 ° is arranged on the incident surface, a light receiving element DG22 in which a polarizing filter having a vibrating surface of 45 ° is arranged on the incident surface, and a vibrating surface of 90.
  • the light receiving element DG23 in which the polarizing filter of ° is arranged on the incident surface and the light receiving element DG24 in which the polarizing filter having the vibration surface of 135 ° are arranged on the incident surface correspond.
  • the B filter FB includes a light receiving element DB1 in which a polarizing filter having a vibrating surface of 0 ° is arranged on the incident surface, a light receiving element DB2 in which a polarizing filter having a vibrating surface of 45 ° is arranged on the incident surface, and a vibrating surface of 90.
  • the light receiving element DB3 in which the polarizing filter of ° is arranged on the incident surface and the light receiving element DB4 in which the polarizing filter having the vibration surface of 135 ° are arranged on the incident surface correspond.
  • it may be provided with two types of light receiving elements corresponding to at least two types of polarizing filters whose vibration planes are orthogonal to each other, and further provided with a light receiving element without a polarizing filter.
  • a plurality of light receiving elements correspond to one filter, but it is also possible to configure so that a filter is assigned to each light receiving element. Is.
  • FIG. 5 is an explanatory diagram of the reflection component calculation unit of the first embodiment.
  • FIG. 5 for the sake of easy understanding, only the configuration of one pixel of the imaged RAW image RAW, and by extension, one pixel of the surface reflected light image GS or the internally reflected light image GI is shown.
  • the reflection component calculation unit 12 interpolates the polarization component based on the imaged RAW image GRAW and separates the reflection component from the polarization component interpolation unit 31 that generates the interpolation 4-polarized image GFC described later and the interpolation 4-polarized image GFC.
  • a reflection component separating unit 32 that generates a surface reflected light image GS and an internally reflected light image GI.
  • the operation of the polarization component interpolation unit 31 will be described.
  • the first R polarization component R1 having a vibration surface of 0 °
  • the second R polarization component R2 having a vibration surface of 45 °
  • the vibration surface have an R component.
  • a 90 ° thirty-one R polarization component R3 and a fourth R polarization component R4 having a vibration plane of 135 ° are included.
  • the first G polarization component G11 and the first G polarization component G21 having a vibration surface of 0 °
  • the second G polarization component G12 and the second G polarization component G22 having a vibration surface of 45 °
  • a fourth G polarizing component G14 and a fourth G polarizing component G24 having a vibration surface of 135 ° are included.
  • the first B polarizing component B1 having a vibration surface of 0 °
  • the second B polarization component B2 having a vibration surface of 45 °
  • the third B polarization component G3 having a vibration surface of 90 °
  • the vibration surface includes a fourth B polarization component B4 at 135 °.
  • the polarization component interpolation unit 31 of the reflection component calculation unit 12 has a first polarized image RG1 in which all one pixel in the image is the first R polarization component R1 and one pixel in the image regarding the R component.
  • a 4-polarized image RG4 is generated.
  • the polarization component interpolation unit 31 is arranged with the polarization filters corresponding to the two G filters for the G component in the imaged RAW image GRAW, for example, all one pixel in the image is the first G polarization component G11 and the first G polarization component G11.
  • the polarization component interpolation unit 31 has a first polarized image BG1 in which all one pixel in the image is the first B polarization component B1 and one pixel in the image is the second B polarization component B2. Generates the second polarized image BG2, the third polarized image BG3 in which all one pixel in the image is the third B polarized component B3, and the fourth polarized image BG4 in which all one pixel in the image is the fourth B polarized component B4. To do.
  • the reflection component separation unit 32 of the reflection component calculation unit 12 when the irradiated linearly polarized light has a vibrating surface of 90 °, one pixel in the image corresponding to the vibrating surface orthogonal to the vibrating surface is 0 °.
  • the internally reflected light image GI is generated based on the first polarized image BG1 in which the image GG1 and one pixel in the image are all the first B polarized component B1.
  • one pixel in the image corresponding to 0 ° from the third polarized image RG3 in which all one pixel in the image corresponding to the vibrating surface 90 ° is the third R polarizing component R3 is From the R image obtained by subtracting the value of the first polarized image RG1 all set to the first R polarized component R1, and the third polarized image GG3 in which one pixel in the image corresponding to the vibration surface of 90 ° is all set to the third G polarized component G3.
  • All 1 pixel in the image corresponding to the vibrating surface 0 ° is the first G image obtained by subtracting the value of the first polarized image GG1 in which the first G polarizing component G1 is set, and all 1 pixel in the image corresponding to the vibrating surface 90 ° is the first.
  • a surface reflected light image GS is generated.
  • FIG. 6 is an explanatory diagram of an example of a user interface screen of the focus target selection unit.
  • the image display area AR1 for displaying the image of the portion currently focused on is displayed on the display screen of the touch panel display, and the focus position is the surface of the imaging object (skin in this example).
  • the surface selection button BT1 and the internal selection button BT2 for setting the focus position inside the image pickup object are arranged.
  • the image display area AR1 displays information (“surface” in the example of FIG. 6) indicating that the current focus target is either the surface or the inside.
  • the image of the surface of the skin, which is the object to be imaged is displayed in the image display area AR1.
  • the internal selection button BT2 while the image of the surface of the skin is displayed, the image of the inside of the skin, which is the object to be imaged, is displayed in the image display area AR1.
  • the image selection button BT1 when the user touches the surface selection button BT1 again while the image of the inside of the skin is displayed, the image of the surface of the skin, which is the object to be imaged, is displayed again in the image display area AR1. Then, the display of the image on the last selected side is continued until the user switches the display target.
  • the contrast detection unit 14 will be described.
  • the user can specify the contrast detection area in the image display area AR1 of the touch panel display.
  • the preset (default) area is specified.
  • the preset area is not particularly limited, and may be the entire area corresponding to the image display area AR1, a predetermined rectangular area in the center of the image display area AR1, or the like.
  • the contrast detection unit 14 calculates the brightness L for each pixel of the image corresponding to the designated contrast detection region, and extracts the maximum brightness Lmax which is the maximum value of the brightness L and the Lmin which is the minimum value of the brightness L. ..
  • the brightness L is calculated by, for example, the following equation.
  • the value of the R component of the pixel for which the brightness L is calculated is R
  • the value of the G component is G
  • the value of the B component is B
  • L (R + 2G + B) / 4
  • the contrast detection unit 14 calculates the contrast evaluation value EVC from the extracted maximum luminance value Lmax and minimum luminance value Lmin by the following equation.
  • EVC (Lmax-Lmin) / (Lmax + Lmin)
  • the contrast detection unit 14 outputs the calculated contrast evaluation value EVC to the contrast storage unit 15 and the contrast AF control unit 16.
  • the contrast storage unit 15 stores the contrast evaluation value EVC in chronological order (input order) (step S16).
  • the contrast AF control unit 16 determines whether or not the focus position has been reached based on the difference between the contrast evaluation value EVC of the image corresponding to the current focus position and the contrast evaluation value EVC of the image corresponding to the previous focus position. If the determination is made and the focus position has not been reached, the focus lens driving unit 17 is controlled to change the focus position and perform imaging again.
  • FIG. 7 is an operation processing flowchart of the first embodiment.
  • the contrast AF control unit 16 sets the initial setting of the driving direction of the focusing lens 21 and the initial contrast evaluation value EVC used for determining the end of contrast AF (step S11).
  • the driving direction of the focusing lens 21 may be stored in advance in a memory (not shown). Further, the initial contrast evaluation value EVC may be stored in advance in the contrast storage unit 15.
  • the image capturing unit 11 images the imaged object OBJ and outputs the captured RAW image GRAW to the reflection component calculation unit 12 (step S12).
  • the reflection component calculation unit 12 calculates the reflection component, generates a surface reflected light image GS and an internally reflected light image GI, and outputs the reflected light image GS to the focus target selection unit 13 (step S13).
  • the user selects the focus target (surface or inside) and specifies the contrast detection target area (step S14).
  • the focus target selection unit 13 outputs the reflected light image (either the surface reflected light image GS or the internal reflected light image GI) of the focus target selected by the user to the contrast detection unit 14.
  • the contrast detection unit 14 performs contrast detection on the input reflected light image in the contrast detection region designated by the user.
  • the contrast detection unit 14 outputs the detected contrast evaluation value EVC to the contrast storage unit 15 and the contrast AF control unit 16 (step S15).
  • the contrast storage unit 15 stores the contrast evaluation value EVC (step S16).
  • the contrast AF control unit 16 obtains the contrast evaluation value EVC of the image corresponding to the input current focus position and the contrast evaluation value EVCp of the image corresponding to the previous focus position read from the contrast storage unit 15. Comparison (calculation of difference) is performed (step S17).
  • the contrast AF control unit 16 determines whether or not the difference between the calculated contrast evaluation values EVC is equal to or less than a predetermined threshold value (step S18).
  • step S18 if the difference in the calculated contrast evaluation value EVC is equal to or less than a predetermined threshold value (step S18; Yes), the contrast AF control unit 16 terminates the process assuming that it is in focus.
  • step S18 when the difference between the calculated contrast evaluation values EVC exceeds a predetermined threshold value, the image corresponding to the current focus position is more than the contrast evaluation value EVCp of the image corresponding to the previous focus position. It is determined whether or not the contrast evaluation value EVC is higher (step S19).
  • step S19 if the contrast evaluation value EVC of the image corresponding to the current focus position is higher than the contrast evaluation value EVCp of the image corresponding to the previous focus position (step S19; Yes), the contrast AF The control unit 16 controls the focus lens drive unit 17 to drive the focusing lens 21 in the same direction as the previous time (step S20), and shifts the process to step S12 again to perform imaging (step S12). ..
  • step S19 when the contrast evaluation value EVC of the image corresponding to the current focus position is lower than the contrast evaluation value EVCp of the image corresponding to the previous focus position (step S19; No), the contrast AF The control unit 16 controls the focus lens drive unit 17 to drive the focusing lens 21 in the direction opposite to the previous time (step S21), and shifts the process to step S12 again to perform imaging (step S12). ..
  • the surface reflected light image GS and the internally reflected light image GI are separated and generated from the imaged RAW image GRAW obtained by irradiating the imaged object OBJ with linear polarization. Since the contrast AF control is performed separately for each of the above, high-precision autofocus control can be performed without one being affected by the other and the autofocus accuracy being lowered.
  • FIG. 8 is a schematic block diagram of the image processing apparatus of the second embodiment.
  • the same parts as those in the first embodiment of FIG. 3 are designated by the same reference numerals, and the detailed description thereof shall be incorporated.
  • the difference between the image processing device 10A of the second embodiment and the image processing device 10 of the first embodiment is that the reflection component calculation unit 12 calculates the degree of polarization (degree of polarization image G ⁇ ) and the area is specified by the user.
  • FIG. 9 is an explanatory diagram of the reflection component calculation unit of the second embodiment.
  • the reflection component calculation unit 12 of the second embodiment interpolates the polarization component based on the imaged RAW image GRAW and generates the interpolation 4-polarized image GFC described later. Based on the polarization component interpolation unit 31 and the interpolation 4-polarized image GFC. It is provided with a reflection component separation unit 32 that separates reflection components and generates a polarization degree image G ⁇ , a surface reflected light image GS, and an internally reflected light image GI.
  • the operation of the polarization component interpolation unit 31 is the same as that of the first embodiment, the operation of the reflection component separation unit 32 will be described.
  • the reflection component separation unit 32 of the reflection component calculation unit 12 when the irradiated linearly polarized light has a vibrating surface of 90 °, all one pixel in the image corresponding to the vibrating surface orthogonal to the vibrating surface is 0 °.
  • the first polarized image RG1 as the first R polarized component R1
  • An internally reflected light image GI is generated based on the first polarized image BG1 in which the GG1 and one pixel in the image are all the first B polarized component B1.
  • one pixel in the image corresponding to 0 ° from the third polarized image RG3 in which all one pixel in the image corresponding to the vibrating surface 90 ° is the third R polarizing component R3 is From the R image obtained by subtracting the value of the first polarized image RG1 all set to the first R polarized component R1, and the third polarized image GG3 in which one pixel in the image corresponding to the vibration surface of 90 ° is all set to the third G polarized component G3.
  • All 1 pixel in the image corresponding to the vibrating surface 0 ° is the first G image obtained by subtracting the value of the first polarized image GG1 in which the first G polarizing component G1 is set, and all 1 pixel in the image corresponding to the vibrating surface 90 ° is the first.
  • a surface reflected light image GS is generated.
  • the reflection component separation unit 32 calculates the degree of polarization ⁇ of all pixels and generates the degree of polarization image G ⁇ .
  • the degree of polarization ⁇ can be calculated by the equation (1).
  • the parameters a, b, and c are expressed as follows from the brightness values I 0 , I 45 , I 90 , and I 135 of the four polarized light having vibration planes of 0 °, 45 °, 90 °, and 135 °. To.
  • a polarization degree image G ⁇ is generated from all the obtained polarization degree ⁇ and output to the contrast calculation pixel determination unit 18.
  • the luminance values I 0 , I 45 , I 90 , and I 135 are determined by the equation (5) from the R component value R, G component value G, and B component value B of the corresponding pixels.
  • I (R + 2G + B) / 4 ... (5)
  • FIG. 10 is an operation explanatory view of the contrast calculation pixel determination unit.
  • a portion having an extremely high luminance level or polarization degree level for example, FIG.
  • the portion having a large amount of fat and water is a portion having a so-called “shine” and a high degree of brightness or polarization.
  • FIG. 10B is an explanatory diagram of the brightness value or the degree of polarization of the pixels used for contrast detection.
  • pixels having a brightness value or a degree of polarization exceeding a predetermined threshold value should not be used for contrast detection and are therefore excluded.
  • contrast is performed using only the remaining components excluding the pixels having the brightness value or the degree of polarization exceeding the predetermined threshold value (the component corresponding to the pixel having the brightness value or the degree of polarization located on the left side of the threshold value in the figure). It is desirable to perform detection.
  • the contrast calculation pixel determination unit 18 is a target pixel mask image GM for masking pixels to be excluded from contrast detection based on the degree of polarization, the region, and a predetermined threshold value input from the surface reflected light image GS. To generate.
  • FIG. 10C is an explanatory diagram of an example of the target pixel mask image.
  • the pixels excluded from the contrast detection are displayed in white (1), and the pixels used for the contrast detection are displayed in black (0) to generate a target mask image GM which is a binary image.
  • the light that is irradiated with fat or moisture and reflected as it is, that is, the pixel having high brightness or the degree of polarization (the pixel corresponding to the white portion in the case of FIG. 10C) has the contrast value in autofocus. Contrast AF can be performed more reliably without affecting the calculation. Therefore, according to the second embodiment, more accurate contrast AF can be realized as compared with the first embodiment.
  • FIG. 11 is an explanatory diagram of the third embodiment.
  • the human body (arms, face) has an uneven shape, and the imaging region includes a region that is in focus as shown by an ellipse in FIG. 11 (a).
  • FIG. 11A As shown in FIG. 11A as a region outside the ellipse, it is natural that unfocused regions are mixed.
  • the cross section When the object to be imaged is like an arm, the cross section has a substantially elliptical shape, so if it is focused on any position, it will shift from the focus position as the distance from the focused position increases along the elliptical direction. Because it goes.
  • the region where the texture of the skin is detected by image processing such as edge detection of the captured image ( The area where the linear edge is detected) is set as the contrast detection target area.
  • image processing such as edge detection of the captured image
  • the contrast AF processing is performed faster and more reliably than in the case where the entire captured image shown in FIG. 11A is processed including the unfocused region as shown in FIG. 11C. Can be done.
  • a skin texture detection method for example, an image processing method as shown in JP2013-188341A can be mentioned, and the skin texture can be detected quickly and easily.
  • the contrast calculation pixel determination unit of the second embodiment and the third embodiment described above aims to improve the accuracy of the contrast AF on the surface of the object to be imaged (in the case of the above example, the skin surface).
  • the fourth embodiment aims to improve the accuracy of the contrast AF inside the image pickup object.
  • the stain (melanin) inside the skin has a large light absorption rate in a specific wavelength band, and by utilizing this, in a specific wavelength band.
  • the weight of the detection result it is possible to improve the accuracy of the contrast AF.
  • FIG. 12 is an explanatory diagram of the absorption wavelength.
  • a B sensor having a short wavelength for example, a wavelength of 380 nm
  • the inside of the skin such as hemoglobin and melanin (measurement target). It is possible to perform autofocus sensitively to the area where a specific substance exists (inside the object). As a result, it is possible to improve the accuracy of the contrast AF inside the image pickup object.
  • FIG. 13 is an explanatory diagram of an example of the user interface screen of the focus target selection unit of the fifth embodiment.
  • the image display area AR1 for displaying the image of the portion currently focused on is displayed on the display screen of the touch panel display, and the focus position is the image target from the surface of the image target object (skin in this example).
  • a slide bar SL for making it arbitrary within a predetermined position (predetermined depth) range inside the object is arranged.
  • the slide bar SL is provided with a slide button SLB whose display position can be changed to the left and right. Therefore, the user moves the slide button SLB in the left-right direction so that the image displayed in the image display area AR1 corresponds to the desired focus position.
  • the image display area AR1 has the values of all the pixels on the surface.
  • the surface reflected light image GS composed of the values Is of the pixels constituting the reflected light image GS is displayed.
  • 1, that is, in FIG. 13, when the slide button SLB is at the left end of the slide bar SL, the values of all the pixels in the image display area AR1 are the pixels constituting the internally reflected light image GI.
  • the internally reflected light image GI is displayed with the value Id. Then, until the user switches the display target, the display of the image corresponding to the value of ⁇ corresponding to the position of the last selected slide button SLB is continued.
  • the fifth embodiment it is possible to set an arbitrary position as the focus position within a range from the surface of the imaging object to a predetermined internal position.
  • the user interface of the focus target selection unit is not limited to this, and various users are not limited to this. It is possible to use an interface.
  • the focus position can be changed according to the pressing pressure, or the focus position can be changed by directly inputting a number (for example, 0 to 10, 0 to 100) using the numeric keypad. It is possible to change the focus position by selecting one of a plurality of buttons (for example, 10 buttons).
  • FIG. 14 is a schematic block diagram of the image processing apparatus of the sixth embodiment.
  • the same parts as those in the first embodiment of FIG. 3 are designated by the same reference numerals.
  • the preview image output unit 19 when the reflection component calculation unit generates the surface reflected light image GS and the internally reflected light image GI and outputs them, those images are displayed on one screen.
  • FIG. 15 is an explanatory diagram of an example of a display image of the preview image output unit.
  • the preview image output unit displays both the surface reflected light image GS and the internally reflected light image GI on the display screen. Therefore, the user can easily select the image to be focused by touching any of the images.
  • the mask image GM described in the fourth embodiment in addition to the surface reflected light image GS and the internally reflected light image GI, the mask image GM described in the fourth embodiment, the polarization degree image G ⁇ described in the fifth embodiment, and the like are also included. It is also possible to display it as a preview image as needed.
  • both the surface reflected light image GS and the internally reflected light image GI are displayed as the preview image, but only one of them is displayed at the time of the main shooting. It is also possible to have a configuration in which both the surface reflected light image GS and the internally reflected light image GI are automatically acquired.
  • the preview image it is possible to use a normal image with normal lighting instead of imaging with polarized lighting.
  • the surface reflected light image GS and the internally reflected light image GI are treated as separate images, but by synthesizing both images, all the images are in focus on both the surface and the inside. It is also possible to generate a store image.
  • this technology can also adopt the following configurations.
  • An image generation unit that separates and generates a surface reflected light image and an internally reflected light image from the captured image based on an imaging signal corresponding to the captured image of the imaged object irradiated with linear polarization.
  • a detection unit that detects a contrast evaluation value for each of the surface reflected light image and the internally reflected light image,
  • a determination unit that determines whether or not the reflected light image to be focused has reached the focus position based on the contrast evaluation value.
  • Image processing device equipped with (2)
  • An imaging unit having a focusing lens, imaging the object to be imaged, and outputting the imaging signal.
  • a drive unit for driving the focusing lens to change the focus position based on the determination result of the determination unit is provided.
  • the image processing apparatus (1) The image processing apparatus according to the above. (3) The image generation unit generates the surface reflected light image based on the polarization component obtained by removing the polarization component orthogonal to the linear polarization from the linearly polarized light component, and the inside is based on the polarization component orthogonal to the linear polarization. Generate a reflected light image, The image processing apparatus according to (1) or (2). (4) An image region in which the detection unit detects the contrast evaluation value corresponding to the surface reflected light image in a region where the brightness level exceeds a predetermined first threshold value or a region where the polarization degree level exceeds a predetermined second threshold value. With a decision part to decide as, The image processing apparatus according to any one of (1) to (3).
  • the imaging object is human skin
  • the detection unit includes a determination unit that determines the texture region in which the texture of the skin is detected as an image region in which the contrast evaluation value corresponding to the surface reflected light image is detected in the detection unit.
  • the image processing apparatus according to any one of (1) to (3).
  • the detection unit contains a substance having a relatively high absorption rate in a specific wavelength band inside the image pickup object, the weight of the wavelength band is increased and the contrast evaluation value is obtained for the internally reflected light image.
  • the image processing apparatus according to any one of (1) to (5).
  • the surface reflected light image and the internally reflected light image are input, and a selection unit is provided which outputs either the surface reflected light image or the internally reflected light image to the detection unit based on the designated input.
  • the image processing apparatus according to any one of (1) to (6).
  • the surface reflected light image and the internally reflected light image are input, and the surface reflected light image and the internally reflected light image are combined by changing the ratio based on the designated input, and the combined reflected light image is a contrast evaluation value in the detection unit. Equipped with a selection unit that outputs The image processing apparatus according to any one of (1) to (6).
  • An image output unit for displaying the surface reflected light image and the internally reflected light image is provided.
  • the image processing apparatus according to any one of (1) to (8).
  • (10) A method performed by an image processor A process of separating and generating a surface reflected light image and an internally reflected light image from the captured image based on an imaging signal corresponding to the captured image of the imaged object irradiated with linear polarization. The process of detecting the contrast evaluation value for each of the surface reflected light image and the internally reflected light image, and The process of determining whether or not the reflected light image to be focused has reached the focus position based on the contrast evaluation value, and A method equipped with.
  • An imaging unit that has a focusing lens, images an imaged object irradiated with linear polarization, and outputs an imaging signal.
  • An image generation unit that separates and generates a surface reflected light image and an internally reflected light image from the captured image based on the image pickup signal corresponding to the captured image of the object to be imaged.
  • a detection unit that detects a contrast evaluation value for each of the surface reflected light image and the internally reflected light image,
  • a determination unit that determines whether or not the reflected light image to be focused has reached the focus position based on the contrast evaluation value.
  • a drive unit that drives the focusing lens to change the focus position based on the determination result of the determination unit.
  • Electronic equipment equipped with is

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Immunology (AREA)
  • Biochemistry (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

This image processing device (100) is provided with: an image generation unit (103) for separating and generating a surface reflection light image (GS) and an internal reflection light image (GI) from a captured image (GRAW) on the basis of an imaging signal corresponding to the captured image of an imaging subject (OBJ) irradiated with linearly polarized light; a detection unit (104) for detecting contrast evaluation values (EVC) regarding the surface reflection light image (GS) and the internal reflection light image (GI); and a determination unit (105) for determining whether to reach a focus position regarding the reflection light image to be focused on the basis of the contrast evaluation values (EVC).

Description

画像処理装置、方法及び電子機器Image processing equipment, methods and electronic devices
 本開示は、画像処理装置、方法及び電子機器に関する。 This disclosure relates to image processing devices, methods and electronic devices.
 従来のオートフォーカスの手法の一つとして、コントラストオートフォーカス方式が知られている。
 コントラストオートフォーカス方式は、レンズを介してセンサフィールド内の撮像画像のコントラストを測定することにより合焦を行う方式である。より詳細には、コントラストオートフォーカス方式は、フォーカスレンズを移動させて、よりコントラストが高い方向に移動させ、コントラスト変動量が所定の閾値以下となった場合に合焦制御を終了するものである。
The contrast autofocus method is known as one of the conventional autofocus methods.
The contrast autofocus method is a method of focusing by measuring the contrast of an captured image in a sensor field through a lens. More specifically, in the contrast autofocus method, the focus lens is moved to move in a direction in which the contrast is higher, and the focusing control is terminated when the contrast fluctuation amount becomes equal to or less than a predetermined threshold value.
特許第5951211号公報Japanese Patent No. 5952111 特開2015-141394号公報Japanese Unexamined Patent Publication No. 2015-141394
 ところで、人の肌のように、層状の構造を有している撮像対象物を撮像する場合であって撮影に用いる照射光が撮像対象の内部に透過可能な状況においては、照射光の反射光は、表面からの反射光である表面反射光と、内部からの反射光である内部反射光が混在することとなる。 By the way, in the case of imaging an imaging object having a layered structure such as human skin and the irradiation light used for photographing can be transmitted inside the imaging object, the reflected light of the irradiation light. Is a mixture of surface-reflected light, which is reflected light from the surface, and internally reflected light, which is reflected light from the inside.
 この場合において、撮像対象物表面と内部とでは、光学的な距離が異なるため、適切なフォーカス位置も異なることとなっていた。
 例えば、撮像対象物が肌である場合、肌の表面は、可視光を透過するため、肌表面の画像と肌内部(皮下組織)の画像とが重なった状態で撮像がなされることとなっていた。これにより、互いの画像が影響して、肌表面及び肌内部のいずれにおいても、正しく合焦制御を行うことができなかった。
In this case, since the optical distance is different between the surface and the inside of the object to be imaged, the appropriate focus position is also different.
For example, when the object to be imaged is the skin, the surface of the skin transmits visible light, so that the image of the surface of the skin and the image of the inside of the skin (subcutaneous tissue) are overlapped with each other. It was. As a result, due to the influence of each other's images, it was not possible to correctly perform focusing control on either the skin surface or the inside of the skin.
 特に肌内部に関しては、肌表面のコントラストが肌内部のコントラストと比較して非常に高いため、肌内部のコントラストを正常に計測することができず、コントラストオートフォーカス制御が困難となっていた。 Especially for the inside of the skin, the contrast on the surface of the skin is very high compared to the contrast inside the skin, so that the contrast inside the skin cannot be measured normally, and it is difficult to control the contrast autofocus.
 本開示は、このような状況に鑑みてなされたものであり、肌表面及び肌内部等のように、撮像対象物の表面が透過性を有し、内部も同時に撮像可能な場合であっても、撮像対象物の表面あるいは内部のいずれかに正しく合焦させることが可能な画像処理装置、方法及び電子機器を提供することを目的としている。 The present disclosure has been made in view of such a situation, even when the surface of the object to be imaged has transparency and the inside can be imaged at the same time, such as the skin surface and the inside of the skin. It is an object of the present invention to provide an image processing device, a method, and an electronic device capable of correctly focusing on either the surface or the inside of an image-imaging object.
 上記目的を達成するために、本開示の画像処理装置は、直線偏光が照射された撮像対象物の撮像画像に対応する撮像信号に基づいて、撮像画像から表面反射光画像と内部反射光画像とを分離し生成する画像生成部と、表面反射光画像及び内部反射光画像のそれぞれについて、コントラスト評価値を検出する検出部と、コントラスト評価値に基づいてフォーカス対象の反射光画像についてフォーカス位置に到ったか否かを判定する判定部と、を備える。 In order to achieve the above object, the image processing apparatus of the present disclosure includes a surface reflected light image and an internally reflected light image from the captured image based on the imaging signal corresponding to the captured image of the imaged object irradiated with linear polarization. The image generation unit that separates and generates the image, the detection unit that detects the contrast evaluation value for each of the surface reflected light image and the internally reflected light image, and the focus position for the reflected light image to be focused based on the contrast evaluation value. It is provided with a determination unit for determining whether or not the image has been used.
実施形態の画像処理装置の原理説明図である。It is a principle explanatory drawing of the image processing apparatus of embodiment. 合焦点位置の検出ずれの説明図である。It is explanatory drawing of the detection deviation of a focal point position. 第1実施形態の画像処理装置の概要構成ブロック図である。It is a schematic block diagram of the image processing apparatus of 1st Embodiment. 4偏光センサユニットを構成している一画素の構成例の説明図である。It is explanatory drawing of the configuration example of one pixel which constitutes 4 polarization sensor unit. 第1実施形態の反射成分算出部の説明図である。It is explanatory drawing of the reflection component calculation part of 1st Embodiment. フォーカス対象選択部のユーザインタフェース画面の一例の説明図である。It is explanatory drawing of an example of the user interface screen of the focus target selection part. 第1実施形態の動作処理フローチャートである。It is an operation processing flowchart of 1st Embodiment. 第2実施形態の画像処理装置の概要構成ブロック図である。It is a schematic block diagram of the image processing apparatus of 2nd Embodiment. 第2実施形態の反射成分算出部の説明図である。It is explanatory drawing of the reflection component calculation part of 2nd Embodiment. コントラスト算出画素決定部の動作説明図である。It is operation explanatory drawing of the contrast calculation pixel determination part. 第3実施形態の説明図である。It is explanatory drawing of 3rd Embodiment. 吸収波長の説明図である。It is explanatory drawing of the absorption wavelength. 第5実施形態のフォーカス対象選択部のユーザインタフェース画面の一例の説明図である。It is explanatory drawing of an example of the user interface screen of the focus target selection part of 5th Embodiment. 第6実施形態の画像処理装置の概要構成ブロック図である。It is a schematic block diagram of the image processing apparatus of 6th Embodiment. プレビュー画像出力部の表示画像の一例の説明図である。It is explanatory drawing of an example of the display image of the preview image output part.
 次に、本開示の実施形態について図面に基づいて詳細に説明する。
[1]実施形態の原理説明
 まず、実施形態の詳細な説明に先立ち、実施形態の原理について説明する。
 図1は、実施形態の画像処理装置の原理説明図である。
 画像処理装置100は、光軸方向に沿って駆動可能な合焦用レンズ101を有し、所定の振動面を有する直線偏光が照射された撮像対象物OBJを撮像し、撮像RAW画像GRAWを出力する撮像部102と、撮像部102が出力した偏光RAW画像に含まれる撮像対象物の表面反射光LRSの成分から生成した表面反射光画像GS及び撮像対象物OBJの内部反射光LRIの成分から生成した内部反射光画像GIを生成し、出力する画像生成部103と、を備えている。
Next, the embodiments of the present disclosure will be described in detail with reference to the drawings.
[1] Explanation of the Principle of the Embodiment First, prior to the detailed explanation of the embodiment, the principle of the embodiment will be described.
FIG. 1 is an explanatory diagram of the principle of the image processing apparatus of the embodiment.
The image processing device 100 has a focusing lens 101 that can be driven along the optical axis direction, images an imaged object OBJ that has a predetermined vibrating surface and is irradiated with linearly polarized light, and outputs an imaged RAW image GRAW. Generated from the surface reflected light image GS generated from the component of the surface reflected light LRS of the imaged object included in the image pickup unit 102 and the polarized RAW image output by the image pickup unit 102 and the component of the internal reflected light LRI of the image pickup object OBJ. It includes an image generation unit 103 that generates and outputs the internally reflected light image GI.
 さらに画像処理装置100は、表面反射光画像GS及び内部反射光画像GIのそれぞれについてコントラスト評価値EVCを検出する検出部104と、コントラスト評価値EVCに基づいて表面反射光画像GS及び内部反射光画像GIのそれぞれについてフォーカス位置に到ったか否かを判定する判定部として機能し、合焦用レンズ101の駆動制御を行うコントラストAF制御部105と、コントラストAF制御部105から駆動制御信号CDRが入力され、コントラストAF制御部105の制御下で合焦用レンズ101を駆動する駆動部106と、を備えている。 Further, the image processing device 100 includes a detection unit 104 that detects the contrast evaluation value EVC for each of the surface reflected light image GS and the internally reflected light image GI, and the surface reflected light image GS and the internally reflected light image based on the contrast evaluation value EVC. A drive control signal CDR is input from the contrast AF control unit 105, which functions as a determination unit for determining whether or not the focus position has been reached for each of the GIs and controls the drive of the focusing lens 101, and the contrast AF control unit 105. It is provided with a drive unit 106 that drives the focusing lens 101 under the control of the contrast AF control unit 105.
 この場合において、コントラスト評価値EVCとは、撮像した画像のコントラスト差(明暗差)に相当する値であり、高コントラスト差、すなわち、コントラスト評価値EVCが大きい程、フォーカスが合っている(合焦している)こととなる。 In this case, the contrast evaluation value EVC is a value corresponding to the contrast difference (brightness difference) of the captured image, and the higher the contrast difference, that is, the larger the contrast evaluation value EVC, the more in focus (focusing). It will be).
 また、以上の説明においては、コントラスト評価値EVCに基づいて表面反射光画像GS及び内部反射光画像GIのそれぞれについてフォーカス位置に到ったか否かを判定するとしていた。しかしながら、表面反射光画像GS及び内部反射光画像GIのいずれか一方にのみフォーカスを合わせる場合には、いずれか一方の反射光画像(例えば、表面反射光画像GS)に対応するコントラスト評価値EVCに基づいて当該一方の反射光画像についてフォーカス位置に至ったか否かを判定するようにしてもよい。 Further, in the above description, it is determined whether or not the focus position has been reached for each of the surface reflected light image GS and the internally reflected light image GI based on the contrast evaluation value EVC. However, when focusing on only one of the surface reflected light image GS and the internally reflected light image GI, the contrast evaluation value EVC corresponding to either one of the reflected light images (for example, the surface reflected light image GS) is set. Based on this, it may be determined whether or not the focus position has been reached for the one reflected light image.
 ここで、実施形態の原理を説明する。
 図2は、合焦点位置の検出ずれの説明図である。
 従来のように、図2(a)に示す偏光RAW画像をそのまま用い、すなわち、表面反射光画像GS及び内部反射光画像GIを分離せずに用いる場合には、図2(b)に示すように、表面画像のフォーカス位置FCSpは、最適フォーカス位置FCSに対して、図2(c)及び図2(d)に示す内部反射光画像GIの影響を受けて内部反射光画像GIのフォーカス位置側にずれて検出される。
Here, the principle of the embodiment will be described.
FIG. 2 is an explanatory diagram of the detection deviation of the in-focus position.
When the polarized RAW image shown in FIG. 2 (a) is used as it is as in the conventional case, that is, when the surface reflected light image GS and the internally reflected light image GI are used without separation, as shown in FIG. 2 (b). In addition, the focus position FCSp of the surface image is affected by the internally reflected light image GI shown in FIGS. 2 (c) and 2 (d) with respect to the optimum focus position FCS, and is on the focus position side of the internally reflected light image GI. It is detected by shifting to.
 同様に内部画像のフォーカス位置FCIpは、図2(b)に示すように、最適フォーカス位置FCIに対して、図2(e)及び図2(f)に示す表面反射光画像GSの影響を受けて表面反射光画像GSのフォーカス位置側にずれて検出されることとなり、表面画像及び内部画像のいずれにおいても正しいフォーカス位置を得ることはできない。 Similarly, as shown in FIG. 2B, the focus position FCIp of the internal image is affected by the surface reflected light image GS shown in FIGS. 2 (e) and 2 (f) with respect to the optimum focus position FCI. Therefore, the surface reflected light image GS is detected by shifting to the focus position side, and the correct focus position cannot be obtained in either the surface image or the internal image.
 これを解決するため、本実施形態においては、表面反射光LRSに対応する表面反射光画像GSと内部反射光LRIに対応する内部反射光画像GIとを互いに分離している。
 これにより、本実施形態によれば、それぞれの画像でフォーカス位置を算出し、より好適なフォーカス位置を算出することができる。
In order to solve this, in the present embodiment, the surface reflected light image GS corresponding to the surface reflected light LRS and the internally reflected light image GI corresponding to the internally reflected light LRI are separated from each other.
Thereby, according to the present embodiment, the focus position can be calculated for each image, and a more suitable focus position can be calculated.
 次により具体的に本実施形態の原理を説明する。
 例えば、撮像対象物OBJとして、人の肌を用いる場合に直線偏光(例えば、90°の振動面を有する直線偏光)を照射した場合には、肌に照射された入射光は、肌の表面に反射されるとともに、肌の内部に入射し、肌の内部で散乱、反射されて再び肌の表面から出射する。
The principle of this embodiment will be specifically described below.
For example, when a human skin is used as an image pickup object OBJ and linearly polarized light (for example, linearly polarized light having a vibration surface of 90 °) is irradiated, the incident light irradiated to the skin is applied to the surface of the skin. At the same time as being reflected, it is incident on the inside of the skin, scattered inside the skin, reflected, and emitted from the surface of the skin again.
 この場合において、肌の表面で反射された表面反射光LRSは、鏡面反射光及び表面散乱光となる。これらの表面散乱光及び鏡面反射光は、偏光である。
 一方、肌の内部に入射して反射した内部反射光LRIは、内部で散乱することにより内部散乱光となる。この内部散乱光は、無偏光である。
In this case, the surface reflected light LRS reflected on the surface of the skin becomes specularly reflected light and surface scattered light. These surface scattered light and specularly reflected light are polarized light.
On the other hand, the internally reflected light LRI that is incident and reflected inside the skin becomes internally scattered light by being scattered inside. This internally scattered light is unpolarized.
 したがって、例えば、4種の偏光(例えば、0°、45°、90°、135°)の偏光をそれぞれ検出可能な4偏光センサを用いて検出し、照射した直線偏光の振動面が90°である場合には、表面反射光の成分は、90°の偏光成分(主として鏡面反射光成分)が最も多くなる。これに対し、無偏光の内部散乱光の成分は、0°、45°、90°、135°の偏光成分がほぼ均等に現れることとなる。 Therefore, for example, the polarization of four types of polarized light (for example, 0 °, 45 °, 90 °, 135 °) is detected by using a four-polarized light sensor capable of detecting each, and the irradiated linearly polarized light vibrates at 90 °. In some cases, the surface reflected light component has the largest amount of a 90 ° polarized light component (mainly a specular reflected light component). On the other hand, as for the components of the unpolarized internally scattered light, the polarized components of 0 °, 45 °, 90 °, and 135 ° appear almost evenly.
 また、上述の例の場合には、90°の偏光成分と直交する振動面を有する0°の偏光成分を90°の偏光成分から差し引くことで、90°の偏光成分には、表面反射光の成分が主として残ることとなる。 Further, in the case of the above example, the 0 ° polarization component having a vibration plane orthogonal to the 90 ° polarization component is subtracted from the 90 ° polarization component, so that the 90 ° polarization component is the surface reflected light. The ingredients will remain mainly.
 従って、画像生成部103においては、90°の偏光成分に対応する画素データから画像を生成することで、表面反射光画像(鏡面反射光画像+表面散乱画像)GSが生成され、0°の偏光成分に対応する画素データから画像を生成することで内部反射光画像(内部散乱画像)GIが生成される。
 そして、検出部104は、表面反射光画像GS及び内部反射光画像GIのそれぞれについてコントラスト評価値EVCを出力する。
Therefore, in the image generation unit 103, the surface reflected light image (specular reflected light image + surface scattered image) GS is generated by generating an image from the pixel data corresponding to the polarization component of 90 °, and the polarization of 0 ° is performed. An internally reflected light image (internally scattered image) GI is generated by generating an image from pixel data corresponding to the components.
Then, the detection unit 104 outputs the contrast evaluation value EVC for each of the surface reflected light image GS and the internally reflected light image GI.
 これらの結果、コントラストAF制御部105は、コントラスト評価値EVCに基づいてフォーカス位置に到ったか否かを判定し、表面反射光画像GS及び内部反射光画像GIのそれぞれについて、よりコントラストの高い画像が合焦画像であるとする。 As a result, the contrast AF control unit 105 determines whether or not the focus position has been reached based on the contrast evaluation value EVC, and has a higher contrast image for each of the surface reflected light image GS and the internally reflected light image GI. Is the in-focus image.
 すなわち、コントラストAF制御部105は、表面反射光画像GS及び内部反射光画像GIのそれぞれについて、合焦用レンズ101を駆動部106を介して駆動することにより得られる焦点位置を異ならせた複数の画像のうち、よりコントラストの高い画像が合焦画像であるとする。したがって、表面反射光画像GS及び内部反射光画像GIのそれぞれについて最適なフォーカス位置を検出して、それぞれ最適な合焦画像を得ることができる。 That is, the contrast AF control unit 105 has a plurality of different focal positions obtained by driving the focusing lens 101 via the drive unit 106 for each of the surface reflected light image GS and the internally reflected light image GI. Of the images, the image with higher contrast is assumed to be the in-focus image. Therefore, the optimum focus position can be detected for each of the surface reflected light image GS and the internally reflected light image GI, and the optimum focused image can be obtained for each.
 換言すれば、本実施形態の原理によれば、表面反射光画像GS及び内部反射光画像GIが互いに干渉してフォーカス位置が最適フォーカス位置からずれてしまうのを防止できる。 In other words, according to the principle of the present embodiment, it is possible to prevent the surface reflected light image GS and the internally reflected light image GI from interfering with each other and shifting the focus position from the optimum focus position.
 これに対し、従来のように表面反射光画像GS及び内部反射光画像GIを分離できない場合には、表面反射光画像GS及び内部反射光画像GIよりもコントラストが相対的に非常に高いので内面画像のコントラストが正しく測定できないこととなり、当然にコントラストオートフォーカスの精度も低下することとなる。
 しかしながら、本実施形態の原理によれば、上述したように、内部反射光画像GIにおいても、コントラストを正しく検出でき、最適な合焦画像を得ることができる。
On the other hand, when the surface reflected light image GS and the internally reflected light image GI cannot be separated as in the conventional case, the contrast is relatively higher than that of the surface reflected light image GS and the internally reflected light image GI, so that the inner surface image The contrast of the image cannot be measured correctly, and the accuracy of the contrast autofocus is naturally lowered.
However, according to the principle of the present embodiment, as described above, the contrast can be correctly detected even in the internally reflected light image GI, and the optimum focused image can be obtained.
[2]第1実施形態
 まず第1実施形態の画像処理装置について説明する。
 図3は、第1実施形態の画像処理装置の概要構成ブロック図である。
 画像処理装置10は、大別すると、画像撮像部11と、反射成分算出部12と、フォーカス対象選択部13と、コントラスト検出部14と、コントラスト記憶部15と、コントラストオートフォーカス(AF)制御部16と、フォーカスレンズ駆動部17と、を備えている。
[2] First Embodiment First, the image processing apparatus of the first embodiment will be described.
FIG. 3 is a schematic block diagram of the image processing apparatus of the first embodiment.
The image processing device 10 is roughly classified into an image imaging unit 11, a reflection component calculation unit 12, a focus target selection unit 13, a contrast detection unit 14, a contrast storage unit 15, and a contrast autofocus (AF) control unit. A focus lens driving unit 17 and a focus lens driving unit 17 are provided.
 画像撮像部11は、合焦用レンズ21及び4偏光センサユニット22を有し、撮像対象物OBJ(本実施形態では、人の肌)を撮像して、偏光RAW画像(データ)GRAWを反射成分算出部12に出力する。 The image capturing unit 11 has a focusing lens 21 and a 4-polarized sensor unit 22, images an image-imaging object OBJ (human skin in this embodiment), and reflects a polarized RAW image (data) GRAW as a reflection component. Output to the calculation unit 12.
 反射成分算出部12は、入力された偏光RAW画像(データ)GRAWに基づいて、表面反射光画像(データ)GS及び内部反射光画像(データ)GIを分離、生成し、フォーカス対象選択部13に出力する。 The reflection component calculation unit 12 separates and generates a surface reflected light image (data) GS and an internally reflected light image (data) GI based on the input polarized RAW image (data) GRAW, and causes the focus target selection unit 13 to perform the process. Output.
 この場合において、表面反射光画像GSは、照射された直線偏光(例えば、90°の振動面を有する直線偏光)の撮像対象物OBJの表面の鏡面反射による鏡面反射光画像と、撮像対象物の表面散乱による表面散乱画像の合成画像となっている。 In this case, the surface reflected light image GS includes a specular reflected light image due to specular reflection on the surface of the imaged object OBJ of the irradiated linearly polarized light (for example, linearly polarized light having a vibration surface of 90 °) and the imaged object. It is a composite image of the surface scattering image due to surface scattering.
 また内部反射光画像GIは、撮像対象物OBJの内部に入射した直線偏光が内部散乱により、ほぼ無偏光状態となった内部散乱画像となっている。 Further, the internally reflected light image GI is an internally scattered image in which the linearly polarized light incident on the inside of the imaging object OBJ is almost unpolarized due to internal scattering.
 フォーカス対象選択部13は、画像処理装置のユーザが表面反射光画像及び内部反射光画像のいずれを選択するかに基づいて、すなわち、フォーカス対象の反射光画像がいずれであるかにより表面反射光画像GSあるいは内部反射光画像GIのいずれか一方をコントラスト検出部14に出力する。 The focus target selection unit 13 determines the surface reflected light image based on whether the user of the image processing apparatus selects the surface reflected light image or the internally reflected light image, that is, which is the reflected light image to be focused. Either GS or the internally reflected light image GI is output to the contrast detection unit 14.
 コントラスト検出部14は、フォーカス対象選択部13により出力された表面反射光画像GSあるいは内部反射光画像GIのいずれか一方及びユーザの領域指定情報に基づいて、現在のフレーム画像(表面反射光画像GSあるいは内部反射光画像GIのいずれか一方)のコントラスト評価値EVCをコントラストAF制御部16に出力する。 The contrast detection unit 14 is based on either the surface reflected light image GS or the internally reflected light image GI output by the focus target selection unit 13 and the user's area designation information, and the current frame image (surface reflected light image GS). Alternatively, the contrast evaluation value EVC of any one of the internally reflected light image GI) is output to the contrast AF control unit 16.
 コントラスト記憶部15は、コントラスト検出部14が出力したコントラスト評価値EVCを時系列で所定数記憶する。 The contrast storage unit 15 stores a predetermined number of contrast evaluation values EVC output by the contrast detection unit 14 in chronological order.
 コントラストオートフォーカス(AF)制御部16は、現在のフレーム画像のコントラスト評価値EVCと、前回のフレーム画像のコントラスト評価値EVCと、の差を所定の閾値(コントラスト差閾値)と比較し、現在のフレーム画像のコントラスト評価値EVCと、前回のフレームのコントラスト評価値EVCと、の差が、所定の閾値以下となった場合に、より好適なフォーカス位置に至ったとして現在のフレーム画像を合焦した偏光RAW画像とする。 The contrast autofocus (AF) control unit 16 compares the difference between the contrast evaluation value EVC of the current frame image and the contrast evaluation value EVC of the previous frame image with a predetermined threshold value (contrast difference threshold value), and presents the difference. When the difference between the contrast evaluation value EVC of the frame image and the contrast evaluation value EVC of the previous frame is equal to or less than a predetermined threshold value, the current frame image is focused on the assumption that a more suitable focus position has been reached. The image is a polarized RAW image.
 フォーカスレンズ駆動部17は、現在のフレーム画像のコントラスト評価値EVCと、前回のフレームのコントラスト評価値EVCと、の差が、所定の閾値を未だ超えている場合に、画像撮像部11の合焦用レンズ21を駆動してフォーカス位置を変更するとともに、再撮像を行わせる。 The focus lens driving unit 17 focuses the image imaging unit 11 when the difference between the contrast evaluation value EVC of the current frame image and the contrast evaluation value EVC of the previous frame still exceeds a predetermined threshold value. The lens 21 is driven to change the focus position and to perform reimaging.
 ここで、画像撮像部の構成について説明する。
 画像撮像部11は、上述したように4偏光センサユニット22を備えている。
 4偏光センサユニット22は、0°の振動面を有する偏光を受光する第1センサと、45°の振動面を有する偏光を受光する第2センサと、90°の振動面を有する偏光を受光する第3センサと、135°の振動面を有する偏光を受光する第4センサと、がR、G、Bの各色成分に対応づけられて2次元配置されている。
Here, the configuration of the image capturing unit will be described.
The image capturing unit 11 includes a 4-polarization sensor unit 22 as described above.
The four polarization sensor unit 22 receives a first sensor that receives polarized light having a 0 ° vibrating surface, a second sensor that receives polarized light having a 45 ° vibrating surface, and a polarized light having a 90 ° vibrating surface. A third sensor and a fourth sensor that receives polarized light having a vibration surface of 135 ° are arranged in two dimensions in association with each color component of R, G, and B.
 上記構成では、振動面が45°ずつ異なる4種のセンサを備える構成としているが、振動面が異なる(無偏光も含む)少なくとも3種の偏光画像が得られ、少なくとも1対の振動面が直交している構成であれば、同様に適用が可能である。 In the above configuration, four types of sensors having different vibration planes by 45 ° are provided, but at least three types of polarized images having different vibration planes (including unpolarized light) can be obtained, and at least one pair of vibration planes are orthogonal to each other. If it is configured, it can be applied in the same way.
 より詳細には、4偏光センサユニット22は、カラーモザイクフィルタを撮像面に設けたイメージセンサ上に複数(図では、四つ)の偏光方向の画素構成とされた偏光フィルタを配置した構成を採っている。 More specifically, the 4-polarization sensor unit 22 adopts a configuration in which a plurality of (four in the figure) polarizing filters having a pixel configuration in the polarization direction are arranged on an image sensor provided with a color mosaic filter on the imaging surface. ing.
 以下、具体的に説明する。
 図4は、4偏光センサユニットを構成している一画素の構成例の説明図である。
 図4の例は、画像の1画素に対し、R:G:Bの面積比を1:2:1としたカラーモザイクフィルタを用いている。
Hereinafter, a specific description will be given.
FIG. 4 is an explanatory diagram of a configuration example of one pixel constituting the 4-polarization sensor unit.
In the example of FIG. 4, a color mosaic filter having an area ratio of R: G: B of 1: 2: 1 is used for one pixel of the image.
 そして、RフィルタFRには、振動面が0°の偏光フィルタが入射面に配置された受光素子DR1、振動面が45°の偏光フィルタが入射面に配置された受光素子DR2、振動面が90°の偏光フィルタが入射面に配置された受光素子DR3及び振動面が135°の偏光フィルタが入射面に配置された受光素子DR4が対応している。 The R filter FR includes a light receiving element DR1 in which a polarizing filter having a vibrating surface of 0 ° is arranged on the incident surface, a light receiving element DR2 in which a polarizing filter having a vibrating surface of 45 ° is arranged on the incident surface, and a vibrating surface of 90. The light receiving element DR3 in which the polarizing filter of ° is arranged on the incident surface and the light receiving element DR4 in which the polarizing filter having the vibration surface of 135 ° are arranged on the incident surface correspond.
 同様にGフィルタFG1には、振動面が0°の偏光フィルタが入射面に配置された受光素子DG11、振動面が45°の偏光フィルタが入射面に配置された受光素子DG12、振動面が90°の偏光フィルタが入射面に配置された受光素子DG13及び振動面が135°の偏光フィルタが入射面に配置された受光素子DG14が対応している。 Similarly, the G filter FG1 includes a light receiving element DG11 in which a polarizing filter having a vibrating surface of 0 ° is arranged on the incident surface, a light receiving element DG12 in which a polarizing filter having a vibrating surface of 45 ° is arranged on the incident surface, and a vibrating surface of 90. The light receiving element DG13 in which the polarizing filter of ° is arranged on the incident surface and the light receiving element DG14 in which the polarizing filter having the vibration surface of 135 ° are arranged on the incident surface correspond.
 また、GフィルタFG2には、振動面が0°の偏光フィルタが入射面に配置された受光素子DG21、振動面が45°の偏光フィルタが入射面に配置された受光素子DG22、振動面が90°の偏光フィルタが入射面に配置された受光素子DG23及び振動面が135°の偏光フィルタが入射面に配置された受光素子DG24が対応している。 Further, the G filter FG2 includes a light receiving element DG21 in which a polarizing filter having a vibrating surface of 0 ° is arranged on the incident surface, a light receiving element DG22 in which a polarizing filter having a vibrating surface of 45 ° is arranged on the incident surface, and a vibrating surface of 90. The light receiving element DG23 in which the polarizing filter of ° is arranged on the incident surface and the light receiving element DG24 in which the polarizing filter having the vibration surface of 135 ° are arranged on the incident surface correspond.
 また、BフィルタFBには、振動面が0°の偏光フィルタが入射面に配置された受光素子DB1、振動面が45°の偏光フィルタが入射面に配置された受光素子DB2、振動面が90°の偏光フィルタが入射面に配置された受光素子DB3及び振動面が135°の偏光フィルタが入射面に配置された受光素子DB4が対応している。 Further, the B filter FB includes a light receiving element DB1 in which a polarizing filter having a vibrating surface of 0 ° is arranged on the incident surface, a light receiving element DB2 in which a polarizing filter having a vibrating surface of 45 ° is arranged on the incident surface, and a vibrating surface of 90. The light receiving element DB3 in which the polarizing filter of ° is arranged on the incident surface and the light receiving element DB4 in which the polarizing filter having the vibration surface of 135 ° are arranged on the incident surface correspond.
 そして、4偏光センサユニット22には、撮像素子の画素数×4個の受光素子が二次元配置されている。 Then, in the 4-polarization sensor unit 22, light-receiving elements having the number of pixels of the image sensor x 4 are two-dimensionally arranged.
 以上の偏光センサユニット(4偏光センサユニット)の構成は、一例であり、表面反射光LRS(=鏡面反射光+表面散乱光[偏光])と、内部反射光LRI(=内部散乱光[無偏光])と、を分離できる構成であればどのような構成を採ることも可能である。 The configuration of the above polarized light sensor unit (4-polarized light sensor unit) is an example, and the surface reflected light LRS (= specular reflected light + surface scattered light [polarized light]) and the internally reflected light LRI (= internally scattered light [unpolarized light]). ]) And any configuration that can be separated can be adopted.
 例えば、少なくとも振動面が直交している二種類の偏光フィルタに対応する二種類の受光素子を備え、さらに偏光フィルタが設けられていない受光素子を備えるようにしてもよい。 For example, it may be provided with two types of light receiving elements corresponding to at least two types of polarizing filters whose vibration planes are orthogonal to each other, and further provided with a light receiving element without a polarizing filter.
 また、図4の例では、複数の受光素子(図4の例では、4個の受光素子)が一つのフィルタに対応していたが、受光素子毎にフィルタを割り当てるように構成することも可能である。 Further, in the example of FIG. 4, a plurality of light receiving elements (four light receiving elements in the example of FIG. 4) correspond to one filter, but it is also possible to configure so that a filter is assigned to each light receiving element. Is.
 次に反射成分算出部について説明する。
 図5は、第1実施形態の反射成分算出部の説明図である。
 図5においては、理解の容易のため、撮像RAW画像GRAWの1画素分、ひいては、表面反射光画像GSあるいは内部反射光画像GIの1画素分の構成についてのみ示している。
Next, the reflection component calculation unit will be described.
FIG. 5 is an explanatory diagram of the reflection component calculation unit of the first embodiment.
In FIG. 5, for the sake of easy understanding, only the configuration of one pixel of the imaged RAW image RAW, and by extension, one pixel of the surface reflected light image GS or the internally reflected light image GI is shown.
 反射成分算出部12は、撮像RAW画像GRAWに基づいて偏光成分を補間し、後述する補間4偏光画像GFCを生成する偏光成分補間部31と、補間4偏光画像GFCに基づいて反射成分を分離し、表面反射光画像GS及び内部反射光画像GIを生成する反射成分分離部32と、を備えている。 The reflection component calculation unit 12 interpolates the polarization component based on the imaged RAW image GRAW and separates the reflection component from the polarization component interpolation unit 31 that generates the interpolation 4-polarized image GFC described later and the interpolation 4-polarized image GFC. , A reflection component separating unit 32 that generates a surface reflected light image GS and an internally reflected light image GI.
 まず偏光成分補間部31の動作について説明する。
 図4の4偏光センサユニット22の例の場合、撮像RAW画像GRAWにおいて、R成分に関し、振動面が0°の第1R偏光成分R1、振動面が45°の第2R偏光成分R2、振動面が90°の第31R偏光成分R3及び振動面が135°の第4R偏光成分R4が含まれる。
First, the operation of the polarization component interpolation unit 31 will be described.
In the case of the example of the 4-polarization sensor unit 22 of FIG. 4, in the imaged RAW image GRAW, the first R polarization component R1 having a vibration surface of 0 °, the second R polarization component R2 having a vibration surface of 45 °, and the vibration surface have an R component. A 90 ° thirty-one R polarization component R3 and a fourth R polarization component R4 having a vibration plane of 135 ° are included.
 同様に撮像RAW画像GRAWにおいて、G成分に関しては、振動面が0°の第1G偏光成分G11及び第1G偏光成分G21、振動面が45°の第2G偏光成分G12及び第2G偏光成分G22、振動面が90°の第3G偏光成分G13及び第3G偏光成分G23並びにび振動面が135°の第4G偏光成分G14及び第4G偏光成分G24が含まれる。 Similarly, in the imaged RAW image GRAW, regarding the G component, the first G polarization component G11 and the first G polarization component G21 having a vibration surface of 0 °, the second G polarization component G12 and the second G polarization component G22 having a vibration surface of 45 °, and vibration. A third G polarizing component G13 and a third G polarizing component G23 having a surface of 90 °, and a fourth G polarizing component G14 and a fourth G polarizing component G24 having a vibration surface of 135 ° are included.
 また、撮像RAW画像GRAWにおいて、B成分に関し、振動面が0°の第1B偏光成分B1、振動面が45°の第2B偏光成分B2、振動面が90°の第3B偏光成分G3及び振動面が135°の第4B偏光成分B4が含まれる。 Further, in the imaged RAW image GRAW, regarding the B component, the first B polarizing component B1 having a vibration surface of 0 °, the second B polarization component B2 having a vibration surface of 45 °, the third B polarization component G3 having a vibration surface of 90 °, and the vibration surface. Includes a fourth B polarization component B4 at 135 °.
 そこで、反射成分算出部12の偏光成分補間部31は、撮像RAW画像GRAWにおいて、R成分に関し、画像における1画素が全て第1R偏光成分R1とされた第1偏光画像RG1、画像における1画素が全て第2R偏光成分R2とされた第2偏光画像RG2、画像における1画素が全て第3R偏光成分R3とされた第3偏光画像RG3及び画像における1画素が全て第4R偏光成分R4とされた第4偏光画像RG4を生成する。 Therefore, in the imaged RAW image GRAW, the polarization component interpolation unit 31 of the reflection component calculation unit 12 has a first polarized image RG1 in which all one pixel in the image is the first R polarization component R1 and one pixel in the image regarding the R component. The second polarized image RG2, which is all the second R polarization component R2, the third polarized image RG3, where one pixel in the image is all the third R polarization component R3, and the third polarized image RG3, where all one pixel in the image is the fourth R polarization component R4. A 4-polarized image RG4 is generated.
 また、偏光成分補間部31は、撮像RAW画像GRAWにおいて、G成分に関し、二つのGフィルタに対応する偏光フィルタが配置されているので、例えば、画像における1画素が全て第1G偏光成分G11と第1G偏光成分G21の平均値G1{=(G11+G21)/2}とされた第1偏光画像GG1、画像における1画素が全て第2G偏光成分G12と第2G偏光成分G22の平均値G2{=(G12+G22)/2}とされた第2偏光画像GG2、画像における1画素が全て第3G偏光成分G13と第3G偏光成分G23の平均値G3{=(G13+G23)/2}とされた第3偏光画像GG3及び画像における1画素が全て第4G偏光成分G14と第4G偏光成分G24の平均値G4{=(G14+G24)/2}とされた第4G偏光画像GG4を生成する。 Further, since the polarization component interpolation unit 31 is arranged with the polarization filters corresponding to the two G filters for the G component in the imaged RAW image GRAW, for example, all one pixel in the image is the first G polarization component G11 and the first G polarization component G11. The first polarized image GG1 in which the average value G1 {= (G11 + G21) / 2} of the 1G polarized component G21, and one pixel in the image is the average value G2 {= (G12 + G22) of the second G polarized component G12 and the second G polarized component G22. ) / 2}, the second polarized image GG2, and the third polarized image GG3 in which one pixel in the image is the average value G3 {= (G13 + G23) / 2} of the third G polarized component G13 and the third G polarized component G23. And the 4th G polarized light image GG4 in which one pixel in the image is the average value G4 {= (G14 + G24) / 2} of the 4th G polarized light component G14 and the 4th G polarized light component G24 is generated.
 また、偏光成分補間部31は、撮像RAW画像GRAWにおいて、B成分に関し、画像における1画素が全て第1B偏光成分B1とされた第1偏光画像BG1、画像における1画素が全て第2B偏光成分B2とされた第2偏光画像BG2、画像における1画素が全て第3B偏光成分B3とされた第3偏光画像BG3及び画像における1画素が全て第4B偏光成分B4とされた第4偏光画像BG4を生成する。 Further, in the imaged RAW image GRAW, the polarization component interpolation unit 31 has a first polarized image BG1 in which all one pixel in the image is the first B polarization component B1 and one pixel in the image is the second B polarization component B2. Generates the second polarized image BG2, the third polarized image BG3 in which all one pixel in the image is the third B polarized component B3, and the fourth polarized image BG4 in which all one pixel in the image is the fourth B polarized component B4. To do.
 さらに反射成分算出部12の反射成分分離部32は、照射された直線偏光が振動面が90°の場合には、これと振動面が直交する振動面が0°に対応する画像における1画素が全て第1R偏光成分R1とされた第1偏光画像RG1、画像における1画素が全て第1G偏光成分G11と第1G偏光成分G21の平均値G1{=(G11+G21)/2}とされた第1偏光画像GG1及び画像における1画素が全て第1B偏光成分B1とされた第1偏光画像BG1に基づいて内部反射光画像GIを生成する。 Further, in the reflection component separation unit 32 of the reflection component calculation unit 12, when the irradiated linearly polarized light has a vibrating surface of 90 °, one pixel in the image corresponding to the vibrating surface orthogonal to the vibrating surface is 0 °. The first polarized image RG1 which is all the first R polarization component R1, and the first polarization where one pixel in the image is the average value G1 {= (G11 + G21) / 2} of the first G polarization component G11 and the first G polarization component G21. The internally reflected light image GI is generated based on the first polarized image BG1 in which the image GG1 and one pixel in the image are all the first B polarized component B1.
 また、反射成分分離部32は、振動面が90°に対応する画像における1画素が全て第3R偏光成分R3とされた第3偏光画像RG3から振動面が0°に対応する画像における1画素が全て第1R偏光成分R1とされた第1偏光画像RG1の値を差し引いたR画像、振動面が90°に対応する画像における1画素が全て第3G偏光成分G3とされた第3偏光画像GG3から振動面が0°に対応する画像における1画素が全て第1G偏光成分G1とされた第1偏光画像GG1の値を差し引いたG画像及び振動面が90°に対応する画像における1画素が全て第3B偏光成分B3とされた第3偏光画像BG3から振動面が0°に対応する画像における1画素が全て第1B偏光成分B1とされた第1偏光画像BG1の値を差し引いたB画像に基づいて表面反射光画像GSを生成する。 Further, in the reflection component separation unit 32, one pixel in the image corresponding to 0 ° from the third polarized image RG3 in which all one pixel in the image corresponding to the vibrating surface 90 ° is the third R polarizing component R3 is From the R image obtained by subtracting the value of the first polarized image RG1 all set to the first R polarized component R1, and the third polarized image GG3 in which one pixel in the image corresponding to the vibration surface of 90 ° is all set to the third G polarized component G3. All 1 pixel in the image corresponding to the vibrating surface 0 ° is the first G image obtained by subtracting the value of the first polarized image GG1 in which the first G polarizing component G1 is set, and all 1 pixel in the image corresponding to the vibrating surface 90 ° is the first. Based on the B image obtained by subtracting the value of the first polarized image BG1 in which all 1 pixel in the image corresponding to the vibration plane corresponding to 0 ° is subtracted from the third polarized image BG3 having the 3B polarized component B3. A surface reflected light image GS is generated.
 次にフォーカス対象選択部13について説明する。
 図6は、フォーカス対象選択部のユーザインタフェース画面の一例の説明図である。
 図6の例においては、タッチパネルディスプレイの表示画面に、現在合焦対象としている部分の画像を表示する画像表示領域AR1と、フォーカス位置を撮像対象物(本例では、肌)の表面とするための表面選択ボタンBT1と、フォーカス位置を撮像対象物の内部とするための内部選択ボタンBT2と、が配置されている。
Next, the focus target selection unit 13 will be described.
FIG. 6 is an explanatory diagram of an example of a user interface screen of the focus target selection unit.
In the example of FIG. 6, the image display area AR1 for displaying the image of the portion currently focused on is displayed on the display screen of the touch panel display, and the focus position is the surface of the imaging object (skin in this example). The surface selection button BT1 and the internal selection button BT2 for setting the focus position inside the image pickup object are arranged.
 この場合において、画像表示領域AR1には、現在の合焦対象が表面または内部のいずれかであることを示す情報(図6の例では、「表面」)が表示されている。 In this case, the image display area AR1 displays information (“surface” in the example of FIG. 6) indicating that the current focus target is either the surface or the inside.
 すなわち、ユーザが表面選択ボタンBT1をタッチすると、画像表示領域AR1には、撮像対象物である肌の表面の画像が表示される。肌の表面の画像が表示された状態において、ユーザが内部選択ボタンBT2をタッチすると、画像表示領域AR1には、撮像対象物である肌の内部の画像が表示される。 That is, when the user touches the surface selection button BT1, the image of the surface of the skin, which is the object to be imaged, is displayed in the image display area AR1. When the user touches the internal selection button BT2 while the image of the surface of the skin is displayed, the image of the inside of the skin, which is the object to be imaged, is displayed in the image display area AR1.
 さらに肌の内部の画像が表示された状態において、ユーザが再び表面選択ボタンBT1をタッチすると、画像表示領域AR1には、撮像対象物である肌の表面の画像が再び表示される。
 そして、ユーザが表示対象を切り替えるまでは、最後に選択した側の画像の表示が継続される。
Further, when the user touches the surface selection button BT1 again while the image of the inside of the skin is displayed, the image of the surface of the skin, which is the object to be imaged, is displayed again in the image display area AR1.
Then, the display of the image on the last selected side is continued until the user switches the display target.
 次にコントラスト検出部14について説明する。
 本例では、タッチパネルディスプレイの画像表示領域AR1において、ユーザがコントラスト検出領域を指定可能であるものとする。
Next, the contrast detection unit 14 will be described.
In this example, it is assumed that the user can specify the contrast detection area in the image display area AR1 of the touch panel display.
 なお、特にユーザがコントラスト検出領域を指定しない場合には、プリセット(デフォルト)の領域が指定される。プリセットの領域としては、特に制限はなく、画像表示領域AR1に対応する全領域、画像表示領域AR1の中央の所定の長方形状の領域などとされる。 If the user does not specify the contrast detection area, the preset (default) area is specified. The preset area is not particularly limited, and may be the entire area corresponding to the image display area AR1, a predetermined rectangular area in the center of the image display area AR1, or the like.
 この場合においても、ユーザによりコントラスト検出領域が一旦指定された場合には、再度ユーザがコントラスト検出領域を指定するまでは、当該指定されたコントラスト検出領域が有効のままとされる。
 そしてコントラスト検出部14は、指定されたコントラスト検出領域に対応する画像の画素毎に輝度Lを算出し、輝度Lの最大値である輝度最大値Lmax及び輝度Lの最小値であるLminを抽出する。
Even in this case, once the contrast detection area is designated by the user, the designated contrast detection area remains valid until the user again specifies the contrast detection area.
Then, the contrast detection unit 14 calculates the brightness L for each pixel of the image corresponding to the designated contrast detection region, and extracts the maximum brightness Lmax which is the maximum value of the brightness L and the Lmin which is the minimum value of the brightness L. ..
 ここで、輝度Lは、例えば、次式により算出する。
 輝度Lの算出対象の画素のR成分の値をRとし、G成分の値をGとし、B成分の値をBとした場合、図4の例の場合には、以下の通りとなる。
    L=(R+2G+B)/4
Here, the brightness L is calculated by, for example, the following equation.
When the value of the R component of the pixel for which the brightness L is calculated is R, the value of the G component is G, and the value of the B component is B, the case of the example of FIG. 4 is as follows.
L = (R + 2G + B) / 4
 そして、コントラスト検出部14は、抽出した輝度最大値Lmax及び輝度最小値Lminから次式によりコントラスト評価値EVCを算出する。
    EVC=(Lmax-Lmin)/(Lmax+Lmin)
Then, the contrast detection unit 14 calculates the contrast evaluation value EVC from the extracted maximum luminance value Lmax and minimum luminance value Lmin by the following equation.
EVC = (Lmax-Lmin) / (Lmax + Lmin)
 そして、コントラスト検出部14は、算出したコントラスト評価値EVCをコントラスト記憶部15及びコントラストAF制御部16に出力する。
 これによりコントラスト記憶部15は、コントラスト評価値EVCを時系列(入力順)で記憶する(ステップS16)。
Then, the contrast detection unit 14 outputs the calculated contrast evaluation value EVC to the contrast storage unit 15 and the contrast AF control unit 16.
As a result, the contrast storage unit 15 stores the contrast evaluation value EVC in chronological order (input order) (step S16).
 次にコントラストAF制御部16について説明する。
 コントラストAF制御部16は、現在のフォーカス位置に対応する画像のコントラスト評価値EVCと、前回のフォーカス位置に対応する画像のコントラスト評価値EVCの差に基づいて、フォーカス位置に達しているか否かを判断し、フォーカス位置に達していない場合にはフォーカスレンズ駆動部17を制御して、フォーカス位置を変更し、再び撮像を行わせる。
Next, the contrast AF control unit 16 will be described.
The contrast AF control unit 16 determines whether or not the focus position has been reached based on the difference between the contrast evaluation value EVC of the image corresponding to the current focus position and the contrast evaluation value EVC of the image corresponding to the previous focus position. If the determination is made and the focus position has not been reached, the focus lens driving unit 17 is controlled to change the focus position and perform imaging again.
 次に第1実施形態の概要動作を説明する。
 図7は、第1実施形態の動作処理フローチャートである。
 まず、コントラストAF制御部16は、合焦用レンズ21の駆動方向の初期設定及びコントラストAF終了判定に用いる初期コントラスト評価値EVCを設定する(ステップS11)。
Next, the outline operation of the first embodiment will be described.
FIG. 7 is an operation processing flowchart of the first embodiment.
First, the contrast AF control unit 16 sets the initial setting of the driving direction of the focusing lens 21 and the initial contrast evaluation value EVC used for determining the end of contrast AF (step S11).
 この場合において、合焦用レンズ21の駆動方向は、予め図示しないメモリに記憶しておいても良い。
 また、初期コントラスト評価値EVCについては、コントラスト記憶部15に予め記憶しておくようにしてもよい。
In this case, the driving direction of the focusing lens 21 may be stored in advance in a memory (not shown).
Further, the initial contrast evaluation value EVC may be stored in advance in the contrast storage unit 15.
 次に画像撮像部11は、撮像対象物OBJの撮像を行い、撮像RAW画像GRAWを反射成分算出部12に出力する(ステップS12)。
 反射成分算出部12は、反射成分を算出するとともに、表面反射光画像GS及び内部反射光画像GIを生成し、フォーカス対象選択部13に出力する(ステップS13)。
Next, the image capturing unit 11 images the imaged object OBJ and outputs the captured RAW image GRAW to the reflection component calculation unit 12 (step S12).
The reflection component calculation unit 12 calculates the reflection component, generates a surface reflected light image GS and an internally reflected light image GI, and outputs the reflected light image GS to the focus target selection unit 13 (step S13).
 続いて、ユーザがフォーカス対象(表面あるいは内部)を選択し、コントラスト検出対象領域を指定する(ステップS14)。 Subsequently, the user selects the focus target (surface or inside) and specifies the contrast detection target area (step S14).
 この結果、フォーカス対象選択部13は、ユーザが選択したフォーカス対象の反射光画像(表面反射光画像GSあるいは内部反射光画像GIのいずれか一方)をコントラスト検出部14に出力する。この結果、コントラスト検出部14は、入力された反射光画像に対して、ユーザが指定したコントラスト検出領域におけるコントラスト検出を行う。 As a result, the focus target selection unit 13 outputs the reflected light image (either the surface reflected light image GS or the internal reflected light image GI) of the focus target selected by the user to the contrast detection unit 14. As a result, the contrast detection unit 14 performs contrast detection on the input reflected light image in the contrast detection region designated by the user.
 そしてコントラスト検出部14は、検出したコントラスト評価値EVCをコントラスト記憶部15及びコントラストAF制御部16に出力する(ステップS15)。
 これによりコントラスト記憶部15は、コントラスト評価値EVCを記憶する(ステップS16)。
Then, the contrast detection unit 14 outputs the detected contrast evaluation value EVC to the contrast storage unit 15 and the contrast AF control unit 16 (step S15).
As a result, the contrast storage unit 15 stores the contrast evaluation value EVC (step S16).
 続いてコントラストAF制御部16は、入力された現在のフォーカス位置に対応する画像のコントラスト評価値EVCと、コントラスト記憶部15から読み出した前回のフォーカス位置に対応する画像のコントラスト評価値EVCpと、を比較(差を算出)する(ステップS17)。 Subsequently, the contrast AF control unit 16 obtains the contrast evaluation value EVC of the image corresponding to the input current focus position and the contrast evaluation value EVCp of the image corresponding to the previous focus position read from the contrast storage unit 15. Comparison (calculation of difference) is performed (step S17).
 コントラストAF制御部16は、算出したコントラスト評価値EVCの差が所定の閾値以下であるか否かを判別する(ステップS18)。 The contrast AF control unit 16 determines whether or not the difference between the calculated contrast evaluation values EVC is equal to or less than a predetermined threshold value (step S18).
 ステップS18の判別において、算出したコントラスト評価値EVCの差が所定の閾値以下である場合には(ステップS18;Yes)、コントラストAF制御部16は、合焦したとして処理を終了する。 In the determination in step S18, if the difference in the calculated contrast evaluation value EVC is equal to or less than a predetermined threshold value (step S18; Yes), the contrast AF control unit 16 terminates the process assuming that it is in focus.
 ステップS18の判別において、算出したコントラスト評価値EVCの差が所定の閾値を超えている場合には、前回のフォーカス位置に対応する画像のコントラスト評価値EVCpよりも現在のフォーカス位置に対応する画像のコントラスト評価値EVCの方が高いか否かを判別する(ステップS19)。 In the determination in step S18, when the difference between the calculated contrast evaluation values EVC exceeds a predetermined threshold value, the image corresponding to the current focus position is more than the contrast evaluation value EVCp of the image corresponding to the previous focus position. It is determined whether or not the contrast evaluation value EVC is higher (step S19).
 ステップS19の判別において、前回のフォーカス位置に対応する画像のコントラスト評価値EVCpよりも現在のフォーカス位置に対応する画像のコントラスト評価値EVCの方が高い場合には(ステップS19;Yes)、コントラストAF制御部16は、フォーカスレンズ駆動部17を制御して、前回と同一方向に合焦用レンズ21を駆動し(ステップS20)、再び処理をステップS12に移行して撮像を行わせる(ステップS12)。 In the determination in step S19, if the contrast evaluation value EVC of the image corresponding to the current focus position is higher than the contrast evaluation value EVCp of the image corresponding to the previous focus position (step S19; Yes), the contrast AF The control unit 16 controls the focus lens drive unit 17 to drive the focusing lens 21 in the same direction as the previous time (step S20), and shifts the process to step S12 again to perform imaging (step S12). ..
 ステップS19の判別において、前回のフォーカス位置に対応する画像のコントラスト評価値EVCpよりも現在のフォーカス位置に対応する画像のコントラスト評価値EVCの方が低い場合には(ステップS19;No)、コントラストAF制御部16は、フォーカスレンズ駆動部17を制御して、前回と逆方向に合焦用レンズ21を駆動し(ステップS21)、再び処理をステップS12に移行して撮像を行わせる(ステップS12)。 In the determination in step S19, when the contrast evaluation value EVC of the image corresponding to the current focus position is lower than the contrast evaluation value EVCp of the image corresponding to the previous focus position (step S19; No), the contrast AF The control unit 16 controls the focus lens drive unit 17 to drive the focusing lens 21 in the direction opposite to the previous time (step S21), and shifts the process to step S12 again to perform imaging (step S12). ..
 以上の説明のように、本第1実施形態によれば、撮像対象物OBJに直線偏光を照射して得られる撮像RAW画像GRAWから表面反射光画像GS及び内部反射光画像GIを分離、生成し、それぞれについて別個にコントラストAF制御を行うことから、一方が他方の影響を受けてオートフォーカスの精度が低下することなく、高精度のオートフォーカス制御を行うことができる。 As described above, according to the first embodiment, the surface reflected light image GS and the internally reflected light image GI are separated and generated from the imaged RAW image GRAW obtained by irradiating the imaged object OBJ with linear polarization. Since the contrast AF control is performed separately for each of the above, high-precision autofocus control can be performed without one being affected by the other and the autofocus accuracy being lowered.
[3]第2実施形態
 次に第2実施形態について説明する。
 図8は、第2実施形態の画像処理装置の概要構成ブロック図である。
 図8において、図3の第1実施形態と同様の部分には、同一の符号を付すものとし、その詳細な説明を援用するものとする。
[3] Second Embodiment Next, the second embodiment will be described.
FIG. 8 is a schematic block diagram of the image processing apparatus of the second embodiment.
In FIG. 8, the same parts as those in the first embodiment of FIG. 3 are designated by the same reference numerals, and the detailed description thereof shall be incorporated.
 本第2実施形態の画像処理装置10Aが、第1実施形態の画像処理装置10異なる点は、反射成分算出部12が偏光度(偏光度画像Gρ)を算出する点と、ユーザからの領域指定、偏光度(偏光度画像Gρ)及びフォーカス対象の反射成分画像に基づいてコントラスト算出画素を決定するコントラスト算出画素決定部18を備えた点と、である。 The difference between the image processing device 10A of the second embodiment and the image processing device 10 of the first embodiment is that the reflection component calculation unit 12 calculates the degree of polarization (degree of polarization image Gρ) and the area is specified by the user. A point including a contrast calculation pixel determination unit 18 for determining a contrast calculation pixel based on a degree of polarization (degree of polarization image Gρ) and a reflection component image to be focused.
 まず、第2実施形態の反射成分算出部の偏光度算出動作について説明する。
 図9は、第2実施形態の反射成分算出部の説明図である。
 第2実施形態の反射成分算出部12は、撮像RAW画像GRAWに基づいて偏光成分を補間し、後述する補間4偏光画像GFCを生成する偏光成分補間部31と、補間4偏光画像GFCに基づいて反射成分を分離し、偏光度画像Gρ、表面反射光画像GS及び内部反射光画像GIを生成する反射成分分離部32と、を備えている。
 ここで、偏光成分補間部31の動作については、第1実施形態と同様であるので、反射成分分離部32の動作を説明する。
First, the polarization degree calculation operation of the reflection component calculation unit of the second embodiment will be described.
FIG. 9 is an explanatory diagram of the reflection component calculation unit of the second embodiment.
The reflection component calculation unit 12 of the second embodiment interpolates the polarization component based on the imaged RAW image GRAW and generates the interpolation 4-polarized image GFC described later. Based on the polarization component interpolation unit 31 and the interpolation 4-polarized image GFC. It is provided with a reflection component separation unit 32 that separates reflection components and generates a polarization degree image Gρ, a surface reflected light image GS, and an internally reflected light image GI.
Here, since the operation of the polarization component interpolation unit 31 is the same as that of the first embodiment, the operation of the reflection component separation unit 32 will be described.
 反射成分算出部12の反射成分分離部32は、照射された直線偏光が振動面が90°の場合には、これと振動面が直交する振動面が0°に対応する画像における1画素が全て第1R偏光成分R1とされた第1偏光画像RG1、画像における1画素が全て第1G偏光成分G11と第1G偏光成分G21の平均値G1{=(G11+G21)/2}とされた第1偏光画像GG1及び画像における1画素が全て第1B偏光成分B1とされた第1偏光画像BG1に基づいて内部反射光画像GIを生成する。 In the reflection component separation unit 32 of the reflection component calculation unit 12, when the irradiated linearly polarized light has a vibrating surface of 90 °, all one pixel in the image corresponding to the vibrating surface orthogonal to the vibrating surface is 0 °. The first polarized image RG1 as the first R polarized component R1, and the first polarized image in which one pixel in the image is the average value G1 {= (G11 + G21) / 2} of the first G polarized component G11 and the first G polarized component G21. An internally reflected light image GI is generated based on the first polarized image BG1 in which the GG1 and one pixel in the image are all the first B polarized component B1.
 また、反射成分分離部32は、振動面が90°に対応する画像における1画素が全て第3R偏光成分R3とされた第3偏光画像RG3から振動面が0°に対応する画像における1画素が全て第1R偏光成分R1とされた第1偏光画像RG1の値を差し引いたR画像、振動面が90°に対応する画像における1画素が全て第3G偏光成分G3とされた第3偏光画像GG3から振動面が0°に対応する画像における1画素が全て第1G偏光成分G1とされた第1偏光画像GG1の値を差し引いたG画像及び振動面が90°に対応する画像における1画素が全て第3B偏光成分B3とされた第3偏光画像BG3から振動面が0°に対応する画像における1画素が全て第1B偏光成分B1とされた第1偏光画像BG1の値を差し引いたB画像に基づいて表面反射光画像GSを生成する。 Further, in the reflection component separation unit 32, one pixel in the image corresponding to 0 ° from the third polarized image RG3 in which all one pixel in the image corresponding to the vibrating surface 90 ° is the third R polarizing component R3 is From the R image obtained by subtracting the value of the first polarized image RG1 all set to the first R polarized component R1, and the third polarized image GG3 in which one pixel in the image corresponding to the vibration surface of 90 ° is all set to the third G polarized component G3. All 1 pixel in the image corresponding to the vibrating surface 0 ° is the first G image obtained by subtracting the value of the first polarized image GG1 in which the first G polarizing component G1 is set, and all 1 pixel in the image corresponding to the vibrating surface 90 ° is the first. Based on the B image obtained by subtracting the value of the first polarized image BG1 in which all 1 pixel in the image corresponding to the vibration plane corresponding to 0 ° is subtracted from the third polarized image BG3 having the 3B polarized component B3. A surface reflected light image GS is generated.
 さらに反射成分分離部32は、全画素の偏光度ρを算出し、偏光度画像Gρを生成する。
 ここで、偏光度ρは、(1)式により算出できる。
Further, the reflection component separation unit 32 calculates the degree of polarization ρ of all pixels and generates the degree of polarization image Gρ.
Here, the degree of polarization ρ can be calculated by the equation (1).
Figure JPOXMLDOC01-appb-M000001
 ここで、パラメータa、b、cは、振動面が0°、45°、90°、135°の四つの偏光の輝度値I、I45、I90、I135から以下の様に表される。
Figure JPOXMLDOC01-appb-M000001
Here, the parameters a, b, and c are expressed as follows from the brightness values I 0 , I 45 , I 90 , and I 135 of the four polarized light having vibration planes of 0 °, 45 °, 90 °, and 135 °. To.
Figure JPOXMLDOC01-appb-M000002
 そして、得られた全ての偏光度ρから偏光度画像Gρを生成してコントラスト算出画素決定部18に出力する。
Figure JPOXMLDOC01-appb-M000002
Then, a polarization degree image Gρ is generated from all the obtained polarization degree ρ and output to the contrast calculation pixel determination unit 18.
 上記処理において、輝度値I、I45、I90、I135は、対応する画素のR成分値R、G成分値G及びB成分値Bから(5)式により定まる。
    I=(R+2G+B)/4      …(5)
In the above processing, the luminance values I 0 , I 45 , I 90 , and I 135 are determined by the equation (5) from the R component value R, G component value G, and B component value B of the corresponding pixels.
I = (R + 2G + B) / 4 ... (5)
 図10は、コントラスト算出画素決定部の動作説明図である。
 撮像対象である肌の表面に焦点を合わせる場合、すなわち、画像撮像部に入力される光成分の主成分が鏡面反射成分の場合、極端に輝度レベルあるいは偏光度レベルが高い部分、例えば、図10(a)に示すように、脂分や水分の多い部分は、いわゆる「てかり」がある輝度あるいは偏光度が高い部分となる。
FIG. 10 is an operation explanatory view of the contrast calculation pixel determination unit.
When focusing on the surface of the skin to be imaged, that is, when the main component of the light component input to the image capturing unit is a specular reflection component, a portion having an extremely high luminance level or polarization degree level, for example, FIG. As shown in (a), the portion having a large amount of fat and water is a portion having a so-called “shine” and a high degree of brightness or polarization.
 図10(b)は、コントラスト検出に用いる画素の輝度値あるいは偏光度の説明図である。
 図10(b)に示すように、所定の閾値を超える輝度値あるいは偏光度を有する画素は、コントラスト検出に用いるべきではないので除外する。そして、所定の閾値を超える輝度値あるいは偏光度を有する画素を除外した残りの成分(図中、閾値よりも左側に位置する輝度値又は偏光度を有する画素に相当する成分)のみを用いてコントラスト検出を行うのが望ましい。
FIG. 10B is an explanatory diagram of the brightness value or the degree of polarization of the pixels used for contrast detection.
As shown in FIG. 10B, pixels having a brightness value or a degree of polarization exceeding a predetermined threshold value should not be used for contrast detection and are therefore excluded. Then, contrast is performed using only the remaining components excluding the pixels having the brightness value or the degree of polarization exceeding the predetermined threshold value (the component corresponding to the pixel having the brightness value or the degree of polarization located on the left side of the threshold value in the figure). It is desirable to perform detection.
 そこで、コントラスト算出画素決定部18は、表面反射光画像GSから入力された偏光度、領域して及び所定の閾値に基づいて、コントラスト検出から除外すべき画素をマスクするための対象画素マスク画像GMを生成する。 Therefore, the contrast calculation pixel determination unit 18 is a target pixel mask image GM for masking pixels to be excluded from contrast detection based on the degree of polarization, the region, and a predetermined threshold value input from the surface reflected light image GS. To generate.
 図10(c)は、対象画素マスク画像の一例の説明図である。
 図10(c)においては、コントラスト検出から除外する画素を白(1)で表示し、コントラスト検出に用いる画素を黒(0)で表示した二値画像である対象マスク画像GMとして生成する。
FIG. 10C is an explanatory diagram of an example of the target pixel mask image.
In FIG. 10 (c), the pixels excluded from the contrast detection are displayed in white (1), and the pixels used for the contrast detection are displayed in black (0) to generate a target mask image GM which is a binary image.
 この結果、脂分や水分に照射されてそのまま反射された光、すなわち、輝度あるいは偏光度が高い画素(図10(c)の場合、白い部分に相当する画素)は、オートフォーカスにおけるコントラスト値の算出に影響を与えることがなくなり、より確実にコントラストAFを行うことができるのである。
 したがって、本第2実施形態によれば、第1実施形態と比較して、より正確なコントラストAFを実現できる。
As a result, the light that is irradiated with fat or moisture and reflected as it is, that is, the pixel having high brightness or the degree of polarization (the pixel corresponding to the white portion in the case of FIG. 10C) has the contrast value in autofocus. Contrast AF can be performed more reliably without affecting the calculation.
Therefore, according to the second embodiment, more accurate contrast AF can be realized as compared with the first embodiment.
[4]第3実施形態
 本第3実施形態が上述した第2実施形態と異なる点は、コントラスト検出時点で合焦している可能性が高い領域を用いてコントラスト検出を行うことで、検出精度をより向上させている点である。
[4] Third Embodiment The difference between this third embodiment and the second embodiment described above is that the detection accuracy is obtained by performing contrast detection using a region that is likely to be in focus at the time of contrast detection. It is a point that is further improved.
 図11は、第3実施形態の説明図である。
 人体(腕、顔)は、凹凸を持った形状を有しており、撮像領域内には、図11(a)に楕円で囲んだ領域として示しているように合焦している領域と、図11(a)に楕円外の領域として示しているように合焦していない領域が混在するのが当然である。撮像対象が腕のような場合には、断面が略楕円形状を有しているため、いずれかの位置に合焦させると楕円方向に沿って合焦位置から距離が離れるにつれてフォーカス位置からずれていくからである。
FIG. 11 is an explanatory diagram of the third embodiment.
The human body (arms, face) has an uneven shape, and the imaging region includes a region that is in focus as shown by an ellipse in FIG. 11 (a). As shown in FIG. 11A as a region outside the ellipse, it is natural that unfocused regions are mixed. When the object to be imaged is like an arm, the cross section has a substantially elliptical shape, so if it is focused on any position, it will shift from the focus position as the distance from the focused position increases along the elliptical direction. Because it goes.
 そこで、本第3実施形態においては、肌の表面に合焦させる場合に、図11(b)に示すように撮像画像のエッジ検出等の画像処理により、肌のきめが検出されている領域(直線状のエッジが検出されている領域)をコントラスト検出対象領域として設定する。これにより、図(c)に示すように合焦していない領域を含めて図11(a)に示した撮像画像全体を対象として処理を行う場合と比較して、高速かつ確実にコントラストAF処理が行える。 Therefore, in the third embodiment, when focusing on the surface of the skin, as shown in FIG. 11B, the region where the texture of the skin is detected by image processing such as edge detection of the captured image ( The area where the linear edge is detected) is set as the contrast detection target area. As a result, the contrast AF processing is performed faster and more reliably than in the case where the entire captured image shown in FIG. 11A is processed including the unfocused region as shown in FIG. 11C. Can be done.
 この場合において、肌のきめの検出手法としては、例えば、特開2013-188341号公報に示すような画像処理方法等を挙げることができ、迅速かつ簡易に肌のきめを検出することができる。 In this case, as a skin texture detection method, for example, an image processing method as shown in JP2013-188341A can be mentioned, and the skin texture can be detected quickly and easily.
[5]第4実施形態
 上述した第2実施形態及び第3実施形態のコントラスト算出画素決定部は、撮像対象物の表面(上述の例の場合、肌表面)のコントラストAFの高精度化を図るものであったが、本第4実施形態は、撮像対象物の内部のコントラストAFの高精度化を図るものである。
[5] Fourth Embodiment The contrast calculation pixel determination unit of the second embodiment and the third embodiment described above aims to improve the accuracy of the contrast AF on the surface of the object to be imaged (in the case of the above example, the skin surface). However, the fourth embodiment aims to improve the accuracy of the contrast AF inside the image pickup object.
 本第4実施形態を適用するに際しては、撮像対象物の内部において表面には存在せず、かつ、特定の波長帯に大きい光吸収率を有する物質が存在することが前提となっている。 When applying the fourth embodiment, it is premised that a substance that does not exist on the surface inside the image pickup object and has a large light absorption rate in a specific wavelength band exists.
 すなわち、人の肌が撮像対象物である場合には、肌内部のシミ(メラニン)が特定の波長帯に大きい光吸収率を有しているので、これを利用して、特定の波長帯における検出結果に重みを多くすることで、コントラストAFの高精度化を図ることができる。 That is, when human skin is the object to be imaged, the stain (melanin) inside the skin has a large light absorption rate in a specific wavelength band, and by utilizing this, in a specific wavelength band. By increasing the weight of the detection result, it is possible to improve the accuracy of the contrast AF.
 図12は、吸収波長の説明図である。
 例えば、RGBのセンサであったならば、図12に示すように、波長の短いBセンサ(例えば、波長380nm)の出力信号により重みをつけることで、ヘモグロビンやメラニン等の肌の内部(測定対象物内部)の特定物質の存在領域に対して敏感にオートフォーカスを行うことが可能となる。
 この結果、撮像対象物の内部のコントラストAFの高精度化を図ることが可能となる。
FIG. 12 is an explanatory diagram of the absorption wavelength.
For example, in the case of an RGB sensor, as shown in FIG. 12, by weighting the output signal of a B sensor having a short wavelength (for example, a wavelength of 380 nm), the inside of the skin such as hemoglobin and melanin (measurement target). It is possible to perform autofocus sensitively to the area where a specific substance exists (inside the object).
As a result, it is possible to improve the accuracy of the contrast AF inside the image pickup object.
[6]第5実施形態
 上記各実施形態は、撮像対象物である肌の表面あるいは所定の内部のいずれかに合焦させる場合のものであったが、本第5実施形態は、ユーザの操作により表面から所定の内部の範囲内で任意に合焦させることができる場合の実施形態である。
[6] Fifth Embodiment Each of the above embodiments is for focusing on either the surface of the skin to be imaged or a predetermined inside, but the fifth embodiment is a user operation. This is an embodiment in which the focus can be arbitrarily set within a predetermined internal range from the surface.
 まず、第5実施形態のフォーカス対象選択部について説明する。
 図13は、第5実施形態のフォーカス対象選択部のユーザインタフェース画面の一例の説明図である。
 図13の例においては、タッチパネルディスプレイの表示画面に、現在合焦対象としている部分の画像を表示する画像表示領域AR1と、フォーカス位置を撮像対象物(本例では、肌)の表面から撮像対象物の内部の所定位置(所定深さ)範囲で任意にするためのスライドバーSLと、が配置されている。
First, the focus target selection unit of the fifth embodiment will be described.
FIG. 13 is an explanatory diagram of an example of the user interface screen of the focus target selection unit of the fifth embodiment.
In the example of FIG. 13, the image display area AR1 for displaying the image of the portion currently focused on is displayed on the display screen of the touch panel display, and the focus position is the image target from the surface of the image target object (skin in this example). A slide bar SL for making it arbitrary within a predetermined position (predetermined depth) range inside the object is arranged.
 この場合において、スライドバーSLには、左右に表示位置が変更可能なスライドボタンSLBが設けられている。
 したがって、ユーザは、スライドボタンSLBを左右方向に移動させつつ、画像表示領域AR1に表示される画像が所望の合焦希望位置に対応するようにする。
In this case, the slide bar SL is provided with a slide button SLB whose display position can be changed to the left and right.
Therefore, the user moves the slide button SLB in the left-right direction so that the image displayed in the image display area AR1 corresponds to the desired focus position.
 この場合において、内部処理的には、表面反射光画像GSを構成する画素の値をIsとし、内部反射光画像GIを構成する画素の値をIdとした場合に、コントラスト検出に用いる画像の画素の値Iを(6)式に基づいて合成する。
    I=α・Id+(1-α)・Is      …(6)
In this case, in terms of internal processing, when the value of the pixel constituting the surface reflected light image GS is Is and the value of the pixel constituting the internally reflected light image GI is Id, the pixel of the image used for contrast detection is used. The value I of is synthesized based on the equation (6).
I = α ・ Id + (1-α) ・ Is… (6)
 ここで、0≦α≦1であり、α=0の場合、すなわち、図12において、スライドボタンSLBがスライドバーSLの右端にある場合、画像表示領域AR1には、全ての画素の値が表面反射光画像GSを構成する画素の値Isで構成される表面反射光画像GSが表示される。 Here, when 0 ≦ α ≦ 1 and α = 0, that is, when the slide button SLB is at the right end of the slide bar SL in FIG. 12, the image display area AR1 has the values of all the pixels on the surface. The surface reflected light image GS composed of the values Is of the pixels constituting the reflected light image GS is displayed.
 また、α=1の場合、すなわち、図13において、スライドボタンSLBがスライドバーSLの左端にある場合、画像表示領域AR1には、全ての画素の値が内部反射光画像GIを構成する画素の値Idで内部反射光画像GIが表示される。
 そして、ユーザが表示対象を切り替えるまでは、最後に選択したスライドボタンSLBの位置に対応するαの値に対応する画像の表示が継続される。
Further, when α = 1, that is, in FIG. 13, when the slide button SLB is at the left end of the slide bar SL, the values of all the pixels in the image display area AR1 are the pixels constituting the internally reflected light image GI. The internally reflected light image GI is displayed with the value Id.
Then, until the user switches the display target, the display of the image corresponding to the value of α corresponding to the position of the last selected slide button SLB is continued.
 以上の説明のように、本第5実施形態によれば、撮像対象物の表面から所定の内部位置までの範囲で、任意の位置をフォーカス位置とすることが可能となる。 As described above, according to the fifth embodiment, it is possible to set an arbitrary position as the focus position within a range from the surface of the imaging object to a predetermined internal position.
 以上の説明においては、スライドバーSLにおけるスライドボタンSLBの位置に応じて、フォーカス位置を変化させる場合について説明したが、フォーカス対象選択部のユーザインタフェースとしては、これに限られること無く、様々なユーザインタフェースを用いる事が可能である。 In the above description, the case where the focus position is changed according to the position of the slide button SLB on the slide bar SL has been described, but the user interface of the focus target selection unit is not limited to this, and various users are not limited to this. It is possible to use an interface.
 例えば、タッチパネルディスプレイが感圧式である場合に、押圧力に応じてフォーカス位置を変更したり、テンキーにより数字を入力(例えば、0~10、0~100)をダイレクトに入力してフォーカス位置を変更したり、複数のボタン(例えば、10個)のいずれかを選択させることによりフォーカス位置を変更したりすることが可能である。 For example, when the touch panel display is pressure-sensitive, the focus position can be changed according to the pressing pressure, or the focus position can be changed by directly inputting a number (for example, 0 to 10, 0 to 100) using the numeric keypad. It is possible to change the focus position by selecting one of a plurality of buttons (for example, 10 buttons).
[7]第6実施形態
 本第6実施形態が上記各実施形態と異なる点は、反射成分算出部が出力した表面反射光画像GS及び内部反射光画像GIの双方のプレビュー画像を表示するプレビュー画像出力部を備えた点である。
 図14は、第6実施形態の画像処理装置の概要構成ブロック図である。
 図14において、図3の第1実施形態と同様の部分には、同一の符号を付すものとする。
 プレビュー画像出力部19は、反射成分算出部が、表面反射光画像GS及び内部反射光画像GIを生成し、出力すると、それらの画像を一画面中に表示する。
[7] Sixth Embodiment The difference between this sixth embodiment and each of the above embodiments is that the preview image for displaying both the surface reflected light image GS and the internally reflected light image GI output by the reflection component calculation unit is displayed. The point is that it has an output unit.
FIG. 14 is a schematic block diagram of the image processing apparatus of the sixth embodiment.
In FIG. 14, the same parts as those in the first embodiment of FIG. 3 are designated by the same reference numerals.
In the preview image output unit 19, when the reflection component calculation unit generates the surface reflected light image GS and the internally reflected light image GI and outputs them, those images are displayed on one screen.
 図15は、プレビュー画像出力部の表示画像の一例の説明図である。
 図15(a)に示すように、プレビュー画像出力部は、表示画面上に表面反射光画像GS及び内部反射光画像GIの双方を表示する。
 したがって、いずれかの画像をタッチすることで、合焦対象画像をユーザが容易に選択することも可能である。
FIG. 15 is an explanatory diagram of an example of a display image of the preview image output unit.
As shown in FIG. 15A, the preview image output unit displays both the surface reflected light image GS and the internally reflected light image GI on the display screen.
Therefore, the user can easily select the image to be focused by touching any of the images.
 また図15(b)に示すように、表面反射光画像GS及び内部反射光画像GIに加えて、第4実施形態で説明したマスク画像GMや第5実施形態で説明した偏光度画像Gρ等も適宜必要に応じてプレビュー画像として表示させることも可能である。 Further, as shown in FIG. 15B, in addition to the surface reflected light image GS and the internally reflected light image GI, the mask image GM described in the fourth embodiment, the polarization degree image Gρ described in the fifth embodiment, and the like are also included. It is also possible to display it as a preview image as needed.
 このような構成を採ることにより、ユーザに領域指定を行わせる参考画像として用いたり、オートフォーカスがユーザの所望の通りとならない原因などについての情報を与えたりすることが可能である。 By adopting such a configuration, it is possible to use it as a reference image for the user to specify the area, or to give information about the cause of the autofocus not being as desired by the user.
 また以上の第6実施形態の説明においては、プレビュー画像として表面反射光画像GS及び内部反射光画像GIの双方を表示する構成を採っていたが、いずれか一方のみを表示して、本撮影時にのみ表面反射光画像GS及び内部反射光画像GIを両方自動的に取得する構成とすることも可能である。 Further, in the above description of the sixth embodiment, both the surface reflected light image GS and the internally reflected light image GI are displayed as the preview image, but only one of them is displayed at the time of the main shooting. It is also possible to have a configuration in which both the surface reflected light image GS and the internally reflected light image GI are automatically acquired.
 さらには、プレビュー画像としては、偏光照明による撮像ではなく、通常照明による通常画像を用いるようにすることも可能である。 Furthermore, as the preview image, it is possible to use a normal image with normal lighting instead of imaging with polarized lighting.
[8]実施形態の変形例
 以上の説明においては、撮像対象物が肌である場合を例として説明したが、偏光である照明光が表面で反射されるばかりで無く、内部に透過して散乱するような撮像対象物、例えば、食品(果実等)についても同様に適用が可能である。
[8] Modification Example of the Embodiment In the above description, the case where the object to be imaged is the skin has been described as an example, but the illumination light which is polarized light is not only reflected on the surface but also transmitted and scattered inside. The same can be applied to an imaged object such as a food (fruit, etc.).
 また、2層構造に限らず多層構造であっても表面といずれかの層(より表面側の層も含む)についても適用が可能である。 Further, it is possible to apply not only to a two-layer structure but also to a multi-layer structure on the surface and any layer (including a layer on the more surface side).
 以上の説明においては、コントラストAFのみを用いる画像処理装置について説明したが、多くの撮像装置のオートフォーカス方式として主流となっている像面位相差AFとの組合せについても適用が可能である。 In the above description, an image processing device that uses only contrast AF has been described, but it can also be applied to a combination with image plane phase difference AF, which is the mainstream autofocus method for many image pickup devices.
 また、以上の説明においては、表面反射光画像GSと内部反射光画像GIとを別個のものとして扱っていたが、両画像を合成することで、表面と内部の双方において合焦状態にある全商店画像を生成するようにすることも可能である。 Further, in the above description, the surface reflected light image GS and the internally reflected light image GI are treated as separate images, but by synthesizing both images, all the images are in focus on both the surface and the inside. It is also possible to generate a store image.
 なお、本明細書に記載された効果はあくまで例示であって限定されるものでは無く、また他の効果があってもよい。 Note that the effects described in the present specification are merely examples and are not limited, and other effects may be obtained.
 なお、本技術は以下のような構成も採ることができる。
(1)
 直線偏光が照射された撮像対象物の撮像画像に対応する撮像信号に基づいて、前記撮像画像から表面反射光画像と内部反射光画像とを分離し生成する画像生成部と、
 前記表面反射光画像及び前記内部反射光画像のそれぞれについて、コントラスト評価値を検出する検出部と、
 前記コントラスト評価値に基づいてフォーカス対象の反射光画像についてフォーカス位置に到ったか否かを判定する判定部と、
 を備えた画像処理装置。
(2)
 合焦用レンズを有し、前記撮像対象物の撮像を行い、前記撮像信号を出力する撮像部と、
 前記判定部の判定結果に基づいてフォーカス位置を変更させるべく前記合焦用レンズを駆動する駆動部と、を備えた、
 (1)記載の画像処理装置。
(3)
 前記画像生成部は、前記直線偏光の成分から前記直線偏光に直交する偏光成分を除いた偏光成分に基づいて前記表面反射光画像を生成し、前記直線偏光に直交する偏光成分に基づいて前記内部反射光画像を生成する、
 (1)又は(2)記載の画像処理装置。
(4)
 輝度レベルが所定の第1閾値を超えている領域あるいは偏光度レベルが所定の第2閾値を超えている領域を前記検出部において前記表面反射光画像に対応する前記コントラスト評価値を検出する画像領域として決定する決定部を備えた、
 (1)乃至(3)のいずれか一項記載の画像処理装置。
(5)
 前記撮像対象物は、人の肌で有り、
 前記検出部は、前記肌のきめが検出されているきめ領域を前記検出部において前記表面反射光画像に対応する前記コントラスト評価値を検出する画像領域として決定する決定部を備えた、
 (1)乃至(3)のいずれか一項記載の画像処理装置。
(6)
 前記検出部は、前記撮像対象物の内部に特定の波長帯において吸収率が相対的に高い物質が含まれる場合に、前記波長帯の重みを大きくして前記内部反射光画像について、コントラスト評価値を検出する、
 (1)乃至(5)のいずれか一項記載の画像処理装置。
(7)
 前記表面反射光画像及び前記内部反射光画像が入力され、指定入力に基づいて表面反射光画像あるいは前記内部反射光画像のいずれか一方を前記検出部に出力する選択部を備えた、
 (1)乃至(6)のいずれか一項記載の画像処理装置。
(8)
 前記表面反射光画像及び前記内部反射光画像が入力され、指定入力に基づいて表面反射光画像及び前記内部反射光画像の比率を変えて合成し、合成反射光画像を前記検出部におけるコントラスト評価値を検出対象として出力する選択部備えた、
 (1)乃至(6)のいずれか一項記載の画像処理装置。
(9)
 前記表面反射光画像及び前記内部反射光画像を表示する画像出力部を備えた、
  (1)乃至(8)のいずれか一項記載の画像処理装置。
(10)
 画像処理装置で実行される方法であって、
 直線偏光が照射された撮像対象物の撮像画像に対応する撮像信号に基づいて、前記撮像画像から表面反射光画像と内部反射光画像とを分離し生成する過程と、
 前記表面反射光画像及び前記内部反射光画像のそれぞれについて、コントラスト評価値を検出する過程と、
 前記コントラスト評価値に基づいてフォーカス対象の反射光画像についてフォーカス位置に到ったか否かを判定する過程と、
 を備えた方法。
(11)
 合焦用レンズを有し、直線偏光が照射された撮像対象物の撮像を行い、撮像信号を出力する撮像部と、
 撮像対象物の撮像画像に対応する前記撮像信号に基づいて、前記撮像画像から表面反射光画像と内部反射光画像とを分離し生成する画像生成部と、
 前記表面反射光画像及び前記内部反射光画像のそれぞれについて、コントラスト評価値を検出する検出部と、
 前記コントラスト評価値に基づいてフォーカス対象の反射光画像についてフォーカス位置に到ったか否かを判定する判定部と、
 前記判定部の判定結果に基づいてフォーカス位置を変更させるべく前記合焦用レンズを駆動する駆動部と、
 を備えた電子機器。
In addition, this technology can also adopt the following configurations.
(1)
An image generation unit that separates and generates a surface reflected light image and an internally reflected light image from the captured image based on an imaging signal corresponding to the captured image of the imaged object irradiated with linear polarization.
A detection unit that detects a contrast evaluation value for each of the surface reflected light image and the internally reflected light image,
A determination unit that determines whether or not the reflected light image to be focused has reached the focus position based on the contrast evaluation value.
Image processing device equipped with.
(2)
An imaging unit having a focusing lens, imaging the object to be imaged, and outputting the imaging signal.
A drive unit for driving the focusing lens to change the focus position based on the determination result of the determination unit is provided.
(1) The image processing apparatus according to the above.
(3)
The image generation unit generates the surface reflected light image based on the polarization component obtained by removing the polarization component orthogonal to the linear polarization from the linearly polarized light component, and the inside is based on the polarization component orthogonal to the linear polarization. Generate a reflected light image,
The image processing apparatus according to (1) or (2).
(4)
An image region in which the detection unit detects the contrast evaluation value corresponding to the surface reflected light image in a region where the brightness level exceeds a predetermined first threshold value or a region where the polarization degree level exceeds a predetermined second threshold value. With a decision part to decide as,
The image processing apparatus according to any one of (1) to (3).
(5)
The imaging object is human skin,
The detection unit includes a determination unit that determines the texture region in which the texture of the skin is detected as an image region in which the contrast evaluation value corresponding to the surface reflected light image is detected in the detection unit.
The image processing apparatus according to any one of (1) to (3).
(6)
When the detection unit contains a substance having a relatively high absorption rate in a specific wavelength band inside the image pickup object, the weight of the wavelength band is increased and the contrast evaluation value is obtained for the internally reflected light image. To detect,
The image processing apparatus according to any one of (1) to (5).
(7)
The surface reflected light image and the internally reflected light image are input, and a selection unit is provided which outputs either the surface reflected light image or the internally reflected light image to the detection unit based on the designated input.
The image processing apparatus according to any one of (1) to (6).
(8)
The surface reflected light image and the internally reflected light image are input, and the surface reflected light image and the internally reflected light image are combined by changing the ratio based on the designated input, and the combined reflected light image is a contrast evaluation value in the detection unit. Equipped with a selection unit that outputs
The image processing apparatus according to any one of (1) to (6).
(9)
An image output unit for displaying the surface reflected light image and the internally reflected light image is provided.
The image processing apparatus according to any one of (1) to (8).
(10)
A method performed by an image processor
A process of separating and generating a surface reflected light image and an internally reflected light image from the captured image based on an imaging signal corresponding to the captured image of the imaged object irradiated with linear polarization.
The process of detecting the contrast evaluation value for each of the surface reflected light image and the internally reflected light image, and
The process of determining whether or not the reflected light image to be focused has reached the focus position based on the contrast evaluation value, and
A method equipped with.
(11)
An imaging unit that has a focusing lens, images an imaged object irradiated with linear polarization, and outputs an imaging signal.
An image generation unit that separates and generates a surface reflected light image and an internally reflected light image from the captured image based on the image pickup signal corresponding to the captured image of the object to be imaged.
A detection unit that detects a contrast evaluation value for each of the surface reflected light image and the internally reflected light image,
A determination unit that determines whether or not the reflected light image to be focused has reached the focus position based on the contrast evaluation value.
A drive unit that drives the focusing lens to change the focus position based on the determination result of the determination unit.
Electronic equipment equipped with.
 10、10A 画像処理装置
 11  画像撮像部
 12  反射成分算出部
 13  フォーカス対象選択部
 14  コントラスト検出部
 15  コントラスト記憶部
 16  コントラストAF制御部
 17  フォーカスレンズ駆動部
 18  コントラスト算出画素決定部
 19  プレビュー画像出力部
 21  合焦用レンズ
 22  4偏光センサユニット
 31  偏光成分補間部
 32  反射成分分離部
 100 画像処理装置
 101 合焦用レンズ
 102 撮像部
 103 画像生成部
 104 検出部
 105 コントラストAF制御部
 106 駆動部
 AR1 画像表示領域
 CDR 駆動制御信号
 EVC、EVCp コントラスト評価値
 FCI 最適フォーカス位置
 FCIp フォーカス位置
 FCS 最適フォーカス位置
 FCSp フォーカス位置
 GFC 補間4偏光画像
 GI  内部反射光画像
 GM  対象画素マスク画像
 GRAW 撮像RAW画像
 GS  表面反射光画像
 Gρ  偏光度画像
 LRI 内部反射光
 LRS 表面反射光
 Lmax 輝度最大値
 Lmin 輝度最小値
 OBJ 撮像対象物
 SL  スライドバー
 SLB スライドボタン
10, 10A Image processing device 11 Image imaging unit 12 Reflection component calculation unit 13 Focus target selection unit 14 Contrast detection unit 15 Contrast storage unit 16 Contrast AF control unit 17 Focus lens drive unit 18 Contrast calculation pixel determination unit 19 Preview image output unit 21 Focusing lens 22 4 Polarization sensor unit 31 Polarization component interpolation unit 32 Reflection component separation unit 100 Image processing device 101 Focusing lens 102 Imaging unit 103 Image generation unit 104 Detection unit 105 Contrast AF control unit 106 Drive unit AR1 Image display area CDR drive control signal EVC, EVCp Contrast evaluation value FCI Optimal focus position FCIp Focus position FCS Optimal focus position FCSp Focus position GFC Interpolation 4 Polarized image GI Internal reflected light image GM Target pixel mask image GRAW Imaging RAW image GS Surface reflected light image Gρ Degree Image LRI Internally reflected light LRS Surface reflected light Lmax Maximum brightness Lmin Minimum brightness OBJ Object to be imaged SL Slide bar SLB Slide button

Claims (11)

  1.  直線偏光が照射された撮像対象物の撮像画像に対応する撮像信号に基づいて、前記撮像画像から表面反射光画像と内部反射光画像とを分離し生成する画像生成部と、
     前記表面反射光画像及び前記内部反射光画像のそれぞれについて、コントラスト評価値を検出する検出部と、
     前記コントラスト評価値に基づいてフォーカス対象の反射光画像についてフォーカス位置に到ったか否かを判定する判定部と、
     を備えた画像処理装置。
    An image generation unit that separates and generates a surface reflected light image and an internally reflected light image from the captured image based on an imaging signal corresponding to the captured image of the imaged object irradiated with linear polarization.
    A detection unit that detects a contrast evaluation value for each of the surface reflected light image and the internally reflected light image,
    A determination unit that determines whether or not the reflected light image to be focused has reached the focus position based on the contrast evaluation value.
    Image processing device equipped with.
  2.  合焦用レンズを有し、前記撮像対象物の撮像を行い、前記撮像信号を出力する撮像部と、
     前記判定部の判定結果に基づいてフォーカス位置を変更させるべく前記合焦用レンズを駆動する駆動部と、を備えた、
     請求項1記載の画像処理装置。
    An imaging unit having a focusing lens, imaging the object to be imaged, and outputting the imaging signal.
    A drive unit that drives the focusing lens to change the focus position based on the determination result of the determination unit is provided.
    The image processing apparatus according to claim 1.
  3.  前記画像生成部は、前記直線偏光の成分から前記直線偏光に直交する偏光成分を除いた偏光成分に基づいて前記表面反射光画像を生成し、前記直線偏光に直交する偏光成分に基づいて前記内部反射光画像を生成する、
     請求項1記載の画像処理装置。
    The image generation unit generates the surface reflected light image based on the polarization component obtained by removing the polarization component orthogonal to the linear polarization from the linearly polarized light component, and the inside is based on the polarization component orthogonal to the linear polarization. Generate a reflected light image,
    The image processing apparatus according to claim 1.
  4.  輝度レベルが所定の第1閾値を超えている領域あるいは偏光度レベルが所定の第2閾値を超えている領域を前記検出部において前記表面反射光画像に対応する前記コントラスト評価値を検出する画像領域として決定する決定部を備えた、
     請求項1記載の画像処理装置。
    An image region in which the detection unit detects the contrast evaluation value corresponding to the surface reflected light image in a region where the brightness level exceeds a predetermined first threshold value or a region where the polarization degree level exceeds a predetermined second threshold value. With a decision part to decide as,
    The image processing apparatus according to claim 1.
  5.  前記撮像対象物は、人の肌で有り、
     前記検出部は、前記肌のきめが検出されているきめ領域を前記検出部において前記表面反射光画像に対応する前記コントラスト評価値を検出する画像領域として決定する決定部を備えた、
     請求項1記載の画像処理装置。
    The object to be imaged is human skin,
    The detection unit includes a determination unit that determines the texture region in which the texture of the skin is detected as an image region in which the contrast evaluation value corresponding to the surface reflected light image is detected in the detection unit.
    The image processing apparatus according to claim 1.
  6.  前記検出部は、前記撮像対象物の内部に特定の波長帯において吸収率が相対的に高い物質が含まれる場合に、前記波長帯の重みを大きくして前記内部反射光画像について、コントラスト評価値を検出する、
     請求項1記載の画像処理装置。
    When the detection unit contains a substance having a relatively high absorption rate in a specific wavelength band inside the image pickup object, the weight of the wavelength band is increased and the contrast evaluation value is obtained for the internally reflected light image. To detect,
    The image processing apparatus according to claim 1.
  7.  前記表面反射光画像及び前記内部反射光画像が入力され、指定入力に基づいて表面反射光画像あるいは前記内部反射光画像のいずれか一方を前記検出部に出力する選択部を備えた、
     請求項1記載の画像処理装置。
    The surface reflected light image and the internally reflected light image are input, and a selection unit is provided which outputs either the surface reflected light image or the internally reflected light image to the detection unit based on the designated input.
    The image processing apparatus according to claim 1.
  8.  前記表面反射光画像及び前記内部反射光画像が入力され、指定入力に基づいて表面反射光画像及び前記内部反射光画像の比率を変えて合成し、合成反射光画像を前記検出部におけるコントラスト評価値を検出対象として出力する選択部備えた、
     請求項1記載の画像処理装置。
    The surface reflected light image and the internally reflected light image are input, and the surface reflected light image and the internally reflected light image are combined by changing the ratio based on the designated input, and the combined reflected light image is a contrast evaluation value in the detection unit. Equipped with a selection unit that outputs
    The image processing apparatus according to claim 1.
  9.  前記表面反射光画像及び前記内部反射光画像を表示する画像出力部を備えた、
     請求項1記載の画像処理装置。
    An image output unit for displaying the surface reflected light image and the internally reflected light image is provided.
    The image processing apparatus according to claim 1.
  10.  画像処理装置で実行される方法であって、
     直線偏光が照射された撮像対象物の撮像画像に対応する撮像信号に基づいて、前記撮像画像から表面反射光画像と内部反射光画像とを分離し生成する過程と、
     前記表面反射光画像及び前記内部反射光画像のそれぞれについて、コントラスト評価値を検出する過程と、
     前記コントラスト評価値に基づいてフォーカス対象の反射光画像についてフォーカス位置に到ったか否かを判定する過程と、
     を備えた方法。
    A method performed by an image processor
    A process of separating and generating a surface reflected light image and an internally reflected light image from the captured image based on an imaging signal corresponding to the captured image of the imaged object irradiated with linear polarization.
    The process of detecting the contrast evaluation value for each of the surface reflected light image and the internally reflected light image, and
    The process of determining whether or not the reflected light image to be focused has reached the focus position based on the contrast evaluation value, and
    A method equipped with.
  11.  合焦用レンズを有し、直線偏光が照射された撮像対象物の撮像を行い、撮像信号を出力する撮像部と、
     撮像対象物の撮像画像に対応する前記撮像信号に基づいて、前記撮像画像から表面反射光画像と内部反射光画像とを分離し生成する画像生成部と、
     前記表面反射光画像及び前記内部反射光画像のそれぞれについて、コントラスト評価値を検出する検出部と、
     前記コントラスト評価値に基づいてフォーカス位置に到ったか否かを判定する判定部と、
     前記判定部の判定結果に基づいてフォーカス対象の反射光画像についてフォーカス位置を変更させるべく前記合焦用レンズを駆動する駆動部と、
     を備えた電子機器。
    An imaging unit that has a focusing lens, images an imaged object irradiated with linear polarization, and outputs an imaging signal.
    An image generation unit that separates and generates a surface reflected light image and an internally reflected light image from the captured image based on the image pickup signal corresponding to the captured image of the object to be imaged.
    A detection unit that detects a contrast evaluation value for each of the surface reflected light image and the internally reflected light image,
    A determination unit that determines whether or not the focus position has been reached based on the contrast evaluation value,
    A drive unit that drives the focusing lens to change the focus position of the reflected light image to be focused based on the determination result of the determination unit.
    Electronic equipment equipped with.
PCT/JP2020/016845 2019-04-24 2020-04-17 Image processing device and method, and electronic apparatus WO2020218180A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-082758 2019-04-24
JP2019082758 2019-04-24

Publications (1)

Publication Number Publication Date
WO2020218180A1 true WO2020218180A1 (en) 2020-10-29

Family

ID=72942678

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/016845 WO2020218180A1 (en) 2019-04-24 2020-04-17 Image processing device and method, and electronic apparatus

Country Status (1)

Country Link
WO (1) WO2020218180A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013205571A (en) * 2012-03-28 2013-10-07 Sony Corp Imaging device, imaging method, program, imaging system and accessory device
WO2014027522A1 (en) * 2012-08-17 2014-02-20 ソニー株式会社 Image processing device, image processing method, program, and image processing system
WO2014027523A1 (en) * 2012-08-17 2014-02-20 ソニー株式会社 Image-processing device, image-processing method, program and image-processing system
JP2015085039A (en) * 2013-10-31 2015-05-07 シャープ株式会社 Measuring device
JP2016010063A (en) * 2014-06-25 2016-01-18 キヤノン株式会社 Imaging apparatus
WO2016121518A1 (en) * 2015-01-29 2016-08-04 ソニー株式会社 Information processing device, information processing method, and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013205571A (en) * 2012-03-28 2013-10-07 Sony Corp Imaging device, imaging method, program, imaging system and accessory device
WO2014027522A1 (en) * 2012-08-17 2014-02-20 ソニー株式会社 Image processing device, image processing method, program, and image processing system
WO2014027523A1 (en) * 2012-08-17 2014-02-20 ソニー株式会社 Image-processing device, image-processing method, program and image-processing system
JP2015085039A (en) * 2013-10-31 2015-05-07 シャープ株式会社 Measuring device
JP2016010063A (en) * 2014-06-25 2016-01-18 キヤノン株式会社 Imaging apparatus
WO2016121518A1 (en) * 2015-01-29 2016-08-04 ソニー株式会社 Information processing device, information processing method, and program

Similar Documents

Publication Publication Date Title
JP4172898B2 (en) Electronic endoscope device
WO2017145529A1 (en) Calculation system
US20110077462A1 (en) Electronic endoscope system, processor for electronic endoscope, and method of displaying vascular information
JP6066595B2 (en) Fundus image processing apparatus, fundus image processing method, and program
US20030040668A1 (en) Endoscope apparatus
JP5808031B2 (en) Endoscope system
US10874293B2 (en) Endoscope device
JP2004350849A (en) Fundus camera
TW200833094A (en) Focus assist system and method
US7534205B2 (en) Methods and apparatuses for selecting and displaying an image with the best focus
JP2022027501A (en) Imaging device, method for performing phase-difference auto-focus, endoscope system, and program
JP2007054113A (en) Electronic endoscope, endoscope light source device, endoscope processor, and endoscope system
WO2018131631A1 (en) Endoscope system and image display device
WO2018043726A1 (en) Endoscope system
JP6120491B2 (en) Endoscope apparatus and focus control method for endoscope apparatus
WO2020218180A1 (en) Image processing device and method, and electronic apparatus
CN109561808B (en) Analysis device
WO2021157392A1 (en) Image-processing device, endoscopic system, and image-processing method
JP6017749B1 (en) IMAGING DEVICE AND IMAGING DEVICE OPERATING METHOD
WO2015025672A1 (en) Endoscope system, processor device, operation method, and table creation method
US9347830B2 (en) Apparatus and method for obtaining spectral image
JP5048103B2 (en) Ophthalmic imaging equipment
WO2022014258A1 (en) Processor device and processor device operation method
JP2009033612A (en) Imaging apparatus
JP5990141B2 (en) ENDOSCOPE SYSTEM, PROCESSOR DEVICE, AND OPERATION METHOD

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20796342

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20796342

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP