WO2019208155A1 - Image processing device, method and program, and imaging device - Google Patents

Image processing device, method and program, and imaging device Download PDF

Info

Publication number
WO2019208155A1
WO2019208155A1 PCT/JP2019/015046 JP2019015046W WO2019208155A1 WO 2019208155 A1 WO2019208155 A1 WO 2019208155A1 JP 2019015046 W JP2019015046 W JP 2019015046W WO 2019208155 A1 WO2019208155 A1 WO 2019208155A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
image
emitting
distribution information
unit
Prior art date
Application number
PCT/JP2019/015046
Other languages
French (fr)
Japanese (ja)
Inventor
祐也 西尾
智紀 増田
智大 島田
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2020516173A priority Critical patent/JP6810299B2/en
Publication of WO2019208155A1 publication Critical patent/WO2019208155A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/05Combinations of cameras with electronic flash apparatus; Electronic flash units
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10144Varying exposure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present invention relates to an image processing apparatus, method, program, and imaging apparatus, and in particular, performs high dynamic range synthesis (HDR (High-dynamic-range) synthesis) that combines a plurality of images with different exposures to expand a dynamic range.
  • HDR High-dynamic-range
  • HDR processing As one of methods for expressing a dynamic range of a subject with a wider image.
  • an image obtained by reducing the exposure amount with respect to the appropriate exposure (underexposure image) and an image obtained by increasing the exposure amount (overexposure image) are acquired, and a pixel on which a dark subject is imaged
  • an overexposed image pixel, and an underexposed image pixel is used for a pixel on which a bright subject is imaged (actually, not only one of them may be mixed), but an HDR image with little overexposure and underexposure is generated. create.
  • Patent Documents 1 and 2 Conventionally, as an apparatus for generating one HDR image from a plurality of images taken at different exposures, there are devices described in Patent Documents 1 and 2.
  • the imaging system described in Patent Literature 1 is an exposure of luminance signals related to a long-time exposure image from a plurality of images (long-time exposure image and short-time exposure image) captured on the same subject under different exposure conditions.
  • a luminance signal in a region (appropriate exposure region) other than an over-exposed region (inappropriate exposure region) is acquired, and in the case of a luminance signal related to a short exposure image, it corresponds to the inappropriate exposure region in the long exposure image.
  • Get the luminance signal Get the luminance signal.
  • a wide dynamic range image is generated by combining the appropriate exposure areas of the images.
  • the imaging system described in Patent Literature 1 estimates a main subject area from a long-exposure image and a short-exposure image, and performs tone correction so that a tone width is allocated to the estimated main subject area.
  • the gradation for the main subject in the appropriate exposure area is enriched by synthesizing the appropriate exposure area for each image after gradation correction.
  • the imaging device described in Patent Document 2 includes one imaging device having a pixel group that is exposed for a long time and a pixel group that is exposed for a short time, and the image data of the first pixel group read from the imaging device and the second image data.
  • the HDR image is generated by combining the image data of the pixel group.
  • Patent Document 2 adds a mixing ratio of flash image data that is long-time exposure data and non-flash image data that is short-time exposure data while adjusting the mixing ratio according to the estimated distance to the subject. There is a description of obtaining one composite image. As a result, an image with a high signal-to-noise ratio (Signal to Noise Ratio) of the main subject is obtained while leaving the atmosphere of the background image.
  • Signal-to-noise ratio Signal to Noise Ratio
  • Patent Document 1 there is no clear description regarding the mixing ratio of the long-time exposure image and the short-time exposure image when generating the HDR image, but regarding the luminance signal corresponding to the improper exposure area of the long-time exposure image. Since the luminance signal related to the short-time exposure image is used, it is considered that this is a general HDR synthesis in which the mixing ratio between the long-time exposure image and the short-time exposure image is determined by the magnitude of the luminance signal. Therefore, as described later, when there is a shadow on a part of the main subject, there is a problem that the shadow area remains dark.
  • the invention described in Patent Document 2 adjusts the mixing ratio of flash image data that is long-time exposure data and non-flash image data that is short-time exposure data in accordance with the estimated distance to the subject. Adding together, one composite image is generated.
  • the invention described in Patent Document 2 combines flash image data and non-flash image data, and does not combine non-light emitting images (non-flash images) with different exposure conditions.
  • Patent Documents 1 and 2 when a HDR image is generated by synthesizing a plurality of images captured with different exposure conditions for the same subject, the intensity of light applied to the subject is determined. There is no description to adjust the composition ratio of a plurality of images accordingly.
  • HDR processing when deciding whether to use the pixel value of the over image or the pixel value of the under image based on the luminance value obtained from the image sensor, it depends on the intensity of light hitting the subject. Regardless, the same HDR processing is performed and the following problems occur.
  • the left half of the person's face is the sunlit area A where the sunlight is exposed, and the right half is the shaded area B.
  • the dark hair (1) exposed to strong light and “skinned skin (2)” have the same luminance value or almost the same luminance value, the same applies to the HDR composition.
  • the over image is used, and not only the shadowed skin (2) becomes bright, but also the black hair (1) becomes a bright color.
  • the present invention has been made in view of such circumstances, and an image processing apparatus capable of acquiring an HDR image with excellent gradation characteristics in which the contrast between light and dark is adjusted according to the brightness of light hitting a subject, It is an object to provide a method, a program, and an imaging apparatus.
  • an image processing apparatus is an image obtained by imaging the same subject, and a plurality of non-light emitting devices that are imaged under different exposure conditions under non-flash light emission.
  • An image acquisition unit that acquires an image and a light-emitting image captured under the same exposure condition as that of any one of the plurality of non-light-emitting images under flash light emission;
  • a distance distribution information acquisition unit that acquires distance distribution information indicating the light intensity of light emitted to the subject based on the light emission image, the non-light emission image captured under the same exposure conditions as the light emission image, and the distance distribution information
  • a light intensity distribution information calculation unit that calculates distribution information, a combination ratio determination unit that determines a combination ratio of a plurality of non-light-emitting images based on the light intensity distribution information, and a plurality of units according to the combination ratio
  • the luminescent image synthesizing comprises an image combining unit for generating a composite image whose dynamic range is expanded, the.
  • light intensity distribution information of light irradiated on a subject is calculated based on a light emission image, a non-light emission image captured under the same exposure conditions as the light emission image, and distance distribution information. . Then, a composite ratio of the plurality of non-light-emitting images is determined based on the calculated light intensity distribution information, and the plurality of non-light-emitting images are combined according to the determined composite ratio, thereby generating a composite image (HDR) with an expanded dynamic range. Image).
  • the HDR image generated in this way is an HDR image with excellent gradation characteristics in which the contrast of light and darkness is adjusted according to the brightness of the light hitting the subject without losing the contrast of the subject.
  • a composition ratio is determined for each pixel or region corresponding to each of the plurality of non-light-emitting images, and the image composition unit is configured to correspond to each pixel or region corresponding to the plurality of non-light-emitting images. It is preferable to synthesize a plurality of non-light emitting images by applying a synthesis ratio determined for each.
  • the plurality of exposure conditions corresponding to the plurality of non-light-emitting images include an under-exposure exposure condition and an over-exposure exposure condition rather than an appropriate exposure condition. Is preferred. This makes it possible to create an HDR image with less whiteout and blackout.
  • the mixing ratio of the first non-light-emitting images captured under the exposure condition with the smallest exposure value among the plurality of non-light-emitting images is represented by the mixing ratio of the first non-light-emitting images captured under the exposure condition with the largest exposure value among the plurality of non-light-emitting images. It is preferable that the mixing ratio of the second non-light-emitting image is larger than the mixing ratio of the first non-light-emitting image in a pixel or region having a low light intensity.
  • the mixing ratio of the first non-light emitting image is set larger than the mixing ratio of the second non-light emitting image.
  • the mixing ratio of the second non-emitting image is greater than the mixing ratio of the first non-emitting image. Is increased to generate an HDR image with less blackout.
  • the composition ratio determination unit determines the composition ratio based on the light intensity distribution information when determining the composition ratio for each corresponding pixel or region of the plurality of non-light-emitting images.
  • the mixing ratio of the second non-light emitting image is monotonously decreased (the mixing ratio of the first non-emitting image is monotonously increased).
  • the minimum light intensity is not limited to the minimum value in the light intensity distribution information.
  • the minimum light intensity may be a light intensity offset by a certain value from the minimum value.
  • the maximum light intensity is the light intensity distribution information.
  • the light intensity may be a value offset from the maximum value by a fixed value. It goes without saying that the monotonic decrease includes a linear decrease.
  • the image processing apparatus includes a distribution information creating unit that creates distribution information indicating a frequency distribution or a frequency distribution of the light intensity based on the light intensity distribution information, and the composition ratio determining unit is created The first light intensity corresponding to the first vertex having a high frequency or frequency and the second light intensity corresponding to the second vertex having a high frequency or frequency based on the distribution information that is greater than the first light intensity
  • the composition ratio is determined for each corresponding pixel or region of the plurality of non-light-emitting images, if the light intensity is equal to or lower than the first light intensity, the mixing ratio of the second non-light-emitting images Is set to the maximum value, the light intensity equal to or higher than the second light intensity is set to the minimum mixing ratio of the second non-emission image, and the light intensity is greater than the first light intensity and smaller than the second light intensity.
  • the imaging target in this case can be roughly divided into a shaded subject and a sunny subject.
  • the distribution information includes two peaks (first vertex and second vertex which are two vertices). ) Exists.
  • the first vertex is a vertex corresponding to the frequency or frequency of the light intensity distribution information of the subject region corresponding to the shaded subject
  • the second vertex is the frequency of the light intensity distribution information of the subject region corresponding to the sunny subject or It is a vertex corresponding to the frequency.
  • a light intensity corresponding to the first vertex (first light intensity) and a light intensity corresponding to the second vertex (second light intensity) are obtained, and for each corresponding pixel or region of the plurality of non-luminous images.
  • first light intensity first light intensity
  • second light intensity second light intensity
  • the mixture ratio of the second non-light-emitting image is maximized when the light intensity is equal to or lower than the first light intensity
  • the second non-light-emitting image when the light intensity is equal to or higher than the second light intensity.
  • the mixing ratio of is minimized. This is because the light intensity area below the first light intensity is considered to be a shaded area, and the area above the second light intensity is considered to be a sunny area.
  • the mixing ratio of the second non-light-emitting image is monotonously decreased between the maximum value and the minimum value, and continuously according to the increase or decrease of the light intensity.
  • the mixing ratio is changed. Note that this is not limited to a daytime outdoor scene on a sunny day, but a scene in which sunlight is inserted into a part of the room, a scene in which an area illuminated by an artificial light source and a shadowed area exist, etc.
  • the distribution curve showing the frequency distribution or frequency distribution of the light intensity there may be two vertices.
  • the image processing apparatus includes a distribution information creating unit that creates distribution information indicating a frequency distribution or a frequency distribution of the light intensity based on the light intensity distribution information, and the composition ratio determining unit is created The light corresponding to the bottom point is obtained when the light intensity corresponding to the bottom point of the frequency or the low frequency is obtained based on the distribution information obtained and the composition ratio is determined for each corresponding pixel or region of the plurality of non-light-emitting images.
  • the mixing ratio of the second non-light-emitting image is maximized when the light intensity is less than or equal to the threshold value, and the mixing ratio of the second non-light-emitting image is minimized when the light intensity exceeds the threshold value. It is preferable.
  • the bottom point is obtained from the distribution information indicating the frequency distribution or frequency distribution of the light intensity, and the light intensity corresponding to the bottom point is set as a threshold value.
  • the maximum value the mixing ratio of the first non-light emitting image is the minimum value
  • the mixing ratio of the second non-light emitting image is set to the minimum value (the mixing ratio of the second non-light emitting image is the maximum value).
  • the mixing ratio of the second non-light-emitting image is continuously changed between the maximum value and the minimum value according to the magnitude of the light intensity.
  • An image processing apparatus includes a luminance distribution information calculation unit that calculates luminance distribution information indicating a luminance distribution in an image of at least one non-luminescent image among a plurality of non-luminescent images.
  • the composition ratio determining unit preferably determines the composition ratio of the plurality of non-light-emitting images based on the light intensity distribution information and the luminance distribution information.
  • the image processing apparatus includes a detection unit that detects a pixel or a region where the flash light does not reach in the light emission image, and the combination ratio determination unit includes the pixel or the region where the flash light does not reach When determining a composition ratio for a region, it is preferable to determine a composition ratio of a plurality of non-light-emitting images based only on luminance distribution information.
  • the light intensity corresponding to that pixel or region is unknown. Therefore, in this case, it is preferable to determine the composite ratio of a plurality of non-light-emitting images based on the luminance distribution information as in the conventional case.
  • the composition ratio determination unit determines a composition ratio for each pixel or region corresponding to each of the plurality of non-light-emitting images, and indicates a parameter indicating the position of the pixel or region.
  • the light emission image exposure condition is preferably the exposure condition having the smallest exposure value among the plurality of exposure conditions corresponding to the plurality of non-light emission images.
  • An imaging apparatus includes an imaging unit that can image a subject under different exposure conditions, a flash light emitting unit that emits flash light, and a flash control that controls flash light emitted from the flash light emitting unit. And an image processing device as described above, and the image acquisition unit includes a plurality of non-light-emitting images captured by the imaging unit under different exposure conditions under non-light emission of the flash light emitted from the flash light-emitting unit. Then, under the light emission of the flash light emitted from the flash light emitting unit, the light emitting image captured by the imaging unit under the same exposure condition as the exposure condition of any one of the plurality of non-light emitting images is acquired.
  • the flash control unit dims the flash light from the flash light emitting unit, and does not perform the dimming light emission with the dimming image captured by the imaging unit under the dimming light emission. Based on the image captured by the imaging unit, if a region where the dimmed flash light has not reached is detected, the maximum amount of light emission that does not saturate the region where the dimmed flash light has reached is calculated.
  • the flash light is preferably emitted from the flash light emitting unit at the maximum light emission amount calculated when the light emission image is acquired.
  • the flash control unit calculates the maximum light emission amount based on the average reflectance of the subject and the distance distribution information acquired by the distance distribution information acquisition unit. As a result, it is possible to calculate the maximum light emission amount in which the area where the flash light reaches does not blow out.
  • the flash control unit dims the flash light from the flash light emitting unit, and does not perform the dimming light emission with the dimming image captured by the imaging unit under the dimming light emission.
  • the image acquisition unit preferably acquires the light control image as a light emission image.
  • the flash control unit dims the flash light from the flash light emitting unit, and does not perform the dimming light emission with the dimming image captured by the imaging unit under the dimming light emission.
  • An area where the flash light has not reached the light control image is detected based on the image captured by the imaging section, and the distance of the area where the flash light has not reached is emitted from the flash light emission section based on the distance distribution information.
  • the image acquisition unit preferably acquires the light control image as the light emission image.
  • the flashlight cannot reach the pixels that have not reached the flashlight even if the main light is emitted. This is because there is no need to make a new light emission.
  • an image processing method for acquiring a plurality of non-light-emitting images that are images of the same subject and that are captured under different exposure conditions under non-flash light emission.
  • a step of obtaining a light-emitting image captured under the same exposure condition as the exposure condition of any one of the plurality of non-light-emitting images under flash light emission, and distance distribution information indicating a distribution of the distance of the subject The light intensity distribution information of the light applied to the subject based on the light emission image, the non-light emission image captured under the same exposure conditions as the light emission image, and the distance distribution information; Determining a composite ratio of the plurality of non-light-emitting images based on the intensity distribution information; combining the plurality of non-light-emitting images according to the composite ratio; And generating a composite image di is enlarged, the.
  • the step of determining the composition ratio is based on the light intensity distribution information when determining the composition ratio for each corresponding pixel or region of the plurality of non-light-emitting images.
  • the mixing ratio of the first non-light-emitting images captured under the exposure condition with the smallest exposure value among the plurality of non-light-emitting images is represented by the highest exposure value among the plurality of non-light-emitting images.
  • the mixing ratio of the second non-light emitting image is set larger than the mixing ratio of the first non-light emitting image in a pixel or region having a low light intensity. Is preferred.
  • An image processing program has a function of acquiring a plurality of non-light-emitting images that are images of the same subject and that are captured under different exposure conditions under non-flash light emission.
  • a function of acquiring a light-emitting image captured under the same exposure condition as the exposure condition of any one of the plurality of non-light-emitting images under flash light emission, and distance distribution information indicating the distance distribution of the subject A light emission image, a non-light emission image captured under the same exposure conditions as the light emission image, and a distance distribution information based on the light intensity distribution information of the light applied to the subject,
  • the function of determining the composition ratio is based on the light intensity distribution information when determining the composition ratio for each corresponding pixel or region of the plurality of non-light-emitting images.
  • the mixing ratio of the first non-light-emitting images captured under the exposure condition with the smallest exposure value among the plurality of non-light-emitting images is represented by the highest exposure value among the plurality of non-light-emitting images.
  • the mixing ratio of the second non-light emitting image is set larger than the mixing ratio of the first non-light emitting image in a pixel or region having a low light intensity. Is preferred.
  • the composition ratio of the plurality of non-light-emitting images is determined according to the brightness (light intensity) of the light hitting the subject, and the HDR composition is performed. It is possible to obtain an HDR image with excellent gradation characteristics in which the difference in brightness is adjusted according to the brightness of the light hitting
  • FIG. 1 is a perspective view of an imaging apparatus according to the present invention as viewed obliquely from the front.
  • FIG. 2 is a rear view of the imaging apparatus.
  • FIG. 3 is a block diagram illustrating an embodiment of the internal configuration of the imaging apparatus.
  • FIG. 4 is a front view showing a configuration example of the image sensor.
  • FIG. 5 is a functional block diagram showing the first embodiment of the main body side CPU 220 functioning as the image processing apparatus according to the present invention.
  • FIG. 6 is a graph showing the relationship between the light intensity distribution information S i and the mixing ratio ⁇ i of the second non-light-emitting image.
  • FIG. 7 is a graph showing another relationship between the light intensity distribution information Si and the mixing ratio ⁇ i of the second non-luminescent image.
  • FIG. 8 is a graph showing still another relationship between the light intensity distribution information Si and the mixing ratio ⁇ i of the second non-emission image.
  • FIG. 9 is a functional block diagram showing a second embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention.
  • FIG. 10 is a frequency distribution diagram showing an example of the frequency distribution of light intensity.
  • FIG. 11 is a graph showing still another relationship between the light intensity distribution information Si and the mixing ratio ⁇ i of the second non-luminescent image.
  • FIG. 12 is a frequency distribution diagram similar to the frequency distribution of light intensity shown in FIG. It is a frequency distribution figure which shows an example of frequency distribution of light intensity.
  • FIG. 10 is a frequency distribution diagram showing an example of the frequency distribution of light intensity.
  • FIG. 11 is a graph showing still another relationship between the light intensity distribution information Si and the mixing ratio ⁇ i of the second non-luminescent image.
  • FIG. 12 is a frequency distribution diagram similar to the frequency distribution of light intensity shown in FIG. It is
  • FIG. 13 is a graph showing still another relationship between the light intensity distribution information Si and the mixing ratio ⁇ i of the second non-luminescent image.
  • FIG. 14 is a functional block diagram showing a third embodiment of the main body side CPU 220 functioning as the image processing apparatus according to the invention.
  • FIG. 15 is a functional block diagram showing a fourth embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the invention.
  • FIG. 16 is a functional block diagram showing a fifth embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the invention.
  • FIG. 17 is a flowchart showing an embodiment of an image processing method according to the present invention.
  • FIG. 18 is a flowchart showing detailed operations in step S12 of FIG.
  • FIG. 19 is an external view of a smartphone which is an embodiment of an imaging apparatus according to the present invention.
  • FIG. 20 is a block diagram illustrating a configuration of the smartphone.
  • FIG. 21 is a diagram used for explaining a problem of the conventional HDR synthesis
  • FIG. 1 is a perspective view of an imaging apparatus according to the present invention as viewed obliquely from the front
  • FIG. 2 is a rear view of the imaging apparatus.
  • the imaging apparatus 10 is a mirrorless digital single-lens camera including an interchangeable lens 100 and a camera body 200 to which the interchangeable lens 100 can be attached and detached.
  • a main body mount 260 to which the interchangeable lens 100 is attached, a finder window 20 of an optical finder, and the like are provided on the front surface of the camera main body 200, and a shutter release switch 22 and a shutter are mainly provided on the upper surface of the camera main body 200.
  • a speed dial 23, an exposure correction dial 24, a power lever 25, and a built-in flash 30 are provided.
  • a liquid crystal monitor 216 As shown in FIG. 2, a liquid crystal monitor 216, an optical viewfinder eyepiece 26, a MENU / OK key 27, a cross key 28, a playback button 29, and the like are mainly provided on the back of the camera body 200.
  • the liquid crystal monitor 216 functions as a display unit for displaying various menu screens in addition to displaying a live view image in the shooting mode, reproducing and displaying an image captured in the playback mode, and displaying various information to the user.
  • the MENU / OK key 27 is an operation having both a function as a menu button for instructing to display a menu on the screen of the liquid crystal monitor 216 and a function as an OK button for instructing confirmation and execution of selection contents.
  • the cross key 28 is an operation unit for inputting instructions in four directions, up, down, left, and right, and functions as a multi-function key for selecting an item from the menu screen and instructing selection of various setting items from each menu.
  • the up and down keys of the cross key 28 function as a zoom switch at the time of imaging or a playback zoom switch in the playback mode, and the left key and the right key are frame advance (forward and reverse) buttons in the playback mode. Function as. In addition, it also functions as an operation unit that designates an arbitrary subject for focus adjustment from a plurality of subjects displayed on the liquid crystal monitor 216.
  • still images are continuously captured.
  • Various shooting modes including HDR shooting mode that captures a plurality of images used for HDR composition that expands the dynamic range, and moving image capturing mode that captures moving images, which are a kind of continuous shooting mode and continuous shooting mode Can do.
  • the playback button 29 is a button for switching to a playback mode in which the recorded still image or moving image is displayed on the liquid crystal monitor 216.
  • FIG. 3 is a block diagram illustrating an embodiment of the internal configuration of the imaging apparatus 10.
  • the interchangeable lens 100 that functions as an imaging optical system constituting the imaging apparatus 10 is manufactured in accordance with the communication standard of the camera body 200, and can communicate with the camera body 200 as described later. It is an interchangeable lens.
  • the interchangeable lens 100 includes an imaging optical system 102, a focus lens control unit 116, an aperture control unit 118, a lens side CPU (Central Processing Unit) 120, a flash ROM (Read Only Memory) 126, a lens side communication unit 150, and a lens mount. 160.
  • a lens side CPU Central Processing Unit
  • flash ROM Read Only Memory
  • the imaging optical system 102 of the interchangeable lens 100 includes a lens group 104 including a focus lens and a diaphragm 108.
  • the focus lens control unit 116 moves the focus lens in accordance with a command from the lens side CPU 120 and controls the position (focus position) of the focus lens.
  • the diaphragm control unit 118 controls the diaphragm 108 in accordance with a command from the lens side CPU 120.
  • the lens-side CPU 120 controls the interchangeable lens 100 as a whole, and includes a ROM 124 and a RAM (Random Access Memory) 122.
  • the flash ROM 126 is a nonvolatile memory that stores programs downloaded from the camera body 200.
  • the lens-side CPU 120 performs overall control of each part of the interchangeable lens 100 using the RAM 122 as a work area according to a control program stored in the ROM 124 or the flash ROM 126.
  • the lens side communication unit 150 is connected to the camera body 200 via a plurality of signal terminals (lens side signal terminals) provided on the lens mount 160 in a state where the lens mount 160 is attached to the body mount 260 of the camera body 200. Communication. That is, the lens side communication unit 150 sends a request signal and a response signal to and from the main body side communication unit 250 of the camera body 200 connected via the lens mount 160 and the main body mount 260 in accordance with a command from the lens side CPU 120. Transmission / reception (bidirectional communication) is performed to notify the camera body 200 of lens information (focus lens position information, focal length information, aperture information, etc.) of each optical member of the imaging optical system 102.
  • lens information focus lens position information, focal length information, aperture information, etc.
  • the interchangeable lens 100 also includes a detection unit (not shown) that detects focus lens position information and aperture information.
  • the aperture information is information indicating the aperture value (F value) of the aperture 108, the aperture diameter of the aperture 108, and the like.
  • the lens side CPU 120 preferably stores various lens information including the detected focus lens position information and aperture information in the RAM 122. Further, the lens information is detected when there is a request for lens information from the camera body 200, or is detected when the optical member is driven, or at a fixed period (a period sufficiently shorter than the frame period of the moving image). It is detected and the detection result can be held.
  • [Camera body] 3 includes an image sensor 201, an image sensor control unit 202, an analog signal processing unit 203, an A / D (Analog / Digital) converter 204, an image input controller 205, a digital signal.
  • the mount 260 and the built-in flash 30 (FIG. 1) are configured. It comprises: (focal-plane shutter FPS) 280 and the FPS control unit 296, Rush emitting unit 270, the flash control unit 272,
  • the image sensor 201 is composed of a complementary metal-oxide semiconductor (CMOS) type color image sensor.
  • CMOS complementary metal-oxide semiconductor
  • the image sensor 201 is not limited to the CMOS type, but may be an XY address type or a CCD (Charge Coupled Device) type image sensor.
  • CCD Charge Coupled Device
  • FIG. 4 is a front view showing the configuration of the image sensor 201.
  • the image sensor 201 includes an image sensor having a plurality of pixels configured by photoelectric conversion elements (light receiving cells) arranged two-dimensionally in the horizontal direction (x direction) and the vertical direction (y direction).
  • the first and second pixels S A and S B are configured to selectively receive light by dividing the light beam incident through the interchangeable lens 100 by the pupil dividing means shown below.
  • the image sensor 201 employs a pupil image separation method using a pupil imaging lens (microlens) 201A as pupil dividing means.
  • a pupil imaging lens microlens
  • a plurality of (two in this example) pixels are assigned to one microlens 201A, and are incident on one microlens 201A.
  • pupil image which is pupil division by the microlens 201A, the first pixel S a two (a pair), and is received by the second pixel S B.
  • the pupil image is separated according to the incident angle of the light entering the microlens 201A, the corresponding pair of the first pixel S A, is received by the second pixel S B.
  • a pair of first pixel S A, second pixel S B corresponds to a pair of phase difference pixels available on the phase difference AF.
  • the pair of first pixel S A, pixel values obtained by adding the pixel values obtained from the pixels of the second pixel S B are to become equal to the pixel value of normal pixels that are not pupil division, the pair of first The pixel S A and the second pixel S B can be handled as normal pixels (one pixel).
  • the pair of first pixel S A and second pixel S B of the image sensor 201 includes red A color filter of any one of the three primary color filters (R filter, G filter, B filter) of (R), green (G), and blue (B) is arranged according to a predetermined color filter array.
  • the color filter array shown in FIG. 4 is a general Bayer array, but the color filter array is not limited to this, and may be another color filter array such as a Trans (registered trademark) array, for example.
  • the optical image of the subject formed on the light receiving surface of the image sensor 201 by the imaging optical system 102 of the interchangeable lens 100 is converted into an electrical signal by the image sensor 201.
  • Charges corresponding to the amount of incident light are accumulated in each pixel of the image sensor 201 (the first pixel S A and the second pixel S B shown in FIG. 4), and the charge accumulated in each pixel from the image sensor 201.
  • An electric signal corresponding to the amount (signal charge) is read out as an image signal. That is, the first pixel S A, a signal corresponding to the charge amount accumulated in the second pixel S B (first pixel S A, pixel values of the second pixel S B) can be read independently.
  • the image sensor control unit 202 performs reading control of an image signal from the image sensor 201 in accordance with a command from the main body side CPU 220. In addition, when a still image is captured, the image sensor control unit 202 reads all lines of the image sensor 201 with the FPS 280 closed after the exposure time is controlled by opening and closing the FPS 280. In addition, the image sensor 201 and the image sensor control unit 202 of this example sequentially perform an exposure operation for each of at least one line or each pixel (that is, sequentially reset for each line or each pixel to accumulate charges.
  • the analog signal processing unit 203 performs various types of analog signal processing on an analog image signal obtained by imaging the subject with the image sensor 201.
  • the analog signal processing unit 203 includes a sampling hold circuit, a color separation circuit, an AGC (Automatic Gain Control) circuit, and the like.
  • the AGC circuit functions as a sensitivity adjustment unit that adjusts the sensitivity at the time of imaging (ISO: International Organization for Standardization), adjusts the gain of an amplifier that amplifies the input image signal, and the signal level of the image signal is Try to be in the proper range.
  • the A / D converter 204 converts the analog image signal output from the analog signal processing unit 203 into a digital image signal.
  • Image data (mosaic image data) for each pixel of RGB output via the image sensor 201, the analog signal processing unit 203, and the A / D converter 204 when capturing a still image or moving image is transferred from the image input controller 205 to the RAM 207. Is temporarily stored.
  • the image sensor 201 is a CMOS type image sensor, the analog signal processing unit 203 and the A / D converter 204 are often built in the image sensor 201.
  • the digital signal processing unit 206 performs various types of digital signal processing on the image data stored in the RAM 207.
  • the digital signal processing unit 206 appropriately reads out image data stored in the RAM 207, and performs offset processing, gain control processing including sensitivity correction, gamma correction processing, demosaicing processing (demosaicing processing, simultaneous processing) on the read image data.
  • Digital signal processing such as RGB / YCrCb conversion processing or the like, and image data after the digital signal processing is stored in the RAM 207 again.
  • the demosaic process is a process of calculating color information of all RGB for each pixel from a mosaic image composed of RGB, for example, in the case of an image sensor composed of RGB color filters, and mosaic data (dot sequential RGB data). ) To generate the RGB 3 plane image data.
  • the RGB / YCrCb conversion process is a process of converting the synchronized RGB data into luminance data (Y) and color difference data (Cr, Cb).
  • the compression / decompression processing unit 208 performs compression processing on the uncompressed luminance data Y and color difference data Cb, Cr once stored in the RAM 207 when recording a still image or a moving image.
  • the image is compressed in, for example, JPEG (Joint Photographic coding Experts Group) format. Compress in H.264 format.
  • the image data compressed by the compression / decompression processing unit 208 is recorded on the memory card 212 via the media control unit 210.
  • the compression / decompression processing unit 208 performs decompression processing on the compressed image data obtained from the memory card 212 via the media control unit 210 in the playback mode, and generates uncompressed image data.
  • the media control unit 210 performs control to record the image data compressed by the compression / decompression processing unit 208 in the memory card 212. In addition, the media control unit 210 performs control for reading compressed image data from the memory card 212.
  • the display control unit 214 performs control to display uncompressed image data stored in the RAM 207 on the liquid crystal monitor 216.
  • the liquid crystal monitor 216 is configured by a liquid crystal display device, but may be configured by a display device such as organic electroluminescence instead of the liquid crystal monitor 216.
  • the liquid crystal monitor 216 When a live view image is displayed on the liquid crystal monitor 216, digital image signals continuously generated by the digital signal processing unit 206 are temporarily stored in the RAM 207.
  • the display control unit 214 converts the digital image signal temporarily stored in the RAM 207 into a display signal format and sequentially outputs it to the liquid crystal monitor 216. Thereby, the captured image is displayed on the liquid crystal monitor 216 in real time, and the liquid crystal monitor 216 can be used as an electronic viewfinder.
  • the shutter release switch 22 is an imaging instruction unit for inputting an imaging instruction of a still image or a moving image, and is configured by a two-stage stroke type switch composed of so-called “half press” and “full press”.
  • an S1 ON signal is output when the shutter release switch 22 is half-pressed, an S2 ON signal is output when the shutter release switch 22 is further pressed halfway down, and an S1 ON signal is output.
  • the main body side CPU 220 executes shooting preparation processing such as AF control (automatic focus adjustment) and AE control (automatic exposure control), and when the S2 ON signal is output, executes still image shooting processing and recording processing. To do.
  • AF control and AE control are automatically performed when the auto mode is set by the operation unit 222, respectively, and AF control and AE control are not performed when the manual mode is set. Needless to say.
  • the camera body 200 In the moving image capturing mode, when the S2 ON signal is output when the shutter release switch 22 is fully pressed, the camera body 200 enters a moving image recording mode in which moving image recording is started, and image processing of the moving image is performed. When the shutter release switch 22 is fully pressed again and an S2 ON signal is output, the camera body 200 enters a standby state and temporarily stops the moving image recording process.
  • the shutter release switch 22 is not limited to a two-stroke type switch consisting of half-pressing and full-pressing, and may output an S1 ON signal and an S2 ON signal in a single operation. May be provided to output an S1 on signal and an S2 on signal.
  • the operation instruction may be output by touching an area corresponding to the operation instruction displayed on the screen of the touch panel as these operation means.
  • the form of the operation means is not limited to these as long as the preparation process or the imaging process is instructed.
  • the still image or moving image acquired by the imaging is compressed by the compression / decompression processing unit 208, and the compressed image data includes the required attachment of the imaging date and time, GPS information, and imaging conditions (F value, shutter speed, ISO sensitivity, etc.).
  • the information is converted into an image file added to the header, and then stored in the memory card 212 via the media control unit 210.
  • the main body side CPU 220 controls the overall operation of the camera main body 200 and the driving of the optical members of the interchangeable lens 100, etc. Based on the input from the operation unit 222 including the shutter release switch 22, the main body side CPU 220 The interchangeable lens 100 is controlled.
  • the clock unit 224 measures time based on a command from the main body CPU 220 as a timer.
  • the clock unit 224 measures the current date and time as a calendar.
  • the flash ROM 226 is a non-volatile memory that can be read and written, and stores setting information.
  • the ROM 228 stores a camera control program executed by the main body side CPU 220, an image alignment auxiliary program for executing imaging in the image capturing mode for composition according to the present invention, defect information of the image sensor 201, various types used for image processing, and the like. Parameters and tables are stored.
  • the main body side CPU 220 controls each part of the camera main body 200 and the interchangeable lens 100 using the RAM 207 as a work area in accordance with a camera control program stored in the ROM 228 or an image alignment auxiliary program.
  • the AF control unit 230 that functions as an automatic focus adjustment unit calculates a defocus amount necessary for controlling the phase difference AF, and based on the calculated defocus amount, a position (focus position) command to which the focus lens should move is calculated. Is notified to the interchangeable lens 100 via the main body side CPU 220 and the main body side communication unit 250.
  • the AF control unit 230 includes a phase difference detection unit and a defocus amount calculation unit.
  • the phase difference detection unit is in an AF area of the image sensor 201 (an area where the main subject specified by the user exists, an area of the main subject automatically detected by face detection, or an area set by default).
  • a first pixel group including a first pixel S a which color filters of the same color are disposed, respectively the pixel data from the second pixel group composed of the first pixel S a (the first pixel value and second pixel value)
  • the phase difference is detected based on the first pixel value and the second pixel value.
  • This phase difference is obtained when the correlation between the plurality of first pixel values of the first pixel group and the plurality of second pixel values of the second pixel group becomes maximum (the plurality of first pixel values and the plurality of second pixel values It can be calculated from the amount of shift in the pupil division direction between the first pixel value and the second pixel value (when the integrated absolute difference value is minimized).
  • the defocus amount calculation unit calculates the defocus amount by multiplying the phase difference detected by the phase difference detection unit by a coefficient corresponding to the current F value (ray angle) of the interchangeable lens 100.
  • the focus lens position command corresponding to the defocus amount calculated by the AF control unit 230 is notified to the interchangeable lens 100, and the lens side CPU 120 of the interchangeable lens 100 that receives the focus lens position command receives the focus lens control unit 116.
  • the focus lens is moved via, and the position (focus position) of the focus lens is controlled.
  • the AE control unit 232 is a part that detects the brightness of the subject (subject brightness), and is a numerical value (exposure value (EV value)) necessary for AE control and AWB (Auto White Balance) control corresponding to the subject brightness. )) Is calculated.
  • the AE control unit 232 calculates an EV value based on the brightness of the image acquired via the image sensor 201, the shutter speed when the brightness of the image is acquired, and the F value.
  • the main body side CPU 220 can determine the F value, shutter speed, and ISO sensitivity from a predetermined program diagram based on the EV value obtained from the AE control unit 232, and perform AE control.
  • the white balance correction unit 234 calculates white balance gains (WB (White Balance) gains) Gr, Gg, Gb for each color data of RGB data (R data, G data, and B data). White balance correction is performed by multiplying the B data by the calculated WB gains Gr, Gg, and Gb, respectively.
  • WB gains Gr, Gg, and Gb the subject is illuminated based on scene recognition (outdoor / indoor determination, etc.) based on the brightness (EV value) of the subject, the color temperature of ambient light, and the like.
  • a method of reading out a WB gain corresponding to a specified light source type from a storage unit in which an appropriate WB gain is stored in advance for each light source type is conceivable, but at least using an EV value
  • Other known methods for determining Gr, Gg, Gb are conceivable.
  • the wireless communication unit 236 is a part that performs short-range wireless communication of a standard such as Wi-Fi (Wireless Fidelity) (registered trademark), Bluetooth (registered trademark), etc., and with a peripheral digital device (a mobile terminal such as a smartphone). Necessary information is sent and received between the two.
  • Wi-Fi Wireless Fidelity
  • Bluetooth registered trademark
  • peripheral digital device a mobile terminal such as a smartphone
  • the GPS receiving unit 238 receives GPS signals transmitted from a plurality of GPS satellites according to instructions from the main body side CPU 220, executes positioning calculation processing based on the received plurality of GPS signals, and the latitude and longitude of the camera main body 200 And GPS information consisting of altitude.
  • the acquired GPS information can be recorded in the header of the image file as attached information indicating the imaging position of the captured image.
  • the power supply control unit 240 applies power supply voltage supplied from the battery 242 to each unit of the camera main body 200 in accordance with a command from the main body side CPU 220.
  • the power supply control unit 240 supplies the power supply voltage supplied from the battery 242 to each unit of the interchangeable lens 100 via the main body mount 260 and the lens mount 160 in accordance with a command from the main body side CPU 220.
  • the lens power switch 244 switches on and off the power supply voltage applied to the interchangeable lens 100 via the main body mount 260 and the lens mount 160 and switches the level in accordance with a command from the main body side CPU 220.
  • the main body side communication unit 250 transmits / receives a request signal and a response signal to / from the lens side communication unit 150 of the interchangeable lens 100 connected via the main body mount 260 and the lens mount 160 in accordance with a command from the main body side CPU 220 ( (Bidirectional communication).
  • the main body mount 260 is provided with a plurality of terminals 260A as shown in FIG. 1, and when the interchangeable lens 100 is attached to the camera main body 200 (the lens mount 160 and the main body mount 260 are connected), the main body A plurality of terminals 260A (FIG. 1) provided on the mount 260 and a plurality of terminals (not shown) provided on the lens mount 160 are electrically connected, and the main body side communication unit 250 and the lens side communication unit 150 are connected. Bi-directional communication is possible.
  • the built-in flash 30 (FIG. 1) is, for example, a TTL (Through The Lens) automatic dimming flash, and includes a flash light emitting unit 270 and a flash control unit 272.
  • the flash control unit 272 has a function of adjusting the light emission amount (guide number) of flash light emitted from the flash light emitting unit 270. That is, the flash control unit 272 pre-flashes (dimming light emission) flash light with a small light emission amount from the flash light emitting unit 270 in synchronization with the flash imaging instruction from the main body side CPU 220, and the imaging optical system 102 of the interchangeable lens 100. The amount of flash light that is actually emitted is determined based on reflected light (including ambient light) incident via the flash light, and the flash light emission unit 270 emits the flash light of the determined emission amount (main emission). Note that, when the HDR imaging mode is selected, the flash control unit 272 performs light emission control different from the normal flash emission image imaging, and details thereof will be described later.
  • the FPS 280 constitutes a mechanical shutter of the imaging apparatus 10 and is disposed immediately before the image sensor 201.
  • the FPS control unit 296 controls the opening and closing of the front and rear curtains of the FPS 280 based on input information (S2 ON signal, shutter speed, etc.) from the main body CPU 220, and controls the exposure time (shutter speed) in the image sensor 201. To do.
  • FIG. 5 is a functional block diagram showing a first embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention, and HDR synthesis is based mainly on a plurality of images for HDR synthesis captured in the HDR imaging mode.
  • FIG. 6 is a functional block diagram when image processing (HDR processing) is executed.
  • the main body side CPU 220 When performing imaging in the HDR imaging mode, the main body side CPU 220 mainly controls imaging control in the HDR imaging mode, and as shown in FIG. 5, a distance distribution information acquisition unit 220A, a light intensity distribution information calculation unit 220B, It functions as a composition ratio determination unit 220C and an image composition unit 220D.
  • the main body side CPU 220 captures a plurality (two in this example) of non-light-emitting images under different exposure conditions under non-flash light emission and under flash light emission. Imaging of a luminescent image is performed under the same exposure condition as that of any one of the plurality of non-luminescent images.
  • the plurality of exposure conditions corresponding to the plurality of non-light-emitting images include an under-exposure exposure condition and an over-exposure exposure condition rather than an appropriate exposure condition.
  • the main body side CPU 220 captures two non-light-emitting images under two exposure conditions of underexposure and overexposure with respect to the appropriate exposure determined according to the EV value calculated by the AE control unit 232. Let it be done.
  • the exposure correction value (for example, ⁇ 3 EV) for underexposure and overexposure with respect to proper exposure may be arbitrarily set by the user, or automatically according to the brightness (EV value) of the object scene. It may be possible to set it automatically.
  • the main body side CPU 220 reads the light-emitting image under the same exposure condition (in this example, the under-exposure exposure condition) as any non-light-emitting image of the plurality of non-light-emitting images under flash light emission. Perform imaging.
  • the same exposure condition in this example, the under-exposure exposure condition
  • the two non-light-emitting images and the one light-emitting image are images captured with respect to the same subject, and are continuously shot images that are continuously captured with the imaging interval as short as possible. preferable.
  • the image acquisition unit 221 includes a non-light-emitting image (first non-light-emitting image) captured by the image pickup unit (the interchangeable lens 100 and the image sensor 201) under underexposure under the non-flash light, and an overshoot.
  • first non-light-emitting image captured by the image pickup unit (the interchangeable lens 100 and the image sensor 201) under underexposure under the non-flash light, and an overshoot.
  • the image is captured by the image-capturing unit under the same exposure condition as the exposure condition of the first non-light-emitting image. This is a part for acquiring the emitted light image.
  • the image acquisition unit 221 may acquire the first non-light-emitting image, the second non-light-emitting image, and the light-emitting image temporarily stored in the RAM 207 via the image input controller 205 from the RAM 207, or by the imaging unit.
  • the captured first non-emission image, second non-emission image, and emission image may be directly acquired via the image input controller 205.
  • the image input controller 205 of this embodiment a pair of first pixel S A of the image sensor 201 shown in FIG. 4, by adding the pixel value of the second pixel S B, 1 pixel per one microlens 201A
  • the first non-light-emitting image, the second non-light-emitting image, and the light-emitting image are acquired.
  • the first image and the second image are phase difference images having a phase difference according to the subject distance.
  • the distance distribution information acquisition unit 220A acquires a first image and a second image that are phase difference images having a phase difference from the image acquisition unit 221, and based on the first image and the second image, Distance distribution information indicating the distance distribution of the subject is acquired.
  • the distance distribution information acquisition unit 220A calculates a deviation amount (phase difference) between a feature point that is a feature in the first image and a feature point in the second image corresponding to each feature point.
  • the defocus amount of each feature point is calculated based on the phase difference and the current F value of the diaphragm 108, and the calculated defocus amount of each feature point and the lens of the current interchangeable lens 100 (the focus lens of the imaging optical system 102).
  • the distance between each feature point (subject) is calculated from the position.
  • a feature point with a defocus amount of zero is at the in-focus distance at which the lens position of the current focus lens is focused, and as the defocus amount of each feature point shifts to the front pin side or the rear bin side
  • the distance between each feature point is the distance moved to the far side or the near side from the in-focus distance.
  • the light intensity distribution information calculation unit 220B acquires the light emission image and the first non-light emission image captured under the same exposure conditions from the image acquisition unit 221, and indicates the distribution of the distance of the subject in the image from the distance distribution information acquisition unit 220A. Get distance distribution information. Then, based on the acquired light emission image, first non-light emission image, and distance distribution information, light intensity distribution information that is distribution information of an index indicating the intensity of light (not flash light but ambient light) applied to the subject Is calculated.
  • the parameter indicating the coordinates of each pixel of the first non-light-emitting image is i
  • the luminance value of the pixel i is Y i
  • the luminance value of the pixel i at the same coordinate of the light-emitting image is Y Li
  • the light intensity distribution information S i is expressed by the following equation: [Equation 1]
  • S i Y i / ⁇ (Y Li ⁇ Y i ) ⁇ (d i 2 ) ⁇
  • the parameter i is not limited to the parameter indicating the pixel coordinates, and may be a parameter indicating the position of each region obtained by dividing the image into a plurality of regions.
  • the luminance values Y i , Y Li , distance information d i , and light intensity distribution information S i are also the luminance value (representative luminance value), distance information, and light intensity distribution information for each
  • Yi of the first non-light-emitting image is Yi of the first non-light-emitting image
  • B i is the intensity of light striking the subject
  • R i is the reflectance of the subject
  • K is the photoelectric conversion coefficient by the image sensor 201. Is represented by the following equation.
  • the combination ratio determination unit 220C determines a combination ratio of a plurality of non-light-emitting images (first non-light-emitting image and second non-light-emitting image) based on the light intensity distribution information. Since a portion where the value of the light intensity distribution information S i is large is strongly irradiated, it is preferable to use the first non-light-emitting image that is an under image, and light is not so much at a portion where the value of the light intensity distribution information S i is small. Since it is not hit, it is preferable to use the second non-light emitting image which is an over image.
  • composition ratio determining unit 220C determines the composition ratio for each pixel corresponding to each of the plurality of non-light-emitting images
  • a pixel having a high light intensity based on the light intensity distribution information S i includes a plurality of non-light-emitting images.
  • the mixing ratio of the first non-light emitting image captured under the exposure condition with the smallest exposure value is greater than the mixing ratio of the second non-light emitting image captured under the exposure condition with the largest exposure value among the plurality of non-light emitting images.
  • the mixing ratio of the second non-light emitting image is made larger than the mixing ratio of the first non-light emitting image.
  • the composition ratio may be determined for each region obtained by dividing the image into a plurality of regions.
  • 6 to 7 are graphs showing the relationship between the light intensity distribution information S i and the mixing ratio ⁇ i of the second non-light-emitting image, respectively.
  • the mixing ratio of the first non-luminescent image is (1- ⁇ i)
  • the synthesis ratio of the first non-luminescent image and the second non-luminescent image is (1- ⁇ i): ⁇ i.
  • the composition ratio determining unit 220C provides a minimum value ⁇ min. (> 0) and a maximum value ⁇ max. ( ⁇ 1) for the mixing rate ⁇ i, and the minimum value S min. max by. the light intensity distribution information S i, the mixing ratio ⁇ i minimum value alpha min. by linear interpolation the maximum value alpha max. range between the synthesis ratio of the first non-emission image and the second non-emission image Can be determined.
  • the composition ratio determination unit 220C provides the minimum value ⁇ min. (> 0) and the maximum value ⁇ max. ( ⁇ 1) for the mixing rate ⁇ i, and the minimum value S min.
  • a composite ratio between the non-light emitting image and the second non-light emitting image can be determined.
  • the image composition unit 220D composes the under-exposed first non-emission image and the over-exposed second non-emission image in accordance with the composition ratio determined by the composition ratio determination unit 220C, and the dynamic range.
  • a composite image (HDR image) in which is enlarged is generated. That is, the image combining unit 220D combines the first non-light-emitting image and the second light-emitting image by applying the combination ratio determined for each corresponding pixel of the first non-light-emitting image and the second non-light-emitting image.
  • composition ratio is determined for each area obtained by dividing the image into a plurality of areas
  • the composition ratio determined for each area corresponding to each of the first non-light-emitting image and the second non-light-emitting image is applied.
  • the non-light emitting image and the second light emitting image are combined to generate an HDR image.
  • the HDR image generated in this way is an HDR image with excellent gradation characteristics in which the contrast of light and darkness is adjusted according to the brightness of the light hitting the subject without losing the contrast of the subject.
  • FIG. 9 is a functional block diagram showing a second embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention.
  • parts that are the same as those in the first embodiment shown in FIG. 5 are given the same reference numerals, and detailed descriptions thereof are omitted.
  • the main body side CPU 220 of the second embodiment shown in FIG. 9 is different from the first embodiment shown in FIG. 5 in that the function of the distribution information creation unit 220E is added.
  • the distribution information creation unit 220E calculates distribution information (for example, a frequency distribution chart or a frequency distribution table) indicating the frequency distribution of the light intensity based on the light intensity distribution information S i calculated by the light intensity distribution information calculation unit 220B. .
  • distribution information for example, a frequency distribution chart or a frequency distribution table
  • FIG. 10 is a frequency distribution diagram showing an example of the frequency distribution of light intensity.
  • a dark subject that is not exposed to sunlight (a shaded subject) outdoors on a sunny day A subject having a wide dynamic range is a subject to be imaged, from a bright subject to sunlight (a subject in the sun), and in this case, the subject to be imaged can be roughly divided into a shaded subject and a sunny subject.
  • the light intensity frequency distribution diagram generated based on the light intensity distribution information of such a scene there are two peaks (first vertex P1 and second vertex P2 which are two vertices) as shown in FIG. To do.
  • the first vertex P1 is a vertex corresponding to the frequency of the light intensity distribution information of the subject region corresponding to the shaded subject
  • the second vertex P2 is the frequency of the light intensity distribution information of the subject region corresponding to the sunny subject. The corresponding vertex.
  • distribution information indicating the frequency distribution of the light intensity is created.
  • distribution information indicating the frequency distribution of the light intensity may be created, or distribution information that approximates the frequency distribution or the frequency distribution to a curve is created. May be.
  • the composition ratio determination unit 220C shown in FIG. 9 has the first light intensity (S) corresponding to the first vertex P1 having a high light intensity frequency as shown in FIG. dark ) and a second light intensity (S blight ) corresponding to the second vertex P2 having a high light intensity.
  • the composition ratio determination unit 220C determines the composition ratio for each pixel corresponding to each of the first non-light-emitting image and the second non-light-emitting image, as shown in FIG. 11, the first light intensity (S dark ) or less. If the light intensity is less than the second light intensity (S blight ), the second non-light-emitting image mixing ratio ⁇ i is set to the maximum value ⁇ max. When the light intensity is set to the minimum value ⁇ min. And is larger than the first light intensity (S dark ) and smaller than the second light intensity (S blight ), the mixing ratio ⁇ i of the second non-light emitting image is set to the maximum value ⁇ max. And the minimum value ⁇ min. Are linearly decreased (monotonically decreased).
  • a pixel or region having a light intensity equal to or lower than the first light intensity (S dark ) is considered to be a shaded area, and it is preferable that the mixture rate ⁇ i of the overexposed second non-emission image is set to the maximum value ⁇ max.
  • a pixel or region having a light intensity equal to or higher than the second light intensity (S blight ) is considered to be a sunny area, and the mixture ratio ⁇ i of the overexposed second non-emission image is set to the minimum value ⁇ min. It is preferable to set the mixing ratio of the first non-light-emitting image to a maximum value). Thereby, the gradation of the highlight part and the shadow part can be enriched.
  • FIG. 12 is a frequency distribution diagram similar to the frequency distribution of the light intensity shown in FIG.
  • composition ratio determination unit 220C shown in FIG. 9 based on the distribution information created by the distribution information creation unit 220E, the light intensity corresponding to the bottom point V having a low light intensity frequency as shown in FIG. (St) is obtained.
  • the composition ratio determining unit 220C determines the composition ratio for each pixel corresponding to each of the first non-light-emitting image and the second non-light-emitting image
  • the light intensity (St) is set as a threshold value as shown in FIG.
  • the mixture ratio ⁇ i of the second non-light-emitting image is set to the maximum value ⁇ max.
  • the mixture ratio ⁇ i of the second non-light-emitting image is set to the minimum value ⁇ min. . to (maximum mixing ratio of the first non-emission image of underexposure).
  • the mixing ratio of the second non-luminescent image is set to the maximum value ⁇ max. And the minimum value according to the light intensity . This includes the case where the mixing ratio is continuously changed between ⁇ min .
  • FIG. 14 is a functional block diagram showing a third embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention.
  • the same reference numerals are given to the portions common to the first embodiment shown in FIG. 5, and the detailed description thereof is omitted.
  • the main body side CPU 220 of the third embodiment shown in FIG. 14 is different from the first embodiment shown in FIG. 5 in that the function of the luminance distribution information calculation unit 220F is added.
  • the luminance distribution information calculation unit 220F calculates luminance distribution information indicating the luminance distribution in the image of at least one non-luminescent image among the plurality of non-luminescent images (first non-luminescent image and second non-luminescent image). Part.
  • the luminance distribution information can be created, for example, as an image made up of luminance signals of the first non-emission image, or as information indicating the luminance (representative luminance) for each area into which the image is divided.
  • the combination ratio determination unit 220C illustrated in FIG. 14 includes a plurality of non-light-emitting images (first non-light-emitting images) based on the light intensity distribution information calculated by the light intensity distribution information calculation unit 220B and the luminance distribution information calculated by the luminance distribution information calculation unit 220F. The composite ratio of the light emission image and the second non-light emission image) is determined.
  • the composition ratio determination unit 220C determines a parameter indicating the position of a pixel or a region when determining a composition ratio for each pixel or region corresponding to each of the first non-light-emitting image and the second non-light-emitting image.
  • the mixing ratio of the first non-light-emitting image is (1- ⁇ i)
  • the combination ratio of the first non-light-emitting image and the second non-light-emitting image is (1- ⁇ i): ⁇ i.
  • FIG. 15 is a functional block diagram showing a fourth embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention.
  • the same reference numerals are given to the portions common to the third embodiment shown in FIG. 14, and the detailed description thereof is omitted.
  • the main body side CPU 220 of the fourth embodiment shown in FIG. 15 is different from the third embodiment shown in FIG. 14 in that the function of the flash light non-reaching pixel detection unit 220G is added.
  • the flash light non-reaching pixel detection unit 220G calculates a difference (Y Li ⁇ Y i ) between the luminance value Y i of the pixel i of the first non-emission image and the luminance value of the pixel i at the same coordinates of the emission image as Y Li. Then, the pixel i for which the difference (Y Li -Y i ) is 0 is detected as a flash light non-reaching pixel.
  • the combination ratio determination unit 220C illustrated in FIG. 15 has a plurality of non-light-emitting images (first non-light-emitting images) based on the light intensity distribution information calculated by the light intensity distribution information calculation unit 220B and the luminance distribution information calculated by the luminance distribution information calculation unit 220F. (The light emission image, the second non-light emission image) is determined.
  • the flash light non-reaching pixel detection unit 220G detects the flash light non-reaching pixel
  • the composite ratio for the flash light non-reaching pixel is determined.
  • the composition ratio of the plurality of non-light emitting images is determined based only on the luminance distribution information.
  • the composition ratio determination unit 220C calculates the mixing ratio ⁇ i of the second non-light-emitting image based on the light intensity distribution information and the luminance distribution information using the formula [6], but the pixel to which the flash light does not reach is calculated.
  • the mixture ratio ⁇ i of the second non-light-emitting image of the flash light non-reaching pixel is set to 1, and based on the luminance distribution information
  • the mixture ratio ⁇ i of the second non-light-emitting image is obtained only from the mixing ratio ⁇ i of the second non-light-emitting image.
  • the flash light non-reaching pixel detection unit 220G is not limited to detecting a flash light non-reaching pixel, but may function as a detection unit that detects a flash light non-reaching region when an image is divided into a plurality of regions. Good.
  • the flash light non-reaching pixel detection unit 220G is also used to control the light emission amount of the flash light emitted from the flash light emission unit 270 by the flash control unit 272. Details of the light emission amount control by the flash control unit 272 are as follows. Will be described later.
  • FIG. 16 is a functional block diagram showing a fifth embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention.
  • the same reference numerals are given to the portions common to the first embodiment shown in FIG. 5, and detailed description thereof is omitted.
  • the main body side CPU 220 of the fourth embodiment shown in FIG. 16 does not have the distance distribution information acquisition unit 220A shown in the first embodiment, and acquires distance distribution information from the external distance distribution information acquisition unit 223. Yes.
  • the distance distribution information acquisition unit 223 can be configured by, for example, a distance image sensor in which a plurality of light receiving elements are arranged two-dimensionally.
  • a distance image sensor one that obtains a distance image by a TOF (Time Of Flight) method is conceivable.
  • the TOF method is a method for obtaining the distance to the subject by irradiating the subject with light and measuring the time until the reflected light is received by the sensor.
  • a distance image sensor that has two pixels, and obtains a distance image of the subject from the received light amount (light reception intensity) for each pixel of the distance image sensor, and irradiates the subject with light modulated at high frequency and reflects it from the time of irradiation
  • a method of acquiring a distance image by detecting a phase shift until light is received There is known a method of acquiring a distance image by detecting a phase shift until light is received.
  • the distance distribution information acquisition unit 2223 As the distance distribution information acquisition unit 223, a three-dimensional laser measuring instrument, a stereo camera, or the like can be applied.
  • the image sensor 201 is not limited to the image sensor as in this example, and may be an image sensor that cannot measure the distance.
  • FIG. 17 is a flowchart showing an embodiment of an image processing method according to the present invention.
  • the operation of the HDR process performed by the main body side CPU 220 having the functions of the units illustrated in FIG. 5 and the like will be mainly described.
  • the distance distribution information acquisition unit 220 ⁇ / b> A of the main body side CPU 220 acquires distance distribution information indicating the distribution of the distance of the subject in the image (step S ⁇ b> 10).
  • the distance distribution information acquisition unit 220A acquires a first image and a second image that are phase difference images having a phase difference from each other from the image acquisition unit 221, and based on the first image and the second image. Then, distance distribution information indicating the distribution of the distance of the subject in the image is acquired.
  • the image acquisition unit 221 captures the same subject under different imaging conditions, and includes a plurality of non-emission images (first non-emission images) captured under different exposure conditions under non-flash light emission. And the second non-emission image), and the non-emission image (in this example, the exposure condition of the first non-emission image under exposure) of any of the plurality of non-emission images under flash light emission.
  • the light emission image imaged by (1) is acquired (step S12).
  • step S12 the detailed operation in step S12 will be described with reference to the flowchart shown in FIG.
  • the flash control unit 272 determines the dimming light emission amount for the dimming image to be emitted from the flash light emitting unit 270 based on the distance distribution information acquired in step S10. (Step S100).
  • the dimming light emission amount is calculated as the maximum light emission amount that does not blow out from the distance distribution, assuming that the average reflectance of the subject within the angle of view is 18%.
  • the flash control unit 272 causes the flash light emitting unit 270 to emit flash light having the calculated light control light emission amount, and the main body CPU 220 performs the same exposure conditions (under exposure) as the first non-light emitting image under the light control light emission. ) Is acquired (step S102).
  • the main body side CPU 220 determines whether there is a pixel or a region where the flash light has not reached, based on the detection output of the flash light non-reaching pixel detection unit 220G (FIG. 15) (step S104).
  • the flash light non-reaching pixel detection unit 220G acquires a frame image (non-light-emitting image) of the live view image captured under the same exposure condition as that of the light-control image, and uses the light-control image and the non-light-emitting image. In comparison, a flash light non-reaching pixel or region is detected.
  • step S104 determines whether or not there is a whiteout region in the dimming image (step S106). If there is an area of the maximum pixel value (255 when the pixel value is represented by 8 bits) in the dimming image, the area can be determined as a whiteout area.
  • step S106 If it is determined in step S106 that there is no whiteout region in the light control image (in the case of “No”), the light control image is acquired as a light emission image (step S108). In this case, there is no need to capture a light-emitting image again, and the number of captured images can be reduced.
  • step S106 If it is determined in step S106 that there is a whiteout area in the light control image (in the case of “Yes”), the flash control unit 272 calculates the maximum light emission amount that the whiteout area does not white out. Then, the calculated maximum light emission amount is set as the main light emission amount when the light emission image is acquired (step S110).
  • step S104 determines that there is an area where the flash light has not reached (in the case of “Yes”)
  • the main body CPU 220 determines that the area where the flash light has not reached is greater than the maximum reach distance of the flash light. Is also determined (step S112).
  • the distance of the area where the flash light has not reached can be obtained from the distance distribution information, and the maximum reach distance of the flash light can be obtained from the guide number and F value of the flash light emitting unit 270.
  • step S112 If it is determined in step S112 that the area where the flash light has not reached is longer than the maximum reach distance of the flash light (in the case of “No”), the composition ratio determination unit 220C illustrated in FIG.
  • the mixing ratio ⁇ i of the area that has not reached (the mixing ratio ⁇ i of the second non-light-emitting image shown in the equation [6]) is set to 1 (step S114).
  • the mixing ratio ⁇ i of the second non-light emitting image in the region where the flash light has not reached is determined only by the mixing ratio ⁇ i of the second non-light emitting image based on the luminance distribution information.
  • step S112 when it is determined in step S112 that the area where the flash light has not reached is closer than the maximum reach distance of the flash light (in the case of “Yes”), the flash control unit 272 determines that the area where the flash light has reached A maximum light emission amount that does not cause whiteout is calculated, and the calculated maximum light emission amount is set as a main light emission amount when a light emission image is acquired (step S116).
  • the flash control unit 272 calculates the main light emission amount in step S110 or step S116, the flash control unit 272 performs main flash light emission from the flash light emission unit 270 with the calculated main light emission amount, and the image acquisition unit 221 performs the first non-light emission under the main light emission.
  • a light emission image captured by the imaging unit (the interchangeable lens 100 and the image sensor 201) under the same exposure condition as that of the light emission image is acquired (step S118).
  • a non-luminescent image (first non-luminescent image) that is subsequently captured under the same exposure conditions (underexposure) as the luminescent image and a second non-luminescent image that is captured with overexposure And get.
  • the light intensity distribution information calculation unit 220 ⁇ / b> B is based on the non-luminous image (under-exposed first non-luminous image), the luminescent image, and the distance distribution information acquired by the distance distribution information acquisition unit 220 ⁇ / b> A. 1]
  • the light intensity distribution information S i is calculated from the equation (step S14).
  • the combination ratio determination unit 220C determines a combination ratio for HDR combining the plurality of non-light-emitting images (first non-light-emitting image and second non-light-emitting image) based on the light intensity distribution information S i calculated in step S14. (Step S16). For example, when the composition ratio determining unit 220C determines the composition ratio for each pixel or region corresponding to each of the plurality of non-light-emitting images (the first non-light-emitting image and the second non-light-emitting image), the light intensity distribution information S i is used.
  • the mixing ratio of the first non-emission image that is under-exposed is made larger than the mixing ratio of the second non-emission image that is over-exposed.
  • the mixing ratio of the two non-light emitting images is set to be larger than the mixing ratio of the first non-light emitting image.
  • the image composition unit 220D performs HDR composition of the first non-light-emitting image and the second non-light-emitting image based on the composition ratio determined in Step S16, and generates an HDR image (Step S18).
  • the imaging device 10 is a mirrorless digital single-lens camera, but is not limited thereto, and may be a single-lens reflex camera, a lens-integrated imaging device, a digital video camera, or the like.
  • the present invention is also applicable to mobile devices having other functions (call function, communication function, and other computer functions).
  • Other modes to which the present invention can be applied include, for example, mobile phones and smartphones having a camera function, PDAs (Personal Digital Assistants), and portable game machines.
  • PDAs Personal Digital Assistants
  • FIG. 19 shows the appearance of a smartphone 500 that is an embodiment of the imaging apparatus of the present invention.
  • a smartphone 500 illustrated in FIG. 19 includes a flat housing 502, and a display input in which a display panel 521 as a display unit and an operation panel 522 as an input unit are integrated on one surface of the housing 502. Part 520.
  • the casing 502 includes a speaker 531, a microphone 532, an operation unit 540, and a camera unit 541.
  • the configuration of the housing 502 is not limited to this, and, for example, a configuration in which the display unit and the input unit are independent, or a configuration having a folding structure or a slide mechanism may be employed.
  • FIG. 20 is a block diagram showing a configuration of the smartphone 500 shown in FIG. As shown in FIG. 20, as a main component of a smartphone, a wireless communication unit 510 that performs mobile wireless communication via a base station and a mobile communication network, a display input unit 520, a call unit 530, and an operation unit 540 , A camera unit 541, a recording unit 550, an external input / output unit 560, a GPS (Global Positioning System) receiving unit 570, a motion sensor unit 580, a power supply unit 590, and a main control unit 501.
  • GPS Global Positioning System
  • the wireless communication unit 510 performs wireless communication with a base station accommodated in the mobile communication network in accordance with an instruction from the main control unit 501. Using this wireless communication, transmission / reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
  • the display input unit 520 displays images (still images and moving images), character information, and the like, and visually transmits information to the user under the control of the main control unit 501, and detects a user operation on the displayed information.
  • This is a so-called touch panel, and includes a display panel 521 and an operation panel 522.
  • the display panel 521 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a display device.
  • the operation panel 522 is a device that is placed so that an image displayed on the display surface of the display panel 521 is visible and detects one or a plurality of coordinates operated by a user's finger or stylus. When such a device is operated by a user's finger or stylus, a detection signal generated due to the operation is output to the main control unit 501. Next, the main control unit 501 detects an operation position (coordinates) on the display panel 521 based on the received detection signal.
  • the display panel 521 and the operation panel 522 of the smartphone 500 exemplified as an embodiment of the imaging apparatus of the present invention integrally constitute a display input unit 520.
  • the arrangement is such that 522 completely covers the display panel 521.
  • the operation panel 522 may have a function of detecting a user operation even in an area outside the display panel 521.
  • the operation panel 522 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 521 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 521. May be included).
  • the operation panel 522 may include two sensitive regions of the outer edge portion and the other inner portion. Furthermore, the width of the outer edge portion is appropriately designed according to the size of the housing 502 and the like. Further, examples of the position detection method employed in the operation panel 522 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method, and any method is adopted. You can also.
  • the call unit 530 includes a speaker 531 and a microphone 532, and converts a user's voice input through the microphone 532 into voice data that can be processed by the main control unit 501, and outputs the voice data to the main control unit 501, or a wireless communication unit 510 or the audio data received by the external input / output unit 560 is decoded and output from the speaker 531.
  • the speaker 531 and the microphone 532 can be mounted on the same surface as the surface on which the display input unit 520 is provided.
  • the operation unit 540 is a hardware key using a key switch or the like, and receives an instruction from the user.
  • the operation unit 540 is mounted on the side surface of the housing 502 of the smartphone 500 and is turned on when pressed with a finger or the like, and is turned off by a restoring force such as a spring when the finger is released. It is a push button type switch.
  • the recording unit 550 includes a control program of the main control unit 501, control data, application software (including the image processing program according to the present invention), address data that associates the name and telephone number of the communication partner, and the sent and received e-mails. Data, web data downloaded by web browsing, and downloaded content data are stored, and streaming data and the like are temporarily stored.
  • the recording unit 550 includes an internal storage unit 551 built in the smartphone and an external storage unit 562 having a removable external memory slot.
  • Each of the internal storage unit 551 and the external storage unit 552 constituting the recording unit 550 includes a flash memory type, a hard disk type, a multimedia card micro type, This is realized by using a recording medium such as a card type memory (for example, Micro SD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory) or the like.
  • a card type memory for example, Micro SD (registered trademark) memory
  • RAM Random Access Memory
  • ROM Read Only Memory
  • the external input / output unit 560 serves as an interface with all external devices connected to the smartphone 500, and communicates with other external devices (for example, universal serial bus (USB), IEEE 1394, etc.) or Network (for example, Internet, wireless LAN (Local Area Network), Bluetooth (registered trademark), RFID (Radio Frequency Identification), infrared communication (Infrared Data Association: IrDA) (registered trademark), UWB (Ultra Wideband) ( (Registered trademark), ZigBee (registered trademark), etc.) for direct or indirect connection.
  • USB universal serial bus
  • IEEE 1394 etc.
  • Network for example, Internet, wireless LAN (Local Area Network), Bluetooth (registered trademark), RFID (Radio Frequency Identification), infrared communication (Infrared Data Association: IrDA) (registered trademark), UWB (Ultra Wideband) ( (Registered trademark), ZigBee (registered trademark), etc.) for direct or indirect connection.
  • Examples of external devices connected to the smartphone 500 include a wired / wireless headset, wired / wireless external charger, wired / wireless data port, a memory card connected via a card socket, and a SIM (Subscriber).
  • Identity Module Card (UIM) / User Identity Module Card (UIM) card or external audio / video equipment connected via audio / video I / O (Input / Output) terminal, external audio / video equipment connected wirelessly, wired / wireless
  • smartphones to be connected, personal computers to be connected to / from wireless, PDAs to be connected to wireless and wireless, and earphones.
  • the external input / output unit can transmit the data transmitted from such an external device to each component inside the smartphone 500, or can transmit the data inside the smartphone 500 to the external device.
  • the GPS receiving unit 570 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 501, performs positioning calculation processing based on the received plurality of GPS signals, A position consisting of longitude and altitude is detected.
  • the GPS reception unit 570 can acquire position information from the wireless communication unit 510 or the external input / output unit 560 (for example, a wireless LAN), the GPS reception unit 570 can also detect the position using the position information.
  • the motion sensor unit 580 includes, for example, a triaxial acceleration sensor and a gyro sensor, and detects the physical movement of the smartphone 500 in accordance with an instruction from the main control unit 501. By detecting the physical movement of the smartphone 500, the moving direction and acceleration of the smartphone 500 are detected. This detection result is output to the main control unit 501.
  • the power supply unit 590 supplies power stored in a battery (not shown) to each unit of the smartphone 500 in accordance with an instruction from the main control unit 501.
  • the main control unit 501 includes a microprocessor, operates according to a control program and control data stored in the recording unit 550, and controls each unit of the smartphone 500 in an integrated manner. Further, the main control unit 501 includes a mobile communication control function and an application processing function for controlling each unit of the communication system in order to perform voice communication and data communication through the wireless communication unit 510.
  • the application processing function is realized by the main control unit 501 operating in accordance with application software stored in the recording unit 550.
  • Examples of the application processing function include an infrared communication function for controlling the external input / output unit 560 to perform data communication with the opposite device, an e-mail function for sending and receiving e-mails, a web browsing function for browsing web pages, and the present invention.
  • the main control unit 501 has an image processing function such as displaying video on the display input unit 520 based on image data (still image or moving image data) such as received data or downloaded streaming data.
  • the image processing function refers to a function in which the main control unit 501 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 520.
  • the main control unit 501 executes display control for the display panel 521 and operation detection control for detecting a user operation through the operation unit 540 and the operation panel 522.
  • the main control unit 501 By executing the display control, the main control unit 501 displays an icon for starting application software and a software key such as a scroll bar, or displays a window for creating an e-mail.
  • a software key such as a scroll bar
  • the scroll bar refers to a software key for accepting an instruction to move the display portion of a large image that does not fit in the display area of the display panel 521.
  • the main control unit 501 detects a user operation through the operation unit 540, receives an operation on the icon and an input of a character string in the input field of the window through the operation panel 522, or scrolls. A request to scroll the display image through the bar is accepted.
  • the main control unit 501 causes the operation position with respect to the operation panel 522 to overlap with the display panel 521 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 521.
  • a touch panel control function for controlling the sensitive area of the operation panel 522 and the display position of the software key.
  • the main control unit 501 can also detect a gesture operation on the operation panel 522 and execute a preset function in accordance with the detected gesture operation.
  • Gesture operation is not a conventional simple touch operation, but an operation of drawing a trajectory with at least one of a plurality of positions by drawing a trajectory with a finger or the like, or simultaneously specifying a plurality of positions. means.
  • the camera unit 541 is a digital camera that performs electronic photography using an imaging device such as a complementary metal oxide semiconductor (CMOS) or a charge-coupled device (CCD), and corresponds to the imaging device 10 illustrated in FIG.
  • CMOS complementary metal oxide semiconductor
  • CCD charge-coupled device
  • the camera unit 541 converts image data obtained by imaging into compressed image data such as JPEG (Joint Photographic coding Experts Group) under the control of the main control unit 501, and records the data in the recording unit 550.
  • the data can be output through the input / output unit 560 and the wireless communication unit 510. As shown in FIG.
  • the camera unit 541 is mounted on the same surface as the display input unit 520, but the mounting position of the camera unit 541 is not limited to this, and is mounted on the back surface of the display input unit 520.
  • a plurality of camera units 541 may be mounted. Note that when a plurality of camera units 541 are mounted, the camera unit 541 used for shooting can be switched for shooting alone, or a plurality of camera units 541 can be used for shooting simultaneously.
  • the camera unit 541 can be used for various functions of the smartphone 500.
  • an image acquired by the camera unit 541 can be displayed on the display panel 521, or the image of the camera unit 541 can be used as one of operation inputs of the operation panel 522.
  • the GPS receiving unit 570 detects the position, the position can also be detected with reference to an image from the camera unit 541.
  • the optical axis direction of the camera unit 541 of the smartphone 500 without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor (gyro sensor). It is also possible to determine the current usage environment.
  • the image from the camera unit 541 can be used in the application software.
  • the position information acquired by the GPS receiver 570 to the image data of the still image or the moving image, the voice information acquired by the microphone 532 (the text information may be converted into voice information by the main control unit or the like), Posture information and the like acquired by the motion sensor unit 580 can be added and recorded in the recording unit 550 or output through the external input / output unit 560 and the wireless communication unit 510.
  • the image processing apparatus (main body side CPU 220) that performs HDR composition is built in the imaging apparatus, but the image processing apparatus according to the present invention executes, for example, the image processing program according to the present invention. It may be a personal computer, a portable terminal, or the like separate from the imaging device. In this case, it is necessary to input a plurality of non-light-emitting images, light-emitting images, distance distribution information, and the like as information for HDR synthesis.
  • the plurality of non-light-emitting images are not limited to the two non-light-emitting images of the under-exposed first non-light-emitting image and the over-exposed second non-light-emitting image.
  • a non-light emitting image may be used.
  • the combination ratio determination unit determines a combination ratio of three or more non-light emitting images.
  • the hardware structure of a processing unit (processing unit) that executes various processes is various processors as shown below.
  • the circuit configuration can be changed after manufacturing a CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), which is a general-purpose processor that functions as various processing units by executing software (programs).
  • a dedicated electric circuit that is a processor having a circuit configuration specifically designed to execute a specific process such as a programmable logic device (PLD) or an application specific integrated circuit (ASIC). It is.
  • PLD programmable logic device
  • ASIC application specific integrated circuit
  • One processing unit may be configured by one of these various processors, or may be configured by two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of CPUs and FPGAs). May be. Further, the plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or server, one processor is configured with a combination of one or more CPUs and software. There is a form in which the processor functions as a plurality of processing units.
  • SoC System On Chip
  • various processing units are configured using one or more of the various processors as a hardware structure.
  • the hardware structure of these various processors is more specifically an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
  • the present invention includes an image processing program that functions as an imaging apparatus or an image processing apparatus according to the present invention by being installed in the imaging apparatus or computer, and a recording medium on which the image processing program is recorded.
  • Imaging device 10
  • Finder window 22
  • Shutter release switch 23
  • Shutter speed dial 24
  • Exposure compensation dial 25
  • Power lever 26
  • Eyepiece 27
  • MENU / OK key 28
  • Play button 30
  • Built-in flash 100 interchangeable lenses
  • Imaging optical system 104
  • Lens group 108
  • Focus lens control unit 118
  • Aperture control unit 120
  • Lens side CPU 122
  • 207 RAM 124
  • Lens side communication unit 160
  • Lens mount 200
  • Camera body 201
  • Image sensor 201A micro lens
  • Image sensor controller 203
  • Analog signal processor 204
  • Image input controller 206
  • Digital signal processor 208
  • Compression / decompression processor 210
  • Memory control unit 212 memory card
  • Display control unit 216
  • Main CPU 220A
  • Distance distribution information acquisition unit 220B
  • Light intensity distribution information calculation unit 220C
  • Composite ratio determination unit 220D image composition unit 220E

Abstract

Provided are an image processing device, method and program, and an imaging device which can acquire an HDR image in which the brightness/darkness difference has been adjusted according to the brightness of light hitting a subject. This image processing device comprises: an image acquisition unit 221 that acquires a first non-emission image captured by underexposure, a second non-emission image captured by over-exposure, and an emission image with the same exposure conditions as the first non-emission image; a distance distribution information acquisition unit 220A that acquires distance distribution information indicating the distribution of distances of the subject; a light intensity distribution information calculation unit 220B that, on the basis of the emission image, the first and the second non-emission images and the distance distribution information, calculates light intensity distribution information of the light being projected onto the subject; a synthesis ratio determination unit 220C that determines a synthesis ratio for the first and the second non-emission images on the basis of the light intensity distribution information; and an image synthesizing unit 220D that synthesizes the first and the second non-emission images according to the synthesis ratio.

Description

画像処理装置、方法、及びプログラム並びに撮像装置Image processing apparatus, method, program, and imaging apparatus

 本発明は画像処理装置、方法、及びプログラム並びに撮像装置に係り、特に露出の異なる複数枚の画像を合成してダイナミックレンジを広げるハイダイナミックレンジ合成(HDR(High-dynamic-range)合成)を行う技術に関する。

The present invention relates to an image processing apparatus, method, program, and imaging apparatus, and in particular, performs high dynamic range synthesis (HDR (High-dynamic-range) synthesis) that combines a plurality of images with different exposures to expand a dynamic range. Regarding technology.

 デジタルカメラなどでダイナミックレンジの広い被写体を撮像すると、ハイライト部の領域ではイメージセンサの出力が飽和し、いわゆる「白飛び」が発生し、シャドウ部の領域ではイメージセンサの出力の階調が失われ、いわゆる「黒つぶれ」が発生することがある。

When a subject with a wide dynamic range is imaged with a digital camera or the like, the output of the image sensor is saturated in the highlight area, so-called “whiteout” occurs, and the gradation of the output of the image sensor is lost in the shadow area. In other words, so-called “blackout” may occur.

 被写体の持つダイナミックレンジをより広く画像で表現する方法の1つとして、HDR処理を行うことが知られている。HDR処理では、例えば、適正露光に対して露光量を落として撮像した画像(アンダー露光画像)と露光量を上げて撮像した画像(オーバー露光画像)を取得し、暗い被写体が撮像されている画素はオーバー露光画像の画素を、明るい被写体が撮像されている画素はアンダー露光画像の画素を用いることで(実際はどちらかだけでなく混合することもある)、白飛び及び黒つぶれの少ないHDR画像を作成する。

It is known to perform HDR processing as one of methods for expressing a dynamic range of a subject with a wider image. In the HDR processing, for example, an image obtained by reducing the exposure amount with respect to the appropriate exposure (underexposure image) and an image obtained by increasing the exposure amount (overexposure image) are acquired, and a pixel on which a dark subject is imaged Uses an overexposed image pixel, and an underexposed image pixel is used for a pixel on which a bright subject is imaged (actually, not only one of them may be mixed), but an HDR image with little overexposure and underexposure is generated. create.

 従来、異なる露光で撮像した複数の画像から1枚のHDR画像を生成する装置として、特許文献1、2に記載のものがある。

Conventionally, as an apparatus for generating one HDR image from a plurality of images taken at different exposures, there are devices described in Patent Documents 1 and 2.

 特許文献1に記載の撮像システムは、同一の被写体に対して異なる露光条件で撮像した複数の画像(長時間露光画像と短時間露光画像)から、長時間露光画像に係る輝度信号のうちの露光オーバーとなる領域(非適正露光域)以外の領域(適正露光域)の輝度信号を取得し、短時間露光画像に係る輝度信号の場合には、長時間露光画像における非適正露光域に対応する輝度信号を取得する。そして、各画像の適正露光域を合成することにより、広ダイナミックレンジ画像を生成している。

The imaging system described in Patent Literature 1 is an exposure of luminance signals related to a long-time exposure image from a plurality of images (long-time exposure image and short-time exposure image) captured on the same subject under different exposure conditions. A luminance signal in a region (appropriate exposure region) other than an over-exposed region (inappropriate exposure region) is acquired, and in the case of a luminance signal related to a short exposure image, it corresponds to the inappropriate exposure region in the long exposure image. Get the luminance signal. Then, a wide dynamic range image is generated by combining the appropriate exposure areas of the images.

 また、特許文献1に記載の撮像システムは、長時間露光画像及び短時間露光画像から主要被写体領域を推定し、推定した主要被写体領域に重点的に階調幅が配分されるように階調補正を行い、階調補正後の各画像の適正露光域を合成することで、適正露光域内の主要被写体に対する階調を豊かにしている。

In addition, the imaging system described in Patent Literature 1 estimates a main subject area from a long-exposure image and a short-exposure image, and performs tone correction so that a tone width is allocated to the estimated main subject area. In this way, the gradation for the main subject in the appropriate exposure area is enriched by synthesizing the appropriate exposure area for each image after gradation correction.

 一方、特許文献2に記載の撮像装置は、長時間露光する画素群と短時間露光する画素群とを有する1つの撮像素子を備え、撮像素子から読み出した第1画素群の画像データと第2画素群の画像データとを合成し、HDR画像を生成している。

On the other hand, the imaging device described in Patent Document 2 includes one imaging device having a pixel group that is exposed for a long time and a pixel group that is exposed for a short time, and the image data of the first pixel group read from the imaging device and the second image data. The HDR image is generated by combining the image data of the pixel group.

 また、特許文献2には、長時間露光データであるフラッシュ画像データと、短時間露光データであるノンフラッシュ画像データとの混合比率を、被写体までの距離の推定値に応じて調整しながら加算し、1枚の合成画像を得る記載がある。これにより、背景画像の雰囲気を残したまま主要被写体の高SN比(Signal to Noise Ratio)の画像を得るようにしている。

Patent Document 2 adds a mixing ratio of flash image data that is long-time exposure data and non-flash image data that is short-time exposure data while adjusting the mixing ratio according to the estimated distance to the subject. There is a description of obtaining one composite image. As a result, an image with a high signal-to-noise ratio (Signal to Noise Ratio) of the main subject is obtained while leaving the atmosphere of the background image.

特開2002-232777号公報JP 2002-232777 A 特開2009-206925号公報JP 2009-206925 A

 特許文献1には、HDR画像を生成する際の長時間露光画像と短時間露光画像との混合比率に関する明確な記載がないが、長時間露光画像の非適正露光域に対応する輝度信号については、短時間露光画像に係る輝度信号を使用するため、輝度信号の大きさにより長時間露光画像と短時間露光画像との混合比率を決めている、一般的なHDR合成であると考えられる。したがって、後述するように主要被写体の一部に影があると、影の領域が暗いままになるという問題がある。

In Patent Document 1, there is no clear description regarding the mixing ratio of the long-time exposure image and the short-time exposure image when generating the HDR image, but regarding the luminance signal corresponding to the improper exposure area of the long-time exposure image. Since the luminance signal related to the short-time exposure image is used, it is considered that this is a general HDR synthesis in which the mixing ratio between the long-time exposure image and the short-time exposure image is determined by the magnitude of the luminance signal. Therefore, as described later, when there is a shadow on a part of the main subject, there is a problem that the shadow area remains dark.

 一方、特許文献2に記載の発明は、長時間露光データであるフラッシュ画像データと、短時間露光データであるノンフラッシュ画像データとの混合比率を、被写体までの距離の推定値に応じて調整しながら加算し、1枚の合成画像を生成している。また、特許文献2に記載の発明は、フラッシュ画像データとノンフラッシュ画像データとを合成するものであり、露出条件が異なる非発光画像(ノンフラッシュ画像)を合成するものではない。

On the other hand, the invention described in Patent Document 2 adjusts the mixing ratio of flash image data that is long-time exposure data and non-flash image data that is short-time exposure data in accordance with the estimated distance to the subject. Adding together, one composite image is generated. The invention described in Patent Document 2 combines flash image data and non-flash image data, and does not combine non-light emitting images (non-flash images) with different exposure conditions.

 このように特許文献1、2には、いずれも同一の被写体に対して異なる露光条件で撮像した複数の画像を合成し、HDR画像を生成する際に、被写体に照射されている光の強度に応じて複数の画像の合成比率を調整する記載がない。

As described above, in Patent Documents 1 and 2, when a HDR image is generated by synthesizing a plurality of images captured with different exposure conditions for the same subject, the intensity of light applied to the subject is determined. There is no description to adjust the composition ratio of a plurality of images accordingly.

 とろこで、一般的なHDR処理のように、オーバー画像の画素値を使うかアンダー画像の画素値を使うかを、イメージセンサから得られる輝度値によって決める場合、被写体に当たっている光の強さによらず、同じHDR処理が実施され、以下のような問題が発生する。

As in general HDR processing, when deciding whether to use the pixel value of the over image or the pixel value of the under image based on the luminance value obtained from the image sensor, it depends on the intensity of light hitting the subject. Regardless, the same HDR processing is performed and the following problems occur.

 例えば、図21に示すように、人物の顔の左半分に日光が当たる日向領域Aとなり、右半分が日陰領域Bとなる場合を想定する。この場合、「強い光が当たっている黒い髪(1)」と、「影になっている肌(2)」とが同じ輝度値、又は略同じ輝度値になる場合、HDR合成に同じようにオーバー画像が使われてしまい、影になっている肌(2)が明るくなるだけでなく、黒い髪(1)も明るい色になってしまうという問題が発生する。

For example, as shown in FIG. 21, a case is assumed in which the left half of the person's face is the sunlit area A where the sunlight is exposed, and the right half is the shaded area B. In this case, if “the dark hair (1) exposed to strong light” and “skinned skin (2)” have the same luminance value or almost the same luminance value, the same applies to the HDR composition. The over image is used, and not only the shadowed skin (2) becomes bright, but also the black hair (1) becomes a bright color.

 本発明はこのような事情に鑑みてなされたもので、被写体に当たっている光の明るさに応じて明暗差が調整された、階調特性に優れたHDR画像を取得することができる画像処理装置、方法、及びプログラム並びに撮像装置を提供することを目的とする。

The present invention has been made in view of such circumstances, and an image processing apparatus capable of acquiring an HDR image with excellent gradation characteristics in which the contrast between light and dark is adjusted according to the brightness of light hitting a subject, It is an object to provide a method, a program, and an imaging apparatus.

 上記目的を達成するために本発明の一の態様に係る画像処理装置は、同一の被写体を撮像した画像であって、フラッシュ光の非発光下においてそれぞれ異なる露出条件により撮像された複数の非発光画像と、フラッシュ光の発光下において複数の非発光画像のうちのいずれかの非発光画像の露出条件と同じ露出条件により撮像された発光画像とを取得する画像取得部と、被写体の距離の分布を示す距離分布情報を取得する距離分布情報取得部と、発光画像、発光画像と同じ露出条件により撮像された非発光画像、及び距離分布情報に基づいて、被写体に照射されている光の光強度分布情報を算出する光強度分布情報算出部と、光強度分布情報に基づいて複数の非発光画像の合成比率を決定する合成比率決定部と、合成比率にしたがって複数の非発光画像を合成し、ダイナミックレンジが拡大された合成画像を生成する画像合成部と、を備える。

In order to achieve the above object, an image processing apparatus according to an aspect of the present invention is an image obtained by imaging the same subject, and a plurality of non-light emitting devices that are imaged under different exposure conditions under non-flash light emission. An image acquisition unit that acquires an image and a light-emitting image captured under the same exposure condition as that of any one of the plurality of non-light-emitting images under flash light emission; A distance distribution information acquisition unit that acquires distance distribution information indicating the light intensity of light emitted to the subject based on the light emission image, the non-light emission image captured under the same exposure conditions as the light emission image, and the distance distribution information A light intensity distribution information calculation unit that calculates distribution information, a combination ratio determination unit that determines a combination ratio of a plurality of non-light-emitting images based on the light intensity distribution information, and a plurality of units according to the combination ratio The luminescent image synthesizing comprises an image combining unit for generating a composite image whose dynamic range is expanded, the.

 本発明の一の態様によれば、発光画像、発光画像と同じ露出条件により撮像された非発光画像、及び距離分布情報に基づいて、被写体に照射されている光の光強度分布情報を算出する。そして、算出した光強度分布情報に基づいて複数の非発光画像の合成比率を決定し、決定した合成比率にしたがって複数の非発光画像を合成することによって、ダイナミックレンジが拡大された合成画像(HDR画像)を生成する。このようにして生成されたHDR画像は、被写体のもつ明暗が損なわれずに、被写体に当たっている光の明るさに応じて明暗差が調整された、階調特性に優れたHDR画像となる。

According to one aspect of the present invention, light intensity distribution information of light irradiated on a subject is calculated based on a light emission image, a non-light emission image captured under the same exposure conditions as the light emission image, and distance distribution information. . Then, a composite ratio of the plurality of non-light-emitting images is determined based on the calculated light intensity distribution information, and the plurality of non-light-emitting images are combined according to the determined composite ratio, thereby generating a composite image (HDR) with an expanded dynamic range. Image). The HDR image generated in this way is an HDR image with excellent gradation characteristics in which the contrast of light and darkness is adjusted according to the brightness of the light hitting the subject without losing the contrast of the subject.

 本発明の他の態様に係る画像処理装置において、複数の非発光画像のそれぞれ対応する画素又は領域毎に合成比率を決定し、画像合成部は、複数の非発光画像のそれぞれ対応する画素又は領域毎に決定した合成比率を適用して複数の非発光画像を合成することが好ましい。

In the image processing device according to another aspect of the present invention, a composition ratio is determined for each pixel or region corresponding to each of the plurality of non-light-emitting images, and the image composition unit is configured to correspond to each pixel or region corresponding to the plurality of non-light-emitting images. It is preferable to synthesize a plurality of non-light emitting images by applying a synthesis ratio determined for each.

 本発明の更に他の態様に係る画像処理装置において、複数の非発光画像に対応する複数の露出条件は、適正露出の露出条件よりもアンダー露出の露出条件とオーバー露出の露出条件とを含むことが好ましい。これにより、白飛び及び黒つぶれの少ないHDR画像を作成することができる。

In the image processing apparatus according to still another aspect of the present invention, the plurality of exposure conditions corresponding to the plurality of non-light-emitting images include an under-exposure exposure condition and an over-exposure exposure condition rather than an appropriate exposure condition. Is preferred. This makes it possible to create an HDR image with less whiteout and blackout.

 本発明の更に他の態様に係る画像処理装置において、複数の非発光画像のそれぞれ対応する画素又は領域毎に合成比率を決定する際に、光強度分布情報に基づいて光強度が大きい画素又は領域では複数の非発光画像のうちの最も露出値が小さい露出条件によって撮像された第1非発光画像の混合率を、複数の非発光画像のうちの最も露出値が大きい露出条件によって撮像された第2非発光画像の混合率よりも大きくし、光強度が小さい画素又は領域では第2非発光画像の混合率を第1非発光画像の混合率よりも大きくすることが好ましい。

In the image processing apparatus according to still another aspect of the present invention, when determining a composition ratio for each pixel or region corresponding to each of a plurality of non-light-emitting images, a pixel or region having a high light intensity based on the light intensity distribution information Then, the mixing ratio of the first non-light-emitting images captured under the exposure condition with the smallest exposure value among the plurality of non-light-emitting images is represented by the mixing ratio of the first non-light-emitting images captured under the exposure condition with the largest exposure value among the plurality of non-light-emitting images. It is preferable that the mixing ratio of the second non-light-emitting image is larger than the mixing ratio of the first non-light-emitting image in a pixel or region having a low light intensity.

 即ち、光強度が大きい画素(強い光が当たっている被写体に対応する画素)又は領域では、第1非発光画像の混合率を、第2非発光画像の混合率よりも大きくすることで、白飛びの少ないHDR画像を生成し、光強度が小さい画素(弱い光が当たっている被写体に対応する画素)又は領域では、第2非発光画像の混合率を、第1非発光画像の混合率よりも大きくすることで、黒つぶれの少ないHDR画像を生成する。

That is, in a pixel (region corresponding to a subject that is exposed to strong light) or a region having a high light intensity, the mixing ratio of the first non-light emitting image is set larger than the mixing ratio of the second non-light emitting image. In a pixel or region that generates an HDR image with less skipping and has low light intensity (a pixel corresponding to a subject that is exposed to weak light) or a region, the mixing ratio of the second non-emitting image is greater than the mixing ratio of the first non-emitting image. Is increased to generate an HDR image with less blackout.

 本発明の更に他の態様に係る画像処理装置において、合成比率決定部は、複数の非発光画像のそれぞれ対応する画素又は領域毎に合成比率を決定する際に、光強度分布情報に基づいて光強度分布情報のうちの最小光強度と最大光強度との間において、光強度が増加した場合、第2非発光画像の混合率を単調減少させることが好ましい。光強度分布情報のうちの最小光強度と最大光強度との間において、光強度が増加した場合、第2非発光画像の混合率を単調減少させる(第1非発光画像の混合率を単調増加させる)ことによって、光強度の増減に応じて連続的に混合率を変化させたHDR合成を行うことができる。尚、最小光強度は、光強度分布情報の中の最小値に限らず、例えば、最小値から一定値だけオフセットした値の光強度でもよく、同様に最大光強度は、光強度分布情報の中の最大値に限らず、例えば、最大値から一定値だけオフセットした値の光強度でもよい。単調減少は、リニアに減少する場合も含むことは言うまでもない。

In the image processing apparatus according to still another aspect of the present invention, the composition ratio determination unit determines the composition ratio based on the light intensity distribution information when determining the composition ratio for each corresponding pixel or region of the plurality of non-light-emitting images. When the light intensity increases between the minimum light intensity and the maximum light intensity in the intensity distribution information, it is preferable to monotonously decrease the mixing ratio of the second non-light emitting image. When the light intensity increases between the minimum light intensity and the maximum light intensity in the light intensity distribution information, the mixing ratio of the second non-emitting image is monotonously decreased (the mixing ratio of the first non-emitting image is monotonously increased). By doing so, it is possible to perform HDR synthesis in which the mixing ratio is continuously changed according to the increase or decrease of the light intensity. The minimum light intensity is not limited to the minimum value in the light intensity distribution information. For example, the minimum light intensity may be a light intensity offset by a certain value from the minimum value. Similarly, the maximum light intensity is the light intensity distribution information. For example, the light intensity may be a value offset from the maximum value by a fixed value. It goes without saying that the monotonic decrease includes a linear decrease.

 本発明の更に他の態様に係る画像処理装置において、光強度分布情報に基づいて光強度の度数分布又は頻度分布を示す分布情報を作成する分布情報作成部を備え、合成比率決定部は、作成された分布情報に基づいて度数又は頻度の高い第1頂点に対応する第1光強度と、度数又は頻度の高い第2頂点に対応する第2光強度であって、第1光強度よりも大きい第2光強度とを求め、複数の非発光画像のそれぞれ対応する画素又は領域毎に合成比率を決定する際に、第1光強度以下の光強度の場合には第2非発光画像の混合率を最大値にし、第2光強度以上の光強度の場合には第2非発光画像の混合率を最小値にし、第1光強度よりも大きく第2光強度よりも小さい光強度の場合には第2非発光画像の混合率を最大値と最小値との間において単調減少させることが好ましい。

In the image processing apparatus according to still another aspect of the present invention, the image processing apparatus includes a distribution information creating unit that creates distribution information indicating a frequency distribution or a frequency distribution of the light intensity based on the light intensity distribution information, and the composition ratio determining unit is created The first light intensity corresponding to the first vertex having a high frequency or frequency and the second light intensity corresponding to the second vertex having a high frequency or frequency based on the distribution information that is greater than the first light intensity When the second light intensity is obtained and the composition ratio is determined for each corresponding pixel or region of the plurality of non-light-emitting images, if the light intensity is equal to or lower than the first light intensity, the mixing ratio of the second non-light-emitting images Is set to the maximum value, the light intensity equal to or higher than the second light intensity is set to the minimum mixing ratio of the second non-emission image, and the light intensity is greater than the first light intensity and smaller than the second light intensity. Monotonic mixing ratio of second non-luminous image between maximum and minimum values It is preferable to lack.

 例えば、晴れた日の日中屋外では、日光が当たっていない暗い被写体(日陰の被写体)から日光が当たっている明るい被写体(日向の被写体)まで、ダイナミックレンジの広い被写体が撮像対象になり、この場合の撮像対象は、日陰の被写体と、日向の被写体とに大別することができる。このようなシーンの光強度分布情報に基づいて光強度の度数分布又は頻度分布を示す分布情報を作成すると、その分布情報には、2つ山(2つの頂点となる第1頂点と第2頂点)が存在する。第1頂点は、日陰の被写体に対応する被写体領域の光強度分布情報の度数又は頻度に対応する頂点であり、第2頂点は、日向の被写体に対応する被写体領域の光強度分布情報の度数又は頻度に対応する頂点である。

For example, outdoors in the daytime on sunny days, subjects with a wide dynamic range, from dark subjects that are not exposed to sunlight (shaded subjects) to bright subjects that are exposed to sunlight (subjects to the sun), are subject to imaging. The imaging target in this case can be roughly divided into a shaded subject and a sunny subject. When the distribution information indicating the frequency distribution or frequency distribution of the light intensity is created based on the light intensity distribution information of such a scene, the distribution information includes two peaks (first vertex and second vertex which are two vertices). ) Exists. The first vertex is a vertex corresponding to the frequency or frequency of the light intensity distribution information of the subject region corresponding to the shaded subject, and the second vertex is the frequency of the light intensity distribution information of the subject region corresponding to the sunny subject or It is a vertex corresponding to the frequency.

 そして、第1頂点に対応する光強度(第1光強度)と、第2頂点に対応する光強度(第2光強度)とを求め、複数の非発光画像のそれぞれ対応する画素又は領域毎に合成比率を決定する際に、第1光強度以下の光強度の場合には第2非発光画像の混合率を最大値にし、第2光強度以上の光強度の場合には第2非発光画像の混合率を最小値にする。第1光強度以下の光強度の領域は、日陰の領域と考えられ、第2光強度以上の領域は、日向の領域と考えられるからである。また、第1光強度と第2光強度の間の光強度の場合は、第2非発光画像の混合率を最大値と最小値との間において単調減少させ、光強度の増減に応じて連続的に混合率を変化させる。尚、晴れた日の日中屋外のシーンに限らず、日光が屋内の一部に差し込むシーン、人工光源により照明されている領域と影になっている領域とが存在するシーンなどの場合も上記のように光強度の度数分布又は頻度分布を示す分布曲線は、2つの頂点が存在し得る。

Then, a light intensity corresponding to the first vertex (first light intensity) and a light intensity corresponding to the second vertex (second light intensity) are obtained, and for each corresponding pixel or region of the plurality of non-luminous images. When determining the composition ratio, the mixture ratio of the second non-light-emitting image is maximized when the light intensity is equal to or lower than the first light intensity, and the second non-light-emitting image when the light intensity is equal to or higher than the second light intensity. The mixing ratio of is minimized. This is because the light intensity area below the first light intensity is considered to be a shaded area, and the area above the second light intensity is considered to be a sunny area. In the case of the light intensity between the first light intensity and the second light intensity, the mixing ratio of the second non-light-emitting image is monotonously decreased between the maximum value and the minimum value, and continuously according to the increase or decrease of the light intensity. The mixing ratio is changed. Note that this is not limited to a daytime outdoor scene on a sunny day, but a scene in which sunlight is inserted into a part of the room, a scene in which an area illuminated by an artificial light source and a shadowed area exist, etc. In the distribution curve showing the frequency distribution or frequency distribution of the light intensity, there may be two vertices.

 本発明の更に他の態様に係る画像処理装置において、光強度分布情報に基づいて光強度の度数分布又は頻度分布を示す分布情報を作成する分布情報作成部を備え、合成比率決定部は、作成された分布情報に基づいて度数又は頻度の低い底点に対応する光強度を求め、複数の非発光画像のそれぞれ対応する画素又は領域毎に合成比率を決定する際に、底点に対応する光強度を閾値にして、閾値以下の光強度の場合には第2非発光画像の混合率を最大値にし、閾値を超える光強度の場合には第2非発光画像の混合率を最小値にすることが好ましい。

In the image processing apparatus according to still another aspect of the present invention, the image processing apparatus includes a distribution information creating unit that creates distribution information indicating a frequency distribution or a frequency distribution of the light intensity based on the light intensity distribution information, and the composition ratio determining unit is created The light corresponding to the bottom point is obtained when the light intensity corresponding to the bottom point of the frequency or the low frequency is obtained based on the distribution information obtained and the composition ratio is determined for each corresponding pixel or region of the plurality of non-light-emitting images. When the intensity is a threshold value, the mixing ratio of the second non-light-emitting image is maximized when the light intensity is less than or equal to the threshold value, and the mixing ratio of the second non-light-emitting image is minimized when the light intensity exceeds the threshold value. It is preferable.

 上記と同様に、晴れた日の日中屋外のシーンの場合、光強度の度数分布又は頻度分布を示す分布情報には、2つの山の間の谷(底点)が存在する。そこで、光強度の度数分布又は頻度分布を示す分布情報から底点を求め、この底点に対応する光強度を閾値にして、閾値以下の光強度の場合には第2非発光画像の混合率を最大値(第1非発光画像の混合率を最小値)にし、閾値を超える光強度の場合には第2非発光画像の混合率を最小値(第2非発光画像の混合率を最大値)にすることが好ましい。尚、閾値前後の一定範囲の光強度に対しは、光強度の大きさに応じて、第2非発光画像の混合率を最大値と最小値との間において連続的に混合率を変化させる場合も含む。

Similarly to the above, in the case of a daytime outdoor scene on a sunny day, there is a valley (bottom point) between two peaks in the distribution information indicating the frequency distribution or frequency distribution of the light intensity. Therefore, the bottom point is obtained from the distribution information indicating the frequency distribution or frequency distribution of the light intensity, and the light intensity corresponding to the bottom point is set as a threshold value. Is set to the maximum value (the mixing ratio of the first non-light emitting image is the minimum value), and when the light intensity exceeds the threshold, the mixing ratio of the second non-light emitting image is set to the minimum value (the mixing ratio of the second non-light emitting image is the maximum value). ) Is preferable. For a certain range of light intensity before and after the threshold, the mixing ratio of the second non-light-emitting image is continuously changed between the maximum value and the minimum value according to the magnitude of the light intensity. Including.

 本発明の更に他の態様に係る画像処理装置において、複数の非発光画像のうちの少なくとも1つの非発光画像の画像内の輝度の分布を示す輝度分布情報を算出する輝度分布情報算出部を備え、合成比率決定部は、光強度分布情報及び輝度分布情報に基づいて複数の非発光画像の合成比率を決定することが好ましい。

An image processing apparatus according to still another aspect of the present invention includes a luminance distribution information calculation unit that calculates luminance distribution information indicating a luminance distribution in an image of at least one non-luminescent image among a plurality of non-luminescent images. The composition ratio determining unit preferably determines the composition ratio of the plurality of non-light-emitting images based on the light intensity distribution information and the luminance distribution information.

 これにより、光強度分布情報のみならず、従来の輝度分布情報を加味して複数の非発光画像の合成比率を決定することができる。

Thereby, not only the light intensity distribution information but also the conventional luminance distribution information can be taken into consideration to determine the composite ratio of a plurality of non-luminous images.

 本発明の更に他の態様に係る画像処理装置において、発光画像におけるフラッシュ光が到達していない画素又は領域を検出する検出部を備え、合成比率決定部は、フラッシュ光が到達していない画素又は領域に対する合成比率を決定する際に、輝度分布情報のみに基づいて複数の非発光画像の合成比率を決定することが好ましい。

In the image processing device according to still another aspect of the present invention, the image processing apparatus includes a detection unit that detects a pixel or a region where the flash light does not reach in the light emission image, and the combination ratio determination unit includes the pixel or the region where the flash light does not reach When determining a composition ratio for a region, it is preferable to determine a composition ratio of a plurality of non-light-emitting images based only on luminance distribution information.

 フラッシュ光が到達していない画素又は領域は、その画素又は領域に対応する光強度が不明になる。したがって、この場合には従来と同様に輝度分布情報に基づいて複数の非発光画像の合成比率を決定することが好ましい。

For a pixel or region where the flash light has not reached, the light intensity corresponding to that pixel or region is unknown. Therefore, in this case, it is preferable to determine the composite ratio of a plurality of non-light-emitting images based on the luminance distribution information as in the conventional case.

 本発明の更に他の態様に係る画像処理装置において、合成比率決定部は、複数の非発光画像のそれぞれ対応する画素又は領域毎に合成比率を決定する際に、画素又は領域の位置を示すパラメータをi、複数の非発光画像のうちの最も露出値が大きい露出条件によって撮像された第2非発光画像の混合率をαi、輝度分布情報に基づく第2非発光画像の混合率であって、輝度が高いほど小さい値となる混合率をβiとすると、光強度分布情報及び輝度分布情報に基づく第2非発光画像の混合率γiを、次式、

 γi=αi×βiにより算出することが好ましい。

In the image processing device according to still another aspect of the present invention, the composition ratio determination unit determines a composition ratio for each pixel or region corresponding to each of the plurality of non-light-emitting images, and indicates a parameter indicating the position of the pixel or region. I, the mixing ratio of the second non-light-emitting image captured by the exposure condition having the largest exposure value among the plurality of non-light-emitting images, αi, the mixing ratio of the second non-light-emitting image based on the luminance distribution information, Assuming that the mixing ratio that is smaller as the luminance is higher is βi, the mixing ratio γi of the second non-luminescent image based on the light intensity distribution information and the luminance distribution information

It is preferable to calculate by γi = αi × βi.

 本発明の更に他の態様に係る画像処理装置において、発光画像の露出条件は、複数の非発光画像に対応する複数の露出条件のうちの最も露出値が小さい露出条件であることが好ましい。これにより、発光画像を撮像する場合のフラッシュ光の発光量を大きくすることができ、フラッシュ光未到達画素を最小限にすることができる。

In the image processing apparatus according to still another aspect of the present invention, the light emission image exposure condition is preferably the exposure condition having the smallest exposure value among the plurality of exposure conditions corresponding to the plurality of non-light emission images. As a result, it is possible to increase the amount of flash light emitted when capturing a light-emitting image, and to minimize the pixels that have not reached the flash light.

 本発明の更に他の態様に係る撮像装置は、被写体を異なる露出条件により撮像可能な撮像部と、フラッシュ光を発光するフラッシュ発光部と、フラッシュ発光部から発光されるフラッシュ光を制御するフラッシュ制御部と、上述した画像処理装置と、を備え、画像取得部は、フラッシュ発光部から発光されるフラッシュ光の非発光下において、撮像部によりそれぞれ異なる露出条件により撮像された複数の非発光画像と、フラッシュ発光部から発光されるフラッシュ光の発光下において、撮像部により複数の非発光画像のうちのいずれかの非発光画像の露出条件と同じ露出条件により撮像された発光画像とを取得する。

An imaging apparatus according to still another aspect of the present invention includes an imaging unit that can image a subject under different exposure conditions, a flash light emitting unit that emits flash light, and a flash control that controls flash light emitted from the flash light emitting unit. And an image processing device as described above, and the image acquisition unit includes a plurality of non-light-emitting images captured by the imaging unit under different exposure conditions under non-light emission of the flash light emitted from the flash light-emitting unit. Then, under the light emission of the flash light emitted from the flash light emitting unit, the light emitting image captured by the imaging unit under the same exposure condition as the exposure condition of any one of the plurality of non-light emitting images is acquired.

 本発明の更に他の態様に係る撮像装置において、フラッシュ制御部は、フラッシュ発光部からフラッシュ光を調光発光させ、調光発光下において撮像部が撮像した調光画像と調光発光せずに撮像部が撮像した画像とに基づいて調光発光したフラッシュ光が到達していない領域が検出された場合、調光発光したフラッシュ光が到達している領域が飽和しない最大の発光量を算出し、発光画像を取得する際に算出した最大の発光量においてフラッシュ発光部からフラッシュ光を本発光させることが好ましい。このようにして本発光時の発光量を決定することによって、フラッシュ光未到達画素を最小限にすることができる。

In the imaging device according to still another aspect of the present invention, the flash control unit dims the flash light from the flash light emitting unit, and does not perform the dimming light emission with the dimming image captured by the imaging unit under the dimming light emission. Based on the image captured by the imaging unit, if a region where the dimmed flash light has not reached is detected, the maximum amount of light emission that does not saturate the region where the dimmed flash light has reached is calculated. The flash light is preferably emitted from the flash light emitting unit at the maximum light emission amount calculated when the light emission image is acquired. By determining the light emission amount during the main light emission in this way, it is possible to minimize the pixels that have not reached the flash light.

 本発明の更に他の態様に係る撮像装置において、フラッシュ制御部は、被写体の平均反射率と距離分布情報取得部が取得した距離分布情報とに基づいて最大の発光量を算出することが好ましい。これにより、フラッシュ光が到達する領域が白飛びしない、最大の発光量を算出することができる。

In the imaging device according to still another aspect of the present invention, it is preferable that the flash control unit calculates the maximum light emission amount based on the average reflectance of the subject and the distance distribution information acquired by the distance distribution information acquisition unit. As a result, it is possible to calculate the maximum light emission amount in which the area where the flash light reaches does not blow out.

 本発明の更に他の態様に係る撮像装置において、フラッシュ制御部は、フラッシュ発光部からフラッシュ光を調光発光させ、調光発光下において撮像部が撮像した調光画像と調光発光せずに撮像部が撮像した画像とに基づいて調光画像の全領域にフラッシュ光が到達していることが検出された場合、画像取得部は、調光画像を発光画像として取得することが好ましい。

In the imaging device according to still another aspect of the present invention, the flash control unit dims the flash light from the flash light emitting unit, and does not perform the dimming light emission with the dimming image captured by the imaging unit under the dimming light emission. When it is detected that the flash light reaches the entire region of the light control image based on the image captured by the image capturing unit, the image acquisition unit preferably acquires the light control image as a light emission image.

 調光画像の全領域にフラッシュ光が到達している場合、全領域の光強度分布情報を算出することができ、改めて本発光する必要がないからである。

This is because when the flash light reaches the entire area of the light control image, the light intensity distribution information of the entire area can be calculated, and it is not necessary to perform the main light emission again.

 本発明の更に他の態様に係る撮像装置において、フラッシュ制御部は、フラッシュ発光部からフラッシュ光を調光発光させ、調光発光下において撮像部が撮像した調光画像と調光発光せずに撮像部が撮像した画像とに基づいて調光画像にフラッシュ光が到達していない領域が検出され、かつ距離分布情報に基づいてフラッシュ光が到達していない領域の距離が、フラッシュ発光部から発光されるフラッシュ光の最大発光量の到達距離よりも遠い場合、画像取得部は、調光画像を発光画像として取得することが好ましい。

In the imaging device according to still another aspect of the present invention, the flash control unit dims the flash light from the flash light emitting unit, and does not perform the dimming light emission with the dimming image captured by the imaging unit under the dimming light emission. An area where the flash light has not reached the light control image is detected based on the image captured by the imaging section, and the distance of the area where the flash light has not reached is emitted from the flash light emission section based on the distance distribution information. In the case where the flash light is farther than the reach distance of the maximum light emission amount, the image acquisition unit preferably acquires the light control image as the light emission image.

 調光発光のフラッシュ光が到達していない領域の距離が、フラッシュ光の最大発光量の到達距離よりも遠い場合、本発光してもフラッシュ光未到達画素にフラッシュ光を到達させることができず、改めた本発光する必要がないからである。

If the distance of the area where the flashlight of the dimming light emission has not reached is longer than the reachable distance of the maximum light emission amount of the flashlight, the flashlight cannot reach the pixels that have not reached the flashlight even if the main light is emitted. This is because there is no need to make a new light emission.

 本発明の更に他の態様に係る画像処理方法は、同一の被写体を撮像した画像であって、フラッシュ光の非発光下においてそれぞれ異なる露出条件により撮像された複数の非発光画像を取得するステップと、フラッシュ光の発光下において複数の非発光画像のうちのいずれかの非発光画像の露出条件と同じ露出条件により撮像された発光画像を取得するステップと、被写体の距離の分布を示す距離分布情報を取得するステップと、発光画像、発光画像と同じ露出条件により撮像された非発光画像、及び距離分布情報に基づいて、被写体に照射されている光の光強度分布情報を算出するステップと、光強度分布情報に基づいて複数の非発光画像の合成比率を決定するステップと、合成比率にしたがって複数の非発光画像を合成し、ダイナミックレンジが拡大された合成画像を生成するステップと、を含む。

According to still another aspect of the present invention, there is provided an image processing method for acquiring a plurality of non-light-emitting images that are images of the same subject and that are captured under different exposure conditions under non-flash light emission. , A step of obtaining a light-emitting image captured under the same exposure condition as the exposure condition of any one of the plurality of non-light-emitting images under flash light emission, and distance distribution information indicating a distribution of the distance of the subject The light intensity distribution information of the light applied to the subject based on the light emission image, the non-light emission image captured under the same exposure conditions as the light emission image, and the distance distribution information; Determining a composite ratio of the plurality of non-light-emitting images based on the intensity distribution information; combining the plurality of non-light-emitting images according to the composite ratio; And generating a composite image di is enlarged, the.

 本発明の更に他の態様に係る画像処理方法において、合成比率を決定するステップは、複数の非発光画像のそれぞれ対応する画素又は領域毎に合成比率を決定する際に、光強度分布情報に基づいて光強度が大きい画素又は領域では複数の非発光画像のうちの最も露出値が小さい露出条件によって撮像された第1非発光画像の混合率を、複数の非発光画像のうちの最も露出値が大きい露出条件によって撮像された第2非発光画像の混合率よりも大きくし、光強度が小さい画素又は領域では第2非発光画像の混合率を第1非発光画像の混合率よりも大きくすることが好ましい。

In the image processing method according to still another aspect of the present invention, the step of determining the composition ratio is based on the light intensity distribution information when determining the composition ratio for each corresponding pixel or region of the plurality of non-light-emitting images. In a pixel or region having a high light intensity, the mixing ratio of the first non-light-emitting images captured under the exposure condition with the smallest exposure value among the plurality of non-light-emitting images is represented by the highest exposure value among the plurality of non-light-emitting images. The mixing ratio of the second non-light emitting image is set larger than the mixing ratio of the first non-light emitting image in a pixel or region having a low light intensity. Is preferred.

 本発明の更に他の態様に係る画像処理プログラムは、同一の被写体を撮像した画像であって、フラッシュ光の非発光下においてそれぞれ異なる露出条件により撮像された複数の非発光画像を取得する機能と、フラッシュ光の発光下において複数の非発光画像のうちのいずれかの非発光画像の露出条件と同じ露出条件により撮像された発光画像を取得する機能と、被写体の距離の分布を示す距離分布情報を取得する機能と、発光画像、発光画像と同じ露出条件により撮像された非発光画像、及び距離分布情報に基づいて、被写体に照射されている光の光強度分布情報を算出する機能と、光強度分布情報に基づいて複数の非発光画像の合成比率を決定する機能と、合成比率にしたがって複数の非発光画像を合成し、ダイナミックレンジが拡大された合成画像を生成する機能と、をコンピュータに実現させる。

An image processing program according to still another aspect of the present invention has a function of acquiring a plurality of non-light-emitting images that are images of the same subject and that are captured under different exposure conditions under non-flash light emission. , A function of acquiring a light-emitting image captured under the same exposure condition as the exposure condition of any one of the plurality of non-light-emitting images under flash light emission, and distance distribution information indicating the distance distribution of the subject A light emission image, a non-light emission image captured under the same exposure conditions as the light emission image, and a distance distribution information based on the light intensity distribution information of the light applied to the subject, A function that determines the composite ratio of multiple non-light-emitting images based on intensity distribution information, and combines multiple non-light-emitting images according to the composite ratio, expanding the dynamic range. And generating a composite image, to realize the computer.

 本発明の更に他の態様に係る画像処理プログラムにおいて、合成比率を決定する機能は、複数の非発光画像のそれぞれ対応する画素又は領域毎に合成比率を決定する際に、光強度分布情報に基づいて光強度が大きい画素又は領域では複数の非発光画像のうちの最も露出値が小さい露出条件によって撮像された第1非発光画像の混合率を、複数の非発光画像のうちの最も露出値が大きい露出条件によって撮像された第2非発光画像の混合率よりも大きくし、光強度が小さい画素又は領域では第2非発光画像の混合率を第1非発光画像の混合率よりも大きくすることが好ましい。

In the image processing program according to still another aspect of the present invention, the function of determining the composition ratio is based on the light intensity distribution information when determining the composition ratio for each corresponding pixel or region of the plurality of non-light-emitting images. In a pixel or region having a high light intensity, the mixing ratio of the first non-light-emitting images captured under the exposure condition with the smallest exposure value among the plurality of non-light-emitting images is represented by the highest exposure value among the plurality of non-light-emitting images. The mixing ratio of the second non-light emitting image is set larger than the mixing ratio of the first non-light emitting image in a pixel or region having a low light intensity. Is preferred.

 本発明によれば、被写体に当たっている光の明るさ(光強度)に応じて、複数の非発光画像の合成比率を決定してHDR合成するようにしたため、被写体のもつ明暗が損なわれずに、被写体に当たっている光の明るさに応じて明暗差が調整された、階調特性に優れたHDR画像を取得することができる

According to the present invention, the composition ratio of the plurality of non-light-emitting images is determined according to the brightness (light intensity) of the light hitting the subject, and the HDR composition is performed. It is possible to obtain an HDR image with excellent gradation characteristics in which the difference in brightness is adjusted according to the brightness of the light hitting

図1は本発明に係る撮像装置を斜め前方から見た斜視図である。FIG. 1 is a perspective view of an imaging apparatus according to the present invention as viewed obliquely from the front. 図2は撮像装置の背面図である。FIG. 2 is a rear view of the imaging apparatus. 図3は撮像装置の内部構成の実施形態を示すブロック図である。FIG. 3 is a block diagram illustrating an embodiment of the internal configuration of the imaging apparatus. 図4はイメージセンサの構成例を示す正面図である。FIG. 4 is a front view showing a configuration example of the image sensor. 図5は本発明に係る画像処理装置として機能する本体側CPU220の第1実施形態を示す機能ブロック図である。FIG. 5 is a functional block diagram showing the first embodiment of the main body side CPU 220 functioning as the image processing apparatus according to the present invention. 図6は光強度分布情報Sと第2非発光画像の混合率αiとの関係を示すグラフである。FIG. 6 is a graph showing the relationship between the light intensity distribution information S i and the mixing ratio αi of the second non-light-emitting image. 図7は光強度分布情報Siと第2非発光画像の混合率αiとの他の関係を示すグラフである。FIG. 7 is a graph showing another relationship between the light intensity distribution information Si and the mixing ratio αi of the second non-luminescent image. 図8は光強度分布情報Siと第2非発光画像の混合率αiとの更に他の関係を示すグラフである。FIG. 8 is a graph showing still another relationship between the light intensity distribution information Si and the mixing ratio αi of the second non-emission image. 図9は、本発明に係る画像処理装置として機能する本体側CPU220の第2実施形態を示す機能ブロック図である。FIG. 9 is a functional block diagram showing a second embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention. 図10は光強度の度数分布の一例を示す度数分布図である。FIG. 10 is a frequency distribution diagram showing an example of the frequency distribution of light intensity. 図11は光強度分布情報Siと第2非発光画像の混合率αiとの更に他の関係を示すグラフである。FIG. 11 is a graph showing still another relationship between the light intensity distribution information Si and the mixing ratio αi of the second non-luminescent image. 図12は図10に示した光強度の度数分布と同様の度数分布図である。光強度の度数分布の一例を示す度数分布図である。FIG. 12 is a frequency distribution diagram similar to the frequency distribution of light intensity shown in FIG. It is a frequency distribution figure which shows an example of frequency distribution of light intensity. 図13は光強度分布情報Siと第2非発光画像の混合率αiとの更に他の関係を示すグラフである。FIG. 13 is a graph showing still another relationship between the light intensity distribution information Si and the mixing ratio αi of the second non-luminescent image. 図14は発明に係る画像処理装置として機能する本体側CPU220の第3実施形態を示す機能ブロック図である。FIG. 14 is a functional block diagram showing a third embodiment of the main body side CPU 220 functioning as the image processing apparatus according to the invention. 図15は発明に係る画像処理装置として機能する本体側CPU220の第4実施形態を示す機能ブロック図である。FIG. 15 is a functional block diagram showing a fourth embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the invention. 図16は発明に係る画像処理装置として機能する本体側CPU220の第5実施形態を示す機能ブロック図である。FIG. 16 is a functional block diagram showing a fifth embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the invention. 図17は本発明に係る画像処理方法の実施形態を示すフローチャートである。FIG. 17 is a flowchart showing an embodiment of an image processing method according to the present invention. 図18は図17のステップS12における詳細な動作を示すフローチャートである。FIG. 18 is a flowchart showing detailed operations in step S12 of FIG. 図19は本発明に係る撮像装置の一実施形態であるスマートフォンの外観図である。FIG. 19 is an external view of a smartphone which is an embodiment of an imaging apparatus according to the present invention. 図20はスマートフォンの構成を示すブロック図である。FIG. 20 is a block diagram illustrating a configuration of the smartphone. 図21は従来のHDR合成の課題を説明するために用いた図である。FIG. 21 is a diagram used for explaining a problem of the conventional HDR synthesis.

 以下、添付図面にしたがって本発明に係る画像処理装置、方法、及びプログラム並びに撮像装置の好ましい実施形態について説明する。

Hereinafter, preferred embodiments of an image processing device, a method, a program, and an imaging device according to the present invention will be described with reference to the accompanying drawings.

 <撮像装置の外観>

 図1は、本発明に係る撮像装置を斜め前方から見た斜視図であり、図2は撮像装置の背面図である。

<Appearance of imaging device>

FIG. 1 is a perspective view of an imaging apparatus according to the present invention as viewed obliquely from the front, and FIG. 2 is a rear view of the imaging apparatus.

 図1に示すように撮像装置10は、交換レンズ100と、交換レンズ100が着脱可能なカメラ本体200とから構成されたミラーレスのデジタル一眼カメラである。

As shown in FIG. 1, the imaging apparatus 10 is a mirrorless digital single-lens camera including an interchangeable lens 100 and a camera body 200 to which the interchangeable lens 100 can be attached and detached.

 図1において、カメラ本体200の前面には、交換レンズ100が装着される本体マウント260と、光学ファインダのファインダ窓20等が設けられ、カメラ本体200の上面には、主としてシャッタレリーズスイッチ22、シャッタスピードダイヤル23、露出補正ダイヤル24、電源レバー25、及び内蔵フラッシュ30が設けられている。

In FIG. 1, a main body mount 260 to which the interchangeable lens 100 is attached, a finder window 20 of an optical finder, and the like are provided on the front surface of the camera main body 200, and a shutter release switch 22 and a shutter are mainly provided on the upper surface of the camera main body 200. A speed dial 23, an exposure correction dial 24, a power lever 25, and a built-in flash 30 are provided.

 また、図2に示すようにカメラ本体200の背面には、主として液晶モニタ216、光学ファインダの接眼部26、MENU/OKキー27、十字キー28、再生ボタン29等が設けられている。

As shown in FIG. 2, a liquid crystal monitor 216, an optical viewfinder eyepiece 26, a MENU / OK key 27, a cross key 28, a playback button 29, and the like are mainly provided on the back of the camera body 200.

 液晶モニタ216は、撮影モード時にライブビュー画像を表示したり、再生モード時に撮像した画像を再生表示する他、各種のメニュー画面を表示する表示部として機能し、またユーザに対して各種の情報を通知する通知部として機能する。MENU/OKキー27は、液晶モニタ216の画面上にメニューを表示させる指令を行うためのメニューボタンとしての機能と、選択内容の確定及び実行などを指令するOKボタンとしての機能とを兼備した操作キーである。十字キー28は、上下左右の4方向の指示を入力する操作部であり、メニュー画面から項目を選択したり、各メニューから各種設定項目の選択を指示したりするマルチファンクションキーとして機能する。また、十字キー28の上キー及び下キーは撮像時のズームスイッチあるいは再生モード時の再生ズームスイッチとして機能し、左キー及び右キーは再生モード時のコマ送り(順方向及び逆方向送り)ボタンとして機能する。また、液晶モニタ216に表示された複数の被写体から焦点調節を行う任意の被写体を指定する操作部としても機能する。

The liquid crystal monitor 216 functions as a display unit for displaying various menu screens in addition to displaying a live view image in the shooting mode, reproducing and displaying an image captured in the playback mode, and displaying various information to the user. Functions as a notification unit for notification. The MENU / OK key 27 is an operation having both a function as a menu button for instructing to display a menu on the screen of the liquid crystal monitor 216 and a function as an OK button for instructing confirmation and execution of selection contents. Key. The cross key 28 is an operation unit for inputting instructions in four directions, up, down, left, and right, and functions as a multi-function key for selecting an item from the menu screen and instructing selection of various setting items from each menu. The up and down keys of the cross key 28 function as a zoom switch at the time of imaging or a playback zoom switch in the playback mode, and the left key and the right key are frame advance (forward and reverse) buttons in the playback mode. Function as. In addition, it also functions as an operation unit that designates an arbitrary subject for focus adjustment from a plurality of subjects displayed on the liquid crystal monitor 216.

 また、MENU/OKキー27、十字キー28、及び液晶モニタ216に表示されるメニュー画面を使用することで、1枚の静止画を撮像する静止画撮像モードの他に、静止画を連続撮影する連写モード、連写モードの一種であり、ダイナミックレンジを広げるHDR合成に使用する複数枚の画像を撮像するHDR撮像モード、動画を撮像する動画撮像モードを含む各種の撮像モードの設定を行うことができる。

Further, by using the menu screen displayed on the MENU / OK key 27, the cross key 28, and the liquid crystal monitor 216, in addition to the still image capturing mode for capturing one still image, still images are continuously captured. Various shooting modes including HDR shooting mode that captures a plurality of images used for HDR composition that expands the dynamic range, and moving image capturing mode that captures moving images, which are a kind of continuous shooting mode and continuous shooting mode Can do.

 再生ボタン29は、記録した静止画又は動画を液晶モニタ216に表示させる再生モードに切り替えるためのボタンである。

The playback button 29 is a button for switching to a playback mode in which the recorded still image or moving image is displayed on the liquid crystal monitor 216.

 <撮像装置の内部構成>

 [交換レンズ]

 図3は、撮像装置10の内部構成の実施形態を示すブロック図である。

<Internal configuration of imaging device>

[interchangeable lens]

FIG. 3 is a block diagram illustrating an embodiment of the internal configuration of the imaging apparatus 10.

 撮像装置10を構成する撮像光学系として機能する交換レンズ100は、カメラ本体200の通信規格に沿って製造されたものであり、後述するようにカメラ本体200との間で通信を行うことができる交換レンズである。この交換レンズ100は、撮像光学系102、フォーカスレンズ制御部116、絞り制御部118、レンズ側CPU(Central Processing Unit)120、フラッシュROM(Read Only Memory)126、レンズ側通信部150、及びレンズマウント160を備える。

The interchangeable lens 100 that functions as an imaging optical system constituting the imaging apparatus 10 is manufactured in accordance with the communication standard of the camera body 200, and can communicate with the camera body 200 as described later. It is an interchangeable lens. The interchangeable lens 100 includes an imaging optical system 102, a focus lens control unit 116, an aperture control unit 118, a lens side CPU (Central Processing Unit) 120, a flash ROM (Read Only Memory) 126, a lens side communication unit 150, and a lens mount. 160.

 交換レンズ100の撮像光学系102は、フォーカスレンズを含むレンズ群104及び絞り108を含む。

The imaging optical system 102 of the interchangeable lens 100 includes a lens group 104 including a focus lens and a diaphragm 108.

 フォーカスレンズ制御部116は、レンズ側CPU120からの指令にしたがってフォーカスレンズを移動させ、フォーカスレンズの位置(合焦位置)を制御する。絞り制御部118は、レンズ側CPU120からの指令にしたがって絞り108を制御する。

The focus lens control unit 116 moves the focus lens in accordance with a command from the lens side CPU 120 and controls the position (focus position) of the focus lens. The diaphragm control unit 118 controls the diaphragm 108 in accordance with a command from the lens side CPU 120.

 レンズ側CPU120は、交換レンズ100を統括制御するもので、ROM124及びRAM(Random Access Memory)122を内蔵している。

The lens-side CPU 120 controls the interchangeable lens 100 as a whole, and includes a ROM 124 and a RAM (Random Access Memory) 122.

 フラッシュROM126は、カメラ本体200からダウンロードされたプログラム等を格納する不揮発性のメモリである。

The flash ROM 126 is a nonvolatile memory that stores programs downloaded from the camera body 200.

 レンズ側CPU120は、ROM124又はフラッシュROM126に格納された制御プログラムに従い、RAM122を作業領域として、交換レンズ100の各部を統括制御する。

The lens-side CPU 120 performs overall control of each part of the interchangeable lens 100 using the RAM 122 as a work area according to a control program stored in the ROM 124 or the flash ROM 126.

 レンズ側通信部150は、レンズマウント160がカメラ本体200の本体マウント260に装着されている状態で、レンズマウント160に設けられた複数の信号端子(レンズ側信号端子)を介してカメラ本体200との通信を行う。即ち、レンズ側通信部150は、レンズ側CPU120の指令にしたがって、レンズマウント160及び本体マウント260を介して接続されたカメラ本体200の本体側通信部250との間で、リクエスト信号、回答信号の送受信(双方向通信)を行い、撮像光学系102の各光学部材のレンズ情報(フォーカスレンズの位置情報、焦点距離情報及び絞り情報等)を、カメラ本体200に通知する。

The lens side communication unit 150 is connected to the camera body 200 via a plurality of signal terminals (lens side signal terminals) provided on the lens mount 160 in a state where the lens mount 160 is attached to the body mount 260 of the camera body 200. Communication. That is, the lens side communication unit 150 sends a request signal and a response signal to and from the main body side communication unit 250 of the camera body 200 connected via the lens mount 160 and the main body mount 260 in accordance with a command from the lens side CPU 120. Transmission / reception (bidirectional communication) is performed to notify the camera body 200 of lens information (focus lens position information, focal length information, aperture information, etc.) of each optical member of the imaging optical system 102.

 また、交換レンズ100は、フォーカスレンズの位置情報、及び絞り情報を検出する検出部(図示せず)を備えている。ここで、絞り情報とは、絞り108の絞り値(F値)、絞り108の開口径等を示す情報である。

The interchangeable lens 100 also includes a detection unit (not shown) that detects focus lens position information and aperture information. Here, the aperture information is information indicating the aperture value (F value) of the aperture 108, the aperture diameter of the aperture 108, and the like.

 レンズ側CPU120は、カメラ本体200からのレンズ情報のリクエストに応えるために、検出されたフォーカスレンズの位置情報及び絞り情報を含む各種のレンズ情報をRAM122に保持することが好ましい。また、レンズ情報は、カメラ本体200からのレンズ情報の要求があると検出され、又は光学部材が駆動されるときに検出され、又は一定の周期(動画のフレーム周期よりも十分に短い周期)で検出され、検出結果を保持することができる。

In order to respond to the lens information request from the camera body 200, the lens side CPU 120 preferably stores various lens information including the detected focus lens position information and aperture information in the RAM 122. Further, the lens information is detected when there is a request for lens information from the camera body 200, or is detected when the optical member is driven, or at a fixed period (a period sufficiently shorter than the frame period of the moving image). It is detected and the detection result can be held.

 [カメラ本体]

 図3に示す撮像装置10を構成するカメラ本体200は、イメージセンサ201、イメージセンサ制御部202、アナログ信号処理部203、A/D(Analog/Digital)変換器204、画像入力コントローラ205、デジタル信号処理部206、RAM207、圧縮伸張処理部208、メディア制御部210、メモリカード212、表示制御部214、液晶モニタ216、本体側CPU220、操作部222、時計部224、フラッシュROM226、ROM228、AF(Autofocus)制御部230、AE(Auto Exposure)制御部232、ホワイトバランス補正部234、無線通信部236、GPS(Global Positioning System)受信部238、電源制御部240、バッテリ242、本体側通信部250、本体マウント260、内蔵フラッシュ30(図1)を構成するフラッシュ発光部270、フラッシュ制御部272、フォーカルプレーンシャッタ(FPS:focal-plane shutter)280、及びFPS制御部296を備える。

[Camera body]

3 includes an image sensor 201, an image sensor control unit 202, an analog signal processing unit 203, an A / D (Analog / Digital) converter 204, an image input controller 205, a digital signal. Processing unit 206, RAM 207, compression / decompression processing unit 208, media control unit 210, memory card 212, display control unit 214, liquid crystal monitor 216, main body CPU 220, operation unit 222, clock unit 224, flash ROM 226, ROM 228, AF (Autofocus ) Control unit 230, AE (Auto Exposure) control unit 232, white balance correction unit 234, wireless communication unit 236, GPS (Global Positioning System) reception unit 238, power supply control unit 240, battery 242, main unit side communication unit 250, main unit The mount 260 and the built-in flash 30 (FIG. 1) are configured. It comprises: (focal-plane shutter FPS) 280 and the FPS control unit 296, Rush emitting unit 270, the flash control unit 272, a focal plane shutter.

 <イメージセンサの構成>

 イメージセンサ201は、CMOS(Complementary Metal-Oxide Semiconductor)型のカラーイメージセンサにより構成されている。尚、イメージセンサ201は、CMOS型に限らず、XYアドレス型、又はCCD(Charge Coupled Device)型のイメージセンサでもよい。

<Configuration of image sensor>

The image sensor 201 is composed of a complementary metal-oxide semiconductor (CMOS) type color image sensor. The image sensor 201 is not limited to the CMOS type, but may be an XY address type or a CCD (Charge Coupled Device) type image sensor.

 図4はイメージセンサ201の構成を示す正面図である。

FIG. 4 is a front view showing the configuration of the image sensor 201.

 図4に示すようにイメージセンサ201は、水平方向(x方向)及び垂直方向(y方向)に2次元状に配列された光電変換素子(受光セル)により構成された複数の画素を有するイメージセンサであって、交換レンズ100を介して入射する光束を、以下に示す瞳分割手段により瞳分割して選択的に受光する第1画素SA、第2画素Sを含んで構成されている。

As shown in FIG. 4, the image sensor 201 includes an image sensor having a plurality of pixels configured by photoelectric conversion elements (light receiving cells) arranged two-dimensionally in the horizontal direction (x direction) and the vertical direction (y direction). In this case, the first and second pixels S A and S B are configured to selectively receive light by dividing the light beam incident through the interchangeable lens 100 by the pupil dividing means shown below.

 図4に示すようにイメージセンサ201は、瞳分割手段として瞳結像レンズ(マイクロレンズ)201Aによる瞳像分離方式が適用されている。

As shown in FIG. 4, the image sensor 201 employs a pupil image separation method using a pupil imaging lens (microlens) 201A as pupil dividing means.

 瞳像分離方式は、1つのマイクロレンズ201Aに対して複数(本例では2つ)の画素(第1画素S,第2画素S)が割り当てられており、1つのマイクロレンズ201Aに入射する瞳像は、マイクロレンズ201Aにより瞳分割され、2つ(一対)の第1画素S,第2画素Sにより受光される。従って、マイクロレンズ201Aへの光の入射角度に応じて瞳像が分離され、それぞれ対応する一対の第1画素S,第2画素Sにより受光される。

In the pupil image separation method, a plurality of (two in this example) pixels (first pixel S A and second pixel S B ) are assigned to one microlens 201A, and are incident on one microlens 201A. pupil image which is pupil division by the microlens 201A, the first pixel S a two (a pair), and is received by the second pixel S B. Thus, the pupil image is separated according to the incident angle of the light entering the microlens 201A, the corresponding pair of the first pixel S A, is received by the second pixel S B.

 一対の第1画素S,第2画素Sは、位相差AFに使用可能な一対の位相差画素に相当する。また、一対の第1画素S,第2画素Sの各画素から得られる画素値を加算した画素値は、瞳分割されていない通常画素の画素値と同等になるため、一対の第1画素S,第2画素Sは、通常画素(1画素)として扱うことができる。

A pair of first pixel S A, second pixel S B corresponds to a pair of phase difference pixels available on the phase difference AF. The pair of first pixel S A, pixel values obtained by adding the pixel values obtained from the pixels of the second pixel S B are to become equal to the pixel value of normal pixels that are not pupil division, the pair of first The pixel S A and the second pixel S B can be handled as normal pixels (one pixel).

 図4に示すようにイメージセンサ201の一対の第1画素SA、第2画素Sには、赤

(R)、緑(G)、青(B)の3原色のカラーフィルタ(Rフィルタ、Gフィルタ、Bフィルタ)のうちのいずれか1色のカラーフィルタが、所定のカラーフィルタ配列にしたがって配置されている。図4に示すカラーフィルタ配列は、一般的なベイヤー配列であるが、カラーフィルタ配列はこれに限定されるものではなく、例えば、Trans(登録商標)配列等の他のカラーフィルタ配列でもよい。

As shown in FIG. 4, the pair of first pixel S A and second pixel S B of the image sensor 201 includes red

A color filter of any one of the three primary color filters (R filter, G filter, B filter) of (R), green (G), and blue (B) is arranged according to a predetermined color filter array. ing. The color filter array shown in FIG. 4 is a general Bayer array, but the color filter array is not limited to this, and may be another color filter array such as a Trans (registered trademark) array, for example.

 図3に戻って、交換レンズ100の撮像光学系102によってイメージセンサ201の受光面に結像された被写体の光学像は、イメージセンサ201によって電気信号に変換される。イメージセンサ201の各画素(図4に示した第1画素SA、第2画素S)には、入射する光量に応じた電荷が蓄積され、イメージセンサ201からは各画素に蓄積された電荷量(信号電荷)に応じた電気信号が画像信号として読み出される。即ち、第1画素SA、第2画素Sに蓄積された電荷量に対応する信号(第1画素S,第2画素Sの各画素値)は、独立して読み出し可能である。

Returning to FIG. 3, the optical image of the subject formed on the light receiving surface of the image sensor 201 by the imaging optical system 102 of the interchangeable lens 100 is converted into an electrical signal by the image sensor 201. Charges corresponding to the amount of incident light are accumulated in each pixel of the image sensor 201 (the first pixel S A and the second pixel S B shown in FIG. 4), and the charge accumulated in each pixel from the image sensor 201. An electric signal corresponding to the amount (signal charge) is read out as an image signal. That is, the first pixel S A, a signal corresponding to the charge amount accumulated in the second pixel S B (first pixel S A, pixel values of the second pixel S B) can be read independently.

 イメージセンサ制御部202は、本体側CPU220の指令にしたがってイメージセンサ201から画像信号の読み出し制御を行う。また、イメージセンサ制御部202は、静止画の撮像が行われる場合には、FPS280の開閉により露光時間が制御された後、FPS280が閉じた状態でイメージセンサ201の全ラインを読み出す。また、本例のイメージセンサ201及びイメージセンサ制御部202は、少なくとも1つ以上のライン毎や画素毎に順次露光動作を行う(即ち、ライン毎や画素毎に順次リセットを行い、電荷の蓄積を開始し、蓄積した電荷を読み出す方式である)、いわゆるローリングシャッタ方式にて駆動することができ、特にFPS280を開放した状態でローリングシャッタ方式にて動画の撮像、又はライブビュー画像の撮像を行う機能を有する。

The image sensor control unit 202 performs reading control of an image signal from the image sensor 201 in accordance with a command from the main body side CPU 220. In addition, when a still image is captured, the image sensor control unit 202 reads all lines of the image sensor 201 with the FPS 280 closed after the exposure time is controlled by opening and closing the FPS 280. In addition, the image sensor 201 and the image sensor control unit 202 of this example sequentially perform an exposure operation for each of at least one line or each pixel (that is, sequentially reset for each line or each pixel to accumulate charges. This is a method of starting and reading out the accumulated electric charge), and can be driven by a so-called rolling shutter method, and in particular, a function of capturing a moving image or a live view image by a rolling shutter method with the FPS 280 opened. Have

 アナログ信号処理部203は、イメージセンサ201で被写体を撮像して得られたアナログの画像信号に対して、各種のアナログ信号処理を施す。アナログ信号処理部203は、サンプリングホールド回路、色分離回路、AGC(Automatic Gain Control)回路等を含んで構成されている。AGC回路は、撮像時の感度(ISO感度(ISO:International Organization for Standardization))を調整する感度調整部として機能し、入力する画像信号を増幅する増幅器のゲインを調整し、画像信号の信号レベルが適切な範囲に入るようにする。A/D変換器204は、アナログ信号処理部203から出力されたアナログの画像信号をデジタルの画像信号に変換する。

The analog signal processing unit 203 performs various types of analog signal processing on an analog image signal obtained by imaging the subject with the image sensor 201. The analog signal processing unit 203 includes a sampling hold circuit, a color separation circuit, an AGC (Automatic Gain Control) circuit, and the like. The AGC circuit functions as a sensitivity adjustment unit that adjusts the sensitivity at the time of imaging (ISO: International Organization for Standardization), adjusts the gain of an amplifier that amplifies the input image signal, and the signal level of the image signal is Try to be in the proper range. The A / D converter 204 converts the analog image signal output from the analog signal processing unit 203 into a digital image signal.

 静止画又は動画の撮像時にイメージセンサ201、アナログ信号処理部203、及びA/D変換器204を介して出力されるRGBの画素毎の画像データ(モザイク画像データ)は、画像入力コントローラ205からRAM207に入力され、一時的に記憶される。

尚、イメージセンサ201がCMOS型イメージセンサである場合、アナログ信号処理部203及びA/D変換器204は、イメージセンサ201内に内蔵されていることが多い。

Image data (mosaic image data) for each pixel of RGB output via the image sensor 201, the analog signal processing unit 203, and the A / D converter 204 when capturing a still image or moving image is transferred from the image input controller 205 to the RAM 207. Is temporarily stored.

When the image sensor 201 is a CMOS type image sensor, the analog signal processing unit 203 and the A / D converter 204 are often built in the image sensor 201.

 デジタル信号処理部206は、RAM207に格納されている画像データに対して、各種のデジタル信号処理を施す。デジタル信号処理部206は、RAM207に記憶されている画像データを適宜読み出し、読み出した画像データに対してオフセット処理、感度補正を含むゲイン・コントロール処理、ガンマ補正処理、デモザイク処理(デモザイキング処理、同時化処理とも言う)、RGB/YCrCb変換処理等のデジタル信号処理を行い、デジタル信号処理後の画像データを再びRAM207に記憶させる。尚、デモザイク処理とは、例えば、RGB3色のカラーフィルタからなるイメージセンサの場合、RGBからなるモザイク画像から画素毎にRGB全ての色情報を算出する処理であり、モザイクデータ(点順次のRGBデータ)から同時化されたRGB3面の画像データを生成する。

The digital signal processing unit 206 performs various types of digital signal processing on the image data stored in the RAM 207. The digital signal processing unit 206 appropriately reads out image data stored in the RAM 207, and performs offset processing, gain control processing including sensitivity correction, gamma correction processing, demosaicing processing (demosaicing processing, simultaneous processing) on the read image data. Digital signal processing such as RGB / YCrCb conversion processing or the like, and image data after the digital signal processing is stored in the RAM 207 again. Note that the demosaic process is a process of calculating color information of all RGB for each pixel from a mosaic image composed of RGB, for example, in the case of an image sensor composed of RGB color filters, and mosaic data (dot sequential RGB data). ) To generate the RGB 3 plane image data.

 RGB/YCrCb変換処理は、同時化されたRGBデータを輝度データ(Y)及び色差データ(Cr、Cb)に変換する処理である。

The RGB / YCrCb conversion process is a process of converting the synchronized RGB data into luminance data (Y) and color difference data (Cr, Cb).

 圧縮伸張処理部208は、静止画又は動画の記録時に、一旦RAM207に格納された非圧縮の輝度データY及び色差データCb,Crに対して圧縮処理を施す。静止画の場合には、例えばJPEG(Joint Photographic coding Experts Group)形式で圧縮し、動画の場合には、例えばH.264形式で圧縮する。圧縮伸張処理部208により圧縮された画像データは、メディア制御部210を介してメモリカード212に記録される。また、圧縮伸張処理部208は、再生モード時にメディア制御部210を介してメモリカード212から得た圧縮された画像データに対して伸張処理を施し、非圧縮の画像データを生成する。

The compression / decompression processing unit 208 performs compression processing on the uncompressed luminance data Y and color difference data Cb, Cr once stored in the RAM 207 when recording a still image or a moving image. In the case of a still image, the image is compressed in, for example, JPEG (Joint Photographic coding Experts Group) format. Compress in H.264 format. The image data compressed by the compression / decompression processing unit 208 is recorded on the memory card 212 via the media control unit 210. In addition, the compression / decompression processing unit 208 performs decompression processing on the compressed image data obtained from the memory card 212 via the media control unit 210 in the playback mode, and generates uncompressed image data.

 メディア制御部210は、圧縮伸張処理部208で圧縮された画像データを、メモリカード212に記録する制御を行う。また、メディア制御部210は、メモリカード212から、圧縮された画像データを読み出す制御を行う。

The media control unit 210 performs control to record the image data compressed by the compression / decompression processing unit 208 in the memory card 212. In addition, the media control unit 210 performs control for reading compressed image data from the memory card 212.

 表示制御部214は、RAM207に格納されている非圧縮の画像データを、液晶モニタ216に表示させる制御を行う。液晶モニタ216は、液晶表示デバイスにより構成されているが、液晶モニタ216の代わりに有機エレクトロルミネッセンスなどの表示デバイスによって構成してもよい。

The display control unit 214 performs control to display uncompressed image data stored in the RAM 207 on the liquid crystal monitor 216. The liquid crystal monitor 216 is configured by a liquid crystal display device, but may be configured by a display device such as organic electroluminescence instead of the liquid crystal monitor 216.

 液晶モニタ216にライブビュー画像を表示させる場合には、デジタル信号処理部206で連続的に生成されたデジタルの画像信号が、RAM207に一時的に記憶される。表示制御部214は、このRAM207に一時記憶されたデジタルの画像信号を表示用の信号形式に変換して、液晶モニタ216に順次出力する。これにより、液晶モニタ216に撮像画像がリアルタイムに表示され、液晶モニタ216を電子ビューファインダとして使用することができる。

When a live view image is displayed on the liquid crystal monitor 216, digital image signals continuously generated by the digital signal processing unit 206 are temporarily stored in the RAM 207. The display control unit 214 converts the digital image signal temporarily stored in the RAM 207 into a display signal format and sequentially outputs it to the liquid crystal monitor 216. Thereby, the captured image is displayed on the liquid crystal monitor 216 in real time, and the liquid crystal monitor 216 can be used as an electronic viewfinder.

 シャッタレリーズスイッチ22は、静止画や動画の撮像指示を入力するための撮像指示部であり、いわゆる「半押し」と「全押し」とからなる2段ストローク式のスイッチで構成されている。

The shutter release switch 22 is an imaging instruction unit for inputting an imaging instruction of a still image or a moving image, and is configured by a two-stage stroke type switch composed of so-called “half press” and “full press”.

 静止画撮像モードの場合、シャッタレリーズスイッチ22が半押しされることによってS1オンの信号、半押しから更に押し込む全押しがされることによってS2オンの信号が出力され、S1オン信号が出力されると、本体側CPU220は、AF制御(自動焦点調節)及びAE制御(自動露出制御)などの撮影準備処理を実行し、S2オン信号が出力されると、静止画の撮像処理及び記録処理を実行する。

In the still image capturing mode, an S1 ON signal is output when the shutter release switch 22 is half-pressed, an S2 ON signal is output when the shutter release switch 22 is further pressed halfway down, and an S1 ON signal is output. Then, the main body side CPU 220 executes shooting preparation processing such as AF control (automatic focus adjustment) and AE control (automatic exposure control), and when the S2 ON signal is output, executes still image shooting processing and recording processing. To do.

 尚、AF制御及びAE制御は、それぞれ操作部222によりオートモードが設定されている場合に自動的に行われ、マニュアルモードが設定されている場合には、AF制御及びAE制御が行われないことは言うまでもない。

Note that AF control and AE control are automatically performed when the auto mode is set by the operation unit 222, respectively, and AF control and AE control are not performed when the manual mode is set. Needless to say.

 また、動画撮像モードの場合、シャッタレリーズスイッチ22が全押しされることによってS2オンの信号が出力されると、カメラ本体200は、動画の記録を開始する動画記録モードになり、動画の画像処理及び記録処理を実行し、その後、シャッタレリーズスイッチ22が再び全押しされることによってS2オンの信号が出力されると、カメラ本体200は、スタンバイ状態になり、動画の記録処理を一時停止する。

In the moving image capturing mode, when the S2 ON signal is output when the shutter release switch 22 is fully pressed, the camera body 200 enters a moving image recording mode in which moving image recording is started, and image processing of the moving image is performed. When the shutter release switch 22 is fully pressed again and an S2 ON signal is output, the camera body 200 enters a standby state and temporarily stops the moving image recording process.

 尚、シャッタレリーズスイッチ22は半押しと全押しとからなる2段ストローク式のスイッチの形態に限られず、1回の操作でS1オンの信号、S2オンの信号を出力しても良く、それぞれ個別のスイッチを設けてS1オンの信号、S2オンの信号を出力しても良い。

The shutter release switch 22 is not limited to a two-stroke type switch consisting of half-pressing and full-pressing, and may output an S1 ON signal and an S2 ON signal in a single operation. May be provided to output an S1 on signal and an S2 on signal.

 また、タッチ式パネル等により操作指示を行う形態では、これら操作手段としてタッチ式パネルの画面に表示される操作指示に対応する領域をタッチすることで操作指示を出力するようにしても良く、撮影準備処理や撮像処理を指示するものであれば操作手段の形態はこれらに限られない。

Further, in the form in which the operation instruction is performed by the touch panel or the like, the operation instruction may be output by touching an area corresponding to the operation instruction displayed on the screen of the touch panel as these operation means. The form of the operation means is not limited to these as long as the preparation process or the imaging process is instructed.

 撮像により取得された静止画又は動画は、圧縮伸張処理部208により圧縮され、圧縮された画像データは、撮像日時、GPS情報、撮像条件(F値、シャッタスピード、ISO感度等)の所要の付属情報が、ヘッダに付加された画像ファイルとされた後、メディア制御部210を介してメモリカード212に格納される。

The still image or moving image acquired by the imaging is compressed by the compression / decompression processing unit 208, and the compressed image data includes the required attachment of the imaging date and time, GPS information, and imaging conditions (F value, shutter speed, ISO sensitivity, etc.). The information is converted into an image file added to the header, and then stored in the memory card 212 via the media control unit 210.

 本体側CPU220は、カメラ本体200全体の動作及び交換レンズ100の光学部材の駆動等を統括制御するもので、シャッタレリーズスイッチ22を含む操作部222等からの入力に基づき、カメラ本体200の各部及び交換レンズ100を制御する。

The main body side CPU 220 controls the overall operation of the camera main body 200 and the driving of the optical members of the interchangeable lens 100, etc. Based on the input from the operation unit 222 including the shutter release switch 22, the main body side CPU 220 The interchangeable lens 100 is controlled.

 時計部224は、タイマとして、本体側CPU220からの指令に基づいて時間を計測する。また、時計部224は、カレンダとして、現在の年月日及び時刻を計測する。

The clock unit 224 measures time based on a command from the main body CPU 220 as a timer. The clock unit 224 measures the current date and time as a calendar.

 フラッシュROM226は、読み取り及び書き込みが可能な不揮発性メモリであり、設定情報を記憶する。

The flash ROM 226 is a non-volatile memory that can be read and written, and stores setting information.

 ROM228には、本体側CPU220が実行するカメラ制御プログラム、本発明に係る合成用画像撮像モードでの撮像を実行させる画像位置合わせ補助プログラム、イメージセンサ201の欠陥情報、画像処理等に使用する各種のパラメータやテーブルが記憶されている。本体側CPU220は、ROM228に格納されたカメラ制御プログラム、あるいは画像位置合わせ補助プログラムにしたがい、RAM207を作業領域としながらカメラ本体200の各部、及び交換レンズ100を制御する。

The ROM 228 stores a camera control program executed by the main body side CPU 220, an image alignment auxiliary program for executing imaging in the image capturing mode for composition according to the present invention, defect information of the image sensor 201, various types used for image processing, and the like. Parameters and tables are stored. The main body side CPU 220 controls each part of the camera main body 200 and the interchangeable lens 100 using the RAM 207 as a work area in accordance with a camera control program stored in the ROM 228 or an image alignment auxiliary program.

 自動焦点調節部として機能するAF制御部230は、位相差AFの制御に必要なデフォーカス量を算出し、算出したデフォーカス量に基づいて、フォーカスレンズが移動すべき位置(合焦位置)指令を、本体側CPU220及び本体側通信部250を介して交換レンズ100に通知する。

The AF control unit 230 that functions as an automatic focus adjustment unit calculates a defocus amount necessary for controlling the phase difference AF, and based on the calculated defocus amount, a position (focus position) command to which the focus lens should move is calculated. Is notified to the interchangeable lens 100 via the main body side CPU 220 and the main body side communication unit 250.

 ここで、AF制御部230は、位相差検出部とデフォーカス量算出部とを含む。位相差検出部は、イメージセンサ201のAF領域(ユーザが指定した主要被写体が存在する領域、顔検出等により自動的に検出される主要被写体の領域、又はデフォルトで設定された領域等)内の、同色のカラーフィルタが配設された第1画素Sからなる第1画素群と、第1画素Sからなる第2画素群からそれぞれ画素データ(第1画素値及び第2画素値)を取得し、これらの第1画素値及び第2画素値に基づいて位相差を検出する。この位相差は、第1画素群の複数の第1画素値と第2画素群の複数の第2画素値の相関が最大になるとき(複数の第1画素値と複数の第2画素値の差分絶対値の積算値が最小になるとき)の第1画素値と第2画素値との間の瞳分割方向のシフト量から算出することができる。

Here, the AF control unit 230 includes a phase difference detection unit and a defocus amount calculation unit. The phase difference detection unit is in an AF area of the image sensor 201 (an area where the main subject specified by the user exists, an area of the main subject automatically detected by face detection, or an area set by default). a first pixel group including a first pixel S a which color filters of the same color are disposed, respectively the pixel data from the second pixel group composed of the first pixel S a (the first pixel value and second pixel value) The phase difference is detected based on the first pixel value and the second pixel value. This phase difference is obtained when the correlation between the plurality of first pixel values of the first pixel group and the plurality of second pixel values of the second pixel group becomes maximum (the plurality of first pixel values and the plurality of second pixel values It can be calculated from the amount of shift in the pupil division direction between the first pixel value and the second pixel value (when the integrated absolute difference value is minimized).

 デフォーカス量算出部は、位相差検出部により検出された位相差と、交換レンズ100の現在のF値(光線角度)に対応する係数とを乗算することによりデフォーカス量を算出する。

The defocus amount calculation unit calculates the defocus amount by multiplying the phase difference detected by the phase difference detection unit by a coefficient corresponding to the current F value (ray angle) of the interchangeable lens 100.

 AF制御部230により算出されたデフォーカス量に対応するフォーカスレンズの位置指令は、交換レンズ100に通知され、フォーカスレンズの位置指令を受け付けた交換レンズ100のレンズ側CPU120は、フォーカスレンズ制御部116を介してフォーカスレンズを移動させ、フォーカスレンズの位置(合焦位置)を制御する。

The focus lens position command corresponding to the defocus amount calculated by the AF control unit 230 is notified to the interchangeable lens 100, and the lens side CPU 120 of the interchangeable lens 100 that receives the focus lens position command receives the focus lens control unit 116. The focus lens is moved via, and the position (focus position) of the focus lens is controlled.

 AE制御部232は、被写体の明るさ(被写体輝度)を検出する部分であり、被写体輝度に対応するAE制御及びAWB(Auto White Balance)制御に必要な数値(露出値(EV値(exposure value)))を算出する。AE制御部232は、イメージセンサ201を介して取得した画像の輝度、画像の輝度の取得時のシャッタスピード及びF値によりEV値を算出する。

The AE control unit 232 is a part that detects the brightness of the subject (subject brightness), and is a numerical value (exposure value (EV value)) necessary for AE control and AWB (Auto White Balance) control corresponding to the subject brightness. )) Is calculated. The AE control unit 232 calculates an EV value based on the brightness of the image acquired via the image sensor 201, the shutter speed when the brightness of the image is acquired, and the F value.

 本体側CPU220は、AE制御部232から得たEV値に基づいて所定のプログラム線図からF値、シャッタスピード及びISO感度を決定し、AE制御を行うことができる。

The main body side CPU 220 can determine the F value, shutter speed, and ISO sensitivity from a predetermined program diagram based on the EV value obtained from the AE control unit 232, and perform AE control.

 ホワイトバランス補正部234は、RGBデータ(Rデータ、Gデータ及びBデータ)の色データ毎のホワイトバランスゲイン(WB(White Balance)ゲイン)Gr,Gg,Gbを算出し、Rデータ、Gデータ及びBデータに、それぞれ算出したWBゲインGr,Gg,Gbを乗算することによりホワイトバランス補正を行う。ここで、WBゲインGr,Gg,Gbの算出方法としては、被写体の明るさ(EV値)によるシーン認識(屋外、屋内の判定等)及び周囲光の色温度等に基づいて被写体を照明している光源種を特定し、予め光源種毎に適切なWBゲインが記憶されている記憶部から特定した光源種に対応するWBゲインを読み出す方法が考えられるが、少なくともEV値を使用してWBゲインGr,Gg,Gbを求める他の公知の方法が考えられる。

The white balance correction unit 234 calculates white balance gains (WB (White Balance) gains) Gr, Gg, Gb for each color data of RGB data (R data, G data, and B data). White balance correction is performed by multiplying the B data by the calculated WB gains Gr, Gg, and Gb, respectively. Here, as a method of calculating the WB gains Gr, Gg, and Gb, the subject is illuminated based on scene recognition (outdoor / indoor determination, etc.) based on the brightness (EV value) of the subject, the color temperature of ambient light, and the like. A method of reading out a WB gain corresponding to a specified light source type from a storage unit in which an appropriate WB gain is stored in advance for each light source type is conceivable, but at least using an EV value Other known methods for determining Gr, Gg, Gb are conceivable.

 無線通信部236は、Wi-Fi(Wireless Fidelity)(登録商標)、Bluetooth(登録商標)等の規格の近距離無線通信を行う部分であり、周辺のデジタル機器(スマートフォン、等の携帯端末)との間で必要な情報の送受信を行う。

The wireless communication unit 236 is a part that performs short-range wireless communication of a standard such as Wi-Fi (Wireless Fidelity) (registered trademark), Bluetooth (registered trademark), etc., and with a peripheral digital device (a mobile terminal such as a smartphone). Necessary information is sent and received between the two.

 GPS受信部238は、本体側CPU220の指示にしたがって、複数のGPS衛星から送信されるGPS信号を受信し、受信した複数のGPS信号に基づく測位演算処理を実行し、カメラ本体200の緯度、経度、及び高度からなるGPS情報を取得する。取得されたGPS情報は、撮像された画像の撮像位置を示す付属情報として画像ファイルのヘッダに記録することができる。

The GPS receiving unit 238 receives GPS signals transmitted from a plurality of GPS satellites according to instructions from the main body side CPU 220, executes positioning calculation processing based on the received plurality of GPS signals, and the latitude and longitude of the camera main body 200 And GPS information consisting of altitude. The acquired GPS information can be recorded in the header of the image file as attached information indicating the imaging position of the captured image.

 電源制御部240は、本体側CPU220の指令にしたがって、バッテリ242から供給される電源電圧をカメラ本体200の各部に与える。また、電源制御部240は、本体側CPU220の指令にしたがって、本体マウント260及びレンズマウント160を介して、バッテリ242から供給される電源電圧を交換レンズ100の各部に与える。

The power supply control unit 240 applies power supply voltage supplied from the battery 242 to each unit of the camera main body 200 in accordance with a command from the main body side CPU 220. In addition, the power supply control unit 240 supplies the power supply voltage supplied from the battery 242 to each unit of the interchangeable lens 100 via the main body mount 260 and the lens mount 160 in accordance with a command from the main body side CPU 220.

 レンズ電源スイッチ244は、本体側CPU220の指令にしたがって、本体マウント260及びレンズマウント160を介して交換レンズ100に与える電源電圧のオン及びオフの切り替えとレベルの切り替えを行う。

The lens power switch 244 switches on and off the power supply voltage applied to the interchangeable lens 100 via the main body mount 260 and the lens mount 160 and switches the level in accordance with a command from the main body side CPU 220.

 本体側通信部250は、本体側CPU220の指令にしたがって、本体マウント260及びレンズマウント160を介して接続された交換レンズ100のレンズ側通信部150との間で、リクエスト信号、回答信号の送受信(双方向通信)を行う。尚、本体マウント260には、図1に示すように複数の端子260Aが設けられており、交換レンズ100がカメラ本体200に装着(レンズマウント160と本体マウント260とが接続)されると、本体マウント260に設けられた複数の端子260A(図1)と、レンズマウント160に設けられた複数の端子(図示せず)とが電気的に接続され、本体側通信部250とレンズ側通信部150との間で双方向通信が可能になる。

The main body side communication unit 250 transmits / receives a request signal and a response signal to / from the lens side communication unit 150 of the interchangeable lens 100 connected via the main body mount 260 and the lens mount 160 in accordance with a command from the main body side CPU 220 ( (Bidirectional communication). The main body mount 260 is provided with a plurality of terminals 260A as shown in FIG. 1, and when the interchangeable lens 100 is attached to the camera main body 200 (the lens mount 160 and the main body mount 260 are connected), the main body A plurality of terminals 260A (FIG. 1) provided on the mount 260 and a plurality of terminals (not shown) provided on the lens mount 160 are electrically connected, and the main body side communication unit 250 and the lens side communication unit 150 are connected. Bi-directional communication is possible.

 内蔵フラッシュ30(図1)は、例えば、TTL(Through The Lens)自動調光方式のフラッシュであり、フラッシュ発光部270と、フラッシュ制御部272とから構成されている。

The built-in flash 30 (FIG. 1) is, for example, a TTL (Through The Lens) automatic dimming flash, and includes a flash light emitting unit 270 and a flash control unit 272.

 フラッシュ制御部272は、フラッシュ発光部270から発光するフラッシュ光の発光量(ガイドナンバー)を調整する機能を有する。即ち、フラッシュ制御部272は、本体側CPU220からのフラッシュ撮像指示に同期してフラッシュ発光部270から、発光量の小さいフラッシュ光をプリ発光(調光発光)させ、交換レンズ100の撮像光学系102を介して入射する反射光(周囲光を含む)に基づいて本発光するフラッシュ光の発光量を決定し、決定した発光量のフラッシュ光を、フラッシュ発光部270から発光(本発光)させる。尚、HDR撮像モードが選択されている場合、フラッシュ制御部272は、通常のフラッシュ発光画像の撮像とは異なる発光制御を行うが、その詳細については後述する。

The flash control unit 272 has a function of adjusting the light emission amount (guide number) of flash light emitted from the flash light emitting unit 270. That is, the flash control unit 272 pre-flashes (dimming light emission) flash light with a small light emission amount from the flash light emitting unit 270 in synchronization with the flash imaging instruction from the main body side CPU 220, and the imaging optical system 102 of the interchangeable lens 100. The amount of flash light that is actually emitted is determined based on reflected light (including ambient light) incident via the flash light, and the flash light emission unit 270 emits the flash light of the determined emission amount (main emission). Note that, when the HDR imaging mode is selected, the flash control unit 272 performs light emission control different from the normal flash emission image imaging, and details thereof will be described later.

 FPS280は、撮像装置10のメカシャッタを構成し、イメージセンサ201の直前に配置される。FPS制御部296は、本体側CPU220からの入力情報(S2オン信号、シャッタスピード等)に基づいてFPS280の先幕、後幕の開閉を制御し、イメージセンサ201における露光時間(シャッタスピード)を制御する。

The FPS 280 constitutes a mechanical shutter of the imaging apparatus 10 and is disposed immediately before the image sensor 201. The FPS control unit 296 controls the opening and closing of the front and rear curtains of the FPS 280 based on input information (S2 ON signal, shutter speed, etc.) from the main body CPU 220, and controls the exposure time (shutter speed) in the image sensor 201. To do.

 次に、HDR撮像モードによりダイナミックレンジを広げるHDR合成に使用する露出の異なる複数の画像を撮像し、撮像した複数の画像のHDR合成を行う撮像装置10の制御について説明する。

Next, control of the imaging apparatus 10 that captures a plurality of images with different exposures used for HDR composition that expands the dynamic range in the HDR imaging mode and performs HDR composition of the captured images will be described.

 [HDR処理の第1実施形態]

 図5は、本発明に係る画像処理装置として機能する本体側CPU220の第1実施形態を示す機能ブロック図であり、主としてHDR撮像モードにより撮像されたHDR合成用の複数の画像に基づいてHDR合成用の画像処理(HDR処理)を実行する場合の機能ブロック図である。

[First Embodiment of HDR Processing]

FIG. 5 is a functional block diagram showing a first embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention, and HDR synthesis is based mainly on a plurality of images for HDR synthesis captured in the HDR imaging mode. FIG. 6 is a functional block diagram when image processing (HDR processing) is executed.

 HDR撮像モードによる撮像を行う場合、主として本体側CPU220が、HDR撮像モードによる撮像用の制御を統括制御するとともに、図5に示すように距離分布情報取得部220A,光強度分布情報算出部220B,合成比率決定部220C及び画像合成部220Dとして機能する。

When performing imaging in the HDR imaging mode, the main body side CPU 220 mainly controls imaging control in the HDR imaging mode, and as shown in FIG. 5, a distance distribution information acquisition unit 220A, a light intensity distribution information calculation unit 220B, It functions as a composition ratio determination unit 220C and an image composition unit 220D.

 即ち、HDR撮像モードによる撮像を行う場合、本体側CPU220は、フラッシュ光の非発光下でそれぞれ異なる露出条件による複数(本例では2つ)の非発光画像の撮像と、フラッシュ光の発光下で複数の非発光画像のうちのいずれかの非発光画像の露出条件と同じ露出条件による発光画像の撮像とを実行する。

That is, when performing imaging in the HDR imaging mode, the main body side CPU 220 captures a plurality (two in this example) of non-light-emitting images under different exposure conditions under non-flash light emission and under flash light emission. Imaging of a luminescent image is performed under the same exposure condition as that of any one of the plurality of non-luminescent images.

 ここで、複数の非発光画像に対応する複数の露出条件は、適正露出の露出条件よりもアンダー露出の露出条件とオーバー露出の露出条件とを含む。本例では、本体側CPU220は、AE制御部232により算出されたEV値に応じて定まる適正露出に対して、アンダー露出、及びオーバー露出の2つの露出条件にて2つの非発光画像の撮像を行わせる。尚、適正露出に対するアンダー露出、オーバー露出となる露出補正値(例えば、±3EV)は、ユーザが任意に設定できるようにしてもよいし、被写界の明るさ(EV値)に応じて自動的に設定できるようにしてもよい。

Here, the plurality of exposure conditions corresponding to the plurality of non-light-emitting images include an under-exposure exposure condition and an over-exposure exposure condition rather than an appropriate exposure condition. In this example, the main body side CPU 220 captures two non-light-emitting images under two exposure conditions of underexposure and overexposure with respect to the appropriate exposure determined according to the EV value calculated by the AE control unit 232. Let it be done. The exposure correction value (for example, ± 3 EV) for underexposure and overexposure with respect to proper exposure may be arbitrarily set by the user, or automatically according to the brightness (EV value) of the object scene. It may be possible to set it automatically.

 また、本体側CPU220は、フラッシュ光の発光下で複数の非発光画像のうちのいずれかの非発光画像の露出条件と同じ露出条件(本例では、アンダー露出の露出条件)にて発光画像の撮像を実行する。

Further, the main body side CPU 220 reads the light-emitting image under the same exposure condition (in this example, the under-exposure exposure condition) as any non-light-emitting image of the plurality of non-light-emitting images under flash light emission. Perform imaging.

 尚、2つの非発光画像と1つの発光画像は、それぞれ同一の被写体に対して撮像された画像であり、可能な限り撮像間隔を短くして連続して撮像された連写画像であることが好ましい。

Note that the two non-light-emitting images and the one light-emitting image are images captured with respect to the same subject, and are continuously shot images that are continuously captured with the imaging interval as short as possible. preferable.

 図5において、画像取得部221は、フラッシュ光の非発光下で撮像部(交換レンズ100、イメージセンサ201)により、それぞれアンダー露出で撮像された非発光画像(第1非発光画像)と、オーバー露出で撮像された非発光画像(第2非発光画像)と、フラッシュ発光部270から発光されるフラッシュ光の発光下で、第1非発光画像の露出条件と同じ露出条件で撮像部により撮像された発光画像とを取得する部分である。

In FIG. 5, the image acquisition unit 221 includes a non-light-emitting image (first non-light-emitting image) captured by the image pickup unit (the interchangeable lens 100 and the image sensor 201) under underexposure under the non-flash light, and an overshoot. Under the light emission of the non-light-emitting image (second non-light-emitting image) captured by exposure and the flash light emitted from the flash light-emitting unit 270, the image is captured by the image-capturing unit under the same exposure condition as the exposure condition of the first non-light-emitting image. This is a part for acquiring the emitted light image.

 画像取得部221は、例えば画像入力コントローラ205を介して一時的にRAM207に保持された第1非発光画像、第2非発光画像及び発光画像を、RAM207から取得してもよいし、撮像部により撮像された第1非発光画像、第2非発光画像及び発光画像を画像入力コントローラ205を介して直接取得してもよい。

For example, the image acquisition unit 221 may acquire the first non-light-emitting image, the second non-light-emitting image, and the light-emitting image temporarily stored in the RAM 207 via the image input controller 205 from the RAM 207, or by the imaging unit. The captured first non-emission image, second non-emission image, and emission image may be directly acquired via the image input controller 205.

 また、本例の画像入力コントローラ205は、図4に示したイメージセンサ201の一対の第1画素S,第2画素Sの画素値を加算して、1つのマイクロレンズ201A当り1画素の画素値とした、第1非発光画像、第2非発光画像、及び発光画像の他に、第1非発光画像、第2非発光画像、及び発光画像のうちのいずれかの画像の、第1画素Sのみの画素群からなる第1画像と第2画素Sのみの画素群からなる第2画像とを取得する。尚、第1画像と第2画像とは、被写体距離に応じて位相差をもつ位相差画像である。

The image input controller 205 of this embodiment, a pair of first pixel S A of the image sensor 201 shown in FIG. 4, by adding the pixel value of the second pixel S B, 1 pixel per one microlens 201A In addition to the first non-light-emitting image, the second non-light-emitting image, and the light-emitting image, the first non-light-emitting image, the second non-light-emitting image, and the light-emitting image of the first non-light-emitting image. A first image including a pixel group including only the pixel S A and a second image including a pixel group including only the second pixel S B are acquired. The first image and the second image are phase difference images having a phase difference according to the subject distance.

 距離分布情報取得部220Aは、画像取得部221から互いに位相差をもつ位相差画像である第1画像と第2画像とを取得し、これらの第1画像及び第2画像に基づいて画像内の被写体の距離の分布を示す距離分布情報を取得する。

The distance distribution information acquisition unit 220A acquires a first image and a second image that are phase difference images having a phase difference from the image acquisition unit 221, and based on the first image and the second image, Distance distribution information indicating the distance distribution of the subject is acquired.

 即ち、距離分布情報取得部220Aは、第1画像内の特徴となる特徴点と、各特徴点に対応する第2画像内の特徴点とのずれ量(位相差)を算出し、算出した位相差と現在の絞り108のF値に基づいて各特徴点のデフォーカス量を算出し、更に算出した各特徴点のデフォーカス量と現在の交換レンズ100(撮像光学系102のフォーカスレンズ)のレンズ位置とから各特徴点(被写体)の距離を算出する。例えば、デフォーカス量がゼロの特徴点は、現在のフォーカスレンズのレンズ位置にて合焦する合焦距離にあり、各特徴点のデフォーカス量が、前ピン側又は後ビン側にずれるにしたがって、各特徴点の距離は、合焦距離よりも遠側又は近側に移動した距離となる。

That is, the distance distribution information acquisition unit 220A calculates a deviation amount (phase difference) between a feature point that is a feature in the first image and a feature point in the second image corresponding to each feature point. The defocus amount of each feature point is calculated based on the phase difference and the current F value of the diaphragm 108, and the calculated defocus amount of each feature point and the lens of the current interchangeable lens 100 (the focus lens of the imaging optical system 102). The distance between each feature point (subject) is calculated from the position. For example, a feature point with a defocus amount of zero is at the in-focus distance at which the lens position of the current focus lens is focused, and as the defocus amount of each feature point shifts to the front pin side or the rear bin side The distance between each feature point is the distance moved to the far side or the near side from the in-focus distance.

 光強度分布情報算出部220Bは、画像取得部221から同じ露出条件により撮像された発光画像及び第1非発光画像を取得し、距離分布情報取得部220Aから画像内の被写体の距離の分布を示す距離分布情報を取得する。そして、取得した発光画像、第1非発光画像及び距離分布情報に基づいて、被写体に照射されている光(フラッシュ光ではなく、環境光)の強度を示す指標の分布情報である光強度分布情報を算出する。

The light intensity distribution information calculation unit 220B acquires the light emission image and the first non-light emission image captured under the same exposure conditions from the image acquisition unit 221, and indicates the distribution of the distance of the subject in the image from the distance distribution information acquisition unit 220A. Get distance distribution information. Then, based on the acquired light emission image, first non-light emission image, and distance distribution information, light intensity distribution information that is distribution information of an index indicating the intensity of light (not flash light but ambient light) applied to the subject Is calculated.

 以下、光強度分布情報算出部220Bによる光強度分布情報の算出方法について詳述する。

Hereinafter, the light intensity distribution information calculation method by the light intensity distribution information calculation unit 220B will be described in detail.

 第1非発光画像の各画素の座標を示すパラメータをiとし、画素iの輝度値をY、発光画像の同じ座標の画素iの輝度値をYLi、その座標の距離情報d(=撮像装置10からそこに写っている被写体との距離)としたとき、光強度分布情報Sを、次式、

 [数1]

 S=Y/{(YLi-Y)×(d )}

により定義する。パラメータiは、画素の座標を示すパラメータに限らず、画像を複数の領域に分割した各領域の位置を示すパラメータでも良い。この場合、輝度値Y、YLi、距離情報d、及び光強度分布情報Sも、領域i毎の輝度値(代表輝度値)、距離情報、及び光強度分布情報である。

The parameter indicating the coordinates of each pixel of the first non-light-emitting image is i, the luminance value of the pixel i is Y i , the luminance value of the pixel i at the same coordinate of the light-emitting image is Y Li , and distance information d i (= The distance from the imaging device 10 to the subject reflected there), the light intensity distribution information S i is expressed by the following equation:

[Equation 1]

S i = Y i / {(Y Li −Y i ) × (d i 2 )}

Defined by The parameter i is not limited to the parameter indicating the pixel coordinates, and may be a parameter indicating the position of each region obtained by dividing the image into a plurality of regions. In this case, the luminance values Y i , Y Li , distance information d i , and light intensity distribution information S i are also the luminance value (representative luminance value), distance information, and light intensity distribution information for each region i.

 以下、[数1]式を説明する。

Hereinafter, the formula 1 will be described.

 第1非発光画像のYiは、そこにある被写体に当たっている光の強さをB、被写体の反射率をR、イメージセンサ201による光電変換係数をKとすると、第1非発光画像のYiは、次式により表わされる。

Yi of the first non-light-emitting image is Yi of the first non-light-emitting image, where B i is the intensity of light striking the subject, R i is the reflectance of the subject, and K is the photoelectric conversion coefficient by the image sensor 201. Is represented by the following equation.

 [数2]

 Y=B×R×K

 発光画像のYLiは、被写体に当たっているフラッシュ光の強さをFとすると、次式により表わされる。

[Equation 2]

Y i = B i × R i × K

Y Li of the luminescent image is expressed by the following equation where F i is the intensity of flash light hitting the subject.

 [数3]

 YLi=(B+F)×R×K

 Fは、フラッシュ発光部270から単位距離は離れた場所でのフラッシュ光の強さをFsとすると、光の減衰に関する逆2乗の法則より、次式により表わされる。

[Equation 3]

Y Li = (B i + F i ) × R i × K

F i is expressed by the following equation based on the inverse square law regarding light attenuation, where Fs is the intensity of flash light at a unit distance from the flash light emitting unit 270.

 [数4]

 F=Fs/(d

 光強度分布情報の[数1]式は、[数2]式~[数4]式により、[数5]式のように変形できる。

[Equation 4]

F i = Fs / (d i 2 )

[Equation 1] of the light intensity distribution information can be transformed into [Equation 5] by [Equation 2] to [Equation 4].

 [数5]

 S=Y/{(YLi-Y)×(d )} =(B×R×K)/[{(B+F)×R×K-(B×R×K)}×(d )]=(B/{F×(d )}=B/Fs

 画素iによらず、Fsは一定なので、光強度分布情報Sは、画素i(=1~N(N:画素数)ごとの被写体に当たっている光強度Bを相対的に比較できる指数といえる。

[Equation 5]

S i = Y i / {(Y Li −Y i ) × (d i 2 )} = (B i × R i × K) / [{(B i + F i ) × R i × K− (B i × R i × K)} × (d i 2 )] = (B i / {F i × (d i 2 )} = B i / Fs

Since Fs is constant regardless of the pixel i, the light intensity distribution information S i can be said to be an index by which the light intensity B i hitting the subject for each pixel i (= 1 to N (N: number of pixels) can be relatively compared. .

 合成比率決定部220Cは、光強度分布情報に基づいて複数の非発光画像(第1非発光画像と第2非発光画像)の合成比率を決定する。光強度分布情報Sの値が大きい箇所は強く光が当たっているので、アンダー画像である第1非発光画像を使うことが好ましく、光強度分布情報Sの値が小さい箇所は光があまり当たっていないので、オーバー画像である第2非発光画像を使うことが好ましい。

The combination ratio determination unit 220C determines a combination ratio of a plurality of non-light-emitting images (first non-light-emitting image and second non-light-emitting image) based on the light intensity distribution information. Since a portion where the value of the light intensity distribution information S i is large is strongly irradiated, it is preferable to use the first non-light-emitting image that is an under image, and light is not so much at a portion where the value of the light intensity distribution information S i is small. Since it is not hit, it is preferable to use the second non-light emitting image which is an over image.

 次に、合成比率決定部220Cによる合成比率の決定方法について説明する。

Next, a method for determining a composite ratio by the composite ratio determining unit 220C will be described.

 <合成比率の第1決定方法>

 合成比率決定部220Cは、複数の非発光画像のそれぞれ対応する画素毎に合成比率を決定する際に、光強度分布情報Sに基づいて光強度が大きい画素では複数の非発光画像のうちの最も露出値が小さい露出条件で撮像された第1非発光画像の混合率を、複数の非発光画像のうちの最も露出値が大きい露出条件で撮像された第2非発光画像の混合率よりも大きくし、光強度が小さい画素では第2非発光画像の混合率を第1非発光画像の混合率よりも大きくする。尚、画像を複数の領域に分割した領域毎に合成比率を決定してもよい。

<First determination method of composition ratio>

When the composition ratio determining unit 220C determines the composition ratio for each pixel corresponding to each of the plurality of non-light-emitting images, a pixel having a high light intensity based on the light intensity distribution information S i includes a plurality of non-light-emitting images. The mixing ratio of the first non-light emitting image captured under the exposure condition with the smallest exposure value is greater than the mixing ratio of the second non-light emitting image captured under the exposure condition with the largest exposure value among the plurality of non-light emitting images. For a pixel having a small light intensity, the mixing ratio of the second non-light emitting image is made larger than the mixing ratio of the first non-light emitting image. Note that the composition ratio may be determined for each region obtained by dividing the image into a plurality of regions.

 図6から図7は、それぞれ光強度分布情報Sと第2非発光画像の混合率αiとの関係を示すグラフである。

6 to 7 are graphs showing the relationship between the light intensity distribution information S i and the mixing ratio α i of the second non-light-emitting image, respectively.

 図6に示すように合成比率決定部220Cは、第2非発光画像の混合率をαiとし、図6に示すように光強度分布情報Sの最小値をSmin.、最大値をSmax.とすると、最も簡単には、最小値Smin.~最大値Smax.の光強度分布情報Sにより、混合率αi(=0~1)を線形補間することで、第1非発光画像と第2非発光画像との合成比率を決定する。この場合、第1非発光画像の混合率は、(1-αi)となり、第1非発光画像と第2非発光画像との合成比率は、(1-αi):αiとなる。

As shown in FIG. 6, the composition ratio determining unit 220C sets the mixing ratio of the second non-light-emitting image as αi, and sets the minimum value of the light intensity distribution information S i as S min. And the maximum value as S max as shown in FIG . . and when, simplest, the minimum value S min. by ~ maximum value S max. of the light intensity distribution information S i, the mixing ratio αi (= 0 ~ 1) to by linear interpolation, the first non-emission image And the second non-luminescent image are determined. In this case, the mixing ratio of the first non-luminescent image is (1-αi), and the synthesis ratio of the first non-luminescent image and the second non-luminescent image is (1-αi): αi.

 また、合成比率決定部220Cは、図7に示すように混合率αiに最小値αmin.(>0)と最大値αmax.(<1)を設け、最小値Smin.~最大値Smax.の光強度分布情報Sにより、混合率αiを最小値αmin.~最大値αmax.の範囲で線形補間することで、第1非発光画像と第2非発光画像との合成比率を決定することができる。

Further, as shown in FIG. 7, the composition ratio determining unit 220C provides a minimum value α min. (> 0) and a maximum value α max. (<1) for the mixing rate α i, and the minimum value S min. max by. the light intensity distribution information S i, the mixing ratio αi minimum value alpha min. by linear interpolation the maximum value alpha max. range between the synthesis ratio of the first non-emission image and the second non-emission image Can be determined.

 更に、合成比率決定部220Cは、図8に示すように混合率αiに最小値αmin.(>0)と最大値αmax.(<1)を設け、最小値Smin.~最大値Smax.の光強度分布情報Sにより、混合率αiを最小値αmin.~最大値αmax.の範囲で非線形補間(混合率αiが単調減少するように非線形補間)することで、第1非発光画像と第2非発光画像との合成比率を決定することができる。

Further, as shown in FIG. 8, the composition ratio determination unit 220C provides the minimum value α min. (> 0) and the maximum value α max. (<1) for the mixing rate α i, and the minimum value S min. First, by performing non-linear interpolation (non-linear interpolation so that the mixing rate αi monotonously decreases) in the range of the minimum value α min. to the maximum value α max. based on the light intensity distribution information S i of max. A composite ratio between the non-light emitting image and the second non-light emitting image can be determined.

 図5に戻って、画像合成部220Dは、合成比率決定部220Cにより決定された合成比率にしたがって、アンダー露出の第1非発光画像とオーバー露出の第2非発光画像とを合成し、ダイナミックレンジが拡大された合成画像(HDR画像)を生成する。即ち、画像合成部220Dは、第1非発光画像と第2非発光画像のそれぞれ対応する画素毎に決定した合成比率を適用して、第1非発光画像と第2発光画像とを合成する。尚、画像を複数に分割した領域毎に合成比率が決定される場合には、第1非発光画像と第2非発光画像のそれぞれ対応する領域毎に決定した合成比率を適用して、第1非発光画像と第2発光画像とを合成し、HDR画像を生成する。

Returning to FIG. 5, the image composition unit 220D composes the under-exposed first non-emission image and the over-exposed second non-emission image in accordance with the composition ratio determined by the composition ratio determination unit 220C, and the dynamic range. A composite image (HDR image) in which is enlarged is generated. That is, the image combining unit 220D combines the first non-light-emitting image and the second light-emitting image by applying the combination ratio determined for each corresponding pixel of the first non-light-emitting image and the second non-light-emitting image. When the composition ratio is determined for each area obtained by dividing the image into a plurality of areas, the composition ratio determined for each area corresponding to each of the first non-light-emitting image and the second non-light-emitting image is applied. The non-light emitting image and the second light emitting image are combined to generate an HDR image.

 このようにして生成されたHDR画像は、被写体のもつ明暗が損なわれずに、被写体に当たっている光の明るさに応じて明暗差が調整された、階調特性に優れたHDR画像となる。

The HDR image generated in this way is an HDR image with excellent gradation characteristics in which the contrast of light and darkness is adjusted according to the brightness of the light hitting the subject without losing the contrast of the subject.

 [HDR処理の第2実施形態]

 図9は、本発明に係る画像処理装置として機能する本体側CPU220の第2実施形態を示す機能ブロック図である。尚、図9において、図5に示した第1実施形態と共通する部分には同一の符号を付し、その詳細な説明は省略する。

[Second Embodiment of HDR Processing]

FIG. 9 is a functional block diagram showing a second embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention. In FIG. 9, parts that are the same as those in the first embodiment shown in FIG. 5 are given the same reference numerals, and detailed descriptions thereof are omitted.

 図9に示す第2実施形態の本体側CPU220は、分布情報作成部220Eの機能が追加されている点で、図5に示した第1実施形態と相違する。

The main body side CPU 220 of the second embodiment shown in FIG. 9 is different from the first embodiment shown in FIG. 5 in that the function of the distribution information creation unit 220E is added.

 分布情報作成部220Eは、光強度分布情報算出部220Bにより算出された光強度分布情報Sに基づいて光強度の度数分布を示す分布情報(例えば、度数分布図や度数分布表)を算出する。

The distribution information creation unit 220E calculates distribution information (for example, a frequency distribution chart or a frequency distribution table) indicating the frequency distribution of the light intensity based on the light intensity distribution information S i calculated by the light intensity distribution information calculation unit 220B. .

 図10は、光強度の度数分布の一例を示す度数分布図である。

FIG. 10 is a frequency distribution diagram showing an example of the frequency distribution of light intensity.

 例えば、晴れた日の日中屋外では、日光が当たっていない暗い被写体(日陰の被写体)

から日光が当たっている明るい被写体(日向の被写体)まで、ダイナミックレンジの広い被写体が撮像対象になり、この場合の撮像対象は、日陰の被写体と、日向の被写体とに大別することができる。このようなシーンの光強度分布情報に基づいて生成した光強度の度数分布図には、図10に示すように2つの山(2つの頂点となる第1頂点P1と第2頂点P2)が存在する。第1頂点P1は、日陰の被写体に対応する被写体領域の光強度分布情報の度数に対応する頂点であり、第2頂点P2は、日向の被写体に対応する被写体領域の光強度分布情報の度数に対応する頂点である。

For example, a dark subject that is not exposed to sunlight (a shaded subject) outdoors on a sunny day

A subject having a wide dynamic range is a subject to be imaged, from a bright subject to sunlight (a subject in the sun), and in this case, the subject to be imaged can be roughly divided into a shaded subject and a sunny subject. In the light intensity frequency distribution diagram generated based on the light intensity distribution information of such a scene, there are two peaks (first vertex P1 and second vertex P2 which are two vertices) as shown in FIG. To do. The first vertex P1 is a vertex corresponding to the frequency of the light intensity distribution information of the subject region corresponding to the shaded subject, and the second vertex P2 is the frequency of the light intensity distribution information of the subject region corresponding to the sunny subject. The corresponding vertex.

 また、晴れた日の日中屋外のシーンに限らず、日光が屋内の一部に差し込むシーン、人工光源により照明されている領域と影になっている領域とが存在するシーンなどの場合も上記のように光強度の度数分布図には、2つの頂点が存在し得る。

In addition, not only in the daytime and outdoor scenes on a sunny day, but also in the case where there is a scene where sunlight is inserted into a part of the indoor, a scene illuminated by an artificial light source and a shadowed area, etc. In the frequency distribution diagram of light intensity, there can be two vertices.

 尚、本例では、光強度の度数分布を示す分布情報を作成するが、光強度の頻度分布を示す分布情報を作成してもよいし、度数分布又は頻度分布を曲線近似した分布情報を作成してもよい。

In this example, distribution information indicating the frequency distribution of the light intensity is created. However, distribution information indicating the frequency distribution of the light intensity may be created, or distribution information that approximates the frequency distribution or the frequency distribution to a curve is created. May be.

 図9に示す合成比率決定部220Cは、分布情報作成部220Eが作成した分布情報に基づいて、図10に示すように光強度の度数の高い第1頂点P1に対応する第1光強度(Sdark)と、光強度の度数の高い第2頂点P2に対応する第2光強度(Sblight)とを求める。

Based on the distribution information created by the distribution information creation unit 220E, the composition ratio determination unit 220C shown in FIG. 9 has the first light intensity (S) corresponding to the first vertex P1 having a high light intensity frequency as shown in FIG. dark ) and a second light intensity (S blight ) corresponding to the second vertex P2 having a high light intensity.

 そして、合成比率決定部220Cは、第1非発光画像及び第2非発光画像のそれぞれ対応する画素毎に合成比率を決定する際に、図11に示すように第1光強度(Sdark)以下の光強度の場合には第2非発光画像の混合率αiを最大値αmax.にし、第2光強度(Sblight)以上の光強度の場合には第2非発光画像の混合率αiを最小値αmin.にし、第1光強度(Sdark)よりも大きく第2光強度(Sblight)よりも小さい光強度の場合には第2非発光画像の混合率αiを最大値αmax.と最小値αmin.との間で線形に減少(単調減少)させる。

Then, when the composition ratio determination unit 220C determines the composition ratio for each pixel corresponding to each of the first non-light-emitting image and the second non-light-emitting image, as shown in FIG. 11, the first light intensity (S dark ) or less. If the light intensity is less than the second light intensity (S blight ), the second non-light-emitting image mixing ratio αi is set to the maximum value α max. When the light intensity is set to the minimum value α min. And is larger than the first light intensity (S dark ) and smaller than the second light intensity (S blight ), the mixing ratio αi of the second non-light emitting image is set to the maximum value α max. And the minimum value α min. Are linearly decreased (monotonically decreased).

 第1光強度(Sdark)以下の光強度を有する画素又は領域は、日陰の領域と考えられ、オーバー露出の第2非発光画像の混合率αiを最大値αmax.にすることが好ましく、一方、第2光強度(Sblight)以上の光強度を有する画素又は領域は、日向の領域と考えられ、オーバー露出の第2非発光画像の混合率αiを最小値αmin.(アンダー露出の第1非発光画像の混合率を最大値)にすることが好ましい。これにより、ハイライト部やシャドウ部の階調を豊かにすることができる。

A pixel or region having a light intensity equal to or lower than the first light intensity (S dark ) is considered to be a shaded area, and it is preferable that the mixture rate αi of the overexposed second non-emission image is set to the maximum value α max. On the other hand, a pixel or region having a light intensity equal to or higher than the second light intensity (S blight ) is considered to be a sunny area, and the mixture ratio αi of the overexposed second non-emission image is set to the minimum value α min. It is preferable to set the mixing ratio of the first non-light-emitting image to a maximum value). Thereby, the gradation of the highlight part and the shadow part can be enriched.

 図12は、図10に示した光強度の度数分布と同様の度数分布図である。

FIG. 12 is a frequency distribution diagram similar to the frequency distribution of the light intensity shown in FIG.

 図12に示す度数分布図には、度数が高くなる2つの山が存在するため、2つの山の間で度数が低くなる谷(底点V)が存在する。

In the frequency distribution diagram shown in FIG. 12, there are two peaks where the frequency is high, and therefore there is a valley (bottom point V) where the frequency is low between the two peaks.

 図9に示す合成比率決定部220Cの他の実施形態では、分布情報作成部220Eが作成した分布情報に基づいて、図12に示すように光強度の度数の低い底点Vに対応する光強度(St)を求める。

In another embodiment of the composition ratio determination unit 220C shown in FIG. 9, based on the distribution information created by the distribution information creation unit 220E, the light intensity corresponding to the bottom point V having a low light intensity frequency as shown in FIG. (St) is obtained.

 そして、合成比率決定部220Cは、第1非発光画像及び第2非発光画像のそれぞれ対応する画素毎に合成比率を決定する際に、図13に示すように光強度(St)を閾値にして、閾値以下の光強度の場合には第2非発光画像の混合率αiを最大値αmax.にし、閾値を超える光強度の場合には第2非発光画像の混合率αiを最小値αmin.(アンダー露出の第1非発光画像の混合率を最大値)にする。

When the composition ratio determining unit 220C determines the composition ratio for each pixel corresponding to each of the first non-light-emitting image and the second non-light-emitting image, the light intensity (St) is set as a threshold value as shown in FIG. When the light intensity is less than or equal to the threshold value, the mixture ratio αi of the second non-light-emitting image is set to the maximum value α max. When the light intensity exceeds the threshold value, the mixture ratio αi of the second non-light-emitting image is set to the minimum value α min. . to (maximum mixing ratio of the first non-emission image of underexposure).

 尚、閾値となる光強度(St)の前後の一定範囲ΔSの光強度に対しては、光強度の大きさに応じて、第2非発光画像の混合率を最大値αmax.と最小値αmin.との間で連続的に混合率を変化させる場合も含む。

For the light intensity in a certain range ΔS before and after the threshold light intensity (St), the mixing ratio of the second non-luminescent image is set to the maximum value α max. And the minimum value according to the light intensity . This includes the case where the mixing ratio is continuously changed between α min .

 [HDR処理の第3実施形態]

 図14は、本発明に係る画像処理装置として機能する本体側CPU220の第3実施形態を示す機能ブロック図である。尚、図14において、図5に示した第1実施形態と共通する部分には同一の符号を付し、その詳細な説明は省略する。

[Third Embodiment of HDR Processing]

FIG. 14 is a functional block diagram showing a third embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention. In FIG. 14, the same reference numerals are given to the portions common to the first embodiment shown in FIG. 5, and the detailed description thereof is omitted.

 図14に示す第3実施形態の本体側CPU220は、輝度分布情報算出部220Fの機能が追加されている点で、図5に示した第1実施形態と相違する。

The main body side CPU 220 of the third embodiment shown in FIG. 14 is different from the first embodiment shown in FIG. 5 in that the function of the luminance distribution information calculation unit 220F is added.

 輝度分布情報算出部220Fは、複数の非発光画像(第1非発光画像、第2非発光画像)のうちの少なくとも1つの非発光画像の画像内の輝度の分布を示す輝度分布情報を算出する部分である。輝度分布情報は、例えば、第1非発光画像の輝度信号からなる画像として作成し、又は画像を分割した領域毎の輝度(代表輝度)を示す情報として作成することができる。

The luminance distribution information calculation unit 220F calculates luminance distribution information indicating the luminance distribution in the image of at least one non-luminescent image among the plurality of non-luminescent images (first non-luminescent image and second non-luminescent image). Part. The luminance distribution information can be created, for example, as an image made up of luminance signals of the first non-emission image, or as information indicating the luminance (representative luminance) for each area into which the image is divided.

 図14に示す合成比率決定部220Cは、光強度分布情報算出部220Bが算出した光強度分布情報及び輝度分布情報算出部220Fが算出した輝度分布情報に基づいて複数の非発光画像(第1非発光画像、第2非発光画像)の合成比率を決定する。

The combination ratio determination unit 220C illustrated in FIG. 14 includes a plurality of non-light-emitting images (first non-light-emitting images) based on the light intensity distribution information calculated by the light intensity distribution information calculation unit 220B and the luminance distribution information calculated by the luminance distribution information calculation unit 220F. The composite ratio of the light emission image and the second non-light emission image) is determined.

 具体的には、合成比率決定部220Cは、第1非発光画像及び第2非発光画像のそれぞれ対応する画素又は領域毎に合成比率を決定する際に、画素又は領域の位置を示すパラメータをi、光強度分布情報に基づく第2非発光画像の混合率をαi、輝度分布情報に基づく第2非発光画像の混合率であって、輝度が高いほど小さい値となる混合率をβiとすると、光強度分布情報及び輝度分布情報に基づく第2非発光画像の混合率γiを、次式、

 [数6]

 γi=αi×βiにより算出する。この場合、第1非発光画像の混合率は、(1-γi)となり、第1非発光画像と第2非発光画像との合成比率は、(1-γi):γiとなる。

Specifically, the composition ratio determination unit 220C determines a parameter indicating the position of a pixel or a region when determining a composition ratio for each pixel or region corresponding to each of the first non-light-emitting image and the second non-light-emitting image. If the mixing ratio of the second non-light-emitting image based on the light intensity distribution information is αi, the mixing ratio of the second non-light-emitting image based on the luminance distribution information, and the mixing ratio that is smaller as the luminance is higher is βi, A mixing ratio γi of the second non-light-emitting image based on the light intensity distribution information and the luminance distribution information is expressed by the following equation:

[Equation 6]

γi = αi × βi is calculated. In this case, the mixing ratio of the first non-light-emitting image is (1-γi), and the combination ratio of the first non-light-emitting image and the second non-light-emitting image is (1-γi): γi.

 [HDR処理の第4実施形態]

 図15は、本発明に係る画像処理装置として機能する本体側CPU220の第4実施形態を示す機能ブロック図である。尚、図15において、図14に示した第3実施形態と共通する部分には同一の符号を付し、その詳細な説明は省略する。

[Fourth Embodiment of HDR Processing]

FIG. 15 is a functional block diagram showing a fourth embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention. In FIG. 15, the same reference numerals are given to the portions common to the third embodiment shown in FIG. 14, and the detailed description thereof is omitted.

 図15に示す第4実施形態の本体側CPU220は、フラッシュ光未到達画素検出部220Gの機能が追加されている点で、図14に示した第3実施形態と相違する。

The main body side CPU 220 of the fourth embodiment shown in FIG. 15 is different from the third embodiment shown in FIG. 14 in that the function of the flash light non-reaching pixel detection unit 220G is added.

 フラッシュ光未到達画素検出部220Gは、第1非発光画像の画素iの輝度値Yと発光画像の同じ座標の画素iの輝度値をYLiとの差分(YLi-Y)を算出し、差分(YLi-Y)が0になる画素iを、フラッシュ光未到達画素として検出する。

The flash light non-reaching pixel detection unit 220G calculates a difference (Y Li −Y i ) between the luminance value Y i of the pixel i of the first non-emission image and the luminance value of the pixel i at the same coordinates of the emission image as Y Li. Then, the pixel i for which the difference (Y Li -Y i ) is 0 is detected as a flash light non-reaching pixel.

 図15に示す合成比率決定部220Cは、光強度分布情報算出部220Bが算出した光強度分布情報及び輝度分布情報算出部220Fが算出した輝度分布情報に基づいて複数の非発光画像(第1非発光画像、第2非発光画像)の合成比率を決定するが、フラッシュ光未到達画素検出部220Gによりフラッシュ光未到達画素が検出されると、そのフラッシュ光未到達画素に対する合成比率を決定する際に、輝度分布情報のみに基づいて複数の非発光画像の合成比率を決定する。

The combination ratio determination unit 220C illustrated in FIG. 15 has a plurality of non-light-emitting images (first non-light-emitting images) based on the light intensity distribution information calculated by the light intensity distribution information calculation unit 220B and the luminance distribution information calculated by the luminance distribution information calculation unit 220F. (The light emission image, the second non-light emission image) is determined. When the flash light non-reaching pixel detection unit 220G detects the flash light non-reaching pixel, the composite ratio for the flash light non-reaching pixel is determined. In addition, the composition ratio of the plurality of non-light emitting images is determined based only on the luminance distribution information.

 被写体の背景などが遠くにあると、フラッシュ光が届かず、(YLi-Y)が0になり、[数1]式に示した光強度分布情報Sを算出することできなくなり、その結果、光強度分布情報Sに基づく第2非発光画像の混合率αiも求めることができなくなるからである。

If the background of the subject is far away, the flash light does not reach and (Y Li -Y i ) becomes 0, and the light intensity distribution information S i shown in the equation 1 cannot be calculated. As a result, the mixing ratio αi of the second non-light-emitting image based on the light intensity distribution information S i cannot be obtained.

 したがって、この場合には、輝度分布情報のみに基づいて複数の非発光画像の合成比率を決定することで、フラッシュ光が届かない被写体に対しても妥当な合成比率を決めることができる。

Therefore, in this case, by determining the combination ratio of a plurality of non-light-emitting images based only on the luminance distribution information, an appropriate combination ratio can be determined even for a subject that does not reach the flash light.

 具体的には、合成比率決定部220Cは、[数6]式により光強度分布情報及び輝度分布情報に基づく第2非発光画像の混合率γiを算出しているが、フラッシュ光が届かない画素(フラッシュ光未到達画素)が、フラッシュ光未到達画素検出部220Gにより検出されると、フラッシュ光未到達画素の第2非発光画像の混合率αiを1とすることで、輝度分布情報に基づく第2非発光画像の混合率βiのみで第2非発光画像の混合率γiを求める。

Specifically, the composition ratio determination unit 220C calculates the mixing ratio γi of the second non-light-emitting image based on the light intensity distribution information and the luminance distribution information using the formula [6], but the pixel to which the flash light does not reach is calculated. When (flash light non-reaching pixel) is detected by the flash light non-reaching pixel detection unit 220G, the mixture ratio αi of the second non-light-emitting image of the flash light non-reaching pixel is set to 1, and based on the luminance distribution information The mixture ratio γi of the second non-light-emitting image is obtained only from the mixing ratio βi of the second non-light-emitting image.

 尚、フラッシュ光未到達画素検出部220Gは、フラッシュ光未到達画素を検出する場合に限らず、画像を複数の領域に分割した場合、フラッシュ光未到達領域を検出する検出部として機能させてもよい。

The flash light non-reaching pixel detection unit 220G is not limited to detecting a flash light non-reaching pixel, but may function as a detection unit that detects a flash light non-reaching region when an image is divided into a plurality of regions. Good.

 また、フラッシュ光未到達画素検出部220Gは、フラッシュ制御部272が、フラッシュ発光部270から発光させるフラッシュ光の発光量の制御にも使用されるが、フラッシュ制御部272による発光量制御の詳細については後述する。

The flash light non-reaching pixel detection unit 220G is also used to control the light emission amount of the flash light emitted from the flash light emission unit 270 by the flash control unit 272. Details of the light emission amount control by the flash control unit 272 are as follows. Will be described later.

 [HDR処理の第5実施形態]

 図16は、本発明に係る画像処理装置として機能する本体側CPU220の第5実施形態を示す機能ブロック図である。尚、図16において、図5に示した第1実施形態と共通する部分には同一の符号を付し、その詳細な説明は省略する。

[Fifth Embodiment of HDR Processing]

FIG. 16 is a functional block diagram showing a fifth embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention. In FIG. 16, the same reference numerals are given to the portions common to the first embodiment shown in FIG. 5, and detailed description thereof is omitted.

 図16に示す第4実施形態の本体側CPU220は、第1実施形態に示した距離分布情報取得部220Aを有しておらず、外部の距離分布情報取得部223から距離分布情報を取得している。

The main body side CPU 220 of the fourth embodiment shown in FIG. 16 does not have the distance distribution information acquisition unit 220A shown in the first embodiment, and acquires distance distribution information from the external distance distribution information acquisition unit 223. Yes.

 距離分布情報取得部223は、例えば、複数の受光素子が2次元状に配列された距離画像センサにより構成することができる。距離画像センサとしては、TOF(Time Of Flight)方式により距離画像を取得するものが考えられる。TOF方式は、被写体に光を照射し、その反射光をセンサで受光するまでの時間を測定することにより被写体までの距離を求める方式であり、パルス光を被写体に照射し、その反射光を複数の画素を有する距離画像センサにより受光し、距離画像センサの画素毎の受光量(受光強度)から被写体の距離画像を取得する方式と、高周波で変調した光を被写体に照射し、照射時点から反射光を受光するまでの位相ずれを検出することにより距離画像を取得する方式とが知られている。

The distance distribution information acquisition unit 223 can be configured by, for example, a distance image sensor in which a plurality of light receiving elements are arranged two-dimensionally. As the distance image sensor, one that obtains a distance image by a TOF (Time Of Flight) method is conceivable. The TOF method is a method for obtaining the distance to the subject by irradiating the subject with light and measuring the time until the reflected light is received by the sensor. A distance image sensor that has two pixels, and obtains a distance image of the subject from the received light amount (light reception intensity) for each pixel of the distance image sensor, and irradiates the subject with light modulated at high frequency and reflects it from the time of irradiation There is known a method of acquiring a distance image by detecting a phase shift until light is received.

 また、距離分布情報取得部223としては、3次元レーザ計測器、ステレオカメラ等を適用することができる。尚、距離分布情報取得部223を使用する場合、イメージセンサ201は、本例のようなイメージセンサに限らず、距離計測が不能なイメージセンサでもよい。

As the distance distribution information acquisition unit 223, a three-dimensional laser measuring instrument, a stereo camera, or the like can be applied. When the distance distribution information acquisition unit 223 is used, the image sensor 201 is not limited to the image sensor as in this example, and may be an image sensor that cannot measure the distance.

 [画像処理方法]

 図17は、本発明に係る画像処理方法の実施形態を示すフローチャートである。以下、主として図5等に示した各部の機能を有する本体側CPU220によるHDR処理の動作について説明する。

[Image processing method]

FIG. 17 is a flowchart showing an embodiment of an image processing method according to the present invention. Hereinafter, the operation of the HDR process performed by the main body side CPU 220 having the functions of the units illustrated in FIG. 5 and the like will be mainly described.

 図17において、本体側CPU220の距離分布情報取得部220Aは、画像内の被写体の距離の分布を示す距離分布情報を取得する(ステップS10)。本例では、距離分布情報取得部220Aは、画像取得部221から互いに位相差をもつ位相差画像である第1画像と第2画像とを取得し、これらの第1画像及び第2画像に基づいて画像内の被写体の距離の分布を示す距離分布情報を取得する。

In FIG. 17, the distance distribution information acquisition unit 220 </ b> A of the main body side CPU 220 acquires distance distribution information indicating the distribution of the distance of the subject in the image (step S <b> 10). In this example, the distance distribution information acquisition unit 220A acquires a first image and a second image that are phase difference images having a phase difference from each other from the image acquisition unit 221, and based on the first image and the second image. Then, distance distribution information indicating the distribution of the distance of the subject in the image is acquired.

 また、画像取得部221により、同一の被写体をそれぞれ異なる撮像条件で撮像した画像であって、フラッシュ光の非発光下でそれぞれ異なる露出条件により撮像された複数の非発光画像(第1非発光画像と第2非発光画像)、及びフラッシュ光の発光下で複数の非発光画像のうちのいずれかの非発光画像(本例では、アンダー露出の第1非発光画像の露出条件)と同じ露出条件により撮像された発光画像を取得する(ステップS12)。

In addition, the image acquisition unit 221 captures the same subject under different imaging conditions, and includes a plurality of non-emission images (first non-emission images) captured under different exposure conditions under non-flash light emission. And the second non-emission image), and the non-emission image (in this example, the exposure condition of the first non-emission image under exposure) of any of the plurality of non-emission images under flash light emission. The light emission image imaged by (1) is acquired (step S12).

 次に、ステップS12における詳細な動作について、図18に示すフローチャートにより説明する。

Next, the detailed operation in step S12 will be described with reference to the flowchart shown in FIG.

 図18において、フラッシュ制御部272は、ステップS10で取得された距離分布情報に基づいてフラッシュ発光部270から発光させる調光画像用の調光発光量を決定する

(ステップS100)。調光発光量は、画角内の被写体の平均反射率が18%と仮定して、距離分布から白飛びしない最大の発光量として計算する。

In FIG. 18, the flash control unit 272 determines the dimming light emission amount for the dimming image to be emitted from the flash light emitting unit 270 based on the distance distribution information acquired in step S10.

(Step S100). The dimming light emission amount is calculated as the maximum light emission amount that does not blow out from the distance distribution, assuming that the average reflectance of the subject within the angle of view is 18%.

 次に、フラッシュ制御部272は、フラッシュ発光部270から、算出した調光発光量のフラッシュ光を発光させ、本体側CPU220は、調光発光下で第1非発光画像と同じ露出条件(アンダー露出)で撮像された調光画像を取得する(ステップS102)。

Next, the flash control unit 272 causes the flash light emitting unit 270 to emit flash light having the calculated light control light emission amount, and the main body CPU 220 performs the same exposure conditions (under exposure) as the first non-light emitting image under the light control light emission. ) Is acquired (step S102).

 続いて、本体側CPU220は、フラッシュ光未到達画素検出部220G(図15)の検出出力によりフラッシュ光が到達していない画素又は領域があるかを判別する(ステップS104)。尚、フラッシュ光未到達画素検出部220Gは、調光画像と同じ露出条件で撮像されているライブビュー画像の1フレームの画像(非発光画像)を取得し、調光画像と非発光画像とを比較してフラッシュ光未到達画素又は領域を検出する。

Subsequently, the main body side CPU 220 determines whether there is a pixel or a region where the flash light has not reached, based on the detection output of the flash light non-reaching pixel detection unit 220G (FIG. 15) (step S104). Note that the flash light non-reaching pixel detection unit 220G acquires a frame image (non-light-emitting image) of the live view image captured under the same exposure condition as that of the light-control image, and uses the light-control image and the non-light-emitting image. In comparison, a flash light non-reaching pixel or region is detected.

 ステップS104によりフラッシュ光が全領域に到達していると判別されると(「No」

の場合)、本体側CPU220は、調光画像に白飛びしている領域があるか否かを判別する(ステップS106)。調光画像内に最大画素値(8ビットで画素値が表されている場合は、255)の領域がある場合、その領域は白飛びしている領域として判別することができる。

If it is determined in step S104 that the flash light has reached the entire area ("No")

In this case, the main body side CPU 220 determines whether or not there is a whiteout region in the dimming image (step S106). If there is an area of the maximum pixel value (255 when the pixel value is represented by 8 bits) in the dimming image, the area can be determined as a whiteout area.

 ステップS106により調光画像内に白飛びしている領域がないと判別されると(「No」の場合)、調光画像を発光画像として取得する(ステップS108)。この場合改めて発光画像を撮像する必要がなく、撮像枚数を減らすことができる。

If it is determined in step S106 that there is no whiteout region in the light control image (in the case of “No”), the light control image is acquired as a light emission image (step S108). In this case, there is no need to capture a light-emitting image again, and the number of captured images can be reduced.

 ステップS106により調光画像内に白飛びしている領域があると判別されると(「Yes」の場合)、フラッシュ制御部272は、白飛びしている領域が白飛びしない最大発光量を算出し、算出した最大発光量を、発光画像を取得するときの本発光量とする(ステップS110)。

If it is determined in step S106 that there is a whiteout area in the light control image (in the case of “Yes”), the flash control unit 272 calculates the maximum light emission amount that the whiteout area does not white out. Then, the calculated maximum light emission amount is set as the main light emission amount when the light emission image is acquired (step S110).

 一方、ステップS104によりフラッシュ光が到達していない領域があると判別されると(「Yes」の場合)、本体側CPU220は、フラッシュ光が到達していない領域は、フラッシュ光の最大到達距離よりも近いか否かを判別する(ステップS112)。尚、フラッシュ光が到達していない領域の距離は、距離分布情報から取得することができ、フラッシュ光の最大到達距離は、フラッシュ発光部270のガイドナンバーとF値とにより求めることができる。

On the other hand, if it is determined in step S104 that there is an area where the flash light has not reached (in the case of “Yes”), the main body CPU 220 determines that the area where the flash light has not reached is greater than the maximum reach distance of the flash light. Is also determined (step S112). The distance of the area where the flash light has not reached can be obtained from the distance distribution information, and the maximum reach distance of the flash light can be obtained from the guide number and F value of the flash light emitting unit 270.

 ステップS112によりフラッシュ光が到達していない領域が、フラッシュ光の最大到達距離よりも遠いと判別されると(「No」の場合)、図15に示した合成比率決定部220Cは、フラッシュ光が到達していない領域の混合率αi([数6]式に示した第2非発光画像の混合率αi)を1にする(ステップS114)。これにより、フラッシュ光が到達していない領域の第2非発光画像の混合率γiは、輝度分布情報に基づく第2非発光画像の混合率βiのみで決定されることになる。

If it is determined in step S112 that the area where the flash light has not reached is longer than the maximum reach distance of the flash light (in the case of “No”), the composition ratio determination unit 220C illustrated in FIG. The mixing ratio αi of the area that has not reached (the mixing ratio αi of the second non-light-emitting image shown in the equation [6]) is set to 1 (step S114). As a result, the mixing ratio γi of the second non-light emitting image in the region where the flash light has not reached is determined only by the mixing ratio βi of the second non-light emitting image based on the luminance distribution information.

 一方、ステップS112によりフラッシュ光が到達していない領域が、フラッシュ光の最大到達距離よりも近いと判別されると(「Yes」の場合)、フラッシュ制御部272は、フラッシュ光が到達した領域が白飛びしない最大発光量を算出し、算出した最大発光量を、発光画像を取得するときの本発光量とする(ステップS116)。

On the other hand, when it is determined in step S112 that the area where the flash light has not reached is closer than the maximum reach distance of the flash light (in the case of “Yes”), the flash control unit 272 determines that the area where the flash light has reached A maximum light emission amount that does not cause whiteout is calculated, and the calculated maximum light emission amount is set as a main light emission amount when a light emission image is acquired (step S116).

 フラッシュ制御部272は、ステップS110又はステップS116により本発光量を算出すると、算出した本発光量でフラッシュ発光部270からフラッシュ光を本発光させ、画像取得部221は、本発光下で第1非発光画像の露出条件と同じ露出条件で撮像部(交換レンズ100及びイメージセンサ201)により撮像された発光画像を取得する(ステップS118)。

When the flash control unit 272 calculates the main light emission amount in step S110 or step S116, the flash control unit 272 performs main flash light emission from the flash light emission unit 270 with the calculated main light emission amount, and the image acquisition unit 221 performs the first non-light emission under the main light emission. A light emission image captured by the imaging unit (the interchangeable lens 100 and the image sensor 201) under the same exposure condition as that of the light emission image is acquired (step S118).

 上記のようにして発光画像を取得すると、続いて発光画像と同じ露出条件(アンダー露出)で撮像される非発光画像(第1非発光画像)と、オーバー露出で撮像される第2非発光画像とを取得する。

When a luminescent image is acquired as described above, a non-luminescent image (first non-luminescent image) that is subsequently captured under the same exposure conditions (underexposure) as the luminescent image and a second non-luminescent image that is captured with overexposure And get.

 図17に戻って、光強度分布情報算出部220Bは、非発光画像(アンダー露出の第1非発光画像)と発光画像と距離分布情報取得部220Aにより取得した距離分布情報に基づいて、[数1]式により光強度分布情報Sを算出する(ステップS14)。

Returning to FIG. 17, the light intensity distribution information calculation unit 220 </ b> B is based on the non-luminous image (under-exposed first non-luminous image), the luminescent image, and the distance distribution information acquired by the distance distribution information acquisition unit 220 </ b> A. 1] The light intensity distribution information S i is calculated from the equation (step S14).

 合成比率決定部220Cは、ステップS14で算出された光強度分布情報Sに基づいて複数の非発光画像(第1非発光画像、第2非発光画像)をHDR合成するための合成比率を決定する(ステップS16)。例えば、合成比率決定部220Cは、複数の非発光画像(第1非発光画像と第2非発光画像)のそれぞれ対応する画素又は領域毎に合成比率を決定する際に、光強度分布情報Sに基づいて光強度が大きい画素又は領域では、アンダー露出の第1非発光画像の混合率を、オーバー露出の第2非発光画像の混合率よりも大きくし、光強度が小さい画素又は領域では第2非発光画像の混合率を第1非発光画像の混合率よりも大きくする。

The combination ratio determination unit 220C determines a combination ratio for HDR combining the plurality of non-light-emitting images (first non-light-emitting image and second non-light-emitting image) based on the light intensity distribution information S i calculated in step S14. (Step S16). For example, when the composition ratio determining unit 220C determines the composition ratio for each pixel or region corresponding to each of the plurality of non-light-emitting images (the first non-light-emitting image and the second non-light-emitting image), the light intensity distribution information S i is used. Based on the above, in the pixel or region having a high light intensity, the mixing ratio of the first non-emission image that is under-exposed is made larger than the mixing ratio of the second non-emission image that is over-exposed. The mixing ratio of the two non-light emitting images is set to be larger than the mixing ratio of the first non-light emitting image.

 画像合成部220Dは、ステップS16で決定された合成比率に基づいて第1非発光画像と第2非発光画像とのHDR合成を行い、HDR画像を生成する(ステップS18)。

The image composition unit 220D performs HDR composition of the first non-light-emitting image and the second non-light-emitting image based on the composition ratio determined in Step S16, and generates an HDR image (Step S18).

 本実施形態の撮像装置10は、ミラーレスのデジタル一眼カメラであるが、これに限らず、一眼レフカメラ、レンズ一体型の撮像装置、デジタルビデオカメラ等でもよく、また、撮像機能に加えて撮像以外の他の機能(通話機能、通信機能、その他のコンピュータ機能)を備えるモバイル機器に対しても適用可能である。本発明を適用可能な他の態様としては、例えば、カメラ機能を有する携帯電話機やスマートフォン、PDA(Personal Digital Assistants)、携帯型ゲーム機が挙げられる。以下、本発明を適用可能なスマートフォンの一例について説明する。

The imaging device 10 according to the present embodiment is a mirrorless digital single-lens camera, but is not limited thereto, and may be a single-lens reflex camera, a lens-integrated imaging device, a digital video camera, or the like. The present invention is also applicable to mobile devices having other functions (call function, communication function, and other computer functions). Other modes to which the present invention can be applied include, for example, mobile phones and smartphones having a camera function, PDAs (Personal Digital Assistants), and portable game machines. Hereinafter, an example of a smartphone to which the present invention can be applied will be described.

 <スマートフォンの構成>

 図19は、本発明の撮像装置の一実施形態であるスマートフォン500の外観を示すものである。図19に示すスマートフォン500は、平板状の筐体502を有し、筐体502の一方の面に表示部としての表示パネル521と、入力部としての操作パネル522とが一体となった表示入力部520を備えている。また、係る筐体502は、スピーカ531と、マイクロホン532、操作部540と、カメラ部541とを備えている。尚、筐体502の構成はこれに限定されず、例えば、表示部と入力部とが独立した構成を採用したり、折り畳み構造やスライド機構を有する構成を採用することもできる。

<Configuration of smartphone>

FIG. 19 shows the appearance of a smartphone 500 that is an embodiment of the imaging apparatus of the present invention. A smartphone 500 illustrated in FIG. 19 includes a flat housing 502, and a display input in which a display panel 521 as a display unit and an operation panel 522 as an input unit are integrated on one surface of the housing 502. Part 520. The casing 502 includes a speaker 531, a microphone 532, an operation unit 540, and a camera unit 541. Note that the configuration of the housing 502 is not limited to this, and, for example, a configuration in which the display unit and the input unit are independent, or a configuration having a folding structure or a slide mechanism may be employed.

 図20は、図19に示すスマートフォン500の構成を示すブロック図である。図20に示すように、スマートフォンの主たる構成要素として、基地局と移動通信網とを介した移動無線通信を行う無線通信部510と、表示入力部520と、通話部530と、操作部540と、カメラ部541と、記録部550と、外部入出力部560と、GPS(Global Positioning System)受信部570と、モーションセンサ部580と、電源部590と、主制御部501とを備える。

20 is a block diagram showing a configuration of the smartphone 500 shown in FIG. As shown in FIG. 20, as a main component of a smartphone, a wireless communication unit 510 that performs mobile wireless communication via a base station and a mobile communication network, a display input unit 520, a call unit 530, and an operation unit 540 , A camera unit 541, a recording unit 550, an external input / output unit 560, a GPS (Global Positioning System) receiving unit 570, a motion sensor unit 580, a power supply unit 590, and a main control unit 501.

 無線通信部510は、主制御部501の指示にしたがって、移動通信網に収容された基地局に対し無線通信を行うものである。この無線通信を使用して、音声データ、画像データ等の各種ファイルデータ、電子メールデータなどの送受信、Webデータ及びストリーミングデータなどの受信を行う。

The wireless communication unit 510 performs wireless communication with a base station accommodated in the mobile communication network in accordance with an instruction from the main control unit 501. Using this wireless communication, transmission / reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.

 表示入力部520は、主制御部501の制御により、画像(静止画像及び動画像)や文字情報などを表示して視覚的にユーザに情報を伝達し、表示した情報に対するユーザ操作を検出する、いわゆるタッチパネルであって、表示パネル521と、操作パネル522とを備える。

The display input unit 520 displays images (still images and moving images), character information, and the like, and visually transmits information to the user under the control of the main control unit 501, and detects a user operation on the displayed information. This is a so-called touch panel, and includes a display panel 521 and an operation panel 522.

 表示パネル521は、LCD(Liquid Crystal Display)、OELD(Organic Electro-Luminescence Display)などを表示デバイスとして用いたものである。操作パネル522は、表示パネル521の表示面上に表示される画像を視認可能に載置され、ユーザの指や尖筆によって操作される一又は複数の座標を検出するデバイスである。かかるデバイスをユーザの指や尖筆によって操作すると、操作に起因して発生する検出信号を主制御部501に出力する。次いで、主制御部501は、受信した検出信号に基づいて、表示パネル521上の操作位置(座標)を検出する。

The display panel 521 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a display device. The operation panel 522 is a device that is placed so that an image displayed on the display surface of the display panel 521 is visible and detects one or a plurality of coordinates operated by a user's finger or stylus. When such a device is operated by a user's finger or stylus, a detection signal generated due to the operation is output to the main control unit 501. Next, the main control unit 501 detects an operation position (coordinates) on the display panel 521 based on the received detection signal.

 図19に示すように、本発明の撮像装置の一実施形態として例示しているスマートフォン500の表示パネル521と操作パネル522とは一体となって表示入力部520を構成しているが、操作パネル522が表示パネル521を完全に覆うような配置となっている。かかる配置を採用した場合、操作パネル522は、表示パネル521外の領域についても、ユーザ操作を検出する機能を備えてもよい。換言すると、操作パネル522は、表示パネル521に重なる重畳部分についての検出領域(以下、表示領域と称する)と、それ以外の表示パネル521に重ならない外縁部分についての検出領域(以下、非表示領域と称する)とを備えていてもよい。

As shown in FIG. 19, the display panel 521 and the operation panel 522 of the smartphone 500 exemplified as an embodiment of the imaging apparatus of the present invention integrally constitute a display input unit 520. The arrangement is such that 522 completely covers the display panel 521. When such an arrangement is adopted, the operation panel 522 may have a function of detecting a user operation even in an area outside the display panel 521. In other words, the operation panel 522 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 521 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 521. May be included).

 尚、表示領域の大きさと表示パネル521の大きさとを完全に一致させても良いが、両者を必ずしも一致させる必要は無い。また、操作パネル522が、外縁部分と、それ以外の内側部分の2つの感応領域を備えていてもよい。さらに、外縁部分の幅は、筐体502の大きさなどに応じて適宜設計されるものである。また、操作パネル522で採用される位置検出方式としては、マトリクススイッチ方式、抵抗膜方式、表面弾性波方式、赤外線式、電磁誘導方式、静電容量方式などが挙げられ、いずれの方式を採用することもできる。

Although the size of the display area and the size of the display panel 521 may be completely matched, it is not always necessary to match them. In addition, the operation panel 522 may include two sensitive regions of the outer edge portion and the other inner portion. Furthermore, the width of the outer edge portion is appropriately designed according to the size of the housing 502 and the like. Further, examples of the position detection method employed in the operation panel 522 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method, and any method is adopted. You can also.

 通話部530は、スピーカ531やマイクロホン532を備え、マイクロホン532を通じて入力されたユーザの音声を主制御部501にて処理可能な音声データに変換して主制御部501に出力したり、無線通信部510あるいは外部入出力部560により受信された音声データを復号してスピーカ531から出力するものである。また、図19に示すように、例えば、スピーカ531、マイクロホン532を表示入力部520が設けられた面と同じ面に搭載することができる。

The call unit 530 includes a speaker 531 and a microphone 532, and converts a user's voice input through the microphone 532 into voice data that can be processed by the main control unit 501, and outputs the voice data to the main control unit 501, or a wireless communication unit 510 or the audio data received by the external input / output unit 560 is decoded and output from the speaker 531. As shown in FIG. 19, for example, the speaker 531 and the microphone 532 can be mounted on the same surface as the surface on which the display input unit 520 is provided.

 操作部540は、キースイッチなどを用いたハードウェアキーであって、ユーザからの指示を受け付けるものである。例えば、図19に示すように、操作部540は、スマートフォン500の筐体502の側面に搭載され、指などで押下されるとオンとなり、指を離すとバネなどの復元力によってオフ状態となる押しボタン式のスイッチである。

The operation unit 540 is a hardware key using a key switch or the like, and receives an instruction from the user. For example, as illustrated in FIG. 19, the operation unit 540 is mounted on the side surface of the housing 502 of the smartphone 500 and is turned on when pressed with a finger or the like, and is turned off by a restoring force such as a spring when the finger is released. It is a push button type switch.

 記録部550は、主制御部501の制御プログラム、制御データ、アプリケーションソフトウェア(本発明に係る画像処理プログラムを含む)、通信相手の名称及び電話番号などを対応づけたアドレスデータ、送受信した電子メールのデータ、WebブラウジングによりダウンロードしたWebデータ、及びダウンロードしたコンテンツデータを記憶し、またストリーミングデータなどを一時的に記憶するものである。また、記録部550は、スマートフォン内蔵の内部記憶部551と着脱自在な外部メモリスロットを有する外部記憶部562により構成される。尚、記録部550を構成するそれぞれの内部記憶部551と外部記憶部552は、フラッシュメモリタイプ(flash memory type)、ハードディスクタイプ(hard disk type)、マルチメディアカードマイクロタイプ(multimedia card micro type)、カードタイプのメモリ(例えば、Micro SD(登録商標)メモリ等)、RAM(Random Access Memory)、ROM(Read Only Memory)などの記録媒体を用いて実現される。

The recording unit 550 includes a control program of the main control unit 501, control data, application software (including the image processing program according to the present invention), address data that associates the name and telephone number of the communication partner, and the sent and received e-mails. Data, web data downloaded by web browsing, and downloaded content data are stored, and streaming data and the like are temporarily stored. The recording unit 550 includes an internal storage unit 551 built in the smartphone and an external storage unit 562 having a removable external memory slot. Each of the internal storage unit 551 and the external storage unit 552 constituting the recording unit 550 includes a flash memory type, a hard disk type, a multimedia card micro type, This is realized by using a recording medium such as a card type memory (for example, Micro SD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory) or the like.

 外部入出力部560は、スマートフォン500に連結される全ての外部機器とのインターフェースの役割を果たすものであり、他の外部機器に通信等(例えば、ユニバーサルシリアルバス(USB)、及びIEEE1394など)又はネットワーク(例えば、インターネット、無線LAN(Local Area Network)、ブルートゥース(Bluetooth)(登録商標)、RFID(Radio Frequency Identification)、赤外線通信(Infrared Data Association:IrDA)(登録商標)、UWB(Ultra Wideband)(登録商標)、ジグビー(ZigBee)(登録商標)など)により直接的又は間接的に接続するためのものである。

The external input / output unit 560 serves as an interface with all external devices connected to the smartphone 500, and communicates with other external devices (for example, universal serial bus (USB), IEEE 1394, etc.) or Network (for example, Internet, wireless LAN (Local Area Network), Bluetooth (registered trademark), RFID (Radio Frequency Identification), infrared communication (Infrared Data Association: IrDA) (registered trademark), UWB (Ultra Wideband) ( (Registered trademark), ZigBee (registered trademark), etc.) for direct or indirect connection.

 スマートフォン500に連結される外部機器としては、例えば、有/無線ヘッドセット、有/無線外部充電器、有/無線データポート、カードソケットを介して接続されるメモリカード(Memory card)、SIM(Subscriber Identity Module Card)/UIM(User Identity Module Card)カード、又はオーディオ・ビデオI/O(Input/Output)端子を介して接続される外部オーディオビデオ機器、無線接続される外部オーディオビデオ機器、有/無線接続されるスマートフォン、有/無線接続されるパーソナルコンピュータ、有/無線接続されるPDA、及びイヤホンなどがある。外部入出力部は、このような外部機器から伝送を受けたデータをスマートフォン500の内部の各構成要素に伝達し、又はスマートフォン500の内部のデータを外部機器に伝送することが可能である。

Examples of external devices connected to the smartphone 500 include a wired / wireless headset, wired / wireless external charger, wired / wireless data port, a memory card connected via a card socket, and a SIM (Subscriber). Identity Module Card (UIM) / User Identity Module Card (UIM) card, or external audio / video equipment connected via audio / video I / O (Input / Output) terminal, external audio / video equipment connected wirelessly, wired / wireless There are smartphones to be connected, personal computers to be connected to / from wireless, PDAs to be connected to wireless and wireless, and earphones. The external input / output unit can transmit the data transmitted from such an external device to each component inside the smartphone 500, or can transmit the data inside the smartphone 500 to the external device.

 GPS受信部570は、主制御部501の指示にしたがって、GPS衛星ST1~STnから送信されるGPS信号を受信し、受信した複数のGPS信号に基づく測位演算処理を実行し、スマートフォン500の緯度、経度、及び高度からなる位置を検出する。GPS受信部570は、無線通信部510や外部入出力部560(例えば、無線LAN)から位置情報を取得できる時には、その位置情報を用いて位置を検出することもできる。

The GPS receiving unit 570 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 501, performs positioning calculation processing based on the received plurality of GPS signals, A position consisting of longitude and altitude is detected. When the GPS reception unit 570 can acquire position information from the wireless communication unit 510 or the external input / output unit 560 (for example, a wireless LAN), the GPS reception unit 570 can also detect the position using the position information.

 モーションセンサ部580は、例えば、3軸の加速度センサ及びジャイロセンサなどを備え、主制御部501の指示にしたがって、スマートフォン500の物理的な動きを検出する。スマートフォン500の物理的な動きを検出することにより、スマートフォン500の動く方向や加速度が検出される。この検出結果は、主制御部501に出力されるものである。

The motion sensor unit 580 includes, for example, a triaxial acceleration sensor and a gyro sensor, and detects the physical movement of the smartphone 500 in accordance with an instruction from the main control unit 501. By detecting the physical movement of the smartphone 500, the moving direction and acceleration of the smartphone 500 are detected. This detection result is output to the main control unit 501.

 電源部590は、主制御部501の指示にしたがって、スマートフォン500の各部に、バッテリ(図示しない)に蓄えられる電力を供給するものである。

The power supply unit 590 supplies power stored in a battery (not shown) to each unit of the smartphone 500 in accordance with an instruction from the main control unit 501.

 主制御部501は、マイクロプロセッサを備え、記録部550が記憶する制御プログラム及び制御データにしたがって動作し、スマートフォン500の各部を統括して制御するものである。また、主制御部501は、無線通信部510を通じて、音声通信やデータ通信を行うために、通信系の各部を制御する移動通信制御機能及びアプリケーション処理機能を備える。

The main control unit 501 includes a microprocessor, operates according to a control program and control data stored in the recording unit 550, and controls each unit of the smartphone 500 in an integrated manner. Further, the main control unit 501 includes a mobile communication control function and an application processing function for controlling each unit of the communication system in order to perform voice communication and data communication through the wireless communication unit 510.

 アプリケーション処理機能は、記録部550が記憶するアプリケーションソフトウェアにしたがって主制御部501が動作することにより実現するものである。アプリケーション処理機能としては、例えば、外部入出力部560を制御して対向機器とデータ通信を行う赤外線通信機能や、電子メールの送受信を行う電子メール機能、Webページを閲覧するWebブラウジング機能、本発明に係るHDR合成を行う画像処理機能などがある。

The application processing function is realized by the main control unit 501 operating in accordance with application software stored in the recording unit 550. Examples of the application processing function include an infrared communication function for controlling the external input / output unit 560 to perform data communication with the opposite device, an e-mail function for sending and receiving e-mails, a web browsing function for browsing web pages, and the present invention. There is an image processing function for performing HDR composition.

 また、主制御部501は、受信データやダウンロードしたストリーミングデータなどの画像データ(静止画像や動画像のデータ)に基づいて、映像を表示入力部520に表示する等の画像処理機能を備える。画像処理機能とは、主制御部501が、上記画像データを復号し、係る復号結果に画像処理を施して、画像を表示入力部520に表示する機能のことをいう。

Further, the main control unit 501 has an image processing function such as displaying video on the display input unit 520 based on image data (still image or moving image data) such as received data or downloaded streaming data. The image processing function refers to a function in which the main control unit 501 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 520.

 更に、主制御部501は、表示パネル521に対する表示制御と、操作部540、操作パネル522を通じたユーザ操作を検出する操作検出制御を実行する。

Further, the main control unit 501 executes display control for the display panel 521 and operation detection control for detecting a user operation through the operation unit 540 and the operation panel 522.

 表示制御の実行により、主制御部501は、アプリケーションソフトウェアを起動するためのアイコン及びスクロールバーなどのソフトウェアキーを表示したり、或いは電子メールを作成するためのウィンドウを表示する。尚、スクロールバーとは、表示パネル521の表示領域に収まりきれない大きな画像などについて、画像の表示部分を移動する指示を受け付けるためのソフトウェアキーのことをいう。

By executing the display control, the main control unit 501 displays an icon for starting application software and a software key such as a scroll bar, or displays a window for creating an e-mail. Note that the scroll bar refers to a software key for accepting an instruction to move the display portion of a large image that does not fit in the display area of the display panel 521.

 また、操作検出制御の実行により、主制御部501は、操作部540を通じたユーザ操作を検出したり、操作パネル522を通じてアイコンに対する操作、及びウィンドウの入力欄に対する文字列の入力を受け付け、或いはスクロールバーを通じた表示画像のスクロール要求を受け付ける。

Further, by executing the operation detection control, the main control unit 501 detects a user operation through the operation unit 540, receives an operation on the icon and an input of a character string in the input field of the window through the operation panel 522, or scrolls. A request to scroll the display image through the bar is accepted.

 更に、操作検出制御の実行により主制御部501は、操作パネル522に対する操作位置が、表示パネル521に重なる重畳部分(表示領域)か、それ以外の表示パネル521に重ならない外縁部分(非表示領域)かを判定し、操作パネル522の感応領域及びソフトウェアキーの表示位置を制御するタッチパネル制御機能を備える。

Further, by executing the operation detection control, the main control unit 501 causes the operation position with respect to the operation panel 522 to overlap with the display panel 521 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 521. And a touch panel control function for controlling the sensitive area of the operation panel 522 and the display position of the software key.

 また、主制御部501は、操作パネル522に対するジェスチャ操作を検出し、検出したジェスチャ操作に応じて、予め設定された機能を実行することもできる。ジェスチャ操作とは、従来の単純なタッチ操作ではなく、指などによって軌跡を描いたり、複数の位置を同時に指定したり、或いはこれらを組合せて、複数の位置から少なくとも1つについて軌跡を描く操作を意味する。

The main control unit 501 can also detect a gesture operation on the operation panel 522 and execute a preset function in accordance with the detected gesture operation. Gesture operation is not a conventional simple touch operation, but an operation of drawing a trajectory with at least one of a plurality of positions by drawing a trajectory with a finger or the like, or simultaneously specifying a plurality of positions. means.

 カメラ部541は、CMOS(Complementary Metal Oxide Semiconductor)やCCD(Charge-Coupled Device)などの撮像素子を用いて電子撮影するデジタルカメラであり、図1に示した撮像装置10に相当する。また、カメラ部541は、主制御部501の制御により、撮像によって得た画像データを例えばJPEG(Joint Photographic coding Experts Group)などの圧縮した画像データに変換し、記録部550に記録したり、外部入出力部560や無線通信部510を通じて出力することができる。図19に示すようにスマートフォン500において、カメラ部541は表示入力部520と同じ面に搭載されているが、カメラ部541の搭載位置はこれに限らず、表示入力部520の背面に搭載されてもよいし、或いは、複数のカメラ部541が搭載されてもよい。尚、複数のカメラ部541が搭載されている場合、撮影に供するカメラ部541を切り替えて単独にて撮影したり、或いは、複数のカメラ部541を同時に使用して撮影することもできる。

The camera unit 541 is a digital camera that performs electronic photography using an imaging device such as a complementary metal oxide semiconductor (CMOS) or a charge-coupled device (CCD), and corresponds to the imaging device 10 illustrated in FIG. In addition, the camera unit 541 converts image data obtained by imaging into compressed image data such as JPEG (Joint Photographic coding Experts Group) under the control of the main control unit 501, and records the data in the recording unit 550. The data can be output through the input / output unit 560 and the wireless communication unit 510. As shown in FIG. 19, in the smartphone 500, the camera unit 541 is mounted on the same surface as the display input unit 520, but the mounting position of the camera unit 541 is not limited to this, and is mounted on the back surface of the display input unit 520. Alternatively, a plurality of camera units 541 may be mounted. Note that when a plurality of camera units 541 are mounted, the camera unit 541 used for shooting can be switched for shooting alone, or a plurality of camera units 541 can be used for shooting simultaneously.

 また、カメラ部541はスマートフォン500の各種機能に利用することができる。例えば、表示パネル521にカメラ部541で取得した画像を表示することや、操作パネル522の操作入力のひとつとして、カメラ部541の画像を利用することができる。また、GPS受信部570が位置を検出する際に、カメラ部541からの画像を参照して位置を検出することもできる。さらには、カメラ部541からの画像を参照して、3軸の加速度センサを用いずに、或いは、3軸の加速度センサ(ジャイロセンサ)と併用して、スマートフォン500のカメラ部541の光軸方向を判断することや、現在の使用環境を判断することもできる。勿論、カメラ部541からの画像をアプリケーションソフトウェア内で利用することもできる。

The camera unit 541 can be used for various functions of the smartphone 500. For example, an image acquired by the camera unit 541 can be displayed on the display panel 521, or the image of the camera unit 541 can be used as one of operation inputs of the operation panel 522. Further, when the GPS receiving unit 570 detects the position, the position can also be detected with reference to an image from the camera unit 541. Furthermore, referring to an image from the camera unit 541, the optical axis direction of the camera unit 541 of the smartphone 500 without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor (gyro sensor). It is also possible to determine the current usage environment. Of course, the image from the camera unit 541 can be used in the application software.

 その他、静止画又は動画の画像データにGPS受信部570により取得した位置情報、マイクロホン532により取得した音声情報(主制御部等により、音声テキスト変換を行ってテキスト情報となっていてもよい)、モーションセンサ部580により取得した姿勢情報等などを付加して記録部550に記録したり、外部入出力部560や無線通信部510を通じて出力することもできる。

In addition, the position information acquired by the GPS receiver 570 to the image data of the still image or the moving image, the voice information acquired by the microphone 532 (the text information may be converted into voice information by the main control unit or the like), Posture information and the like acquired by the motion sensor unit 580 can be added and recorded in the recording unit 550 or output through the external input / output unit 560 and the wireless communication unit 510.

 [その他]

 本実施形態では、HDR合成を行う画像処理装置(本体側CPU220)は、撮像装置に内蔵されたものであるが、本発明に係る画像処理装置は、例えば、本発明に係る画像処理プログラムを実行する撮像装置とは別体のパーソナルコンピュータ、携帯端末等であってもよい。この場合、複数の非発光画像、発光画像及び距離分布情報等を、HDR合成用の情報として入力する必要がある。

[Others]

In the present embodiment, the image processing apparatus (main body side CPU 220) that performs HDR composition is built in the imaging apparatus, but the image processing apparatus according to the present invention executes, for example, the image processing program according to the present invention. It may be a personal computer, a portable terminal, or the like separate from the imaging device. In this case, it is necessary to input a plurality of non-light-emitting images, light-emitting images, distance distribution information, and the like as information for HDR synthesis.

 また、複数の非発光画像は、アンダー露出の第1非発光画像、オーバー露出の第2非発光画像の2つの非発光画像に限らず、同一の被写体をそれぞれ異なる撮像条件で撮像した3以上の非発光画像でもよい。この場合、合成比率決定部は、3以上の非発光画像の合成比率を決定することになる。

The plurality of non-light-emitting images are not limited to the two non-light-emitting images of the under-exposed first non-light-emitting image and the over-exposed second non-light-emitting image. A non-light emitting image may be used. In this case, the combination ratio determination unit determines a combination ratio of three or more non-light emitting images.

 また、本実施形態において、例えば、本体側CPU220等の各種の処理を実行する処理部(processing unit)のハードウェア的な構造は、次に示すような各種のプロセッサ(processor)である。各種のプロセッサには、ソフトウェア(プログラム)を実行して各種の処理部として機能する汎用的なプロセッサであるCPU(Central Processing Unit)、FPGA(Field Programmable Gate Array)などの製造後に回路構成を変更可能なプロセッサであるプログラマブルロジックデバイス(Programmable Logic Device:PLD)、ASIC(Application Specific Integrated Circuit)などの特定の処理を実行させるために専用に設計された回路構成を有するプロセッサである専用電気回路などが含まれる。

In the present embodiment, for example, the hardware structure of a processing unit (processing unit) that executes various processes such as the main body CPU 220 is various processors as shown below. For various processors, the circuit configuration can be changed after manufacturing a CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), which is a general-purpose processor that functions as various processing units by executing software (programs). Includes a dedicated electric circuit that is a processor having a circuit configuration specifically designed to execute a specific process such as a programmable logic device (PLD) or an application specific integrated circuit (ASIC). It is.

 1つの処理部は、これら各種のプロセッサのうちの1つで構成されていてもよいし、同種または異種の2つ以上のプロセッサ(例えば、複数のFPGA、あるいはCPUとFPGAの組み合わせ)で構成されてもよい。また、複数の処理部を1つのプロセッサで構成してもよい。複数の処理部を1つのプロセッサで構成する例としては、第1に、クライアントやサーバなどのコンピュータに代表されるように、1つ以上のCPUとソフトウェアの組合せで1つのプロセッサを構成し、このプロセッサが複数の処理部として機能する形態がある。第2に、システムオンチップ(System On Chip:SoC)などに代表されるように、複数の処理部を含むシステム全体の機能を1つのIC(Integrated Circuit)チップで実現するプロセッサを使用する形態がある。このように、各種の処理部は、ハードウェア的な構造として、上記各種のプロセッサを1つ以上用いて構成される。

One processing unit may be configured by one of these various processors, or may be configured by two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of CPUs and FPGAs). May be. Further, the plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or server, one processor is configured with a combination of one or more CPUs and software. There is a form in which the processor functions as a plurality of processing units. Secondly, as represented by System On Chip (SoC), etc., there is a form of using a processor that realizes the functions of the entire system including a plurality of processing units with one IC (Integrated Circuit) chip. is there. As described above, various processing units are configured using one or more of the various processors as a hardware structure.

 更に、これらの各種のプロセッサのハードウェア的な構造は、より具体的には、半導体素子などの回路素子を組み合わせた電気回路(circuitry)である。

Furthermore, the hardware structure of these various processors is more specifically an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.

 更にまた、本発明は、撮像装置又はコンピュータにインストールされることにより、本発明に係る撮像装置又は画像処理装置として機能させる画像処理プログラム、及びこの画像処理プログラムが記録された記録媒体を含む。

Furthermore, the present invention includes an image processing program that functions as an imaging apparatus or an image processing apparatus according to the present invention by being installed in the imaging apparatus or computer, and a recording medium on which the image processing program is recorded.

 また、本発明は上述した実施形態に限定されず、本発明の精神を逸脱しない範囲で種々の変形が可能であることは言うまでもない。

Moreover, it goes without saying that the present invention is not limited to the above-described embodiments, and various modifications can be made without departing from the spirit of the present invention.

10 撮像装置

20 ファインダ窓

22 シャッタレリーズスイッチ

23 シャッタスピードダイヤル

24 露出補正ダイヤル

25 電源レバー

26 接眼部

27 MENU/OKキー

28 十字キー

29 再生ボタン

30 内蔵フラッシュ

100 交換レンズ

102 撮像光学系

104 レンズ群

108 絞り

116 フォーカスレンズ制御部

118 絞り制御部

120 レンズ側CPU

122、207 RAM

124、228 ROM

126 フラッシュROM

150 レンズ側通信部

160 レンズマウント

200 カメラ本体

201 イメージセンサ

201A マイクロレンズ

202 イメージセンサ制御部

203 アナログ信号処理部

204 A/D変換器

205 画像入力コントローラ

206 デジタル信号処理部

208 圧縮伸張処理部

210 メディア制御部

212 メモリカード

214 表示制御部

216 液晶モニタ

220 本体側CPU

220A 距離分布情報取得部

220B 光強度分布情報算出部

220C 合成比率決定部

220D 画像合成部

220E 分布情報作成部

220F 輝度分布情報算出部

220G フラッシュ光未到達画素検出部

221 画像取得部

222 操作部

223 距離分布情報取得部

224 時計部

226 フラッシュROM

230 AF制御部

232 AE制御部

234 ホワイトバランス補正部

236 無線通信部

238 GPS受信部

240 電源制御部

242 バッテリ

244 レンズ電源スイッチ

250 本体側通信部

260 本体マウント

260A 端子

270 フラッシュ発光部

272 フラッシュ制御部

296 FPS制御部

500 スマートフォン

501 主制御部

502 筐体

510 無線通信部

520 表示入力部

521 表示パネル

522 操作パネル

530 通話部

531 スピーカ

532 マイクロホン

540 操作部

541 カメラ部

550 記録部

551 内部記憶部

552 外部記憶部

560 外部入出力部

562 外部記憶部

570 受信部

570 GPS受信部

580 モーションセンサ部

590 電源部

S10~S18、S100~S118 ステップ

 光強度分布情報

Li 輝度値

 輝度値

 距離情報

αi、βi、γi 混合率

10 Imaging device

20 Finder window

22 Shutter release switch

23 Shutter speed dial

24 Exposure compensation dial

25 Power lever

26 Eyepiece

27 MENU / OK key

28 Four-way controller

29 Play button

30 Built-in flash

100 interchangeable lenses

102 Imaging optical system

104 Lens group

108 aperture

116 Focus lens control unit

118 Aperture control unit

120 Lens side CPU

122, 207 RAM

124, 228 ROM

126 Flash ROM

150 Lens side communication unit

160 Lens mount

200 Camera body

201 Image sensor

201A micro lens

202 Image sensor controller

203 Analog signal processor

204 A / D converter

205 Image input controller

206 Digital signal processor

208 Compression / decompression processor

210 Media control unit

212 memory card

214 Display control unit

216 LCD monitor

220 Main CPU

220A Distance distribution information acquisition unit

220B Light intensity distribution information calculation unit

220C Composite ratio determination unit

220D image composition unit

220E Distribution information creation unit

220F Luminance distribution information calculation unit

220G Flash light non-reaching pixel detector

221 Image acquisition unit

222 Operation unit

223 Distance distribution information acquisition unit

224 clock part

226 flash ROM

230 AF control unit

232 AE control unit

234 White balance correction section

236 Wireless communication unit

238 GPS receiver

240 Power control unit

242 battery

244 Lens power switch

250 Communication unit on the main unit side

260 Body mount

260A terminal

270 Flash light emitting unit

272 Flash control unit

296 FPS controller

500 Smartphone

501 Main control unit

502 housing

510 wireless communication unit

520 Display input section

521 Display panel

522 Control panel

530 Call section

531 Speaker

532 microphone

540 operation unit

541 Camera section

550 recording section

551 internal storage

552 External storage unit

560 External input / output unit

562 External storage unit

570 receiver

570 GPS receiver

580 Motion sensor

590 power supply

Steps S10 to S18, S100 to S118

Si light intensity distribution information

Y Li luminance value

Y i luminance value

d i distance information

αi, βi, γi mixing ratio

Claims (20)


  1.  同一の被写体を撮像した画像であって、フラッシュ光の非発光下においてそれぞれ異なる露出条件により撮像された複数の非発光画像と、フラッシュ光の発光下において前記複数の非発光画像のうちのいずれかの非発光画像の露出条件と同じ露出条件により撮像された発光画像とを取得する画像取得部と、

     前記被写体の距離の分布を示す距離分布情報を取得する距離分布情報取得部と、

     前記発光画像、前記発光画像と同じ露出条件により撮像された前記非発光画像、及び前記距離分布情報に基づいて、前記被写体に照射されている光の光強度分布情報を算出する光強度分布情報算出部と、

     前記光強度分布情報に基づいて前記複数の非発光画像の合成比率を決定する合成比率決定部と、

     前記合成比率にしたがって前記複数の非発光画像を合成し、ダイナミックレンジが拡大された合成画像を生成する画像合成部と、

     を備えた画像処理装置。

    One of the plurality of non-light emitting images captured under different exposure conditions under non-flash light emission and the plurality of non-light emission images under flash light emission, which are images of the same subject. An image acquisition unit that acquires a luminescent image captured under the same exposure conditions as the non-luminous image exposure conditions;

    A distance distribution information acquisition unit for acquiring distance distribution information indicating a distribution of the distance of the subject;

    Light intensity distribution information calculation for calculating light intensity distribution information of light irradiated on the subject based on the light emission image, the non-light emission image captured under the same exposure conditions as the light emission image, and the distance distribution information And

    A composition ratio determining unit that determines a composition ratio of the plurality of non-light-emitting images based on the light intensity distribution information;

    An image combining unit that combines the plurality of non-light-emitting images according to the combining ratio and generates a combined image with an expanded dynamic range;

    An image processing apparatus.

  2.  前記合成比率決定部は、前記複数の非発光画像のそれぞれ対応する画素又は領域毎に前記合成比率を決定し、

     前記画像合成部は、前記複数の非発光画像のそれぞれ対応する画素又は領域毎に前記決定した合成比率を適用して前記複数の非発光画像を合成する、請求項1に記載の画像処理装置。

    The composition ratio determining unit determines the composition ratio for each pixel or region corresponding to each of the plurality of non-light-emitting images,

    2. The image processing apparatus according to claim 1, wherein the image combining unit combines the plurality of non-light-emitting images by applying the determined combining ratio for each pixel or region corresponding to each of the plurality of non-light-emitting images.

  3.  前記複数の非発光画像に対応する複数の露出条件は、適正露出の露出条件よりもアンダー露出の露出条件とオーバー露出の露出条件とを含む請求項1又は2に記載の画像処理装置。

    The image processing apparatus according to claim 1, wherein the plurality of exposure conditions corresponding to the plurality of non-light-emitting images include an under-exposure exposure condition and an over-exposure exposure condition rather than an appropriate exposure condition.

  4.  前記合成比率決定部は、前記複数の非発光画像のそれぞれ対応する画素又は領域毎に前記合成比率を決定する際に、前記光強度分布情報に基づいて光強度が大きい画素又は領域では前記複数の非発光画像のうちの最も露出値が小さい露出条件によって撮像された第1非発光画像の混合率を、前記複数の非発光画像のうちの最も露出値が大きい露出条件によって撮像された第2非発光画像の混合率よりも大きくし、光強度が小さい画素又は領域では前記第2非発光画像の混合率を前記第1非発光画像の混合率よりも大きくする、請求項1から3のいずれか1項に記載の画像処理装置。

    When determining the combination ratio for each pixel or region corresponding to each of the plurality of non-light-emitting images, the combination ratio determination unit determines whether the plurality of pixels or regions have a high light intensity based on the light intensity distribution information. The mixing ratio of the first non-light-emitting images captured under the exposure condition with the smallest exposure value among the non-light-emitting images is represented as the second non-light-image ratio captured with the exposure conditions with the largest exposure value among the plurality of non-light-emitting images. 4. The method according to claim 1, wherein the mixture ratio of the second non-light-emitting image is made larger than the mixture ratio of the first non-light-emitting image in a pixel or region having a light intensity that is larger than the mixing ratio of the light-emitting image. The image processing apparatus according to item 1.

  5.  前記合成比率決定部は、前記複数の非発光画像のそれぞれ対応する画素又は領域毎に前記合成比率を決定する際に、前記光強度分布情報に基づいて前記光強度分布情報のうちの最小光強度と最大光強度との間において、前記光強度が増加した場合、前記第2非発光画像の混合率を単調減少させる、請求項4に記載の画像処理装置。

    The combination ratio determination unit determines the minimum light intensity of the light intensity distribution information based on the light intensity distribution information when determining the combination ratio for each corresponding pixel or region of the plurality of non-light-emitting images. 5. The image processing apparatus according to claim 4, wherein when the light intensity increases between the maximum light intensity and the maximum light intensity, the mixing ratio of the second non-light-emitting image is monotonously decreased.

  6.  前記光強度分布情報に基づいて前記光強度の度数分布又は頻度分布を示す分布情報を作成する分布情報作成部を備え、

     前記合成比率決定部は、前記作成された分布情報に基づいて度数又は頻度の高い第1頂点に対応する第1光強度と、度数又は頻度の高い第2頂点に対応する第2光強度であって、前記第1光強度よりも大きい第2光強度とを求め、前記複数の非発光画像のそれぞれ対応する画素又は領域毎に前記合成比率を決定する際に、前記第1光強度以下の光強度の場合には前記第2非発光画像の混合率を最大値にし、前記第2光強度以上の光強度の場合には前記第2非発光画像の混合率を最小値にし、前記第1光強度よりも大きく前記第2光強度よりも小さい光強度の場合には前記第2非発光画像の混合率を前記最大値と前記最小値との間において単調減少させる、請求項4に記載の画像処理装置。

    A distribution information creation unit that creates distribution information indicating the frequency distribution or frequency distribution of the light intensity based on the light intensity distribution information;

    The combination ratio determining unit is configured to determine a first light intensity corresponding to a first vertex having a high frequency or frequency and a second light intensity corresponding to a second vertex having a high frequency or frequency based on the created distribution information. A second light intensity greater than the first light intensity, and determining the combination ratio for each corresponding pixel or region of the plurality of non-light-emitting images, the light less than the first light intensity In the case of intensity, the mixing ratio of the second non-light-emitting image is maximized, and in the case of light intensity equal to or higher than the second light intensity, the mixing ratio of the second non-light-emitting image is minimized. 5. The image according to claim 4, wherein when the light intensity is larger than the intensity and smaller than the second light intensity, the mixing ratio of the second non-light-emitting image is monotonously decreased between the maximum value and the minimum value. Processing equipment.

  7.  前記光強度分布情報に基づいて前記光強度の度数分布又は頻度分布を示す分布情報を作成する分布情報作成部を備え、

     前記合成比率決定部は、前記作成された分布情報に基づいて度数又は頻度の低い底点に対応する光強度を求め、前記複数の非発光画像のそれぞれ対応する画素又は領域毎に前記合成比率を決定する際に、前記底点に対応する光強度を閾値にして、前記閾値以下の光強度の場合には前記第2非発光画像の混合率を最大値にし、前記閾値を超える光強度の場合には前記第2非発光画像の混合率を最小値にする、請求項5に記載の画像処理装置。

    A distribution information creation unit that creates distribution information indicating the frequency distribution or frequency distribution of the light intensity based on the light intensity distribution information;

    The composition ratio determination unit obtains light intensity corresponding to a frequency or low frequency base point based on the created distribution information, and calculates the composition ratio for each corresponding pixel or region of the plurality of non-light-emitting images. When determining, the light intensity corresponding to the bottom point is set as a threshold value, the light intensity below the threshold value is set to the maximum mixing ratio of the second non-light-emitting image, and the light intensity exceeds the threshold value The image processing apparatus according to claim 5, wherein the mixing ratio of the second non-light-emitting image is minimized.

  8.  前記複数の非発光画像のうちの少なくとも1つの非発光画像の画像内の輝度の分布を示す輝度分布情報を算出する輝度分布情報算出部を備え、

     前記合成比率決定部は、前記光強度分布情報及び前記輝度分布情報に基づいて前記複数の非発光画像の合成比率を決定する、請求項1から7のいずれか1項に記載の画像処理装置。

    A luminance distribution information calculation unit that calculates luminance distribution information indicating a luminance distribution in an image of at least one non-light-emitting image among the plurality of non-light-emitting images;

    The image processing apparatus according to claim 1, wherein the synthesis ratio determination unit determines a synthesis ratio of the plurality of non-light-emitting images based on the light intensity distribution information and the luminance distribution information.

  9.  前記発光画像における前記フラッシュ光が到達していない画素又は領域を検出する検出部を備え、

     前記合成比率決定部は、前記フラッシュ光が到達していない画素又は領域に対する前記合成比率を決定する際に、前記輝度分布情報のみに基づいて前記複数の非発光画像の合成比率を決定する、請求項8に記載の画像処理装置。

    A detection unit for detecting a pixel or a region where the flash light does not reach in the emission image;

    The composition ratio determining unit determines a composition ratio of the plurality of non-light-emitting images based only on the luminance distribution information when determining the composition ratio for a pixel or region where the flash light has not reached. Item 9. The image processing apparatus according to Item 8.

  10.  前記合成比率決定部は、前記複数の非発光画像のそれぞれ対応する画素又は領域毎に前記合成比率を決定する際に、画素又は領域の位置を示すパラメータをi、前記複数の非発光画像のうちの最も露出値が大きい露出条件によって撮像された第2非発光画像の混合率をαi、前記輝度分布情報に基づく前記第2非発光画像の混合率であって、輝度が高いほど小さい値となる混合率をβiとすると、前記光強度分布情報及び前記輝度分布情報に基づく前記第2非発光画像の混合率γiを、次式、

     γi=αi×βiにより算出する、請求項8又は9に記載の画像処理装置。

    The composite ratio determining unit determines a parameter indicating a position of a pixel or a region when determining the composite ratio for each pixel or region corresponding to each of the plurality of non-light-emitting images, and among the plurality of non-light-emitting images Αi is the mixing ratio of the second non-light-emitting image captured under the exposure condition having the largest exposure value, and the mixing ratio of the second non-light-emitting image based on the luminance distribution information is smaller as the luminance is higher. When the mixing rate is βi, the mixing rate γi of the second non-light-emitting image based on the light intensity distribution information and the luminance distribution information is expressed by the following equation:

    The image processing apparatus according to claim 8, wherein calculation is performed using γi = αi × βi.

  11.  前記発光画像の露出条件は、前記複数の非発光画像に対応する複数の露出条件のうちの最も露出値が小さい露出条件である、請求項1から10のいずれか1項に記載の画像処理装置。

    The image processing apparatus according to claim 1, wherein the exposure condition of the light-emitting image is an exposure condition having a smallest exposure value among a plurality of exposure conditions corresponding to the plurality of non-light-emitting images. .

  12.  被写体を異なる露出条件により撮像可能な撮像部と、

     フラッシュ光を発光するフラッシュ発光部と、

     前記フラッシュ発光部から発光される前記フラッシュ光を制御するフラッシュ制御部と、

     請求項1から11のいずれか1項に記載の画像処理装置と、を備え、

     前記画像取得部は、前記フラッシュ発光部から発光されるフラッシュ光の非発光下において、前記撮像部によりそれぞれ異なる露出条件により撮像された前記複数の非発光画像と、前記フラッシュ発光部から発光されるフラッシュ光の発光下において、前記撮像部により前記複数の非発光画像のうちのいずれかの非発光画像の露出条件と同じ露出条件により撮像された前記発光画像とを取得する、撮像装置。

    An imaging unit capable of imaging a subject under different exposure conditions;

    A flash unit that emits flash light;

    A flash control unit for controlling the flash light emitted from the flash light emitting unit;

    An image processing apparatus according to any one of claims 1 to 11,

    The image acquisition unit emits light from the flash light emitting unit and the plurality of non-light emitting images captured under different exposure conditions by the imaging unit under non-light emission of flash light emitted from the flash light emitting unit. An imaging apparatus that acquires the light-emitting image captured by the imaging unit under the same exposure condition as that of any one of the plurality of non-light-emitting images under the emission of flash light.

  13.  前記フラッシュ制御部は、前記フラッシュ発光部からフラッシュ光を調光発光させ、前記調光発光下において前記撮像部が撮像した調光画像と前記調光発光せずに前記撮像部が撮像した画像とに基づいて前記調光発光したフラッシュ光が到達していない領域が検出された場合、前記調光発光したフラッシュ光が到達している領域が飽和しない最大の発光量を算出し、前記発光画像を取得する際に前記算出した最大の発光量によって前記フラッシュ発光部からフラッシュ光を本発光させる、請求項12に記載の撮像装置。

    The flash control unit dimmes flash light from the flash light emitting unit, and a light control image captured by the image capturing unit under the light control light emission and an image captured by the image capturing unit without the light control light emission When the area where the dimmed flash light does not reach is detected based on the above, the maximum light emission amount that does not saturate the area where the dimmed flash light arrives is calculated, and The imaging apparatus according to claim 12, wherein when acquiring the flash light, the flash light is emitted from the flash light emitting unit according to the calculated maximum light emission amount.

  14.  前記フラッシュ制御部は、被写体の平均反射率と前記距離分布情報取得部が取得した前記距離分布情報とに基づいて前記最大の発光量を算出する、請求項13に記載の撮像装置。

    The imaging device according to claim 13, wherein the flash control unit calculates the maximum light emission amount based on an average reflectance of a subject and the distance distribution information acquired by the distance distribution information acquisition unit.

  15.  前記フラッシュ制御部は、前記フラッシュ発光部からフラッシュ光を調光発光させ、前記調光発光下において前記撮像部が撮像した調光画像と前記調光発光せずに前記撮像部が撮像した画像とに基づいて前記調光画像の全領域にフラッシュ光が到達していることが検出された場合、前記画像取得部は、前記調光画像を前記発光画像として取得する、請求項12に記載の撮像装置。

    The flash control unit dimmes flash light from the flash light emitting unit, and a light control image captured by the image capturing unit under the light control light emission and an image captured by the image capturing unit without the light control light emission The imaging according to claim 12, wherein when it is detected that flash light has reached the entire area of the light control image based on the image, the image acquisition unit acquires the light control image as the light emission image. apparatus.

  16.  前記フラッシュ制御部は、前記フラッシュ発光部からフラッシュ光を調光発光させ、前記調光発光下において前記撮像部が撮像した調光画像と前記調光発光せずに前記撮像部が撮像した画像とに基づいて前記調光画像にフラッシュ光が到達していない領域が検出され、かつ前記距離分布情報に基づいて前記フラッシュ光が到達していない領域の距離が、前記フラッシュ発光部から発光されるフラッシュ光の最大発光量の到達距離よりも遠い場合、前記画像取得部は、前記調光画像を前記発光画像として取得する、請求項12に記載の撮像装置。

    The flash control unit dimmes flash light from the flash light emitting unit, and a light control image captured by the image capturing unit under the light control light emission and an image captured by the image capturing unit without the light control light emission A flash where the flash light does not reach the light control image is detected based on the distance distribution information, and the distance of the area where the flash light does not reach is detected based on the distance distribution information. The imaging device according to claim 12, wherein the image acquisition unit acquires the light control image as the light emission image when the distance is longer than a reach distance of the maximum light emission amount.

  17.  同一の被写体を撮像した画像であって、フラッシュ光の非発光下においてそれぞれ異なる露出条件により撮像された複数の非発光画像を取得するステップと、

     フラッシュ光の発光下において前記複数の非発光画像のうちのいずれかの非発光画像の露出条件と同じ露出条件により撮像された発光画像を取得するステップと、

     前記被写体の距離の分布を示す距離分布情報を取得するステップと、

     前記発光画像、前記発光画像と同じ露出条件により撮像された前記非発光画像、及び前記距離分布情報に基づいて、前記被写体に照射されている光の光強度分布情報を算出するステップと、

     前記光強度分布情報に基づいて前記複数の非発光画像の合成比率を決定するステップと、

     前記合成比率にしたがって前記複数の非発光画像を合成し、ダイナミックレンジが拡大された合成画像を生成するステップと、

     を含む画像処理方法。

    Obtaining a plurality of non-light-emitting images that are images of the same subject and that are imaged under different exposure conditions under non-light emission of flash light;

    Obtaining a light-emitting image captured under the same exposure condition as that of any one of the plurality of non-light-emitting images under flash light emission; and

    Obtaining distance distribution information indicating a distribution of the distance of the subject;

    Calculating light intensity distribution information of light irradiated on the subject based on the light emission image, the non-light emission image captured under the same exposure conditions as the light emission image, and the distance distribution information;

    Determining a synthesis ratio of the plurality of non-light-emitting images based on the light intensity distribution information;

    Combining the plurality of non-light emitting images according to the combining ratio, and generating a combined image with an expanded dynamic range;

    An image processing method including:

  18.  前記合成比率を決定するステップは、前記複数の非発光画像のそれぞれ対応する画素又は領域毎に前記合成比率を決定する際に、前記光強度分布情報に基づいて光強度が大きい画素又は領域では前記複数の非発光画像のうちの最も露出値が小さい露出条件によって撮像された第1非発光画像の混合率を、前記複数の非発光画像のうちの最も露出値が大きい露出条件によって撮像された第2非発光画像の混合率よりも大きくし、光強度が小さい画素又は領域では前記第2非発光画像の混合率を前記第1非発光画像の混合率よりも大きくする、請求項17に記載の画像処理方法。

    The step of determining the composition ratio includes determining the composition ratio for each pixel or region corresponding to each of the plurality of non-light-emitting images when the pixel or region has a high light intensity based on the light intensity distribution information. The mixing ratio of the first non-light-emitting images captured under the exposure condition with the smallest exposure value among the plurality of non-light-emitting images is represented by the first mixing ratio of the first non-light-emitting images captured under the exposure condition with the largest exposure value among the plurality of non-light-emitting images. The mixture ratio of the second non-light-emitting image is set to be larger than the mixing ratio of the first non-light-emitting image in a pixel or region having a low light intensity. Image processing method.

  19.  同一の被写体を撮像した画像であって、フラッシュ光の非発光下においてそれぞれ異なる露出条件により撮像された複数の非発光画像を取得する機能と、

     フラッシュ光の発光下において前記複数の非発光画像のうちのいずれかの非発光画像の露出条件と同じ露出条件により撮像された発光画像を取得する機能と、

     前記被写体の距離の分布を示す距離分布情報を取得する機能と、

     前記発光画像、前記発光画像と同じ露出条件により撮像された前記非発光画像、及び前記距離分布情報に基づいて、前記被写体に照射されている光の光強度分布情報を算出する機能と、

     前記光強度分布情報に基づいて前記複数の非発光画像の合成比率を決定する機能と、

     前記合成比率にしたがって前記複数の非発光画像を合成し、ダイナミックレンジが拡大された合成画像を生成する機能と、

     をコンピュータに実現させる画像処理プログラム。

    A function of acquiring a plurality of non-light-emitting images that are images of the same subject and that are captured under different exposure conditions under non-flash light;

    A function of acquiring a light-emitting image captured under the same exposure condition as that of any one of the plurality of non-light-emitting images under flash light emission;

    A function of acquiring distance distribution information indicating the distribution of the distance of the subject;

    A function of calculating light intensity distribution information of light applied to the subject based on the light emission image, the non-light emission image captured under the same exposure conditions as the light emission image, and the distance distribution information;

    A function of determining a composite ratio of the plurality of non-light-emitting images based on the light intensity distribution information;

    A function of combining the plurality of non-light emitting images according to the combining ratio and generating a combined image with an expanded dynamic range;

    An image processing program for realizing a computer.

  20.  前記合成比率を決定する機能は、前記複数の非発光画像のそれぞれ対応する画素又は領域毎に前記合成比率を決定する際に、前記光強度分布情報に基づいて光強度が大きい画素又は領域では前記複数の非発光画像のうちの最も露出値が小さい露出条件によって撮像された第1非発光画像の混合率を、前記複数の非発光画像のうちの最も露出値が大きい露出条件によって撮像された第2非発光画像の混合率よりも大きくし、光強度が小さい画素又は領域では前記第2非発光画像の混合率を前記第1非発光画像の混合率よりも大きくする、請求項19に記載の画像処理プログラム。

    The function of determining the composition ratio is such that when determining the composition ratio for each pixel or region corresponding to each of the plurality of non-light-emitting images, the pixel or region having a high light intensity based on the light intensity distribution information. The mixing ratio of the first non-light-emitting images captured under the exposure condition with the smallest exposure value among the plurality of non-light-emitting images is represented by the first mixing ratio of the first non-light-emitting images captured under the exposure condition with the largest exposure value among the plurality of non-light-emitting images. The mixture ratio of the second non-light-emitting image is made larger than the mixing ratio of the first non-light-emitting image in a pixel or region having a light intensity smaller than that of the two non-light-emitting images. Image processing program.
PCT/JP2019/015046 2018-04-26 2019-04-05 Image processing device, method and program, and imaging device WO2019208155A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020516173A JP6810299B2 (en) 2018-04-26 2019-04-05 Image processing equipment, methods, and programs as well as imaging equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-085582 2018-04-26
JP2018085582 2018-04-26

Publications (1)

Publication Number Publication Date
WO2019208155A1 true WO2019208155A1 (en) 2019-10-31

Family

ID=68294051

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/015046 WO2019208155A1 (en) 2018-04-26 2019-04-05 Image processing device, method and program, and imaging device

Country Status (2)

Country Link
JP (1) JP6810299B2 (en)
WO (1) WO2019208155A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111327800A (en) * 2020-01-08 2020-06-23 深圳深知未来智能有限公司 All-weather vehicle-mounted vision system and method suitable for complex illumination environment
CN114513589A (en) * 2020-10-06 2022-05-17 联发科技股份有限公司 Image acquisition method and related image acquisition system
CN114690510A (en) * 2020-12-29 2022-07-01 财团法人工业技术研究院 Image capturing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003018617A (en) * 2001-07-03 2003-01-17 Olympus Optical Co Ltd Imaging apparatus
JP2008205935A (en) * 2007-02-21 2008-09-04 National Univ Corp Shizuoka Univ Exposure ratio determination method in image synthesis
WO2017065949A1 (en) * 2015-10-16 2017-04-20 CapsoVision, Inc. Single image sensor for capturing mixed structured-light images and regular images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003018617A (en) * 2001-07-03 2003-01-17 Olympus Optical Co Ltd Imaging apparatus
JP2008205935A (en) * 2007-02-21 2008-09-04 National Univ Corp Shizuoka Univ Exposure ratio determination method in image synthesis
WO2017065949A1 (en) * 2015-10-16 2017-04-20 CapsoVision, Inc. Single image sensor for capturing mixed structured-light images and regular images

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111327800A (en) * 2020-01-08 2020-06-23 深圳深知未来智能有限公司 All-weather vehicle-mounted vision system and method suitable for complex illumination environment
CN114513589A (en) * 2020-10-06 2022-05-17 联发科技股份有限公司 Image acquisition method and related image acquisition system
CN114690510A (en) * 2020-12-29 2022-07-01 财团法人工业技术研究院 Image capturing method

Also Published As

Publication number Publication date
JP6810299B2 (en) 2021-01-06
JPWO2019208155A1 (en) 2021-03-25

Similar Documents

Publication Publication Date Title
CN101610363B (en) Apparatus and method of blurring background of image in digital image processing device
JP6302554B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
JP6017225B2 (en) Imaging device
TW201240448A (en) Combined ambient and flash exposure for improved image quality
JP6063093B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program
JP6810299B2 (en) Image processing equipment, methods, and programs as well as imaging equipment
JP6921972B2 (en) Image processing equipment, imaging equipment, image processing methods, imaging methods, and programs
US11032483B2 (en) Imaging apparatus, imaging method, and program
JP7112529B2 (en) IMAGING DEVICE, IMAGING METHOD, AND PROGRAM
JP6192416B2 (en) Imaging apparatus, control method therefor, program, and storage medium
WO2019111659A1 (en) Image processing device, imaging device, image processing method, and program
JP6534780B2 (en) Imaging device, imaging method, and program
JP6998454B2 (en) Imaging equipment, imaging methods, programs and recording media
WO2020161969A1 (en) Image processing device, photographing device, image processing method, and image processing program
JP6810298B2 (en) Image alignment aids, methods and programs and imaging devices
JP6941744B2 (en) Image processing device, photographing device, image processing method and image processing program
JP7352034B2 (en) Imaging device, imaging instruction method, and imaging instruction program
KR101128518B1 (en) Imaging method for digital image processing device
JP2004364163A (en) Imaging apparatus
JP2008268361A (en) Photographing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19793277

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020516173

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19793277

Country of ref document: EP

Kind code of ref document: A1