WO2019208155A1 - Dispositif, procédé et programme de traitement d'images, et dispositif d'imagerie - Google Patents

Dispositif, procédé et programme de traitement d'images, et dispositif d'imagerie Download PDF

Info

Publication number
WO2019208155A1
WO2019208155A1 PCT/JP2019/015046 JP2019015046W WO2019208155A1 WO 2019208155 A1 WO2019208155 A1 WO 2019208155A1 JP 2019015046 W JP2019015046 W JP 2019015046W WO 2019208155 A1 WO2019208155 A1 WO 2019208155A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
image
emitting
distribution information
unit
Prior art date
Application number
PCT/JP2019/015046
Other languages
English (en)
Japanese (ja)
Inventor
祐也 西尾
智紀 増田
智大 島田
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2020516173A priority Critical patent/JP6810299B2/ja
Publication of WO2019208155A1 publication Critical patent/WO2019208155A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • G03B15/03Combinations of cameras with lighting apparatus; Flash units
    • G03B15/05Combinations of cameras with electronic flash apparatus; Electronic flash units
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10144Varying exposure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the present invention relates to an image processing apparatus, method, program, and imaging apparatus, and in particular, performs high dynamic range synthesis (HDR (High-dynamic-range) synthesis) that combines a plurality of images with different exposures to expand a dynamic range.
  • HDR High-dynamic-range
  • HDR processing As one of methods for expressing a dynamic range of a subject with a wider image.
  • an image obtained by reducing the exposure amount with respect to the appropriate exposure (underexposure image) and an image obtained by increasing the exposure amount (overexposure image) are acquired, and a pixel on which a dark subject is imaged
  • an overexposed image pixel, and an underexposed image pixel is used for a pixel on which a bright subject is imaged (actually, not only one of them may be mixed), but an HDR image with little overexposure and underexposure is generated. create.
  • Patent Documents 1 and 2 Conventionally, as an apparatus for generating one HDR image from a plurality of images taken at different exposures, there are devices described in Patent Documents 1 and 2.
  • the imaging system described in Patent Literature 1 is an exposure of luminance signals related to a long-time exposure image from a plurality of images (long-time exposure image and short-time exposure image) captured on the same subject under different exposure conditions.
  • a luminance signal in a region (appropriate exposure region) other than an over-exposed region (inappropriate exposure region) is acquired, and in the case of a luminance signal related to a short exposure image, it corresponds to the inappropriate exposure region in the long exposure image.
  • Get the luminance signal Get the luminance signal.
  • a wide dynamic range image is generated by combining the appropriate exposure areas of the images.
  • the imaging system described in Patent Literature 1 estimates a main subject area from a long-exposure image and a short-exposure image, and performs tone correction so that a tone width is allocated to the estimated main subject area.
  • the gradation for the main subject in the appropriate exposure area is enriched by synthesizing the appropriate exposure area for each image after gradation correction.
  • the imaging device described in Patent Document 2 includes one imaging device having a pixel group that is exposed for a long time and a pixel group that is exposed for a short time, and the image data of the first pixel group read from the imaging device and the second image data.
  • the HDR image is generated by combining the image data of the pixel group.
  • Patent Document 2 adds a mixing ratio of flash image data that is long-time exposure data and non-flash image data that is short-time exposure data while adjusting the mixing ratio according to the estimated distance to the subject. There is a description of obtaining one composite image. As a result, an image with a high signal-to-noise ratio (Signal to Noise Ratio) of the main subject is obtained while leaving the atmosphere of the background image.
  • Signal-to-noise ratio Signal to Noise Ratio
  • Patent Document 1 there is no clear description regarding the mixing ratio of the long-time exposure image and the short-time exposure image when generating the HDR image, but regarding the luminance signal corresponding to the improper exposure area of the long-time exposure image. Since the luminance signal related to the short-time exposure image is used, it is considered that this is a general HDR synthesis in which the mixing ratio between the long-time exposure image and the short-time exposure image is determined by the magnitude of the luminance signal. Therefore, as described later, when there is a shadow on a part of the main subject, there is a problem that the shadow area remains dark.
  • the invention described in Patent Document 2 adjusts the mixing ratio of flash image data that is long-time exposure data and non-flash image data that is short-time exposure data in accordance with the estimated distance to the subject. Adding together, one composite image is generated.
  • the invention described in Patent Document 2 combines flash image data and non-flash image data, and does not combine non-light emitting images (non-flash images) with different exposure conditions.
  • Patent Documents 1 and 2 when a HDR image is generated by synthesizing a plurality of images captured with different exposure conditions for the same subject, the intensity of light applied to the subject is determined. There is no description to adjust the composition ratio of a plurality of images accordingly.
  • HDR processing when deciding whether to use the pixel value of the over image or the pixel value of the under image based on the luminance value obtained from the image sensor, it depends on the intensity of light hitting the subject. Regardless, the same HDR processing is performed and the following problems occur.
  • the left half of the person's face is the sunlit area A where the sunlight is exposed, and the right half is the shaded area B.
  • the dark hair (1) exposed to strong light and “skinned skin (2)” have the same luminance value or almost the same luminance value, the same applies to the HDR composition.
  • the over image is used, and not only the shadowed skin (2) becomes bright, but also the black hair (1) becomes a bright color.
  • the present invention has been made in view of such circumstances, and an image processing apparatus capable of acquiring an HDR image with excellent gradation characteristics in which the contrast between light and dark is adjusted according to the brightness of light hitting a subject, It is an object to provide a method, a program, and an imaging apparatus.
  • an image processing apparatus is an image obtained by imaging the same subject, and a plurality of non-light emitting devices that are imaged under different exposure conditions under non-flash light emission.
  • An image acquisition unit that acquires an image and a light-emitting image captured under the same exposure condition as that of any one of the plurality of non-light-emitting images under flash light emission;
  • a distance distribution information acquisition unit that acquires distance distribution information indicating the light intensity of light emitted to the subject based on the light emission image, the non-light emission image captured under the same exposure conditions as the light emission image, and the distance distribution information
  • a light intensity distribution information calculation unit that calculates distribution information, a combination ratio determination unit that determines a combination ratio of a plurality of non-light-emitting images based on the light intensity distribution information, and a plurality of units according to the combination ratio
  • the luminescent image synthesizing comprises an image combining unit for generating a composite image whose dynamic range is expanded, the.
  • light intensity distribution information of light irradiated on a subject is calculated based on a light emission image, a non-light emission image captured under the same exposure conditions as the light emission image, and distance distribution information. . Then, a composite ratio of the plurality of non-light-emitting images is determined based on the calculated light intensity distribution information, and the plurality of non-light-emitting images are combined according to the determined composite ratio, thereby generating a composite image (HDR) with an expanded dynamic range. Image).
  • the HDR image generated in this way is an HDR image with excellent gradation characteristics in which the contrast of light and darkness is adjusted according to the brightness of the light hitting the subject without losing the contrast of the subject.
  • a composition ratio is determined for each pixel or region corresponding to each of the plurality of non-light-emitting images, and the image composition unit is configured to correspond to each pixel or region corresponding to the plurality of non-light-emitting images. It is preferable to synthesize a plurality of non-light emitting images by applying a synthesis ratio determined for each.
  • the plurality of exposure conditions corresponding to the plurality of non-light-emitting images include an under-exposure exposure condition and an over-exposure exposure condition rather than an appropriate exposure condition. Is preferred. This makes it possible to create an HDR image with less whiteout and blackout.
  • the mixing ratio of the first non-light-emitting images captured under the exposure condition with the smallest exposure value among the plurality of non-light-emitting images is represented by the mixing ratio of the first non-light-emitting images captured under the exposure condition with the largest exposure value among the plurality of non-light-emitting images. It is preferable that the mixing ratio of the second non-light-emitting image is larger than the mixing ratio of the first non-light-emitting image in a pixel or region having a low light intensity.
  • the mixing ratio of the first non-light emitting image is set larger than the mixing ratio of the second non-light emitting image.
  • the mixing ratio of the second non-emitting image is greater than the mixing ratio of the first non-emitting image. Is increased to generate an HDR image with less blackout.
  • the composition ratio determination unit determines the composition ratio based on the light intensity distribution information when determining the composition ratio for each corresponding pixel or region of the plurality of non-light-emitting images.
  • the mixing ratio of the second non-light emitting image is monotonously decreased (the mixing ratio of the first non-emitting image is monotonously increased).
  • the minimum light intensity is not limited to the minimum value in the light intensity distribution information.
  • the minimum light intensity may be a light intensity offset by a certain value from the minimum value.
  • the maximum light intensity is the light intensity distribution information.
  • the light intensity may be a value offset from the maximum value by a fixed value. It goes without saying that the monotonic decrease includes a linear decrease.
  • the image processing apparatus includes a distribution information creating unit that creates distribution information indicating a frequency distribution or a frequency distribution of the light intensity based on the light intensity distribution information, and the composition ratio determining unit is created The first light intensity corresponding to the first vertex having a high frequency or frequency and the second light intensity corresponding to the second vertex having a high frequency or frequency based on the distribution information that is greater than the first light intensity
  • the composition ratio is determined for each corresponding pixel or region of the plurality of non-light-emitting images, if the light intensity is equal to or lower than the first light intensity, the mixing ratio of the second non-light-emitting images Is set to the maximum value, the light intensity equal to or higher than the second light intensity is set to the minimum mixing ratio of the second non-emission image, and the light intensity is greater than the first light intensity and smaller than the second light intensity.
  • the imaging target in this case can be roughly divided into a shaded subject and a sunny subject.
  • the distribution information includes two peaks (first vertex and second vertex which are two vertices). ) Exists.
  • the first vertex is a vertex corresponding to the frequency or frequency of the light intensity distribution information of the subject region corresponding to the shaded subject
  • the second vertex is the frequency of the light intensity distribution information of the subject region corresponding to the sunny subject or It is a vertex corresponding to the frequency.
  • a light intensity corresponding to the first vertex (first light intensity) and a light intensity corresponding to the second vertex (second light intensity) are obtained, and for each corresponding pixel or region of the plurality of non-luminous images.
  • first light intensity first light intensity
  • second light intensity second light intensity
  • the mixture ratio of the second non-light-emitting image is maximized when the light intensity is equal to or lower than the first light intensity
  • the second non-light-emitting image when the light intensity is equal to or higher than the second light intensity.
  • the mixing ratio of is minimized. This is because the light intensity area below the first light intensity is considered to be a shaded area, and the area above the second light intensity is considered to be a sunny area.
  • the mixing ratio of the second non-light-emitting image is monotonously decreased between the maximum value and the minimum value, and continuously according to the increase or decrease of the light intensity.
  • the mixing ratio is changed. Note that this is not limited to a daytime outdoor scene on a sunny day, but a scene in which sunlight is inserted into a part of the room, a scene in which an area illuminated by an artificial light source and a shadowed area exist, etc.
  • the distribution curve showing the frequency distribution or frequency distribution of the light intensity there may be two vertices.
  • the image processing apparatus includes a distribution information creating unit that creates distribution information indicating a frequency distribution or a frequency distribution of the light intensity based on the light intensity distribution information, and the composition ratio determining unit is created The light corresponding to the bottom point is obtained when the light intensity corresponding to the bottom point of the frequency or the low frequency is obtained based on the distribution information obtained and the composition ratio is determined for each corresponding pixel or region of the plurality of non-light-emitting images.
  • the mixing ratio of the second non-light-emitting image is maximized when the light intensity is less than or equal to the threshold value, and the mixing ratio of the second non-light-emitting image is minimized when the light intensity exceeds the threshold value. It is preferable.
  • the bottom point is obtained from the distribution information indicating the frequency distribution or frequency distribution of the light intensity, and the light intensity corresponding to the bottom point is set as a threshold value.
  • the maximum value the mixing ratio of the first non-light emitting image is the minimum value
  • the mixing ratio of the second non-light emitting image is set to the minimum value (the mixing ratio of the second non-light emitting image is the maximum value).
  • the mixing ratio of the second non-light-emitting image is continuously changed between the maximum value and the minimum value according to the magnitude of the light intensity.
  • An image processing apparatus includes a luminance distribution information calculation unit that calculates luminance distribution information indicating a luminance distribution in an image of at least one non-luminescent image among a plurality of non-luminescent images.
  • the composition ratio determining unit preferably determines the composition ratio of the plurality of non-light-emitting images based on the light intensity distribution information and the luminance distribution information.
  • the image processing apparatus includes a detection unit that detects a pixel or a region where the flash light does not reach in the light emission image, and the combination ratio determination unit includes the pixel or the region where the flash light does not reach When determining a composition ratio for a region, it is preferable to determine a composition ratio of a plurality of non-light-emitting images based only on luminance distribution information.
  • the light intensity corresponding to that pixel or region is unknown. Therefore, in this case, it is preferable to determine the composite ratio of a plurality of non-light-emitting images based on the luminance distribution information as in the conventional case.
  • the composition ratio determination unit determines a composition ratio for each pixel or region corresponding to each of the plurality of non-light-emitting images, and indicates a parameter indicating the position of the pixel or region.
  • the light emission image exposure condition is preferably the exposure condition having the smallest exposure value among the plurality of exposure conditions corresponding to the plurality of non-light emission images.
  • An imaging apparatus includes an imaging unit that can image a subject under different exposure conditions, a flash light emitting unit that emits flash light, and a flash control that controls flash light emitted from the flash light emitting unit. And an image processing device as described above, and the image acquisition unit includes a plurality of non-light-emitting images captured by the imaging unit under different exposure conditions under non-light emission of the flash light emitted from the flash light-emitting unit. Then, under the light emission of the flash light emitted from the flash light emitting unit, the light emitting image captured by the imaging unit under the same exposure condition as the exposure condition of any one of the plurality of non-light emitting images is acquired.
  • the flash control unit dims the flash light from the flash light emitting unit, and does not perform the dimming light emission with the dimming image captured by the imaging unit under the dimming light emission. Based on the image captured by the imaging unit, if a region where the dimmed flash light has not reached is detected, the maximum amount of light emission that does not saturate the region where the dimmed flash light has reached is calculated.
  • the flash light is preferably emitted from the flash light emitting unit at the maximum light emission amount calculated when the light emission image is acquired.
  • the flash control unit calculates the maximum light emission amount based on the average reflectance of the subject and the distance distribution information acquired by the distance distribution information acquisition unit. As a result, it is possible to calculate the maximum light emission amount in which the area where the flash light reaches does not blow out.
  • the flash control unit dims the flash light from the flash light emitting unit, and does not perform the dimming light emission with the dimming image captured by the imaging unit under the dimming light emission.
  • the image acquisition unit preferably acquires the light control image as a light emission image.
  • the flash control unit dims the flash light from the flash light emitting unit, and does not perform the dimming light emission with the dimming image captured by the imaging unit under the dimming light emission.
  • An area where the flash light has not reached the light control image is detected based on the image captured by the imaging section, and the distance of the area where the flash light has not reached is emitted from the flash light emission section based on the distance distribution information.
  • the image acquisition unit preferably acquires the light control image as the light emission image.
  • the flashlight cannot reach the pixels that have not reached the flashlight even if the main light is emitted. This is because there is no need to make a new light emission.
  • an image processing method for acquiring a plurality of non-light-emitting images that are images of the same subject and that are captured under different exposure conditions under non-flash light emission.
  • a step of obtaining a light-emitting image captured under the same exposure condition as the exposure condition of any one of the plurality of non-light-emitting images under flash light emission, and distance distribution information indicating a distribution of the distance of the subject The light intensity distribution information of the light applied to the subject based on the light emission image, the non-light emission image captured under the same exposure conditions as the light emission image, and the distance distribution information; Determining a composite ratio of the plurality of non-light-emitting images based on the intensity distribution information; combining the plurality of non-light-emitting images according to the composite ratio; And generating a composite image di is enlarged, the.
  • the step of determining the composition ratio is based on the light intensity distribution information when determining the composition ratio for each corresponding pixel or region of the plurality of non-light-emitting images.
  • the mixing ratio of the first non-light-emitting images captured under the exposure condition with the smallest exposure value among the plurality of non-light-emitting images is represented by the highest exposure value among the plurality of non-light-emitting images.
  • the mixing ratio of the second non-light emitting image is set larger than the mixing ratio of the first non-light emitting image in a pixel or region having a low light intensity. Is preferred.
  • An image processing program has a function of acquiring a plurality of non-light-emitting images that are images of the same subject and that are captured under different exposure conditions under non-flash light emission.
  • a function of acquiring a light-emitting image captured under the same exposure condition as the exposure condition of any one of the plurality of non-light-emitting images under flash light emission, and distance distribution information indicating the distance distribution of the subject A light emission image, a non-light emission image captured under the same exposure conditions as the light emission image, and a distance distribution information based on the light intensity distribution information of the light applied to the subject,
  • the function of determining the composition ratio is based on the light intensity distribution information when determining the composition ratio for each corresponding pixel or region of the plurality of non-light-emitting images.
  • the mixing ratio of the first non-light-emitting images captured under the exposure condition with the smallest exposure value among the plurality of non-light-emitting images is represented by the highest exposure value among the plurality of non-light-emitting images.
  • the mixing ratio of the second non-light emitting image is set larger than the mixing ratio of the first non-light emitting image in a pixel or region having a low light intensity. Is preferred.
  • the composition ratio of the plurality of non-light-emitting images is determined according to the brightness (light intensity) of the light hitting the subject, and the HDR composition is performed. It is possible to obtain an HDR image with excellent gradation characteristics in which the difference in brightness is adjusted according to the brightness of the light hitting
  • FIG. 1 is a perspective view of an imaging apparatus according to the present invention as viewed obliquely from the front.
  • FIG. 2 is a rear view of the imaging apparatus.
  • FIG. 3 is a block diagram illustrating an embodiment of the internal configuration of the imaging apparatus.
  • FIG. 4 is a front view showing a configuration example of the image sensor.
  • FIG. 5 is a functional block diagram showing the first embodiment of the main body side CPU 220 functioning as the image processing apparatus according to the present invention.
  • FIG. 6 is a graph showing the relationship between the light intensity distribution information S i and the mixing ratio ⁇ i of the second non-light-emitting image.
  • FIG. 7 is a graph showing another relationship between the light intensity distribution information Si and the mixing ratio ⁇ i of the second non-luminescent image.
  • FIG. 8 is a graph showing still another relationship between the light intensity distribution information Si and the mixing ratio ⁇ i of the second non-emission image.
  • FIG. 9 is a functional block diagram showing a second embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention.
  • FIG. 10 is a frequency distribution diagram showing an example of the frequency distribution of light intensity.
  • FIG. 11 is a graph showing still another relationship between the light intensity distribution information Si and the mixing ratio ⁇ i of the second non-luminescent image.
  • FIG. 12 is a frequency distribution diagram similar to the frequency distribution of light intensity shown in FIG. It is a frequency distribution figure which shows an example of frequency distribution of light intensity.
  • FIG. 10 is a frequency distribution diagram showing an example of the frequency distribution of light intensity.
  • FIG. 11 is a graph showing still another relationship between the light intensity distribution information Si and the mixing ratio ⁇ i of the second non-luminescent image.
  • FIG. 12 is a frequency distribution diagram similar to the frequency distribution of light intensity shown in FIG. It is
  • FIG. 13 is a graph showing still another relationship between the light intensity distribution information Si and the mixing ratio ⁇ i of the second non-luminescent image.
  • FIG. 14 is a functional block diagram showing a third embodiment of the main body side CPU 220 functioning as the image processing apparatus according to the invention.
  • FIG. 15 is a functional block diagram showing a fourth embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the invention.
  • FIG. 16 is a functional block diagram showing a fifth embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the invention.
  • FIG. 17 is a flowchart showing an embodiment of an image processing method according to the present invention.
  • FIG. 18 is a flowchart showing detailed operations in step S12 of FIG.
  • FIG. 19 is an external view of a smartphone which is an embodiment of an imaging apparatus according to the present invention.
  • FIG. 20 is a block diagram illustrating a configuration of the smartphone.
  • FIG. 21 is a diagram used for explaining a problem of the conventional HDR synthesis
  • FIG. 1 is a perspective view of an imaging apparatus according to the present invention as viewed obliquely from the front
  • FIG. 2 is a rear view of the imaging apparatus.
  • the imaging apparatus 10 is a mirrorless digital single-lens camera including an interchangeable lens 100 and a camera body 200 to which the interchangeable lens 100 can be attached and detached.
  • a main body mount 260 to which the interchangeable lens 100 is attached, a finder window 20 of an optical finder, and the like are provided on the front surface of the camera main body 200, and a shutter release switch 22 and a shutter are mainly provided on the upper surface of the camera main body 200.
  • a speed dial 23, an exposure correction dial 24, a power lever 25, and a built-in flash 30 are provided.
  • a liquid crystal monitor 216 As shown in FIG. 2, a liquid crystal monitor 216, an optical viewfinder eyepiece 26, a MENU / OK key 27, a cross key 28, a playback button 29, and the like are mainly provided on the back of the camera body 200.
  • the liquid crystal monitor 216 functions as a display unit for displaying various menu screens in addition to displaying a live view image in the shooting mode, reproducing and displaying an image captured in the playback mode, and displaying various information to the user.
  • the MENU / OK key 27 is an operation having both a function as a menu button for instructing to display a menu on the screen of the liquid crystal monitor 216 and a function as an OK button for instructing confirmation and execution of selection contents.
  • the cross key 28 is an operation unit for inputting instructions in four directions, up, down, left, and right, and functions as a multi-function key for selecting an item from the menu screen and instructing selection of various setting items from each menu.
  • the up and down keys of the cross key 28 function as a zoom switch at the time of imaging or a playback zoom switch in the playback mode, and the left key and the right key are frame advance (forward and reverse) buttons in the playback mode. Function as. In addition, it also functions as an operation unit that designates an arbitrary subject for focus adjustment from a plurality of subjects displayed on the liquid crystal monitor 216.
  • still images are continuously captured.
  • Various shooting modes including HDR shooting mode that captures a plurality of images used for HDR composition that expands the dynamic range, and moving image capturing mode that captures moving images, which are a kind of continuous shooting mode and continuous shooting mode Can do.
  • the playback button 29 is a button for switching to a playback mode in which the recorded still image or moving image is displayed on the liquid crystal monitor 216.
  • FIG. 3 is a block diagram illustrating an embodiment of the internal configuration of the imaging apparatus 10.
  • the interchangeable lens 100 that functions as an imaging optical system constituting the imaging apparatus 10 is manufactured in accordance with the communication standard of the camera body 200, and can communicate with the camera body 200 as described later. It is an interchangeable lens.
  • the interchangeable lens 100 includes an imaging optical system 102, a focus lens control unit 116, an aperture control unit 118, a lens side CPU (Central Processing Unit) 120, a flash ROM (Read Only Memory) 126, a lens side communication unit 150, and a lens mount. 160.
  • a lens side CPU Central Processing Unit
  • flash ROM Read Only Memory
  • the imaging optical system 102 of the interchangeable lens 100 includes a lens group 104 including a focus lens and a diaphragm 108.
  • the focus lens control unit 116 moves the focus lens in accordance with a command from the lens side CPU 120 and controls the position (focus position) of the focus lens.
  • the diaphragm control unit 118 controls the diaphragm 108 in accordance with a command from the lens side CPU 120.
  • the lens-side CPU 120 controls the interchangeable lens 100 as a whole, and includes a ROM 124 and a RAM (Random Access Memory) 122.
  • the flash ROM 126 is a nonvolatile memory that stores programs downloaded from the camera body 200.
  • the lens-side CPU 120 performs overall control of each part of the interchangeable lens 100 using the RAM 122 as a work area according to a control program stored in the ROM 124 or the flash ROM 126.
  • the lens side communication unit 150 is connected to the camera body 200 via a plurality of signal terminals (lens side signal terminals) provided on the lens mount 160 in a state where the lens mount 160 is attached to the body mount 260 of the camera body 200. Communication. That is, the lens side communication unit 150 sends a request signal and a response signal to and from the main body side communication unit 250 of the camera body 200 connected via the lens mount 160 and the main body mount 260 in accordance with a command from the lens side CPU 120. Transmission / reception (bidirectional communication) is performed to notify the camera body 200 of lens information (focus lens position information, focal length information, aperture information, etc.) of each optical member of the imaging optical system 102.
  • lens information focus lens position information, focal length information, aperture information, etc.
  • the interchangeable lens 100 also includes a detection unit (not shown) that detects focus lens position information and aperture information.
  • the aperture information is information indicating the aperture value (F value) of the aperture 108, the aperture diameter of the aperture 108, and the like.
  • the lens side CPU 120 preferably stores various lens information including the detected focus lens position information and aperture information in the RAM 122. Further, the lens information is detected when there is a request for lens information from the camera body 200, or is detected when the optical member is driven, or at a fixed period (a period sufficiently shorter than the frame period of the moving image). It is detected and the detection result can be held.
  • [Camera body] 3 includes an image sensor 201, an image sensor control unit 202, an analog signal processing unit 203, an A / D (Analog / Digital) converter 204, an image input controller 205, a digital signal.
  • the mount 260 and the built-in flash 30 (FIG. 1) are configured. It comprises: (focal-plane shutter FPS) 280 and the FPS control unit 296, Rush emitting unit 270, the flash control unit 272,
  • the image sensor 201 is composed of a complementary metal-oxide semiconductor (CMOS) type color image sensor.
  • CMOS complementary metal-oxide semiconductor
  • the image sensor 201 is not limited to the CMOS type, but may be an XY address type or a CCD (Charge Coupled Device) type image sensor.
  • CCD Charge Coupled Device
  • FIG. 4 is a front view showing the configuration of the image sensor 201.
  • the image sensor 201 includes an image sensor having a plurality of pixels configured by photoelectric conversion elements (light receiving cells) arranged two-dimensionally in the horizontal direction (x direction) and the vertical direction (y direction).
  • the first and second pixels S A and S B are configured to selectively receive light by dividing the light beam incident through the interchangeable lens 100 by the pupil dividing means shown below.
  • the image sensor 201 employs a pupil image separation method using a pupil imaging lens (microlens) 201A as pupil dividing means.
  • a pupil imaging lens microlens
  • a plurality of (two in this example) pixels are assigned to one microlens 201A, and are incident on one microlens 201A.
  • pupil image which is pupil division by the microlens 201A, the first pixel S a two (a pair), and is received by the second pixel S B.
  • the pupil image is separated according to the incident angle of the light entering the microlens 201A, the corresponding pair of the first pixel S A, is received by the second pixel S B.
  • a pair of first pixel S A, second pixel S B corresponds to a pair of phase difference pixels available on the phase difference AF.
  • the pair of first pixel S A, pixel values obtained by adding the pixel values obtained from the pixels of the second pixel S B are to become equal to the pixel value of normal pixels that are not pupil division, the pair of first The pixel S A and the second pixel S B can be handled as normal pixels (one pixel).
  • the pair of first pixel S A and second pixel S B of the image sensor 201 includes red A color filter of any one of the three primary color filters (R filter, G filter, B filter) of (R), green (G), and blue (B) is arranged according to a predetermined color filter array.
  • the color filter array shown in FIG. 4 is a general Bayer array, but the color filter array is not limited to this, and may be another color filter array such as a Trans (registered trademark) array, for example.
  • the optical image of the subject formed on the light receiving surface of the image sensor 201 by the imaging optical system 102 of the interchangeable lens 100 is converted into an electrical signal by the image sensor 201.
  • Charges corresponding to the amount of incident light are accumulated in each pixel of the image sensor 201 (the first pixel S A and the second pixel S B shown in FIG. 4), and the charge accumulated in each pixel from the image sensor 201.
  • An electric signal corresponding to the amount (signal charge) is read out as an image signal. That is, the first pixel S A, a signal corresponding to the charge amount accumulated in the second pixel S B (first pixel S A, pixel values of the second pixel S B) can be read independently.
  • the image sensor control unit 202 performs reading control of an image signal from the image sensor 201 in accordance with a command from the main body side CPU 220. In addition, when a still image is captured, the image sensor control unit 202 reads all lines of the image sensor 201 with the FPS 280 closed after the exposure time is controlled by opening and closing the FPS 280. In addition, the image sensor 201 and the image sensor control unit 202 of this example sequentially perform an exposure operation for each of at least one line or each pixel (that is, sequentially reset for each line or each pixel to accumulate charges.
  • the analog signal processing unit 203 performs various types of analog signal processing on an analog image signal obtained by imaging the subject with the image sensor 201.
  • the analog signal processing unit 203 includes a sampling hold circuit, a color separation circuit, an AGC (Automatic Gain Control) circuit, and the like.
  • the AGC circuit functions as a sensitivity adjustment unit that adjusts the sensitivity at the time of imaging (ISO: International Organization for Standardization), adjusts the gain of an amplifier that amplifies the input image signal, and the signal level of the image signal is Try to be in the proper range.
  • the A / D converter 204 converts the analog image signal output from the analog signal processing unit 203 into a digital image signal.
  • Image data (mosaic image data) for each pixel of RGB output via the image sensor 201, the analog signal processing unit 203, and the A / D converter 204 when capturing a still image or moving image is transferred from the image input controller 205 to the RAM 207. Is temporarily stored.
  • the image sensor 201 is a CMOS type image sensor, the analog signal processing unit 203 and the A / D converter 204 are often built in the image sensor 201.
  • the digital signal processing unit 206 performs various types of digital signal processing on the image data stored in the RAM 207.
  • the digital signal processing unit 206 appropriately reads out image data stored in the RAM 207, and performs offset processing, gain control processing including sensitivity correction, gamma correction processing, demosaicing processing (demosaicing processing, simultaneous processing) on the read image data.
  • Digital signal processing such as RGB / YCrCb conversion processing or the like, and image data after the digital signal processing is stored in the RAM 207 again.
  • the demosaic process is a process of calculating color information of all RGB for each pixel from a mosaic image composed of RGB, for example, in the case of an image sensor composed of RGB color filters, and mosaic data (dot sequential RGB data). ) To generate the RGB 3 plane image data.
  • the RGB / YCrCb conversion process is a process of converting the synchronized RGB data into luminance data (Y) and color difference data (Cr, Cb).
  • the compression / decompression processing unit 208 performs compression processing on the uncompressed luminance data Y and color difference data Cb, Cr once stored in the RAM 207 when recording a still image or a moving image.
  • the image is compressed in, for example, JPEG (Joint Photographic coding Experts Group) format. Compress in H.264 format.
  • the image data compressed by the compression / decompression processing unit 208 is recorded on the memory card 212 via the media control unit 210.
  • the compression / decompression processing unit 208 performs decompression processing on the compressed image data obtained from the memory card 212 via the media control unit 210 in the playback mode, and generates uncompressed image data.
  • the media control unit 210 performs control to record the image data compressed by the compression / decompression processing unit 208 in the memory card 212. In addition, the media control unit 210 performs control for reading compressed image data from the memory card 212.
  • the display control unit 214 performs control to display uncompressed image data stored in the RAM 207 on the liquid crystal monitor 216.
  • the liquid crystal monitor 216 is configured by a liquid crystal display device, but may be configured by a display device such as organic electroluminescence instead of the liquid crystal monitor 216.
  • the liquid crystal monitor 216 When a live view image is displayed on the liquid crystal monitor 216, digital image signals continuously generated by the digital signal processing unit 206 are temporarily stored in the RAM 207.
  • the display control unit 214 converts the digital image signal temporarily stored in the RAM 207 into a display signal format and sequentially outputs it to the liquid crystal monitor 216. Thereby, the captured image is displayed on the liquid crystal monitor 216 in real time, and the liquid crystal monitor 216 can be used as an electronic viewfinder.
  • the shutter release switch 22 is an imaging instruction unit for inputting an imaging instruction of a still image or a moving image, and is configured by a two-stage stroke type switch composed of so-called “half press” and “full press”.
  • an S1 ON signal is output when the shutter release switch 22 is half-pressed, an S2 ON signal is output when the shutter release switch 22 is further pressed halfway down, and an S1 ON signal is output.
  • the main body side CPU 220 executes shooting preparation processing such as AF control (automatic focus adjustment) and AE control (automatic exposure control), and when the S2 ON signal is output, executes still image shooting processing and recording processing. To do.
  • AF control and AE control are automatically performed when the auto mode is set by the operation unit 222, respectively, and AF control and AE control are not performed when the manual mode is set. Needless to say.
  • the camera body 200 In the moving image capturing mode, when the S2 ON signal is output when the shutter release switch 22 is fully pressed, the camera body 200 enters a moving image recording mode in which moving image recording is started, and image processing of the moving image is performed. When the shutter release switch 22 is fully pressed again and an S2 ON signal is output, the camera body 200 enters a standby state and temporarily stops the moving image recording process.
  • the shutter release switch 22 is not limited to a two-stroke type switch consisting of half-pressing and full-pressing, and may output an S1 ON signal and an S2 ON signal in a single operation. May be provided to output an S1 on signal and an S2 on signal.
  • the operation instruction may be output by touching an area corresponding to the operation instruction displayed on the screen of the touch panel as these operation means.
  • the form of the operation means is not limited to these as long as the preparation process or the imaging process is instructed.
  • the still image or moving image acquired by the imaging is compressed by the compression / decompression processing unit 208, and the compressed image data includes the required attachment of the imaging date and time, GPS information, and imaging conditions (F value, shutter speed, ISO sensitivity, etc.).
  • the information is converted into an image file added to the header, and then stored in the memory card 212 via the media control unit 210.
  • the main body side CPU 220 controls the overall operation of the camera main body 200 and the driving of the optical members of the interchangeable lens 100, etc. Based on the input from the operation unit 222 including the shutter release switch 22, the main body side CPU 220 The interchangeable lens 100 is controlled.
  • the clock unit 224 measures time based on a command from the main body CPU 220 as a timer.
  • the clock unit 224 measures the current date and time as a calendar.
  • the flash ROM 226 is a non-volatile memory that can be read and written, and stores setting information.
  • the ROM 228 stores a camera control program executed by the main body side CPU 220, an image alignment auxiliary program for executing imaging in the image capturing mode for composition according to the present invention, defect information of the image sensor 201, various types used for image processing, and the like. Parameters and tables are stored.
  • the main body side CPU 220 controls each part of the camera main body 200 and the interchangeable lens 100 using the RAM 207 as a work area in accordance with a camera control program stored in the ROM 228 or an image alignment auxiliary program.
  • the AF control unit 230 that functions as an automatic focus adjustment unit calculates a defocus amount necessary for controlling the phase difference AF, and based on the calculated defocus amount, a position (focus position) command to which the focus lens should move is calculated. Is notified to the interchangeable lens 100 via the main body side CPU 220 and the main body side communication unit 250.
  • the AF control unit 230 includes a phase difference detection unit and a defocus amount calculation unit.
  • the phase difference detection unit is in an AF area of the image sensor 201 (an area where the main subject specified by the user exists, an area of the main subject automatically detected by face detection, or an area set by default).
  • a first pixel group including a first pixel S a which color filters of the same color are disposed, respectively the pixel data from the second pixel group composed of the first pixel S a (the first pixel value and second pixel value)
  • the phase difference is detected based on the first pixel value and the second pixel value.
  • This phase difference is obtained when the correlation between the plurality of first pixel values of the first pixel group and the plurality of second pixel values of the second pixel group becomes maximum (the plurality of first pixel values and the plurality of second pixel values It can be calculated from the amount of shift in the pupil division direction between the first pixel value and the second pixel value (when the integrated absolute difference value is minimized).
  • the defocus amount calculation unit calculates the defocus amount by multiplying the phase difference detected by the phase difference detection unit by a coefficient corresponding to the current F value (ray angle) of the interchangeable lens 100.
  • the focus lens position command corresponding to the defocus amount calculated by the AF control unit 230 is notified to the interchangeable lens 100, and the lens side CPU 120 of the interchangeable lens 100 that receives the focus lens position command receives the focus lens control unit 116.
  • the focus lens is moved via, and the position (focus position) of the focus lens is controlled.
  • the AE control unit 232 is a part that detects the brightness of the subject (subject brightness), and is a numerical value (exposure value (EV value)) necessary for AE control and AWB (Auto White Balance) control corresponding to the subject brightness. )) Is calculated.
  • the AE control unit 232 calculates an EV value based on the brightness of the image acquired via the image sensor 201, the shutter speed when the brightness of the image is acquired, and the F value.
  • the main body side CPU 220 can determine the F value, shutter speed, and ISO sensitivity from a predetermined program diagram based on the EV value obtained from the AE control unit 232, and perform AE control.
  • the white balance correction unit 234 calculates white balance gains (WB (White Balance) gains) Gr, Gg, Gb for each color data of RGB data (R data, G data, and B data). White balance correction is performed by multiplying the B data by the calculated WB gains Gr, Gg, and Gb, respectively.
  • WB gains Gr, Gg, and Gb the subject is illuminated based on scene recognition (outdoor / indoor determination, etc.) based on the brightness (EV value) of the subject, the color temperature of ambient light, and the like.
  • a method of reading out a WB gain corresponding to a specified light source type from a storage unit in which an appropriate WB gain is stored in advance for each light source type is conceivable, but at least using an EV value
  • Other known methods for determining Gr, Gg, Gb are conceivable.
  • the wireless communication unit 236 is a part that performs short-range wireless communication of a standard such as Wi-Fi (Wireless Fidelity) (registered trademark), Bluetooth (registered trademark), etc., and with a peripheral digital device (a mobile terminal such as a smartphone). Necessary information is sent and received between the two.
  • Wi-Fi Wireless Fidelity
  • Bluetooth registered trademark
  • peripheral digital device a mobile terminal such as a smartphone
  • the GPS receiving unit 238 receives GPS signals transmitted from a plurality of GPS satellites according to instructions from the main body side CPU 220, executes positioning calculation processing based on the received plurality of GPS signals, and the latitude and longitude of the camera main body 200 And GPS information consisting of altitude.
  • the acquired GPS information can be recorded in the header of the image file as attached information indicating the imaging position of the captured image.
  • the power supply control unit 240 applies power supply voltage supplied from the battery 242 to each unit of the camera main body 200 in accordance with a command from the main body side CPU 220.
  • the power supply control unit 240 supplies the power supply voltage supplied from the battery 242 to each unit of the interchangeable lens 100 via the main body mount 260 and the lens mount 160 in accordance with a command from the main body side CPU 220.
  • the lens power switch 244 switches on and off the power supply voltage applied to the interchangeable lens 100 via the main body mount 260 and the lens mount 160 and switches the level in accordance with a command from the main body side CPU 220.
  • the main body side communication unit 250 transmits / receives a request signal and a response signal to / from the lens side communication unit 150 of the interchangeable lens 100 connected via the main body mount 260 and the lens mount 160 in accordance with a command from the main body side CPU 220 ( (Bidirectional communication).
  • the main body mount 260 is provided with a plurality of terminals 260A as shown in FIG. 1, and when the interchangeable lens 100 is attached to the camera main body 200 (the lens mount 160 and the main body mount 260 are connected), the main body A plurality of terminals 260A (FIG. 1) provided on the mount 260 and a plurality of terminals (not shown) provided on the lens mount 160 are electrically connected, and the main body side communication unit 250 and the lens side communication unit 150 are connected. Bi-directional communication is possible.
  • the built-in flash 30 (FIG. 1) is, for example, a TTL (Through The Lens) automatic dimming flash, and includes a flash light emitting unit 270 and a flash control unit 272.
  • the flash control unit 272 has a function of adjusting the light emission amount (guide number) of flash light emitted from the flash light emitting unit 270. That is, the flash control unit 272 pre-flashes (dimming light emission) flash light with a small light emission amount from the flash light emitting unit 270 in synchronization with the flash imaging instruction from the main body side CPU 220, and the imaging optical system 102 of the interchangeable lens 100. The amount of flash light that is actually emitted is determined based on reflected light (including ambient light) incident via the flash light, and the flash light emission unit 270 emits the flash light of the determined emission amount (main emission). Note that, when the HDR imaging mode is selected, the flash control unit 272 performs light emission control different from the normal flash emission image imaging, and details thereof will be described later.
  • the FPS 280 constitutes a mechanical shutter of the imaging apparatus 10 and is disposed immediately before the image sensor 201.
  • the FPS control unit 296 controls the opening and closing of the front and rear curtains of the FPS 280 based on input information (S2 ON signal, shutter speed, etc.) from the main body CPU 220, and controls the exposure time (shutter speed) in the image sensor 201. To do.
  • FIG. 5 is a functional block diagram showing a first embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention, and HDR synthesis is based mainly on a plurality of images for HDR synthesis captured in the HDR imaging mode.
  • FIG. 6 is a functional block diagram when image processing (HDR processing) is executed.
  • the main body side CPU 220 When performing imaging in the HDR imaging mode, the main body side CPU 220 mainly controls imaging control in the HDR imaging mode, and as shown in FIG. 5, a distance distribution information acquisition unit 220A, a light intensity distribution information calculation unit 220B, It functions as a composition ratio determination unit 220C and an image composition unit 220D.
  • the main body side CPU 220 captures a plurality (two in this example) of non-light-emitting images under different exposure conditions under non-flash light emission and under flash light emission. Imaging of a luminescent image is performed under the same exposure condition as that of any one of the plurality of non-luminescent images.
  • the plurality of exposure conditions corresponding to the plurality of non-light-emitting images include an under-exposure exposure condition and an over-exposure exposure condition rather than an appropriate exposure condition.
  • the main body side CPU 220 captures two non-light-emitting images under two exposure conditions of underexposure and overexposure with respect to the appropriate exposure determined according to the EV value calculated by the AE control unit 232. Let it be done.
  • the exposure correction value (for example, ⁇ 3 EV) for underexposure and overexposure with respect to proper exposure may be arbitrarily set by the user, or automatically according to the brightness (EV value) of the object scene. It may be possible to set it automatically.
  • the main body side CPU 220 reads the light-emitting image under the same exposure condition (in this example, the under-exposure exposure condition) as any non-light-emitting image of the plurality of non-light-emitting images under flash light emission. Perform imaging.
  • the same exposure condition in this example, the under-exposure exposure condition
  • the two non-light-emitting images and the one light-emitting image are images captured with respect to the same subject, and are continuously shot images that are continuously captured with the imaging interval as short as possible. preferable.
  • the image acquisition unit 221 includes a non-light-emitting image (first non-light-emitting image) captured by the image pickup unit (the interchangeable lens 100 and the image sensor 201) under underexposure under the non-flash light, and an overshoot.
  • first non-light-emitting image captured by the image pickup unit (the interchangeable lens 100 and the image sensor 201) under underexposure under the non-flash light, and an overshoot.
  • the image is captured by the image-capturing unit under the same exposure condition as the exposure condition of the first non-light-emitting image. This is a part for acquiring the emitted light image.
  • the image acquisition unit 221 may acquire the first non-light-emitting image, the second non-light-emitting image, and the light-emitting image temporarily stored in the RAM 207 via the image input controller 205 from the RAM 207, or by the imaging unit.
  • the captured first non-emission image, second non-emission image, and emission image may be directly acquired via the image input controller 205.
  • the image input controller 205 of this embodiment a pair of first pixel S A of the image sensor 201 shown in FIG. 4, by adding the pixel value of the second pixel S B, 1 pixel per one microlens 201A
  • the first non-light-emitting image, the second non-light-emitting image, and the light-emitting image are acquired.
  • the first image and the second image are phase difference images having a phase difference according to the subject distance.
  • the distance distribution information acquisition unit 220A acquires a first image and a second image that are phase difference images having a phase difference from the image acquisition unit 221, and based on the first image and the second image, Distance distribution information indicating the distance distribution of the subject is acquired.
  • the distance distribution information acquisition unit 220A calculates a deviation amount (phase difference) between a feature point that is a feature in the first image and a feature point in the second image corresponding to each feature point.
  • the defocus amount of each feature point is calculated based on the phase difference and the current F value of the diaphragm 108, and the calculated defocus amount of each feature point and the lens of the current interchangeable lens 100 (the focus lens of the imaging optical system 102).
  • the distance between each feature point (subject) is calculated from the position.
  • a feature point with a defocus amount of zero is at the in-focus distance at which the lens position of the current focus lens is focused, and as the defocus amount of each feature point shifts to the front pin side or the rear bin side
  • the distance between each feature point is the distance moved to the far side or the near side from the in-focus distance.
  • the light intensity distribution information calculation unit 220B acquires the light emission image and the first non-light emission image captured under the same exposure conditions from the image acquisition unit 221, and indicates the distribution of the distance of the subject in the image from the distance distribution information acquisition unit 220A. Get distance distribution information. Then, based on the acquired light emission image, first non-light emission image, and distance distribution information, light intensity distribution information that is distribution information of an index indicating the intensity of light (not flash light but ambient light) applied to the subject Is calculated.
  • the parameter indicating the coordinates of each pixel of the first non-light-emitting image is i
  • the luminance value of the pixel i is Y i
  • the luminance value of the pixel i at the same coordinate of the light-emitting image is Y Li
  • the light intensity distribution information S i is expressed by the following equation: [Equation 1]
  • S i Y i / ⁇ (Y Li ⁇ Y i ) ⁇ (d i 2 ) ⁇
  • the parameter i is not limited to the parameter indicating the pixel coordinates, and may be a parameter indicating the position of each region obtained by dividing the image into a plurality of regions.
  • the luminance values Y i , Y Li , distance information d i , and light intensity distribution information S i are also the luminance value (representative luminance value), distance information, and light intensity distribution information for each
  • Yi of the first non-light-emitting image is Yi of the first non-light-emitting image
  • B i is the intensity of light striking the subject
  • R i is the reflectance of the subject
  • K is the photoelectric conversion coefficient by the image sensor 201. Is represented by the following equation.
  • the combination ratio determination unit 220C determines a combination ratio of a plurality of non-light-emitting images (first non-light-emitting image and second non-light-emitting image) based on the light intensity distribution information. Since a portion where the value of the light intensity distribution information S i is large is strongly irradiated, it is preferable to use the first non-light-emitting image that is an under image, and light is not so much at a portion where the value of the light intensity distribution information S i is small. Since it is not hit, it is preferable to use the second non-light emitting image which is an over image.
  • composition ratio determining unit 220C determines the composition ratio for each pixel corresponding to each of the plurality of non-light-emitting images
  • a pixel having a high light intensity based on the light intensity distribution information S i includes a plurality of non-light-emitting images.
  • the mixing ratio of the first non-light emitting image captured under the exposure condition with the smallest exposure value is greater than the mixing ratio of the second non-light emitting image captured under the exposure condition with the largest exposure value among the plurality of non-light emitting images.
  • the mixing ratio of the second non-light emitting image is made larger than the mixing ratio of the first non-light emitting image.
  • the composition ratio may be determined for each region obtained by dividing the image into a plurality of regions.
  • 6 to 7 are graphs showing the relationship between the light intensity distribution information S i and the mixing ratio ⁇ i of the second non-light-emitting image, respectively.
  • the mixing ratio of the first non-luminescent image is (1- ⁇ i)
  • the synthesis ratio of the first non-luminescent image and the second non-luminescent image is (1- ⁇ i): ⁇ i.
  • the composition ratio determining unit 220C provides a minimum value ⁇ min. (> 0) and a maximum value ⁇ max. ( ⁇ 1) for the mixing rate ⁇ i, and the minimum value S min. max by. the light intensity distribution information S i, the mixing ratio ⁇ i minimum value alpha min. by linear interpolation the maximum value alpha max. range between the synthesis ratio of the first non-emission image and the second non-emission image Can be determined.
  • the composition ratio determination unit 220C provides the minimum value ⁇ min. (> 0) and the maximum value ⁇ max. ( ⁇ 1) for the mixing rate ⁇ i, and the minimum value S min.
  • a composite ratio between the non-light emitting image and the second non-light emitting image can be determined.
  • the image composition unit 220D composes the under-exposed first non-emission image and the over-exposed second non-emission image in accordance with the composition ratio determined by the composition ratio determination unit 220C, and the dynamic range.
  • a composite image (HDR image) in which is enlarged is generated. That is, the image combining unit 220D combines the first non-light-emitting image and the second light-emitting image by applying the combination ratio determined for each corresponding pixel of the first non-light-emitting image and the second non-light-emitting image.
  • composition ratio is determined for each area obtained by dividing the image into a plurality of areas
  • the composition ratio determined for each area corresponding to each of the first non-light-emitting image and the second non-light-emitting image is applied.
  • the non-light emitting image and the second light emitting image are combined to generate an HDR image.
  • the HDR image generated in this way is an HDR image with excellent gradation characteristics in which the contrast of light and darkness is adjusted according to the brightness of the light hitting the subject without losing the contrast of the subject.
  • FIG. 9 is a functional block diagram showing a second embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention.
  • parts that are the same as those in the first embodiment shown in FIG. 5 are given the same reference numerals, and detailed descriptions thereof are omitted.
  • the main body side CPU 220 of the second embodiment shown in FIG. 9 is different from the first embodiment shown in FIG. 5 in that the function of the distribution information creation unit 220E is added.
  • the distribution information creation unit 220E calculates distribution information (for example, a frequency distribution chart or a frequency distribution table) indicating the frequency distribution of the light intensity based on the light intensity distribution information S i calculated by the light intensity distribution information calculation unit 220B. .
  • distribution information for example, a frequency distribution chart or a frequency distribution table
  • FIG. 10 is a frequency distribution diagram showing an example of the frequency distribution of light intensity.
  • a dark subject that is not exposed to sunlight (a shaded subject) outdoors on a sunny day A subject having a wide dynamic range is a subject to be imaged, from a bright subject to sunlight (a subject in the sun), and in this case, the subject to be imaged can be roughly divided into a shaded subject and a sunny subject.
  • the light intensity frequency distribution diagram generated based on the light intensity distribution information of such a scene there are two peaks (first vertex P1 and second vertex P2 which are two vertices) as shown in FIG. To do.
  • the first vertex P1 is a vertex corresponding to the frequency of the light intensity distribution information of the subject region corresponding to the shaded subject
  • the second vertex P2 is the frequency of the light intensity distribution information of the subject region corresponding to the sunny subject. The corresponding vertex.
  • distribution information indicating the frequency distribution of the light intensity is created.
  • distribution information indicating the frequency distribution of the light intensity may be created, or distribution information that approximates the frequency distribution or the frequency distribution to a curve is created. May be.
  • the composition ratio determination unit 220C shown in FIG. 9 has the first light intensity (S) corresponding to the first vertex P1 having a high light intensity frequency as shown in FIG. dark ) and a second light intensity (S blight ) corresponding to the second vertex P2 having a high light intensity.
  • the composition ratio determination unit 220C determines the composition ratio for each pixel corresponding to each of the first non-light-emitting image and the second non-light-emitting image, as shown in FIG. 11, the first light intensity (S dark ) or less. If the light intensity is less than the second light intensity (S blight ), the second non-light-emitting image mixing ratio ⁇ i is set to the maximum value ⁇ max. When the light intensity is set to the minimum value ⁇ min. And is larger than the first light intensity (S dark ) and smaller than the second light intensity (S blight ), the mixing ratio ⁇ i of the second non-light emitting image is set to the maximum value ⁇ max. And the minimum value ⁇ min. Are linearly decreased (monotonically decreased).
  • a pixel or region having a light intensity equal to or lower than the first light intensity (S dark ) is considered to be a shaded area, and it is preferable that the mixture rate ⁇ i of the overexposed second non-emission image is set to the maximum value ⁇ max.
  • a pixel or region having a light intensity equal to or higher than the second light intensity (S blight ) is considered to be a sunny area, and the mixture ratio ⁇ i of the overexposed second non-emission image is set to the minimum value ⁇ min. It is preferable to set the mixing ratio of the first non-light-emitting image to a maximum value). Thereby, the gradation of the highlight part and the shadow part can be enriched.
  • FIG. 12 is a frequency distribution diagram similar to the frequency distribution of the light intensity shown in FIG.
  • composition ratio determination unit 220C shown in FIG. 9 based on the distribution information created by the distribution information creation unit 220E, the light intensity corresponding to the bottom point V having a low light intensity frequency as shown in FIG. (St) is obtained.
  • the composition ratio determining unit 220C determines the composition ratio for each pixel corresponding to each of the first non-light-emitting image and the second non-light-emitting image
  • the light intensity (St) is set as a threshold value as shown in FIG.
  • the mixture ratio ⁇ i of the second non-light-emitting image is set to the maximum value ⁇ max.
  • the mixture ratio ⁇ i of the second non-light-emitting image is set to the minimum value ⁇ min. . to (maximum mixing ratio of the first non-emission image of underexposure).
  • the mixing ratio of the second non-luminescent image is set to the maximum value ⁇ max. And the minimum value according to the light intensity . This includes the case where the mixing ratio is continuously changed between ⁇ min .
  • FIG. 14 is a functional block diagram showing a third embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention.
  • the same reference numerals are given to the portions common to the first embodiment shown in FIG. 5, and the detailed description thereof is omitted.
  • the main body side CPU 220 of the third embodiment shown in FIG. 14 is different from the first embodiment shown in FIG. 5 in that the function of the luminance distribution information calculation unit 220F is added.
  • the luminance distribution information calculation unit 220F calculates luminance distribution information indicating the luminance distribution in the image of at least one non-luminescent image among the plurality of non-luminescent images (first non-luminescent image and second non-luminescent image). Part.
  • the luminance distribution information can be created, for example, as an image made up of luminance signals of the first non-emission image, or as information indicating the luminance (representative luminance) for each area into which the image is divided.
  • the combination ratio determination unit 220C illustrated in FIG. 14 includes a plurality of non-light-emitting images (first non-light-emitting images) based on the light intensity distribution information calculated by the light intensity distribution information calculation unit 220B and the luminance distribution information calculated by the luminance distribution information calculation unit 220F. The composite ratio of the light emission image and the second non-light emission image) is determined.
  • the composition ratio determination unit 220C determines a parameter indicating the position of a pixel or a region when determining a composition ratio for each pixel or region corresponding to each of the first non-light-emitting image and the second non-light-emitting image.
  • the mixing ratio of the first non-light-emitting image is (1- ⁇ i)
  • the combination ratio of the first non-light-emitting image and the second non-light-emitting image is (1- ⁇ i): ⁇ i.
  • FIG. 15 is a functional block diagram showing a fourth embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention.
  • the same reference numerals are given to the portions common to the third embodiment shown in FIG. 14, and the detailed description thereof is omitted.
  • the main body side CPU 220 of the fourth embodiment shown in FIG. 15 is different from the third embodiment shown in FIG. 14 in that the function of the flash light non-reaching pixel detection unit 220G is added.
  • the flash light non-reaching pixel detection unit 220G calculates a difference (Y Li ⁇ Y i ) between the luminance value Y i of the pixel i of the first non-emission image and the luminance value of the pixel i at the same coordinates of the emission image as Y Li. Then, the pixel i for which the difference (Y Li -Y i ) is 0 is detected as a flash light non-reaching pixel.
  • the combination ratio determination unit 220C illustrated in FIG. 15 has a plurality of non-light-emitting images (first non-light-emitting images) based on the light intensity distribution information calculated by the light intensity distribution information calculation unit 220B and the luminance distribution information calculated by the luminance distribution information calculation unit 220F. (The light emission image, the second non-light emission image) is determined.
  • the flash light non-reaching pixel detection unit 220G detects the flash light non-reaching pixel
  • the composite ratio for the flash light non-reaching pixel is determined.
  • the composition ratio of the plurality of non-light emitting images is determined based only on the luminance distribution information.
  • the composition ratio determination unit 220C calculates the mixing ratio ⁇ i of the second non-light-emitting image based on the light intensity distribution information and the luminance distribution information using the formula [6], but the pixel to which the flash light does not reach is calculated.
  • the mixture ratio ⁇ i of the second non-light-emitting image of the flash light non-reaching pixel is set to 1, and based on the luminance distribution information
  • the mixture ratio ⁇ i of the second non-light-emitting image is obtained only from the mixing ratio ⁇ i of the second non-light-emitting image.
  • the flash light non-reaching pixel detection unit 220G is not limited to detecting a flash light non-reaching pixel, but may function as a detection unit that detects a flash light non-reaching region when an image is divided into a plurality of regions. Good.
  • the flash light non-reaching pixel detection unit 220G is also used to control the light emission amount of the flash light emitted from the flash light emission unit 270 by the flash control unit 272. Details of the light emission amount control by the flash control unit 272 are as follows. Will be described later.
  • FIG. 16 is a functional block diagram showing a fifth embodiment of the main body side CPU 220 functioning as an image processing apparatus according to the present invention.
  • the same reference numerals are given to the portions common to the first embodiment shown in FIG. 5, and detailed description thereof is omitted.
  • the main body side CPU 220 of the fourth embodiment shown in FIG. 16 does not have the distance distribution information acquisition unit 220A shown in the first embodiment, and acquires distance distribution information from the external distance distribution information acquisition unit 223. Yes.
  • the distance distribution information acquisition unit 223 can be configured by, for example, a distance image sensor in which a plurality of light receiving elements are arranged two-dimensionally.
  • a distance image sensor one that obtains a distance image by a TOF (Time Of Flight) method is conceivable.
  • the TOF method is a method for obtaining the distance to the subject by irradiating the subject with light and measuring the time until the reflected light is received by the sensor.
  • a distance image sensor that has two pixels, and obtains a distance image of the subject from the received light amount (light reception intensity) for each pixel of the distance image sensor, and irradiates the subject with light modulated at high frequency and reflects it from the time of irradiation
  • a method of acquiring a distance image by detecting a phase shift until light is received There is known a method of acquiring a distance image by detecting a phase shift until light is received.
  • the distance distribution information acquisition unit 2223 As the distance distribution information acquisition unit 223, a three-dimensional laser measuring instrument, a stereo camera, or the like can be applied.
  • the image sensor 201 is not limited to the image sensor as in this example, and may be an image sensor that cannot measure the distance.
  • FIG. 17 is a flowchart showing an embodiment of an image processing method according to the present invention.
  • the operation of the HDR process performed by the main body side CPU 220 having the functions of the units illustrated in FIG. 5 and the like will be mainly described.
  • the distance distribution information acquisition unit 220 ⁇ / b> A of the main body side CPU 220 acquires distance distribution information indicating the distribution of the distance of the subject in the image (step S ⁇ b> 10).
  • the distance distribution information acquisition unit 220A acquires a first image and a second image that are phase difference images having a phase difference from each other from the image acquisition unit 221, and based on the first image and the second image. Then, distance distribution information indicating the distribution of the distance of the subject in the image is acquired.
  • the image acquisition unit 221 captures the same subject under different imaging conditions, and includes a plurality of non-emission images (first non-emission images) captured under different exposure conditions under non-flash light emission. And the second non-emission image), and the non-emission image (in this example, the exposure condition of the first non-emission image under exposure) of any of the plurality of non-emission images under flash light emission.
  • the light emission image imaged by (1) is acquired (step S12).
  • step S12 the detailed operation in step S12 will be described with reference to the flowchart shown in FIG.
  • the flash control unit 272 determines the dimming light emission amount for the dimming image to be emitted from the flash light emitting unit 270 based on the distance distribution information acquired in step S10. (Step S100).
  • the dimming light emission amount is calculated as the maximum light emission amount that does not blow out from the distance distribution, assuming that the average reflectance of the subject within the angle of view is 18%.
  • the flash control unit 272 causes the flash light emitting unit 270 to emit flash light having the calculated light control light emission amount, and the main body CPU 220 performs the same exposure conditions (under exposure) as the first non-light emitting image under the light control light emission. ) Is acquired (step S102).
  • the main body side CPU 220 determines whether there is a pixel or a region where the flash light has not reached, based on the detection output of the flash light non-reaching pixel detection unit 220G (FIG. 15) (step S104).
  • the flash light non-reaching pixel detection unit 220G acquires a frame image (non-light-emitting image) of the live view image captured under the same exposure condition as that of the light-control image, and uses the light-control image and the non-light-emitting image. In comparison, a flash light non-reaching pixel or region is detected.
  • step S104 determines whether or not there is a whiteout region in the dimming image (step S106). If there is an area of the maximum pixel value (255 when the pixel value is represented by 8 bits) in the dimming image, the area can be determined as a whiteout area.
  • step S106 If it is determined in step S106 that there is no whiteout region in the light control image (in the case of “No”), the light control image is acquired as a light emission image (step S108). In this case, there is no need to capture a light-emitting image again, and the number of captured images can be reduced.
  • step S106 If it is determined in step S106 that there is a whiteout area in the light control image (in the case of “Yes”), the flash control unit 272 calculates the maximum light emission amount that the whiteout area does not white out. Then, the calculated maximum light emission amount is set as the main light emission amount when the light emission image is acquired (step S110).
  • step S104 determines that there is an area where the flash light has not reached (in the case of “Yes”)
  • the main body CPU 220 determines that the area where the flash light has not reached is greater than the maximum reach distance of the flash light. Is also determined (step S112).
  • the distance of the area where the flash light has not reached can be obtained from the distance distribution information, and the maximum reach distance of the flash light can be obtained from the guide number and F value of the flash light emitting unit 270.
  • step S112 If it is determined in step S112 that the area where the flash light has not reached is longer than the maximum reach distance of the flash light (in the case of “No”), the composition ratio determination unit 220C illustrated in FIG.
  • the mixing ratio ⁇ i of the area that has not reached (the mixing ratio ⁇ i of the second non-light-emitting image shown in the equation [6]) is set to 1 (step S114).
  • the mixing ratio ⁇ i of the second non-light emitting image in the region where the flash light has not reached is determined only by the mixing ratio ⁇ i of the second non-light emitting image based on the luminance distribution information.
  • step S112 when it is determined in step S112 that the area where the flash light has not reached is closer than the maximum reach distance of the flash light (in the case of “Yes”), the flash control unit 272 determines that the area where the flash light has reached A maximum light emission amount that does not cause whiteout is calculated, and the calculated maximum light emission amount is set as a main light emission amount when a light emission image is acquired (step S116).
  • the flash control unit 272 calculates the main light emission amount in step S110 or step S116, the flash control unit 272 performs main flash light emission from the flash light emission unit 270 with the calculated main light emission amount, and the image acquisition unit 221 performs the first non-light emission under the main light emission.
  • a light emission image captured by the imaging unit (the interchangeable lens 100 and the image sensor 201) under the same exposure condition as that of the light emission image is acquired (step S118).
  • a non-luminescent image (first non-luminescent image) that is subsequently captured under the same exposure conditions (underexposure) as the luminescent image and a second non-luminescent image that is captured with overexposure And get.
  • the light intensity distribution information calculation unit 220 ⁇ / b> B is based on the non-luminous image (under-exposed first non-luminous image), the luminescent image, and the distance distribution information acquired by the distance distribution information acquisition unit 220 ⁇ / b> A. 1]
  • the light intensity distribution information S i is calculated from the equation (step S14).
  • the combination ratio determination unit 220C determines a combination ratio for HDR combining the plurality of non-light-emitting images (first non-light-emitting image and second non-light-emitting image) based on the light intensity distribution information S i calculated in step S14. (Step S16). For example, when the composition ratio determining unit 220C determines the composition ratio for each pixel or region corresponding to each of the plurality of non-light-emitting images (the first non-light-emitting image and the second non-light-emitting image), the light intensity distribution information S i is used.
  • the mixing ratio of the first non-emission image that is under-exposed is made larger than the mixing ratio of the second non-emission image that is over-exposed.
  • the mixing ratio of the two non-light emitting images is set to be larger than the mixing ratio of the first non-light emitting image.
  • the image composition unit 220D performs HDR composition of the first non-light-emitting image and the second non-light-emitting image based on the composition ratio determined in Step S16, and generates an HDR image (Step S18).
  • the imaging device 10 is a mirrorless digital single-lens camera, but is not limited thereto, and may be a single-lens reflex camera, a lens-integrated imaging device, a digital video camera, or the like.
  • the present invention is also applicable to mobile devices having other functions (call function, communication function, and other computer functions).
  • Other modes to which the present invention can be applied include, for example, mobile phones and smartphones having a camera function, PDAs (Personal Digital Assistants), and portable game machines.
  • PDAs Personal Digital Assistants
  • FIG. 19 shows the appearance of a smartphone 500 that is an embodiment of the imaging apparatus of the present invention.
  • a smartphone 500 illustrated in FIG. 19 includes a flat housing 502, and a display input in which a display panel 521 as a display unit and an operation panel 522 as an input unit are integrated on one surface of the housing 502. Part 520.
  • the casing 502 includes a speaker 531, a microphone 532, an operation unit 540, and a camera unit 541.
  • the configuration of the housing 502 is not limited to this, and, for example, a configuration in which the display unit and the input unit are independent, or a configuration having a folding structure or a slide mechanism may be employed.
  • FIG. 20 is a block diagram showing a configuration of the smartphone 500 shown in FIG. As shown in FIG. 20, as a main component of a smartphone, a wireless communication unit 510 that performs mobile wireless communication via a base station and a mobile communication network, a display input unit 520, a call unit 530, and an operation unit 540 , A camera unit 541, a recording unit 550, an external input / output unit 560, a GPS (Global Positioning System) receiving unit 570, a motion sensor unit 580, a power supply unit 590, and a main control unit 501.
  • GPS Global Positioning System
  • the wireless communication unit 510 performs wireless communication with a base station accommodated in the mobile communication network in accordance with an instruction from the main control unit 501. Using this wireless communication, transmission / reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
  • the display input unit 520 displays images (still images and moving images), character information, and the like, and visually transmits information to the user under the control of the main control unit 501, and detects a user operation on the displayed information.
  • This is a so-called touch panel, and includes a display panel 521 and an operation panel 522.
  • the display panel 521 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a display device.
  • the operation panel 522 is a device that is placed so that an image displayed on the display surface of the display panel 521 is visible and detects one or a plurality of coordinates operated by a user's finger or stylus. When such a device is operated by a user's finger or stylus, a detection signal generated due to the operation is output to the main control unit 501. Next, the main control unit 501 detects an operation position (coordinates) on the display panel 521 based on the received detection signal.
  • the display panel 521 and the operation panel 522 of the smartphone 500 exemplified as an embodiment of the imaging apparatus of the present invention integrally constitute a display input unit 520.
  • the arrangement is such that 522 completely covers the display panel 521.
  • the operation panel 522 may have a function of detecting a user operation even in an area outside the display panel 521.
  • the operation panel 522 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 521 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 521. May be included).
  • the operation panel 522 may include two sensitive regions of the outer edge portion and the other inner portion. Furthermore, the width of the outer edge portion is appropriately designed according to the size of the housing 502 and the like. Further, examples of the position detection method employed in the operation panel 522 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method, and any method is adopted. You can also.
  • the call unit 530 includes a speaker 531 and a microphone 532, and converts a user's voice input through the microphone 532 into voice data that can be processed by the main control unit 501, and outputs the voice data to the main control unit 501, or a wireless communication unit 510 or the audio data received by the external input / output unit 560 is decoded and output from the speaker 531.
  • the speaker 531 and the microphone 532 can be mounted on the same surface as the surface on which the display input unit 520 is provided.
  • the operation unit 540 is a hardware key using a key switch or the like, and receives an instruction from the user.
  • the operation unit 540 is mounted on the side surface of the housing 502 of the smartphone 500 and is turned on when pressed with a finger or the like, and is turned off by a restoring force such as a spring when the finger is released. It is a push button type switch.
  • the recording unit 550 includes a control program of the main control unit 501, control data, application software (including the image processing program according to the present invention), address data that associates the name and telephone number of the communication partner, and the sent and received e-mails. Data, web data downloaded by web browsing, and downloaded content data are stored, and streaming data and the like are temporarily stored.
  • the recording unit 550 includes an internal storage unit 551 built in the smartphone and an external storage unit 562 having a removable external memory slot.
  • Each of the internal storage unit 551 and the external storage unit 552 constituting the recording unit 550 includes a flash memory type, a hard disk type, a multimedia card micro type, This is realized by using a recording medium such as a card type memory (for example, Micro SD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory) or the like.
  • a card type memory for example, Micro SD (registered trademark) memory
  • RAM Random Access Memory
  • ROM Read Only Memory
  • the external input / output unit 560 serves as an interface with all external devices connected to the smartphone 500, and communicates with other external devices (for example, universal serial bus (USB), IEEE 1394, etc.) or Network (for example, Internet, wireless LAN (Local Area Network), Bluetooth (registered trademark), RFID (Radio Frequency Identification), infrared communication (Infrared Data Association: IrDA) (registered trademark), UWB (Ultra Wideband) ( (Registered trademark), ZigBee (registered trademark), etc.) for direct or indirect connection.
  • USB universal serial bus
  • IEEE 1394 etc.
  • Network for example, Internet, wireless LAN (Local Area Network), Bluetooth (registered trademark), RFID (Radio Frequency Identification), infrared communication (Infrared Data Association: IrDA) (registered trademark), UWB (Ultra Wideband) ( (Registered trademark), ZigBee (registered trademark), etc.) for direct or indirect connection.
  • Examples of external devices connected to the smartphone 500 include a wired / wireless headset, wired / wireless external charger, wired / wireless data port, a memory card connected via a card socket, and a SIM (Subscriber).
  • Identity Module Card (UIM) / User Identity Module Card (UIM) card or external audio / video equipment connected via audio / video I / O (Input / Output) terminal, external audio / video equipment connected wirelessly, wired / wireless
  • smartphones to be connected, personal computers to be connected to / from wireless, PDAs to be connected to wireless and wireless, and earphones.
  • the external input / output unit can transmit the data transmitted from such an external device to each component inside the smartphone 500, or can transmit the data inside the smartphone 500 to the external device.
  • the GPS receiving unit 570 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 501, performs positioning calculation processing based on the received plurality of GPS signals, A position consisting of longitude and altitude is detected.
  • the GPS reception unit 570 can acquire position information from the wireless communication unit 510 or the external input / output unit 560 (for example, a wireless LAN), the GPS reception unit 570 can also detect the position using the position information.
  • the motion sensor unit 580 includes, for example, a triaxial acceleration sensor and a gyro sensor, and detects the physical movement of the smartphone 500 in accordance with an instruction from the main control unit 501. By detecting the physical movement of the smartphone 500, the moving direction and acceleration of the smartphone 500 are detected. This detection result is output to the main control unit 501.
  • the power supply unit 590 supplies power stored in a battery (not shown) to each unit of the smartphone 500 in accordance with an instruction from the main control unit 501.
  • the main control unit 501 includes a microprocessor, operates according to a control program and control data stored in the recording unit 550, and controls each unit of the smartphone 500 in an integrated manner. Further, the main control unit 501 includes a mobile communication control function and an application processing function for controlling each unit of the communication system in order to perform voice communication and data communication through the wireless communication unit 510.
  • the application processing function is realized by the main control unit 501 operating in accordance with application software stored in the recording unit 550.
  • Examples of the application processing function include an infrared communication function for controlling the external input / output unit 560 to perform data communication with the opposite device, an e-mail function for sending and receiving e-mails, a web browsing function for browsing web pages, and the present invention.
  • the main control unit 501 has an image processing function such as displaying video on the display input unit 520 based on image data (still image or moving image data) such as received data or downloaded streaming data.
  • the image processing function refers to a function in which the main control unit 501 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 520.
  • the main control unit 501 executes display control for the display panel 521 and operation detection control for detecting a user operation through the operation unit 540 and the operation panel 522.
  • the main control unit 501 By executing the display control, the main control unit 501 displays an icon for starting application software and a software key such as a scroll bar, or displays a window for creating an e-mail.
  • a software key such as a scroll bar
  • the scroll bar refers to a software key for accepting an instruction to move the display portion of a large image that does not fit in the display area of the display panel 521.
  • the main control unit 501 detects a user operation through the operation unit 540, receives an operation on the icon and an input of a character string in the input field of the window through the operation panel 522, or scrolls. A request to scroll the display image through the bar is accepted.
  • the main control unit 501 causes the operation position with respect to the operation panel 522 to overlap with the display panel 521 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 521.
  • a touch panel control function for controlling the sensitive area of the operation panel 522 and the display position of the software key.
  • the main control unit 501 can also detect a gesture operation on the operation panel 522 and execute a preset function in accordance with the detected gesture operation.
  • Gesture operation is not a conventional simple touch operation, but an operation of drawing a trajectory with at least one of a plurality of positions by drawing a trajectory with a finger or the like, or simultaneously specifying a plurality of positions. means.
  • the camera unit 541 is a digital camera that performs electronic photography using an imaging device such as a complementary metal oxide semiconductor (CMOS) or a charge-coupled device (CCD), and corresponds to the imaging device 10 illustrated in FIG.
  • CMOS complementary metal oxide semiconductor
  • CCD charge-coupled device
  • the camera unit 541 converts image data obtained by imaging into compressed image data such as JPEG (Joint Photographic coding Experts Group) under the control of the main control unit 501, and records the data in the recording unit 550.
  • the data can be output through the input / output unit 560 and the wireless communication unit 510. As shown in FIG.
  • the camera unit 541 is mounted on the same surface as the display input unit 520, but the mounting position of the camera unit 541 is not limited to this, and is mounted on the back surface of the display input unit 520.
  • a plurality of camera units 541 may be mounted. Note that when a plurality of camera units 541 are mounted, the camera unit 541 used for shooting can be switched for shooting alone, or a plurality of camera units 541 can be used for shooting simultaneously.
  • the camera unit 541 can be used for various functions of the smartphone 500.
  • an image acquired by the camera unit 541 can be displayed on the display panel 521, or the image of the camera unit 541 can be used as one of operation inputs of the operation panel 522.
  • the GPS receiving unit 570 detects the position, the position can also be detected with reference to an image from the camera unit 541.
  • the optical axis direction of the camera unit 541 of the smartphone 500 without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor (gyro sensor). It is also possible to determine the current usage environment.
  • the image from the camera unit 541 can be used in the application software.
  • the position information acquired by the GPS receiver 570 to the image data of the still image or the moving image, the voice information acquired by the microphone 532 (the text information may be converted into voice information by the main control unit or the like), Posture information and the like acquired by the motion sensor unit 580 can be added and recorded in the recording unit 550 or output through the external input / output unit 560 and the wireless communication unit 510.
  • the image processing apparatus (main body side CPU 220) that performs HDR composition is built in the imaging apparatus, but the image processing apparatus according to the present invention executes, for example, the image processing program according to the present invention. It may be a personal computer, a portable terminal, or the like separate from the imaging device. In this case, it is necessary to input a plurality of non-light-emitting images, light-emitting images, distance distribution information, and the like as information for HDR synthesis.
  • the plurality of non-light-emitting images are not limited to the two non-light-emitting images of the under-exposed first non-light-emitting image and the over-exposed second non-light-emitting image.
  • a non-light emitting image may be used.
  • the combination ratio determination unit determines a combination ratio of three or more non-light emitting images.
  • the hardware structure of a processing unit (processing unit) that executes various processes is various processors as shown below.
  • the circuit configuration can be changed after manufacturing a CPU (Central Processing Unit), FPGA (Field Programmable Gate Array), which is a general-purpose processor that functions as various processing units by executing software (programs).
  • a dedicated electric circuit that is a processor having a circuit configuration specifically designed to execute a specific process such as a programmable logic device (PLD) or an application specific integrated circuit (ASIC). It is.
  • PLD programmable logic device
  • ASIC application specific integrated circuit
  • One processing unit may be configured by one of these various processors, or may be configured by two or more processors of the same type or different types (for example, a plurality of FPGAs or a combination of CPUs and FPGAs). May be. Further, the plurality of processing units may be configured by one processor. As an example of configuring a plurality of processing units with one processor, first, as represented by a computer such as a client or server, one processor is configured with a combination of one or more CPUs and software. There is a form in which the processor functions as a plurality of processing units.
  • SoC System On Chip
  • various processing units are configured using one or more of the various processors as a hardware structure.
  • the hardware structure of these various processors is more specifically an electric circuit (circuitry) in which circuit elements such as semiconductor elements are combined.
  • the present invention includes an image processing program that functions as an imaging apparatus or an image processing apparatus according to the present invention by being installed in the imaging apparatus or computer, and a recording medium on which the image processing program is recorded.
  • Imaging device 10
  • Finder window 22
  • Shutter release switch 23
  • Shutter speed dial 24
  • Exposure compensation dial 25
  • Power lever 26
  • Eyepiece 27
  • MENU / OK key 28
  • Play button 30
  • Built-in flash 100 interchangeable lenses
  • Imaging optical system 104
  • Lens group 108
  • Focus lens control unit 118
  • Aperture control unit 120
  • Lens side CPU 122
  • 207 RAM 124
  • Lens side communication unit 160
  • Lens mount 200
  • Camera body 201
  • Image sensor 201A micro lens
  • Image sensor controller 203
  • Analog signal processor 204
  • Image input controller 206
  • Digital signal processor 208
  • Compression / decompression processor 210
  • Memory control unit 212 memory card
  • Display control unit 216
  • Main CPU 220A
  • Distance distribution information acquisition unit 220B
  • Light intensity distribution information calculation unit 220C
  • Composite ratio determination unit 220D image composition unit 220E

Abstract

L'invention concerne un dispositif, un procédé et un programme de traitement d'Image, et un dispositif d'imagerie qui peut acquérir une image HDR dans laquelle la différence de luminosité/obscurité a été ajustée en fonction de la luminosité de la lumière frappant un sujet. Ce dispositif de traitement d'image comprend : une unité d'acquisition d'image (221) qui acquiert une première image non émettrice capturée par sous-exposition, une seconde image non émettrice capturée par surexposition, et une image émettrice avec les mêmes conditions d'exposition que la première image non émettrice; une unité d'acquisition d'information de distribution de distance (220A) qui acquiert une information de distribution de distance indiquant la distribution des distances du sujet; une unité de calcul d'informations de distribution d'intensité lumineuse (220B) qui, sur la base de l'image d'émission, des première et deuxième images non émettrices et des informations de distribution de distance, calcule les informations de distribution d'intensité lumineuse de la lumière projetée sur le sujet; une unité de détermination de rapport de synthèse (220C) qui détermine un rapport de synthèse pour la première et la deuxième images non émettrices sur la base des informations de distribution d'intensité lumineuse; et une unité de synthèse d'image (220D) qui synthétise les première et deuxième images non émises selon le rapport de synthèse.
PCT/JP2019/015046 2018-04-26 2019-04-05 Dispositif, procédé et programme de traitement d'images, et dispositif d'imagerie WO2019208155A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020516173A JP6810299B2 (ja) 2018-04-26 2019-04-05 画像処理装置、方法、及びプログラム並びに撮像装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-085582 2018-04-26
JP2018085582 2018-04-26

Publications (1)

Publication Number Publication Date
WO2019208155A1 true WO2019208155A1 (fr) 2019-10-31

Family

ID=68294051

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/015046 WO2019208155A1 (fr) 2018-04-26 2019-04-05 Dispositif, procédé et programme de traitement d'images, et dispositif d'imagerie

Country Status (2)

Country Link
JP (1) JP6810299B2 (fr)
WO (1) WO2019208155A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111327800A (zh) * 2020-01-08 2020-06-23 深圳深知未来智能有限公司 一种适应复杂光照环境的全天候车载视觉系统及方法
CN114513589A (zh) * 2020-10-06 2022-05-17 联发科技股份有限公司 图像采集方法以及相关的图像采集系统
CN114690510A (zh) * 2020-12-29 2022-07-01 财团法人工业技术研究院 取像方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003018617A (ja) * 2001-07-03 2003-01-17 Olympus Optical Co Ltd 撮像装置
JP2008205935A (ja) * 2007-02-21 2008-09-04 National Univ Corp Shizuoka Univ 画像合成における露光比決定方法
WO2017065949A1 (fr) * 2015-10-16 2017-04-20 CapsoVision, Inc. Capteur unique d'images pour capture d'images en lumière structurée et d'images ordinaires mélangées

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003018617A (ja) * 2001-07-03 2003-01-17 Olympus Optical Co Ltd 撮像装置
JP2008205935A (ja) * 2007-02-21 2008-09-04 National Univ Corp Shizuoka Univ 画像合成における露光比決定方法
WO2017065949A1 (fr) * 2015-10-16 2017-04-20 CapsoVision, Inc. Capteur unique d'images pour capture d'images en lumière structurée et d'images ordinaires mélangées

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111327800A (zh) * 2020-01-08 2020-06-23 深圳深知未来智能有限公司 一种适应复杂光照环境的全天候车载视觉系统及方法
CN114513589A (zh) * 2020-10-06 2022-05-17 联发科技股份有限公司 图像采集方法以及相关的图像采集系统
CN114690510A (zh) * 2020-12-29 2022-07-01 财团法人工业技术研究院 取像方法

Also Published As

Publication number Publication date
JPWO2019208155A1 (ja) 2021-03-25
JP6810299B2 (ja) 2021-01-06

Similar Documents

Publication Publication Date Title
CN101610363B (zh) 在数字图像处理装置中对图像背景进行模糊的设备和方法
JP6302554B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP6017225B2 (ja) 撮像装置
TW201240448A (en) Combined ambient and flash exposure for improved image quality
JP6063093B2 (ja) 画像処理装置、撮像装置、画像処理方法及びプログラム
JP6810299B2 (ja) 画像処理装置、方法、及びプログラム並びに撮像装置
JP6921972B2 (ja) 画像処理装置、撮像装置、画像処理方法、撮像方法、およびプログラム
US11032483B2 (en) Imaging apparatus, imaging method, and program
JP7112529B2 (ja) 撮像装置、撮像方法、及びプログラム
JP6192416B2 (ja) 撮像装置及びその制御方法、プログラム、並びに記憶媒体
WO2019111659A1 (fr) Dispositif de traitement d'image, dispositif d'imagerie, procédé de traitement d'image et programme
JP6534780B2 (ja) 撮像装置、撮像方法、およびプログラム
JP6998454B2 (ja) 撮像装置、撮像方法、プログラム及び記録媒体
WO2020161969A1 (fr) Dispositif de traitement d'images, dispositif de photographie, procédé de traitement d'images et programme de traitement d'images
JP6810298B2 (ja) 画像位置合わせ補助装置、方法及びプログラム並びに撮像装置
JP6941744B2 (ja) 画像処理装置、撮影装置、画像処理方法及び画像処理プログラム
JP7352034B2 (ja) 撮像装置、撮像指示方法、及び撮像指示プログラム
KR101128518B1 (ko) 디지털 이미지 처리장치의 촬영방법
JP2004364163A (ja) 撮像装置
JP2008268361A (ja) 撮影装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19793277

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020516173

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19793277

Country of ref document: EP

Kind code of ref document: A1