WO2014045738A1 - Dispositif de traitement d'image, dispositif de formation d'image, procédé de traitement d'image et programme de traitement d'image - Google Patents

Dispositif de traitement d'image, dispositif de formation d'image, procédé de traitement d'image et programme de traitement d'image Download PDF

Info

Publication number
WO2014045738A1
WO2014045738A1 PCT/JP2013/071180 JP2013071180W WO2014045738A1 WO 2014045738 A1 WO2014045738 A1 WO 2014045738A1 JP 2013071180 W JP2013071180 W JP 2013071180W WO 2014045738 A1 WO2014045738 A1 WO 2014045738A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
color
display
display image
unit
Prior art date
Application number
PCT/JP2013/071180
Other languages
English (en)
Japanese (ja)
Inventor
克俊 井澤
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2014536664A priority Critical patent/JP5901780B2/ja
Priority to CN201380048302.XA priority patent/CN104737527B/zh
Publication of WO2014045738A1 publication Critical patent/WO2014045738A1/fr
Priority to US14/642,774 priority patent/US9712752B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/18Signals indicating condition of a camera member or suitability of light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/207Image signal generators using stereoscopic image cameras using a single 2D image sensor
    • H04N13/232Image signal generators using stereoscopic image cameras using a single 2D image sensor using fly-eye lenses, e.g. arrangements of circular lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/257Colour aspects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/286Image signal generators having separate monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • H04N23/632Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters for displaying or modifying preview images prior to image capturing, e.g. variety of image resolutions or capturing parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/703SSIS architectures incorporating pixels for producing signals other than image signals
    • H04N25/704Pixels specially adapted for focusing, e.g. phase difference pixel sets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2209/00Details of colour television systems
    • H04N2209/04Picture signal generators
    • H04N2209/041Picture signal generators using solid-state devices
    • H04N2209/042Picture signal generators using solid-state devices having a single pick-up sensor
    • H04N2209/045Picture signal generators using solid-state devices having a single pick-up sensor using mosaic colour filter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes

Definitions

  • the present invention relates to an image processing device, an imaging device, an image processing method, and an image processing program.
  • a digital camera having a so-called manual focus mode in which a user can manually perform focus adjustment in addition to auto-focus using a phase difference detection method or a contrast detection method is widely known.
  • a method using a split micro prism screen that displays a phase difference visually by providing a reflex mirror so that focus adjustment can be performed while checking a subject is known.
  • a method employing a method for visually confirming contrast is also known.
  • a split image is displayed in a live view image (also referred to as a through image) in order to make it easier for the operator to focus on the subject in the manual focus mode.
  • the split image is, for example, a divided image divided into two (for example, images divided in the vertical direction), and is shifted in the parallax generation direction (for example, the horizontal direction) according to the shift of the focus, and is in focus. In this state, it indicates a divided image in which the shift in the parallax generation direction is eliminated.
  • An operator for example, a photographer adjusts the focus by operating the manual focus ring so that the split image (for example, each image divided in the vertical direction) is not displaced.
  • Patent Document 1 An imaging apparatus described in Japanese Patent Laying-Open No. 2009-147665 (hereinafter referred to as Patent Document 1) includes a first subject image and a first object image formed by a light beam divided by a pupil division unit among light beams from an imaging optical system.
  • a first image and a second image are generated by photoelectrically converting the two subject images.
  • a split image is generated using the first and second images, and a third image is generated by photoelectric conversion of a third subject image formed by a light beam that is not divided by the pupil dividing unit.
  • the third image is displayed on the display unit, the generated split image is displayed in the third image, and the color information extracted from the third image is added to the split image. In this way, by adding the color information extracted from the third image to the split image, the visibility of the split image can be improved.
  • Patent Document 2 An image processing apparatus described in Japanese Patent Application Laid-Open No. 2000-078607 (hereinafter referred to as Patent Document 2) has color balance adjustment processing means for performing color balance adjustment on an image signal input from the image input apparatus.
  • the color balance adjustment processing unit estimates the chromaticity of the illumination light source when the image signal is acquired from the image signal, and performs color balance adjustment on the image signal based on the estimated chromaticity of the illumination light source.
  • An object is to provide an image processing apparatus, an imaging apparatus, an image processing method, and an image processing program.
  • An image processing apparatus includes an imaging device including first and second pixel groups in which a subject image that has passed through first and second regions in a photographic lens is divided into pupils, respectively.
  • a first display image generation unit configured to generate a first display image based on an image signal output from the element; and first and second image signals output from the first and second pixel groups.
  • a second display image generation unit that generates a second display image to be used for focus confirmation based on the first display image, and a color of the first display image generated by the first display image generation unit
  • An acquisition unit that acquires information, and a color having a color characteristic different from the color characteristic of the first display image based on the color information acquired by the acquisition unit is a display color of the second display image
  • a determination unit for determining the image, a display unit for displaying an image, and the display unit.
  • the first display image generated by the first display image generation unit is displayed, and the second display image generation unit includes a display area of the first display image.
  • a display control unit for displaying the generated second display image.
  • the color having the different color characteristic is a color characteristic different from the color characteristic of the first display image. It is good also as what. Accordingly, the first display image and the second display image can be visually identified as compared with the case where the present configuration is not provided.
  • the acquisition unit may acquire the color information from a part or all of the first display image. Good. Thereby, compared with the case where this configuration is not provided, the color characteristics of the second display image and the color characteristics of the first display image can be differentiated with high accuracy with a simple configuration.
  • the acquisition unit includes the first display image, the first image, the second image, or the first image.
  • the color information may be acquired based on two display images.
  • the color characteristics of the second display image and the color characteristics of the first display image can be made to differ with high accuracy compared to the case without this configuration.
  • the fourth aspect of the present invention even if the first display image is overexposed, the first and second pixel groups have low sensitivity, so that color information can be acquired.
  • the imaging element includes a normal imaging region used for imaging the first display image.
  • An in-focus confirmation imaging area that is used for imaging the second display image and is adjacent to the normal imaging area, and the acquisition unit includes the first display image in the normal imaging area.
  • the color information may be acquired from an image output from an edge portion on the boundary side with the focus confirmation imaging region. Accordingly, the boundary between the first display image and the second display image can be made clearer than in the case where the present configuration is not provided.
  • the color information indicates color information indicating a color characteristic of an object color of the first display image. It is good also as what. As a result, the boundary between the first display image and the second display image is obtained even when the object color is more dominant than the light source color in the first display image as compared with the case without this configuration. The visibility of the line can be improved.
  • the seventh aspect of the present invention may be such that the color characteristic is a hue in the sixth aspect of the present invention. Thereby, the visibility of the boundary line between the first display image and the second display image can be further improved with a simple configuration as compared with the case where the present configuration is not provided.
  • the color having the different color characteristic is predetermined as an object color of the first display image in a hue circle. It is good also as what made the color away from the angle. As a result, an object color suitable as a color having a color characteristic different from the color characteristic of the first display image can be easily determined as compared with the case without this configuration.
  • the color characteristic of the object color of the first display image is the first display image.
  • the composition ratio of the primary color components may be used. As a result, the visibility of the boundary line between the first display image and the second display image is improved as the color characteristic of the object color of the first display image, compared with the case where the present configuration is not provided. It is possible to easily identify appropriate color characteristics.
  • the converted composition ratio obtained by converting the composition ratio by an input / output function including a decreasing function is the different color characteristic. Also good. Thereby, the visibility of the boundary line between the first display image and the second display image can be improved with a simple configuration as compared with the case where the present configuration is not provided.
  • the converted composition ratio obtained by converting the composition ratio by color conversion data in the ninth aspect of the present invention may be the different color characteristics. Thereby, the visibility of the boundary line between the first display image and the second display image can be improved with a simple configuration as compared with the case where the present configuration is not provided.
  • the color conversion data may be color conversion data created based on an input / output function including a reduction function.
  • the color information is color information indicating a color characteristic of a light source color of the first display image. It is good also as what. As a result, the boundary between the first display image and the second display image is obtained even when the light source color is more dominant than the object color in the first display image as compared with the case without this configuration. The visibility of the line can be improved.
  • each of the first and second pixel groups is a single color pixel group
  • the first Two display image generation units generate an achromatic image as the second display image based on the first and second image signals output from the first and second pixel groups; It may be a thing. Thereby, it is possible to obtain an achromatic second display image with a simple configuration as compared with the case where this configuration is not provided.
  • the determination unit is based on the color information acquired by the acquisition unit.
  • a chromatic color having a color characteristic different from the color characteristic of the first display image may be determined as the display color of the second display image.
  • the determination unit is configured such that the first display image and the second display image are achromatic.
  • a chromatic color having a color characteristic different from the color characteristic of the first display image is determined as the display color of the second display image. It may be a thing. Thereby, the visibility of the boundary line between the first display image and the second display image can be improved as compared with the case where this configuration is not provided.
  • the determining unit is configured to perform the first based on the color information acquired by the acquiring unit.
  • a chromatic color having a color characteristic different from the color characteristic of the display image, wherein the chromatic color is different between the region based on the first image and the region based on the second image in the second display image.
  • the display color of the second display image may be determined. This makes it easier to visually identify the image based on the first image and the image based on the second image in the second display image as compared with the case without this configuration.
  • the determination unit is configured to display the color characteristics of the first display image and the second display image.
  • a predetermined condition for approaching the color characteristic of the image is satisfied, a color that approaches the color characteristic of the first display image by a predetermined degree based on the color information acquired by the acquisition unit A color having characteristics may be further determined as the display color of the second display image.
  • the image sensor forms a third image signal by subjecting the subject image to be formed without pupil division.
  • the first display image generation unit outputs the first display image based on the third image signal output from the third pixel group. It may be generated. Thereby, compared with the case where it does not have this structure, the image quality of the 1st image for a display can be improved.
  • An image pickup apparatus includes an image processing apparatus according to any one of the first to nineteenth aspects of the present invention, a storage unit that stores an image output from the image sensor, including. Thereby, the visibility of the boundary line between the first display image and the second display image can be improved as compared with the case where this configuration is not provided.
  • An image processing method is an image processing method including first and second pixel groups in which a subject image that has passed through first and second regions in a photographic lens is divided into pupils and formed.
  • a first display image generating step for generating a first display image based on an image signal output from the element; and first and second image signals output from the first and second pixel groups.
  • a second display image generation step for generating a second display image based on the first display image, and an acquisition step for acquiring color information of the first display image generated by the first display image generation unit
  • the first display with respect to a display unit for displaying an image
  • the first display image generated by the image generation step is displayed
  • the second display generated by the second display image generation step is displayed in the display area of the first display image.
  • a display control step for displaying an image for use is provided.
  • a program according to a twenty-second aspect of the present invention provides the first display image generating unit in the image processing apparatus according to any one of the first to nineteenth aspects of the present invention.
  • the computer functions as the second display image generation unit, the acquisition unit, the determination unit, and the display control unit.
  • FIG. 2 is a schematic layout diagram illustrating an example of a layout of color filters provided in an image sensor included in the imaging device according to the first embodiment.
  • FIG. 5 is a diagram for explaining a method of determining a correlation direction from pixel values of 2 ⁇ 2 G pixels included in the color filter illustrated in FIG. 4. It is a figure for demonstrating the concept of a basic sequence pattern.
  • FIG. 3 is a schematic arrangement diagram illustrating an example of an arrangement of light shielding members provided in an imaging element included in the imaging apparatus according to the first embodiment. It is a schematic block diagram which shows an example of a structure of the phase difference pixel (1st pixel and 2nd pixel) contained in the image pick-up element of the imaging device which concerns on 1st Embodiment. It is a block diagram which shows an example of the principal part function of the imaging device which concerns on 1st Embodiment. It is a schematic diagram which shows an example of each display area of the normal image and split image in the display apparatus of the imaging device which concerns on 1st Embodiment.
  • FIG. 1 It is a flowchart which shows an example of the flow of the chromatic color provision process performed by the image process part contained in the imaging device which concerns on 1st Embodiment. It is a flowchart which shows an example of the flow of the display color differentiation process performed by the image process part contained in the imaging device which concerns on 1st Embodiment. It is a flowchart which shows an example of the flow of the display color assimilation process performed by the image process part contained in the imaging device which concerns on 1st Embodiment. It is a screen mode figure which shows an example of the normal image and split image which are the live view images displayed on the display apparatus of the imaging device in 1st Embodiment, and a display color differentiation process is performed.
  • FIG. 1 shows an example of the normal image and split image which are the live view images displayed on the display apparatus of the imaging device in 1st Embodiment, and a display color differentiation process is performed.
  • FIG. 3 is a screen mode diagram illustrating an example of a normal image and a color-divided split image when a display color differentiation process is performed on a live view image displayed on the display device of the imaging device according to the first embodiment. It is a screen mode figure which shows an example of the achromatic normal image and split image which are the live view images displayed on the display apparatus of the imaging device according to the first embodiment.
  • FIG. 3 is a screen view illustrating an example of an achromatic normal image and a chromatic split image that are live view images displayed on the display device of the imaging apparatus according to the first embodiment.
  • FIG. 1 is a perspective view illustrating an example of an appearance of the imaging apparatus 100 according to the first embodiment
  • FIG. 2 is a rear view of the imaging apparatus 100 illustrated in FIG.
  • the imaging apparatus 100 is an interchangeable lens camera, and includes a camera body 200 and an interchangeable lens 300 (photographing lens, focus lens 302 (manual operation unit)) that is attached to the camera body 200 in an interchangeable manner. This is a digital camera with a mirror omitted.
  • the camera body 200 is provided with a hybrid finder (registered trademark) 220.
  • the hybrid viewfinder 220 here refers to a viewfinder in which, for example, an optical viewfinder (hereinafter referred to as “OVF”) and an electronic viewfinder (hereinafter referred to as “EVF”) are selectively used.
  • OPF optical viewfinder
  • EMF electronic viewfinder
  • the camera body 200 and the interchangeable lens 300 are interchangeably mounted by combining a mount 256 provided in the camera body 200 and a mount 346 (see FIG. 3) on the interchangeable lens 300 side corresponding to the mount 256.
  • a front view of the camera body 200 is provided with an OVF viewfinder window 241 included in the hybrid viewfinder 220.
  • a finder switching lever (finder switching unit) 214 is provided on the front surface of the camera body 200. When the viewfinder switching lever 214 is rotated in the direction of the arrow SW, it switches between an optical image that can be viewed with OVF and an electronic image (live view image) that can be viewed with EVF (described later).
  • the optical axis L2 of the OVF is an optical axis different from the optical axis L1 of the interchangeable lens 300.
  • a release button 211 and a dial 212 for setting a shooting mode, a playback mode, and the like are mainly provided.
  • an OVF viewfinder eyepiece 242 On the back of the camera body 200, an OVF viewfinder eyepiece 242, a display unit 213, a cross key 222, a MENU / OK key 224, and a BACK / DISP button 225 are provided.
  • the cross key 222 functions as a multi-function key that outputs various command signals such as menu selection, zoom and frame advance.
  • the MENU / OK key 224 has both a function as a menu button for instructing to display a menu on the screen of the display unit 213 and an function as an OK button for instructing confirmation and execution of selection contents. Key.
  • the BACK / DISP button 225 is used for deleting a desired object such as a selection item, canceling a designated content, or returning to the previous operation state.
  • the display unit 213 is realized by, for example, an LCD, and is used to display a live view image (through image) that is an example of a continuous frame image obtained by capturing a continuous frame in the shooting mode.
  • the display unit 213 is also used to display a still image that is an example of a single frame image obtained by capturing a single frame when a still image shooting instruction is given.
  • the display unit 213 is also used for displaying a playback image and a menu screen in the playback mode.
  • FIG. 3 is a block diagram showing an example of the electrical configuration (internal configuration) of the imaging apparatus 100 according to the first embodiment.
  • the imaging device 100 is a digital camera that records captured still images and moving images, and the overall operation of the camera is controlled by a CPU (central processing unit) 12.
  • the imaging apparatus 100 includes an operation unit 14, an interface unit 24, a memory 26, an encoder 34, a display control unit 36, and an eyepiece detection unit 37.
  • the imaging apparatus 100 includes an image processing unit 28 that is an example of a first display image generation unit, a second display image generation unit, an acquisition unit, a determination unit, and an estimation unit according to the present invention.
  • the CPU 12, the operation unit 14, the interface unit 24, the memory 26 as an example of a storage unit, the image processing unit 28, the encoder 34, the display control unit 36, the eyepiece detection unit 37, and the external interface (I / F) 39 are connected to the bus 40.
  • the memory 26 includes a non-volatile storage area (such as an EEPROM) that stores parameters, programs, and the like, and a volatile storage area (such as an SDRAM) that temporarily stores various information such as images.
  • the operation unit 14 includes a release button 211, a dial (focus mode switching unit) 212 for selecting a shooting mode, a display unit 213, a viewfinder switching lever 214, a cross key 222, a MENU / OK key 224, and a BACK / DISP button 225. .
  • the operation unit 14 also includes a touch panel that accepts various types of information. This touch panel is overlaid on the display screen of the display unit 213, for example. Various operation signals output from the operation unit 14 are input to the CPU 12.
  • the image light indicating the subject is coupled to the light receiving surface of the color image sensor (for example, a CMOS sensor) 20 via the photographing lens 16 including the focus lens that can be moved manually and the shutter 18.
  • the signal charge accumulated in the image sensor 20 is sequentially read out as a digital signal corresponding to the signal charge (voltage) by a read signal applied from the device control unit 22.
  • the imaging element 20 has a so-called electronic shutter function, and controls the charge accumulation time (shutter speed) of each photosensor according to the timing of the readout signal by using the electronic shutter function.
  • the image sensor 20 according to the first embodiment is a CMOS image sensor, but is not limited thereto, and may be a CCD image sensor.
  • FIG. 4 schematically shows an example of the arrangement of the color filters 21 provided in the image sensor 20.
  • (4896 ⁇ 3264) pixels are adopted as an example of the number of pixels and 3: 2 is adopted as the aspect ratio.
  • the number of pixels and the aspect ratio are not limited thereto.
  • the color filter 21 includes a first filter G corresponding to G (green) that contributes most to obtain a luminance signal, and second filters R and B corresponding to R (red).
  • a third filter B corresponding to (blue) is included.
  • the arrangement pattern of the first filter G (hereinafter referred to as G filter), the second filter R (hereinafter referred to as R filter), and the third filter B (hereinafter referred to as B filter) is the first arrangement. It is classified into a pattern A and a second array pattern B.
  • the G filters are arranged on the four corners and the center pixel of the 3 ⁇ 3 pixel square array.
  • the R filter is arranged on the central vertical line in the horizontal direction of the square arrangement.
  • the B filter is arranged on the central horizontal line in the vertical direction of the square arrangement.
  • the second arrangement pattern B is a pattern in which the arrangement of the filter G is the same as that of the first basic arrangement pattern A and the arrangement of the filter R and the arrangement of the filter B are interchanged.
  • the color filter 21 includes a basic array pattern C composed of a square array pattern corresponding to 6 ⁇ 6 pixels.
  • the basic array pattern C is a 6 ⁇ 6 pixel pattern in which the first array pattern A and the second array pattern B are arranged point-symmetrically, and the basic array pattern C is repeatedly arranged in the horizontal direction and the vertical direction. ing.
  • filters of R, G, and B colors R filter, G filter, and B filter
  • R filter, G filter, and B filter filters of R, G filter, and B colors
  • the color filter array of the reduced image after the thinning process can be the same as the color filter array before the thinning process, and a common processing circuit is provided. Can be used.
  • the color filter 21 has a G filter corresponding to the color that contributes most to obtain a luminance signal (G color in the first embodiment) in each horizontal, vertical, and diagonal line of the color filter array. Is arranged. Therefore, it is possible to improve the reproduction accuracy of the synchronization process in the high frequency region regardless of the direction of high frequency.
  • the color filter 21 includes an R filter and a B filter corresponding to two or more other colors (R and B colors in the first embodiment) other than the G color. Located within each line of direction. For this reason, the occurrence of color moire (false color) is suppressed, whereby an optical low-pass filter for suppressing the occurrence of false color can be prevented from being arranged in the optical path from the incident surface of the optical system to the imaging surface. . Even when an optical low-pass filter is applied, it is possible to apply a filter having a weak function of cutting a high-frequency component for preventing the occurrence of false colors, so that the resolution is not impaired.
  • the basic array pattern C includes a 3 ⁇ 3 pixel first array pattern A surrounded by a broken line frame and a 3 ⁇ 3 pixel second array pattern B surrounded by a one-dot chain line. It can also be understood that the array is arranged alternately.
  • G filters which are luminance pixels, are arranged at the four corners and the center, and are arranged on both diagonal lines.
  • the B filter is arranged in the horizontal direction and the R filter is arranged in the vertical direction with the central G filter interposed therebetween.
  • the R filters are arranged in the horizontal direction and the B filters are arranged in the vertical direction across the center G filter. That is, in the first arrangement pattern A and the second arrangement pattern B, the positional relationship between the R filter and the B filter is reversed, but the other arrangements are the same.
  • the G filters at the four corners of the first array pattern A and the second array pattern B are shown in FIG. 5 as an example, in which the first array pattern A and the second array pattern B are horizontal and vertical. Are alternately arranged, a square array of G filters corresponding to 2 ⁇ 2 pixels is formed.
  • a square array of G filters corresponding to 2 ⁇ 2 pixels is formed.
  • the difference absolute value of the pixel value of the G pixel in the horizontal direction the difference absolute value of the pixel value of the G pixel in the vertical direction, and the diagonal direction
  • the difference absolute value of the pixel values of the G pixels (upper right diagonal, upper left diagonal) is calculated.
  • the basic array pattern C of the color filter 21 is arranged point-symmetrically with respect to the center of the basic array pattern C (the centers of the four G filters).
  • the first array pattern A and the second array pattern B in the basic array pattern C are also arranged point-symmetrically with respect to the central G filter, so that the circuit scale of the subsequent processing circuit is reduced. Or can be simplified.
  • the color filter array of the first and third lines of the first to sixth lines in the horizontal direction is GRGGBG.
  • the color filter array of the second line is BGBGR.
  • the color filter array of the fourth and sixth lines is GBGGRG.
  • the color filter array of the fifth line is RGRBGB.
  • basic array patterns C, C ′, and C ′′ are shown.
  • the basic array pattern C ′ indicates a pattern obtained by shifting the basic array pattern C by one pixel in the horizontal direction and the vertical direction
  • the basic array pattern C ′′ indicates a pattern obtained by shifting the basic array pattern C by two pixels each in the horizontal direction and the vertical direction.
  • the color filter 21 has the same color filter array even when the basic array patterns C ′ and C ′′ are repeatedly arranged in the horizontal direction and the vertical direction.
  • the basic array pattern C Is referred to as a basic array pattern for convenience.
  • the imaging apparatus 100 has a phase difference AF function.
  • the image sensor 20 includes a plurality of phase difference detection pixels used when the phase difference AF function is activated.
  • the plurality of phase difference detection pixels are arranged in a predetermined pattern.
  • a light shielding member 20A for shielding the left half pixel in the horizontal direction and a light shielding member 20B for shielding the right half pixel in the horizontal direction are provided on the phase difference detection pixels.
  • the phase difference detection pixel provided with the light shielding member 20A is referred to as a “first pixel”
  • the phase difference pixel provided with the light shielding member 20B is represented by “ This is referred to as a “second pixel”.
  • phase difference pixels when there is no need to distinguish between the first pixel and the second pixel, they are referred to as “phase difference pixels”.
  • FIG. 8 shows an example of the first pixel and the second pixel arranged in the image sensor 20.
  • the first pixel shown in FIG. 8 has a light shielding member 20A
  • the second pixel has a light shielding member 20B.
  • the light shielding member 20A is provided on the front side (microlens L side) of the photodiode PD, and shields the left half of the light receiving surface.
  • the light shielding member 20B is provided on the front side of the photodiode PD and shields the right half of the light receiving surface.
  • the microlens L and the light shielding members 20A and 20B function as a pupil dividing unit, the first pixel receives only the left side of the optical axis of the light beam passing through the exit pupil of the photographing lens 16, and the second pixel is photographed. Only the right side of the optical axis of the light beam passing through the exit pupil of the lens 16 is received. In this way, the light beam passing through the exit pupil is divided into left and right by the microlens L and the light shielding members 20A and 20B, which are pupil dividing portions, and enter the first pixel and the second pixel, respectively.
  • the subject image corresponding to the left half of the light beam passing through the exit pupil of the photographing lens 16 and the subject image corresponding to the right half of the light beam are in focus (in focus).
  • the portion forms an image at the same position on the image sensor 20.
  • the front pin portion or the rear pin portion is incident on different positions on the image sensor 20 (the phase is shifted).
  • the subject image corresponding to the left half light beam and the subject image corresponding to the right half light beam can be acquired as parallax images (left eye image and right eye image) having different parallaxes.
  • the imaging apparatus 100 detects the amount of phase shift based on the pixel value of the first pixel and the pixel value of the second pixel by using the phase difference AF function. Then, the focal position of the photographing lens is adjusted based on the detected phase shift amount.
  • the light shielding members 20A and 20B are referred to as “light shielding members” without reference numerals.
  • the light shielding member is a G filter at the upper left corner of two sets of the first array pattern A and the second array pattern B included in the basic array pattern C.
  • the light shielding member 20A is arranged on the (6n + 1) th line in the vertical direction, and the light shielding member 20B is arranged on the (6n + 4) th line.
  • the light shielding members are provided for all the basic array patterns C.
  • the present invention is not limited to this, and only the basic array patterns C in a predetermined region of the image sensor 20 are provided. It may be.
  • the light shielding member is provided for the pixels of the G filter at the upper left corner of all the first array pattern A and the second array pattern B, and both the vertical direction and the horizontal direction are provided.
  • One phase difference pixel is regularly arranged for every three pixels. For this reason, since a relatively large number of normal pixels are arranged around the phase difference pixels, the interpolation accuracy in the case of interpolating the pixel values of the phase difference pixels from the pixel values of the normal pixels can be improved.
  • the “normal pixel” refers to a pixel other than the phase difference pixel (for example, a pixel that does not have the light shielding members 20A and 20B).
  • the image sensor 20 is classified into a first pixel group, a second pixel group, and a third pixel group.
  • the first pixel group refers to a plurality of first pixels, for example.
  • the second pixel group refers to, for example, a plurality of second pixels.
  • the third pixel group refers to a plurality of normal pixels, for example.
  • the RAW image output from the first pixel group is referred to as a “first image”
  • the RAW image output from the second pixel group is referred to as a “second image”
  • the third image The RAW image output from the pixel group is referred to as a “third image”.
  • the image sensor 20 outputs a first image (a digital signal indicating the pixel value of each first pixel) from the first pixel group, and outputs a second image (from the second pixel group). A digital signal indicating the pixel value of each second pixel). Further, the image sensor 20 outputs a third image (a digital signal indicating the pixel value of each normal pixel) from the third pixel group. Note that the third image output from the third pixel group is a chromatic image, for example, a color image having the same color array as the normal pixel array.
  • the first image, the second image, and the third image output from the image sensor 20 are temporarily stored in a volatile storage area in the memory 26 via the interface unit 24.
  • the image processing unit 28 includes a normal processing unit 30.
  • the normal processing unit 30 generates a chromatic color normal image (an example of a first display image) by processing the R, G, and B signals corresponding to the third pixel group.
  • the image processing unit 28 includes a split image processing unit 32.
  • the split image processing unit 32 generates an achromatic split image by processing G signals corresponding to the first pixel group and the second pixel group.
  • the image processing unit 28 according to the first embodiment is realized by an ASIC (Application (Specific Integrated Circuit) that is an integrated circuit in which a plurality of functions related to image processing are integrated into one.
  • the hardware configuration is not limited to this, and may be another hardware configuration such as a programmable logic device, a computer including a CPU, a ROM, and a RAM.
  • the Encoder 34 converts the input signal into a signal of another format and outputs it.
  • the hybrid viewfinder 220 has an LCD 247 that displays an electronic image.
  • the number of pixels in a predetermined direction on the LCD 247 (for example, the number of pixels in the horizontal direction, which is a parallax generation direction) is smaller than the number of pixels in the same direction on the display unit 213.
  • the display control unit 36 is connected to the display unit 213 and the LCD 247, and selectively displays the image on the LCD 247 or the display unit 213 by selectively controlling the LCD 247 and the display unit 213.
  • the display unit 213 and the LCD 247 are referred to as “display devices” when it is not necessary to distinguish between them.
  • the imaging apparatus 100 is configured to be able to switch between a manual focus mode and an autofocus mode by a dial 212 (focus mode switching unit).
  • the display control unit 36 causes the display device to display a live view image obtained by combining the split images.
  • the CPU 12 operates as a phase difference detection unit and an automatic focus adjustment unit.
  • the phase difference detection unit detects a phase difference between the first image output from the first pixel group and the second image output from the second pixel group.
  • the automatic focus adjustment unit controls the lens driving unit (not shown) from the device control unit 22 via the mounts 256 and 346 so that the defocus amount of the photographing lens 16 is zero based on the detected phase difference. Then, the photographing lens 16 is moved to the in-focus position.
  • defocus amount refers to, for example, the amount of phase shift between the first image and the second image.
  • the eyepiece detection unit 37 detects that a person (for example, a photographer) has looked into the viewfinder eyepiece 242 and outputs the detection result to the CPU 12. Therefore, the CPU 12 can grasp whether or not the finder eyepiece unit 242 is used based on the detection result of the eyepiece detection unit 37.
  • the external I / F 39 is connected to a communication network such as a LAN (Local Area Network) or the Internet, and controls transmission / reception of various information between the external device (for example, a printer) and the CPU 12 via the communication network. Therefore, when a printer is connected as an external device, the imaging apparatus 100 can output a captured still image to the printer for printing. Further, when a display is connected as an external device, the imaging apparatus 100 can output and display a captured still image or live view image on the display.
  • a communication network such as a LAN (Local Area Network) or the Internet
  • FIG. 9 is a functional block diagram illustrating an example of main functions of the imaging apparatus 100 according to the first embodiment.
  • symbol is attached
  • the normal processing unit 30 and the split image processing unit 32 each have a WB gain unit, a gamma correction unit, and a synchronization processing unit (not shown), and with respect to the original digital signal (RAW image) temporarily stored in the memory 26.
  • Each processing unit sequentially performs signal processing. That is, the WB gain unit executes white balance (WB) by adjusting the gains of the R, G, and B signals.
  • the gamma correction unit performs gamma correction on each of the R, G, and B signals that have been subjected to WB by the WB gain unit.
  • the synchronization processing unit performs color interpolation processing corresponding to the color filter array (here, Bayer array as an example) of the image sensor 20, and generates synchronized R, G, and B signals.
  • the normal processing unit 30 and the split image processing unit 32 perform image processing on the RAW image in parallel every time a RAW image for one screen is acquired by the image sensor 20.
  • the normal processing unit 30 receives R, G, and B raw images from the interface unit 24, and converts the R, G, and B pixels of the third pixel group into the first pixel group and the second pixel group. It is generated by interpolating with surrounding pixels of the same color (for example, G pixel). Thereby, a normal image for recording can be generated based on the third image output from the third pixel group.
  • the normal processing unit 30 outputs the generated image data of the normal image for recording to the encoder 34.
  • the R, G, B signals processed by the normal processing unit 30 are converted (encoded) into recording signals by the encoder 34 and recorded in the recording unit 40 (see FIG. 7).
  • a normal image for display that is an image based on the third image processed by the normal processing unit 30 is output to the display control unit 36.
  • the term “for recording” and “for display” are used. Is referred to as a “normal image”.
  • the image sensor 20 can change the exposure conditions (shutter speed by an electronic shutter as an example) of each of the first pixel group and the second pixel group, and thereby can simultaneously acquire images having different exposure conditions. . Therefore, the image processing unit 28 can generate an image with a wide dynamic range based on images with different exposure conditions. In addition, a plurality of images can be simultaneously acquired under the same exposure condition, and by adding these images, a high-sensitivity image with little noise can be generated, or a high-resolution image can be generated.
  • the split image processing unit 32 extracts the G signal of the first pixel group and the second pixel group from the RAW image once stored in the memory 26, and the G of the first pixel group and the second pixel group. An achromatic split image is generated based on the signal.
  • Each of the first pixel group and the second pixel group extracted from the RAW image is a pixel group including G filter pixels as described above. Therefore, the split image processing unit 32 can generate an achromatic left parallax image and an achromatic right parallax image based on the G signals of the first pixel group and the second pixel group.
  • the above “achromatic left parallax image” is referred to as a “left eye image”
  • the above “achromatic right parallax image” is referred to as a “right eye image”.
  • the split image processing unit 32 combines the left-eye image based on the first image output from the first pixel group and the right-eye image based on the second image output from the second pixel group to split the image. Generate an image.
  • the generated split image data is output to the display controller 36.
  • the display control unit 36 records image data corresponding to the third pixel group input from the normal processing unit 30 and the split corresponding to the first and second pixel groups input from the split image processing unit 32. Display image data is generated based on the image data of the image. For example, the display control unit 36 displays the image input from the split image processing unit 32 in the display area of the normal image indicated by the recording image data corresponding to the third pixel group input from the normal processing unit 30. The split image indicated by the data is synthesized. Then, the image data obtained by the synthesis is output to the display unit 213.
  • the split image generated by the split image processing unit 32 is a multi-divided image obtained by combining a part of the left eye image and a part of the right eye image.
  • a split image shown in FIG. 11 is an image obtained by synthesizing a part of the left eye image and a part of the right eye image at a position corresponding to the position of the predetermined area of the display unit 213, and between the images divided into four in the vertical direction. Is an image shifted in a predetermined direction (for example, a parallax generation direction) according to the in-focus state.
  • the aspect of the split image is not limited to the example shown in FIG. 11 and may be an image obtained by combining the upper half image of the left eye image and the lower half image of the right eye image. In this case, the upper and lower images divided into two are shifted in a predetermined direction (for example, the parallax generation direction) according to the in-focus state.
  • the method of combining the split image with the normal image is not limited to the combining method of fitting the split image in place of a part of the normal image.
  • a synthesis method in which a split image is superimposed on a normal image may be used.
  • a combining method may be used in which the transmittance of a part of the normal image on which the split image is superimposed and the split image are appropriately adjusted and superimposed.
  • the hybrid finder 220 includes an OVF 240 and an EVF 248.
  • the OVF 240 is an inverse Galileo finder having an objective lens 244 and an eyepiece 246, and the EVF 248 has an LCD 247, a prism 245, and an eyepiece 246.
  • a liquid crystal shutter 243 is disposed in front of the objective lens 244, and the liquid crystal shutter 243 shields light so that an optical image does not enter the objective lens 244 when the EVF 248 is used.
  • the prism 245 reflects an electronic image or various information displayed on the LCD 247 and guides it to the eyepiece 246, and combines the optical image and information (electronic image and various information) displayed on the LCD 247.
  • an OVF mode in which an optical image can be visually recognized by the OVF 240 and an electronic image can be visually recognized by the EVF 248 each time it is rotated.
  • the EVF mode is switched alternately.
  • the display control unit 36 controls the liquid crystal shutter 243 to be in a non-light-shielded state so that an optical image can be visually recognized from the eyepiece unit. Further, only the split image is displayed on the LCD 247. Thereby, a finder image in which a split image is superimposed on a part of the optical image can be displayed.
  • the display control unit 36 controls the liquid crystal shutter 243 to be in a light shielding state so that only the electronic image displayed on the LCD 247 can be visually recognized from the eyepiece unit.
  • the image data equivalent to the image data obtained by combining the split image output to the display unit 213 is input to the LCD 247, whereby the split image is combined with a part of the normal image in the same manner as the display unit 213. Electronic images can be displayed.
  • FIG. 10 shows an example of display areas of the normal image and the split image in the display device.
  • the split image is displayed in a rectangular frame at the center of the screen of the display device, and a normal image is displayed in the outer peripheral area of the split image. Note that the edge line representing the rectangular frame shown in FIG. 10 is not actually displayed, but is shown in FIG. 10 for convenience of explanation.
  • FIG. 11 schematically shows an example of a normal image and a split image displayed on the same screen of the display device as a comparative example to the first embodiment.
  • the split image is displayed in an achromatic color
  • the normal image is displayed in a chromatic color.
  • the color of the split image does not stand out from the color of the normal image, and it becomes difficult to visually recognize the boundary between the split image and the normal image.
  • the image processing unit 28 visually differentiates the display color of the normal image from the display color of the split image (hereinafter, “display color differentiation process”).
  • Image output processing including The display color differentiation process includes a process of acquiring color information of a normal image and determining a color having a color characteristic different from the color characteristic of the normal image as a display color of the split image based on the acquired color information.
  • the “color information” refers to color information indicating the color characteristics of the object color of a normal image, for example.
  • the “color characteristic different from the color characteristic of the normal image” refers to a color characteristic different from the color characteristic of the object color of the normal image, for example.
  • the “color characteristic different from the color characteristic of the normal image” refers to, for example, a color separated from the object color of the normal image by a predetermined angle in the hue circle.
  • the “predetermined angle” here refers to an angle that is predetermined as an angle at which a color that is visually recognizable as having a color characteristic different from the color characteristic of the object color of the normal image is obtained.
  • the “predetermined angle” is preferably 160 degrees or more and 200 degrees or less, more preferably 170 degrees or more and 190 degrees or less, and most preferably 180 degrees (an angle for specifying a complementary color). It is.
  • the angle is not limited to this, and may be an angle designated by the user via the operation unit 14 (for example, an angle at which a user can visually recognize a color).
  • the present invention is not limited thereto, and an angle determined based on an evaluation result obtained by a sensory test by a plurality of subjects may be adopted as the “predetermined angle”. In this case, the majority of subjects among the plurality of subjects are evaluated to have a color characteristic different from the color characteristic of the object color of the normal image (sample image) (for example, a hue different from the hue of the object color of the sample image).
  • adopts as a "predetermined angle" is given.
  • the composition ratio of the primary color components in the normal image after WB is adopted.
  • the “component ratio of primary color components” here refers to a ratio that is normalized such that the sum of the R, G, and B pixel values is “1”.
  • the first input / output function F (W) shown in FIG. 12 is used as an example.
  • the horizontal axis shown in FIG. 12 is an input value of the first input / output function F (W), and shows one component of the R, G, and B composition ratio in the normal image after WB. That is, the horizontal axis indicates the average value of the R, G, and B pixel values in the normal image after WB.
  • the vertical axis shown in FIG. 12 is the output value of the first input / output function F (W) and indicates R: G: B of each pixel of the split image.
  • the split image is blue-green gray.
  • the first input / output function F (W) is a function for converting the composition ratio of the primary color components in the normal image after WB to the composition ratio of the primary color components of the split image.
  • the first input / output function F (W) includes a function f1 that is a decreasing function (a function having a characteristic that the output value decreases as the input value increases).
  • the first input / output function F (W) shown in FIG. 12 includes a linear function having a negative slope as an example of the function f1, but has a characteristic that the output value decreases as the input value increases instead of the linear function. You may employ
  • the first input / output function F (W) is a bent function (a function combining the functions f1 to f3).
  • the image processing unit 28 assimilates the display color of the normal image and the display color of the split image by bringing the color characteristic of the normal image close to the color characteristic of the split image.
  • display color assimilation process the second input / output function G (W) shown in FIG. 13 is used as an example.
  • the horizontal axis shown in FIG. 13 is the input value of the second input / output function G (W), and the configurations of R, G, and B in the normal image after the WB are implemented as in the horizontal axis shown in FIG. One component of the ratio is shown.
  • the vertical axis shown in FIG. 13 is the output value of the second input / output function G (W), and shows one component of the composition ratio of R, G, and B in the split image, similar to the vertical axis shown in FIG.
  • the second input / output function G (W) also converts the composition ratio of the primary color components in the normal image after WB to the composition ratio of the primary color components of the split image. It is a function.
  • the second input / output function G (W) is a function symmetrical to the first input / output function F (W) shown in FIG. 12, and includes functions g1, g2, and g3.
  • the function g1 is an increasing function (a function in which the output value increases as the input value increases).
  • the second input / output function G (W) shown in FIG. 13 includes a linear function having a positive slope as an example of the function g1, but instead of the linear function, the second input / output function G (W) has a characteristic that the output value increases as the input value increases. You may employ
  • the second input / output function G (W) shown in FIG. 13 is a linear function continuous to the function g1 and includes functions g2 and g3 having a slope of “0”. These functions g2 and g3 Instead of this, a nonlinear function may be used, or a linear function having a positive slope may be used.
  • the second input / output function G (W) is a bent function (a function combining the functions g1 to g3).
  • the first input / output function F (W) and the second input / output function G (W) are stored in a predetermined storage area (for example, the memory 28).
  • image output processing performed by the image processing unit 28 as an operation of the first embodiment will be described with reference to FIG.
  • the case where the image output process is performed by the image processing unit 28 is illustrated, but the present invention is not limited to this.
  • the CPU 12 executes the image output process program and the imaging apparatus 100 executes the image output process program. Image output processing may be performed.
  • step 400 the image processing unit 28 acquires a normal image.
  • the normal image acquired in step 400 is stored in a predetermined storage area (for example, the memory 26) by the image processing unit 28.
  • the image processing unit 28 acquires the left eye image and the right eye image, and generates a split image based on the acquired left eye image and right eye image.
  • the image processing unit 28 executes WB processing.
  • FIG. 15 shows an example of the flow of WB processing.
  • the image processing unit 28 acquires color information from the normal image, and estimates a light source color when imaging is performed based on the acquired color information.
  • the image processor 28 acquires a gain corresponding to the light source color estimated in step 404A as a gain for each of R, G, and B colors used in WB.
  • gains are determined for each of the sun on a clear day, the sun on a cloudy day, a fluorescent lamp, and a tungsten light source (for example, stored in the memory 26). The gain of the light source corresponding to the estimated light source color is acquired.
  • step 404C the image processing unit 28 executes white balance with the gain acquired in step 404B, and then ends the white balance process.
  • step 406 the image processing unit 28 sets a pixel to be processed (target pixel (i, j)) in the split image generated in step 402.
  • target pixel i, j
  • the target pixel is sequentially moved from the pixel (1, 1) to the pixel (m, n) every time this step 406 is performed.
  • the image processing unit 28 determines whether or not the normal image obtained by executing the white balance process in step 404 is not an achromatic image.
  • the “achromatic color image” referred to here is, for example, a component ratio of R, G, and B of
  • Whether or not the normal image is an achromatic image is determined by determining whether the occupancy of the achromatic region (the number of pixels of the achromatic pixel) in the entire region (the total number of pixels) of the normal image is a predetermined occupancy rate (for example, it may be performed by determining whether or not it is 95% or more. Further, when the predetermined area in the normal image is an achromatic color area, it may be determined that the normal image is an achromatic color image. In addition, the normal image may be determined to be an achromatic image when the achromatic region is equal to or larger than a predetermined area (a predetermined number of pixels) in a region adjacent to the boundary with the split image in the normal image.
  • the other area is an achromatic color area, so the occupation ratio of the achromatic color area exceeds the predetermined occupation ratio. If it has become, it may be determined that the normal image is not an achromatic image.
  • step 412 the image processing unit 28 executes chromatic color addition processing.
  • FIG. 16 shows an example of the flow of chromatic color imparting processing.
  • the image processing unit 28 determines whether or not the target pixel set in step 406 is the first processing target pixel. If the target pixel is the first pixel to be processed in step 412A, the determination is affirmed and the process proceeds to step 412B. If the target pixel is not the first pixel to be processed in step 412A, the determination is negative and the process proceeds to step 412C.
  • predetermined R, G, and B values that can give a chromatic color to the target pixel by multiplying the R, G, and B pixel values of the target pixel.
  • the component ratio (hereinafter referred to as “predetermined component ratio”) is stored.
  • step 412B the image processing unit 28 acquires a predetermined composition ratio from a predetermined storage area.
  • step 412C the image processing unit 28 multiplies the pixel value of the same color of the target pixel by the component for each color having the predetermined composition ratio acquired in step 412B, so that the R, G, B of the target pixel is obtained.
  • step 412D the image processing unit 28 stores the R, G, and B pixel values of the target pixel corrected in step 412C in a predetermined storage area (for example, the memory 26), thereby storing the image. Thereafter, the process proceeds to step 418.
  • step 410 the image processing unit 28 determines whether or not a display color assimilation condition that is a condition for executing the display color assimilation process is satisfied.
  • the “display color assimilation condition” mentioned here includes, for example, a condition that an instruction to start execution of the display color assimilation process is input via the operation unit 14. If the display color assimilation condition is not satisfied in step 410, the determination is affirmed and the routine proceeds to step 414. If the display color assimilation condition is satisfied in step 410, the determination is negative and the routine proceeds to step 416.
  • step 414 when an affirmative determination is made in step 410 is given, but does the condition for executing the display color differentiation process be satisfied between step 410 and step 414?
  • a step of performing a process of determining whether or not may be inserted. In this case, if the condition for executing the display color differentiation process is satisfied, the determination is affirmed and the process proceeds to step 414. If the condition for executing the display color differentiation process is not satisfied, the determination is negative. Then, the process proceeds to step 418.
  • the condition for executing the display color differentiation process include a condition that an instruction to start execution of the display color differentiation process is input via the operation unit 14.
  • the average value (or composition ratio) of the R, G, and B pixel values of the normal image and the average value (or composition ratio) of the R, G, and B pixel values of the split image are R, G,
  • An example of the condition is that the colors of B match within a predetermined error.
  • step 414 the image processing unit 28 executes display color differentiation processing.
  • FIG. 17 shows an example of the flow of display color differentiation processing.
  • step 414A shown in FIG. 17 it is determined whether or not the pixel of interest set in step 406 is the first pixel to be processed. If the target pixel is the first pixel to be processed in step 414A, the determination is affirmed and the routine proceeds to step 414B. If the target pixel is not the first pixel to be processed in this step 414A, the determination is negative and the routine proceeds to step 414C.
  • step 414B the image processing unit 28 acquires R, G, and B pixel values (for example, pixel values for all pixels included in the specific image) from the specific image.
  • a part of the normal image generated in step 400 is adopted as the “specific image”.
  • the present invention is not limited to this, and the normal image and the left-eye image and right-eye image generated in step 402 are used. May be.
  • Step 414B as the “part of the normal image”, the image output from the edge on the boundary side with the split image imaging area in the normal imaging area among the normal images generated in Step 400 is used.
  • the size of the “edge portion” here refers to, for example, the size of 1/3 of the normal image area width.
  • the size of 1/3 of the normal image area width is merely an example, and it may be an area sufficient to obtain effective pixel values (an example of color information).
  • the size of the normal image area and the split image may be used. It goes without saying that the size of the “edge” required to acquire an effective pixel value differs depending on the size of the pixel. Moreover, not only an edge part but the other area
  • region in a normal image may be sufficient. The other region in the normal image refers to a region other than the “edge” in the normal image.
  • the “specific image” is not necessarily a part of the normal image, and may be an image of the entire area of the normal image.
  • the “normal imaging area” is an area used for imaging a normal image in the image sensor 20, and is, for example, an area of the imaging device 20 corresponding to the “normal image display area” illustrated in FIG. 10. Point to.
  • the “split image capturing area” described above refers to an area that is used for capturing a split image in the image sensor 20 and is adjacent to the normal image capturing area, and corresponds to, for example, the “split image display area” illustrated in FIG. This refers to the area of the imaging device 20 that performs.
  • the image processing unit 28 calculates an average value for each color of the R, G, and B pixel values acquired in step 414B.
  • the composition ratio of the R, G, B pixel values is calculated by the image processing unit 28 based on the average value for each color of the R, G, B pixel values calculated in step 414D.
  • the image processing unit 28 normalizes the average value for each color of the R, G, and B pixel values calculated in step 414D, for example, so that the sum is “1”.
  • the composition ratio is calculated.
  • the image processor 28 acquires the first input / output function F (W) from a predetermined storage area.
  • the image processing unit 28 calculates the composition ratio of the R, G, and B pixel values in the split image using the first input / output function F (W). That is, the composition ratio of the R, G, B pixel values calculated in step 414E is converted into the composition ratio of the R, G, B pixel values in the split image by the first input / output function F (W).
  • the image processing unit 28 determines whether or not the different color application condition is satisfied.
  • the “different color application condition” refers to a condition for executing a process (different color application process) of applying different chromatic colors to each of the image by the left eye image and the image by the right eye image in the split image.
  • the different color imparting condition for example, a condition that an instruction to start execution of the different color imparting process is made through the operation unit 14 can be cited.
  • the “different color imparting process” referred to here indicates, for example, steps 414I, 414K, 414L, 414M, and 414N described later. If the different color application condition is satisfied in step 414H, the determination is affirmed and the routine proceeds to step 414I. If it is determined in step 414H that the different color imparting conditions are not satisfied, the determination is negative and the routine proceeds to step 414J.
  • step 414I the image processing unit 28 determines whether or not the target pixel set in step 406 is a pixel included in the right-eye image. If the target pixel is not a pixel included in the right-eye image in this step 414J (if the target pixel is a pixel included in the left-eye image), the determination is negative and the process proceeds to step 414L. If the target pixel is a pixel included in the right-eye image in step 414I, the determination is affirmed and the process proceeds to step 414K.
  • the image processing unit 28 calculates the right-eye image composition ratio by adding a correction term for the right-eye image to the composition ratio calculated in step 414G.
  • a correction term ⁇ R is added to F (Wr).
  • F (Wr) + ⁇ R which is the R component of the right-eye image composition ratio
  • ⁇ G the correction term ⁇ G to F (Wg)
  • F (Wg) + ⁇ G which is the G component of the right-eye image composition ratio
  • ⁇ B which is the B component of the right-eye image composition ratio
  • step 414M the image processing unit 28 multiplies the component value for each color of the right-eye image composition ratio calculated in step 414K by the pixel value of the same color of the pixel of interest, so that R, G, The pixel value of each color of B is corrected, and then the process proceeds to step 414P.
  • this step 414M being executed by the image processing unit 28, a color having a color characteristic different from the color characteristic of the normal image is determined as the display color of the target pixel.
  • step 414L the image processing unit 28 calculates the left-eye image composition ratio by adding a correction term for the left-eye image to the composition ratio calculated in step 414G.
  • a correction term for the left-eye image For example, if the component ratio of the R, G, and B pixel values calculated in step 414G is F (Wr), F (Wg), and F (Wb), the correction term ⁇ R is subtracted from F (Wr).
  • F (Wr) ⁇ R which is the R component of the left-eye image composition ratio
  • F (Wg) ⁇ G which is the G component of the left-eye image composition ratio
  • F (Wb) ⁇ B which is the B component of the left-eye image composition ratio
  • the correction terms ⁇ R, ⁇ G, and ⁇ B may be values that allow the difference to be visually recognized.
  • a value obtained by simply linearly multiplying F (Wg) and (Wb) may be adopted.
  • step 414N the image processing unit 28 multiplies the component value for each color of the left-eye image component ratio calculated in step 414L by the pixel value of the same color of the target pixel, so that R, G, The pixel value of each color of B is corrected, and then the process proceeds to step 414P.
  • this step 414N being executed by the image processing unit 28 in this way, a color having a color characteristic different from the color characteristic of the normal image is determined as the display color of the target pixel.
  • step 414N and the above step 414 by the image processing unit 28 different chromatic colors in the split-eye left-eye image and right-eye image are determined as the split-image display color.
  • step 414J the image processing unit 28 multiplies the component value for each color of the composition ratio calculated in step 414G by the pixel value of the same color of the target pixel, so that each color of R, G, B of the target pixel is changed.
  • the pixel value is corrected, and then the process proceeds to step 414P.
  • this step 414J being executed by the image processing unit 28 in this way, a color having a color characteristic different from the color characteristic of the normal image is determined as the display color of the target pixel.
  • step 414P the image processing unit 28 stores the R, G, and B pixel values of the target pixel corrected in any of the above steps 414J, 414M, and 414N in a predetermined storage area (for example, the memory 26). To store the image, and then the process proceeds to step 418.
  • step 414C it is determined whether or not a different color imparting process has been executed for the previous process target pixel. If a different color imparting process is executed for the previous pixel to be processed in step 414C, the determination is affirmed and the process proceeds to step 414Q. If the different color imparting process has not been executed for the previous pixel to be processed in step 414C, the determination is negative and the process proceeds to step 414J.
  • step 414Q the image processing unit 28 determines whether or not the target pixel set in step 406 is a pixel included in the right-eye image. In step 414Q, when the target pixel is not a pixel included in the right-eye image (when the target pixel is a pixel included in the left-eye image), the determination is negative and the process proceeds to step 414N. If it is determined in step 414Q that the pixel of interest is a pixel included in the right-eye image, the determination is affirmed and the process proceeds to step 414M.
  • step 416 the image processing unit 28 executes display color assimilation processing.
  • FIG. 18 shows an example of the flow of display color assimilation processing.
  • step 416A shown in FIG. 18 it is determined whether or not the target pixel set in step 406 is the first processing target pixel. If the target pixel is the first pixel to be processed in step 416A, the determination is affirmed and the routine proceeds to step 416B. If the target pixel is not the first pixel to be processed in step 416A, the determination is negative and the process proceeds to step 416G.
  • step 416B the image processing unit 28 acquires R, G, and B pixel values (for example, pixel values for all pixels included in the specific image) from the specific image.
  • the “specific image” refers to an image corresponding to the “specific image” used in step 414B, for example.
  • the image processing unit 28 calculates an average value for each color of the R, G, and B pixel values acquired in step 416B.
  • the image processing unit 28 calculates the composition ratio of the R, G, and B pixel values based on the average value for each color of the R, G, and B pixel values calculated in Step 416C.
  • the image processing unit 28 normalizes the average value for each color of the R, G, and B pixel values calculated in step 416C, for example, so that the sum is “1”, and thereby R, G, B The composition ratio is calculated.
  • the image processing unit 28 acquires the second input / output function G (W) from a predetermined storage area.
  • the image processing unit 28 calculates the composition ratio of the R, G, and B pixel values in the split image using the second input / output function G (W). That is, the composition ratio of the R, G, and B pixel values calculated in step 416D is converted into the composition ratio of the R, G, and B pixel values in the split image by the second input / output function G (W).
  • step 416G the image processing unit 28 multiplies the component value for each color of the composition ratio calculated in step 416F by the pixel value of the same color of the target pixel, so that R, G, B of the target pixel is obtained.
  • the pixel value of each color is corrected, and then the process proceeds to step 416H.
  • a color having a color characteristic approaching the color characteristic of the normal image by a predetermined degree is determined as the display color of the target pixel.
  • the image processing unit 28 stores the R, G, B pixel values of the target pixel corrected in the above step 416G in a predetermined storage area (for example, the memory 26), thereby storing the image. Thereafter, the process proceeds to step 418.
  • step 418 image processing other than WB processing, chromatic color addition processing, display color differentiation processing, and display color assimilation processing is performed on the target pixel (i, j) set in step 406. (For example, gamma correction and synchronization processing) are performed.
  • the image processing unit 28 stores the image by storing the pixel value obtained by the image processing performed in step 418 in a predetermined storage area (for example, the memory 26) for each pixel. Thereafter, the process proceeds to step 422.
  • the image processing unit 28 determines whether the chromatic color providing process, the display color differentiation process, or the display color assimilation process for all the pixels has been completed. If the chromatic color providing process, the display color differentiation process, or the display color assimilation process for all the pixels is completed in step 422, the determination is affirmed and the process proceeds to step 424. If the chromatic color providing process, the display color differentiation process, or the display color assimilation process for all the pixels is not completed in step 422, the determination is negative and the process proceeds to step 406.
  • step 424 the image processing unit 28 outputs the image stored in the predetermined storage area in steps 400 and 420 to a predetermined output destination, and then ends the image processing.
  • the above-mentioned “predetermined output destination” includes, for example, the display control unit 36.
  • the external I / F 40 is applied as the “predetermined output destination”.
  • a live view image is displayed on the display device as shown in FIG. 19, FIG. 20, FIG. Is displayed.
  • the live view image includes a normal image and a split image.
  • Split images are roughly classified into left-eye images and right-eye images (see, for example, FIG. 20).
  • the left-eye image and the right-eye image in the split image are shifted in the parallax generation direction (for example, the horizontal direction).
  • the boundary image between the normal image and the split image is also shifted in the parallax generation direction. This indicates that a phase difference has occurred, and the photographer can visually recognize the phase difference and the parallax generation direction through the split image.
  • the left-eye image and the right-eye image in the split image do not shift.
  • the boundary image between the normal image and the split image is not shifted. This indicates that no phase difference has occurred, and the photographer can recognize that no phase difference has occurred visually through the split image.
  • the photographer can check the in-focus state of the photographing lens 16 by the split image displayed on the display device.
  • the focus shift amount (defocus amount) can be reduced to zero by manually operating the focus ring 302 of the photographing lens 16.
  • the normal image and the split image can be displayed as color images without color misregistration, and the manual focus adjustment by the photographer can be supported by the color split image.
  • a normal image is displayed on the display device and the color characteristics of the normal image are displayed.
  • the split image is displayed in a color having a color characteristic different from the above.
  • the display color of the normal image is red (in this case, orange), and the display color of the split image is blue.
  • the blue color is adopted as the split image display color.
  • a red color is employed as the split image display color. Therefore, it is possible to make the split image stand out from the normal image.
  • the colors of the left eye image and the right eye image Split images with different characteristics are displayed.
  • the left-eye image in the split image has a blue color
  • the right-eye image in the split image has a green color.
  • the display color of the normal image is a red color (here, orange) as in the example shown in FIG.
  • the split image can be emphasized with respect to the normal image, and the left-eye image and the right-eye image can be easily identified in the split image.
  • the image processing unit 28 applies a chromatic color to the split image as illustrated in FIG. 22 by executing the chromatic color applying process shown in FIG. In the example shown in FIG. 22, a blue color is adopted as the display color of the split image. As a result, the split image can be enhanced with respect to the achromatic normal image.
  • the display color of the split image is set to the blue color, and the blue color is used in the estimated light source color. If is dominant, the display color of the split image may be a red color.
  • the display color assimilation process shown in FIG. 18 is executed by the image processing unit 28, as shown in FIG. 23 as an example, a normal image and a split image in which the display colors are assimilated are displayed.
  • the red color an orange color as an example
  • the red color is adopted as the display color of the split image. It is displayed to melt into. Therefore, it is possible to make it difficult to visually recognize the boundary between the normal image and the split image.
  • the imaging apparatus 100 has the left and right sides of the optical axis of the light beam passing through the exit pupil of the photographing lens 16 (the subject image that has passed through the first and second regions).
  • 1) includes first and second pixel groups that are divided into pupils and imaged respectively.
  • Generate step 400.
  • the image processing unit 28, which is an example of a second display image generation unit is an example of a second display image based on the first and second images output from the first and second pixel groups.
  • a split image is generated (step 402).
  • the image processing unit 28 which is an example of an acquisition unit, acquires color information of the generated normal image (Step 414B).
  • the image processing unit 28, which is an example of a determination unit, determines a color having a color characteristic different from the color characteristic of the normal image as a split image display color based on the acquired color information (steps 414J, 414M, 414N). Thereby, the visibility of the boundary line between the normal image and the split image can be improved as compared with the case where this configuration is not provided.
  • the imaging apparatus 100 employs a color that can be visually recognized as having a color characteristic different from that of the normal image as a color having a color characteristic different from that of the normal image. Yes. Thereby, compared with the case where it does not have this structure, a normal image and a split image can be visually identified.
  • the image processing unit 28 acquires color information from the normal image. Therefore, compared with the case where this configuration is not provided, the color characteristics of the split image and the color characteristics of the normal image can be differentiated with high accuracy with a simple configuration.
  • the image processing unit 28 acquires color information based on the normal image and the split image. Thereby, compared with the case where this configuration is not provided, the color characteristics of the split image and the color characteristics of the normal image can be differentiated with high accuracy. Even if the normal image is overexposed, the first and second pixel groups have low sensitivity, so that color information can be acquired. Note that the color information may be acquired based on the normal image and the first image, the color information may be acquired based on the normal image and the second image, the normal image, the first image, and the second image. Color information may be obtained for images
  • the imaging element 20 is used for the normal imaging area used for imaging the normal image and the split image imaging area used for imaging the split image and adjacent to the normal imaging area. And have. Then, the image processing unit 28 acquires color information from an image output from an edge portion of the normal image on the boundary side with the split image imaging region in the normal imaging region. As a result, the boundary between the normal image and the split image can be made clearer than in the case without this configuration.
  • the imaging apparatus 100 employs color information indicating the color characteristics of the object color of the normal image as the color information. Therefore, the visibility of the boundary line between the normal image and the split image can be improved even when the object color is more dominant than the light source color in the normal image, compared to the case without this configuration.
  • the imaging apparatus 100 employs a color characteristic different from the color characteristic of the normal image as the color characteristic different from the color characteristic of the normal image.
  • the imaging device 100 applies hue as the color characteristic. Accordingly, the visibility of the boundary line between the normal image and the split image can be further improved with a simple configuration as compared with the case where the present configuration is not provided.
  • the imaging apparatus 100 employs a color that has a color characteristic different from the color characteristic of the normal image and that is separated from the object color of the normal image by a predetermined angle in the hue circle. Yes. Accordingly, an object color suitable as a color having a color characteristic different from the color characteristic of the normal image can be easily determined as compared with the case without this configuration.
  • the imaging apparatus 100 employs an angle at which a color that can be visually recognized as having a color characteristic different from the color characteristic of the object color of the normal image is obtained as the predetermined angle. ing. Thereby, compared with the case where this structure is not provided, a split image can be clearly shown with respect to a normal image.
  • the imaging apparatus 100 employs the composition ratio of the primary color components in the normal image as the color characteristic of the object color of the normal image. As a result, it is possible to easily identify an appropriate color characteristic for improving the visibility of the boundary line between the normal image and the split image as the color characteristic of the object color of the normal image, as compared with the case without this configuration. it can.
  • the imaging apparatus 100 employs the composition ratio of the primary color components in the normal image after the white balance is implemented as the composition ratio of the primary color components in the normal image. Therefore, compared with the case where this configuration is not provided, the boundary line between the normal image and the split image in which the white balance is performed can be made clear.
  • the image processing unit 28 which is an example of an estimation unit, estimates the light source color based on the normal image. Then, a gain for the primary color component used in white balance is determined according to the light source color estimated by the image processing unit 28. Thereby, the visibility of the boundary line between the normal image and the split image on which the white balance is performed can be improved in consideration of the light source color as compared with the case where the present configuration is not provided.
  • the composition ratio of the primary color components in the normal image is determined based on the average value of the pixel values for each of the primary color components of the object color in the normal image.
  • the converted composition ratio obtained by converting the composition ratio of the primary color components in the normal image using the input / output function including the decrease function is the color characteristic of the normal image. Adopted as different color characteristics. Thereby, the visibility of the boundary line between the normal image and the split image can be improved with a simple configuration as compared with the case where the present configuration is not provided.
  • each of the first and second pixel groups is a single color pixel group, and the image processing unit 28 is output from the first and second pixel groups.
  • An achromatic image is generated as a split image based on the first and second images.
  • a green pixel group is employed as the single color pixel group.
  • the image processing unit 28 when the split image is an achromatic image, the image processing unit 28 has a color characteristic different from that of the normal image based on the acquired color information.
  • the coloring is determined as the display color of the split image.
  • the image processing unit 28 when the normal image and the split image are achromatic images, the image processing unit 28 has color characteristics different from the color characteristics of the normal image based on the acquired color information.
  • the chromatic color having the is determined as the display color of the split image.
  • the image processing unit 28 is a chromatic color having a color characteristic different from the color characteristic of the normal image, and is different for the left-eye image and the right-eye image in the split image. Is determined as the split image display color. This makes it easier to visually distinguish the left-eye image and the right-eye image in the split image compared to the case without this configuration.
  • an image in which the left eye image and the right eye image are alternately arranged in a direction intersecting the parallax generation direction is employed. is doing. This makes it easier to visually distinguish the left eye image and the right eye image in the split image.
  • the image processing unit 28 when the image processing unit 28 satisfies the display color assimilation condition, the image processing unit 28 approaches the color characteristics of the normal image with a predetermined degree based on the acquired color information. Further, the color having the above color characteristic is further determined as the display color of the split image. As a result, it is possible to make it difficult to visually recognize the boundary between the normal image and the split image as compared with the case where this configuration is not provided.
  • the imaging apparatus 100 includes a display control unit that performs control to display a normal image and a split image whose display color is corrected by the image processing unit 28 side by side on the same screen on the display device. 36 is further included. Thereby, it is possible to visually recognize the boundary line between the normal image and the split image with a simple configuration as compared with the case where this configuration is not provided.
  • the imaging element 20 includes the third pixel group, and the image processing unit 28 is based on the third image output from the third pixel group. A normal image is generated. Thereby, compared with the case where this structure is not provided, the image quality of a normal image can be improved.
  • the composition ratio of the primary color components in the normal image after the implementation of WB is adopted.
  • the configuration ratio is not limited to this, and the composition ratio of the primary color components in the normal image before implementation of the WB.
  • it may be a composition ratio of primary color components in a normal image after gamma correction.
  • the present invention is not limited to this, and R, G, and B pixel values of the light source color may be adopted. Thereby, even when the light source color is more dominant than the object color in the normal image, the visibility of the boundary line between the normal image and the split image can be improved.
  • the pixel value is corrected with a configuration ratio that does not depend on the type of the display device in the chromatic color providing process and the display color discrimination process.
  • the pixel value is determined according to the type of the display apparatus.
  • the pixel value may be corrected with the above-described composition ratio.
  • a further correction term for example, a positive correction term
  • the pixel value is corrected using the obtained composition ratio.
  • the maximum output value of the first input / output function F (W) is “1”, but may be a value exceeding “1”.
  • the value of the first input / output function F (W) exceeds “1”, the value may be corrected to “1” or less by performing a normalization process thereafter. The same applies to the output value of the second input / output function G (W).
  • the composition ratio of the R, G, and B pixel values is calculated from the average value of the R, G, and B pixel values acquired from the specific image.
  • the component ratio of the R, G, and B pixel values may be calculated from the representative values of the R, G, and B pixel values acquired from the image.
  • the representative value mentioned here can be exemplified by the mode, median, and the like of R, G, and B pixel values acquired from a specific image.
  • the G pixel is exemplified as the phase difference pixel.
  • the present invention is not limited to this, and any of R, G, and B pixels may be used.
  • a monochromatic pixel is exemplified as the phase difference pixel.
  • the present invention is not limited to this, and two or more of the R, G, and B pixels may be used as the phase difference pixel. In this case, a color image is displayed as a split image. Even in this case, the present invention is established.
  • the time required for the pixel value correction processing is shorter and the visual processing is shorter when the single-color phase-difference pixels are used, as compared with the case where the phase-difference pixels of a plurality of colors are employed.
  • it is easy to distinguish from the background color for example, the display color of the normal image.
  • the interchangeable lens type digital camera in which the reflex mirror is omitted is illustrated.
  • the present invention is not limited to this, and the present invention can also be applied to an imaging apparatus having an integrated lens. it can.
  • the imaging apparatus 100 including the display unit 213 and the EVF 248 is illustrated.
  • the present invention is not limited thereto, and the display unit 213 and the EVF 248 may be included.
  • the split image divided into a plurality of parts in the vertical direction is exemplified.
  • the present invention is not limited to this, and an image divided into a plurality of parts in the left-right direction or the oblique direction may be applied as the split image.
  • the present invention is not limited to this.
  • Other focus confirmation images are generated from the left eye image and the right eye image, and the focus confirmation image is displayed. You may make it do.
  • the left-eye image and the right-eye image may be superimposed and displayed as a composite image. If the image is out of focus, the image may be displayed as a double image, and the image may be clearly displayed when the image is in focus.
  • the display color for output is determined according to the input / output function F (W) shown in FIG. 12 .
  • the display color for output may be determined using a color conversion table stored in advance in (1).
  • an example of the color conversion table is a color conversion table in which the output value decreases as the input value increases.
  • the data relating to the input and output of the color conversion table may be data created based on the input / output function F (W) shown in FIG. 12, or data created based on the hue circle. May be. Thereby, the visibility of the boundary line between the normal image and the split image can be improved with a simple configuration.
  • the display color for output is determined according to the input / output function G (W) shown in FIG.
  • the image output process flow (see FIG. 14), the white balance process flow (see FIG. 15), the chromatic color application process flow (see FIG. 16), and the display color differentiation process described in the first embodiment.
  • the flow (see FIG. 17) and the display color assimilation process (see FIG. 18) are examples. Therefore, it goes without saying that unnecessary steps may be deleted, new steps may be added, and the processing order may be changed within a range not departing from the spirit.
  • Each process included in the image output process may be realized by a software configuration using a computer by executing a program, or may be realized by a hardware configuration. Further, it may be realized by a combination of a hardware configuration and a software configuration. The same applies to the white balance processing, chromatic color imparting processing, display color differentiation processing, and display color assimilation processing.
  • the program may be stored in a predetermined storage area (for example, the memory 26) in advance.
  • a predetermined storage area for example, the memory 26
  • the white balance processing, chromatic color imparting processing, display color differentiation processing, and display color assimilation processing It is not always necessary to store in the memory 26 from the beginning.
  • a program is first stored in an arbitrary “portable storage medium” such as a flexible disk connected to a computer, so-called FD, CD-ROM, DVD disk, magneto-optical disk, IC card, etc. Also good.
  • the computer may acquire the program from these portable storage media and execute it.
  • each program may be stored in another computer or server device connected to the computer via the Internet, LAN (Local Area Network), etc., and the computer may acquire and execute the program from these. Good.
  • the imaging device 100 is illustrated.
  • a mobile terminal device that is a modification of the imaging device 100, for example, a mobile phone or a smartphone having a camera function, a PDA (Personal Digital Assistants), a portable game machine Etc.
  • a smartphone will be described as an example, and will be described in detail with reference to the drawings.
  • FIG. 24 is a perspective view showing an example of the appearance of the smartphone 500.
  • a smartphone 500 illustrated in FIG. 24 includes a flat housing 502, and a display input in which a display panel 521 serving as a display unit and an operation panel 522 serving as an input unit are integrated on one surface of the housing 502. Part 520.
  • the housing 502 includes a speaker 531, a microphone 532, an operation unit 540, and a camera unit 541.
  • the configuration of the housing 502 is not limited thereto, and for example, a configuration in which the display unit and the input unit are independent, or a configuration having a folding structure or a slide structure may be employed.
  • FIG. 25 is a block diagram showing an example of the configuration of the smartphone 500 shown in FIG.
  • the main components of the smartphone 500 include a wireless communication unit 510, a display input unit 520, a call unit 530, an operation unit 540, a camera unit 541, a storage unit 550, and an external input / output. Part 560.
  • the smartphone 500 includes a GPS (Global Positioning System) receiving unit 570, a motion sensor unit 580, a power supply unit 590, and a main control unit 501.
  • GPS Global Positioning System
  • a wireless communication function for performing mobile wireless communication via the base station device BS and the mobile communication network NW is provided as a main function of the smartphone 500.
  • the wireless communication unit 510 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW according to an instruction from the main control unit 501. Using this wireless communication, transmission and reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
  • the display input unit 520 is a so-called touch panel, and includes a display panel 521 and an operation panel 522. For this reason, the display input unit 520 displays images (still images and moving images), character information, and the like visually by controlling the main control unit 501, and visually transmits information to the user, and performs user operations on the displayed information. To detect. Note that when viewing the generated 3D, the display panel 521 is preferably a 3D display panel.
  • the display panel 521 uses an LCD, OELD (Organic Electro-Luminescence Display), or the like as a display device.
  • the operation panel 522 is a device that is placed so that an image displayed on the display surface of the display panel 521 is visible and detects one or a plurality of coordinates operated by a user's finger or stylus. When such a device is operated by a user's finger or stylus, a detection signal generated due to the operation is output to the main control unit 501. Next, the main control unit 501 detects an operation position (coordinates) on the display panel 521 based on the received detection signal.
  • the display panel 521 and the operation panel 522 of the smartphone 500 integrally form the display input unit 520, but the operation panel 522 is disposed so as to completely cover the display panel 521. ing.
  • the operation panel 522 may have a function of detecting a user operation even in an area outside the display panel 521.
  • the operation panel 522 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 521 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 521. May be included).
  • the operation panel 522 may include two sensitive regions of the outer edge portion and the other inner portion. Further, the width of the outer edge portion is appropriately designed according to the size of the housing 502 and the like.
  • examples of the position detection method employed in the operation panel 522 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method. You can also
  • the call unit 530 includes a speaker 531 and a microphone 532.
  • the call unit 530 converts the user's voice input through the microphone 532 into voice data that can be processed by the main control unit 501 and outputs the voice data to the main control unit 501. Further, the call unit 530 decodes the audio data received by the wireless communication unit 510 or the external input / output unit 560 and outputs it from the speaker 531.
  • the speaker 531 can be mounted on the same surface as the display input unit 520 and the microphone 532 can be mounted on the side surface of the housing 502.
  • the operation unit 540 is a hardware key using a key switch or the like, and receives an instruction from the user.
  • the operation unit 540 is mounted on the side surface of the housing 502 of the smartphone 500 and is turned on when pressed with a finger or the like, and is turned off by a restoring force such as a spring when the finger is released. It is a push button type switch.
  • the storage unit 550 stores the control program and control data of the main control unit 501, application software, address data that associates the name and telephone number of the communication partner, and transmitted / received e-mail data.
  • the storage unit 550 stores Web data downloaded by Web browsing and downloaded content data.
  • the storage unit 550 temporarily stores streaming data and the like.
  • the storage unit 550 includes an external storage unit 552 having an internal storage unit 551 built in the smartphone and a removable external memory slot.
  • Each of the internal storage unit 551 and the external storage unit 552 constituting the storage unit 550 is realized using a storage medium such as a flash memory type (hard memory type) or a hard disk type (hard disk type).
  • multimedia card micro type multimedia card micro type
  • card type memory for example, MicroSD (registered trademark) memory
  • RAM Random Access Memory
  • ROM Read Only Memory
  • the external input / output unit 560 serves as an interface with all external devices connected to the smartphone 500, and is used to connect directly or indirectly to other external devices through communication or the like or a network. is there. Examples of communication with other external devices include universal serial bus (USB), IEEE 1394, and the like. Examples of the network include the Internet, wireless LAN, Bluetooth (Bluetooth (registered trademark)), RFID (Radio Frequency Identification), and infrared communication (Infrared Data Association: IrDA (registered trademark)). Other examples of the network include UWB (Ultra Wideband (registered trademark)) and ZigBee (registered trademark).
  • Examples of the external device connected to the smartphone 500 include a wired / wireless headset, wired / wireless external charger, wired / wireless data port, and a memory card connected via a card socket.
  • Other examples of external devices include SIM (Subscriber Identity Module Card) / UIM (User Identity Module Card) cards, and external audio / video devices connected via audio / video I / O (Input / Output) terminals. Can be mentioned.
  • an external audio / video device that is wirelessly connected can be used.
  • the external input / output unit may transmit data received from such an external device to each component inside the smartphone 500, or may allow data inside the smartphone 500 to be transmitted to the external device. it can.
  • the GPS receiving unit 570 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 501, performs positioning calculation processing based on the received plurality of GPS signals, and calculates the latitude of the smartphone 500 Detect the position consisting of longitude and altitude.
  • the GPS reception unit 570 can acquire position information from the wireless communication unit 510 or the external input / output unit 560 (for example, a wireless LAN), the GPS reception unit 570 can also detect the position using the position information.
  • the motion sensor unit 580 includes a triaxial acceleration sensor, for example, and detects the physical movement of the smartphone 500 in accordance with an instruction from the main control unit 501. By detecting the physical movement of the smartphone 500, the moving direction and acceleration of the smartphone 500 are detected. This detection result is output to the main control unit 501.
  • the power supply unit 590 supplies power stored in a battery (not shown) to each unit of the smartphone 500 in accordance with an instruction from the main control unit 501.
  • the main control unit 501 includes a microprocessor, operates according to a control program and control data stored in the storage unit 550, and controls each unit of the smartphone 500 in an integrated manner. Further, the main control unit 501 includes a mobile communication control function for controlling each unit of the communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 510.
  • the application processing function is realized by the main control unit 501 operating according to the application software stored in the storage unit 550.
  • Application processing functions include, for example, an infrared communication function that controls the external input / output unit 560 to perform data communication with the opposite device, an e-mail function that transmits and receives e-mails, and a web browsing function that browses web pages. .
  • the main control unit 501 has an image processing function such as displaying video on the display input unit 520 based on image data (still image data or moving image data) such as received data or downloaded streaming data.
  • the image processing function is a function in which the main control unit 501 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 520.
  • the main control unit 501 executes display control for the display panel 521 and operation detection control for detecting a user operation through the operation unit 540 and the operation panel 522.
  • the main control unit 501 By executing the display control, the main control unit 501 displays an icon for starting application software, a software key such as a scroll bar, or a window for creating an e-mail.
  • a software key such as a scroll bar
  • the scroll bar refers to a software key for accepting an instruction to move a display portion of an image, such as a large image that does not fit in the display area of the display panel 521.
  • the main control unit 501 detects a user operation through the operation unit 540, or accepts an operation on the icon or an input of a character string in the input field of the window through the operation panel 522. Or, by executing the operation detection control, the main control unit 501 accepts a display image scroll request through a scroll bar.
  • the main control unit 501 causes the operation position with respect to the operation panel 522 to overlap with the display panel 521 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 21.
  • a touch panel control function for controlling the sensitive area of the operation panel 522 and the display position of the software key is provided.
  • the main control unit 501 can also detect a gesture operation on the operation panel 522 and execute a preset function according to the detected gesture operation.
  • Gesture operation is not a conventional simple touch operation, but an operation that draws a trajectory with a finger or the like, designates a plurality of positions at the same time, or combines these to draw a trajectory for at least one of a plurality of positions. means.
  • the camera unit 541 is a digital camera that captures an image using an image sensor such as a CMOS or a CCD, and has the same function as the image capturing apparatus 100 shown in FIG.
  • the camera unit 541 can switch between a manual focus mode and an autofocus mode.
  • the photographing lens of the camera unit 541 can be focused by operating a focus icon button or the like displayed on the operation unit 540 or the display input unit 520.
  • the manual focus mode the live view image obtained by combining the split images is displayed on the display panel 521 so that the in-focus state during the manual focus can be confirmed.
  • the camera unit 541 converts the image data obtained by imaging into compressed image data such as JPEG (Joint Photographic coding Experts Group) under the control of the main control unit 501.
  • the converted image data can be recorded in the storage unit 550 or output through the input / output unit 560 or the wireless communication unit 510.
  • the camera unit 541 is mounted on the same surface as the display input unit 520, but the mounting position of the camera unit 541 is not limited to this, and the camera unit 541 may be mounted on the back surface of the display input unit 520.
  • a plurality of camera units 541 may be mounted. Note that when a plurality of camera units 541 are mounted, the camera unit 541 used for imaging may be switched and imaged alone, or a plurality of camera units 541 may be used simultaneously for imaging. it can.
  • the camera unit 541 can be used for various functions of the smartphone 500.
  • an image acquired by the camera unit 541 can be displayed on the display panel 521, or the image of the camera unit 541 can be used as one of operation inputs of the operation panel 522.
  • the GPS receiving unit 570 detects the position, the position can also be detected with reference to an image from the camera unit 541.
  • the optical axis direction of the camera unit 541 of the smartphone 500 is determined without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the current usage environment.
  • the image from the camera unit 541 can be used in the application software.
  • various information can be added to still image or moving image data and recorded in the storage unit 550 or output through the input / output unit 560 or the wireless communication unit 510.
  • the “various information” herein include, for example, position information acquired by the GPS receiving unit 570 and image information of the still image or moving image, audio information acquired by the microphone 532 (sound text conversion by the main control unit or the like). May be text information).
  • posture information acquired by the motion sensor unit 580 may be used.
  • the image pickup device 20 having the first to third pixel groups is illustrated, but the present invention is not limited to this, and only the first pixel group and the second pixel group are used.
  • An image sensor made of A digital camera having this type of image sensor generates a three-dimensional image (3D image) based on the first image output from the first pixel group and the second image output from the second pixel group. 2D images (2D images) can also be generated. In this case, the generation of the two-dimensional image is realized, for example, by performing an interpolation process between pixels of the same color in the first image and the second image. Moreover, you may employ
  • split image refers to a case where an image sensor including only a phase difference pixel group (for example, a first pixel group and a second pixel group) is used, and a phase difference pixel at a predetermined ratio with respect to a normal pixel.
  • an image output from the phase difference image group for example, the first image output from the first pixel group
  • a split image based on the second image output from the second pixel group is not limited to a mode in which both the normal image and the split image are simultaneously displayed on the same screen of the display device, and the normal image is displayed in a state where the split image display is instructed.
  • the display control unit 36 may perform control to display the split image without displaying the normal image on the display device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)
  • Focusing (AREA)
  • Automatic Focus Adjustment (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)

Abstract

La présente invention concerne un dispositif de traitement d'image capable d'améliorer l'aptitude à la reconnaissance de la limite entre une première image d'affichage et une seconde image d'affichage devant être utilisées à des fins de vérification de la mise au point ; un dispositif de formation d'image ; un procédé de traitement d'image ; et un programme de traitement d'image. Une unité de traitement d'image obtient les valeurs de pixels R, G et B d'une image normale au cours d'une étape (414B). L'unité de traitement d'image (28) détermine une couleur d'affichage d'image divisée devant être une couleur dont les propriétés diffèrent de celles de l'image normale, sur la base des valeurs de pixels R, G et B de l'image normale, pendant les étapes (414J, 414M, 414N).
PCT/JP2013/071180 2012-09-19 2013-08-05 Dispositif de traitement d'image, dispositif de formation d'image, procédé de traitement d'image et programme de traitement d'image WO2014045738A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2014536664A JP5901780B2 (ja) 2012-09-19 2013-08-05 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
CN201380048302.XA CN104737527B (zh) 2012-09-19 2013-08-05 图像处理装置、摄像装置及图像处理方法
US14/642,774 US9712752B2 (en) 2012-09-19 2015-03-10 Image processing device, imaging device, image processing method, and computer readable medium, capable of improving visibility of a boundary line between a first display image and a second display image for use in focus-checking

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-205905 2012-09-19
JP2012205905 2012-09-19

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/642,774 Continuation US9712752B2 (en) 2012-09-19 2015-03-10 Image processing device, imaging device, image processing method, and computer readable medium, capable of improving visibility of a boundary line between a first display image and a second display image for use in focus-checking

Publications (1)

Publication Number Publication Date
WO2014045738A1 true WO2014045738A1 (fr) 2014-03-27

Family

ID=50341062

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/071180 WO2014045738A1 (fr) 2012-09-19 2013-08-05 Dispositif de traitement d'image, dispositif de formation d'image, procédé de traitement d'image et programme de traitement d'image

Country Status (4)

Country Link
US (1) US9712752B2 (fr)
JP (1) JP5901780B2 (fr)
CN (1) CN104737527B (fr)
WO (1) WO2014045738A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110874829B (zh) * 2018-08-31 2022-10-14 北京小米移动软件有限公司 图像处理方法及装置、电子设备及存储介质
CN110675404B (zh) * 2019-09-03 2023-03-21 RealMe重庆移动通信有限公司 图像处理方法、图像处理装置、存储介质与终端设备

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008116848A (ja) * 2006-11-07 2008-05-22 Canon Inc 撮像装置
JP2009147665A (ja) * 2007-12-13 2009-07-02 Canon Inc 撮像装置

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3690470B2 (ja) 1998-09-02 2005-08-31 富士ゼロックス株式会社 画像処理装置および画像形成装置、画像処理方法
US7391532B2 (en) * 2000-02-01 2008-06-24 Canon Kabushiki Kaisha Image correction in image sensing system including image sensing apparatus and image processing apparatus
WO2002073538A1 (fr) * 2001-03-13 2002-09-19 Ecchandes Inc. Dispositif visuel, compteur asservi et capteur d'images
JP4740135B2 (ja) * 2003-09-17 2011-08-03 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 3次元画像ディスプレイの画面に3次元画像を描画するシステム及び方法
JP5308792B2 (ja) * 2008-11-28 2013-10-09 オリンパス株式会社 ホワイトバランス調整装置、ホワイトバランス調整方法、ホワイトバランス調整プログラム、および、撮像装置

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008116848A (ja) * 2006-11-07 2008-05-22 Canon Inc 撮像装置
JP2009147665A (ja) * 2007-12-13 2009-07-02 Canon Inc 撮像装置

Also Published As

Publication number Publication date
CN104737527A (zh) 2015-06-24
CN104737527B (zh) 2018-06-22
US9712752B2 (en) 2017-07-18
JPWO2014045738A1 (ja) 2016-08-18
JP5901780B2 (ja) 2016-04-13
US20150181127A1 (en) 2015-06-25

Similar Documents

Publication Publication Date Title
JP6033454B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5931206B2 (ja) 画像処理装置、撮像装置、プログラム及び画像処理方法
JP5681329B2 (ja) 撮像装置及び画像表示方法
JP5697801B2 (ja) 撮像装置及び自動焦点調節方法
JP5901782B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5960286B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5901781B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP6086975B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5889441B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
US9609224B2 (en) Imaging device and image display method
JP6000446B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5903528B2 (ja) 撮像装置及び画像処理方法
JP5833254B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5901780B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5972485B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
WO2014045741A1 (fr) Dispositif de traitement d'image, dispositif d'imagerie, procédé de traitement d'image et programme de traitement d'image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13839173

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2014536664

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13839173

Country of ref document: EP

Kind code of ref document: A1