WO2014045741A1 - Dispositif de traitement d'image, dispositif d'imagerie, procédé de traitement d'image et programme de traitement d'image - Google Patents

Dispositif de traitement d'image, dispositif d'imagerie, procédé de traitement d'image et programme de traitement d'image Download PDF

Info

Publication number
WO2014045741A1
WO2014045741A1 PCT/JP2013/071183 JP2013071183W WO2014045741A1 WO 2014045741 A1 WO2014045741 A1 WO 2014045741A1 JP 2013071183 W JP2013071183 W JP 2013071183W WO 2014045741 A1 WO2014045741 A1 WO 2014045741A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
unit
pixel
instruction
Prior art date
Application number
PCT/JP2013/071183
Other languages
English (en)
Japanese (ja)
Inventor
智行 河合
沖川 満
林 淳司
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2014045741A1 publication Critical patent/WO2014045741A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/34Systems for automatic generation of focusing signals using different areas in a pupil plane
    • G02B7/346Systems for automatic generation of focusing signals using different areas in a pupil plane using horizontal and vertical areas in the pupil plane, i.e. wide area autofocusing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/843Demosaicing, e.g. interpolating colour pixel values
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
    • H04N25/134Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements based on three different wavelength filter elements
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/18Signals indicating condition of a camera member or suitability of light

Definitions

  • the present invention relates to an image processing device, an imaging device, an image processing method, and an image processing program.
  • a digital camera having a so-called manual focus mode in which a user can manually perform focus adjustment in addition to auto-focus using a phase difference detection method or a contrast detection method is widely known.
  • a method using a split micro prism screen that displays a phase difference visually by providing a reflex mirror so that focus adjustment can be performed while checking a subject is known.
  • a method employing a method for visually confirming contrast is also known.
  • a split image is displayed in a live view image (also referred to as a through image) in order to make it easier for the operator to focus on the subject in the manual focus mode.
  • the split image is, for example, a divided image divided into two (for example, images divided in the vertical direction), and is shifted in the parallax generation direction (for example, the horizontal direction) according to the shift of the focus, and is in focus. In this state, it indicates a divided image in which the shift in the parallax generation direction is eliminated.
  • An operator for example, a photographer adjusts the focus by operating the manual focus ring so that the split image (for example, each image divided in the vertical direction) is not displaced.
  • aperture moving means for moving the aperture stop on the subject optical path on a plane perpendicular to the optical axis, and the aperture stop move.
  • Storage means for storing two subject images respectively captured at two distance measuring positions.
  • the image processing apparatus includes a display unit that outputs a split image in which two subject images are combined and displays whether the focus state is appropriate.
  • Patent Document 2 An imaging apparatus described in Japanese Patent Laying-Open No. 2009-147665 (hereinafter referred to as Patent Document 2) includes a first subject image and a first object image formed by a light beam divided by a pupil division unit among light beams from an imaging optical system. A first image and a second image are generated by photoelectrically converting the two subject images. Then, a split image is generated using the first and second images, and a third image is generated by photoelectric conversion of a third subject image formed by a light beam that is not divided by the pupil dividing unit. The third image is displayed on the display unit, the generated split image is displayed in the third image, and the color information extracted from the third image is added to the split image. In this way, by adding the color information extracted from the third image to the split image, the visibility of the split image can be improved.
  • Patent Document 3 An imaging apparatus described in Japanese Patent Laying-Open No. 2009-163220 (hereinafter referred to as Patent Document 3) displays a superimposed image obtained by superimposing a first image and a second image obtained by dividing by a pupil dividing unit.
  • the processing means to display is provided.
  • Patent Document 4 A digital camera described in Japanese Patent Laid-Open No. 2001-309210 (hereinafter referred to as Patent Document 4) detects a shift amount between a focus position and a subject position, and changes the display content of the split image according to the shift amount. Means.
  • Patent Document 5 In the imaging apparatus described in Japanese Patent Application Laid-Open No. 2009-237214 (hereinafter referred to as Patent Document 5), the split image is erased from the screen when the aperture button is operated in the manual focus mode, and the split image is displayed when the aperture button is not operated. The displayed configuration is provided.
  • the split image may be continuously displayed even when the time when the split image is unnecessary for the user has come.
  • the split image may not be displayed even when the time when the user needs the split image has arrived.
  • all of the techniques described in Patent Documents 1 to 5 have a problem that it is difficult to switch between the split image display state and the non-display state at an appropriate time.
  • the present invention has been proposed in view of such circumstances, and an image processing apparatus, an imaging apparatus, an image processing method, and an image processing apparatus that can switch between display and non-display of an image used for in-focus confirmation at an appropriate time.
  • An object is to provide an image processing program.
  • An image processing apparatus includes first and second pixel groups in which a subject image that has passed through first and second regions in a photographic lens is divided into pupils and formed, respectively.
  • a first display image is generated based on the image signal output from the element, and in-focus confirmation is performed based on the first and second image signals output from the first and second pixel groups.
  • a generation unit that generates a second display image to be used; a first instruction unit that instructs display of the first display image; and a first instruction unit that instructs a non-holding type; and the second A second instruction unit for instructing display of the display image, a display unit for displaying the image, and the first instruction unit in a state in which display of the second display image is instructed by the second instruction unit.
  • the second display unit performs the second display.
  • the display instruction of the first display image by the first instruction unit is canceled in the state where the display of the image for use is instructed, the first unit generated by the generation unit with respect to the display unit And a display control unit that performs control to display a display image and display the second display image generated by the generation unit in a display area of the first display image.
  • the instruction by the second instruction unit may be a holding instruction.
  • the imaging element forms a third image by subjecting the subject image transmitted through the photographing lens without being divided into pupils.
  • the generation unit may generate the first display image based on the third image output from the third image group.
  • the generation unit interpolates the first and second images based on the third image, and the third The first display image may be generated based on the image.
  • the image quality of the first display image can be further improved with a simple structure.
  • An imaging device includes an image processing device according to any one of the first to fourth aspects of the present invention, an imaging element having the first and second pixel groups, And a storage unit that stores an image output from the image sensor. Thereby, the display and non-display of the image used for focus confirmation can be switched at an appropriate time as compared with the case where the present configuration is not provided.
  • the first instruction unit is moved to a first instruction position that instructs adjustment of an imaging condition and a second instruction position that instructs the start of imaging.
  • the release switch is held at the first designated position, the display of the first display image is designated.
  • the release switch is held at the first designated position, the release switch is held at the first designated position.
  • the display may be switched when the state is released. Accordingly, the second display image can be quickly displayed at an appropriate time as compared with the case where the present configuration is not provided.
  • an imaging operation is started when the release switch is held at the second designated position, and the display control unit is configured so that the release switch is When the second instruction position is held, the display instruction of the second display image by the second instruction unit is canceled, so that the first display is displayed on the display unit after the imaging operation is finished.
  • Control for displaying an image for use may be performed.
  • the release switch when the release switch is held at the second designated position, an imaging operation is started, and the release switch is held at the second designated position.
  • the case where it is performed may be a case where the display instruction of the first display image is canceled.
  • the first instruction unit is a setting unit that sets an aperture value of the photographing lens.
  • the case where the display of one display image is instructed is the case where a specific aperture value is set by the setting unit, and the case where the specific aperture value is not set by the setting unit is the first display image It may be a case where the image display instruction is canceled. Accordingly, the second display image can be displayed at an appropriate time according to the aperture value as compared with the case where the present configuration is not provided.
  • an electronic viewfinder capable of displaying the first display image and the second display image.
  • the first instruction unit includes a detection unit that detects use of the electronic viewfinder, and the detection unit detects that the use of the electronic viewfinder is not detected.
  • the case where the display is instructed, and the case where the use of the electronic viewfinder is detected may be the case where the display instruction for the first display image is canceled. Accordingly, the second display image can be displayed at an appropriate time according to the usage state of the electronic viewfinder, as compared with the case where the present configuration is not provided.
  • the imaging element forms an object image that has passed through the photographing lens without pupil division.
  • each pixel included in the first and second pixel groups is further divided into the first pixel group and the first pixel group.
  • the pixel group may be adjacent to each other in a predetermined direction between the two pixel groups.
  • each pixel included in the first and second pixel groups is further replaced with the third pixel. It is arranged between the pixels included in the pixel group, provided on the first to third pixel groups, and each pixel included in the first and second pixel groups and the first and second pixel groups.
  • a color filter that assigns a specific primary color to pixels included in the adjacent third pixel group may be further included.
  • the specific primary color may be green in the thirteenth aspect of the present invention.
  • the primary color array of the color filter may be a Bayer array in the thirteenth or fourteenth aspect of the present invention.
  • each pixel included in the first and second pixel groups is further replaced with the third pixel.
  • 4th and 5th obtained by arranging the pixel group together with the pixel group in a matrix shifted by half a pixel and between the pixels included in the third pixel group and classifying the constituent pixels in the image sensor into two types.
  • the constituent pixels are provided on the fourth and fifth pixel groups that are alternately shifted by half a pixel in the row direction and the column direction.
  • a color filter that assigns a primary color of the Bayer array to each may be further included.
  • the first and second pixel groups may be arranged with respect to a green filter in the color filter.
  • the image quality can be further improved with a simple configuration as compared with the case without this configuration.
  • An image processing method is an image processing method including first and second pixel groups in which a subject image that has passed through first and second regions in a photographing lens is divided into pupils and formed.
  • a first display image is generated based on the image signal output from the element, and in-focus confirmation is performed based on the first and second image signals output from the first and second pixel groups.
  • a generation step of generating a second display image to be used, a first instruction step of instructing display of the first display image, a first instruction step of instructing a non-holding type, and the second A second instruction step for instructing display of the display image, and the first instruction image by the first instruction step in a state where the display of the second display image is instructed by the second instruction step.
  • the display unit that displays the image Control is performed to display the first display image generated by the generation step without displaying the second display image generated by the generation step, and the second instruction step performs the second display step.
  • the display instruction of the first display image in the first instruction step is canceled in a state where display of the display image is instructed, the first generated by the generation step with respect to the display unit And a display control step of performing control to display the second display image generated by the generation step in the display area of the first display image.
  • An image processing program is for causing a computer to function as the generation unit and the display unit in the image processing device according to any one of claims 1 to 4. . Thereby, the display and non-display of the image used for focus confirmation can be switched at an appropriate time as compared with the case where the present configuration is not provided.
  • FIG. 2 is a schematic layout diagram illustrating an example of a layout of color filters provided in an image sensor included in the imaging device according to the first embodiment.
  • FIG. 5 is a diagram for explaining a method of determining a correlation direction from pixel values of 2 ⁇ 2 G pixels included in the color filter illustrated in FIG. 4. It is a figure for demonstrating the concept of a basic sequence pattern.
  • FIG. 1 It is a figure for demonstrating the concept of the basic sequence pattern contained in the color filter shown in FIG.
  • FIG. It is a schematic block diagram which shows an example of arrangement
  • FIG. 6 is a screen diagram illustrating an example of a live view image that is displayed on a display unit of the imaging apparatus according to the first embodiment and is out of focus.
  • FIG. 6 is a screen diagram illustrating an example of a live view image displayed on a display unit of the imaging apparatus according to the first embodiment and in a focused state. It is a schematic block diagram which shows the modification of arrangement
  • FIG. 20 is a schematic configuration diagram illustrating an example of a configuration of phase difference pixels (first pixel and second pixel) included in the imaging element illustrated in FIG. 19.
  • FIG. 20 is a schematic configuration diagram illustrating a modified example of the arrangement of phase difference pixels included in the image sensor illustrated in FIG. 19.
  • FIG. 6 is a schematic diagram illustrating an example of a split image according to the first embodiment, in which the first image and the second image are divided into odd lines and even lines and alternately arranged. It is.
  • FIG. 9 is a schematic diagram showing an example of a split image according to the first embodiment, which is an example of a split image divided by oblique dividing lines inclined with respect to the horizontal direction.
  • FIG. 10 is a schematic diagram illustrating an example of a split image according to the first embodiment, which is an example of a split image divided by a grid-like dividing line.
  • FIG. 10 is a schematic diagram illustrating an example of a split image formed in a checkered pattern, which is a modification of the split image according to the first embodiment. It is a flowchart which shows an example of the flow of the image output process which concerns on 2nd Embodiment. It is a flowchart which shows an example of the flow of the image output process which concerns on 3rd Embodiment. It is a flowchart which shows an example of the flow of the image output process which concerns on 4th Embodiment. It is a perspective view which shows an example of the external appearance of the smart phone which concerns on 5th Embodiment. It is a block diagram which shows an example of the principal part structure of the electrical system of the smart phone which concerns on 5th Embodiment.
  • FIG. 1 is a perspective view illustrating an example of an appearance of the imaging apparatus 100 according to the first embodiment
  • FIG. 2 is a rear view of the imaging apparatus 100 illustrated in FIG.
  • the imaging apparatus 100 is an interchangeable lens camera, and includes a camera body 200 and an interchangeable lens 300 (photographing lens, focus lens 302) that is replaceably attached to the camera body 200, and a reflex mirror is omitted. It is a digital camera.
  • the camera body 200 is provided with a hybrid finder (registered trademark) 220.
  • the hybrid viewfinder 220 here refers to a viewfinder in which, for example, an optical viewfinder (hereinafter referred to as “OVF”) and an electronic viewfinder (hereinafter referred to as “EVF”) are selectively used.
  • OPF optical viewfinder
  • EMF electronic viewfinder
  • the camera body 200 and the interchangeable lens 300 are interchangeably mounted by combining a mount 256 provided in the camera body 200 and a mount 346 (see FIG. 3) on the interchangeable lens 300 side corresponding to the mount 256.
  • the lens barrel of the interchangeable lens 300 is provided with a focus ring, and the focus lens 302 is moved in the optical axis direction in accordance with the rotation operation of the focus ring, and an imaging device 20 (described later) at a focus position corresponding to the subject distance.
  • the subject light can be imaged on (see FIG. 3).
  • a front view of the camera body 200 is provided with an OVF viewfinder window 241 included in the hybrid viewfinder 220.
  • a finder switching lever (finder switching unit) 214 is provided on the front surface of the camera body 200. When the viewfinder switching lever 214 is rotated in the direction of the arrow SW, it switches between an optical image that can be viewed with OVF and an electronic image (live view image) that can be viewed with EVF (described later).
  • the optical axis L2 of the OVF is an optical axis different from the optical axis L1 of the interchangeable lens 300.
  • a release switch 211 and a dial 212 for setting a shooting mode, a playback mode, and the like are mainly provided on the upper surface of the camera body 200.
  • the release switch 211 is pressed from a standby position to an intermediate position (half-pressed position) that is an example of a first indicated position, and a final pressed position (full-pressed position) that is an example of a second indicated position that exceeds the intermediate position. ) In a state where the button is pressed down until it is pressed down).
  • a state where the button is pressed from the standby position to the half-pressed position is referred to as “half-pressed state”
  • “a state where the button is pressed from the standby position to the fully-pressed position” is referred to as “full-pressed state”.
  • the shooting conditions are adjusted by pressing the release button 211 halfway, and then exposure (shooting) is performed when the release button 211 is fully pressed.
  • the “imaging condition” referred to here indicates, for example, at least one of an exposure state and a focused state.
  • the exposure state and the focus state are adjusted. In other words, by pressing the release button 211 halfway, the AE (Automatic Exposure) function is activated and the exposure state (shutter speed, aperture state) is set, and then the AF function is activated and the focus is controlled.
  • AE Automatic Exposure
  • an OVF viewfinder eyepiece 242 On the back of the camera body 200, an OVF viewfinder eyepiece 242, a display unit 213, a cross key 222, a MENU / OK key 224, and a BACK / DISP button 225 are provided.
  • the cross key 222 functions as a multi-function key that outputs various command signals such as menu selection, zoom and frame advance.
  • the MENU / OK key 224 has both a function as a menu button for instructing to display a menu on the screen of the display unit 213 and an function as an OK button for instructing confirmation and execution of selection contents. Key.
  • the BACK / DISP button 225 is used for deleting a desired object such as a selection item, canceling a designated content, or returning to the previous operation state.
  • the display unit 213 is realized by, for example, an LCD, and is used to display a live view image (through image) that is an example of a continuous frame image obtained by capturing a continuous frame in the shooting mode.
  • the display unit 213 is also used to display a still image that is an example of a single frame image obtained by capturing a single frame when a still image shooting instruction is given.
  • the display unit 213 is also used for displaying a playback image and a menu screen in the playback mode.
  • FIG. 3 is a block diagram showing an example of the electrical configuration (internal configuration) of the imaging apparatus 100 according to the first embodiment.
  • the imaging device 100 is a digital camera that records captured still images and moving images, and the overall operation of the camera is controlled by a CPU (central processing unit) 12.
  • the imaging apparatus 100 includes an operation unit 14 that is an example of a first instruction unit, a second instruction unit, and a setting unit according to the present invention.
  • the imaging unit 100 includes an interface unit 24, a memory 26, and an encoder 34.
  • the imaging unit 100 includes display control units 36A and 36B which are examples of the display control unit according to the present invention.
  • the imaging unit 100 includes an eyepiece detection unit 37 that is an example of a detection unit according to the present invention.
  • the imaging apparatus 100 includes an image processing unit 28 that is an example of a generation unit according to the present invention.
  • the display control unit 36 is provided as a hardware configuration different from the image processing unit 28.
  • the present invention is not limited to this, and the image processing unit 28 has the same function as the display control unit 36. In this case, the display control unit 36 is not necessary.
  • the CPU 12, the operation unit 14, the interface unit 24, the memory 26, which is an example of a storage unit, the image processing unit 28, the encoder 34, the display control units 36 ⁇ / b> A and 36 ⁇ / b> B, the eyepiece detection unit 37, and the external interface (I / F) 39 40 are connected to each other.
  • the memory 26 includes a non-volatile storage area (such as an EEPROM) that stores parameters, programs, and the like, and a volatile storage area (such as an SDRAM) that temporarily stores various information such as images.
  • the CPU 12 performs focusing control by driving and controlling the focus adjustment motor so that the contrast value of the image obtained by imaging is maximized. Further, the CPU 12 calculates AE information that is a physical quantity indicating the brightness of an image obtained by imaging. When the release switch 211 is half-pressed, the CPU 12 derives the shutter speed and F value corresponding to the brightness of the image indicated by the AE information. Then, the exposure state is set by controlling each related part so that the derived shutter speed and F value are obtained.
  • the operation unit 14 is a user interface operated by the operator when giving various instructions to the imaging apparatus 100. Various instructions received by the operation unit 14 are output as operation signals to the CPU 12, and the CPU 12 executes processing according to the operation signals input from the operation unit 14.
  • the operation unit 14 includes a release switch 211, a dial (focus mode switching unit) 212 for selecting a shooting mode, a display unit 213, a viewfinder switching lever 214, a cross key 222, a MENU / OK key 224, and a BACK / DISP button 225. .
  • the operation unit 14 also includes a touch panel that accepts various types of information. This touch panel is overlaid on the display screen of the display unit 213, for example.
  • the operation unit 14 also includes a depth-of-field confirmation button that is pressed when confirming the depth of field on the screen of the display unit 213, for example.
  • the F value of the photographing lens is adjusted by pressing the depth-of-field confirmation button, and a predetermined F value is set as the F value used when capturing a still image. Is done.
  • the image light indicating the subject is coupled to the light receiving surface of the color image sensor (for example, a CMOS sensor) 20 via the photographing lens 16 including the focus lens that can be moved manually and the shutter 18.
  • the signal charge accumulated in the image sensor 20 is sequentially read out as a digital signal corresponding to the signal charge (voltage) by a read signal applied from the device control unit 22.
  • the imaging element 20 has a so-called electronic shutter function, and controls the charge accumulation time (shutter speed) of each photosensor according to the timing of the readout signal by using the electronic shutter function.
  • the image sensor 20 according to the first embodiment is a CMOS image sensor, but is not limited thereto, and may be a CCD image sensor.
  • the image sensor 20 is provided with a color filter 21 shown in FIG. 4 as an example.
  • FIG. 4 schematically shows an example of the arrangement of the color filters 21.
  • (4896 ⁇ 3264) pixels are adopted as an example of the number of pixels and 3: 2 is adopted as the aspect ratio.
  • the number of pixels and the aspect ratio are not limited thereto.
  • the color filter 21 includes a first filter G corresponding to G (green) that contributes most to obtain a luminance signal, and second filters R and B corresponding to R (red).
  • a third filter B corresponding to (blue) is included.
  • the arrangement pattern of the first filter G (hereinafter referred to as G filter), the second filter R (hereinafter referred to as R filter), and the third filter B (hereinafter referred to as B filter) is the first arrangement. It is classified into a pattern A and a second array pattern B.
  • the G filters are arranged on the four corners and the center pixel of the 3 ⁇ 3 pixel square array.
  • the R filter is arranged on the central vertical line in the horizontal direction (an example of the row direction) of the square arrangement.
  • the B filter is arranged on the central horizontal line in the vertical direction (an example of the column direction) of the square arrangement.
  • the second arrangement pattern B is a pattern in which the arrangement of the filter G is the same as that of the first basic arrangement pattern A and the arrangement of the filter R and the arrangement of the filter B are interchanged.
  • the color filter 21 includes a basic array pattern C composed of a square array pattern corresponding to 6 ⁇ 6 pixels.
  • the basic array pattern C is a 6 ⁇ 6 pixel pattern in which the first array pattern A and the second array pattern B are arranged point-symmetrically, and the basic array pattern C is repeatedly arranged in the horizontal direction and the vertical direction. ing.
  • filters of R, G, and B colors R filter, G filter, and B filter
  • R filter, G filter, and B filter filters of R, G filter, and B colors
  • the color filter array of the reduced image after the thinning process can be the same as the color filter array before the thinning process, and a common processing circuit is provided. Can be used.
  • the color filter 21 has a G filter corresponding to the color that contributes most to obtain a luminance signal (G color in the first embodiment) in each horizontal, vertical, and diagonal line of the color filter array. Is arranged. Therefore, it is possible to improve the reproduction accuracy of the synchronization process in the high frequency region regardless of the direction of high frequency.
  • the color filter 21 includes an R filter and a B filter corresponding to two or more other colors (R and B colors in the first embodiment) other than the G color. Located within each line of direction. For this reason, the occurrence of color moire (false color) is suppressed, whereby an optical low-pass filter for suppressing the occurrence of false color can be prevented from being arranged in the optical path from the incident surface of the optical system to the imaging surface. . Even when an optical low-pass filter is applied, it is possible to apply a filter having a weak function of cutting a high-frequency component for preventing the occurrence of false colors, so that the resolution is not impaired.
  • the basic array pattern C includes a 3 ⁇ 3 pixel first array pattern A surrounded by a broken line frame and a 3 ⁇ 3 pixel second array pattern B surrounded by a one-dot chain line. It can also be understood that the array is arranged alternately.
  • G filters which are luminance pixels, are arranged at the four corners and the center, and are arranged on both diagonal lines.
  • the B filter is arranged in the horizontal direction and the R filter is arranged in the vertical direction with the central G filter interposed therebetween.
  • the R filters are arranged in the horizontal direction and the B filters are arranged in the vertical direction across the center G filter. That is, in the first arrangement pattern A and the second arrangement pattern B, the positional relationship between the R filter and the B filter is reversed, but the other arrangements are the same.
  • the G filters at the four corners of the first array pattern A and the second array pattern B are shown in FIG. 5 as an example, in which the first array pattern A and the second array pattern B are horizontal and vertical. Are alternately arranged, a square array of G filters corresponding to 2 ⁇ 2 pixels is formed.
  • a square array of G filters corresponding to 2 ⁇ 2 pixels is formed.
  • the difference absolute value of the pixel value of the G pixel in the horizontal direction the difference absolute value of the pixel value of the G pixel in the vertical direction, and the diagonal direction
  • the difference absolute value of the pixel values of the G pixels (upper right diagonal, upper left diagonal) is calculated.
  • the basic array pattern C of the color filter 21 is arranged point-symmetrically with respect to the center of the basic array pattern C (the centers of the four G filters).
  • the first array pattern A and the second array pattern B in the basic array pattern C are also arranged point-symmetrically with respect to the central G filter, so that the circuit scale of the subsequent processing circuit is reduced. Or can be simplified.
  • the color filter array of the first and third lines of the first to sixth lines in the horizontal direction is GRGGBG.
  • the color filter array of the second line is BGBGR.
  • the color filter array of the fourth and sixth lines is GBGGRG.
  • the color filter array of the fifth line is RGRBGB.
  • basic array patterns C, C ′, and C ′′ are shown.
  • the basic array pattern C ′ indicates a pattern obtained by shifting the basic array pattern C by one pixel in the horizontal direction and the vertical direction
  • the basic array pattern C ′′ indicates a pattern obtained by shifting the basic array pattern C by two pixels each in the horizontal direction and the vertical direction.
  • the color filter 21 has the same color filter array even if the basic array patterns C ′ and C ′′ are repeatedly arranged in the horizontal direction and the vertical direction.
  • the imaging apparatus 100 has a phase difference AF function.
  • the image sensor 20 includes a plurality of phase difference detection pixels used when the phase difference AF function is activated.
  • the plurality of phase difference detection pixels are arranged in a predetermined pattern.
  • FIG. 7 schematically shows an example of a correspondence relationship between a part of the color filter 21 and a part of the pixels for detecting the phase difference.
  • the phase difference detection pixels include a first pixel L in which the left half pixel in the horizontal direction is shielded and a second pixel R in which the right half pixel in the horizontal direction is shielded. Any of them.
  • phase difference pixels when it is not necessary to distinguish between the first pixel L and the second pixel R, they are referred to as “phase difference pixels”.
  • FIG. 8 shows an example of the first pixel L and the second pixel R arranged in the image sensor 20.
  • the first pixel L has a light shielding member 20A
  • the second pixel R has a light shielding member 20B.
  • the light shielding member 20A is provided on the front side (microlens L side) of the photodiode PD, and shields the left half of the light receiving surface.
  • the light shielding member 20B is provided on the front side of the photodiode PD and shields the right half of the light receiving surface.
  • the microlens L and the light shielding members 20A and 20B function as a pupil dividing unit, the first pixel L receives only the left side of the optical axis of the light beam passing through the exit pupil of the photographing lens 16, and the second pixel R is Only the right side of the optical axis of the light beam passing through the exit pupil of the photographing lens 16 is received. In this way, the light beam passing through the exit pupil is divided into the left and right by the microlens L and the light shielding members 20A and 20B, which are pupil dividing portions, and enter the first pixel L and the second pixel R, respectively.
  • the subject image corresponding to the left half of the light beam passing through the exit pupil of the photographing lens 16 and the subject image corresponding to the right half of the light beam are in focus (in focus).
  • the portion forms an image at the same position on the image sensor 20.
  • the front pin portion or the rear pin portion is incident on different positions on the image sensor 20 (the phase is shifted).
  • the subject image corresponding to the left half light beam and the subject image corresponding to the right half light beam can be acquired as parallax images (left eye image and right eye image) having different parallaxes.
  • the imaging apparatus 100 detects a phase shift amount based on the pixel value of the first pixel L and the pixel value of the second pixel R by using the phase difference AF function. Then, the focal position of the photographing lens is adjusted based on the detected phase shift amount.
  • the light shielding members 20A and 20B are referred to as “light shielding members” without reference numerals.
  • the image sensor 20 is classified into a first pixel group, a second pixel group, and a third pixel group.
  • the first pixel group refers to a plurality of first pixels L, for example.
  • the second pixel group refers to a plurality of second pixels R, for example.
  • the third pixel group refers to, for example, a plurality of normal pixels (an example of a third pixel).
  • the “normal pixel” here refers to, for example, a pixel other than the phase difference pixel (for example, a pixel having no light shielding members 20A and 20B).
  • the RAW image output from the first pixel group is referred to as a “first image”
  • the RAW image output from the second pixel group is referred to as a “second image”
  • the third image The RAW image output from the pixel group is referred to as a “third image”.
  • Each pixel included in the first and second pixel groups is arranged at a position where the positions in the horizontal direction are aligned within one pixel between the first pixel group and the second pixel group.
  • each pixel included in the first and second pixel groups is arranged at a position where the positions in the vertical direction are aligned within the one pixel between the first pixel group and the second pixel group.
  • one pixel L and second pixel R are alternately arranged linearly in the horizontal direction and the vertical direction at intervals of a plurality of pixels.
  • the positions of the pixels included in the first and second pixel groups are positions aligned within one pixel in each of the horizontal direction and the vertical direction, but at least in either the horizontal direction or the vertical direction.
  • the position may be within a predetermined number of pixels (for example, within 2 pixels).
  • the position of each pixel included in the first and second pixel groups is set in the horizontal direction as shown in FIG. 7 as an example.
  • it is preferable that the positions are aligned within one pixel in each of the vertical directions.
  • the phase difference pixels are provided for the pixels of the G filter in a square array corresponding to 2 ⁇ 2 pixels. That is, in the example shown in FIG. 7, the pixel in the upper right corner of the front view in the figure of the 2 ⁇ 2 pixel G filter is assigned to the phase difference pixel. Further, normal pixels are arranged between the phase difference pixels, and the remaining pixels of the 2 ⁇ 2 pixel G filter are assigned to the normal pixels. Further, in the example shown in FIG.
  • the rows of the phase difference pixels in which the first pixels L and the second pixels R are alternately arranged in the horizontal direction are set as a set in units of two rows, and each set includes They are arranged in the vertical direction at intervals of a predetermined number of pixels (eight pixels in the example shown in FIG. 7).
  • the light shielding member is provided for the pixel in the upper right corner of the 2 ⁇ 2 pixel G filter, and the phase difference pixel is spaced by a plurality of pixels in both the vertical and horizontal directions.
  • the phase difference pixel is spaced by a plurality of pixels in both the vertical and horizontal directions.
  • the interpolation accuracy in the case of interpolating the pixel values of the phase difference pixels from the pixel values of the normal pixels can be improved.
  • the pixels included in the first to third pixel groups are arranged so that the normal pixels used for interpolation do not overlap between the phase difference pixels, further improvement in interpolation accuracy can be expected.
  • the image sensor 20 outputs a first image (a digital signal indicating the pixel value of each first pixel) from the first pixel group, and outputs a second image (from the second pixel group). A digital signal indicating the pixel value of each second pixel). Further, the image sensor 20 outputs a third image (a digital signal indicating the pixel value of each normal pixel) from the third pixel group. Note that the third image output from the third pixel group is a chromatic image, for example, a color image having the same color array as the normal pixel array.
  • the first image, the second image, and the third image output from the image sensor 20 are temporarily stored in a volatile storage area in the memory 26 via the interface unit 24.
  • the image processing unit 28 includes a normal processing unit 30.
  • the normal processing unit 30 processes the R, G, and B signals corresponding to the third pixel group to generate a chromatic color normal image that is an example of the first display image.
  • the image processing unit 28 includes a split image processing unit 32.
  • the split image processing unit 32 processes the G signal corresponding to the first pixel group and the second pixel group to generate an achromatic split image that is an example of a second display image.
  • the image processing unit 28 according to the first embodiment is realized by an ASIC (Application (Specific Integrated Circuit) that is an integrated circuit in which a plurality of functions related to image processing are integrated into one.
  • the hardware configuration is not limited to this, and may be another hardware configuration such as a programmable logic device, a computer including a CPU, a ROM, and a RAM.
  • the hybrid viewfinder 220 has an LCD 247 that displays an electronic image.
  • the number of pixels in a predetermined direction on the LCD 247 (for example, the number of pixels in the horizontal direction, which is a parallax generation direction) is smaller than the number of pixels in the same direction on the display unit 213.
  • the display control unit 36A is connected to the display unit 213, and the display control part 36B is connected to the LCD 247. By selectively controlling the LCD 247 and the display unit 213, an image is displayed on the LCD 247 or the display unit 213.
  • the display unit 213 and the LCD 247 are referred to as “display devices” when it is not necessary to distinguish between them.
  • the imaging apparatus 100 is configured to be able to switch between a manual focus mode and an autofocus mode by a dial 212 (focus mode switching unit).
  • the display control unit 36 causes the display device to display a live view image obtained by combining the split images.
  • the CPU 12 operates as a phase difference detection unit and an automatic focus adjustment unit.
  • the phase difference detection unit detects a phase difference between the first image output from the first pixel group and the second image output from the second pixel group.
  • the automatic focus adjustment unit controls the lens driving unit (not shown) from the device control unit 22 via the mounts 256 and 346 so that the defocus amount of the photographing lens 16 is zero based on the detected phase difference. Then, the photographing lens 16 is moved to the in-focus position.
  • the above “defocus amount” refers to, for example, the amount of phase shift between the first image and the second image.
  • the eyepiece detection unit 37 detects that a person (for example, a photographer) has looked into the viewfinder eyepiece 242 and outputs the detection result to the CPU 12. Therefore, the CPU 12 can grasp whether or not the finder eyepiece unit 242 is used based on the detection result of the eyepiece detection unit 37.
  • the external I / F 39 is connected to a communication network such as a LAN (Local Area Network) or the Internet, and controls transmission / reception of various information between the external device (for example, a printer) and the CPU 12 via the communication network. Therefore, when a printer is connected as an external device, the imaging apparatus 100 can output a captured still image to the printer for printing. Further, when a display is connected as an external device, the imaging apparatus 100 can output and display a captured still image or live view image on the display.
  • a communication network such as a LAN (Local Area Network) or the Internet
  • FIG. 9 is a functional block diagram illustrating an example of main functions of the imaging apparatus 100 according to the first embodiment.
  • symbol is attached
  • the normal processing unit 30 and the split image processing unit 32 each have a WB gain unit, a gamma correction unit, and a synchronization processing unit (not shown), and with respect to the original digital signal (RAW image) temporarily stored in the memory 26.
  • Each processing unit sequentially performs signal processing. That is, the WB gain unit executes white balance (WB) by adjusting the gains of the R, G, and B signals.
  • the gamma correction unit performs gamma correction on each of the R, G, and B signals that have been subjected to WB by the WB gain unit.
  • the synchronization processing unit performs color interpolation processing corresponding to the color filter array of the image sensor 20, and generates synchronized R, G, B signals.
  • the normal processing unit 30 and the split image processing unit 32 perform image processing on the RAW image in parallel every time a RAW image for one screen is acquired by the image sensor 20.
  • the normal processing unit 30 receives R, G, B RAW images from the interface unit 24, and uses the R, G, B pixels of the third pixel group as an example as shown in FIG. And it is generated by interpolating with peripheral pixels of the same color in the second pixel group (for example, adjacent G pixels). Thereby, a normal image for recording can be generated based on the third image output from the third pixel group.
  • the normal processing unit 30 outputs the generated image data of the normal image for recording to the encoder 34.
  • the R, G, B signals processed by the normal processing unit 30 are converted (encoded) into recording signals by the encoder 34 and recorded in the recording unit 40 (see FIG. 7).
  • a normal image for display that is an image based on the third image processed by the normal processing unit 30 is output to the display control unit 36.
  • the term “for recording” and “for display” are used. Is referred to as a “normal image”.
  • the image sensor 20 can change the exposure conditions (shutter speed by an electronic shutter as an example) of each of the first pixel group and the second pixel group, and thereby can simultaneously acquire images having different exposure conditions. . Therefore, the image processing unit 28 can generate an image with a wide dynamic range based on images with different exposure conditions. In addition, a plurality of images can be simultaneously acquired under the same exposure condition, and by adding these images, a high-sensitivity image with little noise can be generated, or a high-resolution image can be generated.
  • the split image processing unit 32 extracts the G signal of the first pixel group and the second pixel group from the RAW image once stored in the memory 26, and the G of the first pixel group and the second pixel group. An achromatic split image is generated based on the signal.
  • Each of the first pixel group and the second pixel group extracted from the RAW image is a pixel group including G filter pixels as described above. Therefore, the split image processing unit 32 can generate an achromatic left parallax image and an achromatic right parallax image based on the G signals of the first pixel group and the second pixel group.
  • the above “achromatic left parallax image” is referred to as a “left eye image”
  • the above “achromatic right parallax image” is referred to as a “right eye image”.
  • the split image processing unit 32 combines the left-eye image based on the first image output from the first pixel group and the right-eye image based on the second image output from the second pixel group to split the image. Generate an image.
  • the generated split image data is output to the display controller 36.
  • the display control unit 36 records image data corresponding to the third pixel group input from the normal processing unit 30 and the split corresponding to the first and second pixel groups input from the split image processing unit 32. Display image data is generated based on the image data of the image. For example, the display control unit 36 displays the image input from the split image processing unit 32 in the display area of the normal image indicated by the recording image data corresponding to the third pixel group input from the normal processing unit 30. The split image indicated by the data is synthesized. Then, the image data obtained by the synthesis is output to the display device. That is, the display control unit 36A outputs the image data to the display unit 213, and the display control unit 36B outputs the image data to the LCD 247.
  • the split image generated by the split image processing unit 32 is a multi-divided image obtained by combining a part of the left eye image and a part of the right eye image.
  • Examples of the “multi-divided image” mentioned here include split images shown in FIGS. 13A and 13B.
  • the split image shown in FIG. 13 is an image obtained by combining the upper half image of the left-eye image and the lower half image of the right-eye image, and the image divided into two in the vertical direction is in focus. Accordingly, the image is shifted in a predetermined direction (for example, a parallax generation direction).
  • the split image mode is not limited to the example shown in FIGS.
  • a part of the left eye image and a part of the right eye image at a position corresponding to the position of the predetermined area of the display unit 213 are combined. It may be an image. In this case, for example, an image divided into four in the vertical direction is shifted in a predetermined direction (for example, a parallax generation direction) according to the in-focus state.
  • a predetermined direction for example, a parallax generation direction
  • the method of combining the split image with the normal image is not limited to the combining method of fitting the split image in place of a part of the normal image.
  • a synthesis method in which a split image is superimposed on a normal image may be used.
  • a combining method may be used in which the transmittance of a part of the normal image on which the split image is superimposed and the split image are appropriately adjusted and superimposed.
  • the hybrid finder 220 includes an OVF 240 and an EVF 248.
  • the OVF 240 is an inverse Galileo finder having an objective lens 244 and an eyepiece 246, and the EVF 248 has an LCD 247, a prism 245, and an eyepiece 246.
  • a liquid crystal shutter 243 is disposed in front of the objective lens 244, and the liquid crystal shutter 243 shields light so that an optical image does not enter the objective lens 244 when the EVF 248 is used.
  • the prism 245 reflects an electronic image or various information displayed on the LCD 247 and guides it to the eyepiece 246, and combines the optical image and information (electronic image and various information) displayed on the LCD 247.
  • an OVF mode in which an optical image can be visually recognized by the OVF 240 and an electronic image can be visually recognized by the EVF 248 each time it is rotated.
  • the EVF mode is switched alternately.
  • the display control unit 36B controls the liquid crystal shutter 243 to be in a non-light-shielding state so that an optical image can be visually recognized from the eyepiece unit. Further, only the split image is displayed on the LCD 247. Thereby, a finder image in which a split image is superimposed on a part of the optical image can be displayed.
  • the display control unit 36B controls the liquid crystal shutter 243 to be in a light shielding state so that only the electronic image displayed on the LCD 247 can be visually recognized from the eyepiece unit.
  • the image data equivalent to the image data obtained by combining the split image output to the display unit 213 is input to the LCD 247, whereby the split image is combined with a part of the normal image in the same manner as the display unit 213. Electronic images can be displayed.
  • FIG. 11 shows an example of display areas of the normal image and the split image in the display device.
  • the display device displays the input split image within a rectangular frame (a split image display area) at the center of the screen.
  • the display device displays the input normal image in the outer peripheral area of the split image (normal image display area).
  • the display device displays the input normal image in the entire screen area (full screen display).
  • the display device displays the input split image as an example in a rectangular frame at the center of the screen shown in FIG. It is a blank area. Note that the edge line representing the rectangular frame at the center of the screen shown in FIG. 11 is not actually displayed, but is shown in FIG. 11 for convenience of explanation.
  • an image output process performed by the CPU 12 when a normal image and a split image are generated by the image processing unit 28 (for example, every time a live view image for one frame is generated)
  • the image output process is performed by the imaging apparatus 100 when the CPU 12 executes the image output process program.
  • the image output processing program is stored in a predetermined storage area (for example, the memory 26).
  • the image processing unit 28 may execute the image output process.
  • step 300 the CPU 12 determines whether or not the split image generated by the image processing unit 28 has been instructed via the operation unit 14.
  • the instruction by the operation unit 14 is preferably a holding type instruction.
  • the instruction by the operation unit 14 may be an instruction by a hard key or an instruction by a soft key.
  • an alternate operation type switch (holding type switch) is preferably applied. That is, it refers to a switch that is maintained in an operating state (in this case, a state in which display of a split image is instructed) until a release operation is performed when pushed into a predetermined position.
  • the alternate operation type switch may be a lock type that is held in the pushed-in position when the pressing operation is released, and returns to the original state when the pressing operation is performed again, or is free when the pressing operation is released.
  • a non-locking type returning to the position may be used.
  • the CPU 12 causes the display unit 213 to display a soft key (an example of a second instruction unit) that is pressed when instructing display of the split image.
  • the soft key is operated, it is determined that the split image display is instructed. If split image display is instructed via the operation unit 14 in step 300, the determination is affirmed and the routine proceeds to step 302. If the split image display is not instructed via the operation unit 14 in step 300, the determination is negative and the process proceeds to step 312.
  • the CPU 12 determines whether or not the display of the normal image generated by the image processing unit 28 is instructed via the operation unit 14.
  • the instruction by the operation unit 14 is preferably a non-holding type instruction.
  • the instruction by the operation unit 14 may be an instruction by a hard key or an instruction by a soft key.
  • a momentary operation type switch non-holding type switch
  • it refers to a switch that maintains an operating state (here, as an example, a state instructing display of a normal image) only while being pushed into a predetermined position.
  • the momentary operation type switch may be a push-pull type that is pressed when instructing display of a normal image and is pulled out when canceling the instruction, or may be of other types.
  • a switch that realizes the same function as a hard key in a software configuration may be applied.
  • the CPU 12 causes the display unit 213 to display a soft key (an example of a first instruction unit) that is pressed when instructing display of a normal image. Then, it is determined that the display of the normal image is instructed when the soft key is operated. If the display of the normal image is instructed via the operation unit 14 in step 302, the determination is affirmed and the process proceeds to step 306. If the display of the normal image is not instructed via the operation unit 14 in step 302, the determination is negative and the process proceeds to step 308.
  • step 306 the CPU 12 controls the image processing unit 28 to output the normal image to the display control units 36 ⁇ / b> A and 36 ⁇ / b> B and discards the split image, and then proceeds to step 310.
  • the image processing unit 28 executes this step 305, the image processing unit 28 outputs the generated normal image to the display control units 36A and 36B, and discards the generated split image.
  • the display control unit 36A outputs the input normal image to the display unit 213, thereby causing the display unit 213 to display the normal image. In this case, the display unit 213 displays a normal image in the entire area of the screen.
  • the display control unit 36B outputs the input normal image to the LCD 247 to display the normal image on the LCD 247.
  • the LCD 247 displays a normal image in the entire area of the screen.
  • step 308 the CPU 12 controls the image processing unit 28 to output the normal image and the split image to the display control units 36A and 36B, and then proceeds to step 310.
  • the image processing unit 28 outputs the generated normal image and split image to the display control units 36A and 36B.
  • the display control unit 36A causes the display unit 213 to display the input normal image and split image.
  • the display unit 213 displays a normal image in the normal image display area shown in FIG. 11 as an example, and displays the split image in the split image display area shown in FIG. 11 as an example.
  • the display control unit 36B causes the LCD 247 to display the input normal image and split image.
  • the LCD 247 displays the normal image in the normal image display area shown in FIG. 11 as an example, and displays the split image in the split image display area shown in FIG. 11 as an example.
  • step 312 the CPU 12 performs the same processing as in step 306, and then proceeds to step 310.
  • step 310 the CPU 12 determines whether or not a condition (end condition) for ending the main image output process is satisfied.
  • the “end condition” here include a condition that an instruction to end the main image output process is given via the operation unit 14 and a condition that the normal image and the split image are no longer generated by the image processing unit 28. Can be mentioned. If the end condition is not satisfied in step 310, the process returns to step 300. If the end condition is satisfied in step 310, the determination is affirmed and the main image output process is ended.
  • step 308 when step 308 is executed by the CPU 12, a live view image is displayed on the display unit 213 and the hybrid viewfinder 220 as shown in FIGS. 13A and 13B as an example.
  • the split image is displayed in the inner area of the frame 60 at the center of the screen corresponding to the split image display area shown in FIG. 11, and the frame corresponding to the display area of the normal image is displayed.
  • a normal image is displayed in the outer area 60.
  • the split image is an image (parallax image) of the upper half 60A of the frame 60 in the left-eye image corresponding to the first image output from the first pixel group, and the second image output from the second pixel group. And the image (parallax image) of the lower half 60B of the frame 60 in the right-eye image corresponding to.
  • the parallax image of the upper half 60A of the split image and the parallax of the lower half 60B as shown in FIG. 13A when the subject corresponding to the image in the frame 60 is not in focus, the parallax image of the upper half 60A of the split image and the parallax of the lower half 60B as shown in FIG. 13A.
  • the image at the boundary with the image is shifted in the parallax generation direction (for example, the horizontal direction).
  • the boundary image between the normal image and the split image is also shifted in the parallax generation direction. This indicates that a phase difference has occurred, and the photographer can visually recognize the phase difference and the parallax generation direction through the split image.
  • the photographing lens 16 when the photographing lens 16 is focused on the subject corresponding to the image in the frame 60, the parallax image of the upper half 60A and the parallax image of the lower half 60B as shown in FIG. 13B.
  • the border images match.
  • the images at the boundary between the normal image and the split image also match. This indicates that no phase difference has occurred, and the photographer can recognize that no phase difference has occurred visually through the split image.
  • the photographer can check the in-focus state of the photographing lens 16 by the split image displayed on the display device.
  • the focus shift amount (defocus amount) can be reduced to zero by manually operating the focus ring 302 of the photographing lens 16.
  • the normal image and the split image can be displayed as color images without color misregistration, and the manual focus adjustment by the photographer can be supported by the color split image.
  • the display device does not display the split image when the display of the normal image is instructed to display the split image, and displays the normal image and the split image when the display of the normal image is not instructed. indicate.
  • the case where the display of the normal image is not instructed indicates, for example, the case where the display instruction of the normal image is canceled. Therefore, the imaging apparatus 100 according to the present embodiment can switch between displaying and not displaying the split image at an appropriate time as compared with the case where the present configuration is not provided.
  • the normal image and the split image are output from the image processing unit 28 to the display control unit 36.
  • the normal image and the split image are output to an external device connected to the external I / F 39. May be output via the external I / F 39.
  • the external device is a storage device (for example, a server device)
  • a normal image and a split image can be stored in the storage device.
  • the external device is an external display, a normal image and a split image can be displayed on the external display in the same manner as the display device described above.
  • the imaging apparatus 100 has the left side and the right side of the optical axis of the light beam passing through the exit pupil of the photographing lens 16 (the subject image that has passed through the first and second regions).
  • An example includes first and second pixel groups that are imaged by pupil division.
  • the image processing unit 28, which is an example of a generation unit, generates a normal image, which is an example of a first display image, based on an image output from the image sensor 20 having the first and second pixel groups. . Further, the operation unit 14 instructs to display a normal image and also displays a split image (steps 300 and 302).
  • the image processing unit 28 when display of a normal image is instructed by the operation unit 14 in a state in which display of the split image is instructed by the operation unit 14, the image processing unit 28 generates the generated normal image without outputting the generated split image. An image is output (step 306).
  • the image processing unit 28 when the operation unit 14 cancels the normal image output instruction while the operation unit 14 instructs the display of the split image, the image processing unit 28 outputs the generated split image and normal image (step). 308).
  • the display control unit 36 controls the display device to display the normal image and display the split image in the normal image display area. Thereby, the display and non-display of the split image can be switched at an appropriate time as compared with the case where this configuration is not provided. Further, since the split image is output when the instruction to display the normal image by the operation unit 14 is canceled (step 302: N), the split image can be displayed more quickly than in the case where the present configuration is not provided. .
  • a non-holding type instruction is applied as an instruction to display a normal image.
  • the display and non-display of a split image can be switched quickly compared with the case where this configuration is not provided.
  • a holding type instruction is applied as an instruction to display a split image.
  • the imaging device 20 includes a third pixel group that outputs a third image by forming a subject image that has passed through the photographing lens 16 without being divided into pupils. Have.
  • the image processing unit 28 generates a normal image based on the third image output from the third pixel group. Thereby, compared with the case where this structure is not provided, the image quality of a normal image can be improved.
  • a single phase difference pixel is arranged with respect to a 2 ⁇ 2 pixel G filter.
  • a 2 ⁇ 2 pixel G filter is used.
  • a pair of first pixel L and second pixel R may be arranged.
  • a pair of first and second pixels L and R adjacent in the horizontal direction with respect to a 2 ⁇ 2 pixel G filter may be arranged.
  • a pair of first and second pixels L and R adjacent to each other in the vertical direction with respect to the 2 ⁇ 2 pixel G filter may be arranged.
  • the positions of the first pixel L and the second pixel R are set at least in the vertical direction and the horizontal direction between the first pixel group and the second pixel group. It is preferable to align one side within a predetermined number of pixels. 14 and 15, the first pixel L and the second pixel R, and the position in the vertical direction and the horizontal direction between the first pixel group and the second pixel group are one pixel. An example is shown in which they are arranged at the same positions.
  • the color filter 21 having the basic arrangement pattern C is exemplified, but the present invention is not limited to this.
  • the arrangement of the primary colors (R filter, G filter, B filter) of the color filter may be a Bayer arrangement.
  • phase difference pixels are arranged for the G filter.
  • the color filter 21A shown in FIG. 16, 3 ⁇ 3 four corners and the center of the pixel square matrix are disposed phase difference pixels in the center of the arrangement pattern G 1 which is a G filter as an example.
  • the first pixel L and the second pixel R are alternately arranged by skipping the G filter for one pixel in each of the horizontal direction and the vertical direction (with the G filter for one pixel in between). Yes.
  • the first pixel L and the second pixel R are arranged at positions where the positions in the vertical direction and the horizontal direction are aligned within one pixel between the first pixel group and the second pixel group. ing.
  • an image based on the center of the phase difference pixel arrangement pattern G 1 because it is possible interpolation using the image based on normal pixels at the four corners of the array pattern G 1, compared to a case in which this configuration, interpolation accuracy Can be improved.
  • each of the arrangement pattern G 1 does not position overlap with each other. That is, the first pixel L and the second pixel R are interpolated with the third image by the pixels included in the third pixel group adjacent to the pixels included in the first and second pixel groups.
  • the pixels included in the second image are arranged at positions that do not overlap in pixel units. Therefore, it is possible to avoid an image based on a phase difference pixel being interpolated with an image based on a normal pixel used in the interpolation of an image based on another phase difference pixel. Therefore, further improvement in interpolation accuracy can be expected.
  • the color filter 21B shown in FIG. 17, the phase difference pixels are arranged in the center and in the drawing front view the lower right corner of the array pattern G 1 as an example. Further, the first pixel L and the second pixel R are alternately arranged by skipping the G filters for two pixels in each of the horizontal direction and the vertical direction (with the G filters for two pixels in between). Yes. Thereby, the first pixel L and the second pixel R are arranged at positions where the positions in the vertical direction and the horizontal direction are aligned within one pixel between the first pixel group and the second pixel group. Thus, the first pixel L and the second pixel R can be adjacent to each other. Therefore, it is possible to suppress the occurrence of image shift due to factors other than focus shift.
  • each of the arrangement pattern G 1 does not position overlap with each other. That is, the first pixel L and the second pixel R are interpolated with the third image by the pixels included in the third pixel group adjacent to the pixels included in the first and second pixel groups. In addition, the pixels included in the second image are arranged at positions that do not overlap in a pair of pixel units.
  • the "pair of pixels" refers to the first pixel L and a second pixel R (a pair of phase difference pixels) included, for example, in the arrangement pattern G 1.
  • the first pixel L is disposed by skipping the G filters for two pixels in each of the horizontal direction and the vertical direction
  • the second pixel R is also a G filter for two pixels in each of the horizontal direction and the vertical direction. It is arranged by skipping.
  • the first pixel L and the second pixel R are arranged at positions where the positions in the vertical direction and the horizontal direction are aligned within the two pixels between the first pixel group and the second pixel group.
  • the first pixel L and the second pixel R can be adjacent to each other. Therefore, it is possible to suppress the occurrence of image shift due to factors other than focus shift.
  • each of the examples as well as the arrangement pattern G 1 shown in FIG. 17 in the example shown in FIG. 18 does not position overlap with each other. Therefore, it is possible to avoid an image based on a phase difference pixel being interpolated with an image based on a normal pixel used in the interpolation of an image based on another phase difference pixel. Therefore, further improvement in interpolation accuracy can be expected.
  • FIG. 19 schematically shows an example of the arrangement of primary colors (R filter, G filter, B filter) of the color filter 21D provided in the image sensor 20 and the arrangement of the light shielding members.
  • the constituent pixels of the image sensor 20 are classified into two types of pixel groups.
  • a pixel group on the A surface that is an example of the fourth pixel group and a pixel group on the B surface that is an example of the fifth pixel group can be cited.
  • the pixel group on the A plane includes a first pixel group
  • the pixel group on the B plane includes a second pixel group.
  • Each of the pixel group on the A plane and the pixel group on the B plane includes a third pixel group.
  • the pixel group on the A surface and the pixel group on the B surface are pixel groups in which the constituent pixels in the image sensor 20 are alternately arranged in the horizontal direction and the vertical direction.
  • Each of the A-side pixel group and the B-side pixel group is assigned a Bayer array primary color by the color filter 21D.
  • the A-side pixel group and the B-side pixel group are mutually aligned in the horizontal direction and the vertical direction. They are shifted by half a pixel (half pitch).
  • FIG. 20 schematically shows an example of the first pixel group and the second pixel group in the image sensor 20 shown in FIG.
  • the arrangement of the R filter, G filter, and B filter of the phase difference pixels in each of the first pixel group and the second pixel group is a Bayer arrangement.
  • the first pixel L and the second pixel R are disposed adjacent to each other (with a minimum pitch) in pairs. Thereby, the phase difference between the first pixel group and the second pixel group is calculated with higher accuracy than in the case where the present configuration is not provided.
  • the arrangement of the R filter, the G filter, and the B filter of the phase difference pixels in each of the first pixel group and the second pixel group is a Bayer array.
  • the invention is not limited to this.
  • a part of the first pixels L in the first pixel group and a part of the second pixels R in the second pixel group are arranged adjacent to each other. May be. Since the pixel provided with the G filter is more sensitive than the pixel provided with the filter of other colors, the interpolation accuracy can be increased, and the pixel provided with the G filter is more Interpolation is easy because the G filter has continuity.
  • the split image divided in the vertical direction is exemplified.
  • the present invention is not limited to this, and an image divided into a plurality of parts in the horizontal direction or the diagonal direction may be applied as the split image.
  • the split image 66a shown in FIG. 22 is divided into odd and even lines by a plurality of dividing lines 63a parallel to the horizontal direction.
  • a line-like (eg, strip-like) phase difference image 66La generated based on the output signal outputted from the first pixel group is displayed on an odd line (even an even line is acceptable).
  • a line-shaped (eg, strip-shaped) phase difference image 66Ra generated based on the output signal output from the second pixel group is displayed on even lines.
  • the split image 66b shown in FIG. 23 is divided into two by a dividing line 63b (for example, a diagonal line of the split image 66b) having an inclination angle in the horizontal direction.
  • the phase difference image 66Lb generated based on the output signal output from the first pixel group is displayed in one area.
  • the phase difference image 66Rb generated based on the output signal output from the second pixel group is displayed in the other region.
  • the split image 66c shown in FIGS. 24A and 24B is divided by grid-like dividing lines 63c parallel to the horizontal direction and the vertical direction, respectively.
  • the phase difference image 66Lc generated based on the output signal output from the first pixel group is displayed in a checkered pattern (checker pattern).
  • the phase difference image 66Rc generated based on the output signal output from the second pixel group is displayed in a checkered pattern.
  • another focus confirmation image may be generated from the two phase difference images, and the focus confirmation image may be displayed.
  • two phase difference images may be superimposed and displayed as a composite image. If the image is out of focus, the image may be displayed as a double image, and the image may be clearly displayed when the image is in focus.
  • the output content of the image processing unit 28 is changed depending on whether the display of the normal image is instructed.
  • the image processing unit 28 is changed according to the operation state of the release switch 211.
  • a non-holding type switch is applied as an example of the release switch 211.
  • An example of the non-holding type switch is a momentary operation type switch.
  • a software key that realizes the same function as the release switch 211 may be displayed on the display unit 213 by a software configuration, and the displayed software key may be operated by the user via the touch panel.
  • symbol is attached
  • FIG. 25 is a flowchart showing an example of the flow of image output processing according to the second embodiment.
  • steps different from the flowchart shown in FIG. 12 will be described, and the same steps will be denoted by the same step numbers and description thereof will be omitted.
  • the flowchart shown in FIG. 25 differs from the flowchart shown in FIG. 12 in that step 302A is applied instead of step 302 and that steps 350 to 360 are newly provided.
  • step 302A the CPU 12 determines whether or not the release switch 211 is in a half-pressed state (for example, held in a half-pressed state). If the release switch 211 is pressed halfway in step 302A, the determination is affirmed and the routine proceeds to step 306. If the release switch 211 is not pressed halfway in this step 302A (for example, if it is held at the standby position), the determination is negative and the routine proceeds to step 308.
  • “when the release switch 211 is not half-pressed” includes, for example, a case where the release switch 211 is not pressed.
  • the present invention is not limited to this, and the release switch 211 is half-pressed. It may be determined whether the in-focus state has been reached. That is, in this case, the case where the release switch 211 is half-pressed and brought into a focused state is a case where the display of the first display image is instructed by the first instruction unit according to the present invention.
  • step 350 the CPU 12 determines whether or not the release switch 211 is fully pressed (for example, held in the fully pressed state). If the release switch 211 is fully pressed in step 350, the determination is affirmed and the routine proceeds to step 352. If the release switch 211 is not fully pressed in this step 350 (for example, if the half-pressed state is maintained or if the half-pressed state is released), the determination is negative and the routine proceeds to step 302A.
  • step 352 the CPU 12 performs control so as to start capturing a still image. Then, in accordance with the processing in step 352, still image capturing is started. When the imaging is completed, the process proceeds to step 354.
  • step 354 the CPU 12 determines whether or not the split image display instruction by the operation unit 14 (an instruction unit different from the release switch 211 as an example) has been released. If the split image display instruction issued via the operation unit 14 in step 354 is canceled, the determination is affirmed and the process proceeds to step 356. If the split image display instruction given via the operation unit 14 in step 354 has not been canceled, the determination is negative and the routine proceeds to step 360.
  • step 356 the CPU 12 performs the same processing as the processing in step 306, and then proceeds to step 310.
  • step 360 the CPU 12 performs the same processing as the processing in step 308, and then proceeds to step 310.
  • the split image is not displayed on the display device and the normal image is displayed. Is displayed.
  • the split image is displayed when the operation state of the release switch 211 is canceled when the split image is not displayed (step 358: Y), the split image is more appropriate than the case without this configuration. It can be displayed quickly at the right time.
  • the display is switched when the state where the release switch 211 is held at the half-pressed position is released. That is, in the example shown in FIG. 25, when the release switch 211 is held in the fully-pressed position, imaging is performed (step 352), and then the split image display instruction performed via the operation unit 14 is canceled.
  • the CPU 12 controls the display device to display the normal image (step 356).
  • AF autofocus
  • step 356 when the determination is affirmed in step 354 and the process proceeds to step 360 when the determination is denied.
  • the invention is not limited to this.
  • a step (addition step A) is inserted between step 354 and step 356 to cause the CPU 12 to determine whether or not the operation with respect to the release switch 211 has been released, and a step of performing similar processing (addition step B) is performed. It may be inserted between 354 and step 360.
  • “operation on the release switch 211” refers to full-pressing and half-pressing of the release switch 211, for example. That is, the state in which the operation with respect to the release switch 211 is released means, for example, a state in which the release switch 211 is released without the release switch 211 being fully pressed or half pressed.
  • step A if the release switch 211 is fully pressed, the determination is denied and the process proceeds to step 354.
  • step A when the release switch 211 is shifted from the fully pressed state to the state where the release switch 211 is not operated, an affirmative determination is made and the routine proceeds to step 356.
  • the operation state of the release switch 211 is released, a live view image based on a normal image can be displayed on the display device.
  • step B when the release switch 211 is shifted from the fully pressed state to a state where the release switch 211 is not operated, an affirmative determination is made and the routine proceeds to step 360.
  • the split image can be displayed on the display device together with the live view image based on the normal image.
  • the present invention is not limited to the image output process described above, and the CPU 12 may execute an image output process in which steps 354 and 356 are omitted from the image output process according to the second embodiment.
  • the release switch 211 is held at the fully-pressed position while the split image display instruction is being issued, the normal image display instruction is canceled, and the split image is displayed on the display device together with the normal image accordingly.
  • You may make CPU12 perform control to perform. In this case, for example, in a scene where a plurality of images of the same subject are changed, the setting of performing manual focus adjustment using a split image can be maintained even after imaging, which contributes to improvement of convenience for the user. Can do.
  • the release switch 211 when the release switch 211 is held at the fully-pressed position (an example of the second indication position) or the standby position (when the release switch 211 is not operated at the time of capturing a live view image), By displaying the normal image and displaying the split image in the normal image display area, the split image can be displayed at an appropriate time according to the operation state of the release switch 211.
  • the output content of the image processing unit 28 is changed depending on whether the display of the normal image is instructed.
  • the output content of the image processing unit 28 is changed according to the F value. The case of changing will be described. In the following description, differences from the first embodiment will be described. Moreover, the same code
  • FIG. 26 is a flowchart showing an example of the flow of image output processing according to the third embodiment.
  • steps different from the flowchart shown in FIG. 12 will be described, and the same steps will be denoted by the same step numbers and description thereof will be omitted.
  • the flowchart shown in FIG. 26 is different from the flowchart shown in FIG. 12 in that step 302B is applied instead of step 302.
  • step 302B whether or not a predetermined F value (an example of a specific aperture value) is set by the CPU 12 via the operation unit 14 (for example, a depth-of-field confirmation button) as an F value used when capturing a still image. Is determined. If a predetermined F value is set via the operation unit 14 in step 302B, the determination is affirmed and the routine proceeds to step 306. If the predetermined F value is not set via the operation unit 14 in step 302B, the determination is negative and the routine proceeds to step 308.
  • a predetermined F value an example of a specific aperture value
  • a split image is displayed on the display device when an F value that is set in advance as an F value used when capturing a still image is set via the operation unit 14.
  • a normal image is displayed without being displayed.
  • an F value that is set in advance as an F value used when capturing a still image is not set via the operation unit 14
  • a normal image and a split image are displayed on the display device.
  • the output content of the image processing unit 28 is changed depending on whether or not the display of the normal image is instructed.
  • FIG. 27 shows a flowchart showing an example of the flow of image output processing according to the fourth embodiment.
  • steps different from the flowchart shown in FIG. 12 will be described, and the same steps will be denoted by the same step numbers and description thereof will be omitted.
  • the flowchart shown in FIG. 26 is different from the flowchart shown in FIG. 12 in that step 302C is applied instead of step 302.
  • step 302C the CPU 12 determines whether or not the user is using the EVF 248. Whether or not the user is using the EVF 248 is determined based on, for example, whether or not the viewfinder eyepiece unit 242 is determined to be used based on the detection result of the eyepiece detection unit 37. That is, when it is determined that the finder eyepiece 242 is used, it is determined that the user is using the EVF 248, and when it is determined that the finder eyepiece 242 is not used, the user uses the EVF 248. It is determined that it is not being used. If the user is using the EVF 248 in step 302C, the determination is affirmed and the routine proceeds to step 306. If the user does not use the EVF 248 in step 302C, the determination is negative and the process proceeds to step 308.
  • the image output process according to the fourth embodiment is performed by the CPU 12, when the EVF 248 is used, the normal image is displayed without displaying the split image on the display device.
  • the EVF 248 is not used, a normal image and a split image are displayed on the display device.
  • the split image can be displayed at an appropriate time according to the usage state of the EVF 248 as compared with the case where this configuration is not provided.
  • the image output processing according to the fourth embodiment and at least one of the image output processing according to the first to third embodiments may be performed in parallel, or in the first to third embodiments. At least two or more of the image output processes may be performed in parallel. In this case, for example, when at least one of the steps 302, 302A, 302B, and 302C is affirmative, the normal image is displayed and the split image is discarded. Further, when at least one of the determinations is negative, a process of displaying a normal image and a split image is performed.
  • each process included in the image output process described in the first to fourth embodiments may be realized by a software configuration using a computer by executing a program, or may be realized by a hardware configuration. May be. Further, it may be realized by a combination of a hardware configuration and a software configuration.
  • the program may be stored in a predetermined storage area (for example, the memory 26) in advance. It is not always necessary to store in the memory 26 from the beginning.
  • a program is first stored in an arbitrary “portable storage medium” such as a flexible disk connected to a computer, so-called FD, CD-ROM, DVD disk, magneto-optical disk, IC card, etc. Also good. Then, the computer may acquire the program from these portable storage media and execute it.
  • each program may be stored in another computer or server device connected to the computer via the Internet, LAN (Local Area Network), etc., and the computer may acquire and execute the program from these. Good.
  • the imaging device 100 is illustrated.
  • a mobile terminal device that is a modification of the imaging device 100, for example, a mobile phone or a smartphone having a camera function, a PDA (Personal Digital Assistant), a portable game machine Etc.
  • a smartphone will be described as an example, and will be described in detail with reference to the drawings.
  • FIG. 28 is a perspective view showing an example of the appearance of the smartphone 500.
  • a smartphone 500 illustrated in FIG. 28 includes a flat housing 502, and a display input in which a display panel 521 as a display unit and an operation panel 522 as an input unit are integrated on one surface of the housing 502. Part 520.
  • the housing 502 includes a speaker 531, a microphone 532, an operation unit 540, and a camera unit 541. Note that the configuration of the housing 502 is not limited thereto, and for example, a configuration in which the display unit and the input unit are independent, or a configuration having a folding structure or a slide structure may be employed.
  • FIG. 29 is a block diagram showing an example of the configuration of the smartphone 500 shown in FIG.
  • the main components of the smartphone 500 include a wireless communication unit 510, a display input unit 520, a call unit 530, an operation unit 540, a camera unit 541, a storage unit 550, and an external input / output. Part 560.
  • the smartphone 500 includes a GPS (Global Positioning System) receiving unit 570, a motion sensor unit 580, a power supply unit 590, and a main control unit 501.
  • GPS Global Positioning System
  • a wireless communication function for performing mobile wireless communication via the base station device BS and the mobile communication network NW is provided as a main function of the smartphone 500.
  • the wireless communication unit 510 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW according to an instruction from the main control unit 501. Using this wireless communication, transmission and reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
  • the display input unit 520 is a so-called touch panel, and includes a display panel 521 and an operation panel 522. For this reason, the display input unit 520 displays images (still images and moving images), character information, and the like visually by controlling the main control unit 501, and visually transmits information to the user, and performs user operations on the displayed information. To detect. Note that when viewing the generated 3D, the display panel 521 is preferably a 3D display panel.
  • the display panel 521 uses an LCD, OELD (Organic Electro-Luminescence Display), or the like as a display device.
  • the operation panel 522 is a device that is placed so that an image displayed on the display surface of the display panel 521 is visible and detects one or a plurality of coordinates operated by a user's finger or stylus. When such a device is operated by a user's finger or stylus, a detection signal generated due to the operation is output to the main control unit 501. Next, the main control unit 501 detects an operation position (coordinates) on the display panel 521 based on the received detection signal.
  • the display panel 521 and the operation panel 522 of the smartphone 500 integrally form the display input unit 520, but the operation panel 522 is disposed so as to completely cover the display panel 521. ing.
  • the operation panel 522 may have a function of detecting a user operation even in an area outside the display panel 521.
  • the operation panel 522 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 521 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 521. May be included).
  • the operation panel 522 may include two sensitive regions of the outer edge portion and the other inner portion. Further, the width of the outer edge portion is appropriately designed according to the size of the housing 502 and the like.
  • examples of the position detection method employed in the operation panel 522 include a matrix switch method, a resistance film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method. You can also
  • the call unit 530 includes a speaker 531 and a microphone 532.
  • the call unit 530 converts the user's voice input through the microphone 532 into voice data that can be processed by the main control unit 501 and outputs the voice data to the main control unit 501. Further, the call unit 530 decodes the audio data received by the wireless communication unit 510 or the external input / output unit 560 and outputs it from the speaker 531.
  • the speaker 531 can be mounted on the same surface as the surface on which the display input unit 520 is provided, and the microphone 532 can be mounted on the side surface of the housing 502.
  • the operation unit 540 is a hardware key using a key switch or the like, and receives an instruction from the user.
  • the operation unit 540 is mounted on the side surface of the housing 502 of the smartphone 500 and is turned on when pressed with a finger or the like, and is turned off by a restoring force such as a spring when the finger is released. It is a push button type switch.
  • the storage unit 550 stores the control program and control data of the main control unit 501, application software, address data that associates the name and telephone number of the communication partner, and transmitted / received e-mail data.
  • the storage unit 550 stores Web data downloaded by Web browsing and downloaded content data.
  • the storage unit 550 temporarily stores streaming data and the like.
  • the storage unit 550 includes an external storage unit 552 having an internal storage unit 551 built in the smartphone and a removable external memory slot.
  • Each of the internal storage unit 551 and the external storage unit 552 constituting the storage unit 550 is realized using a storage medium such as a flash memory type (hard memory type) or a hard disk type (hard disk type).
  • multimedia card micro type multimedia card micro type
  • card type memory for example, MicroSD (registered trademark) memory
  • RAM Random Access Memory
  • ROM Read Only Memory
  • the external input / output unit 560 serves as an interface with all external devices connected to the smartphone 500, and is used to connect directly or indirectly to other external devices through communication or the like or a network. is there. Examples of communication with other external devices include universal serial bus (USB), IEEE 1394, and the like. Examples of the network include the Internet, wireless LAN, Bluetooth (Bluetooth (registered trademark)), RFID (Radio Frequency Identification), and infrared communication (Infrared Data Association: IrDA (registered trademark)). Other examples of the network include UWB (Ultra Wideband (registered trademark)) and ZigBee (registered trademark).
  • Examples of the external device connected to the smartphone 500 include a wired / wireless headset, wired / wireless external charger, wired / wireless data port, and a memory card connected via a card socket.
  • Other examples of external devices include SIM (Subscriber Identity Module Card) / UIM (User Identity Module Card) cards, and external audio / video devices connected via audio / video I / O (Input / Output) terminals. Can be mentioned.
  • an external audio / video device that is wirelessly connected can be used.
  • the external input / output unit may transmit data received from such an external device to each component inside the smartphone 500, or may allow data inside the smartphone 500 to be transmitted to the external device. it can.
  • the GPS receiving unit 570 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 501, performs positioning calculation processing based on the received plurality of GPS signals, and calculates the latitude of the smartphone 500 Detect the position consisting of longitude and altitude.
  • the GPS reception unit 570 can acquire position information from the wireless communication unit 510 or the external input / output unit 560 (for example, a wireless LAN), the GPS reception unit 570 can also detect the position using the position information.
  • the motion sensor unit 580 includes a triaxial acceleration sensor, for example, and detects the physical movement of the smartphone 500 in accordance with an instruction from the main control unit 501. By detecting the physical movement of the smartphone 500, the moving direction and acceleration of the smartphone 500 are detected. This detection result is output to the main control unit 501.
  • the power supply unit 590 supplies power stored in a battery (not shown) to each unit of the smartphone 500 in accordance with an instruction from the main control unit 501.
  • the main control unit 501 includes a microprocessor, operates according to a control program and control data stored in the storage unit 550, and controls each unit of the smartphone 500 in an integrated manner. Further, the main control unit 501 includes a mobile communication control function for controlling each unit of the communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 510.
  • the application processing function is realized by the main control unit 501 operating according to the application software stored in the storage unit 550.
  • Application processing functions include, for example, an infrared communication function that controls the external input / output unit 560 to perform data communication with the opposite device, an e-mail function that transmits and receives e-mails, and a web browsing function that browses web pages. .
  • the main control unit 501 has an image processing function such as displaying video on the display input unit 520 based on image data (still image data or moving image data) such as received data or downloaded streaming data.
  • the image processing function is a function in which the main control unit 501 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 520.
  • the main control unit 501 executes display control for the display panel 521 and operation detection control for detecting a user operation through the operation unit 540 and the operation panel 522.
  • the main control unit 501 By executing the display control, the main control unit 501 displays an icon for starting application software, a software key such as a scroll bar, or a window for creating an e-mail.
  • a software key such as a scroll bar
  • the scroll bar refers to a software key for accepting an instruction to move a display portion of an image, such as a large image that does not fit in the display area of the display panel 521.
  • the main control unit 501 detects a user operation through the operation unit 540, or accepts an operation on the icon or an input of a character string in the input field of the window through the operation panel 522. Or, by executing the operation detection control, the main control unit 501 accepts a display image scroll request through a scroll bar.
  • the main control unit 501 causes the operation position with respect to the operation panel 522 to overlap with the display panel 521 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 21.
  • a touch panel control function for controlling the sensitive area of the operation panel 522 and the display position of the software key is provided.
  • the main control unit 501 can also detect a gesture operation on the operation panel 522 and execute a preset function according to the detected gesture operation.
  • Gesture operation is not a conventional simple touch operation, but an operation that draws a trajectory with a finger or the like, designates a plurality of positions at the same time, or combines these to draw a trajectory for at least one of a plurality of positions. means.
  • the camera unit 541 is a digital camera that captures an image using an image sensor such as a CMOS or a CCD, and has the same function as the image capturing apparatus 100 shown in FIG.
  • the camera unit 541 can switch between a manual focus mode and an autofocus mode.
  • the photographing lens of the camera unit 541 can be focused by operating a focus icon button or the like displayed on the operation unit 540 or the display input unit 520.
  • the manual focus mode the live view image obtained by combining the split images is displayed on the display panel 521 so that the in-focus state during the manual focus can be confirmed.
  • the camera unit 541 converts the image data obtained by imaging into compressed image data such as JPEG (Joint Photographic coding Experts Group) under the control of the main control unit 501.
  • the converted image data can be recorded in the storage unit 550 or output through the input / output unit 560 or the wireless communication unit 510.
  • the camera unit 541 is mounted on the same surface as the display input unit 520, but the mounting position of the camera unit 541 is not limited to this, and the camera unit 541 may be mounted on the back surface of the display input unit 520.
  • a plurality of camera units 541 may be mounted. Note that when a plurality of camera units 541 are mounted, the camera unit 541 used for imaging may be switched and imaged alone, or a plurality of camera units 541 may be used simultaneously for imaging. it can.
  • the camera unit 541 can be used for various functions of the smartphone 500.
  • an image acquired by the camera unit 541 can be displayed on the display panel 521, or the image of the camera unit 541 can be used as one of operation inputs of the operation panel 522.
  • the GPS receiving unit 570 detects the position, the position can also be detected with reference to an image from the camera unit 541.
  • the optical axis direction of the camera unit 541 of the smartphone 500 is determined without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the current usage environment.
  • the image from the camera unit 541 can be used in the application software.
  • various information can be added to still image or moving image data and recorded in the storage unit 550 or output through the input / output unit 560 or the wireless communication unit 510.
  • the “various information” herein include, for example, position information acquired by the GPS receiving unit 570 and image information of the still image or moving image, audio information acquired by the microphone 532 (sound text conversion by the main control unit or the like). May be text information).
  • posture information acquired by the motion sensor unit 580 may be used.
  • the image pickup device 20 having the first to third pixel groups is illustrated, but the present invention is not limited to this, and only the first pixel group and the second pixel group are used.
  • An image sensor made of A digital camera having this type of image sensor generates a three-dimensional image (3D image) based on the first image output from the first pixel group and the second image output from the second pixel group. 2D images (2D images) can also be generated. In this case, the generation of the two-dimensional image is realized, for example, by performing an interpolation process between pixels of the same color in the first image and the second image. Moreover, you may employ
  • split image refers to a case where an image sensor including only a phase difference pixel group (for example, a first pixel group and a second pixel group) is used, and a phase difference pixel at a predetermined ratio with respect to a normal pixel.
  • an image output from the phase difference image group for example, the first image output from the first pixel group
  • a split image based on the second image output from the second pixel group is not limited to a mode in which both the normal image and the split image are simultaneously displayed on the same screen of the display device, and the normal image is displayed in a state where the split image display is instructed.
  • the display control unit 36 may perform control to display the split image without displaying the normal image on the display device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

La présente invention concerne un dispositif de traitement d'image, un dispositif d'imagerie, un procédé de traitement d'image et un programme de traitement d'image avec lesquels on peut exécuter, à un instant approprié, une commutation entre un affichage et un non-affichage d'une image utilisée pour confirmer une mise au point. Selon l'invention, une unité d'exploitation ordonne la sortie d'une image normale, et ordonne la sortie d'une image composite (étapes 300, 302). Dans le cas de l'annulation d'une instruction d'affichage de l'image normale par l'unité d'exploitation tandis que l'unité d'exploitation ordonne la sortie de l'image composite, une unité de traitement d'image sort une image composite générée (étape 308). En outre, dans le cas où la sortie de l'image normale est ordonnée par l'unité d'exploitation alors que cette dernière ordonne la sortie de l'image composite, l'unité de traitement d'image sort une image normale générée sans sortie de l'image composite générée (étape 306).
PCT/JP2013/071183 2012-09-19 2013-08-05 Dispositif de traitement d'image, dispositif d'imagerie, procédé de traitement d'image et programme de traitement d'image WO2014045741A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-205908 2012-09-19
JP2012205908 2012-09-19

Publications (1)

Publication Number Publication Date
WO2014045741A1 true WO2014045741A1 (fr) 2014-03-27

Family

ID=50341065

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/071183 WO2014045741A1 (fr) 2012-09-19 2013-08-05 Dispositif de traitement d'image, dispositif d'imagerie, procédé de traitement d'image et programme de traitement d'image

Country Status (1)

Country Link
WO (1) WO2014045741A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112640437A (zh) * 2018-08-31 2021-04-09 富士胶片株式会社 成像元件、摄像装置、图像数据处理方法及程序

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004040740A (ja) * 2002-07-08 2004-02-05 Fuji Photo Film Co Ltd マニュアルフォーカス装置
JP2005025055A (ja) * 2003-07-04 2005-01-27 Olympus Corp デジタル一眼レフカメラ
JP2007248852A (ja) * 2006-03-16 2007-09-27 Olympus Imaging Corp カメラの焦点調節装置
JP2009147665A (ja) * 2007-12-13 2009-07-02 Canon Inc 撮像装置
JP2009237214A (ja) * 2008-03-27 2009-10-15 Canon Inc 撮像装置
WO2012002297A1 (fr) * 2010-06-30 2012-01-05 富士フイルム株式会社 Procédé et dispositif d'imagerie

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004040740A (ja) * 2002-07-08 2004-02-05 Fuji Photo Film Co Ltd マニュアルフォーカス装置
JP2005025055A (ja) * 2003-07-04 2005-01-27 Olympus Corp デジタル一眼レフカメラ
JP2007248852A (ja) * 2006-03-16 2007-09-27 Olympus Imaging Corp カメラの焦点調節装置
JP2009147665A (ja) * 2007-12-13 2009-07-02 Canon Inc 撮像装置
JP2009237214A (ja) * 2008-03-27 2009-10-15 Canon Inc 撮像装置
WO2012002297A1 (fr) * 2010-06-30 2012-01-05 富士フイルム株式会社 Procédé et dispositif d'imagerie

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112640437A (zh) * 2018-08-31 2021-04-09 富士胶片株式会社 成像元件、摄像装置、图像数据处理方法及程序
CN112640437B (zh) * 2018-08-31 2024-05-14 富士胶片株式会社 成像元件、摄像装置、图像数据处理方法及存储介质

Similar Documents

Publication Publication Date Title
JP6033454B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5681329B2 (ja) 撮像装置及び画像表示方法
JP5931206B2 (ja) 画像処理装置、撮像装置、プログラム及び画像処理方法
JP6158340B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5960286B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5889441B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5901801B2 (ja) 画像処理装置、撮像装置、プログラム及び画像処理方法
JP5901782B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5901781B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5753323B2 (ja) 撮像装置及び画像表示方法
JP5833254B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP6086975B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP6000446B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5955417B2 (ja) 画像処理装置、撮像装置、プログラム及び画像処理方法
JP5972485B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
JP5901780B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム
WO2014045741A1 (fr) Dispositif de traitement d'image, dispositif d'imagerie, procédé de traitement d'image et programme de traitement d'image
JP5934844B2 (ja) 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13839134

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13839134

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP