US9667888B2 - Image capturing apparatus and control method thereof - Google Patents
Image capturing apparatus and control method thereof Download PDFInfo
- Publication number
- US9667888B2 US9667888B2 US14/575,163 US201414575163A US9667888B2 US 9667888 B2 US9667888 B2 US 9667888B2 US 201414575163 A US201414575163 A US 201414575163A US 9667888 B2 US9667888 B2 US 9667888B2
- Authority
- US
- United States
- Prior art keywords
- image
- information
- live image
- live
- displayed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/635—Region indicators; Field of view indicators
-
- H04N5/2258—
-
- H04N5/23293—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H04N5/23219—
Definitions
- the present invention relates to an image capturing apparatus having a plurality of image capturing units and control method thereof.
- main camera cameras that shoot a photographer himself/herself or an object on the photographer side
- sub camera an object on the photographer side
- main camera and sub camera mount a main camera and a sub camera
- simultaneous shooting function is realized in which shooting is performed by the main camera and the sub camera at the same time and the images shot by the cameras are composited and recorded.
- simultaneous shooting function while viewing the live view images of the main camera and the sub camera, the user can check the expression and the like of the photographer in conjunction with the composition of the object and the like, and perform shooting.
- An image obtained using the simultaneous shooting function is generally constituted by superimposing the image shot by the sub camera at a small size on a background region that is the image shot by the main camera.
- Japanese Patent Laid-Open No. 2005-094741 discloses a technique of performing face detection on the image shot by the main camera and compositing the image of the photographer shot by the sub camera in a region not including a face, so that the object in the image shot by the main camera is not hidden.
- a composite image obtained by arranging the live view image from the sub camera on the background region that is the live view image from the main camera is subjected to through-the-lens display on an electronic view finder such as a liquid crystal panel. For this reason, the photographer can perform shooting while checking not only the object but also his/her own expression and the like.
- the present invention has been made in consideration of the aforementioned problems, and realizes a technique according to which, by switching the vertical positional relationship between an image and an information which are superimposed and displayed, according to the shooting state, an image can be displayed without being hidden by another image, information display, or the like.
- the present invention provides an image capturing apparatus comprising: a first image capturing unit; a second image capturing unit that is different from the first image capturing unit; and a control unit configured to perform control such that a second live image captured by the second image capturing unit is superimposed and displayed on a first live image captured by the first image capturing unit, wherein if first information is to be displayed on a display unit along with the first live image and the second live image, the control unit controls to switch a display state in which the first information is superimposed on the second live image or a display state in which the first information is displayed behind the second live image, according to a state of a predetermined function for shooting.
- the present invention provides a control method of a display control apparatus for displaying an image captured by a first image capturing unit and an image captured by a second image capturing unit for capturing an image in a direction different from that of the first image capturing unit, the method comprising: a control step of performing control such that a first live image captured by the first image capturing unit, a second live image captured by the second image capturing unit, and first information are displayed on the display unit, wherein in the control step, the second live image is superimposed and displayed on the first live image, and according to a state of a predetermined function related to shooting, it is controlled to switch a display state in which the first information is superimposed on the second live image or a display state in which the first information is displayed behind the second live image.
- an image by switching the vertical positional relationship between a plurality of images and an information which are superimposed and displayed, according to a shooting state, an image can be displayed such that it is not hidden by another image, information display, or the like.
- FIG. 1 is a diagram showing the external appearance of an image capturing apparatus according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing a configuration of an image capturing apparatus according to an embodiment.
- FIG. 3 is a flowchart showing a basic operation of an image capturing apparatus according to an embodiment.
- FIG. 4 is a flowchart showing an operation in a still image recording mode of an image capturing apparatus according to an embodiment.
- FIG. 5A is a flowchart showing face detection processing in step S 407 in FIG. 4 .
- FIG. 5B is a flowchart showing shooting processing in step S 414 in FIG. 4 .
- FIG. 5C is a flowchart showing recording processing in step S 416 in FIG. 4 .
- FIG. 6 is a flowchart showing through-the-lens display control processing in a simultaneous shooting mode according to an embodiment.
- FIGS. 7A to 7C are diagrams illustrating a live view screen in a simultaneous shooting mode according to an embodiment.
- FIGS. 8A and 8B are diagrams illustrating a live view screen in a simultaneous shooting mode according to an embodiment.
- FIGS. 9A and 9B are diagrams illustrating a layer priority setting screen in a simultaneous shooting mode according to an embodiment.
- FIGS. 10A and 10B are diagrams illustrating a layer configuration in a simultaneous shooting mode according to an embodiment.
- Embodiments of the present invention will be described in detail below.
- the following embodiments are merely examples for practicing the present invention.
- the embodiments should be properly modified or changed depending on various conditions and the structure of an apparatus to which the present invention is applied.
- the present invention should not be limited to the following embodiments. Also, parts of the embodiments to be described later may be properly combined.
- FIG. 1 shows an external appearance of a digital camera 100 according to the present embodiment.
- a display unit 101 is a display device such as an LCD panel which displays images and various information.
- a shutter button 102 is an operation unit for a shooting instruction.
- a mode switching button 103 is an operation unit for changing over among various modes.
- a connector 107 is an interface that connects a connection cable 108 with a digital camera 100 .
- Operation units 104 comprise operation members such as various switches, buttons and a touch panel operated in various ways by the user.
- a controller wheel 106 is a rotatable operation member included among the operation units 104 .
- a power switch 105 switches between power on and power off.
- a recording medium 109 is a medium such as a memory card or hard disk.
- a recording medium slot 110 is for accommodating the recording medium 109 .
- the recording medium 109 accommodated in the recording medium slot 110 makes it possible to communicate with the digital camera 100 .
- a cover 111 covers the recording medium slot 110 .
- a sub camera 112 is a camera module for shooting a photographer himself/herself or an object on the photographer side.
- a viewfinder 113 is provided for the photographer to shoot an object while seeing through the viewfinder, and includes an eye detection unit for detecting whether the photographer's eye is in contact with the viewfinder 113 .
- FIG. 2 shows an internal configuration of a digital camera 100 according to the present embodiment.
- the digital camera 100 includes a main camera module (to be referred to as a main camera hereinafter) for shooting an object seen from the photographer, and a sub-camera module (to be referred to as a sub-camera hereinafter) for shooting the photographer himself/herself or an object on the photographer side.
- the main camera includes a first imaging optical system formed by a photographing lens 203 , a shutter 204 , and an image capturing unit 205 .
- the sub camera includes a second imaging optical system formed by a photographing lens 233 , a shutter 234 , and an image capturing unit 235 .
- Each of photographing lenses 203 , 233 includes a zoom lens and a focusing lens.
- Each of shutters 204 , 234 has a diaphragm function.
- Each of image capturing units 205 , 235 is an image sensor, which is constituted by a CCD or CMOS or the like, for converting the optical image of an object to an electric signal.
- An A/D converter 206 converts analog signals, each of which is output from the image capturing units 205 , 235 , to a digital signal.
- a barrier 202 covers the imaging optical system which includes the photographing lens 203 of the part of the main camera of the digital camera 100 , thereby preventing contamination of and damage to the image capturing system that includes the photographing lens 203 , shutter 204 and image capturing unit 205 .
- An image processing unit 207 performs resizing processing, such as predetermined pixel interpolation and reduction, and color conversion processing, with respect to data from the A/D converter 206 or data from a memory control unit 209 . Further, the image processing unit 207 performs predetermined calculation processing using the captured image data, and the system control unit 201 performs exposure control and distance measuring control based on the calculation results. Thus, AF (Automatic Focus) processing, AE (Automatic Exposure) processing, and EF (flash pre-emission) processing of TTL (Through the Lens) type are performed. Furthermore, the image processing unit 207 performs predetermined calculation processing using the captured image data, and AWB (Automatic White Balance) processing of TTL type is performed on the basis of the calculation results.
- AF Automatic Focus
- AE Automatic Exposure
- A/D converter 206 and image processing unit 207 can be provided in each of the image capturing units 205 and 235 .
- the data from the A/D converter 206 is directly written into a memory 210 via both the image processing unit 207 and the memory control unit 209 or via the memory control unit 209 .
- the memory 210 stores the image data obtained from the image capturing unit 205 and the A/D converter 206 , and image display data to be displayed on the display unit 101 .
- the memory 210 has a storage capacity that is sufficient for storing a predetermined number of still images as well as moving images and audio for a predetermined time period.
- a compression/decompression unit 217 compresses image data using an adaptive discrete cosine transform (ADCT) or the like, or decompresses compressed image data. Using the shutters 204 , 234 as triggers, the compression/decompression unit 217 loads image data stored in the memory 210 , performs compression processing thereon, and when the processing ends, writes the resulting image data in the memory 210 .
- ADCT adaptive discrete cosine transform
- the memory control unit 209 composites a first image shot by the main camera (to be referred to as a main camera image hereinafter) and a second image shot by the sub camera (to be referred to as a sub camera image hereinafter) that are stored in the memory 210 , and when the compression processing by the compression/decompression unit 217 ends, the resulting composite image data is written in the memory 210 .
- the compression/decompression unit 217 performs decompression processing on image data that has been recorded in the recording medium 109 as a file and writes the image data in the memory 210 when the processing ends.
- the image data written in the memory 210 by the compression/decompression unit 217 is written in the recording medium 109 as a file by the system control unit 201 via the recording medium I/F 216 .
- the memory 210 also functions as a memory for image display (video memory).
- a D/A converter 208 converts the image display data stored in the memory 210 into an analog signal and supplies the display unit 101 with the analog signal.
- the image display data that was written into the memory 210 is displayed by the display unit 101 via the D/A converter 208 .
- the display unit 101 performs, on a display device such as an LCD, display in accordance with the analog signal from the D/A converter 208 .
- the digital signal once converted by the A/D converter 206 and stored in the memory 210 is converted into analog signals by the D/A converter 208 , and the analog signals are successively transmitted to the display unit 101 so as to be displayed thereon, making it possible to realize an electronic view finder (EVF) functionality and to perform through-the-lens display of a live view image (hereinafter, live image).
- EVF electronic view finder
- a nonvolatile memory 213 is, for example, an EEPROM, which is electrically erasable and recordable.
- nonvolatile memory 213 constants and programs, for example, for operating the system control unit 201 are stored.
- programs may refer to programs for executing various flowcharts that will be described later.
- the system control unit 201 overall controls the entire camera 100 , and realizes, by executing the programs stored in the nonvolatile memory 213 , the processes of the flowchart that will be described later.
- the system memory 212 is, for example, a RAM on which constants and variables for operating the system control unit 201 , and the programs read out from the nonvolatile memory 213 are expanded.
- the system control unit 201 controls the memory 210 , the D/A converter 208 , the display unit 101 , and the like, so as to perform display control.
- a system timer 211 is a timer circuit for measuring time periods for various types of controls and the time of an integrated clock.
- a mode switching button 103 , a first shutter switch 102 a , a second shutter switch 102 b , and the operation units 104 are operation members for inputting various types of instructions into the system control unit 201 .
- the mode switching button 103 switches the operation mode of the system control unit 201 to any of a still image recording mode, a moving image recording mode, and a reproduction mode.
- the still image recording mode includes an automatic shooting mode, an automatic scene determination mode, a manual mode, various types of scene modes in which different settings are configured for individual shooting scenes, a program AE mode, a custom mode, a simultaneous shooting mode in which the main camera and sub camera can simultaneously perform shooting, and the like.
- the mode is directly switched to any of the plurality of modes included in the still image recording mode.
- the moving image recording mode may include a plurality of modes.
- the first shutter switch 102 a While the shutter-release button 102 provided on the camera 100 is being operated, that is, pressed half-way (the shooting preparation instruction), the first shutter switch 102 a is turned on and generates a first shutter switch signal SW 1 .
- the system control unit 201 Upon receiving the first shutter switch signal SW 1 , the system control unit 201 causes the main camera and sub camera to start the AF (Automatic Focus) processing, the AE (Automatic Exposure) processing, the AWB (Automatic White Balance) processing, the EF (flash pre-emission) processing and the like.
- the second shutter switch 102 b is turned on and generates a second shutter switch signal SW 2 .
- the system control unit 201 starts a series of shooting processing from reading out the signal from each of the image capturing units 205 , 235 to writing of image data to the recording medium 109 .
- buttons include an end button, a back button, an image scrolling button, a jump button, a narrow-down button, an attribute change button.
- a menu screen that enables various settings to be made is displayed on the display unit 101 by pressing a menu button. The user can make various settings intuitively by using the menu screen, which is displayed on the display unit 101 , four-direction (up, down, left, right) buttons and a SET button.
- the controller wheel 106 which is a rotatable operation member included among the operation units 104 , is used together with the direction buttons as when a selection item is specified.
- an electrical pulse signal is generated in accordance with the amount of rotation, and the system control unit 201 controls each unit of the digital camera 100 based upon the pulse signal.
- the angle through which the controller wheel 106 has been turned and how many times it has been turned can be determined by the pulse signal.
- the controller wheel 106 can be any operating member so long as it is an operating member whose rotation can be detected.
- it can be a dial operating member in which the controller wheel 106 itself is rotated to generate the pulse signal in accordance with a turning operation by the user.
- it can be a device (a so-called touch wheel) that detects an operation such as the revolution of the user's finger on the controller wheel 106 without by controller wheel 106 itself being rotated.
- a power control unit 214 is constituted by, for example, a battery detection circuit, a DC-DC converter, a switch circuit for changing over the block to be supplied with power, and detects whether a battery has been inserted or not, the type of the battery, and the residual capacity thereof. Further, the power control unit 214 controls the DC-DC converter in accordance with the detection results and an instruction of the system control unit 201 , and supplies a necessary voltage for a necessary length of time to each of the units including the recording medium 109 .
- a power supply unit 215 comprises a primary battery such as an alkaline battery or a lithium battery, a secondary battery such as an NiCd battery, an NiMH battery, or an Li battery, or an AC adaptor.
- the recording medium interface (I/F) 216 is for interfacing with the recording medium 109 such as the memory card or hard disk.
- the recording medium 109 is a recording medium such as a memory card for recording shot images, and constituted by a semiconductor memory, a magnetic disk, or the like.
- FIG. 3 The basic operation of the digital camera 100 according to this embodiment from the start to the end will be described with reference to FIG. 3 .
- processing shown in FIG. 3 is implemented when a program recorded in the nonvolatile memory 213 is read out into the system memory 212 , and executed by the system control unit 201 .
- sub camera performs operations that are basically the same as those of the main camera as will be described later, a description will be given in which processing that is unique to the sub camera in particular is supplemented.
- the system control unit 201 initializes the flags, control variables, and the like in step S 301 and starts management processing for files recorded in the recording medium 109 in step S 302 .
- step S 303 the system control unit 201 determines the setting position of the mode switching button 103 . If it has been set to the still image recording mode in step S 303 , the process advances to step S 304 , and the still image recording mode processing that will be described later in FIG. 4 is executed. Also, if it has been set to the moving image recording mode in step S 305 , the process advances to step S 306 , and the moving image recording mode processing is executed. Also, if it has been set to the reproduction mode in step S 307 , the process advances to step S 308 , and the reproduction mode processing is executed.
- the process advances to step S 309 , and the system control unit 201 executes other mode processing.
- the other mode processing is, for example, transmission mode processing in which transmission of a file recorded in the recording medium 109 is performed, reception mode processing in which a file is received from an external device and recorded in the recording medium 109 , or the like.
- step S 310 the system control unit 201 determines the setting of the power supply switch 105 , and if it is set to be on, the process returns to step S 303 , and if it is set to be off, the process advances to step S 311 .
- step S 311 the system control unit 201 performs end processing.
- End processing includes, for example, processing for changing the display of the display unit 101 to an end state and closing the barrier 202 to protect the image capturing unit 205 , and processing for recording setting values, setting modes, and parameters including flags, control variables, and the like in the nonvolatile memory 213 and shutting off the power to units that do not require a supply of power.
- step S 311 Upon completion of the end processing in step S 311 , a transition to the power off state is made.
- step S 304 in FIG. 3 will be described with reference to FIGS. 4 and 5A-5C .
- the processing in FIG. 4 ends due to interruption processing and the like in the case where a switch to another operation mode is instructed using the mode switching button 103 or in the case where the power supply switch 105 is switched off.
- step S 401 the system control unit 201 establishes the shooting mode. Establishment of the shooting mode is performed by acquiring the shooting mode at the time of ending the previous instance of still image recording mode processing from the nonvolatile memory 213 and storing the shooting mode in the system memory 212 .
- a shooting mode is for shooting a still image.
- the digital camera 100 of the present embodiment has the following shooting modes.
- Automatic shooting mode a mode in which various parameters of the camera are automatically determined by a program installed in the digital camera 100 , based on an exposure value obtained by photometry.
- Manual mode a mode in which the user can freely change various parameters of the camera.
- Scene mode the combination of the shutter speed, diaphragm value, strobe light state, sensitivity setting, and the like that are appropriate for the shooting scene are set freely.
- Simultaneous shooting mode a mode in which shooting with the sub camera is also performed at the same time as shooting with the main camera and one composite image is generated by superimposing the sub camera image on the main camera image.
- step S 402 the system control unit 201 performs through-the-lens display of image data shot using the image capturing unit 205 on the display unit 101 .
- the main camera image and the sub camera image shot by the image capturing units 205 , 235 are composited by the memory control unit 209 , and an almost real-time live image thereof is subjected to through-the-lens display without being stored in the recording medium 109 .
- the photographer can check the angle of view and adjust the timing of shooting, and can check his/her own expression in conjunction therewith.
- step S 403 the system control unit 201 uses the power supply control unit 214 to determine whether or not the residual capacity of the power supply unit 215 , the existence or non-existence of the recording medium 109 , or the residual capacity thereof is problematic for the operation of the digital camera 100 . If it is problematic, the process advances to step S 404 , a predetermined warning is displayed on the display unit 101 using an image or audio, and the process returns to step S 401 , and if it is not problematic, the process advances to step S 405 .
- step S 405 the system control unit 201 expands the shooting settings stored in the nonvolatile memory 213 in accordance with user settings to the system memory 212 so as to determine whether or not shooting parameters have been set. If they have not been set, the process advances to step S 407 , and if they have been set, the process advances to step S 406 and processing is performed in accordance with the set shooting parameters.
- step S 407 the system control unit 201 detects a face of a person in the live image being subjected to through-the-lens display (face detection processing). If the face of a person is detected in the face detection processing, the system control unit 201 stores the positional coordinates, size (width, height), number detected, reliability coefficient, and the like of the detected face in the system memory 212 as face information. If a face is not detected in the face detection processing, 0 is set in regions for the positional coordinates, size (width, height), number detected, reliability coefficient, and the like of the system memory 212 .
- step S 407 the face detection processing in step S 407 will be described with reference to FIG. 5A .
- the system control unit 201 reads out the live image data (step S 531 ) stored in the memory 210 , applies band pass filters in the horizontal direction and the vertical direction (steps S 532 and S 533 ), and detects edge components. Next, the system control unit 201 performs pattern matching on the detected edge components (step S 534 ) so as to extract a group of candidates for eyes, nose, mouth, and ears.
- the system control unit 201 determines that among the group of candidates for eyes extracted using pattern matching, candidates that satisfy a predetermined condition (e.g., distance between two eyes, inclination, etc.) are a pair of eyes, and the group of candidates for eyes is narrowed down to only those that are a pair of eyes (step S 535 ).
- a predetermined condition e.g., distance between two eyes, inclination, etc.
- the system control unit 201 detects a face by associating the narrowed-down group of candidates for eyes with other parts forming a face (nose, mouth, ears) that correspond thereto and passing them through a pre-set non-face condition filter (step S 536 ), and face information is generated in accordance with the detection result (step S 537 ).
- Object information can be generated by extracting feature amounts of an object such as a person's face from the live image data being subjected to through-the-lens display in this way.
- a face frame and the like may be displayed on the display unit 101 in accordance with the positional coordinates and the size of the face by superimposing them on the live image, for example.
- face information is illustrated as object information, but the present invention is not limited to this, and it is possible to detect a moving object such as a vehicle or a train for example as an object being trailed.
- step S 409 the system control unit 201 determines whether or not a first shutter switch signal SW 1 is on (receive shooting preparation instruction). If the first shutter switch signal SW 1 is off, the process returns to step S 405 , whereas if it is on, the process advances to step S 410 .
- step S 410 a shooting preparation operation is performed.
- the system control unit 201 performs distance measuring processing so as to focus the photographing lens 203 on the object (AF processing) and performs photometry processing so as to determine the diaphragm value and shutter speed (AE processing). Note that in the photometry processing, flash settings are also performed if necessary. Also, if a face is detected in step S 407 , it is possible to perform face AF in which distance measurement is performed using the detected range of the face, and to display an AF frame on a specific face indicating the focus position. Also, in the case of simultaneous shooting mode, a shooting preparation operation is performed similarly on the sub camera side as well.
- steps S 411 , S 412 the system control unit 201 determines the on/off state of the first shutter switch signal SW 1 and the second shutter switch signal SW 2 . If the second shutter switch signal SW 2 switches on when the first shutter switch signal SW 1 is on, the process advances to step S 413 . Thereafter, upon the first shutter switch signal SW 1 switching off (if the first shutter switch signal SW 1 is switched off while the second shutter switch signal SW 2 is not on), the process returns to step S 405 . Also, while the first shutter switch signal SW 1 is on and the second shutter switch signal SW 2 is off, the processing of steps S 411 , S 412 is repeated.
- step S 413 the system control unit 201 sets the display unit 101 from a through-the-lens display state to a fixed color display state (e.g., a solid black display).
- a fixed color display state e.g., a solid black display
- step S 414 the system control unit 201 executes shooting processing which includes exposure processing and developing processing.
- image data obtained from the image capturing unit 205 and the A/D converter 206 is written in the memory 210 via both the image processing unit 207 and the memory control unit 209 , or from the A/D converter 206 via the memory control unit 209 directly.
- the system control unit 201 performs various processing by reading out image data written in the memory 210 using the memory control unit 209 and, when necessary, the image processing unit 207 .
- step S 414 the shooting processing in step S 414 will be described with reference to FIG. 5B .
- the system control unit 201 acquires the shooting date/time from the system timer 211 and stores it in the system memory 212 (step S 541 ).
- the system control unit 201 adjusts the diaphragm function of the shutter 204 according to the diaphragm value that was determined in step S 410 and stored in the system memory 212 , and starts exposure of the image capturing unit 205 (step S 542 , S 543 ).
- the system control unit 201 closes the shutter 204 (step S 545 ), reads out the charge signal from the image capturing unit 205 , and writes the image data in the memory 210 via the A/D converter 206 , the image processing unit 207 , and the memory control unit 209 , or from the A/D converter 206 via the memory control unit 209 directly (step S 546 ).
- the system control unit 201 reads out the image data written in the memory 210 via the image processing unit 207 as needed, performs image processing such as compression processing using the compression/decompression unit 217 (step S 547 ), and when the processing ends, the image data is sequentially written in the memory 210 .
- image processing such as compression processing using the compression/decompression unit 217 (step S 547 )
- the processing ends the image data is sequentially written in the memory 210 .
- the simultaneous shooting mode after being subjected to compression processing, the main camera image and the sub camera image are composited and written in the memory 210 .
- step S 548 the system control unit 201 reads out the image data from the memory 210 using the memory control unit 209 and transfers the image data for display that has been converted into an analog signal by the D/A converter 208 to the display unit 101 .
- the image data that was written in the memory 210 is displayed on the display unit 101 via the D/A converter 208 .
- step S 415 the system control unit 201 performs quick review display of image data obtained using the shooting processing of step S 414 .
- Quick review display is processing for displaying image data for a predetermined period of time (review time) on the display unit 101 after shooting the object and before recording it in the recording medium in order to check the shot image.
- step S 416 the system control unit 201 writes the image data obtained using the shooting processing of step S 414 in the recording medium 109 as a file.
- the sub camera image data shot by the sub camera is recorded in the recording medium 109 in association with the main camera data.
- step S 416 will be described with reference to FIG. 5C .
- the system control unit 201 generates a file name in accordance with a predetermined file name generation rule (step S 551 ) and acquires shooting date/time information stored in the system memory 212 in the shooting processing of step S 414 (step S 552 ).
- the system control unit 201 acquires the file size of the image data (step S 553 ) and determines whether or not the directory in which the file is stored exists in the recording medium 109 (step S 554 ). If it is determined that the directory does not exist, the process advances to step S 555 , and if it does exist, the process advances to step S 556 .
- step S 555 the system control unit 201 generates a directory such as 100GANON in the recording medium 109 , for example.
- step S 556 the system control unit 201 generates a file header constituted by the shooting date/time, file size, shooting conditions, and the like of the image data obtained with the shooting processing of step S 414 .
- step S 557 the system control unit 201 generates a directory entry using the file name, shooting date/time, and data amount generated and/or acquired in steps S 551 to S 553 and writes the image data in the recording medium 109 as a file.
- step S 417 the system control unit 201 determines the on/off state of the second shutter switch signal SW 2 and waits for the second shutter switch signal SW 2 to switch off.
- the quick review display is continued until the second shutter switch signal SW 2 switches off.
- the quick review display on the display unit 101 is continued until the second shutter switch signal SW 2 is switched off.
- the user can carefully check shot image data using the quick review display by continuing to fully press the shutter-release button 102 .
- the process advances to step S 418 .
- step S 418 the system control unit 201 determines whether or not a predetermined amount of review time for the quick review display has elapsed. If the review time has not elapsed, the system control unit 201 waits for the review time to elapse, and upon the review time elapsing, the process advances to step S 419 .
- step S 419 the system control unit 201 returns the display unit 101 from the quick review display to the through-the-lens display state.
- the display state of the display unit 101 automatically switches to the through-the-lens display state in which the image data captured by the image capturing unit 205 is successively displayed for the next shooting.
- step S 420 the system control unit 201 determines the on/off state of the first shutter switch signal SW 1 , and if it is on, the process returns to step S 411 , and if it is off, the process returns to step S 405 .
- the system control unit 201 prepares for the next shooting (step S 411 ).
- the shutter-release button 102 is released (the first shutter switch signal SW 1 is off, the series of shooting operations are ended and the shooting standby state is returned to (step S 405 ).
- Live image through-the-lens display processing during simultaneous shooting mode in step S 402 in FIG. 4 will be described next with reference to FIGS. 6 to 10A and 10B .
- simultaneous shooting mode As shown in FIG. 7A , a first live image (to be referred to as a main live image hereinafter) 711 shot by the main camera and a second live image (to be referred to as a sub live image hereinafter) 712 shot by the sub camera are displayed on a live view screen 710 .
- a first live image to be referred to as a main live image hereinafter
- a second live image to be referred to as a sub live image hereinafter
- the system control unit 201 uses the image processing unit 207 to generate the main live image 711 and the sub live image 712 based on the main camera image data and the sub camera image data stored in the memory 210 .
- step S 603 if a face is detected as a result of the face detection in step S 407 in FIG. 4 with respect to the main camera image data, the system control unit 201 moves to step S 604 , and if a face is not detected, moves to step S 605 .
- step S 604 according to the positional coordinates and the size (width, height) of the detected face, the system control unit 201 generates frame information (first information) for displaying and superimposing the face frame 713 on the main live image 711 of the live view screen 710 .
- step S 605 the system control unit 201 generates OSD data (second information) composed of an icon, character string, and the like for displaying an OSD (on-screen display) 714 for shooting parameters and the like on the live view screen 710 .
- OSD data second information
- the shooting mode, battery remaining amount, shutter speed, exposure value, ISO sensitivity, white balance, flash on/off and the like are examples of OSD data.
- the system control unit 201 determines the priority levels of the layers at the time of displaying these items on the display unit 101 .
- the priority levels of the layers indicate the vertical relationship of the main live image 711 , the sub live image 712 , the face frame 713 , and the OSD 714 which displayed and superimposed, a layer being arranged further upward the higher its priority level is.
- the system control unit 201 sets the priority level of the layer for the OSD 714 such that it is at the top and sets the priority level of the layer for the main live image 711 such that it is on the bottom. This is because if the main live image 711 is not displayed on the bottom layer, the sub live image 712 and the OSD 714 will be hidden from view due to the fact that the main live image 711 is to be displayed in almost the entirety of the live view screen 710 .
- step S 608 the system control unit 201 determines the on/off state of the first shutter switch signal SW 1 , and if it is on (during a shooting preparation operation), the process advances to step S 609 , and if it is off (during shooting standby), the process advances to step S 610 .
- step S 609 the system control unit 201 sets the layer priority levels for the shooting preparation operation time such that the layer priority level of the face frame 723 is higher than the layer priority level of the sub live image 722 .
- the layer priority levels are set such that the face frame 723 is displayed on the sub live image 722 .
- FIG. 10B illustrates a layer configuration for displaying the live view screen 720 in FIG. 7B .
- the priority level of a layer 1022 for the face frame is set to be second, and the priority level of a layer 1023 for the sub live image is set to be third.
- step S 610 the system control unit 201 sets the layer priority levels during shooting standby such that the layer priority level of the sub live image 732 is higher than the layer priority level of the face frame 733 .
- the layer priority levels are set such that the layer for the face frame 733 is displayed behind that of the sub live image 732 .
- FIG. 10A illustrates a layer configuration for displaying the live view screen 730 in FIG. 7C .
- the priority level of a layer 1012 for the sub live image is set to be second, and the priority level of a layer 1013 for the face frame is set to be third.
- step S 611 the system control unit 201 displays and superimposes the images of all layers and the OSD on the display unit 101 in accordance with the layer priority levels that were set in step S 609 or step S 610 .
- the priority levels of layers 1011 and 1021 for the OSD are set to be the highest, and layers 1014 and 1024 for the main live images are set to be the lowest, according to steps S 606 , S 607 .
- step S 610 various methods are conceivable, such as (1) a case in which the priority levels are fixed, (2) a case in which they are determined according to the coordinate positions, size, and the like of the face in the main live image, and (3) a case in which they are presented selectably to the user.
- the layer priority levels are determined according to the range in which the sub live image 812 and the face region 811 in the main live image are superimposed. For example, as shown in FIG. 8B , if the area of the region in which the face 821 that is to be displayed in the face frame 823 and the sub live image 822 are superimposed is less than a predetermined size, the layer priority level of the face frame 823 is made higher than the layer priority level of the sub live image 822 . On the other hand, as shown in FIG.
- the layer priority level of the face frame is made lower than the layer priority level of the sub live image 812 .
- the layer priority level may be determined according to at least one of the coordinate position and the size of the face in the main live image. In such a case, for example, if the size of the face in the main live image is large enough to cover the entire image, it is possible to set the face frame in a layer below that of the sub live image regardless of the face position. Also, for example, if the position of the face being displayed in the face frame in the main live image is far away from the sub live image, the face frame may be set to be above the sub live image regardless of the face size.
- the user selects menu items 911 , 912 from a layer priority level setting screen 910 shown in FIG. 9A and sets in advance whether to make the layer priority level of the face frame or the sub live image higher. For example, if the user selects face frame layer 911 on the layer priority level setting screen 910 in FIG. 9A , during shooting standby, the layer priority level for the face frame will always be higher than the layer priority level for the sub live image. On the other hand, if sub live image layer 912 is selected, the layer priority level for the sub live image is raised. Thus, it is possible to increase the range of selection according to the user's preference. Other methods are also considered as methods for determining the layer priority level, and any of such methods may be used.
- the user may select menu items 921 , 922 , 923 from a layer priority level setting screen 920 shown in FIG. 9B , for example, so as to raise the layer priority level of one of the face frame and the sub live image, or to determine the layer whose priority level is to be raised according to a condition. If the user selects conditional priority 922 in the layer priority level setting screen 920 in FIG. 9B , the camera automatically selects which of the layer priority level of the face frame and the layer priority level of the sub live image is to be raised according to conditions during a shooting operation. On the other hand, if the face frame layer priority 921 or the sub live image layer 923 is selected, the user can arbitrarily determine which of the layer priority levels he/she wishes to raise.
- the vertical positional relationship between the main live image, the face frame, the sub live image, and the OSD, which are superimposed and displayed, can be switched according to the shooting state.
- the photographer can check the object that is being focused on, along with the object, his/her own expression, and the like, and shooting can be performed at an appropriate timing.
- the present invention may be carried out using an electronic device such as a mobile phone with a camera, a mobile game device, or the like.
- the present invention may be implemented using a system in which a plurality of apparatuses, for example, an image capturing apparatus having a plurality of image capturing units, a display apparatus having a display unit, and a control apparatus that has a control unit and controls the image capturing apparatus and the display apparatus are connected.
- Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
- computer executable instructions e.g., one or more programs
- a storage medium which may also be referred to more fully as a
- the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
- the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
- the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Cameras In General (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
- Television Signal Processing For Recording (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-266131 | 2013-12-24 | ||
JP2013266131A JP6247527B2 (ja) | 2013-12-24 | 2013-12-24 | 撮像装置及び制御方法、プログラム、並びに記憶媒体 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20150181135A1 US20150181135A1 (en) | 2015-06-25 |
US9667888B2 true US9667888B2 (en) | 2017-05-30 |
Family
ID=53401515
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/575,163 Active 2035-07-23 US9667888B2 (en) | 2013-12-24 | 2014-12-18 | Image capturing apparatus and control method thereof |
Country Status (4)
Country | Link |
---|---|
US (1) | US9667888B2 (ja) |
JP (1) | JP6247527B2 (ja) |
KR (1) | KR101719590B1 (ja) |
CN (1) | CN104735317B (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10129336B2 (en) * | 2014-03-12 | 2018-11-13 | Samsung Electronic Co., Ltd. | Content management method and cloud server therefor |
Families Citing this family (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106605196B (zh) | 2014-09-02 | 2018-11-09 | 苹果公司 | 远程相机用户界面 |
US9979890B2 (en) | 2015-04-23 | 2018-05-22 | Apple Inc. | Digital viewfinder user interface for multiple cameras |
CN106683601B (zh) * | 2015-11-10 | 2020-07-14 | 佳能株式会社 | 显示控制装置及其控制方法 |
US9854156B1 (en) | 2016-06-12 | 2017-12-26 | Apple Inc. | User interface for camera effects |
DK180859B1 (en) | 2017-06-04 | 2022-05-23 | Apple Inc | USER INTERFACE CAMERA EFFECTS |
US11112964B2 (en) | 2018-02-09 | 2021-09-07 | Apple Inc. | Media capture lock affordance for graphical user interface |
US10375313B1 (en) | 2018-05-07 | 2019-08-06 | Apple Inc. | Creative camera |
US11722764B2 (en) | 2018-05-07 | 2023-08-08 | Apple Inc. | Creative camera |
DK201870623A1 (en) | 2018-09-11 | 2020-04-15 | Apple Inc. | USER INTERFACES FOR SIMULATED DEPTH EFFECTS |
US11770601B2 (en) | 2019-05-06 | 2023-09-26 | Apple Inc. | User interfaces for capturing and managing visual media |
US10645294B1 (en) | 2019-05-06 | 2020-05-05 | Apple Inc. | User interfaces for capturing and managing visual media |
US11128792B2 (en) | 2018-09-28 | 2021-09-21 | Apple Inc. | Capturing and displaying images with multiple focal planes |
US11321857B2 (en) | 2018-09-28 | 2022-05-03 | Apple Inc. | Displaying and editing images with depth information |
JP7280698B2 (ja) | 2019-01-11 | 2023-05-24 | キヤノン株式会社 | 撮像装置及びその制御方法及びプログラム |
KR20200117562A (ko) * | 2019-04-04 | 2020-10-14 | 삼성전자주식회사 | 비디오 내에서 보케 효과를 제공하기 위한 전자 장치, 방법, 및 컴퓨터 판독가능 매체 |
US11706521B2 (en) | 2019-05-06 | 2023-07-18 | Apple Inc. | User interfaces for capturing and managing visual media |
US11039074B1 (en) | 2020-06-01 | 2021-06-15 | Apple Inc. | User interfaces for managing media |
US11212449B1 (en) | 2020-09-25 | 2021-12-28 | Apple Inc. | User interfaces for media capture and management |
US11539876B2 (en) | 2021-04-30 | 2022-12-27 | Apple Inc. | User interfaces for altering visual media |
US11778339B2 (en) | 2021-04-30 | 2023-10-03 | Apple Inc. | User interfaces for altering visual media |
US12112024B2 (en) | 2021-06-01 | 2024-10-08 | Apple Inc. | User interfaces for managing media styles |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0746564A (ja) | 1993-07-27 | 1995-02-14 | Mitsubishi Electric Corp | テレビ会議装置 |
JP2005094741A (ja) | 2003-08-14 | 2005-04-07 | Fuji Photo Film Co Ltd | 撮像装置及び画像合成方法 |
KR20070028244A (ko) | 2005-09-07 | 2007-03-12 | 가시오게산키 가부시키가이샤 | 복수의 촬상소자를 구비한 카메라장치 |
JP2007129332A (ja) | 2005-11-01 | 2007-05-24 | Canon Inc | マルチ画面における各画像に対する操作方法および装置 |
US20100157084A1 (en) * | 2008-12-18 | 2010-06-24 | Olympus Imaging Corp. | Imaging apparatus and image processing method used in imaging device |
JP2011013243A (ja) | 2009-06-30 | 2011-01-20 | Sanyo Electric Co Ltd | 撮像装置 |
US20140055656A1 (en) * | 2012-08-21 | 2014-02-27 | Canon Kabushiki Kaisha | Image processing apparatus having display device, control method therefor, and storage medium |
US20140118600A1 (en) * | 2012-10-30 | 2014-05-01 | Samsung Electronics Co., Ltd. | Method for controlling camera of device and device thereof |
US20140192245A1 (en) * | 2013-01-07 | 2014-07-10 | Samsung Electronics Co., Ltd | Method and mobile terminal for implementing preview control |
US20150124125A1 (en) * | 2013-11-06 | 2015-05-07 | Lg Electronics Inc. | Mobile terminal and control method thereof |
US20150163409A1 (en) * | 2013-12-06 | 2015-06-11 | Panasonic Intellectual Property Management Co., Ltd. | Imaging device and imaging system |
US20150172552A1 (en) * | 2013-12-17 | 2015-06-18 | Samsung Electronics Co., Ltd. | Method of performing previewing and electronic device for implementing the same |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003134358A (ja) * | 2001-10-19 | 2003-05-09 | Minolta Co Ltd | デジタルカメラ |
JP3948387B2 (ja) * | 2002-10-24 | 2007-07-25 | 松下電器産業株式会社 | ディジタルカメラおよびディジタルカメラ付き携帯電話装置 |
JP2005073161A (ja) * | 2003-08-27 | 2005-03-17 | Canon Inc | 処理装置及び画像記録方法 |
JP5116514B2 (ja) * | 2008-03-11 | 2013-01-09 | キヤノン株式会社 | 撮像装置および表示制御方法 |
CN102055834B (zh) * | 2009-10-30 | 2013-12-11 | Tcl集团股份有限公司 | 一种移动终端的双摄像头拍照方法 |
JP2013017125A (ja) * | 2011-07-06 | 2013-01-24 | Ricoh Co Ltd | 撮像装置及び撮像装置のモニタリング画像の表示方法 |
CN102984355A (zh) * | 2012-11-08 | 2013-03-20 | 深圳桑菲消费通信有限公司 | 一种手机及其双摄像头实现方法 |
-
2013
- 2013-12-24 JP JP2013266131A patent/JP6247527B2/ja active Active
-
2014
- 2014-12-18 US US14/575,163 patent/US9667888B2/en active Active
- 2014-12-19 KR KR1020140183712A patent/KR101719590B1/ko active IP Right Grant
- 2014-12-24 CN CN201410817300.9A patent/CN104735317B/zh not_active Expired - Fee Related
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0746564A (ja) | 1993-07-27 | 1995-02-14 | Mitsubishi Electric Corp | テレビ会議装置 |
JP2005094741A (ja) | 2003-08-14 | 2005-04-07 | Fuji Photo Film Co Ltd | 撮像装置及び画像合成方法 |
KR20070028244A (ko) | 2005-09-07 | 2007-03-12 | 가시오게산키 가부시키가이샤 | 복수의 촬상소자를 구비한 카메라장치 |
JP2007129332A (ja) | 2005-11-01 | 2007-05-24 | Canon Inc | マルチ画面における各画像に対する操作方法および装置 |
US20100157084A1 (en) * | 2008-12-18 | 2010-06-24 | Olympus Imaging Corp. | Imaging apparatus and image processing method used in imaging device |
JP2011013243A (ja) | 2009-06-30 | 2011-01-20 | Sanyo Electric Co Ltd | 撮像装置 |
US20140055656A1 (en) * | 2012-08-21 | 2014-02-27 | Canon Kabushiki Kaisha | Image processing apparatus having display device, control method therefor, and storage medium |
US20140118600A1 (en) * | 2012-10-30 | 2014-05-01 | Samsung Electronics Co., Ltd. | Method for controlling camera of device and device thereof |
US20140192245A1 (en) * | 2013-01-07 | 2014-07-10 | Samsung Electronics Co., Ltd | Method and mobile terminal for implementing preview control |
US20150124125A1 (en) * | 2013-11-06 | 2015-05-07 | Lg Electronics Inc. | Mobile terminal and control method thereof |
US20150163409A1 (en) * | 2013-12-06 | 2015-06-11 | Panasonic Intellectual Property Management Co., Ltd. | Imaging device and imaging system |
US20150172552A1 (en) * | 2013-12-17 | 2015-06-18 | Samsung Electronics Co., Ltd. | Method of performing previewing and electronic device for implementing the same |
Non-Patent Citations (1)
Title |
---|
The above foreign patent documents were cited in the Sep. 8, 2016 Korean Office Action, which is unclosed without an English Translation, that issued in Korean Patent Application No. 10-2014-0183712. |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10129336B2 (en) * | 2014-03-12 | 2018-11-13 | Samsung Electronic Co., Ltd. | Content management method and cloud server therefor |
Also Published As
Publication number | Publication date |
---|---|
KR101719590B1 (ko) | 2017-03-24 |
KR20150075032A (ko) | 2015-07-02 |
CN104735317B (zh) | 2018-09-25 |
US20150181135A1 (en) | 2015-06-25 |
JP6247527B2 (ja) | 2017-12-13 |
JP2015122670A (ja) | 2015-07-02 |
CN104735317A (zh) | 2015-06-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9667888B2 (en) | Image capturing apparatus and control method thereof | |
US9848159B2 (en) | Image reproducing apparatus and method for controlling same | |
US9838609B2 (en) | Image capturing apparatus, control apparatus and control method for controlling zooming function | |
JP2007318225A (ja) | 撮影装置および撮影方法 | |
JP6124700B2 (ja) | 撮像装置、その制御方法、プログラム、記憶媒体 | |
US9692971B2 (en) | Image capturing apparatus capable of automatically switching from reproduction mode to shooting mode and displaying live view image and control method thereof | |
US9992405B2 (en) | Image capture control apparatus and control method of the same | |
US9177395B2 (en) | Display device and display method for providing image display in first color mode and second color mode | |
US10511761B2 (en) | Image capturing control apparatus, control method, and storage medium | |
US9232133B2 (en) | Image capturing apparatus for prioritizing shooting parameter settings and control method thereof | |
US9635281B2 (en) | Imaging apparatus method for controlling imaging apparatus and storage medium | |
US9538097B2 (en) | Image pickup apparatus including a plurality of image pickup units and method of controlling the same | |
US10530981B2 (en) | Image capturing apparatus, control method, and storage medium for not producing a notification sound | |
JP2015095082A (ja) | 画像表示装置、その制御方法、および制御プログラム | |
US11625948B2 (en) | Imaging control apparatus capable of selecting detected subject and method for the same | |
JP6200194B2 (ja) | 撮像装置及びその制御方法並びにプログラム | |
JP5755035B2 (ja) | 撮像装置及びその制御方法 | |
US10194082B2 (en) | Image pickup apparatus that shoots moving image for predetermined time period at the time of shooting still image, control method for the image pickup apparatus, and storage medium | |
JP6124658B2 (ja) | 画像処理装置、画像処理装置の制御方法 | |
JP2017034473A (ja) | 撮影装置及びその制御方法 | |
JP6354377B2 (ja) | 撮像装置およびその制御方法とプログラム | |
JP2016012845A (ja) | 撮像装置、その制御方法、および制御プログラム | |
JP2017085430A (ja) | 撮像装置 | |
JP2014216800A (ja) | 画像処理装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHIMOSATO, JIRO;REEL/FRAME:035782/0106 Effective date: 20141212 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |