CN109076156B - Electronic device and control method thereof - Google Patents

Electronic device and control method thereof Download PDF

Info

Publication number
CN109076156B
CN109076156B CN201780019550.XA CN201780019550A CN109076156B CN 109076156 B CN109076156 B CN 109076156B CN 201780019550 A CN201780019550 A CN 201780019550A CN 109076156 B CN109076156 B CN 109076156B
Authority
CN
China
Prior art keywords
touch
detected
image
display
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780019550.XA
Other languages
Chinese (zh)
Other versions
CN109076156A (en
Inventor
太田知宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority claimed from PCT/JP2017/011080 external-priority patent/WO2017164140A1/en
Publication of CN109076156A publication Critical patent/CN109076156A/en
Application granted granted Critical
Publication of CN109076156B publication Critical patent/CN109076156B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0412Digitisers structurally integrated in a display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/53Constructional details of electronic viewfinders, e.g. rotatable or detachable
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming

Abstract

An electronic apparatus that realizes convenience of touch operation and correct position designation, and a method for controlling the electronic apparatus are disclosed. In a case where an area corresponding to a predetermined condition is detected from the image, the electronic apparatus performs a function corresponding to a position where the touch detection section detects the touch. In the case where the area corresponding to the predetermined condition is not detected or the area detection is not performed, the electronic apparatus does not perform the processing corresponding to the position where the touch detection means detects the touch, but performs the processing corresponding to the movement operation of the touched position.

Description

Electronic device and control method thereof
Technical Field
The present invention relates to an electronic apparatus and a method for controlling the same.
Background
A digital camera is known which includes a touch display for displaying a live view image and enables a user to specify a focus detection area (AF frame) and an area to be appropriately exposed (AE area) by Automatic Exposure (AE) control at a contact position of the touch display (patent document 1).
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open No. 2012-
Disclosure of Invention
Problems to be solved by the invention
Many modern digital cameras have over 1600-2000 thousand pixels, but the resolution of the touch display is about 40-120 ten thousand pixels and its size is no greater than about 3 inches. While some smart phones and tablet devices feature larger touch displays and higher resolution, such displays have 400 ten thousand pixels and are no larger than 10 inches in size.
Therefore, the shot image displayed in the touch display is greatly reduced, and thus it is not easy to correctly specify the intended position by the touch operation of the touch display. In particular, in the case where an AF frame or an AE area is specified for shooting, it is necessary to touch a screen with a finger of one hand while holding the electronic apparatus with the other hand or to touch a screen with a finger of a hand that sticks out of the hand holding the apparatus. This makes it even more difficult to correctly specify the intended location.
In view of these problems of the conventional art, it is an object of the present invention to provide an electronic apparatus and a method for controlling the same that seek a balance between the convenience of touch operation and correctly specifying a position.
Means for solving the problems
The above object is achieved by: an electronic device, comprising: setting means for setting a first operation mode in a case where an area satisfying a predetermined condition is detected in an image, and setting a second operation mode in a case where the area is not detected or the detection of the area is not performed; a touch detection section for detecting an operation on the touch sensing surface; and a control section that performs control such that, in a case where the first operation mode is set, a function corresponding to a position at which the touch detection section detects a touch is executed, and in a case where the second operation mode is set, a process corresponding to a position at which the touch detection section detects a touch is not executed, and a process of a movement operation based on a touch position is executed.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, it is possible to provide an electronic apparatus that realizes both convenience of touch operation and correct position designation and a method for controlling the electronic apparatus.
Other features and advantages of the present invention will become apparent from the following description taken in conjunction with the accompanying drawings. Note that the same reference numerals denote the same or similar components throughout the drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
Fig. 1A is a perspective view showing an example of an appearance of a digital camera according to an embodiment of the present invention.
Fig. 1B is a perspective view showing an example of the appearance of a digital camera according to an embodiment of the present invention.
Fig. 2 is a perspective view illustrating an example of an appearance of a remote controller according to an embodiment.
Fig. 3A is a block diagram showing an example of a functional structure of a digital camera according to the embodiment.
Fig. 3B is a block diagram showing an example of a functional structure of a remote controller according to the embodiment.
Fig. 4AA is a flowchart related to the operation of the digital camera according to the embodiment.
Fig. 4AB is a flowchart relating to the operation of the digital camera according to the embodiment.
Fig. 4BA is a flowchart related to the operation of the digital camera according to the embodiment.
Fig. 4BB is a flowchart related to the operation of the digital camera according to the embodiment.
Fig. 4CA is a flowchart relating to the operation of the digital camera according to the embodiment.
Fig. 4CB is a flowchart relating to the operation of the digital camera according to the embodiment.
Fig. 5A is a diagram illustrating an example of screen display in a remote controller according to the embodiment.
Fig. 5B is a diagram illustrating an example of screen display in the remote controller according to the embodiment.
Fig. 5C is a diagram showing an example of screen display in the remote controller according to the embodiment.
Fig. 5D is a diagram illustrating an example of screen display in the remote controller according to the embodiment.
Fig. 5E is a diagram illustrating an example of screen display in the remote controller according to the embodiment.
Fig. 5F is a diagram illustrating an example of screen display in the remote controller according to the embodiment.
Fig. 6A is a diagram illustrating an example of screen display in the digital camera according to the embodiment.
Fig. 6B is a diagram illustrating an example of screen display in the digital camera according to the embodiment.
Fig. 6C is a diagram illustrating an example of screen display in the digital camera according to the embodiment.
Fig. 6D is a diagram illustrating an example of screen display in the digital camera according to the embodiment.
Fig. 6E is a diagram illustrating an example of screen display in the digital camera according to the embodiment.
Fig. 7 is a flowchart related to an operation of the remote controller according to the embodiment.
Detailed Description
Exemplary embodiments of the present invention will now be described with reference to the accompanying drawings.
● (digital camera 100)
Fig. 1A and 1B are perspective views showing the appearance of a digital single-lens reflex camera 100 (hereinafter simply referred to as "camera 100") as an example of an electronic apparatus according to the present embodiment, viewed from above the front and below the back, respectively.
The display unit 128 provided on the back surface of the camera 100 is a touch display. The display unit 128 is used to display a captured image and various information such as setting values of the camera 100. The display unit 128 also functions as an operation unit for accepting a touch operation. If the apparatus has a separate viewfinder as is the case with the camera 100 according to the present embodiment, a touch sensor having no display function, such as a touch panel, may be used instead of the display unit 128, or a touch sensor may be provided at a position different from that of the display unit 128. Here, the viewfinder may be of an optical type or an electronic type. If the device does not have the display unit 128, the operation to be performed on the touch display can be replaced with an operation performed on the touch sensor while viewing the viewfinder (i.e., while viewing the display unit that can be seen via the eyepiece portion of the viewfinder). The same applies to a device for detecting when the user is looking at the viewfinder and invalidating the display in the display unit 128 while keeping the touch sensor function of the display unit 128 active.
The shutter button 161 is an operation unit for inputting a shooting preparation instruction, a shooting start instruction, and the like. The mode change switch 160 is an operation unit for changing a shooting mode of the camera 100 and the like. The terminal cover 191 protects a connector of a cable for communication with an external device. The dial 171 is used to change setting values such as a shutter speed and an aperture. The power switch 172 switches the power of the camera 100 on and off.
The dial 173 is used to move a selection frame and scroll a displayed image forward and backward, and the like. The dial 173 can also be pressed in the up, down, left, and right directions, and thus can also be used as the direction key 174; further, a setting button 175 is provided at a rotation shaft portion of the dial 173. The direction key 174 is used to move a cursor, and the setting button 175 is used to accept a selected item and input a confirmation response, and the like.
A Live View (LV) button 176 is used to switch live view display in the display unit 128 on and off when the camera 100 is in the still image shooting mode. The LV button 176 is used to instruct start and stop of moving image shooting (recording) when the camera 100 is in a moving image shooting mode.
The zoom-in button 177 is used to switch the zoom-in mode of the live view display on and off, and change the magnification. The enlargement button 177 is used to increase the display magnification of the reproduced image when the camera 100 is in the reproduction mode. The playback button 178 is used to switch the camera 100 between a shooting mode and a playback mode. When the playback button 178 is pressed in the shooting mode, the camera 100 shifts to the playback mode, and the latest image recorded in the recording medium is displayed in the display unit 128.
Also, the openable/closable cover 120 protects the member for holding the recording medium and the recording medium held inside from dust and the like. The grip 190 is formed in a shape that makes it easy for a user to hold the camera 100 with one hand, and contains a battery inside. The mirror 112 is turned upside down by an actuator controlled by a system control unit 150 (fig. 3A), thereby switching the traveling direction of the light beam from the lens unit. The terminal 110 is in contact with a terminal provided in a lens unit that can be attached to and removed from the camera 100, and is used for supplying power from the camera 100 to the lens unit, for performing communication between the camera 100 and the lens unit, and the like.
The eyepiece 116 is an eyepiece of an optical viewfinder. By looking at the eyepiece 116, the user can view an image of a subject formed by the lens unit and setting values such as a shutter speed, an aperture, an ISO, and an AF frame.
● (remote controller 200)
Fig. 2 is a perspective view showing an example of an appearance of a remote controller 200 of the camera 100 (hereinafter simply referred to as "remote controller 200"). The remote controller 200 can remotely control the camera 100 by communicating with the camera 100. Although the remote controller 200 and the camera 100 can communicate through a wired medium or a wireless medium, the present embodiment assumes that wired communication is established using the connection cable 291. The connector of the connection cable 291 is connected to a connector in the terminal cover 191 of the camera 100. The connection cable 291 enables bidirectional communication between the camera 100 and the remote controller 200.
The power switch 272 switches the power of the remote controller 200 on and off. The display unit 228 is a touch display. Although the display unit 228 has the same aspect ratio as the display unit 128 of the camera 100, in the present embodiment, the camera 100 does not transmit the live view image to the remote controller 200, and thus the live view image is not displayed in the display unit 228.
The functions of the three operation buttons 271 each disposed above and below the display unit 228 are changed according to the details displayed in the display unit 228. The direction keys 274 include keys corresponding to up, down, left, and right, and a setting button 275 is disposed at the center of the direction keys 274. The direction key 274 is mainly used for inputting an instruction to move a cursor, and the set key 275 is mainly used for inputting an instruction to execute and accept a selected item. A HOME (HOME) button 276 is used to instruct the remote controller 200 to switch to a HOME screen. The menu button 277 is used to instruct the remote controller 200 to switch to a menu screen. The return button 278 is used to make a cancel instruction and instruct transition to a previous screen or the like. Various keys and buttons 271 and 273 to 278d other than the power switch 272 are included in the operation unit 270 (fig. 3B) of the remote controller 200.
● (function structure of camera and remote controller)
Fig. 3A and 3B are block diagrams showing examples of functional structures of the camera 100 and the remote controller 200, respectively, in which the same reference numerals are attached to constituent elements shown in fig. 1A to 2. The lens unit 140 (hereinafter simply referred to as "lens 140") is an interchangeable lens that can be attached to the camera 100 and detached from the camera 100. The lens 140 has a lens group 103 including movable lenses such as a focus lens and a zoom lens. The terminal 146 is in contact with the terminal 110 when the lens 140 is attached to the camera 100, and enables bidirectional communication between the lens 140 and the camera 100, power supply from the camera 100 to the lens 140, and the like.
The lens control unit 144 is a programmable processor such as a CPU or MPU, and controls the operation of the lens 140 by loading a program stored in the nonvolatile memory 145 into the system memory 147 and executing the program. By communicating with the system control unit 150 via the terminals 146 and 110, the lens control unit 144 controls the diaphragm mechanism 102 via the diaphragm drive circuit 142, and controls the position of the movable lens in the lens group 103 via the AF drive circuit 143. An AF/MF Switch (SW)148 is a switch for switching between whether or not the autofocus function of the lens 140 is to be used, and its setting value is communicated from the lens control unit 144 to the system control unit 150. Note that there is also a lens 140 without the AF/MF switch 148.
In the camera 100, a focusing screen 113, a pentaprism 114, and an eyepiece 116 constitute an optical finder. The user can confirm the photographing range, the focus position, and the like by observing the object image formed on the focusing screen 113 of the optical finder through the pentaprism 114 and the eyepiece 116. An AE sensor 117 provided in the optical viewfinder measures the brightness of the object image formed by the lens 140.
The focus detection unit 111 has an AF sensor for pupil division of the lens 140, and outputs defocus information (direction and amount) to the system control unit 150. The system control unit 150 communicates the movement amount and the movement direction of the focus lens to the lens control unit 144 based on the defocus information. The lens control unit 144 uses the AF drive circuit 143 to move the focus lens 140 in the communicated direction and amount.
The quick return mirror 112 (hereinafter simply referred to as "mirror 112") switches the traveling direction of the light beam incident from the lens 140 between the direction of the optical viewfinder and the direction of the image pickup unit 122. Fig. 3A shows a state where the mirror 112 is lowered, and thus the light beam travels in the direction of the optical finder. At the time of exposure to the image pickup unit 122 (at the time of shooting a still image, shooting a live view, or shooting a moving image to be recorded), the mirror 112 is turned upward, and the light beam travels in the direction of the image pickup unit 122. The mirror 112 has a half-mirror region so that the focus detection unit 111 can receive a part of the light beam in the case where the mirror 112 is lowered.
The mechanical shutter 101 is opened and closed under the control of the system control unit 150, and exposes the image pickup unit 122 for a prescribed amount of time. Note that the mechanical shutter 101 is continuously opened during the use of the electronic shutter, such as when shooting a moving image.
The image pickup unit 122 includes an image sensor having a plurality of pixels for converting an optical image into an electrical signal by photoelectric conversion. The a/D converter 123 converts an analog signal generated by the image pickup unit 122 into digital data.
The image processing unit 124 applies prescribed image processing to data from the a/D converter 123 or from the memory control unit 115. Demosaicing, white balance adjustment, tone correction, scaling (resizing), color conversion, encoding, decoding, region detection, and the like are examples of image processing, but the image processing is not limited thereto. The "region detection" is a process for detecting a region satisfying a predetermined condition (for example, a region having a specific pattern or feature such as a face of a human or an animal and a region having a specific color or brightness, etc.) in an image, and is also referred to as "subject detection". The image processing unit 124 may also be used to perform processing for identifying a person, and detecting facial expressions, eye closure, red eye, and the like. It is assumed below that the image processing unit 124 performs face detection processing for detecting a face of a person as an example of object detection.
The image processing unit 124 may also generate control information from an image obtained by shooting (captured image). For example, when contrast AF is used, the image processing unit 124 may generate an AF evaluation value at the focus detection area and supply the value to the system control unit 150. The image processing unit 124 may also generate a luminance evaluation value, generate a luminance histogram, and the like. Further, the image processing unit 124 may detect a moving destination of a predetermined image area among a plurality of images (for example, a plurality of moving image frames) obtained by sequential photographing. The moving direction and the moving amount of the specific region between the images may employ a known method such as a process for detecting a motion vector using template matching, but is not limited thereto. For example, setting the focus detection area (AF frame) as a template so that the AF frame can be tracked enables tracking AF to be described later.
The digital data output by the a/D converter 123 is written into the memory 132 via at least the memory control unit 115 of the image processing unit 124 and the memory control unit 115. The memory 132 stores image data and the like read out from the recording medium 185 in addition to the digital data output from the a/D converter 123. The memory 132 has a storage capacity sufficient for use as a buffer memory during continuous shooting or the like.
The memory 132 also functions as a video memory, and also stores images displayed in the display unit 128, image (OSD) data representing various information such as camera setting values, and the like. The D/a converter 125 converts data of the image for display (a combined image of the captured image and the OSD) stored in the memory 132 into an analog signal and supplies the analog signal to the display unit 128, and displays the image for display in the display unit 128.
The display unit 128 is, for example, a touch display provided with a touch panel 128a on a screen. The position detection system of the touch panel 128a is not limited, and for example, an electrostatic capacitance type touch panel may be used. Although the touch panel 128a is shown in the drawings as a separate entity from the display unit 128 for simplicity, it should be noted that touch panel functionality may be built into the display unit 128 as in an in-cell touch display. By displaying a moving image in the display unit 128 while the moving image is being shot, the display unit 128 can be made to function as an EVF. An image displayed when the display unit 128 is caused to function as an EVF is referred to as a "live view image".
The nonvolatile memory 156 is a memory such as an EEPROM that can be electrically erased and recorded. The nonvolatile memory 156 stores programs executed by the system control unit 150, various setting values, GUI data, and the like.
The system control unit 150 is a programmable processor such as a CPU or MPU, and controls the operation of the camera 100 by loading a program stored in the nonvolatile memory 156 into the system memory 152 and executing the program. The system control unit 150 also performs display control by controlling the memory 132, the D/a converter 125, and the display unit 128.
The system control unit 150 performs automatic exposure control (AE) processing based on luminance information obtained from the AE sensor 117 and automatic focus detection (AF) processing based on defocus information obtained from the focus detection unit 111. The system control unit 150 also uses information obtained by the image processing unit 124 in AE processing, AF processing, and the like as necessary. For example, by using the result of the face area detection from the image processing unit 124, the system control unit 150 can perform AF processing of setting a focus detection area (AF frame) to the face of a person, AE processing of appropriately exposing the face of a person, flash adjustment, or the like. The system control unit 150 can also perform AF (tracking AF) that tracks the movement of a face by causing the image processing unit 124 to track an area corresponding to a specific face. Further, when contrast AF is performed, the system control unit 150 may perform shooting while changing the position of the focus lens to search for a focus lens position at which the AF evaluation value obtained by the image processing unit 124 is highest.
The system timer 153 measures time used for various controls, measures time of an internal clock, and the like.
The mode change switch 160 switches the operation mode of the camera 100 between a still image recording mode, a moving image recording mode, a playback mode, and the like. The still image recording mode includes an automatic shooting mode, an automatic scene determination mode, a manual mode, an aperture priority mode (Av mode), a shutter speed priority mode (Tv mode), a scene mode, a program AE mode, a custom mode, and the like. The scene mode is a mode in which a parameter based on the type of a shooting scene such as sports, snow scenes, or fireworks is automatically set. The change switch 160 may be capable of selecting at least one of these modes. Alternatively, the still image shooting mode may be selected using the mode change switch 160, and then one of a plurality of modes included in the still image shooting mode may be selected using the other operation member. The moving image photographing mode may also include a plurality of modes.
When the shutter button 161 is half-pressed, the first shutter switch 162 is turned on and issues a first shutter switch signal SW 1. Using the first shutter switch signal SW1 as an instruction for shooting preparation, the system control unit 150 starts operations such as AF processing and AE processing. When the shutter button 161 is fully pressed, the second shutter switch 164 is turned on and issues a second shutter switch signal SW 2. Using the second shutter switch signal SW2 as an instruction to start shooting, the system control unit 150 starts a series of shooting/recording processes from exposure and readout of the image pickup unit 122 until generation of image data and writing of the image data to the recording medium 185.
The operation unit 170 is represented as a set of operation members (dial 171 to enlargement button 177) shown in fig. 1A to 1B. For example, the functions of these operation members may be dynamically allocated according to the details displayed in the display unit 128. For example, functions of a menu button, an end button, a return button, a next image button, a jump button, a sort button, an attribute change button, and the like may be assigned. When, for example, a menu button is operated, the system control unit 150 causes a menu screen for the user to make various settings to be displayed on the display unit 128. The user can perform various settings, input instructions, and the like from the menu screen by operating the direction keys 174 and the setting buttons 175 together, or by operating the touch panel 128a, for example.
Note that operations to be performed in response to the operation of a button or key included in the operation unit 170 may be set in advance according to a screen displayed in the display unit 128 and stored in the nonvolatile memory 156. These settings may be changeable by a user.
The power supply control unit 180 includes a battery detection circuit, a DC-DC converter, a switching circuit for switching blocks to be energized, and the like, and detects whether a power supply is connected, the type of power supply, the remaining battery level, and the like. The power supply control unit 180 also controls the DC-DC converter based on these detection results and under the control of the system control unit 150, and supplies a required voltage to various units including the recording medium 185 for a required period of time.
The power supply unit 130 may be a primary battery, a secondary battery, or an AC adapter. The recording medium I/F118 is an interface for the recording medium 185. The recording medium 185 is a memory card, a hard disk, or the like. There may also be removable recording media and built-in recording media.
The external I/F193 is a communication interface with an external device, and includes a connector for connecting a cable. The external I/F193 includes, for example, a USB interface for connecting to a personal computer or the like, HDMI (registered trademark) for connecting to an external monitor, an interface for the remote controller 200, and a corresponding connector. The communication I/F192 is a communication interface with an external device, and includes an antenna for wireless communication, a modulation/demodulation circuit, and the like. The communication I/F192 includes an antenna and a circuit for wireless communication based on standards such as wireless LAN and Bluetooth (registered trademark).
In the present embodiment, the system control unit 150 may set a focus detection area (AF frame) to a fixed position, a subject area, or a desired position designated by a user, and perform automatic focus detection (AF). The fixed position may be, for example, the center of the screen. The object region may be, for example, a face region detected by the image processing unit 124 in the live view image. The desired position specified by the user may be a position in the image specified by, for example, using the direction key 174 and the setting button 175 together or by a touch operation performed on the touch panel 128 a. For setting the AF frame to the position specified by the touch operation and performing AF, it will be referred to as "touch AF" hereinafter.
The system control unit 150 serving as a touch detection means can detect the following operation performed on a touch sensing surface (a display surface of the display unit 128) of the touch panel 128a, or a state of the touch sensing surface.
A finger or a pen that does not touch the touch panel 128a newly contacts the touch panel 128 a. In other words, this is the start of touch (hereinafter referred to as "touch-down").
A finger or a pen is touching the touch panel 128a (hereinafter referred to as "touch-on").
A finger or a pen moves while touching the touch panel 128a (hereinafter referred to as "touch-move").
Remove the finger or pen that has touched the touch panel 128 a. In other words, this is the end of the touch (hereinafter referred to as "touch-up").
Nothing touches the touch panel 128a (hereinafter referred to as "touch-off").
When a touch is detected, a touch continuation is detected. After touchdown, the touch continues to be detected, typically as long as no touch cessation is detected. Detecting touch movement is also a state where touch persistence is detected. Even if the touch is detected to continue, the touch movement is not detected as long as the touched position is not moved. After touch stop is detected for all fingers or pens that have been touching at all times, no touch occurs.
These operations/states, position coordinates on the touch panel 128a that are constantly touched by a finger or a pen, and the like are communicated to the system control unit 150 through the internal bus. The system control unit 150 determines what type of operation (touch operation) is performed on the touch panel 128a based on the communicated information. Regarding the touch movement, the moving direction of the finger or pen moving on the touch panel 128a can be determined based on the change in the position coordinates for each of the vertical component and the horizontal component on the touch panel 128 a.
If a touch movement of a prescribed distance or more is detected, it is determined that a slide operation (drag operation) is performed. If the finger or pen is quickly moved a short distance and then moved away while the touch panel 128a is being touched, the operation is referred to as "flicking". In other words, "flicking" is an operation of quickly flicking a finger over the touch panel 128 a. If a touch movement of a predetermined speed or more and a predetermined distance or more is detected and then a touch stop is detected, it may be determined that a flick has been performed (it may be determined that the flick continues to occur from the slide operation).
Further, in the case where a plurality of positions (for example, two points) are touched at the same time and the touched positions are brought together, the touch operation is referred to as "pinch-in", and after the touched positions are moved apart, the touch operation is referred to as "apart". The division and kneading are collectively referred to as a kneading operation (or simply "kneading"). As the touch panel 128a, any of various touch panels such as a resistance film, an electrostatic capacitance, a surface acoustic wave, infrared rays, electromagnetic induction, image recognition, a photodetector, and the like can be used. Depending on the type, a touch is detected when contact is made with the touch panel, or when a finger or a pen approaches the touch panel, and any of these types may be used.
Next, the remote controller 200 will be explained.
The nonvolatile memory 256 is a memory such as an EEPROM that can be electrically erased and recorded. The nonvolatile memory 256 stores programs executed by the remote controller control unit 250, various setting values, GUI data, and the like.
The remote controller control unit 250 is a programmable processor such as a CPU or MPU, and controls the operation of the remote controller 200 by loading a program stored in the nonvolatile memory 256 into the system memory 252 and executing the program.
The memory 232 functions as a video memory of the display unit 228, and the remote controller control unit 250 internally stores image data for display. The display image is displayed in the display unit 228 by converting the display image data stored in the memory 232 into an analog signal using the D/a converter 213 and supplying the analog signal to the display unit 228. The remote controller control unit 250 also performs display control by controlling the memory 232, the D/a converter 213, and the display unit 228.
The display unit 228 is a touch display provided with a touch panel 228a on a screen. The type of the touch panel 228a is not limited, as is the touch panel 128a of the camera 100. In addition, the function of the touch panel may be built in the display unit 228.
The system timer 253 measures time used for various types of control, measures time of an internal clock, and the like.
The operation unit 270 collectively represents various keys other than the power switch 272 and buttons 271, 273 to 278 shown in fig. 2. The function of the operation button 271 may be dynamically allocated, and the name or the like of the currently allocated function may be displayed in the display unit 228. An end button, a return button, a next image button, a jump button, a sort button, an attribute change button, and the like may be given as examples of the assigned functions, but the functions are not limited thereto.
When the start button 276 is pressed, the remote controller control unit 250 reads out GUI data from the nonvolatile memory 256, and causes a predetermined start screen to be displayed in the display unit 228. Also, when the menu button 277 is pressed, the remote controller control unit 250 reads out GUI data from the nonvolatile memory 256, and causes a predetermined menu screen to be displayed in the display unit 228.
The remote controller control unit 250 also controls the display according to the screen, for example, moves the focus (highlight of the selected item) in the GUI screen in response to the operation of the direction key 274. Upon operating the setting button 275, the remote controller control unit 250 performs processing or changes a screen or the like according to the item selected at the point in time. Operations to be performed in response to the operation of the button or key included in the operation unit 270 may be set in advance according to the screen displayed in the display unit 228 and stored in the nonvolatile memory 256. These settings may be changeable by a user.
The power supply control unit 280 includes a battery detection circuit, a DC-DC converter, a switching circuit for switching blocks to be energized, and the like, and detects whether a power supply is connected, the type of power supply, the remaining battery level, and the like. The power supply control unit 280 also controls the DC-DC converter based on these detection results and under the control of the system control unit 150, and supplies a required voltage to each unit for a required period of time.
The power supply unit 230 may be a primary battery, a secondary battery, or an AC adapter. The external I/F293 is a communication interface with the camera 100, and the cable 291 is connected to the external I/F293. Communication with the camera 100 is performed via the external I/F293 under the control of the remote controller control unit 250. The communication I/F292 is a communication interface with an external device, and includes an antenna for wireless communication, a modulation/demodulation circuit, and the like. The communication I/F292 includes an antenna and a circuit for wireless communication based on standards such as wireless LAN and Bluetooth (registered trademark).
The remote controller control unit 250 can detect whether or not the touch sensing surface of the touch panel 228a (the display surface of the display unit 228) is touched, a change (speed) of the touched position with the passage of time, and the like, and thus can recognize various touch operations. It is assumed here that the remote controller control unit 250 can recognize the same touch operation as the system control unit 150 described above.
● (instructions for operation-Camera 100)
Next, operations performed while the camera 100 captures and displays a live view image (such as during a shooting standby state or during recording of a moving image) in the image capturing system shown in fig. 3A and 3B will be described using fig. 4AA to 4 CB. The operations shown in the flowcharts of fig. 4AA to 4CB are implemented by the system control unit 150 loading a program stored in the nonvolatile memory 156 into the system memory 132 and executing the program to control the units of the camera 100.
In the operations described below, the input from the remote controller 200 may be an input from the operation unit 170 of the camera 100 (which includes the touch panel 128a and the AF/MF switch 148 of the lens 140). Further, in the case where the remote controller 200 is not connected or in the case where the remote controller 200 is invalid, it is not necessary to execute processing relating to output to the remote controller 200.
In S4001, the system control unit 150 determines whether a request to change the AF mode is received from the remote controller 200 (or from the operation unit 170) via the external I/F193. If the request is received, the process proceeds to S4002, and if the request is not received, the process proceeds to S4003. In the present embodiment, "AF mode" means an operation state of Auto Focus (AF). "request to change AF mode" means a request to affect the operational state of AF, e.g., a request to change a parameter value. In the present embodiment, as specific examples of a request to change the AF mode, setting whether or not to enable face detection, setting an object of subject tracking, instructing start and stop of tracking, instructing switching between Manual Focusing (MF) and autofocus, and the like can be given, but the request is not limited thereto.
In the present embodiment, if the face detection is effective, the user can select a subject to be focused on. The user can also select the position of the area (AF frame) where focus detection is to be performed. Further, tracking AF of AF while tracking a main subject (a subject at the position of the face area or the AF frame) selected by the user may be performed, and the user may start and stop the tracking AF, change an area to be tracked, and the like.
In S4002, the system control unit 150 changes the AF mode in response to the AF mode change request. For example, the system control unit 150 may change the AF mode by changing the parameter value corresponding to the current AF mode held in the system memory 152.
In S4003, the system control unit 150 determines whether the current AF mode is the tracking standby state by referring to the system memory 152; if the mode is the tracking standby state, the process proceeds to S4016, and if the mode is not the tracking standby state, the process proceeds to S4004.
In S4004, the system control unit 150 determines whether the current AF mode is the current tracking state by referring to the system memory 152; if the pattern is the current tracking state, the process proceeds to S4014 of fig. 4AB, and if the pattern is not the current tracking state, the process proceeds to S4005.
In S4005, the system control unit 150 determines whether the current AF mode enables face detection by referring to the system memory 152; if the face detection is valid, the process proceeds to S4006, and if the face detection is invalid, the process proceeds to S4007 of fig. 4 AB.
In S4006, the system control unit 150 determines whether a face is detected in an image frame captured by the imaging unit 122; if a face is detected, the process proceeds to S4008, and if no face is detected, the process proceeds to S4011 of fig. 4 AB.
In S4007, the system control unit 150 determines whether the current AF mode is AF or MF by referring to the system memory 152; if the mode is AF, the process proceeds to S4011 of fig. 4AB, and if the mode is MF, the process proceeds to S4013 of fig. 4 AB.
In S4008, the system control unit 150 determines whether the relative position timer implemented with the system timer 153 is currently operating. The relative position timer is an example of a means for measuring a set amount of time, and starts measurement (counting) in S4204, which will be described later, and then stops when the set amount of time elapses (when the count reaches a prescribed value (e.g., 0)). The system control unit 150 advances the process to S4011 when the relative position timer is currently operating, and advances the process to S4009 when the relative position timer is not currently operating.
In S4009, the system control unit 150 communicates the type of display screen (area selection screen) and information (e.g., identification information, position, and size) of each face that has been detected to the remote controller 200 via the external I/F193. If a plurality of faces are detected, the system control unit 150 sets one of the plurality of faces as a main face according to a predetermined condition, and communicates which face is the main face to the remote controller 200. Although the condition for selecting the main face is not particularly limited, one or more of conditions such as a face closest to the center of the screen, a maximum face, and a face closest to the center of the screen may be used. If a face specified as a main face by the user is detected, the system control unit 150 sets the face as the main face. The display data itself may be transmitted instead of the type of the communication display.
Fig. 5A shows an example of an area selection screen displayed in the display unit 228 by the remote controller control unit 250 in response to the communication of S4009.
5001a is a guidance display showing that a function for instructing transition to the tracking standby state is assigned to one of the devices included in the operation unit 270.
5002a is a guidance display representing that a function for instructing switching between AF and MF is assigned to one of the devices included in the operation unit 270.
5003a is a display indicating the current focus mode of the camera 100, and in the example of fig. 5A, indicates that the current focus mode is AF.
5004 is a guidance display representing that a function for instructing switching between enabling and disabling face detection is assigned to one of the devices included in the operation unit 270.
5005A is a display indicating whether the current face detection function of the camera 100 is active or inactive, and in the example of fig. 5A, indicates that the face detection function is currently active.
6003. 6004a and 6004b are face frames displayed by the remote controller control unit 250 based on the communicated face information, and are examples of marker display serving as indicators of the position and size of the detected face. Here, three faces are detected, and the face frame 6003 corresponding to the main face is made different in display from the other face frames 6004a and 6004 b. Note that the remote controller control unit 250 displays a face frame in which the position, size, and the like of a face area expressed with image coordinates in face information have been converted in accordance with the resolution of the display unit 228.
Note that the input devices included in the operation unit 270 are not particularly limited to devices corresponding to the guide displays 5001a to 5003a displayed at the edge portion of the display unit 228. The structure may also be as follows: the guidance displays 5001a to 5003a can be operated by direct touch using the touch panel 228 a.
Thus, in the case where the screen shown in fig. 5A is displayed, the user can perform the following operations from the remote controller 200:
selecting a subject (face)
Indicate a transition to trace standby state
Switching between AF and MF
Switching face detection between active and inactive
In the present embodiment, the live view image is not displayed in the display unit 228 of the remote controller 200, but the face frame is displayed based on the face information so that the user can easily specify a desired face from the remote controller 200. As shown in fig. 5F, the remote controller control unit 250 may display an area selection screen on which no face frame is displayed. In this case, a guidance display 5006a for notifying the user of the functions assigned to the touch panel 228a and the operation methods can be performed.
If a screen that can select an area detected by the camera 100, such as the area selection screen of fig. 5A, is displayed in the display unit 228, the absolute position operation on the touch panel 228a of the remote controller 200 is validated (i.e., information of the operation position is validated).
Therefore, in S4010, the system control unit 150 executes processing of an absolute position operation, which will be described later using fig. 4BA, and returns the processing to S4003.
In S4011, the system control unit 150 communicates the type of display screen (AF frame position change screen) to the remote controller 200 via the external I/F193. The display data itself may be transmitted instead of the type of the communication display.
Fig. 5B shows an example of an AF frame position change screen displayed in the display unit 228 by the remote controller control unit 250 in response to the communication of S4011. This screen indicates that the camera 100 is in a state of accepting an operation to move the AF frame. The same display elements as those in fig. 5A are given the same reference numerals, and the description thereof will be omitted.
5006b is a guidance display representing: by performing the drag operation on the touch panel 228a, the AF frame can be moved from its current position to a position corresponding to the direction and distance of the drag operation.
Thus, with respect to the change in the position of the AF frame, the relative position operation on the touch panel 228a is validated (information of the direction and amount of the operation is validated) instead of the absolute position operation.
In S4012, the system control unit 150 executes processing of a relative position operation which will be described later using fig. 4BB, and returns the processing to S4003.
In S4013, the system control unit 150 communicates the type of display screen (touch operation invalidation screen) to the remote controller 200 via the external I/F193, and advances the process to S4015. The display data itself may be transmitted instead of the type of the communication display.
Fig. 5C illustrates an example of the touch operation invalidation screen displayed by the remote controller control unit 250 in the display unit 228 in response to the communication of S4013. This screen indicates that the camera 100 is in a state of not accepting an operation related to the AF frame or the face frame. The same display elements as those in fig. 5A are given the same reference numerals, and the description thereof will be omitted.
In the example of fig. 5C, the function for instructing transition to the tracking standby state is not effective, and therefore the guidance display 5001b is displayed, for example, in a different color from when the function is effective as shown in fig. 5A.
5003b indicates that the current focus mode is MF.
5005b indicates that the current face detection function is not valid.
5007c is a guidance display indicating that an operation on touch panel 228a (other than the operations performed on guidance displays 5002a and 5004) is invalid.
In S4014, the system control unit 150 communicates the type of display screen (current tracking screen) to the remote controller 200 via the external I/F193, and advances the process to S4015. The display data itself may be transmitted instead of the type of the communication display.
Fig. 5D shows an example of the current trace screen displayed by the remote controller control unit 250 in the display unit 228 in response to the communication of S4014. The current tracking picture is a picture representing: the camera 100 is currently performing an AF operation (in tracking AF) while tracking the selected subject or AF frame, and the touch operation (except for the operation performed on the guidance display 5001 c) is disabled.
5001c is a guidance display showing that a function for instructing the end of tracking is assigned to one of the devices included in the operation unit 270.
5007d is a guidance display indicating that the AF mode of the camera 100 is the tracking AF and the operation of the touch panel 228a is invalid.
In S4015, the system control unit 150 executes a process of invalidating a position operation, which will be described later using fig. 4CA, and returns the process to S4003.
In S4016, the system control unit 150 communicates the type of display screen (trace standby screen) to the remote controller 200 via the external I/F193, and advances the process to S4017. The display data itself may be transmitted instead of the type of the communication display.
Fig. 5E shows an example of the tracking standby screen displayed in the display unit 228 by the remote controller control unit 250 in response to the communication of S4016. The tracking standby screen is a screen indicating that the AF mode of the camera 100 is a tracking standby state and that the tracking standby screen can be moved by a drag operation.
5001d is a guidance display showing that a function for instructing the start of tracking is assigned to one of the devices included in the operation unit 270.
5002b is a guidance display showing that a function for instructing that tracking is to be canceled is assigned to one of the devices included in the operation unit 270.
5006e is a guidance display representing: the AF mode of the camera 100 is a tracking standby state, and by performing a drag operation on the touch panel 228a, the tracking standby frame can be moved from its current position to a position corresponding to the direction and distance of the drag operation. The tracking standby frame is a face or an AF frame for which tracking AF is to be performed, and when an instruction to start tracking is made, the camera 100 starts tracking AF by tracking the tracking standby frame at the time point at which the instruction is made.
In S4017, the system control unit 150 executes processing of a trace standby operation, which will be described later using fig. 4CB, and then returns the processing to S4003.
Fig. 4BA is a flowchart showing details of the processing of the absolute position operation performed by the system control unit 150 in S4010.
In S4101, the system control unit 150 communicates to the remote controller 200 via the external I/F193 that the touch panel 228a is to operate in the absolute position operation mode. The absolute position operation mode is a mode in which the position (coordinates) of the touch operation is valid.
In S4102, the system control unit 150 updates the face detection state as necessary.
In S4103, the system control unit 150 transmits information (identification information, position, and size) of each currently detected face to the remote controller 200 via the external I/F193. Since the face information is transmitted to the remote controller 200 in S4009, the processes of S4102 and S4103 need not be performed. In addition, the process of S4103 may be performed only in the case where the face detection result in S4102 is different from the face detection result obtained when S4009 is performed.
In S4104, the system control unit 150 determines whether identification information of the selected face is received from the remote controller 200; if the information is received, the process proceeds to S4105, and if the information is not received, the process proceeds to S4107.
In S4105, the system control unit 150 sets a face corresponding to the face identification information received in S4104 as a main face, and advances the process to S4106.
In S4106, the system control unit 150 sets the AF frame to, for example, an area containing the main face, performs focus control (focus detection), and then advances the process to S4107.
In S4107, the system control unit 150 displays a live view screen, for example, including a mark display (face frame) indicating the detected face, as shown in fig. 6A, in the display unit 128, and then ends the processing of the absolute position operation.
6001 is a status display showing that moving image shooting is in progress or that moving image shooting is currently paused. The display 6001 is not performed during the photographing standby state. Here, an example of a display showing that a moving image is being shot is shown.
6002 is a display of a time code indicating a moving image being shot.
6003 is a frame display representing the main face among the detected faces.
6004a and 6004b are frame displays representing faces that are not main faces and that are displayed in a form different from that of the main face frame 6003, which are present among the detected faces.
6005a, 6005b, and 6005c are subjects present in an image captured by the image capturing unit 122, which correspond to detection objects of subject detection processing (here, faces of people).
Fig. 4BB is a flowchart showing details of processing of the relative position operation performed by the system control unit 150 in S4012.
In S4201, the system control unit 150 communicates to the remote controller 200 via the external I/F193 that the touch panel 228a is to operate in the relative position operation mode. The relative position operation mode is a mode in which the movement amount and the movement direction of the touch-operated position are validated, but not the touch-operated position.
In S4202, the system control unit 150 determines whether information (movement information) about the movement amount and the movement direction of the touch operation is received from the remote controller 200; if the information is received, the process proceeds to S4203, and if the information is not received, the process proceeds to S4205.
In S4203, the system control unit 150 sets the movement destination position of the AF frame based on the current position of the AF frame and the movement information received in S4202, and advances the process to S4204.
In S4204, the system control unit 150 resets the relative position timer, starts counting, and advances the process to S4205. As described above, the relative position timer is a timer for measuring a set amount of time, and is implemented using the system timer 153. The relative position timer may be an up-count timer or a down-count timer.
If the mode is frequently switched between the relative position operation mode and the absolute position operation mode, the position of the AF frame will be frequently moved, which results in the focus control not intended by the user being performed. The set amount of time measured by the relative position timer is the amount of time taken to switch (transition) between the relative position operation and the absolute position operation, and a few seconds or so is considered appropriate for switching that does not give an unnatural feeling. The system control unit 150 sets an initial value and a target value of the relative position timer and starts counting. As a condition for starting or stopping counting, the focus control may be used to enter a steady state in S4205 instead of measuring the set amount of time. Focus control that does not generate an unnatural feeling can be achieved by ensuring that the touch panel 228a does not frequently switch between the relative position operation mode and the absolute position operation mode.
In S4205, the system control unit 150 executes focus control based on the current position of the AF frame, and then advances the process to S4206.
In S4206, the system control unit 150 displays, for example, a live view screen including a mark display indicating the current AF frame as shown in fig. 6B in the display unit 128, and then ends the processing of the relative position operation. In fig. 6B, the same display elements as those in fig. 6A are given the same reference numerals, and a repetitive description thereof will be omitted.
6006 is a display showing the current position of the AF frame.
6007 is a subject in the image.
Fig. 4CA is a flowchart showing the details of the processing executed by the system control unit 150 in S4015 to invalidate the position operation.
In S4301, the system control unit 150 communicates to the remote controller 200 via the external I/F193 that the absolute position operation and the relative position operation of the touch panel 228a are invalid.
In S4302, the system control unit 150 determines whether the AF mode is currently tracking in the same manner as S4004; if the AF mode is tracking, the process proceeds to S4303, and if the AF mode is not tracking, the process proceeds to S4305.
In S4305, the system control unit 150 displays, for example, a live view screen without frame display as shown in fig. 6C in the display unit 128, and then ends the processing of invalidating the position operation. In fig. 6C, the same display elements as those in fig. 6A are given the same reference numerals, and a repetitive description thereof will be omitted. If the AF mode is not currently being tracked, frames 6003, 6004a, and 6004B (fig. 6A) representing the detected subject region, and an AF frame 6006 (fig. 6B) are not displayed in the live view screen.
In S4303, the system control unit 150 determines whether the object is lost in the tracking process (whether tracking has failed); if the tracking fails, the process proceeds to S4304, and if the tracking does not fail, the process proceeds to S4306.
In S4304, the system control unit 150 ends the tracking process, changes the parameter value corresponding to the AF mode held in the system memory 152 to a non-tracking state, and ends the process of invalidating the position operation. By so doing, the non-tracking focus control is executed until the tracking setting is performed again.
In S4306, the system control unit 150 updates the setting of the AF frame to the movement position of the object during tracking as detected by the image processing unit 124.
In S4307, the system control unit 150 executes focus control on the AF frame updated in S4306.
In S4308, the system control unit 150 displays, for example, a live view screen including a mark display (tracking frame) indicating an object to be tracked as shown in fig. 6D in the display unit 128, and then ends the processing of invalidating the position operation. In fig. 6D, the same display elements as those in fig. 6A are given the same reference numerals, and a repetitive description thereof will be omitted.
6008 is a display of a frame (tracking frame) indicating the subject being tracked.
Fig. 4CB is a flowchart showing details of the process of tracking the standby operation performed by the system control unit 150 in S4017.
In S4401, the system control unit 150 communicates to the remote controller 200 via the external I/F193 that the touch panel 228a is to operate in the relative position operation mode.
In S4402, the system control unit 150 determines whether information (movement information) about the movement amount and the movement direction of the touch operation is received from the remote controller 200; if the information is received, the process proceeds to S4403, and if the information is not received, the process proceeds to S4404.
In S4403, the system control unit 150 sets the move-destination position of the trace frame based on the current position of the trace frame (AF frame) and the move information received in S4402, and advances the process to S4404. In S4404, the system control unit 150 displays, for example, a live view screen including a display indicating an object to be tracked at the start of tracking (tracking standby frame) as shown in fig. 6E in the display unit 128, and then ends the processing of the tracking standby operation. In fig. 6E, the same display elements as those in fig. 6A are given the same reference numerals, and a repetitive description thereof will be omitted.
In fig. 6E, 6009 is a display showing the position of the tracking frame (tracking standby frame) after the movement. Note that the tracking operation has not yet started at this point in time. Upon receiving an instruction to start tracking from the remote controller 200, the system control unit 150 starts tracking processing using the tracking standby frame at that point in time as a tracking frame.
Fig. 7 is a flowchart showing operations performed by the remote controller 200 in cooperation with operations performed by the camera 100 as described using fig. 4AA to 4 CB. The operation shown in the flowchart of fig. 7 is realized by the remote controller control unit 250 controlling the units in the remote controller 200 based on the program stored in the nonvolatile memory 256.
In S7001, the remote controller control unit 250 receives information of a screen to be displayed in the display unit 228, an operation mode of the touch panel 228a, and information of a detected face from the camera 100 via the external I/F293. Here, the information of the screen is information that the camera 100 communicates in S4009, S4011, S4013, S4014, or S4016. The mode of the touch panel 228a is information that the camera 100 communicates in S4101, S4201, S4301, or S4401. The information of the face is information that the camera 100 communicated in S4009 or S4103. The touch panel 228a has a plurality of operation modes including an absolute position operation mode (first operation mode) and a relative position operation mode (second operation mode). The remote controller control unit 250 sets one of a plurality of operation modes in the touch panel 228a according to the operation mode received from the camera 100.
In S7002, the remote controller control unit 250 displays the screen illustrated in any one of fig. 5A to 5F on the display unit 228 based on the screen identification information (or screen data) received in S7001.
In S7003, the remote controller control unit 250 determines whether an effective device operation (an operation other than a touch operation) based on the currently displayed screen is detected; if the operation is detected, the process proceeds to S7004, and if the operation is not detected, the process proceeds to S7005. For example, if the operation shown in fig. 5A is currently displayed, an operation on a device (a button or a key included in the operation unit 270) assigned to the guidance display 5001a, 5002a, or 5004 is included in the valid operation. The display screen and the valid operation are associated with each other and held in, for example, the nonvolatile memory 256 of the remote controller 200, and the remote controller control unit 250 can determine whether the detected operation is a valid operation for the current display screen by referring to the nonvolatile memory 256.
In S7004, the remote controller control unit 250 generates an AF mode change request based on the valid operation detected in S7003, transmits the request to the camera 100 via the external I/F293, and returns the process to S7001.
In S7005, the remote controller control unit 250 determines whether the currently set operation mode is an absolute position operation mode; if the mode is the absolute position operation mode, the process proceeds to S7006, and if the mode is not the absolute position operation mode, the process proceeds to S7009.
In S7006, the remote controller control unit 250 determines whether a touch-down operation to the touch panel 228a performed at a position within the area of the display unit 228 other than the guidance displays 5001a, 5002a, and 5004 is detected. Although it is assumed here that the main face is set at the position where the touch-up operation is detected, a position where other touch operations (e.g., touch stop or double-click operation) are detected may be used instead. If the remote controller control unit 250 determines that a specific touch operation (here, a touch-down operation) is detected, the process proceeds to S7007, and if a specific touch operation is not detected, the process returns to S7001.
In S7007, the remote controller control unit 250 determines the face specified by the touch operation based on the position (coordinates) at which the touch operation was detected in S7006 and the face information (or the display position and size of each face frame) received in S7001, and then advances the process to S7008. More specifically, if there is face information corresponding to the position (coordinates) where the touch operation is performed (if the position where the touch operation is performed is within the face frame), the corresponding face information is determined as the specified face. If no face is present at the position where the touch operation is performed (touch position), no face is selected in the processing of S7007 and S7008.
In S7008, the remote controller control unit 250 transmits the identification information of the face determined in S7007 as the information of the selected face to the camera 100 via the external I/F293, and returns the process to S7001.
In S7009, the remote controller control unit 250 determines whether the currently set operation mode is a relative position operation mode; if the mode is the relative position operation mode, the process proceeds to S7010, and if the mode is not the relative position operation mode, the process returns to S7001.
In S7010, the remote controller control unit 250 determines whether or not a drag operation (a touch movement in which the movement distance of the touch position is greater than or equal to a prescribed distance) performed on the touch panel 228a is detected. If the remote controller control unit 250 determines that the drag operation is detected, the process proceeds to S7011, and if the drag operation is not detected, the process returns to S7001. Here, the drag operation is one example of an effective touch operation associated with the display screen to which the relative position operation mode is designated, and a different touch operation such as a flick operation or the like may be used instead.
In S7011, the remote controller controlling unit 250 converts the direction and distance of the drag operation detected in S7010 into frame movement information. In converting the distance of the drag operation detected on the touch panel 228a into the distance on the captured image, an appropriate magnification is applied so that the moving amount of the AF frame or the tracking standby frame does not become excessive with respect to the drag operation on the touch panel 228 a. For example, it is assumed that the movement distance is expressed as the number of pixels, and the number of pixels in the display unit 228 is 1/10 of the number of pixels corresponding to the resolution of the captured image in the horizontal direction and the vertical direction. In this case, simply converting the amount of movement of the drag operation on the touch panel 228a will result in a movement amount that is 10 times the amount of the drag operation in the horizontal direction and the vertical direction, which makes it difficult to finely adjust the position. Therefore, the remote controller control unit 250 converts the distance of the drag operation using a lower magnification in the horizontal direction or the vertical direction than the number of pixels in the captured image/the number of pixels in the display unit 228. The system control unit 150 and the remote controller control unit 250 can exchange information and determine an initial value of magnification when the remote controller 200 and the camera 100 are connected. Therefore, the operation amount itself may be transmitted from the remote controller control unit 250 to the camera 100, and the conversion process of S7011 may be performed by the system control unit 150.
If the distance of the drag operation is obtained not as the number of pixels but as the actual moving distance, the converted moving distance/size of the captured image in the horizontal (vertical) direction may be set smaller than the moving distance/size of the touch panel in the horizontal (vertical) direction. However, it is assumed that the size of the captured image is a size converted by a prescribed ppi (for example, 72ppi or 96 ppi).
If the pixel pitch of the display unit 228 is extremely small, the distance of the drag operation can be converted into the number of pixels in the case where ppi is 72 or 96, and then can be further converted into the distance in the captured image. A function for dynamically adjusting the AF frame and the sensitivity of tracking the amount of movement of the standby frame or the like with respect to the drag operation performed on the touch panel 228a can be assigned to any device included in the operation unit 270.
In S7012, the remote controller control unit 250 transmits the frame movement information generated in S7011 to the camera 100 via the external I/F293, and returns the process to S7001.
According to the present embodiment described so far, in a case where an area detected using the function of the camera is selected (such as in a case where a specific object is selected), absolute position designation in a touch sensing device such as a touch panel or the like is effective. On the other hand, if no area is detected using the function of the camera, or the movement of the AF frame or the tracking frame is specified, the absolute position specification in the touch sensing device is invalid, and the relative position specification is valid. In other words, whether the position of the touch operation performed on the touch sensing device is validated or the direction or amount of the operation is validated is dynamically switched depending on whether or not the specific area is detected by the camera.
If an area detected by the camera is selected, by designating a desired position within or near the area, the position can be correctly designated even in a small touch-sensitive device or a touch-sensitive device that does not display an image. Therefore, the position of the touch operation for the touch sensing device is effective. On the other hand, in other cases, at least one of the operation direction and the operation amount is effective instead of the position of the touch operation. This makes it possible to experience the convenience of the touch operation while preventing an unintended result caused by specifying an absolute position in an image by using a small touch-sensing device or a touch-sensing device that does not display an image. The camera automatically performs this switching, which further improves convenience. In addition, at the time of tracking an object, tracking can be started from a tracking standby state and completed by a common operation sequence regardless of whether an object to be tracked is detected.
More specifically, if touchdown (start of touch) is detected while the absolute position designation is valid (while the absolute position operation mode is set), the remote controller control unit 250 determines whether the touchdown-detected position (touchdown position) is a position corresponding to any face information (whether touchdown is made within any face frame). If the touched position is a position corresponding to any face information, the remote controller control unit 250 transmits the face of the corresponding face information as information of the selected face to the camera 100. In other words, if the absolute position specification is valid, the remote controller control unit 250 executes processing corresponding to the touched position (processing of selecting a face corresponding to the touched position).
In this case, the remote controller control unit 250 assumes that the face corresponding to the touched position is selected, regardless of which face is selected before the touch is detected. Further, even if no movement of the touched position (touch movement) or touch stop is detected, if the touched position is within the face frame, the remote controller control unit 250 assumes that the face corresponding to the face frame is selected. In other words, if the absolute position designation is valid, a function uniquely corresponding to the touch position is executed. The processing is not based on the set value before the touch operation is detected (the position of the AF frame or the tracking frame before the touch operation is detected). In the case where there is touch-up, a face corresponding to the last touch position detected before touch-up is performed may be selected instead of selecting a face by touching.
On the other hand, if the relative position designation is valid (if the relative position operation mode is set), the remote controller control unit 250 does not move the AF frame or the tracking frame or the like even if touching (start of touch) is detected, but instead maintains the position before the start of the touch. Upon detecting touching and then moving the touched position (touch movement), the remote controller control unit 250 performs processing for moving the AF frame or the tracking frame by generating frame movement information including the amount of movement and the direction of movement of the touched position and transmitting the information to the camera 100. The AF frame or the tracking frame is moved in a direction and by a movement amount based on the frame movement information from a position set before a touch triggering generation of the frame movement information is made.
In other words, in the case where the relative position designation is valid, the touched position is used to calculate the amount of movement and the direction of movement of the touched position, but the AF frame or the tracking frame does not directly correspond to the movement destination. Regardless of the position at which the touch movement starts, the moving direction and the moving amount of the AF frame or the follow-up frame are the same as long as the moving amount and the moving direction of the touch position are the same. In other words, if the relative position designation is valid, processing corresponding to the movement amount and the movement direction of the touch position is executed, but a function uniquely corresponding to the touch position is not executed. If the relative position designation is valid, the processing is based on a set value before the touch operation is detected (the position of the AF frame or the tracking frame before the touch operation is detected). Specifically, the set value before the touch operation is performed is a set value changed by an amount equivalent to the movement amount and the change in the movement direction corresponding to the touch position.
On the other hand, the live view image is displayed in the display unit of the camera together with information such as the result of the area detection and the current AF frame, and thus more correct operation can be performed by operating the external remote controller of the camera while looking at the display unit of the camera.
Further, after the absolute position specification becomes invalid, the absolute position specification does not become valid until a predetermined condition is satisfied. This makes it possible to suppress the following: in the case where the detection of the area detection function using the camera is unstable, the absolute position designation is switched between on and off in a short period of time in the touch sensing device.
(other embodiments)
The present invention can also be used for functions other than focus control. For example, the position of the area determined according to the touch operation (AF frame or tracking standby frame), information of the selected area, and the like may be used in the Automatic Exposure (AE) process, the white balance control process, and the like.
In addition, the region detected with the function of the camera is not limited to an object region such as a face region. For example, at least one of a high luminance area (e.g., a highlight overflow area) whose luminance exceeds a first prescribed value and a low luminance area (e.g., a dark portion missing area) whose luminance is less than a second prescribed value may be detected. Alternatively, a region having a specific color (e.g., a white region) may be detected.
In addition, a part of the flow shown in fig. 4AA to 4CB may be executed by the remote controller control unit 250 of the remote controller 200, and a part of the flow shown in fig. 7 may be executed by the system control unit 150 of the camera 100. For example, the coordinate conversion processing of S7007 and S7011 may be performed by the camera 100. In this case, the remote controller 200 transmits event information indicating the type of touch operation (touch operation, drag operation, or the like), the operation position, the operation direction, and the operation amount to the camera 100, instead of S7008 and S7012. The same effects as those of the above-described embodiment can be achieved also in this case.
Alternatively, the camera 100 and remote controller 200 may be incorporated into a single housing as an integral device. In this case, the external I/F193 and the external I/F293 are replaced with bus connections within the device. Equivalent modules such as the system control unit 150 and the remote control unit 250 may then be combined into a single module. The same effect as in the present embodiment can be achieved with this structure.
The camera 100 and the remote controller 200 may be incorporated into a single housing as an integrated device, and further, a separate remote controller corresponding to the remote controller 200 may be enabled to be connected to the device. By doing so, if the photographer and the assistant are shooting as one team, both the photographer operating the camera 100 and the assistant operating the separate remote controller 200 can operate the blocks using the same procedure.
Note that the system control unit 150, the remote controller control unit 250, and the like may be implemented as a single hardware, or may be implemented by sharing processing among a plurality of pieces of hardware. The hardware implementing the system control unit 150 and the remote controller control unit 250 and the like may be a programmable processor such as a CPU or dedicated hardware such as an ASIC or the like.
Further, although the above-described embodiment mainly describes the touch panel 228a of the remote controller 200 that is operatively connected to the camera 100, the use of the remote controller 200 is not a necessary condition of the present invention, and the present invention may be executed by using the camera 100 alone.
In performing the present invention with the camera 100 alone, for example, in the following case, the operations performed by the remote controller control unit 250 described above (including setting an operation mode based on the state of detecting the object region, and operations performed according to the operation mode, and the like) are performed with the system control unit 150.
A proximity sensor provided near the viewfinder (optical viewfinder or electronic viewfinder) detects that the user is using the viewfinder
The case where display is being performed on the display unit in the viewfinder
When the user instructs to operate operation section 170
Specifically, the system control unit 150 can perform the same control operation on the display unit 128 and the touch panel 128a as the control operation performed on the display unit 228 and the touch panel 228a by the remote controller control unit 250. In this case, if the viewfinder is an electronic viewfinder, in the same manner as the control performed when the system control unit 150 performs display in the display unit 128 described in the above-described embodiment, a mark display such as a face frame, an AF frame, a tracking frame, or the like may be displayed in the display unit of the viewfinder together with the live view. If the viewfinder is an optical viewfinder, live view display may be omitted, and in a display unit within the viewfinder, a mark display such as a face frame, an AF frame, or a tracking frame may be visually superimposed on the optical image. The operation performed with respect to the touch panel 128a is validated even in the case where live view display is not performed in the display unit 128, which makes it possible to use the touch panel 128a in the same manner as the touch panel 228a of the remote controller 200 described above.
Although the foregoing has described in detail preferred embodiments of the present invention, the present invention is not intended to be limited to the specific embodiments, and all variations that do not depart from the gist of the invention are intended to be within the scope of the invention. Further, the above-described embodiments are only various embodiments of the present invention. Further, the various structures described in these embodiments may be appropriately combined.
The foregoing embodiments describe embodiments in which the present invention is applied to a digital still camera as an example of an electronic apparatus or an image pickup system using the camera. However, the present invention can be applied to any desired electronic apparatus having an image pickup function and a touch sensor (touch panel ). It is not necessary to have the touch sensor built in or arranged on the display device. A personal computer, a tablet terminal, a portable electronic device (PDA), a mobile phone, an image viewer, a media player, a game machine, an electronic book reader, and the like may be given as examples of the electronic apparatus that may imply the present invention, but the present invention is not limited thereto. However, although the present invention can be applied to such electronic devices, the present invention is particularly useful in portable electronic devices in which an area where a touch sensor can be mounted is small.
The present invention can also be realized as a process performed by: a program that realizes one or more functions of the above-described embodiments is supplied to a system or apparatus via a network or using a storage medium, and then the program is read out and executed by one or more processors of a computer of the system or apparatus. The invention may also be implemented using circuitry (e.g., an ASIC) that performs one or more functions.
The present invention is not limited to the above-described embodiments, and various changes and modifications can be made within the spirit and scope of the present invention. Therefore, to apprise the public of the scope of the present invention, the appended claims are made.
The present application claims priority from japanese patent application 2016-.

Claims (14)

1. An electronic device, comprising:
a touch detection section for detecting an operation on the touch sensing surface;
setting means for setting a first operation mode in a case where an area satisfying a predetermined condition is detected in an image, and setting a second operation mode in a case where the area is not detected or the detection of the area is not performed; and
control means for controlling to execute:
executing a function corresponding to a position at which the touch detection section detects a touch in a case where the first operation mode is set, an
In a case where the second operation mode is set, a process corresponding to a position at which the touch detection section detects a touch is not performed, and a process of a moving operation based on the touched position is performed,
wherein, in a case where the second operation mode is set, the control means performs control such that, even if the touch detection means detects a touch, a mark representing a part of the image is not moved from a position where the mark was before the touch was detected, and in a case where the movement of the touched position is detected, the mark is moved in accordance with a movement amount and a movement direction of the touched position as the processing of the movement operation based on the touched position.
2. The electronic device of claim 1,
the image is an image captured by an image capturing means, and the region satisfying the predetermined condition is a region of a specific object.
3. The electronic device of claim 2,
the specific object is a face of a person.
4. The electronic device of claim 1,
the function corresponding to the position where the touch is detected is a function for selecting an area corresponding to the position where the touch is detected from the detected areas.
5. The electronic device of claim 4,
the selected region is set as a main object and is regarded as an object to be focused in AF in the case where the image pickup means picks up an image.
6. The electronic device of claim 1,
in a case where the first operation mode is set, the control section performs control such that, in response to the touch detection section detecting a touch, a function corresponding to a position at which the touch is detected is executed.
7. The electronic device of claim 1,
the mark is a mark indicating a position to be focused in AF in the case where the image pickup means performs image pickup.
8. The electronic device of claim 1,
the touch sensing surface is different from a display surface of the image.
9. The electronic device of claim 1, further comprising:
an image pickup unit;
a viewfinder;
a first display component viewable via the viewfinder; and
a second display section capable of being viewed without going through the viewfinder,
wherein the image is an image captured by the image capturing means,
the touch sensing surface is a display surface of the second display part, an
In a case where the image is being displayed in the first display section, the control section performs control such that a function corresponding to a position where a touch is detected is executed or the processing of a moving operation based on the touch position is executed in accordance with a detection state of the area.
10. The electronic device of claim 1,
the detection of the area is performed by an image pickup apparatus communicatively connected to the electronic apparatus.
11. The electronic device of claim 1,
in a case where the region is changed from an undetected state to a detected state, the setting means does not change from the second operation mode to the first operation mode until a prescribed condition is satisfied.
12. The electronic device of claim 11,
the prescribed condition is that a certain time has elapsed after the processing is performed.
13. A method of controlling an electronic device, comprising:
a setting step of setting a first operation mode in a case where an area satisfying a predetermined condition is detected in an image, and setting a second operation mode in a case where the area is not detected or the detection of the area is not performed;
a touch detection step of detecting an operation on a touch sensing surface; and
a control step of performing control including:
executing a function corresponding to a position at which the touch detection step detects a touch in a case where the first operation mode is set, an
Performing, in a case where the second operation mode is set, processing corresponding to a position at which the touch is detected by the touch detecting step but processing of a moving operation based on the touched position,
wherein, in a case where the second operation mode is set, the controlling includes: performing control such that, even if a touch is detected, a mark representing a part of the image is not moved from a position where the mark was before the touch was detected, and in a case where the movement of the touched position is detected, the mark is moved in accordance with a movement amount and a movement direction of the touched position as the processing of the movement operation based on the touched position.
14. A computer-readable storage medium storing a program for causing a computer of an electronic apparatus including a touch detection section for detecting an operation on a touch sensing surface to execute the method of controlling the electronic apparatus according to claim 13.
CN201780019550.XA 2016-03-22 2017-03-21 Electronic device and control method thereof Active CN109076156B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2016057556 2016-03-22
JP2016-057556 2016-03-22
JP2017023475A JP6788518B2 (en) 2016-03-22 2017-02-10 Electronic devices and their control methods, as well as imaging devices
JP2017-023475 2017-02-10
PCT/JP2017/011080 WO2017164140A1 (en) 2016-03-22 2017-03-21 Electronic device and method for controlling same

Publications (2)

Publication Number Publication Date
CN109076156A CN109076156A (en) 2018-12-21
CN109076156B true CN109076156B (en) 2021-01-26

Family

ID=59972507

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780019550.XA Active CN109076156B (en) 2016-03-22 2017-03-21 Electronic device and control method thereof

Country Status (3)

Country Link
US (1) US10649313B2 (en)
JP (1) JP6788518B2 (en)
CN (1) CN109076156B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3509291B1 (en) * 2016-08-31 2022-09-07 Canon Kabushiki Kaisha Imaging control apparatus and method therefor
JP7224828B2 (en) * 2018-09-27 2023-02-20 キヤノン株式会社 Imaging device
JP7361251B2 (en) 2018-09-28 2023-10-16 パナソニックIpマネジメント株式会社 Measuring device and method
KR102238939B1 (en) * 2018-11-29 2021-04-14 오지큐 주식회사 Method for Protecting Portrait Right by Using Mobile Device
JPWO2020137602A1 (en) * 2018-12-28 2021-11-18 ソニーグループ株式会社 Imaging equipment, imaging methods, and programs
CN112118380B (en) * 2019-06-19 2022-10-25 北京小米移动软件有限公司 Camera control method, device, equipment and storage medium
JP7379083B2 (en) * 2019-10-24 2023-11-14 キヤノン株式会社 Control device, imaging device, control method, and program
JP7277392B2 (en) * 2020-01-21 2023-05-18 株式会社東海理化電機製作所 Remote controllers, processors and computer programs
JP2021144121A (en) * 2020-03-11 2021-09-24 キヤノン株式会社 Display control apparatus, control method thereof, program, and storage medium
WO2021193987A1 (en) * 2020-03-24 2021-09-30 주식회사 삼양옵틱스 Camera lens assembly having user-designated functions
US11665425B2 (en) * 2020-12-08 2023-05-30 Lenovo (Singapore) Pte. Ltd. Apparatus, method, and program product for operating a camera sense mode
JP2023095508A (en) * 2021-12-24 2023-07-06 キヤノン株式会社 Imaging apparatus and method for controlling the same, and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090075072A (en) * 2008-01-03 2009-07-08 상명대학교 산학협력단 Method for personal identification using finger-vein
CN101963860A (en) * 2009-07-22 2011-02-02 奥林巴斯映像株式会社 Condition changing device
CN103873759A (en) * 2012-12-17 2014-06-18 联想(北京)有限公司 Shooting method and electronic equipment
CN104917958A (en) * 2015-04-30 2015-09-16 深圳市万普拉斯科技有限公司 Picture taking method and device

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US523879A (en) * 1894-07-31 Charles braaf
JP4929630B2 (en) * 2005-07-06 2012-05-09 ソニー株式会社 Imaging apparatus, control method, and program
JP4991621B2 (en) * 2008-04-17 2012-08-01 キヤノン株式会社 Imaging device
JP5398570B2 (en) * 2010-02-10 2014-01-29 キヤノン株式会社 Information processing apparatus and control method thereof
JP5849389B2 (en) * 2010-11-04 2016-01-27 株式会社リコー Imaging apparatus and imaging method
JP5872783B2 (en) 2011-03-24 2016-03-01 オリンパス株式会社 Display device, display method, and display program
JP5907738B2 (en) * 2012-01-23 2016-04-26 オリンパス株式会社 Imaging apparatus, display method, and program
JP5806623B2 (en) * 2012-02-07 2015-11-10 オリンパス株式会社 Imaging apparatus, imaging method, and program
JP6137936B2 (en) * 2013-05-10 2017-05-31 オリンパス株式会社 Imaging device, image processing apparatus, and image correction method
EP3509291B1 (en) * 2016-08-31 2022-09-07 Canon Kabushiki Kaisha Imaging control apparatus and method therefor
JP6757268B2 (en) * 2017-01-30 2020-09-16 キヤノン株式会社 Imaging device and its control method
US10375313B1 (en) * 2018-05-07 2019-08-06 Apple Inc. Creative camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090075072A (en) * 2008-01-03 2009-07-08 상명대학교 산학협력단 Method for personal identification using finger-vein
CN101963860A (en) * 2009-07-22 2011-02-02 奥林巴斯映像株式会社 Condition changing device
CN103873759A (en) * 2012-12-17 2014-06-18 联想(北京)有限公司 Shooting method and electronic equipment
CN104917958A (en) * 2015-04-30 2015-09-16 深圳市万普拉斯科技有限公司 Picture taking method and device

Also Published As

Publication number Publication date
CN109076156A (en) 2018-12-21
US10649313B2 (en) 2020-05-12
JP2017175606A (en) 2017-09-28
US20190011805A1 (en) 2019-01-10
JP6788518B2 (en) 2020-11-25

Similar Documents

Publication Publication Date Title
CN109076156B (en) Electronic device and control method thereof
JP6748582B2 (en) Imaging device, control method thereof, program, and recording medium
JP5306266B2 (en) Imaging apparatus and control method thereof
US9036073B2 (en) Imaging apparatus and for controlling an automatic focus (AF) area and an enlargement area in a live view
US11039073B2 (en) Electronic apparatus and method for controlling the same
US10911663B2 (en) Electronic apparatus and method for controlling the same
JP2013143644A (en) Imaging apparatus, imaging apparatus control method, program, and storage medium
JP2018125612A (en) Imaging apparatus and control method thereof
KR20170063388A (en) Electronic device and method for controlling the same
US9992405B2 (en) Image capture control apparatus and control method of the same
JP2018107709A (en) Imaging control device and control method thereof
JP2018036802A (en) Electronic apparatus and method for controlling the same
US10120496B2 (en) Display control apparatus and control method thereof
CN112015266A (en) Electronic device, control method of electronic device, and computer-readable storage medium
US11184528B2 (en) Electronic apparatus, control method therefor, and non-transitory computer readable medium
JP2018107712A (en) Imaging control device and control method thereof
US20210173536A1 (en) Electronic apparatus and control method thereof
JP6393296B2 (en) IMAGING DEVICE AND ITS CONTROL METHOD, IMAGING CONTROL DEVICE, PROGRAM, AND STORAGE MEDIUM
JP2020166199A (en) Electronic apparatus and method of controlling the same
WO2017164140A1 (en) Electronic device and method for controlling same
US11457150B2 (en) Electronic device capable of performing control based on a touch operation and control method thereof
JP7446845B2 (en) Imaging control device, control method for the imaging control device, program, and storage medium
US20240089598A1 (en) Image capturing device and method of controlling thereof
JP7134730B2 (en) Imaging device and its control method
JP7210158B2 (en) ELECTRONIC DEVICE, ELECTRONIC DEVICE CONTROL METHOD, PROGRAM, AND RECORDING MEDIUM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant