US9979881B2 - Imaging control apparatus and method for controlling the same - Google Patents

Imaging control apparatus and method for controlling the same Download PDF

Info

Publication number
US9979881B2
US9979881B2 US15/360,338 US201615360338A US9979881B2 US 9979881 B2 US9979881 B2 US 9979881B2 US 201615360338 A US201615360338 A US 201615360338A US 9979881 B2 US9979881 B2 US 9979881B2
Authority
US
United States
Prior art keywords
touch
moving
control unit
display
response
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US15/360,338
Other languages
English (en)
Other versions
US20170155825A1 (en
Inventor
Akio Yoshikawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YOSHIKAWA, AKIO
Publication of US20170155825A1 publication Critical patent/US20170155825A1/en
Application granted granted Critical
Publication of US9979881B2 publication Critical patent/US9979881B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • H04N5/23216
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/02Viewfinders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/635Region indicators; Field of view indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • H04N5/23206
    • H04N5/23212
    • H04N5/23293
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/673Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction

Definitions

  • the present disclosure relates to an imaging control apparatus and a method for controlling the image control apparatus, and particularly to a technique for setting a position where specific processing relating to image capturing is performed.
  • Japanese Patent Application Laid-Open No. 2012-203143 discusses a technique in which a user moves an AF target frame by performing a touch operation on a touch panel while looking into an in-finder display unit, and determining the display position of the AF target frame as an AF position through a double-tap.
  • AF processing automatic exposure (AE) processing, white balance (WB) processing, and the like are performed in both still image capturing and moving image capturing.
  • the AF processing is performed as follows.
  • still image capturing the user specifies the AF position before capturing an image.
  • moving image capturing the user can specify an AF position before starting recording a moving image, and can change the AF position in the middle of moving image recording. If the method discussed in Japanese Patent Application Laid-Open No. 2012-203143 is applied to still image capturing, the user needs to perform a double-tap operation, after moving the AF target frame, to specify the AF position. Therefore, the user may possibly miss image capturing opportunities. To solve such a problem, the user can move the AF position to the frame position according to the movement of the AF target frame. In this case, however, an image automatically focused at a position other than a desired position may be recorded at the time of moving image recording.
  • the present disclosure is directed to an imaging control apparatus that improves operability for setting a specified position relating to specific processing performed based on a specified position during moving image recording and at timings other than moving image recording.
  • an imaging control apparatus includes a touch detection unit configured to detect a touch operation on a touch panel, a display control unit configured to control, in response to the touch detection unit detecting a moving operation of moving a touch position, a first mark to be displayed at a position on a display unit moved by an amount corresponding to the moving operation, and a control unit configured to perform control, during recording of a captured moving image, not to perform specific processing relating to image capturing during at least a predetermined time period even if the touch detection unit detects a touch-up operation which is releasing of the touch performed following the moving operation, and perform control, in a predetermined state other than during recording of a captured moving image, to perform within the predetermined time period the specific processing based on a position where the touch-up operation is detected, in response to the touch detection unit detecting a touch-up operation performed following the moving operation.
  • FIGS. 1A and 1B are external views illustrating a digital camera as an example of an apparatus to which a configuration of a first exemplary embodiment is applicable.
  • FIG. 2 is a block diagram illustrating an example of a configuration of the digital camera as an example of the apparatus to which the configuration of the first exemplary embodiment is applicable.
  • FIGS. 3A, 3B, and 3C are flowcharts illustrating automatic focusing (AF) moving processing according to the first exemplary embodiment.
  • FIGS. 4A, 4B, 4C, and 4D illustrate states where an AF position is moved.
  • FIGS. 5A and 5B are external views illustrating a digital camera as an example of an apparatus to which a configuration of a second exemplary embodiment is applicable.
  • FIGS. 6A and 6B are flowcharts illustrating AF moving processing according to the second exemplary embodiment.
  • FIGS. 7A and 7B are flowcharts illustrating AF position moving instruction processing of a remote control according to a modification of the second exemplary embodiment.
  • FIGS. 8A and 8B are flowcharts illustrating AF position movement reception processing of a digital camera according to the modification of the second exemplary embodiment.
  • FIGS. 1A and 1B are external views of a digital camera as an example of an imaging control apparatus to which a first exemplary embodiment is applicable.
  • FIG. 1A is a perspective view illustrating the front side of a digital camera 100
  • FIG. 1B is a perspective view illustrating the back side of the digital camera 100 .
  • a display unit 28 displays an image and various kinds of information.
  • a touch panel 70 a capable of receiving a touch operation (capable of touch detection) is overlapped with the display unit 28 .
  • a shutter button 61 is an operation member for issuing an image capturing instruction.
  • a mode selection switch 60 is a (selectable) operation member for selecting various modes.
  • a terminal cover 40 protects connectors (not illustrated) such as a connecting cable for connecting the digital camera 100 and an external apparatus.
  • a main electronic dial 71 is a rotary operation member included in the operation unit 70 . Turning the main electronic dial 71 enables changing setting values such as a shutter speed and diaphragm settings.
  • a power switch 72 is an operation member for turning power of the digital camera 100 ON and OFF.
  • a sub electronic dial 73 is a rotary operation member included in the operation unit 70 and enables moving a selection frame and image feeding.
  • a cross key 74 is a four-way operation key included in the operation unit 70 . The left, right, top and bottom portions of the cross key 74 can be pressed. Operations corresponding to pressed portions of the cross key 74 are possible.
  • a SET button 75 is a push button included in the operation unit 70 . The SET button 75 is mainly used to determine a selection item.
  • a live view (LV) button 78 is included in the operation unit 70 .
  • the LV button 78 turns live view display on the display unit 28 ON and OFF.
  • the LV button 78 is used to give an instruction to start and stop moving image capturing (recording).
  • a playback button 79 is an operation button included in the operation unit 70 . The playback button 79 switches between the image capturing mode and the reproducing mode. When the user presses the playback button 79 in the image capturing mode, the digital camera 100 enters the reproducing mode and displays on the display unit 28 the last captured image out of images recorded in a recording medium 200 .
  • a grip portion 90 is a supporting portion (grasping portion) for operating the digital camera 100 while supporting it.
  • the operation unit 70 is provided on the side of the grip portion 90 .
  • An eyepiece sensor 77 is an object detection unit for detecting an object approaching at a distance closer than a predetermined distance such as 1 or 2 centimeters (eye contact detection). For example, when the user brings the eyes to the finder 16 (looks into an eyepiece portion 16 a ) to observe an in-finder display unit and the eyepiece sensor 77 detects an object (eye) approaching, the display of the in-finder display unit 76 is superimposed on the subject observed through the OVF. Further, when the eyepiece sensor 77 detects that the object (eye) is separated by a predetermined distance, displayed items disappear from the in-finder display unit 76 .
  • a predetermined distance such as 1 or 2 centimeters
  • the display unit 28 When the user looks into the finder 16 , the display unit 28 is not displayed but a touch operation on the touch panel 70 a for setting the AF position can be received. In this case, if the user performs a touch operation on the touch panel 70 a while grasping the grip portion 90 with a finger touching the shutter button 61 , the user is able to quickly perform an AF position moving operation and image capturing instruction while observing the display on the in-finder display unit 76 (and the subject observed through the OVF).
  • the display of the finder 16 may be an electronic view finder (EVF) instead of an OVF.
  • a lens unit 150 is detachably attached to the digital camera 100 .
  • FIG. 2 is a block diagram illustrating an example of a configuration of the digital camera 100 according to the present exemplary embodiment.
  • the lens unit 150 includes an exchangeable imaging lens. Although a lens 103 is composed of a plurality of lenses, it is simply illustrated as one lens.
  • a communication terminal 6 is used by the lens unit 150 to communicate with the digital camera 100 .
  • a communication terminal 10 is used by the digital camera 100 to communicate the lens unit 150 .
  • An AE sensor 17 measures the luminance of light of a subject (image) formed on a focusing screen 13 through the lens unit 150 and a quick return mirror 12 .
  • the quick return mirror 12 (hereinafter referred to as a mirror 12 ) is moved up and down by an actuator (not illustrated) through an instruction from a system control unit 50 .
  • the mirror 12 is used to switch the travelling direction of a light flux incident from the lens 103 between the side of the finder and the side of the imaging unit 22 .
  • the mirror 12 is arranged so as to reflect the light flux to the finder 16 .
  • the mirror 12 flaps upward to evacuate from the light flux so as to lead the light flux to the imaging unit 22 (mirror up).
  • the mirror 12 is partially a half mirror of which the central part transmits a part of light to enter a focus detection unit 11 for performing focus detection.
  • a photographer observes the image formed on the focusing screen 13 through a pentaprism 14 and the finder 16 to enable confirming the focusing state and composition of the optical image of the subject obtained through the lens unit 150 .
  • the in-finder display unit 76 is a display unit arranged on the optical path from the lens 103 to the finder 16 .
  • the in-finder display unit 76 displays a frame indicating the current AF position and icons, e.g., items, marks, and signs, indicating setting statuses of the digital camera 100 through the system control unit 50 .
  • the focusing screen 13 and the in-finder display unit 76 are arranged at adjacent positions in an overlapped manner so that respective displays can be confirmed at one time.
  • the focus detection unit 11 (AF sensor) employs the phase difference detection method which outputs defocus amount information based on the captured image to the system control unit 50 .
  • the system control unit 50 can control the lens unit 150 via the communication terminals 6 and 10 .
  • the system control unit 50 performs phase difference AF based on the defocus amount information via an AF drive circuit 3 and performs phase difference AF by moving the position of the lens 103 (AF execution is possible).
  • the AF method may be contrast AF instead of phase difference AF.
  • the imaging unit 22 is an image sensor such as a charge coupled device (CCD) sensor or complementary metal-oxide semiconductor (CMOS) sensor for converting an optical image into an electrical signal.
  • An analog-to-digital (A/D) converter 23 converts an analog signal into a digital signal.
  • the A/D converter 23 is used to convert an analog signal output from the imaging unit 22 into a digital signal.
  • An image processing unit 24 performs predetermined pixel interpolation, resizing processing such as reduction of size, and color conversion processing on data from the A/D converter 23 or data from the memory control unit 15 .
  • the image processing unit 24 performs predetermined calculation processing by using captured image data.
  • the system control unit 50 performs exposure control and range finding control based on an obtained calculation result. This enables performing the AF processing, AE processing, and electronic flash preliminary emission (EF) processing using the Through The Lens (TTL) method.
  • the image processing unit 24 further performs Automatic White Balance (AWB) processing using the TTL method based on a calculation result obtained by performing predetermined calculation processing by using captured image data.
  • ABM Automatic White Balance
  • the display unit 28 is a back monitor for displaying an image.
  • the display unit 28 is not limited to a liquid crystal display, and may be other displays such as an organic Electro Luminescence (EL) display.
  • EL organic Electro Luminescence
  • Output data from the A/D converter 23 is written in the memory 32 via the image processing unit 24 and the memory control unit 15 or directly via the memory control unit 15 .
  • the memory 32 stores image data captured by the imaging unit 22 and converted into digital data by the A/D converter 23 , and stores image data to be displayed on the display unit 28 .
  • the memory 32 is provided with a sufficient storage capacity for storing a predetermined number of still images and moving images and sound for a predetermined time period.
  • the memory 32 also serves as an image display memory (video memory).
  • a digital-to-analog (D/A) converter 19 converts image display data stored in the memory 32 into an analog signal and supplies the analog signal to the display unit 28 and the in-finder display unit 76 .
  • D/A digital-to-analog
  • the image display data written in the memory 32 is displayed on the display unit 28 and the in-finder display unit 76 via the D/A converter 19 .
  • the display unit 28 and the in-finder display unit 76 display images corresponding to the analog signal from the D/A converter 19 on a display such as a liquid crystal display (LCD).
  • the digital signal generated through A/D conversion by the A/D converter 23 and stored in the memory 32 is converted into an analog signal by the D/A converter 19 .
  • Successively transmitting images to the display unit 28 (EVF when the in-finder display 76 is an EVF) to display them on the display unit 28 enables it to function as an electronic view finder enabling live view display.
  • a nonvolatile memory 56 is an electrically erasable recordable memory such as an electrically erasable programmable read only memory (EEPROM). Constants and programs for operations of the system control unit 50 are stored in the nonvolatile memory 56 .
  • the programs stored in the nonvolatile memory 56 refer to programs for executing various flowcharts (described below) according to the present exemplary embodiment.
  • the system control unit 50 controls the entire digital camera 100 . Each piece of processing of the present exemplary embodiment (described below) is implemented when the system control unit 50 executes a program recorded in the above-described nonvolatile memory 56 . Constants and variables for operations of the system control unit 50 and programs read from the nonvolatile memory 56 are loaded into the system memory 52 . A random access memory (RAM) is used as the system memory 52 .
  • the system control unit 50 also performs display control by controlling the memory 32 , the D/A converter 19 , the display unit 28 , and the in-finder display unit 76 .
  • a system timer 53 is a time measurement unit for measuring time used for various control and time of a built-in clock.
  • the mode selection switch 60 , the shutter button 61 , and the operation unit 70 are operation units for inputting various operation instructions to the system control unit 50 .
  • the mode selection switch 60 switches the operation mode of the system control unit 50 between the image capturing mode and the reproducing mode. Further, the digital camera 100 has various scene modes serving as image capturing settings for each image capturing scenes, the programmed AE mode, and a custom mode. The mode selection switch 60 enables the user to directly select one of these modes included in the menu screen. Alternatively, after once selecting the menu screen, the user may select either one of these modes included in the menu screen by using another operation member.
  • the first shutter switch 62 turns ON in the middle of an operation (i.e., half-stroke press) of the shutter button 61 provided on the digital camera 100 to generate a first shutter switch signal SW 1 (half-stroke press means an image capturing preparation instruction).
  • the first shutter switch signal SW 1 causes the system control unit 50 to start operations including the AF processing, AE processing, AWB processing, and EF processing.
  • a second shutter switch 64 turns ON upon completion of an operation (full-stroke press) of the shutter button 61 to generate a second shutter switch signal SW 2 (full-stroke press means an image capturing instruction).
  • the second shutter switch signal SW 2 causes the system control unit 50 to start a series of image capturing operations ranging from signal reading from the imaging unit 22 to image data writing in the recording medium 200 .
  • each operation member of the operation unit 70 is suitably assigned a function and serves as a function button for each scene.
  • the operation unit 70 includes at least the following operation members: the shutter button 61 , the main electronic dial 71 , the power switch 72 , the sub electronic dial 73 , the cross key 74 , the SET button 75 , the LV button 78 , and the playback button 79 .
  • the user is able to intuitively perform various settings by using the menu screen displayed on the display unit 28 , the cross key (four-direction operation key), and the SET button.
  • a power supply control unit 80 includes a battery detection circuit, a direct current-to-direct current (DC-DC) converter, a switch circuit for selecting a block to be supplied with power, to detect the existence of a battery, the battery type, and the remaining battery capacity.
  • the power supply control unit 80 also controls the DC-DC converter based on the result of detection and an instruction of the system control unit 50 to supply a required voltage to the recording medium 200 and other components for a required time period.
  • the power switch 72 receives a power ON/OFF operation.
  • a power source unit 30 includes a primary battery (such as an alkaline battery and a lithium battery), a secondary battery (such as a NiCd battery, a NiMH battery, and a Li battery), and an alternating current (AC) adaptor.
  • a recording medium interface (I/F) 18 is an interface with the recording medium 200 such as a memory card and a hard disk.
  • the recording medium 200 is a recording medium such as a memory card for recording captured images, including a semiconductor memory and a magnetic disk.
  • a communication unit 54 wirelessly or wired establishes connection to perform transmission and reception of image signals and audio signals.
  • the communication unit 54 can also connect to a wireless Local Area Network (LAN) and the Internet.
  • the communication unit 54 can transmit images (including a live view) captured by the imaging unit 22 and images recorded in the recording medium 200 and receive image data and other various kinds of information from an external apparatus.
  • the digital camera 100 is provided with the touch panel 70 a that can detect contact on the display unit 28 .
  • the touch panel 70 a and the display unit 28 can be integrally formed.
  • the touch panel 70 a is configured so that the transmissivity of light may not disturb the display of the display unit 28 , and is attached to the upper layer of the display screen of the display unit 28 .
  • the input coordinates of the touch panel 79 a are associated with the display coordinates on the display unit 28 .
  • This enables configuring such a graphical user interface (GUI) that enables the user to directly operate the screen displayed on the display unit 28 .
  • GUI graphical user interface
  • a setting in which the position where a touch operation is performed is associated with the position on the display unit 28 and an instruction is received is referred to as an absolute coordinate setting.
  • a setting in which an instruction from a predetermined position on the display unit 28 to a position as a result of the movement according to the moving amount and moving direction of a touch operation (a position as a result of the movement by an amount corresponding to the moving operation), not touch coordinates is referred to as a relative coordinate setting.
  • the user When performing an operation while observing the in-finder display unit 76 , if the user performs a touch operation with an absolute coordinate setting, the user will touch the touch panel 70 a without observing it (display unit 28 ), the user is highly likely to perform an erroneous touch operation at a position different from the desired position.
  • the user when the user performs a touch operation with a relative coordinate setting, the user issues a moving instruction based not on the position but on the moving amount of the touch operation. Therefore, when the user performs an operation for moving the operation target to a desired position while observing the position of the operation target displayed on the in-finder display unit 76 , the user is able to issue an instruction for moving it to a desired position.
  • the absolute coordinate setting and the relative coordinate setting can be selected in a touch pad setting of a menu screen.
  • the touch position and the instruction position may become different and the operability may possibly degrade operability. Therefore, when the touch panel 70 a receives a touch operation with a relative coordinate setting when the eyepiece sensor 77 detects an object approaching, favorable operability is provided both when the user is observing the display unit 28 and when the user is observing the in-finder display unit 76 .
  • the function that the touch panel 70 a receives a touch operation while no image is displayed on the display unit 28 is referred to as a touch pad function.
  • the system control unit 50 is able to detect the following operations and states on the touch panel 70 a.
  • a touch-on is detected at the same time. After a touch-down, a touch-on is normally kept being detected until a touch-up is detected. A touch-move is also detected in a state where a touch-on is being detected. Even if a touch-on is detected, a touch-move is not detected if the touch position is not moving. After a touch-up is detected for all of fingers and a pen, a touch-off is detected.
  • a flick is an operation in which the user quickly moves the user's finger, while the finger is in contact with the touch panel 70 a , over a predetermined distance or longer and then subsequently removes the finger therefrom. In other words, the user quickly flips the surface of the touch panel 70 a with the finger. After detecting a touch-move at a predetermined speed or higher over a predetermined distance or longer, when a touch-up is subsequently detected, it can be determined that a flick has been performed.
  • the touch panel 70 a may be of any one of various types including a resistance film type, a capacitance type, a surface elastic wave type, an infrared type, an electromagnetic induction type, an image recognition type, and an optical sensor type. Depending on the type, a touch is detected when a finger or a pen comes in contact with the touch panel 70 a or when the finger or pen comes close to the touch panel 70 a . Either type is applicable.
  • FIGS. 3A to 3C illustrate AF moving processing according to the first exemplary embodiment. This processing is implemented when a program recorded in the nonvolatile memory 56 is loaded into the system memory 52 and then executed by the system control unit 50 . This processing is started when power of the digital camera 100 is turned ON and the image capturing mode is selected.
  • step S 300 the system control unit 50 initializes the AF coordinates B and records the AF coordinates B in the system memory 52 .
  • the position of the coordinates to be initialized is set to the position of a person's face detected as illustrated in FIG. 4A or the center position of the imaging region.
  • the in-finder display unit 76 actually displays an AF frame 401 or a changing AF frame 402 , and the subject is visually recognizable through the OVF.
  • the system control unit 50 superimposes the AF frame 401 on the captured image captured by the imaging unit 22 .
  • FIG. 4A illustrates a state where the AF frame 401 is displayed at the AF coordinates B (Xb, Yb).
  • FIGS. 4A to 4C illustrate examples of displays on the in-finder display unit 76 during AF position moving processing and touch operations on the touch panel 70 a corresponding to each display on the in-finder display unit 76 .
  • step S 301 the system control unit 50 determines whether the eyepiece sensor 77 detects (is detecting) an eye contact (approach detection). When the system control unit 50 determines that the eyepiece sensor has detected an eye contact (YES in step S 301 ), the processing proceeds to step S 302 . On the other hand, when the system control unit 50 determines that the eyepiece sensor 77 has not detected an eye contact (NO in step S 301 ), the processing proceeds to step S 303 .
  • step S 302 the system control unit 50 displays the AF frame 401 at the current AF position of the in-finder display unit 76 (the AF coordinates B recorded in step S 300 or the AF coordinates B set in step S 334 or S 333 (described below)). More specifically, in a case where an AF position is already set in the AF position moving processing, the set position is the AF coordinates B when the processing returns to step S 302 , and the image capturing mode is selected for the first time after tuning power ON, the detected face position or the center position of the imaging region is set to be at the AF coordinates B. The system control unit 50 further displays the AF frame 401 (mark or index) at the AF coordinates B.
  • step S 303 the system control unit 50 displays the AF frame 401 at the position of the AF coordinates B of the display unit 28 (not illustrated).
  • step S 304 the system control unit 50 performs processing for performing AF on the position of the AF coordinates B.
  • the system control unit 50 does not always need to perform the AF processing in step S 304 , and may perform the AF processing in step S 304 only when the user sets continuous AF.
  • step S 305 the system control unit 50 determines whether an instruction for starting moving image capturing is issued.
  • the instruction for starting moving image capturing can also be performed by pressing the LV button 78 .
  • the processing proceeds to step S 306 .
  • the processing proceeds to step S 308 .
  • step S 306 the system control unit 50 starts moving image recording (start of recording) and records the image captured by the imaging unit 22 in the recording medium 200 as a moving image.
  • start of recording if the AF processing is performed on the AF position set by the user, a moving image in which the subject desired by the user is in-focused is recorded.
  • step S 307 the system control unit 50 determines whether a touch-down has been performed on the touch panel 70 a .
  • the processing proceeds to step S 310 .
  • the processing returns to step S 301 .
  • step S 308 the system control unit 50 determines whether an instruction for capturing a still image has been performed.
  • the instruction for capturing a still image can be performed by pressing the shutter button 61 .
  • the processing proceeds to step S 309 .
  • the system control unit 50 determines that the instruction for capturing a still image has not been performed (NO in step S 308 )
  • the processing proceeds to step S 307 .
  • step S 309 the system control unit 50 records the image captured by the imaging unit 22 in the recording medium 200 .
  • the system control unit 50 performs the AF processing on the set AF coordinates B according to the detection of the press of SW 1 , and records the captured image having undergone the AF processing at the AF coordinates B according to the detection of the depression of SW 2 . More specifically, the system control unit 50 records the captured image in which the object desired by the user is focused.
  • step S 310 the system control unit 50 determines whether a set touch pad setting is a relative coordinate setting.
  • the processing proceeds to step S 311 .
  • the processing proceeds to step S 332 .
  • step S 311 the system control unit 50 determines whether the eyepiece sensor 77 has detected (is detecting) an eye contact.
  • the processing proceeds to step S 312 .
  • the processing proceeds to step S 332 .
  • the system control unit 50 proceeds to the AF position moving processing with a relative coordinate input in steps S 312 to S 325 .
  • the system control unit 50 proceeds to the AF position moving processing with an absolute coordinate input in steps S 332 and S 333 .
  • step S 312 the system control unit 50 starts measuring the time period T by using the system timer 53 .
  • the time period T is required to determine whether the touch operation detected by the touch panel 70 a is a tap operation with a short touch time. More specifically, the system control unit 50 starts measuring the time period since the time when a touch-down is detected until the time when a touch-up is detected without a touch-move performed.
  • step S 313 the system control unit 50 acquires touch-down coordinates O and records the touch-down coordinates O in the system memory 52 .
  • the x axis is provided in the horizontal direction
  • the y-axis is provided in the vertical direction
  • the origin is set at the upper left corner, as illustrated in FIG. 4A .
  • the touch-down coordinates O is set to (X0, Y0).
  • step S 314 the system control unit 50 determines whether a touch-move has been performed.
  • the system control unit 50 determines that a touch-move has been performed.
  • the predetermined distance is, for example, 1 or 0.5 centimeters. If the touch operation moves a distance shorter than the predetermined distance, the system control unit detects the operation not as a touch-move but as a process of a tap operation in which the touch position is unintentionally moved.
  • the system control unit 50 determines that a touch-move has been performed (YES in step S 314 )
  • the processing proceeds to step S 315 .
  • the system control unit 50 determines that a touch-move has not been performed (NO in step S 314 )
  • the processing proceeds to step S 328 .
  • step S 316 the system control unit 50 changes the AF frame currently displayed at the AF coordinates B to the changing AF frame 402 indicating that the AF position is currently being changed.
  • FIG. 4B illustrates an example of display of the in-finder display unit 76 after a touch-move is detected, and an example of a touch operation on the touch panel 70 a corresponding to the display.
  • the system control unit 50 changes the AF frame 401 having been displayed at the AF coordinates B to the changing AF frame 402 indicating that the AF position is currently being changed. More specifically, the system control unit 50 displays the changing AF frame 402 at the AF coordinates B. In this case, the position subjected to the AF processing remains at the position where the changing AF frame 402 is displayed, the AF position remains unchanged from that in step S 302 or S 303 .
  • step S 317 the system control unit 50 acquires coordinates of the current touch position (touch coordinates M (Xm, Ym)).
  • the relative coordinates C at the touch-down position illustrated in FIG. 4B is ((X1 ⁇ X0), (Y1 ⁇ Y0)).
  • a distance L from the touch-down coordinates O to the touch coordinates M is al.
  • step S 319 the system control unit 50 calculates temporary AF coordinates P based on the AF coordinates B and the relative coordinates C calculated in step S 318 , and displays the temporary AF frame 403 at the temporary AF coordinates P.
  • the temporary AF coordinates P is obtained by adding the AF coordinates B and the relative coordinates C, i.e., ((Xb+Xc), (Yb+Yc)).
  • the system control unit 50 displays the temporary AF frame 403 at the position of the calculated temporary AF coordinates P.
  • the AF processing has not yet been performed on the position where the temporary AF frame 403 is displayed, the AF processing will be performed on the position where the temporary AF frame 403 is displayed according to the detection of a touch-up (described below) or a tap.
  • step S 320 the system control unit 50 determines whether a touch-up from the touch panel 70 a has been performed.
  • the processing proceeds to step S 321 .
  • the processing proceeds to step S 317 .
  • step S 321 the system control unit 50 determines whether a moving image is currently being captured (currently being recorded). When the system control unit 50 determines that a moving image is currently being captured (currently being recorded) (YES in step S 321 ), the processing proceeds to step S 322 . On the other hand, when the system control unit determines that a moving image is not currently being captured (currently not being recorded) (NO in step S 321 ), the processing proceeds to step S 334 .
  • step S 322 the system control unit 50 starts measuring a time period S by using the system timer 53 .
  • the time period S is required to determine whether a predetermined time has elapsed without an operation for determining the AF position (described below) performed since a touch-move for moving the temporary AF position and then a touch-up have been performed.
  • the coordinates on which the temporary AF frame 403 is displayed is reset and the original state (step S 301 ) is restored assuming that an AF position moving operation has not been performed.
  • a guide may be displayed on the in-finder display unit 76 to indicate that the AF position can be determined by performing a tap operation.
  • step S 323 the system control unit 50 determines whether the elapsed time period S is longer than a predetermined time period 3 .
  • the processing proceeds to step S 324 .
  • the processing proceeds to step S 326 .
  • the predetermined time period ⁇ is such a time length as 10 or 12 seconds.
  • step S 324 the system control unit 50 changes the changing AF frame 402 having been displayed at the AF coordinates B illustrated in FIG. 4B to the AF frame 401 and displays the AF frame 401 at the AF coordinates B as illustrated in FIG. 4A . Then, the system control unit 50 hides the temporary AF frame 403 .
  • the system control unit 50 leaves the AF position at the AF coordinates B unchanged without moving the AF position to the position of the temporary AF coordinates P.
  • step S 326 the system control unit 50 determines whether a touch-down has been performed on the touch panel 70 a .
  • the processing proceeds to step S 327 .
  • the processing returns to step S 323 .
  • step S 328 the system control unit 50 determines whether a touch-up from the touch panel 70 a has been performed.
  • the processing proceeds to step S 329 .
  • the processing returns to step S 314 .
  • step S 329 the system control unit 50 determines whether the elapsed time period T is longer than a predetermined time period ⁇ .
  • the processing proceeds to step S 331 .
  • the processing proceeds to step S 330 .
  • the predetermined time period ⁇ is such a time length as 1 or 0.5 seconds.
  • step S 331 the system control unit 50 changes the changing AF frame 402 having been displayed at the AF coordinates B illustrated in FIG. 4B to the AF frame 401 , and displays the AF frame 401 at the AF coordinates B as illustrated in FIG. 4A . Then, the system control unit 50 hides the temporary AF frame 403 . More specifically, in a case of a touch for a time period longer than the predetermined time period ⁇ (long-time depression), not a tap operation performed for a time period shorter than the predetermined time period ⁇ , the system control unit 50 leaves the AF position at the AF coordinates B before the touch operation (detection of the movement of the touch position) unchanged without moving the AF position to the temporary AF coordinates P.
  • long-time depression
  • steps S 332 to S 333 is performed in a case where the touch pad setting is an absolute coordinate setting or in a case where, although the touch pad setting is a relative coordinate setting, a touch operation is received with an absolute coordinate setting when the eyepiece sensor 77 does not detect eye contact.
  • the system control unit 50 performs the AF processing on the position where a touch-down has been performed.
  • FIG. 4D illustrates a state of the AF frame movement in a case where an image is displayed on the display unit 28 with an absolute coordinate setting. In a state where the eye does not contact the eyepiece portion 16 a , the user may perform a touch operation at the position on the display unit 28 where an object subjected to the AF processing is displayed.
  • step S 333 the system control unit 50 performs the AF processing on the position of the AF coordinates B recorded in the system memory 52 in step S 332 .
  • the processing in steps S 334 to S 336 indicates processing relating to the AF processing on the position as a result of the movement in a case where an AF position moving operation with a relative coordinate setting has been received.
  • step S 334 the system control unit 50 changes the temporary AF frame 403 having been displayed at the temporary AF coordinates P to the AF frame 401 . Further, the system control unit 50 updates the coordinates to be recorded in the system memory 52 as the AF coordinates B from the coordinates recorded in step S 300 or S 334 to the temporary AF frame coordinates P.
  • step S 335 the system control unit 50 hides the changing AF frame 402 currently being displayed at the AF coordinates B.
  • FIG. 4C illustrates states of the in-finder display unit 76 and the touch panel 70 a after the AF position is changed, in which the AF coordinates has been moved from the AF coordinates B illustrated in FIG. 4A .
  • the AF position is determined when the user moves the temporary AF frame 403 and then performs a tap operation (operation for determining the AF position), as illustrated in FIG. 4B .
  • the AF position is determined when the user moves the temporary AF frame 403 and then performs a touch-up as illustrated in FIG. 4B .
  • step S 336 the system control unit 50 performs the AF processing on the position of the AF coordinates B recorded in the system memory 52 in step S 334 .
  • step S 337 the system control unit 50 determines whether an instruction for ending moving image capturing is issued.
  • the instruction for ending moving image capturing can be issued by pressing the LV button 78 during moving image capturing.
  • the system control unit 50 does not perform the determination in step S 337 at timings other than moving image capturing.
  • the processing proceeds to step S 338 .
  • the system control unit 50 determines that the instruction for ending moving image capturing has not been issued (NO in step S 337 )
  • the processing proceeds to step S 339 .
  • step S 338 the system control unit 50 performs moving image end processing.
  • the moving image end processing refers to such processing that makes a generated moving image file reproducible, for example, by generating captured moving image file attribute information, inter-frame information, a thumbnail, and performing compression processing. These pieces of information are stored in the recording medium 200 together with the moving image file.
  • step S 339 the system control unit 50 determines whether an end operation has been performed. End operations include an operation for selecting the reproducing mode and an operation for turning power OFF. When the system control unit 50 determines that an end operation has been performed (YES in step S 339 ), the system control unit 50 ends the AF position moving processing. When the system control unit 50 determines that an end operation has not been performed (NO in step S 339 ), the processing returns to step S 301 .
  • the AF position can be changed with favorable operability during moving image recording and at other timings.
  • the system control unit 50 does not perform the AF processing immediately after performing a moving operation but changes the position to be subjected to the AF processing according to the determination operation.
  • this processing reduces the possibility that the AF position is changed at an unintended timing in the middle of moving image recording and the possibility that the AF processing is performed on an unintended position in the middle of a moving operation.
  • Performing a tap operation after a moving operation enables performing the AF processing on the position where the temporary AF frame 403 is displayed at a desired timing. This makes it easier to obtain a moving image in which the AF position is changed at a desired timing and at a desired position.
  • the system control unit 50 performs the AF processing on the changed position according to an AF position moving operation and then a touch-up from the touch panel 70 a (steps S 334 to S 336 ). More specifically, immediately after moving the temporary AF frame 403 and then performing a touch-up, the AF processing is performed on the position of the temporary AF frame 403 (the temporary AF frame 403 changes to the AF frame 401 ), enabling quickly changing the AF position.
  • the ability to quickly change the AF position reduces the possibility of missing imaging opportunities during AF position setting.
  • the system control unit 50 changes the timing of changing the AF position depending on whether a moving image is being recorded in this way, the AF position changing operation can be performed with favorable operability.
  • the system control unit 50 also changes the display formats of the frames for indicating positions relating to the AF position, such as the AF frame 401 , the changing AF frame 402 , and the temporary AF frame 403 . This enables the user to easily grasp the position (the AF frame 401 and the changing AF frame 402 ) where the AF processing is actually being performed, and the position (the temporary AF frame 403 ) which is to be set as the AF position.
  • the system control unit 50 leaves the AF position at the AF coordinates B unchanged when the predetermined time period ⁇ has elapsed in step S 323 , a cancelling operation may be received. If the position where the temporary AF frame is currently displayed is a position unintended by the user, the system control unit 50 enables the user to quickly cancel the position of the temporary AF frame (cancel processing) and set a new AF frame or to return to the position before the moving operation.
  • the system control unit 50 may determine that the user approves the position where the temporary AF frame is currently being displayed and, after the predetermined time period ⁇ has elapsed, perform processing (processing in step S 334 and subsequent steps) for determining the AF coordinates.
  • the system control unit 50 hides the temporary AF frame 403
  • the processing is not limited thereto.
  • the system control unit 50 may change the color or size of the frame.
  • the display format of the temporary AF frame 403 is changed, the user recognizes that the AF position is not being moved but determined or that the temporary AF position being moved has been canceled and the AF frame returned to the original position. However, the position where the AF frame is displayed remains unchanged even if the display format has been changed.
  • the user performs a touch operation on the display unit 28 (touch panel 70 a ) of the digital camera 100 while looking into the in-finder display unit 76 to perform the AF position moving processing.
  • a second exemplary embodiment will be described below focusing on a case where an instruction is received from a remote control at a separate position and the AF position is changed.
  • the user will perform an operation while observing the display unit on the digital camera at a separate position or the display panel of the remote control.
  • the use of a remote controller 504 makes it easier to perform each operation.
  • FIGS. 5A and 5B illustrate a digital camera 300 as an example of an imaging control apparatus to which the second exemplary embodiment is applicable.
  • FIG. 5A illustrates a state where the remote controller 504 and the digital camera 300 according to the present exemplary embodiment are wirelessly communicating with each other.
  • the configuration of the digital camera 300 is similar to that according to the first exemplary embodiment illustrated in FIG. 2 .
  • the digital camera 100 according to the first exemplary embodiment illustrated in FIGS. 1A and 1B is provided with the display unit 28 on its back side.
  • the digital camera 300 according to the present exemplary embodiment is provided with a display unit 501 and a touch panel 502 a on its side surface.
  • the digital camera 300 is further provided with not an OVF but an EVF 512 at a position away from the main body of the digital camera 300 . Captured images captured by an imaging unit 22 on the main body can be displayed on the EVF 512 .
  • the digital camera 300 is provided with an operation unit 502 for inputting various operation instructions to the system control unit 50 similar to the operation unit 70 according to the first exemplary embodiment.
  • a communication unit 506 performs wireless (or wired) communication with a remote controller 504 as with the communication unit 54 according to the first exemplary embodiment.
  • FIG. 5A illustrates the remote controller 504 .
  • FIG. 5B illustrates an example of a configuration of the remote controller 504 .
  • a touch panel 508 a on the remote controller 504 is able to detect touch operations. Similar to the touch panel 70 a according to the first exemplary embodiment, the remote controller 504 is able to detect touch operations and coordinates of touch positions. Similar to the operation unit 70 according to the first exemplary embodiment, an operation unit 508 is provided on the remote controller 504 for inputting various operation instructions to a system control unit 509 .
  • the communication unit 503 enables performing wired or wireless communication with the digital camera 300 . Touch operations and touch position coordinates detected by the touch panel 508 a can be transmitted to the digital camera 300 via the communication unit 503 (transmission control).
  • Constants and variables for operations of the system control unit 509 and programs read from a nonvolatile memory 511 are loaded into the system memory 510 (RAM).
  • the nonvolatile memory 511 is an electrically erasable recordable memory such as an EEPROM. Constants and programs for operations of the system control unit 509 are stored in the nonvolatile memory 511 .
  • the touch panel 508 a does not display a captured image captured by the imaging unit 22 of the digital camera 300 .
  • To display an image also on the remote controller 504 a large amount of data needs to be exchanged between the digital camera 300 and the remote controller 504 . Therefore, the smaller data amount causes a less delay of data transmission and reception.
  • the remote controller 504 transmits and receives not the absolute coordinates but the relative coordinates of the touch position detected on the touch panel 508 a of the remote controller 504 .
  • This processing is implemented when a program recorded in the nonvolatile memory 56 is loaded into the system memory 52 and then executed by the system control unit 50 .
  • This processing is started when power of the digital camera 300 and the remote controller 504 is turned ON, the image capturing mode is selected, and communication between the remote controller 504 and the digital camera 300 is established. More specifically, the processing is started when the touch panel 508 a of the remote controller 504 becomes ready to receive a touch operation and the digital camera 300 becomes ready to receive an instruction from the remote controller 504 and perform the AF processing in response to an instruction from the remote controller 504 .
  • the system control unit 50 may receive an instruction on the display unit 501 (touch panel 502 a ) of the digital camera 300 , as described in the first exemplary embodiment.
  • the second exemplary embodiment handles the in-finder display unit 76 according to the first exemplary embodiment and the EVF 512 according to the second exemplary embodiment in a similar way.
  • step S 600 the system control unit 50 initializes the AF coordinates B like in step S 300 illustrated in FIG. 3A and records the AF coordinates B in the system memory 52 .
  • step S 601 the system control unit 50 displays the AF frame 401 at the position of the AF coordinates B on the display unit 501 and the EVF 512 .
  • step S 602 the system control unit 50 performs processing for automatically focusing on the AF coordinates B recorded in step S 601 .
  • the processing in steps S 603 to S 604 is similar to the processing in steps S 305 to S 306 illustrated in FIG. 3A .
  • steps S 605 and S 606 is similar to the processing in steps S 308 and S 309 illustrated in FIG. 3A .
  • step S 607 the system control unit 50 determines whether an instruction has been received from the remote controller 504 via the communication unit 503 .
  • the processing proceeds to step S 608 .
  • the processing returns to step S 603 .
  • the remote controller 504 transmits an instruction to the digital camera 300 via the communication unit 503 .
  • step S 608 the system control unit 50 determines whether the instruction received from the remote controller 504 is an instruction for moving the AF position.
  • the instruction for moving the AF position is transmitted to the digital camera 300 via the communication unit 503 when the movement of a touch position (touch-move) is detected on the touch panel 508 a of the remote controller 504 .
  • the processing proceeds to step S 609 .
  • the system control unit determines that the instruction for moving the AF position has not been received (NO in step S 608 )
  • the processing proceeds to step S 620 .
  • step S 609 similar to step S 316 illustrated in FIG. 3B , the system control unit 50 changes the AF frame 401 currently being displayed at the AF coordinates B to the changing AF frame 402 indicating that the AF position is currently being changed.
  • step S 610 the system control unit 50 acquires via the communication unit 503 the coordinate moving amount C′ (moving distance and direction) of the touch-move on the remote controller 504 .
  • the system control unit 50 may receive information about the moving amount of a touch-move in step S 607 or receive the moving amount of a touch-move each time a touch-move is detected by the remote controller 504 .
  • step S 611 the system control unit 50 calculates the temporary AF coordinates P based on the AF coordinates B and the coordinate moving amount C′ received in step S 610 and displays the temporary AF frame 403 .
  • the temporary AF coordinates P is the sum of the AF coordinates B and the coordinate moving amount C′.
  • the system control unit 50 displays the temporary AF frame 403 at the position of the calculated temporary AF coordinates P.
  • step S 612 the system control unit 50 determines whether an instruction for ending AF position movement has been received.
  • the instruction for ending AF position movement is a touch-up from the touch panel 508 a of the remote controller 504 .
  • This instruction is transmitted to the digital camera 300 via the communication unit 503 upon reception of an operation for ending a touch-move, i.e., the instruction for moving the AF position received in step S 608 .
  • step S 613 the system control unit 50 determines whether a moving image is currently being captured (a moving image is currently being recorded).
  • the processing proceeds to step S 614 .
  • the processing proceeds to step S 621 .
  • steps S 614 to S 617 is similar to the processing in steps S 322 to S 325 illustrated in FIG. 3B .
  • step S 618 the system control unit 50 determines whether an instruction has been received from the remote controller 504 via the communication unit 503 .
  • the processing proceeds to step S 619 .
  • the processing returns to step S 615 .
  • the system control unit 50 waits for the next instruction from the remote controller 504 since the instruction for ending AF position movement was received until the time period ⁇ has elapsed.
  • step S 620 the system control unit 50 determines whether the instruction received from the remote controller 504 in step S 607 is an instruction for determining the AF position.
  • the instruction for determining the AF position is transmitted to the digital camera 300 via the communication unit 503 in response to the detection of a tap operation on the touch panel 508 a of the remote controller 504 .
  • the processing proceeds to step S 621 .
  • the system control unit 50 determines that the instruction for determining the AF position has not been received (NO in step S 620 )
  • the processing proceeds to step S 626 .
  • steps S 621 to S 626 is similar to the processing in steps S 334 to S 339 illustrated in FIG. 3C .
  • the AF position can be changed with favorable operability during moving image recording and at other timings.
  • the system control unit 50 does not perform the AF processing immediately after reception of the instruction for moving the AF position but performs the AF processing at the position changed in response to the determination operation.
  • the system control unit 50 performs the AF position moving operation.
  • the system control unit 50 performs the AF processing at the changed position (steps S 621 to S 623 ).
  • This processing eliminates the need of transmitting and receiving captured images and therefore processes a smaller amount of communication data than in processing for successively transmitting captured images to the display unit 505 of the remote controller 504 and specifying the AF position with the absolute coordinates through a touch operation on the subject in each displayed captured image. Therefore, timing delay due to a data communication delay hardly occur. Reception of instructions on the remote controller 504 enables the user to issue an instruction for moving the AF position at a suitable timing while observing the state of the subject displayed on the display unit 501 at a separate position.
  • a modification of the second exemplary embodiment performs processing for instructing to perform the AF processing through a remote control operation similar to the second exemplary embodiment.
  • the modification differs from the second exemplary embodiment in the operating entity which performs determination according to a performed touch operation.
  • the remote controller 504 determines the processing corresponding to the touch operation and transmits an instruction for performing the processing to the digital camera 300 .
  • the digital camera 300 receives a command relating to an operation detected by the remote controller 504 and performs the processing corresponding to the received command.
  • FIGS. 5A and 5B illustrate the digital camera 300 and the remote controller 504 as examples of imaging control apparatuses to which the modification of the second exemplary embodiment is applicable.
  • the configurations of the digital camera 300 and the remote controller 504 are similar to those in the second exemplary embodiment.
  • FIGS. 7A, 7B, 8A, and 8B The processing illustrated in FIGS. 7A and 7B is implemented when a program recorded in the nonvolatile memory 511 is loaded into the system memory 510 and then executed by the system control unit 509 .
  • FIGS. 7A and 7B illustrate the processing of the remote controller 504 .
  • FIG. 8 illustrates the processing of the digital camera 300 .
  • the AF position moving instruction processing illustrated in FIGS. 7A and 7B performed by the remote controller 504 will be described below.
  • This processing is started when power of the digital camera 300 and the remote controller 504 is turned ON, the image capturing mode is selected, and communication between the remote controller 504 and the digital camera 300 is established. More specifically, the processing is started when the touch panel 508 a of the remote controller 504 becomes ready to receive a touch operation and the digital camera 300 becomes ready to receive an instruction from the remote controller 504 and to perform the AF processing in response to an instruction from the remote controller 504 .
  • step S 701 the system control unit 509 determines whether a touch-down on the touch panel 508 a has been performed.
  • the processing proceeds to step S 702 .
  • the processing proceeds to step S 724 .
  • steps S 702 to S 705 is similar to the processing in steps S 312 to S 315 illustrated in FIG. 3B .
  • step S 706 the system control unit 509 transmits an AF frame changing command to the digital camera 300 via the communication unit 503 .
  • steps S 707 and S 708 is similar to the processing in steps S 317 and S 318 illustrated in FIG. 3B .
  • step S 709 the system control unit 509 transmits the relative coordinates C calculated in step S 708 to the digital camera 300 via the communication unit 503 .
  • step S 710 the system control unit 509 determines whether a touch-up from the touch panel 508 a has been performed.
  • the processing proceeds to step S 711 .
  • the processing returns to step S 707 .
  • step S 711 the system control unit 509 acquires information about the recording status of the digital camera 300 .
  • the recording status is any one of “moving image recording”, “waiting for moving image recording”, and “still image capturing.”
  • step S 712 the system control unit 509 determines whether the recording status of the digital camera 300 is “moving image recording” based on the information about the recording status acquired in step S 711 .
  • the processing proceeds to step S 713 .
  • the system control unit 509 determines that the recording status is not “moving image recording” (“waiting for moving image recording” or “still image capturing”) (NO in step S 712 )
  • the processing proceeds to step S 723 .
  • steps S 713 and S 714 is similar to the processing in steps S 322 and S 323 illustrated in FIG. 3B .
  • step S 715 the system control unit 509 transmits a reset command to the digital camera 300 via the communication unit 503 .
  • the reset command is a command for making an operation relating to the relative coordinates C transmitted in step S 709 not to have been performed.
  • steps S 716 to S 718 is similar to the processing in steps S 325 to S 327 illustrated in FIG. 3B .
  • steps S 719 to S 721 is similar to the processing in steps S 328 to S 330 illustrated in FIG. 3 C.
  • step S 722 the system control unit 509 transmits a reset command to the digital camera 300 via the communication unit 503 , similar in step S 715 .
  • step S 723 the system control unit 509 transmits an AF coordinates B update command (specific command) to the digital camera 300 via the communication unit 503 . More specifically, the system control unit 509 instructs the digital camera 300 to update the AF coordinates B from the coordinates in step S 701 to the coordinates calculated according to the relative coordinates C transmitted in step S 709 . In other words, the system control unit 509 transmits a command indicating an instruction for changing the AF coordinates issued by a user operation.
  • step S 724 is similar to the processing in step S 339 illustrated in FIG. 3C .
  • the AF position moving instruction processing illustrated in FIGS. 8A and 8B performed by the digital camera 300 will be described below.
  • This processing is started when power of the digital camera 300 and the remote controller 504 is turned ON, the image capturing mode is selected, and communication between the remote controller 504 and the digital camera 300 is established. More specifically, the processing is started when the touch panel 508 a of the remote controller 504 becomes ready to receive a touch operation and the digital camera 300 becomes ready to receive an instruction from the remote controller 504 and perform the AF processing in response to an instruction from the remote controller 504 .
  • This processing is implemented when a program recorded in the nonvolatile memory 56 is loaded into the system memory 52 and then executed by the system control unit 50 .
  • the processing in steps S 801 to S 804 is similar to the processing in steps S 300 to S 303 illustrated in FIG. 3A .
  • steps S 805 to S 807 is similar to the processing in steps S 304 to S 306 illustrated in FIG. 3A .
  • steps S 808 and S 809 is similar to the processing in steps S 308 and S 309 illustrated in FIG. 3A .
  • step S 810 the system control unit 50 transmits information about the recording status to the remote controller 504 via the communication unit 503 .
  • step S 811 the system control unit 50 determines whether an AF frame changing command has been transmitted.
  • the processing proceeds to step S 812 .
  • the processing proceeds to step S 813 .
  • step S 812 similar in step S 316 illustrated in FIG. 3B , the system control unit 50 changes the AF frame 401 currently displayed at the AF coordinates B to the changing AF frame 402 indicating that the AF position is currently being changed.
  • step S 813 the system control unit 50 determines whether information about the relative coordinates C has been received.
  • the processing proceeds to step S 814 .
  • the processing proceeds to step S 815 .
  • step S 814 similar in step S 319 illustrated in FIG. 3B , the system control unit 50 calculates the temporary AF coordinates P based on the AF coordinates B and the relative coordinates C calculated in step S 318 and displays the temporary AF frame 403 at the temporary AF coordinates P.
  • the temporary AF coordinates P is obtained by adding the AF coordinates B and the relative coordinates C, i.e., ((Xb+Xc),(Yb+Yc)).
  • step S 815 the system control unit 50 determines whether a reset command has been received.
  • the processing proceeds to step S 816 .
  • the processing proceeds to step S 817 .
  • step S 816 similar in step S 324 or S 331 illustrated in FIG. 3B , the system control unit 50 changes the changing AF frame 402 having been displayed at the AF coordinates B illustrated in FIG. 4B to the AF frame 401 and displays the AF frame 401 at the AF coordinates B as illustrated in FIG. 4A . Then, the system control unit 50 hides the temporary AF frame 403 .
  • step S 817 the system control unit 50 determines whether an AF coordinates B update command has been received.
  • the processing proceeds to step S 818 .
  • the processing proceeds to step S 821 .
  • steps S 818 to S 820 is similar to the processing in steps S 334 to S 336 illustrated in FIG. 3C .
  • steps S 821 to S 823 is similar to the processing in steps S 337 to S 339 illustrated in FIG. 3C .
  • the AF position can be changed with favorable operability during moving image recording and at other timings.
  • the system control unit 50 does not perform the AF processing immediately after reception of the instruction for moving the AF position but performs the AF processing at the position changed in response to the determination operation.
  • the system control unit 50 performs the AF position moving operation.
  • the system control unit 50 upon reception of an end instruction (YES in step S 817 ), the system control unit 50 performs the AF processing at the changed position (steps S 818 to S 820 ).
  • This processing eliminates the need of transmitting and receiving captured images and therefore processes a smaller amount of communication data than in processing for successively transmitting captured images to the display unit 505 of the remote controller 504 and specifying the AF position with the absolute coordinates through a touch operation on the subject in each displayed captured image. Therefore, timing delay due to a data communication delay hardly occur. Reception of instructions on the remote controller 504 enables the user to issue an instruction for moving the AF position at a suitable timing while observing the state of the subject displayed on the display unit 501 at a separate position.
  • the display formats of the AF frame 401 , the changing AF frame 402 , and the temporary AF frame 403 are not limited to those described in the present exemplary embodiment and need only to be distinguishably displayed. More specifically, the frame color may be changed, the frame may blink, the frame line thickness may be changed, the frame may be filled with a color, and a position may be indicated with a dot or arrow instead of a frame.
  • an AF position moving operation is performed with a touch-move and a determination operation is performed with a tap
  • these operations are not limited thereto, and may be other touch operations or operations on buttons included in the operation unit 70 .
  • the AF position moving operation may be a press of the cross key 74 , an operation for rotating the main electronic dial 71 , or the sub electronic dial 73 .
  • the determination operation may be a press of the SET button 75 , a double-tap on the touch panel 70 a , or a multi-touch thereon.
  • the exemplary embodiments are not limited thereto, and are applicable to setting (selection) of a position (region) where various processing is performed.
  • the above-described exemplary embodiments are applicable to selection of a face AF frame for focusing on a specific person's face and selection of a position subjected to enlargement and reduction (i.e., zoom-in and zoom out).
  • the above-described exemplary embodiments are also applicable to selection of a subject, a position, or a region to be used as a reference when performing such processing as AE and WB.
  • the system control unit 50 may perform the AF processing at the touch-down coordinates O in response to a touch-down at timings other than moving image capturing with an absolute coordinate setting, and may perform the AF processing at the touch-up coordinates (touch coordinates immediately before a touch-up) in response to a touch-up during moving image capturing.
  • the system control unit 50 may set the AF coordinates B through a touch-up.
  • the system control unit 50 may set the AF coordinates B at the touch-down position. Further, the system control unit 50 may determine whether to set the AF coordinates B through a touch-down or a touch-up depending on the above-described display condition of the display unit 28 during moving image capturing, and may set the AF coordinates B through a touch-down regardless of the display condition at other timings.
  • Each of the above-described flowcharts to be executed by the system control units 50 and 509 may be controlled by one piece of hardware, and a plurality of pieces of hardware shares processing to control the entire apparatus.
  • exemplary embodiments describe digital cameras 100 and 300 and remote controller 504
  • these exemplary embodiments are also applicable to imaging control apparatuses that perform control to change the position subjected to processing based on a specified position, such as the AF processing, the AE processing, and the AWB processing.
  • a personal computer PC
  • a mobile phone terminal a portable image viewer, a digital photo frame, a music player, a game machine, an electronic book reader, a tablet PC, a smart phone, household appliances, etc.
  • the exemplary embodiments are also applicable to a smart phone, a tablet PC, a desktop PC, and the like, which receive and display AF information of a digital camera, and the like via wired or wireless communication and remotely controls the digital camera (including a network camera).
  • Exemplary embodiments are also implemented by performing the following processing.
  • software for implementing the functions of the above-described exemplary embodiments is supplied to a system or apparatus via a network or various types of storage media, and a computer (or CPU or micro processing unit (MPU)) of the system or apparatus reads and executes the program.
  • a computer or CPU or micro processing unit (MPU)
  • the program and the storage medium storing the program are included in the exemplary embodiments.
  • An imaging control apparatus includes a touch detection unit configured to detect a touch operation on a touch panel, a display control unit configured to control, in response to the touch detection unit detecting a moving operation of moving a touch position, a first mark to be displayed at a position on a display unit moved by an amount corresponding to the moving operation, and a control unit configured to perform control, during recording of a captured moving image, not to perform specific processing relating to image capturing even if the touch detection unit detects a touch-up operation which is releasing of the touch performed following the moving operation, and perform control, in a predetermined state other than during recording of a captured moving image, to perform the specific processing based on a position where the touch-up operation is detected, in response to the touch detection unit detecting a touch-up operation performed following the moving operation.
  • An imaging control apparatus includes a touch detection unit configured to detect a touch operation on a touch panel,
  • An imaging control apparatus includes a receiving unit configured to receive a command relating to a touch operation received by an external apparatus capable of receiving a touch operation, a display control unit configured to control, in response to a reception of a command relating to a moving operation of moving a touch position from the external apparatus, a first mark to be displayed at a position, on a display unit, moved by an amount corresponding to the moving operation, and a control unit configured to perform control, during recording of a captured moving image, not to perform specific processing relating to image capturing even a command relating to a touch-up operation is received which is releasing of the touch performed following the moving operation, and perform control, in a predetermined state other than during recording of a captured moving image, to perform the specific processing based on a position where the touch-up operation is detected, in response to a reception of the command relating to a touch-up operation from the external apparatus.
  • An imaging control apparatus includes a touch detection unit configured to detect a touch operation, a transmission unit configured to transmit a command to an imaging apparatus capable of executing specific processing relating to image capturing, and a transmission control unit configured to, in response to the touch detection unit detecting a moving operation of moving a touch position, control the transmission unit to transmit a command relating to a moving amount of the moving operation to the imaging apparatus, and configured to, in response to the touch detection unit detecting a touch-up operation which is releasing of the touch performed following the moving operation, control the transmission unit not to transmit a specific command for performing specific processing relating to image capturing during recording of a captured moving image in the imaging apparatus and to transmit the specific command in a predetermined state other than during recording of a captured moving image in the imaging apparatus.
  • Embodiment(s) can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
  • computer executable instructions e.g., one or more programs
  • a storage medium which may also be referred to more fully as a ‘non-
  • the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
  • the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
  • the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)
US15/360,338 2015-11-30 2016-11-23 Imaging control apparatus and method for controlling the same Active 2036-11-29 US9979881B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015234301A JP6614943B2 (ja) 2015-11-30 2015-11-30 撮像制御装置およびその制御方法
JP2015-234301 2015-11-30

Publications (2)

Publication Number Publication Date
US20170155825A1 US20170155825A1 (en) 2017-06-01
US9979881B2 true US9979881B2 (en) 2018-05-22

Family

ID=58778364

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/360,338 Active 2036-11-29 US9979881B2 (en) 2015-11-30 2016-11-23 Imaging control apparatus and method for controlling the same

Country Status (4)

Country Link
US (1) US9979881B2 (ko)
JP (1) JP6614943B2 (ko)
KR (1) KR102020373B1 (ko)
CN (1) CN106817536B (ko)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180234621A1 (en) * 2017-02-10 2018-08-16 Canon Kabushiki Kaisha Image capturing apparatus and control method thereof
US10560625B2 (en) * 2014-12-24 2020-02-11 Casio Computer Co., Ltd. Image shooting apparatus for setting image shooting condition easily and method thereof

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108928334B (zh) 2017-05-23 2021-03-19 株式会社万都 电子制动系统
JP7009096B2 (ja) * 2017-07-06 2022-01-25 キヤノン株式会社 電子機器およびその制御方法
JP7143098B2 (ja) * 2018-03-23 2022-09-28 キヤノン株式会社 電子機器及びその制御方法
WO2020137524A1 (ja) * 2018-12-25 2020-07-02 キヤノン株式会社 電子機器
JP7353811B2 (ja) * 2019-06-17 2023-10-02 キヤノン株式会社 電子機器およびその制御方法
JP7321786B2 (ja) * 2019-06-17 2023-08-07 キヤノン株式会社 電子機器およびその制御方法
CN114902104B (zh) * 2019-12-27 2024-06-25 佳能株式会社 电子设备、电子设备的控制方法和计算机可读存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120146929A1 (en) * 2009-08-18 2012-06-14 Canon Kabushiki Kaisha Information processing apparatus, control method therefor, program, and recording medium
JP2012203143A (ja) 2011-03-24 2012-10-22 Olympus Imaging Corp 表示装置
US20130063645A1 (en) * 2011-09-09 2013-03-14 Canon Kabushiki Kaisha Imaging apparatus, control method for the same, and recording medium
US20160224179A1 (en) * 2015-02-04 2016-08-04 Canon Kabushiki Kaisha Electronic apparatus and control method of the same
US20170155826A1 (en) * 2015-11-30 2017-06-01 Canon Kabushiki Kaisha Electronic device and method for controlling the same
US20170195553A1 (en) * 2016-01-05 2017-07-06 Canon Kabushiki Kaisha Electronic apparatus and method for controlling the same
US20170195552A1 (en) * 2016-01-05 2017-07-06 Canon Kabushiki Kaisha Electronic apparatus and control method therefor

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4929630B2 (ja) * 2005-07-06 2012-05-09 ソニー株式会社 撮像装置、制御方法、およびプログラム
JP2009273033A (ja) * 2008-05-09 2009-11-19 Olympus Imaging Corp カメラシステム及びコントローラの制御方法,コントローラのプログラム
KR101505681B1 (ko) * 2008-09-05 2015-03-30 엘지전자 주식회사 터치 스크린을 구비한 이동 단말기 및 이를 이용한 이미지 촬상 방법
US8957981B2 (en) * 2010-03-03 2015-02-17 Intellectual Ventures Fund 83 Llc Imaging device for capturing self-portrait images
JP5613005B2 (ja) * 2010-10-18 2014-10-22 オリンパスイメージング株式会社 カメラ
JP2012147100A (ja) * 2011-01-07 2012-08-02 Nikon Corp 撮像装置
JP5532033B2 (ja) * 2011-09-16 2014-06-25 カシオ計算機株式会社 撮像装置、撮像方法、及びプログラム
JP5863418B2 (ja) 2011-11-28 2016-02-16 キヤノン株式会社 撮像装置及びその制御方法
JP5652408B2 (ja) * 2012-01-16 2015-01-14 カシオ計算機株式会社 撮像装置、撮像方法及びプログラム
JP5940394B2 (ja) * 2012-07-02 2016-06-29 オリンパス株式会社 撮像装置
US20140320698A1 (en) * 2013-04-29 2014-10-30 Microsoft Corporation Systems and methods for capturing photo sequences with a camera
JP6231869B2 (ja) * 2013-12-12 2017-11-15 キヤノン株式会社 撮像装置、通信機器およびそれらの制御方法、プログラム
CN104580907B (zh) * 2014-12-31 2018-11-23 广东欧珀移动通信有限公司 一种防抖的拍照方法及装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120146929A1 (en) * 2009-08-18 2012-06-14 Canon Kabushiki Kaisha Information processing apparatus, control method therefor, program, and recording medium
JP2012203143A (ja) 2011-03-24 2012-10-22 Olympus Imaging Corp 表示装置
US20130063645A1 (en) * 2011-09-09 2013-03-14 Canon Kabushiki Kaisha Imaging apparatus, control method for the same, and recording medium
US20160224179A1 (en) * 2015-02-04 2016-08-04 Canon Kabushiki Kaisha Electronic apparatus and control method of the same
US20170155826A1 (en) * 2015-11-30 2017-06-01 Canon Kabushiki Kaisha Electronic device and method for controlling the same
US20170195553A1 (en) * 2016-01-05 2017-07-06 Canon Kabushiki Kaisha Electronic apparatus and method for controlling the same
US20170195552A1 (en) * 2016-01-05 2017-07-06 Canon Kabushiki Kaisha Electronic apparatus and control method therefor

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10560625B2 (en) * 2014-12-24 2020-02-11 Casio Computer Co., Ltd. Image shooting apparatus for setting image shooting condition easily and method thereof
US20180234621A1 (en) * 2017-02-10 2018-08-16 Canon Kabushiki Kaisha Image capturing apparatus and control method thereof
US10715719B2 (en) * 2017-02-10 2020-07-14 Canon Kabushiki Kaisha Image capturing apparatus and control method thereof

Also Published As

Publication number Publication date
KR20170063368A (ko) 2017-06-08
KR102020373B1 (ko) 2019-09-10
CN106817536B (zh) 2019-07-26
JP6614943B2 (ja) 2019-12-04
JP2017103566A (ja) 2017-06-08
US20170155825A1 (en) 2017-06-01
CN106817536A (zh) 2017-06-09

Similar Documents

Publication Publication Date Title
US9979881B2 (en) Imaging control apparatus and method for controlling the same
JP5306266B2 (ja) 撮像装置及びその制御方法
JP6039328B2 (ja) 撮影制御装置および撮像装置の制御方法
US10212335B2 (en) Electronic apparatus and control method therefor
US10911663B2 (en) Electronic apparatus and method for controlling the same
US11039073B2 (en) Electronic apparatus and method for controlling the same
JP6590666B2 (ja) 電子機器およびその制御方法
US20180234621A1 (en) Image capturing apparatus and control method thereof
US10313580B2 (en) Electronic apparatus, control method therefor, and storage medium
US9325902B2 (en) Image capturing apparatus and control method therefor
US10750081B2 (en) Electronic apparatus and method for selecting an organ from an image
US10527911B2 (en) Electronic apparatus configured to select positions on a display unit by touch operation and control method thereof
US10419678B2 (en) Imaging control apparatus and method for controlling the same
JP5575290B2 (ja) 撮像装置及びその制御方法
JP6611575B2 (ja) 撮像制御装置およびその制御方法
CN112040095A (zh) 电子装置、电子装置的控制方法及存储介质
US20170180633A1 (en) Electronic apparatus and control method thereof
JP6855317B2 (ja) 撮像装置、撮像装置の制御方法、プログラム、及び、記録媒体
JP6878189B2 (ja) 撮像制御装置
JP2021141462A (ja) 電子機器、その制御方法、プログラム及び記憶媒体
JP2021012238A (ja) 電子機器およびその制御方法、プログラム、並びに記憶媒体

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YOSHIKAWA, AKIO;REEL/FRAME:041174/0745

Effective date: 20161107

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4