WO2017164140A1 - Electronic device and method for controlling same - Google Patents

Electronic device and method for controlling same Download PDF

Info

Publication number
WO2017164140A1
WO2017164140A1 PCT/JP2017/011080 JP2017011080W WO2017164140A1 WO 2017164140 A1 WO2017164140 A1 WO 2017164140A1 JP 2017011080 W JP2017011080 W JP 2017011080W WO 2017164140 A1 WO2017164140 A1 WO 2017164140A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
detected
display
control unit
electronic device
Prior art date
Application number
PCT/JP2017/011080
Other languages
French (fr)
Japanese (ja)
Inventor
知宏 太田
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2017023475A external-priority patent/JP6788518B2/en
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Priority to CN201780019550.XA priority Critical patent/CN109076156B/en
Publication of WO2017164140A1 publication Critical patent/WO2017164140A1/en
Priority to US16/131,135 priority patent/US10649313B2/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/18Signals indicating condition of a camera member or suitability of light
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/38Releasing-devices separate from shutter

Definitions

  • the present invention relates to an electronic device and a control method thereof.
  • a digital camera that includes a touch display that displays a live view image and can specify a focus detection area (AF frame) and an area (AE area) that should be properly exposed by automatic exposure control (AE) as a touch position of the touch display.
  • AF frame focus detection area
  • AE area area that should be properly exposed by automatic exposure control
  • the number of pixels of many digital cameras is over 16 to 20 million pixels, while the resolution of the touch display is about 400,000 to 1.2 million pixels and the size is only about 3 type.
  • Some smartphones and tablet terminals have a higher resolution and larger touch display, but are limited to 4 million pixels and 10 type at most.
  • the captured image displayed on the touch display is considerably reduced, it is not easy to accurately specify the intended position by the touch operation.
  • the AF frame or AE area is specified at the time of shooting, it is necessary to touch the screen by holding the electronic device with one hand while using the finger of the other hand or extending the finger of the hand holding the device. Therefore, it is more difficult to specify the intended position accurately.
  • the present invention has been made in view of the problems of the prior art, and an object of the present invention is to provide an electronic device that realizes both convenience of touch operation and accurate position designation and a control method thereof.
  • the above-described object is to set the first operation mode when an area corresponding to a predetermined condition is detected from the image, and to set the first operation mode when the area is not detected or when the area is not detected.
  • a setting means for setting the operation mode 2 a touch detection means for detecting an operation on the touch detection surface, and a control means.
  • the touch detection means touches
  • the process corresponding to the touch position moving operation is performed instead of the process corresponding to the position where the touch is detected by the touch detection means. It is achieved by an electronic device characterized by having control means for performing control so as to execute processing.
  • the accompanying drawings are included in the specification, constitute a part thereof, show an embodiment of the present invention, and are used to explain the principle of the present invention together with the description.
  • the perspective view which shows the external appearance example of the digital camera which concerns on embodiment of this invention The perspective view which shows the external appearance example of the digital camera which concerns on embodiment of this invention
  • the perspective view which shows the external appearance example of the remote control which concerns on embodiment 1 is a block diagram showing a functional configuration example of a digital camera according to an embodiment.
  • FIG. 1 is a block diagram showing an example of a functional configuration of a remote control according to an embodiment
  • the flowchart regarding the operation of the digital camera according to the embodiment The flowchart regarding the operation of the digital camera according to the embodiment.
  • the flowchart regarding the operation of the digital camera according to the embodiment The flowchart regarding the operation of the digital camera according to the embodiment
  • the flowchart regarding the operation of the digital camera according to the embodiment The flowchart regarding the operation of the digital camera according to the embodiment
  • the figure which shows the example of a screen display in the remote control which concerns on embodiment The figure which shows the example of a screen display in the remote control which concerns on embodiment
  • the figure which shows the example of a screen display in the remote control which concerns on embodiment The figure which shows the example of a screen display in the remote control which concerns on embodiment
  • the figure which shows the example of a screen display in the remote control which concerns on embodiment The figure which shows the example of a screen display
  • ⁇ (Digital camera 100) 1A and 1B are perspective views of an external appearance of a single-lens reflex digital camera 100 (hereinafter simply referred to as a camera 100), which is an example of an electronic apparatus according to the present embodiment, viewed from above the front and below the back, respectively. .
  • the display unit 128 provided on the back of the camera 100 is a touch display.
  • the display unit 128 is used to display various information such as a captured image and a setting value of the camera 100.
  • the display unit 128 also functions as an operation unit that receives a touch operation.
  • a touch sensor that does not have a display function such as a touch pad may be used instead of the display unit 128, or a device other than the display unit 128 may be used.
  • a touch sensor may be arranged at the position.
  • the viewfinder may be optical or electronic.
  • the operation on the touch display is replaced by operating the touch sensor while looking through the viewfinder (while visually recognizing the display unit visible through the eyepiece of the viewfinder). Can do.
  • the shutter button 161 is an operation unit for inputting a shooting preparation instruction and a shooting start instruction.
  • the mode switch 160 is an operation unit for switching the shooting mode of the camera 100 and the like.
  • the terminal cover 191 protects a connector of a cable used for communication with an external device. Dial 171 is used to change setting values such as shutter speed and aperture.
  • a power switch 172 switches the power of the camera 100 on and off.
  • the dial 173 is used for moving a frame to be selected and for sending / returning a displayed image.
  • the dial 173 can be pressed in the vertical and horizontal directions, functions as a cross key 174, and is provided with a SET button 175 on the rotating shaft portion.
  • the cross key 174 is used for moving the cursor, and the SET button 175 is used for determining a selection item, inputting a confirmation response, and the like.
  • the live view (LV) button 176 is used to switch on and off live view display on the display unit 128 in the still image shooting mode of the camera 100.
  • the LV button 176 is used to start / stop moving image shooting (recording) in the moving image shooting mode of the camera 100.
  • the enlarge button 177 is used to switch the enlargement mode of live view display on and off, and to change the enlargement ratio.
  • the enlarge button 177 is used to increase the display magnification of the playback image in the playback mode of the camera 100.
  • the playback button 178 is used for switching between the shooting mode and the playback mode of the camera 100. When the playback button 178 is pressed in the shooting mode, the camera 100 shifts to the playback mode and displays the latest image among the images recorded on the recording medium on the display unit 128.
  • the lid 120 that can be opened and closed protects the storage portion of the recording medium and the stored recording medium from dust and the like.
  • the grip 190 is formed in a shape that allows the user to easily hold the camera 100 with one hand, and a battery is accommodated therein.
  • the mirror 112 is moved up and down by an actuator controlled by the system control unit 150 (FIG. 3A) to switch the traveling direction of the light beam from the lens unit.
  • the terminal 110 is in contact with a terminal provided on a lens unit to which the camera 100 can be attached and detached, and is used for supplying power from the camera 100 to the lens unit and for the camera 100 to communicate with the lens unit.
  • the eyepiece 116 is an optical viewfinder eyepiece. By looking into the eyepiece 116, the user can observe the subject image formed by the lens unit and the set values such as the shutter speed, aperture value, photographing sensitivity, and AF frame.
  • FIG. 2 is a perspective view showing an external appearance example of a remote controller 200 (hereinafter simply referred to as a remote controller 200) of the camera 100.
  • the remote controller 200 enables remote operation of the camera 100 by communicating with the camera 100.
  • the remote controller 200 and the camera 100 can communicate with each other through a wired medium or a wireless medium, but in this embodiment, wired communication using the connection cable 291 is performed.
  • the connector of the connection cable 291 is connected to the connector in the terminal cover 191 of the camera 100.
  • the camera 100 and the remote controller 200 can perform two-way communication with a connection cable 291.
  • the power switch 272 switches the power of the remote control 200 on and off.
  • the display unit 228 is a touch display. Although the display unit 228 has the same aspect ratio as the display unit 128 of the camera 100, in this embodiment, the camera 100 does not transmit the live view image to the remote controller 200, and the live view image is not displayed on the display unit 228. .
  • the function of the operation buttons 271 arranged three by three above and below the display unit 228 changes according to the display content of the display unit 228.
  • the cross key 274 has keys corresponding to up, down, left, and right, and a SET button 275 is disposed at the center of the cross key 274.
  • the cross key 274 is mainly used for inputting an instruction for moving a cursor, and the SET key 275 is mainly used for inputting an instruction for executing or determining a selected item.
  • the HOME button 276 is used to instruct a transition to the HOME screen of the remote controller 200.
  • the MENU button 277 is used for instructing the transition of the remote control 200 to the MENU screen.
  • the BACK button 278 is used to instruct a cancel instruction or a transition to the previous screen.
  • Various key buttons 271, 273 to 278 other than the power switch 272 are included in the operation unit 270 (FIG. 3B) of the remote controller 200.
  • FIG. 3A is a block diagram showing an example of the functional configuration of the camera 100
  • FIG. 3B is a block diagram showing an example of the functional configuration of the remote controller 200.
  • the lens unit 140 (hereinafter simply referred to as the lens 140) is an interchangeable lens that can be attached to and detached from the camera 100.
  • the lens 140 includes a lens group 103 including movable lenses such as a focus lens and a variable power lens.
  • the terminal 146 contacts the terminal 110 when the lens 140 is attached to the camera 100, and enables bidirectional communication between the lens 140 and the camera 100, power supply from the camera 100 to the lens 140, and the like.
  • the lens control unit 144 is a programmable processor such as a CPU or MPU, and controls the operation of the lens 140 by reading the program stored in the nonvolatile memory 145 into the system memory 147 and executing it.
  • the lens control unit 144 controls the aperture mechanism 102 via the aperture drive circuit 142 through communication with the system control unit 150 via the terminals 146 and 110, and controls the movable lens of the lens group 103 via the AF drive circuit 143. Control the position.
  • the AF / MF switch (SW) 148 is a switch for switching whether to use the autofocus function of the lens 140, and the setting value is notified from the lens control unit 144 to the system control unit 150. There is also a lens 140 that does not have the AF / MF switch 148.
  • the focusing screen 113, the pentaprism 114, and the eyepiece lens 116 constitute an optical viewfinder.
  • the user can confirm the photographing range, the in-focus position, and the like.
  • the AE sensor 117 provided in the optical viewfinder measures the luminance of the subject image formed by the lens 140.
  • the focus detection unit 111 has an AF sensor that divides the lens 140 into pupils, and outputs defocus information (direction and amount) to the system control unit 150.
  • the system control unit 150 notifies the lens control unit 144 of the movement amount and movement direction of the focus lens based on the defocus information.
  • the lens control unit 144 moves the focus lens 140 via the notified direction and amount and the AF driving circuit 143.
  • the quick return mirror 112 (hereinafter simply referred to as the mirror 112) switches the traveling direction of the light beam incident from the lens 140 between the optical finder direction and the imaging unit 122 direction.
  • FIG. 3A shows a state in which the mirror 112 is down, and the light beam travels in the direction of the optical viewfinder.
  • the mirror 112 When exposing the imaging unit 122 (during still image shooting, live view shooting, and recording moving image shooting), the mirror 112 is raised, and the light beam travels in the direction of the imaging unit 122.
  • the mirror 112 has a half mirror region so that the focus detection unit 111 can receive a part of the light beam in a state where the mirror 112 is down.
  • the mechanical shutter 101 opens and closes under the control of the system control unit 150 to expose the imaging unit 122 for a predetermined time. Note that the mechanical shutter 101 is kept open during a period in which the electronic shutter is used, such as during moving image shooting.
  • the imaging unit 122 includes a plurality of pixels and includes an imaging element that converts an optical image into an electrical signal by photoelectric conversion.
  • the A / D converter 123 converts the analog signal generated by the imaging unit 122 into digital data.
  • the image processing unit 124 applies predetermined image processing to the data from the A / D converter 123 or the memory control unit 115.
  • image processing include, but are not limited to, demosaic, white balance adjustment, gradation correction, scaling (resizing), color conversion, encoding, decoding, and area detection.
  • the area detection is a process for detecting an area that satisfies a predetermined condition, such as an area having a specific pattern or feature (for example, a face of a person or an animal, or an area having a specific color or brightness) from an image. Also called subject detection.
  • the image processing unit 124 can also perform recognition processing of a person and detection of facial expressions, eyebrows, red eyes, and the like. In the following, it is assumed that the image processing unit 124 executes face detection processing for detecting a human face as an example of subject detection.
  • the image processing unit 124 can generate control information from an image (captured image) acquired by shooting. For example, when contrast AF is performed, the image processing unit 124 can generate an AF evaluation value in the focus detection area and provide it to the system control unit 150. The image processing unit 124 may generate a brightness evaluation value or a brightness histogram. Further, the image processing unit 124 can detect a destination of a predetermined image region in a sense of a plurality of images (for example, a plurality of moving image frames) acquired by time-series shooting. The detection of the movement direction and the movement amount of the specific area between the images can be performed using a known method as a motion vector detection process using template matching, for example, but is not limited thereto. For example, by setting a focus detection area (AF frame) as a template, the AF frame can be tracked, and tracking AF described later can be realized.
  • AF frame focus detection area
  • Digital data output from the A / D converter 123 is written into the memory 132 through at least the memory control unit 115 of the image processing unit 124 and the memory control unit 115.
  • the memory 132 stores image data read from the recording medium 185 in addition to the digital data output from the A / D converter 123.
  • the memory 132 has a storage capacity for functioning as a buffer memory during continuous shooting.
  • the memory 132 also serves as a video memory, and stores images to be displayed on the display unit 128 and image (OSD) data representing various information such as camera setting values.
  • the D / A converter 125 converts the display image data (captured image and OSD composite image) stored in the memory 132 into an analog signal and supplies the analog signal to the display unit 128, so that the display image is displayed. Displayed in the section 128.
  • the display unit 128 is, for example, a touch display provided with a touch panel 128a on the screen.
  • a capacitive touch panel can be used.
  • the touch panel 128a is described as a separate configuration from the display unit 128 for convenience, but the function of the touch panel may be incorporated in the display unit 128 like an in-cell touch display.
  • the display unit 128 can function as an EVF.
  • An image displayed for causing the display unit 128 to function as an EVF is referred to as a live view image.
  • the nonvolatile memory 156 is a memory that can be electrically erased and recorded, such as an EEPROM.
  • the nonvolatile memory 156 stores a program to be executed by the system control unit 150, various setting values, GUI data, and the like.
  • the system control unit 150 is a programmable processor such as a CPU or MPU, and controls the operation of the camera 100 by reading the program stored in the nonvolatile memory 156 into the system memory 152 and executing it.
  • the system control unit 150 also performs display control by controlling the memory 132, the D / A converter 125, and the display unit 128.
  • the system control unit 150 performs automatic exposure control (AE) processing based on luminance information obtained from the AE sensor 117 and automatic focus detection (AF) processing based on defocus information obtained from the focus detection unit 111. Further, the system control unit 150 uses information obtained by the image processing unit 124 for AE processing and AF processing as necessary. For example, the system control unit 150 performs an AF process in which a focus detection area (AF frame) is set on a person's face using the detection result of the face area in the image processing unit 124, or the person's face is properly exposed. Thus, AE processing and flash light control can be performed.
  • AE automatic exposure control
  • AF automatic focus detection
  • the system control unit 150 can also perform AF (tracking AF) following the movement of the face by causing the image processing unit 124 to track the area corresponding to the specific face. Further, when executing contrast AF, the system control unit 150 can perform imaging while changing the position of the focus lens, and can search for a focus lens position where the AF evaluation value obtained by the image processing unit 124 is maximized.
  • AF tracking AF
  • the system timer 153 measures the time used for various controls and the time of the built-in clock.
  • the mode switch 160 switches the operation mode of the camera 100 to one of a still image recording mode, a moving image recording mode, a reproduction mode, and the like.
  • Still image recording modes include auto shooting mode, auto scene discrimination mode, manual mode, aperture priority mode (Av mode), shutter speed priority mode (Tv mode), scene mode, program AE mode, custom mode, and the like.
  • the scene mode is a mode in which parameters are automatically set according to the type of shooting scene, such as sports, snow scenes, and fireworks. At least one of these modes may be selectable by the changeover switch 160. Alternatively, after selecting the still image shooting mode with the mode switch 160, any of a plurality of modes included in the still image shooting mode may be selected using another operation member.
  • the video shooting mode may also include a plurality of modes.
  • the first shutter switch 162 is turned on when the shutter button 161 is half-pressed to generate the first shutter switch signal SW1.
  • the system control unit 150 starts operations such as AF processing and AE processing using the first shutter switch signal SW1 as an instruction for shooting preparation.
  • the second shutter switch 164 is turned on when the shutter button 161 is fully pressed, and generates a second shutter switch signal SW2.
  • the system control unit 150 uses the second shutter switch signal SW2 as an instruction to start shooting, and performs a series of shooting and recording processes from exposure and reading of the imaging unit 122 to generation of image data and writing of image data to the recording medium 185. Start.
  • the operation unit 170 collectively shows the operation members (the dial 171 to the enlargement button 177) shown in FIGS. 1A to 1B.
  • the functions of these operation members may be dynamically assigned according to the display content of the display unit 128, for example. For example, functions such as a menu button, an end button, a return button, an image advance button, a jump button, a narrowing button, and an attribute change button may be assigned.
  • the system control unit 150 causes the display unit 128 to display a menu screen for the user to perform various settings.
  • the user can input various settings and instructions from the menu screen by combining the operations of the cross key 174 and the SET button 175 or by operating the touch panel 128a.
  • the user can input various settings and instructions from the menu screen by combining the operations of the cross key 174 and the SET button 175 or by operating the touch panel 128a.
  • an operation by operating a button or key included in the operation unit 170 can be set in advance and stored in the nonvolatile memory 156. This setting may be changeable by the user.
  • the power supply control unit 180 includes a battery detection circuit, a DC-DC converter, a switch circuit that switches a block to be energized, and the like, and detects whether or not a power supply is installed, the type of power supply, and the remaining battery level. Further, the power supply control unit 180 controls the DC-DC converter based on these detection results and the control of the system control unit 150, and supplies a necessary voltage to each unit including the recording medium 185 for a necessary period.
  • the power supply unit 130 may be a primary battery, a secondary battery, or an AC adapter.
  • the recording medium I / F 118 is an interface with the recording medium 185.
  • the recording medium 185 may be a memory card or a hard disk. There may be a detachable recording medium and a fixed recording medium.
  • the external I / F 193 is a communication interface with an external device, and has a connector for connecting a cable.
  • a USB interface for connecting a personal computer or the like, an HDMI (registered trademark) for connecting an external monitor, an interface for the remote control 200, and a corresponding connector are provided.
  • the communication I / F 192 is a communication interface with an external device, and includes an antenna for wireless communication, a modulation / demodulation circuit, and the like.
  • the communication I / F 192 includes, for example, an antenna and a circuit for performing wireless communication in accordance with a standard such as a wireless LAN or Bluetooth (registered trademark).
  • the system control unit 150 can perform automatic focus detection (AF) by setting a focus detection area (AF frame) at a fixed position, a subject area, or an arbitrary position designated by the user. It is.
  • the fixed position may be, for example, the center of the screen.
  • the subject area may be a face area detected by the image processing unit 124 from the live view image, for example.
  • the arbitrary position designated by the user may be, for example, a position in the image designated by a combination of the cross key 174 and the SET button 175 or a touch operation of the touch panel 128a.
  • touch AF setting AF frames at positions designated by touch operations and executing AF.
  • the system control unit 150 serving as a touch detection unit can detect the following operation or state on the touch detection surface (display surface of the display unit 128) of the touch panel 128a.
  • -A finger or pen that has not touched the touch panel 128a touches the touch panel 128a anew. That is, the start of touch (hereinafter referred to as touch-down).
  • the touch panel 128a is being touched with a finger or a pen (hereinafter referred to as touch-on).
  • touch panel 128a is moved while being touched with a finger or a pen (hereinafter referred to as touch-move).
  • touch-up A state in which nothing touches the touch panel 128a (hereinafter referred to as touch-off).
  • touch-on When touchdown is detected, it is also detected that touch-on is performed at the same time. After touch-down, unless touch-up is detected, normally touch-on continues to be detected. The touch move is detected in a state where touch-on is detected. Even if the touch-on is detected, the touch move is not detected unless the touch position is moved. After it is detected that all fingers or pens that have been touched are touched up, the touch is turned off.
  • These operations / states and the position coordinates where the finger or pen touches the touch panel 128a are notified to the system control unit 150 through the internal bus.
  • the system control unit 150 determines what operation (touch operation) has been performed on the touch panel 128a based on the notified information.
  • the moving direction of the finger or pen moving on the touch panel 128a can also be determined for each vertical component / horizontal component on the touch panel 128a based on the change of the position coordinates.
  • a slide operation drag operation
  • An operation in which a finger is touched on the touch panel 128a, quickly moved by a certain distance, and then released is called a flick.
  • the flick is an operation of quickly tracing the touch panel 128a with a finger.
  • the touch panel 128a may be any of various types of touch panels such as a resistive film method, a capacitance method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, an image recognition method, and an optical sensor method. good.
  • a method for detecting that there is a touch due to contact with the touch panel there are a method for detecting that there is a touch due to the approach of a finger or pen to the touch panel, but either method may be used.
  • the nonvolatile memory 256 is an electrically erasable / recordable memory such as an EEPROM.
  • the nonvolatile memory 256 stores a program to be executed by the remote control unit 250, various setting values, GUI data, and the like.
  • the remote controller control unit 250 is a programmable processor such as a CPU or MPU, and controls the operation of the remote controller 200 by reading a program stored in the nonvolatile memory 256 into the system memory 252 and executing it.
  • the memory 232 is used as a video memory of the display unit 228, and the remote control unit 250 stores display image data.
  • the display image data stored in the memory 232 is converted into an analog signal by the D / A converter 213 and supplied to the display unit 228, whereby the display image is displayed on the display unit 228.
  • the remote control unit 250 also performs display control by controlling the memory 232, the D / A converter 213, and the display unit 228.
  • the display unit 228 is a touch display provided with a touch panel 228a on a screen. Similar to the touch panel 128a of the camera 100, the method of the touch panel 228a is not limited. Further, the function of the touch panel may be incorporated in the display portion 228.
  • the system timer 253 measures the time used for various controls and the time of the built-in clock.
  • the operation unit 270 collectively shows various key buttons 271, 273 to 278 other than the power switch 272 in FIG.
  • the function of the operation button 271 may be dynamically assigned, and the name of the currently assigned function may be displayed on the display unit 228.
  • Examples of assigned functions include, but are not limited to, an end button, a return button, an image advance button, a jump button, a narrowing button, and an attribute change button.
  • the remote control unit 250 When the HOME button 276 is pressed, the remote control unit 250 reads the GUI data from the nonvolatile memory 256 and displays a predetermined HOME screen on the display unit 228. Similarly, when the MENU button 277 is pressed, the remote control control unit 250 reads the GUI data from the nonvolatile memory 256 and causes the display unit 228 to display a predetermined MENU screen.
  • the remote control unit 250 performs display control corresponding to the screen, such as moving the focus (highlighting of the selected item) on the GUI screen according to the operation of the cross key 274.
  • the remote controller control unit 250 executes processing corresponding to the item selected at that time, or changes the screen.
  • an operation by operating a button or key included in the operation unit 270 can be set in advance and stored in the nonvolatile memory 256. This setting may be changeable by the user.
  • the power supply control unit 280 includes a battery detection circuit, a DC-DC converter, a switch circuit that switches a block to be energized, and the like, and detects whether or not a power supply is installed, the type of power supply, and the remaining battery level. Further, the power supply control unit 280 controls the DC-DC converter based on the detection result and the control of the system control unit 150, and supplies a necessary voltage to each unit for a necessary period.
  • the power supply unit 230 may be a primary battery, a secondary battery, or an AC adapter.
  • the external I / F 293 is a communication interface with the camera 100, and a cable 291 is connected thereto. Communication with the camera 100 is performed under the control of the remote control unit 250 via the external I / F 293.
  • the communication I / F 292 is a communication interface with an external device, and includes an antenna for wireless communication, a modulation / demodulation circuit, and the like.
  • the communication I / F 292 includes an antenna and a circuit for performing wireless communication conforming to a standard such as a wireless LAN or Blurtooth (registered trademark), for example.
  • the remote controller 250 can detect various touch operations by detecting the presence or absence of a touch on the touch detection surface of the touch panel 228a (the display surface of the display unit 228) and the temporal change (speed) of the touched position.
  • the same touch operation as that of the system control unit 150 described above can be recognized.
  • input from the remote control 200 may be input from the operation unit 170 of the camera 100 (including the touch panel 128a and the AF / MF switch 148 of the lens 140). Further, the output process to the remote controller 200 may not be performed when the remote controller 200 is not connected or when the remote controller 200 is invalid.
  • step S4001 the system control unit 150 determines whether an AF mode change request has been received from the remote controller 200 (or the operation unit 170) via the external I / F 193. If it is determined that the AF mode change request has been received, the processing is received in step S4002. If it is not determined that the process has been performed, the process advances to step S4003.
  • the “AF mode” means an operating state of autofocus (AF).
  • the “AF mode change request” means a request that affects the AF operation state, for example, a parameter value change request.
  • setting of whether to enable face detection, subject tracking target setting, tracking start, tracking end instruction, manual focus (MF) and autofocus switching instruction are set in the AF mode. Although illustrated as a change request, it is not limited to these.
  • the user when face detection is effective, the user can select a subject to be focused. Also, the user can select the position of the area (AF frame) where focus detection is performed. In addition, tracking AF that performs AF while tracking the main subject selected by the user (the subject at the position of the face area or the AF frame) is possible, and the user can start and end tracking AF or change the tracking target area. It can be carried out.
  • the system control unit 150 changes the AF mode in accordance with the AF mode change request.
  • the system control unit 150 can change the AF mode by changing a parameter value stored in the system memory 152 and corresponding to the current AF mode.
  • the system control unit 150 refers to the system memory 152 to determine whether or not the current AF mode is the tracking standby state. If it is determined that the tracking standby state is determined, the process proceeds to S4016. The process proceeds to S4004. In step S4004, the system control unit 150 refers to the system memory 152 to determine whether or not the current AF mode is in the tracking state. If it is determined that the tracking state is in progress, the process proceeds to step S4014 in FIG. If not, the process proceeds to S4005.
  • step S4005 the system control unit 150 refers to the system memory 152 to determine whether or not the current AF mode is valid for face detection. If it is determined that face detection is valid, the process proceeds to step S4006. The process proceeds to S4007 in FIG. 4AB. In S4006, the system control unit 150 determines whether or not a face is detected in the image frame captured by the imaging unit 122. If it is determined that the face is detected, the system control unit 150 determines that the face is detected (S4008). For example, the process proceeds to S4011 in FIG.
  • step S4007 the system control unit 150 refers to the system memory 152 to determine whether the current AF mode is AF or MF. If it is determined to be AF, the process proceeds to step S4011 in FIG. 4AB. If it is determined to be MF, the system control unit 150 in FIG. The process proceeds to S4013.
  • step S4008 the system control unit 150 determines whether the relative position timer realized by the system timer 153 is operating.
  • the relative position timer is an example of a unit that measures a certain time. Measurement (counting) starts in S4204, which will be described later, and stops when a certain time elapses (when the count reaches a predetermined value (eg, 0)). If it is determined that the relative position timer is in operation, the system control unit 150 proceeds to S4011, and if it is not determined to be in operation, the system control unit 150 proceeds to S4009.
  • step S4009 the system control unit 150 notifies the remote control 200 via the external I / F 193 of the display screen type (area selection screen) and detected individual face information (for example, identification information, position, and size). .
  • the system control unit 150 determines one of the plurality of faces as a main face according to a predetermined condition, and notifies the remote controller 200 which face is the main face.
  • the main face selection condition is not particularly limited, but for example, one or more of the conditions such as the face closest to the center of the screen, the largest face, and the face closest to the distance can be used. Note that if the face designated by the user as the main face is detected, the system control unit 150 determines that face as the main face. Instead of notifying the type of display screen, the display screen data itself may be transmitted.
  • FIG. 5A shows an example of an area selection screen that the remote control unit 250 displays on the display unit 228 in response to the notification in S4009.
  • Reference numeral 5001a denotes a guide display indicating that one of the devices included in the operation unit 270 is assigned the function of instructing the transition to the tracking standby state.
  • Reference numeral 5002a denotes a guide display indicating that an AF / MF switching instruction function is assigned to one of the devices included in the operation unit 270.
  • Reference numeral 5003a denotes a display indicating the current focus mode of the camera 100. In the example of FIG. 5A, the current focus mode is AF.
  • Reference numeral 5004 denotes a guide display indicating that the face detection valid / invalid switching function is assigned to one of the devices included in the operation unit 270.
  • Reference numeral 5005a denotes a display indicating whether the current face detection function of the camera 100 is valid or invalid. In the example of FIG. 5A, the face detection function is currently valid.
  • Reference numerals 6003, 6004a, and 6004b are face frames displayed by the remote control unit 250 based on the notified face information, and are examples of mark display that is a guide for the position and size of the detected face.
  • three faces are detected, and the display of the face frame 6003 corresponding to the main face is different from the other face frames 6004a and 6004b.
  • the remote control unit 250 displays the face frame by converting the position and size of the face area indicated by the image coordinates in the face information according to the resolution of the display unit 228.
  • the input devices included in the operation unit 270 there is no particular limitation on devices corresponding to the guide displays 5001a to 5003a displayed on the edge of the display unit 228. Further, the guide displays 5001a to 5003a may be configured to be touched as they are using the touch panel 228a.
  • the subject (face) selection, the transition instruction to the tracking standby state, the AF / MF switching, and the face detection valid / invalid switching can be performed from the remote controller 200.
  • the live view image is not displayed on the display unit 228 of the remote controller 200, but the user can easily designate a desired face from the remote controller 200 by displaying a face frame based on the face information.
  • the remote control unit 250 may display an area selection screen that does not display a face frame, as shown in FIG. 5F.
  • a guide display 5006a for informing the user of the function and operation method assigned to the touch panel 228a can be performed.
  • the system control unit 150 notifies the remote controller 200 of the type of display screen (AF frame position change screen) via the external I / F 193. Instead of notifying the type of display screen, the display screen data itself may be transmitted.
  • FIG. 5B shows an example of an AF frame position change screen displayed on the display unit 228 by the remote control unit 250 in response to the notification in S4011. This screen indicates that the camera 100 is in a state of accepting an AF frame moving operation.
  • the same display elements as those in FIG. 5A are given the same numbers, and descriptions thereof are omitted.
  • Reference numeral 5006b denotes a guide display indicating that the AF frame can be moved from the current position to a position corresponding to the direction and distance of the drag operation by dragging the touch panel 228a.
  • the relative position operation is enabled instead of the absolute position operation of the touch panel 228a (operation direction and amount information is enabled).
  • the system control unit 150 performs a relative position operation process, which will be described later with reference to FIG. 4BB, and returns the process to S4003.
  • the system control unit 150 notifies the remote controller 200 of the type of display screen (touch operation invalid screen) via the external I / F 193, and the process proceeds to S4015.
  • the display screen data itself may be transmitted.
  • FIG. 5C shows an example of a touch operation invalidation screen that the remote control unit 250 displays on the display unit 228 in response to the notification in S4013.
  • This screen indicates that the camera 100 is in a state where it cannot accept the operation of the AF frame and the face frame.
  • the same display elements as those in FIG. 5A are given the same numbers, and descriptions thereof are omitted.
  • the guide display 5001b is displayed in a color different from the valid state shown in FIG. 5A, for example.
  • 5003b indicates that the current focus mode is MF.
  • 5005b indicates that the current face detection function is invalid.
  • Reference numeral 5007c denotes a guide display indicating that an operation on the touch panel 228a (except for operations on the guide displays 5002a and 5004) is invalid.
  • the system control unit 150 notifies the remote controller 200 of the display screen type (tracking screen) via the external I / F 193, and the process proceeds to S4015.
  • the display screen data itself may be transmitted.
  • FIG. 5D shows an example of a tracking screen that the remote control unit 250 displays on the display unit 228 in response to the notification in S4014.
  • the tracking screen indicates that the camera 100 is currently performing an AF operation while tracking the selected subject or AF frame (during tracking AF), and touch operations (except for operations on the guide display 5001c) are invalid. It is a screen which notifies that there is.
  • a guide display 5001c indicates that one of the devices included in the operation unit 270 is assigned a tracking end instruction function.
  • Reference numeral 5007d denotes a guide display indicating that the AF mode of the camera 100 is in tracking AF and the operation of the touch panel 228a is invalid.
  • the system control unit 150 performs position operation invalidation processing, which will be described later, using FIG. 4CA, and returns the processing to S4003.
  • the system control unit 150 notifies the remote controller 200 of the type of display screen (tracking standby screen) via the external I / F 193, and the process proceeds to S4017.
  • the display screen data itself may be transmitted.
  • FIG. 5E shows an example of a tracking standby screen displayed on the display unit 228 by the remote control unit 250 in response to the notification in S4016.
  • the tracking standby screen is a screen for notifying that the AF mode of the camera 100 is in the tracking standby state and that the tracking standby frame can be moved by a drag operation.
  • a guide display 5001d indicates that the tracking start instruction function is assigned to one of the devices included in the operation unit 270.
  • Reference numeral 5002b denotes a guide display indicating that the tracking cancel instruction function is assigned to one of the devices included in the operation unit 270.
  • a guide display 5006e indicates that the AF mode of the camera 100 is in the tracking standby state, and that the tracking standby frame can be moved from the current position to a position corresponding to the direction and distance of the drag operation by dragging the touch panel 228a. is there.
  • the tracking standby frame is a target face or AF frame for which tracking AF is performed. When an instruction to start tracking is given, the camera 100 starts tracking AF with the tracking standby frame at the time of the instruction as a tracking target. To do.
  • the system control unit 150 performs a tracking standby operation process to be described later with reference to FIG. 4CB, and returns the process to S4003.
  • FIG. 4BA is a flowchart illustrating details of the absolute position operation process executed by the system control unit 150 in step S4010.
  • the system control unit 150 notifies the remote controller 200 via the external I / F 193 to operate the touch panel 228a in the absolute position operation mode.
  • the absolute position operation mode is a mode in which the position (coordinates) of the touch operation is validated.
  • step S4102 the system control unit 150 updates the face detection state as necessary.
  • step S4103 the system control unit 150 transmits information (identification information, position, and size) of each currently detected face to the remote control 200 via the external I / F 193. Note that since the face information is transmitted to the remote controller 200 in S4009, the processing in S4102 and S4103 may not be performed. Further, the process of S4103 may be executed only when the face detection result is different from the execution of S4009 in S4102.
  • step S4104 the system control unit 150 determines whether or not the identification information of the selected face has been received from the remote controller 200. If it is determined that it has been received, the process proceeds to S4105. If it is not determined to be received, the process proceeds to S4107. Proceed with the process.
  • step S4105 the system control unit 150 determines the face corresponding to the face identification information received in step S4104 as the main face, and advances the process to step S4106.
  • step S4106 for example, the system control unit 150 sets an AF frame in an area including the main face, performs focus control (focus detection), and advances the process to step S4107.
  • step S4107 the system control unit 150 displays a live view screen including a mark display (face frame) indicating the detected face on the display unit 128 as illustrated in FIG. 6A, for example, and ends the absolute position operation process.
  • a status display 6001 indicates that moving image shooting is being performed or that moving image shooting is being paused. It is not displayed in shooting standby mode.
  • Reference numeral 6002 denotes a display showing the time code of the moving image being shot.
  • Reference numeral 6003 denotes a frame display indicating the main face among the detected faces.
  • Both 6004a and 6004b are frame displays indicating faces other than the main face among the detected faces, and are displayed in a manner different from the main face frame 6003.
  • Reference numerals 6005a, 6005b, and 6005c are subjects corresponding to detection targets (here, human faces) of the subject detection process included in the image captured by the imaging unit 122.
  • FIG. 4BB is a flowchart illustrating details of the relative position operation process executed by the system control unit 150 in step S4012.
  • the system control unit 150 notifies the remote controller 200 via the external I / F 193 to operate the touch panel 228a in the relative position operation mode.
  • the relative position operation mode is a mode in which not the touch operation position but the movement amount and the movement direction of the touch operation position are validated.
  • step S4202 the system control unit 150 determines whether or not information (movement information) regarding the movement amount and direction of the touch operation has been received from the remote controller 200. If it is determined that the information has been received, the system control unit 150 determines that the information has been received. If not, the process proceeds to S4205. In step S4203, the system control unit 150 determines the movement destination position of the AF frame from the current position of the AF frame and the movement information received in step S4202, and advances the process to step S4204. In step S4204, the system control unit 150 resets the relative position timer, starts counting, and advances the process to step S4205. As described above, the relative position timer is a timer for measuring a certain time, and is realized using the system timer 153. The relative position timer may be a count-up timer or a count-down timer.
  • the fixed time measured by the relative position timer is a time for switching (transitioning) between the relative position operation and the absolute position operation, and about several seconds is considered appropriate for switching without a sense of incongruity.
  • the system control unit 150 sets an initial value and a target value of the relative position timer, and starts counting. Instead of measuring the fixed time, the condition for starting or ending counting may be that the focus control in S4205 is in a stable state.
  • step S4205 the system control unit 150 performs focus control based on the current AF frame position, and advances the process to step S4206.
  • step S4206 the system control unit 150 displays a live view screen including a mark display indicating the current AF frame on the display unit 128 as illustrated in FIG. 6B, for example, and ends the relative position operation processing.
  • FIG. 6B the same reference numerals are assigned to the same display elements as in FIG. 6A, and duplicate descriptions are omitted.
  • Reference numeral 6006 denotes a display indicating the current AF frame position.
  • Reference numeral 6007 denotes a subject in the image.
  • FIG. 4CA is a flowchart illustrating details of the position operation invalidation process executed by the system control unit 150 in step S4015.
  • the system control unit 150 notifies the remote control 200 via the external I / F 193 that the absolute position operation and the relative position operation of the touch panel 228a are invalid.
  • the system control unit 150 determines whether the AF mode is being tracked. If it is determined that tracking is in progress, the process proceeds to S4303. If not determined to be tracking, the process proceeds to S4305.
  • the system control unit 150 displays a live view screen without a frame display on the display unit 128 as shown in FIG. 6C, for example, and ends the position operation invalidation process.
  • FIG. 6C the same reference numerals are assigned to the same display elements as in FIG. 6A, and duplicate descriptions are omitted.
  • frames 6003, 6004a, 6004b (FIG. 6A) and AF frame 6006 (FIG. 6B) indicating the detected subject area are not displayed on the live view screen.
  • the system control unit 150 determines whether or not the subject has been lost in the tracking process (tracking has failed). If it is determined that tracking has failed, the process proceeds to S4304. If tracking is not determined to have failed, S4306 is determined. And proceed with each process. In step S4304, the system control unit 150 ends the tracking process, changes the parameter value corresponding to the AF mode stored in the system memory 152 to a state without tracking, and ends the position operation invalidation process. Thus, focus control without tracking is performed until tracking is set again.
  • step S4306 the system control unit 150 updates the AF frame setting to the movement position of the subject that is being detected, which is detected by the image processing unit 124.
  • step S4307 the system control unit 150 performs focus control on the AF frame updated in step S4306.
  • FIG. 6D the same reference numerals are assigned to the same display elements as in FIG. 6A, and duplicate descriptions are omitted.
  • Reference numeral 6008 denotes a frame (tracking frame) display indicating the subject being tracked.
  • FIG. 4CB is a flowchart showing details of the tracking standby operation process executed by the system control unit 150 in S4017.
  • the system control unit 150 notifies the remote controller 200 via the external I / F 193 to operate the touch panel 228a in the relative position operation mode.
  • the system control unit 150 determines whether or not information (movement information) regarding the movement amount and movement direction of the touch operation has been received from the remote controller 200. If it is determined that the information has been received, the system control unit 150 determines that the information has been received. If not, the process proceeds to S4404.
  • the system control unit 150 determines the movement destination position of the tracking frame from the current position of the tracking frame (AF frame) and the movement information received in S4402, and advances the process to S4404.
  • the system control unit 150 displays a live view screen including a display (tracking standby frame) indicating a subject to be tracked at the start of tracking on the display unit 128 as illustrated in FIG. 6D, for example.
  • the process ends.
  • FIG. 6B the same number is assigned to the same display element as in FIG. 6E in FIG. 6E is a display showing the position of the tracking frame (tracking standby frame) after movement. At this time, the tracking operation has not started.
  • the system control unit 150 receives an instruction to start tracking from the remote controller 200, the system control unit 150 starts tracking processing using the tracking standby frame at that time as the tracking frame.
  • FIG. 7 is a flowchart showing an operation performed by the remote controller 200 in conjunction with the operation of the camera 100 described with reference to FIGS. 4AA to 4CB.
  • the operation shown in the flowchart of FIG. 7 is realized by the remote controller control unit 250 controlling each unit of the remote controller 200 based on a program stored in the nonvolatile memory 256.
  • step S7001 the remote control unit 250 receives information on the screen to be displayed on the display unit 228, the operation mode of the touch panel 228a, and information on the detected face from the camera 100 via the external I / F 293.
  • the screen information is information that the camera 100 notifies in any one of S4009, S4011, S4013, S4014, and S4016.
  • the mode of the touch panel 228a is information that the camera 100 notifies in any of S4101, S4201, S4301, and S4401.
  • the face information is information notified by the camera 100 in steps S4009 and S4103.
  • the touch panel 228a has a plurality of operation modes including an absolute position operation mode (first operation mode) and a relative position operation mode (second operation mode).
  • Remote control unit 250 sets one of a plurality of operation modes on touch panel 228a in accordance with the operation mode received from camera 100.
  • step S7002 the remote control unit 250 displays the screen shown in any of FIGS. 5A to 5F on the display unit 228 based on the screen identification information (or screen data) received in step S7001.
  • step S7003 the remote control control unit 250 determines whether an effective device operation (operation other than the touch operation) corresponding to the screen being displayed has been detected. If it is determined that the detection has been detected, the control proceeds to step S7004. If not, the process proceeds to S7005. For example, when the screen of FIG. 5A is being displayed, an operation of a device (any of buttons and keys included in the operation unit 270) assigned to the guide displays 5001a, 5002a, and 5004 is included in the effective operation.
  • a device any of buttons and keys included in the operation unit 270
  • the display screen and the effective operation are associated with each other and stored in, for example, the non-volatile memory 256 of the remote controller 200.
  • the remote control unit 250 determines whether the detected operation is an effective operation for the current display screen. The determination can be made with reference to the nonvolatile memory 256.
  • step S7004 the remote control control unit 250 generates an AF mode change request corresponding to the valid operation detected in step S7003, transmits the AF mode change request to the camera 100 via the external I / F 293, and returns the process to step S7001.
  • the remote controller control unit 250 determines whether or not the currently set operation mode is the absolute position operation mode. If it is determined to be the absolute position operation mode, the process proceeds to S7006, and if it is not determined to be the absolute position operation mode. The process proceeds to S7009.
  • the remote control control unit 250 determines whether a touchdown operation of the touch panel 228a with respect to a position in the display unit 228 other than the guide displays 5001a, 5002a, and 5004 has been detected. Here, the main face is set according to the position where the touch-down operation is detected, but a position where another touch operation such as touch-up or double-tap operation is detected may be used. If it is determined that a specific touch operation (here, a touch-down operation) has been detected, remote control control unit 250 advances the process to S7007, and if not detected, returns the process to S7001.
  • a specific touch operation here, a touch-down operation
  • the remote control unit 250 is designated by the touch operation from the position (coordinates) where the touch operation detected in S7006 is performed and the face information (or display position and size of each face frame) received in S7001. The face is determined, and the process proceeds to S7008. More specifically, when face information corresponding to the position (coordinates) where the touch operation is performed exists (when the position where the touch operation is performed is within the face frame), the corresponding face information is specified. Judge as a face. If no face exists at the position where the touch operation is performed (touch position), no face is selected in the processes of S7007 and S7008. In step S7008, the remote control control unit 250 transmits the face identification information determined in step S7007 to the camera 100 via the external I / F 293 as the selected face information, and the process returns to step S7001.
  • step S7009 the remote control unit 250 determines whether the currently set operation mode is the relative position operation mode. If it is determined that the operation mode is the relative position operation mode, the process advances to step S7010 to determine the relative position operation mode. If not, the process returns to S7001.
  • step S7010 the remote control control unit 250 determines whether a drag operation on the touch panel 228a (a touch move in which the moving distance of the touch position is a predetermined distance or more) is detected. If it is determined that the remote control unit 250 has detected, the process proceeds to step S7011, and if it is not determined that it has been detected, the process returns to step S7001.
  • the drag operation is an example of an effective touch operation associated with the display screen in which the relative position operation mode is specified, and may be another touch operation such as a flick operation.
  • the remote control unit 250 converts the direction and distance of the drag operation detected in S7010 into frame movement information.
  • an appropriate magnification is set so that the movement amount of the AF frame and the tracking standby frame with respect to the drag operation of the touch panel 228a does not become excessively large.
  • the moving distance is represented by the number of pixels, and the number of pixels of the display unit 228 is set to 1/10 of the number of pixels of resolution of the captured image in the horizontal and vertical directions.
  • the remote control unit 250 converts the distance of the drag operation using a magnification smaller than the number of pixels of the captured image / the number of pixels of the display unit 228 in the horizontal or vertical direction.
  • the remote controller 200 and the camera 100 information may be exchanged between the system control unit 150 and the remote control unit 250 to determine the initial value of the magnification. Therefore, the operation amount may be transmitted as it is from the remote controller 250 to the camera 100, and the system controller 150 may perform the conversion process of S7011.
  • the moving distance in the horizontal (vertical) direction / the moving distance after conversion in the horizontal (vertical) direction / imaging rather than the moving distance in the horizontal (vertical) direction / touch panel size is just to make the magnitude
  • the size of the captured image is assumed to be a size converted by a predetermined ppi (for example, 72 ppi or 96 ppi).
  • the distance of the drag operation may be converted into the number of pixels when 72 ppi or 96 ppi is set, and further converted into the distance on the captured image.
  • a function of dynamically adjusting the sensitivity of the movement amount of the AF frame or the tracking standby frame with respect to the drag operation of the touch panel 228a may be assigned to any of the devices included in the operation unit 270.
  • the remote control unit 250 transmits the frame movement information generated in S7011 to the camera 100 via the external I / F 293, and returns the process to S7001.
  • a touch sensing device such as a touch panel is used.
  • the absolute position specification is valid.
  • the absolute position designation with respect to the touch sensing device is invalidated and the relative position designation is valid. I did it. In other words, whether the position of the touch operation on the touch sensing device is valid or whether the operation direction or amount is valid is dynamically switched depending on whether or not a specific area is detected by the camera.
  • an accurate position can be specified even with a small or touch-sensitive device that does not display an image by specifying an arbitrary position within or near the area. Therefore, the position of the touch operation with respect to the touch sensing device is validated. On the other hand, in other cases, not the position of the touch operation but at least one of the operation direction and the operation amount is validated. Thereby, the convenience of the touch operation can be enjoyed while preventing an unintended result by designating the absolute position of the image with a small size or a touch-sensitive device on which no image is displayed. Since the camera automatically switches between these, it is even more convenient. Further, when subject tracking is performed, a common operation procedure can be used to start and end tracking from a tracking standby state regardless of whether or not a tracking target subject is detected.
  • the remote control unit 250 detects the position where the touchdown is detected. It is determined whether (touch down position) is a position corresponding to any face information (whether it is a touch down with respect to any face frame). When the touchdown position is a position corresponding to any face information, the remote controller control unit 250 transmits the face of the corresponding face information to the camera 100 as the selected face information. That is, when the absolute position designation is valid, the remote control unit 250 executes a process corresponding to the touchdown position (a process of selecting a face corresponding to the touchdown position).
  • the remote control unit 250 selects the face corresponding to the touch-down position regardless of which face has been selected before detecting the touch-down. Further, it is assumed that even if no movement (touch move) or touch-up of the touch position is detected, if the touch-down position is within the face frame, remote control unit 250 selects the face corresponding to the face frame. In other words, when the absolute position designation is valid, a function uniquely corresponding to the touch position is executed. Further, the processing is not based on the set value before the touch operation is detected (the position of the AF frame or the tracking frame before the touch operation is detected). Instead of selecting a face by touchdown, when a touchup occurs, a face corresponding to the touch position detected last before the touchup may be selected.
  • the remote control unit 250 does not move the AF frame and the tracking frame only by detecting the touchdown (start of touch), and touches it. Maintain the position before the start of.
  • the remote controller control unit 250 generates frame movement information (including the movement amount and movement direction of the touch position) and sends it to the camera 100.
  • a process for moving the AF frame or the tracking frame is performed.
  • the AF frame or the tracking frame is moved in the direction and the movement amount based on the frame movement information from the position set before the touch that is the generation factor of the frame movement information is performed.
  • the touchdown position is used to calculate the moving amount and moving direction of the touch position, but does not directly correspond to the movement destination of the AF frame or the tracking frame. Regardless of the start position of the touch move, if the movement amount and movement direction of the touch position are the same, the movement direction and movement amount of the AF frame or tracking frame are the same. In other words, when the relative position designation is valid, processing corresponding to the movement amount and movement direction of the touch position is executed, and the function uniquely corresponding to the touch position is not executed.
  • the process is based on the setting value before the touch operation is detected (the position of the AF frame or the tracking frame before the touch operation is detected). Specifically, the set value before the touch operation is performed is changed by the change amount corresponding to the movement amount and the movement direction of the touch position.
  • the Live View image is displayed on the camera display along with information such as the area detection results and the current AF frame, more accurate operation is possible by operating the camera's external remote control while viewing the camera display. It can be performed.
  • the absolute position designation is not validated until a predetermined condition is satisfied after the absolute position designation becomes invalid. Therefore, when the detection by the area detection function of the camera is not stable, it is possible to prevent the valid / invalid designation of the absolute position designation for the touch sensitive device from being switched in a short time.
  • the present invention can also be used for functions other than focus control.
  • the position of the area (AF frame or tracking standby frame) determined according to the touch operation or information on the selected area may be used for the exposure control (AE) process or the white balance control process.
  • the area detected by the camera function is not limited to the subject area such as the face area.
  • a high luminance region for example, a whiteout region
  • a low luminance region for example, a blackout region
  • an area having a specific color for example, an empty area
  • a part of the flow in FIGS. 4AA to 4CB may be performed by the remote control unit 250 in the remote controller 200, or a part of the flow in FIG. 7 may be performed by the system control unit 150 in the camera 100.
  • the coordinate conversion processing in S7007 and S7011 may be performed in the camera 100.
  • the remote controller 200 transmits the event information of the type of touch operation (touch down operation or drag operation), the operation position, the operation direction, and the operation amount to the camera 100 instead of S7008 and S7012. Even if it does in this way, the effect equivalent to the above-mentioned embodiment can be acquired.
  • the camera 100 and the remote controller 200 may be an integrated device incorporated in one housing.
  • the external I / F 193 and the external I / F 293 are replaced with a bus connection inside the apparatus.
  • equivalent modules such as the system control unit 150 and the remote control unit 250 can be integrated into one module. Even with such a configuration, an effect equivalent to that of the present embodiment can be obtained.
  • the camera 100 and the remote controller 200 may be integrated into one housing, and a remote controller corresponding to a separate remote controller 200 may be connected. In this way, when the cameraman and the assistant take a picture as a team, both the cameraman who handles the camera 100 and the assistant who handles the separate remote controller 200 can perform the frame operation in the same procedure.
  • each of the system control unit 150 and the remote control unit 250 may be realized by one piece of hardware, or may be realized by sharing a plurality of pieces of hardware.
  • the hardware that implements the system control unit 150 and the remote control unit 250 may be a programmable processor such as a CPU or dedicated hardware such as an ASIC.
  • the operation of the touch panel 228a included in the remote controller 200 connected to the camera 100 has been mainly described.
  • the use of the remote controller 200 is not essential to the present invention, and the present invention can be implemented with the camera 100 alone. It is.
  • the camera 100 when the camera 100 is used alone, for example, -When the proximity sensor provided near the viewfinder detects that the user is using the viewfinder (optical viewfinder or electronic viewfinder)-When the display is displayed on the display in the viewfinder-Operation of the operation unit 170
  • the system control unit 150 executes the above-described operation of the remote control unit 250 including setting of the operation mode according to the detection state of the subject area, the operation according to the operation mode, and the like. .
  • the system control unit 150 may perform the same control operation for the display unit 128 and the touch panel 128a as those performed by the remote control unit 250 for the display unit 228 and the touch panel 228a. Further, in this case, in the same manner as the control described as being displayed on the display unit 128 by the system control unit 150 in each embodiment described above, when the finder is an electronic view finder, a live view is displayed on the display unit in the finder. In addition, mark display such as a face frame, an AF frame, and a tracking frame may be performed. When the viewfinder is an optical viewfinder, live view display is omitted, and the face frame, AF frame, tracking frame, etc. are displayed on the display unit in the viewfinder so that they can be seen superimposed on the optical image. Just do it. Even when live view display on the display unit 128 is not performed, the touch panel 128a can be used in the same manner as the touch panel 228a of the remote controller 200 described above by enabling the operation on the touch panel 128a.
  • the present invention can be applied to any electronic device having an imaging function and a touch sensor (touch panel, touch pad).
  • the touch sensor need not be incorporated in the display device or disposed on the display device.
  • Electronic devices to which the present invention can be applied include, but are not limited to, personal computers, tablet terminals, portable electronic devices (PDAs), mobile phones, image viewers, media players, game machines, electronic book readers, and the like.
  • PDAs portable electronic devices
  • the present invention can be applied to these electronic devices, but is particularly useful in portable electronic devices in which the touch sensor installation area is small.
  • the present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in a computer of the system or apparatus read and execute the program
  • This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)

Abstract

Disclosed are: an electronic device that achieves both the convenience of a touch operation and an accurate position specification; and a method for controlling the same. When an area corresponding to a predetermined condition has been detected from an image, the electronic device performs a function corresponding to a position where a touch has been detected by a touch detection means. In addition, when the area corresponding to the predetermined condition has not been detected or area detection is not performed, the electronic device does not perform a process corresponding to the position where the touch has been detected by the touch detection means, but performs a process corresponding to a touch position movement operation.

Description

電子機器およびその制御方法Electronic device and control method thereof
 本発明は電子機器およびその制御方法に関する。 The present invention relates to an electronic device and a control method thereof.
 ライブビュー画像を表示するタッチディスプレイを備え、焦点検出領域(AF枠)や、自動露出制御(AE)によって適正露出にすべき領域(AE領域)を、タッチディスプレイのタッチ位置に指定できるデジタルカメラが知られている(特許文献1)。 A digital camera that includes a touch display that displays a live view image and can specify a focus detection area (AF frame) and an area (AE area) that should be properly exposed by automatic exposure control (AE) as a touch position of the touch display. Known (Patent Document 1).
特開2012-203143号公報JP 2012-203143 A
 現在、多くのデジタルカメラの画素数は1600~2000万画素超であるのに対し、タッチディスプレイの解像度は40~120万画素程度で、大きさも3型程度に留まる。スマートフォンやタブレット端末ではより高解像度かつより大型のタッチディスプレイを備えるものもあるが、せいぜい400万画素、10型に留まっている。 At present, the number of pixels of many digital cameras is over 16 to 20 million pixels, while the resolution of the touch display is about 400,000 to 1.2 million pixels and the size is only about 3 type. Some smartphones and tablet terminals have a higher resolution and larger touch display, but are limited to 4 million pixels and 10 type at most.
 従って、タッチディスプレイに表示されている撮像画像はかなり縮小されているため、意図した位置をタッチ操作で正確に指定することは容易でない。特に、AF枠やAE領域を撮影時に指定する場合、電子機器を片手で保持しながらもう片手の指を用いたり、機器を保持している手の指を伸ばしてたりして画面をタッチする必要があるため、意図した位置を正確に指定することは一層困難になる。 Therefore, since the captured image displayed on the touch display is considerably reduced, it is not easy to accurately specify the intended position by the touch operation. In particular, when the AF frame or AE area is specified at the time of shooting, it is necessary to touch the screen by holding the electronic device with one hand while using the finger of the other hand or extending the finger of the hand holding the device. Therefore, it is more difficult to specify the intended position accurately.
 本発明はこのような従来技術の課題に鑑みてなされたもので、タッチ操作の利便性と、正確な位置指定との両立を実現する電子機器およびその制御方法の提供を目的の1つとする。 The present invention has been made in view of the problems of the prior art, and an object of the present invention is to provide an electronic device that realizes both convenience of touch operation and accurate position designation and a control method thereof.
 上述の目的は、画像から予め定められた条件に該当する領域が検出されている場合に第1の操作モードを設定し、領域が検出されていない場合または領域の検出が行われない場合に第2の操作モードを設定する設定手段と、タッチ検知面への操作を検出するタッチ検出手段と、制御手段であって、第1の操作モードが設定されている場合には、タッチ検出手段でタッチを検出した位置に対応する機能を実行し、第2の操作モードが設定されている場合には、タッチ検出手段でタッチを検出した位置に対応する処理ではなく、タッチ位置の移動操作に応じた処理を実行する、ように制御する制御手段と、を有することを特徴とする電子機器によって達成される。 The above-described object is to set the first operation mode when an area corresponding to a predetermined condition is detected from the image, and to set the first operation mode when the area is not detected or when the area is not detected. A setting means for setting the operation mode 2, a touch detection means for detecting an operation on the touch detection surface, and a control means. When the first operation mode is set, the touch detection means touches When the function corresponding to the position where the touch is detected is executed and the second operation mode is set, the process corresponding to the touch position moving operation is performed instead of the process corresponding to the position where the touch is detected by the touch detection means. It is achieved by an electronic device characterized by having control means for performing control so as to execute processing.
 本発明によれば、タッチ操作の利便性と、正確な位置指定との両立を実現する電子機器およびその制御方法を提供できる。 According to the present invention, it is possible to provide an electronic device that realizes both convenience of touch operation and accurate position designation and a control method thereof.
 本発明のその他の特徴及び利点は、添付図面を参照とした以下の説明により明らかになるであろう。なお、添付図面においては、同じ若しくは同様の構成には、同じ参照番号を付す。 Other features and advantages of the present invention will become apparent from the following description with reference to the accompanying drawings. In the accompanying drawings, the same or similar components are denoted by the same reference numerals.
 添付図面は明細書に含まれ、その一部を構成し、本発明の実施の形態を示し、その記述と共に本発明の原理を説明するために用いられる。
本発明の実施形態に係るデジタルカメラの外観例を示す斜視図 本発明の実施形態に係るデジタルカメラの外観例を示す斜視図 実施形態に係るリモコンの外観例を示す斜視図 実施形態に係るデジタルカメラの機能構成例を示すブロック図 実施形態に係るリモコンの機能構成例を示すブロック図 実施形態に係るデジタルカメラの動作に関するフローチャート 実施形態に係るデジタルカメラの動作に関するフローチャート 実施形態に係るデジタルカメラの動作に関するフローチャート 実施形態に係るデジタルカメラの動作に関するフローチャート 実施形態に係るデジタルカメラの動作に関するフローチャート 実施形態に係るデジタルカメラの動作に関するフローチャート 実施形態に係るリモコンにおける画面表示例を示す図 実施形態に係るリモコンにおける画面表示例を示す図 実施形態に係るリモコンにおける画面表示例を示す図 実施形態に係るリモコンにおける画面表示例を示す図 実施形態に係るリモコンにおける画面表示例を示す図 実施形態に係るリモコンにおける画面表示例を示す図 実施形態に係るデジタルカメラにおける画面表示例を示す図 実施形態に係るデジタルカメラにおける画面表示例を示す図 実施形態に係るデジタルカメラにおける画面表示例を示す図 実施形態に係るデジタルカメラにおける画面表示例を示す図 実施形態に係るデジタルカメラにおける画面表示例を示す図 実施形態に係るリモコンの動作に関するフローチャート
The accompanying drawings are included in the specification, constitute a part thereof, show an embodiment of the present invention, and are used to explain the principle of the present invention together with the description.
The perspective view which shows the external appearance example of the digital camera which concerns on embodiment of this invention The perspective view which shows the external appearance example of the digital camera which concerns on embodiment of this invention The perspective view which shows the external appearance example of the remote control which concerns on embodiment 1 is a block diagram showing a functional configuration example of a digital camera according to an embodiment. 1 is a block diagram showing an example of a functional configuration of a remote control according to an embodiment The flowchart regarding the operation of the digital camera according to the embodiment The flowchart regarding the operation of the digital camera according to the embodiment The flowchart regarding the operation of the digital camera according to the embodiment The flowchart regarding the operation of the digital camera according to the embodiment The flowchart regarding the operation of the digital camera according to the embodiment The flowchart regarding the operation of the digital camera according to the embodiment The figure which shows the example of a screen display in the remote control which concerns on embodiment The figure which shows the example of a screen display in the remote control which concerns on embodiment The figure which shows the example of a screen display in the remote control which concerns on embodiment The figure which shows the example of a screen display in the remote control which concerns on embodiment The figure which shows the example of a screen display in the remote control which concerns on embodiment The figure which shows the example of a screen display in the remote control which concerns on embodiment The figure which shows the example of a screen display in the digital camera which concerns on embodiment The figure which shows the example of a screen display in the digital camera which concerns on embodiment The figure which shows the example of a screen display in the digital camera which concerns on embodiment The figure which shows the example of a screen display in the digital camera which concerns on embodiment The figure which shows the example of a screen display in the digital camera which concerns on embodiment The flowchart regarding the operation of the remote control according to the embodiment
 以下、図面を参照して本発明の例示的な実施形態について説明する。
●(デジタルカメラ100)
 図1Aおよび図1Bは、本実施形態に係る電子機器の一例である、一眼レフデジタルカメラ100(以下、単にカメラ100と呼ぶ)の外観を、それぞれ前面上方および背面下方から見た斜視図である。
Hereinafter, exemplary embodiments of the present invention will be described with reference to the drawings.
● (Digital camera 100)
1A and 1B are perspective views of an external appearance of a single-lens reflex digital camera 100 (hereinafter simply referred to as a camera 100), which is an example of an electronic apparatus according to the present embodiment, viewed from above the front and below the back, respectively. .
 カメラ100の背面に設けられた表示部128はタッチディスプレイである。表示部128は、撮影された画像や、カメラ100の設定値などの各種情報を表示するために用いられる。また、表示部128は、タッチ操作を受け付ける操作部としても機能する。なお、本実施形態のカメラ100のように独立したファインダーを有する機器の場合、表示部128に代えて、タッチパッドのように表示機能を持たないタッチセンサを用いたり、表示部128とは別の位置にタッチセンサを配置してもよい。ここで、ファインダーは光学式であっても電子式であってもよい。表示部128を有さない機器の場合、タッチディスプレイに対する操作を、ファインダーを覗きながら(ファインダーの接眼部を介して視認可能な表示部を視認しながら)、タッチセンサを操作することに置き換えることができる。ユーザーがファインダーを覗いていることを検出し、表示部128のタッチセンサ機能は有効のまま、表示を無効とする機器においても同様である。 The display unit 128 provided on the back of the camera 100 is a touch display. The display unit 128 is used to display various information such as a captured image and a setting value of the camera 100. The display unit 128 also functions as an operation unit that receives a touch operation. In the case of a device having an independent viewfinder such as the camera 100 of the present embodiment, a touch sensor that does not have a display function such as a touch pad may be used instead of the display unit 128, or a device other than the display unit 128 may be used. A touch sensor may be arranged at the position. Here, the viewfinder may be optical or electronic. In the case of a device that does not have the display unit 128, the operation on the touch display is replaced by operating the touch sensor while looking through the viewfinder (while visually recognizing the display unit visible through the eyepiece of the viewfinder). Can do. The same applies to a device that detects that the user is looking through the finder and disables the display while the touch sensor function of the display unit 128 remains valid.
 シャッターボタン161は撮影準備指示や撮影開始指示を入力するための操作部である。モード切替スイッチ160は、カメラ100の撮影モードなどを切り替えるための操作部である。端子カバー191は、外部機器との通信に用いるケーブルのコネクタを保護する。ダイヤル171は、シャッター速度や絞りなどの設定値の変更に用いられる。電源スイッチ172はカメラ100の電源のON及びOFFを切り替える。 The shutter button 161 is an operation unit for inputting a shooting preparation instruction and a shooting start instruction. The mode switch 160 is an operation unit for switching the shooting mode of the camera 100 and the like. The terminal cover 191 protects a connector of a cable used for communication with an external device. Dial 171 is used to change setting values such as shutter speed and aperture. A power switch 172 switches the power of the camera 100 on and off.
 ダイヤル173は選択する枠を移動したり、表示される画像の送り/戻しなどに用いられる。ダイヤル173は上下左右方向に押下可能であり、十字キー174としても機能するほか、回転軸部分にSETボタン175が設けられている。十字キー174はカーソル移動に、SETボタン175は選択項目の決定や、確認応答の入力などに用いられる。 The dial 173 is used for moving a frame to be selected and for sending / returning a displayed image. The dial 173 can be pressed in the vertical and horizontal directions, functions as a cross key 174, and is provided with a SET button 175 on the rotating shaft portion. The cross key 174 is used for moving the cursor, and the SET button 175 is used for determining a selection item, inputting a confirmation response, and the like.
 ライブビュー(LV)ボタン176は、カメラ100の静止画撮影モードにおいて、表示部128に対するライブビュー表示のONとOFFの切り替えに用いられる。LVボタン176は、カメラ100の動画撮影モードでは動画撮影(記録)の開始、停止の指示に用いられる。 The live view (LV) button 176 is used to switch on and off live view display on the display unit 128 in the still image shooting mode of the camera 100. The LV button 176 is used to start / stop moving image shooting (recording) in the moving image shooting mode of the camera 100.
 拡大ボタン177はライブビュー表示の拡大モードのON、OFFの切り替えや、拡大率の変更に用いられる。また、拡大ボタン177は、カメラ100の再生モードでは再生画像の表示倍率を増加させるために用いられる。再生ボタン178は、カメラ100の撮影モードと再生モードとの切り替えに用いられる。カメラ100は、撮影モードで再生ボタン178が押下されると再生モードに移行し、記録媒体に記録されている画像のうち、最新の画像を表示部128に表示する。 The enlarge button 177 is used to switch the enlargement mode of live view display on and off, and to change the enlargement ratio. The enlarge button 177 is used to increase the display magnification of the playback image in the playback mode of the camera 100. The playback button 178 is used for switching between the shooting mode and the playback mode of the camera 100. When the playback button 178 is pressed in the shooting mode, the camera 100 shifts to the playback mode and displays the latest image among the images recorded on the recording medium on the display unit 128.
 開閉可能な蓋120は、記録媒体の収容部や、収容された記録媒体をほこりなどから保護する。グリップ部190は、ユーザーがカメラ100を片手で保持しやすい形状に形成され、内部には電池が収容される。ミラー112は、システム制御部150(図3A)によって制御されるアクチュエータによりアップダウンし、レンズユニットからの光束の進行方向を切り替える。端子110はカメラ100が着脱可能なレンズユニットに設けられた端子と接触し、カメラ100からレンズユニットに電源を供給したり、カメラ100がレンズユニットと通信したりするために用いられる。 The lid 120 that can be opened and closed protects the storage portion of the recording medium and the stored recording medium from dust and the like. The grip 190 is formed in a shape that allows the user to easily hold the camera 100 with one hand, and a battery is accommodated therein. The mirror 112 is moved up and down by an actuator controlled by the system control unit 150 (FIG. 3A) to switch the traveling direction of the light beam from the lens unit. The terminal 110 is in contact with a terminal provided on a lens unit to which the camera 100 can be attached and detached, and is used for supplying power from the camera 100 to the lens unit and for the camera 100 to communicate with the lens unit.
 アイピース116は、光学ファインダーの接眼レンズである。アイピース116を覗くことにより、ユーザーはレンズユニットが形成する被写体像や、シャッタースピード、絞り値、撮影感度、AF枠などの設定値を観察することができる。 The eyepiece 116 is an optical viewfinder eyepiece. By looking into the eyepiece 116, the user can observe the subject image formed by the lens unit and the set values such as the shutter speed, aperture value, photographing sensitivity, and AF frame.
●(リモートコントローラ200)
 図2は、カメラ100のリモートコントローラ200(以下、単にリモコン200と呼ぶ)の外観例を示す斜視図である。リモコン200は、カメラ100と通信することにより、カメラ100の遠隔操作を可能にする。リモコン200とカメラ100とは有線媒体もしくは無線媒体を通じて通信可能であるが、本実施形態では接続ケーブル291を用いた有線通信を行うものとする。接続ケーブル291のコネクタは、カメラ100の端子カバー191内のコネクタに接続される。カメラ100とリモコン200は接続ケーブル291により双方向通信が可能である。
● (Remote controller 200)
FIG. 2 is a perspective view showing an external appearance example of a remote controller 200 (hereinafter simply referred to as a remote controller 200) of the camera 100. FIG. The remote controller 200 enables remote operation of the camera 100 by communicating with the camera 100. The remote controller 200 and the camera 100 can communicate with each other through a wired medium or a wireless medium, but in this embodiment, wired communication using the connection cable 291 is performed. The connector of the connection cable 291 is connected to the connector in the terminal cover 191 of the camera 100. The camera 100 and the remote controller 200 can perform two-way communication with a connection cable 291.
 電源スイッチ272は、リモコン200の電源のON及びOFFを切り替える。表示部228はタッチディスプレイである。表示部228はカメラ100の表示部128と同じアスペクト比を有するが、本実施形態では、カメラ100がライブビュー画像をリモコン200に送信することはなく、表示部228にはライブビュー画像は表示されない。 The power switch 272 switches the power of the remote control 200 on and off. The display unit 228 is a touch display. Although the display unit 228 has the same aspect ratio as the display unit 128 of the camera 100, in this embodiment, the camera 100 does not transmit the live view image to the remote controller 200, and the live view image is not displayed on the display unit 228. .
 表示部228の上下に3つずつ配置された操作ボタン271の機能は、表示部228の表示内容に応じて変化する。十字キー274は上、下、左、右に対応したキーを有し、十字キー274の中央にはSETボタン275が配置されている。十字キー274はカーソル移動の指示入力などに、SETキー275は選択された項目の実行や決定の指示入力に主に用いられる。HOMEボタン276は、リモコン200のHOME画面への遷移を指示するために用いられる。MENUボタン277はリモコン200のMENU画面への遷移を指示するために用いられる。BACKボタン278はキャンセル指示や一つ前の画面への遷移を指示するために用いられる。電源スイッチ272以外の各種のキー・ボタン271,273~278は、リモコン200の操作部270(図3B)に含まれる。 The function of the operation buttons 271 arranged three by three above and below the display unit 228 changes according to the display content of the display unit 228. The cross key 274 has keys corresponding to up, down, left, and right, and a SET button 275 is disposed at the center of the cross key 274. The cross key 274 is mainly used for inputting an instruction for moving a cursor, and the SET key 275 is mainly used for inputting an instruction for executing or determining a selected item. The HOME button 276 is used to instruct a transition to the HOME screen of the remote controller 200. The MENU button 277 is used for instructing the transition of the remote control 200 to the MENU screen. The BACK button 278 is used to instruct a cancel instruction or a transition to the previous screen. Various key buttons 271, 273 to 278 other than the power switch 272 are included in the operation unit 270 (FIG. 3B) of the remote controller 200.
●(カメラおよびリモコンの機能構成)
 図3Aはカメラ100の、図3Bはリモコン200の機能構成例を示すブロック図であり、図1A~図2に示した構成要素については共通の参照数字を付してある。レンズユニット140(以下、単にレンズ140と呼ぶ)は、カメラ100に着脱可能な交換レンズである。レンズ140はフォーカスレンズや変倍レンズなどの可動レンズを含むレンズ群103を有する。端子146はレンズ140がカメラ100に装着された際に端子110と接触し、レンズ140-カメラ100間の双方向通信や、カメラ100からレンズ140への電源供給などを可能にする。
● (Functional configuration of camera and remote control)
3A is a block diagram showing an example of the functional configuration of the camera 100, and FIG. 3B is a block diagram showing an example of the functional configuration of the remote controller 200. The components shown in FIGS. The lens unit 140 (hereinafter simply referred to as the lens 140) is an interchangeable lens that can be attached to and detached from the camera 100. The lens 140 includes a lens group 103 including movable lenses such as a focus lens and a variable power lens. The terminal 146 contacts the terminal 110 when the lens 140 is attached to the camera 100, and enables bidirectional communication between the lens 140 and the camera 100, power supply from the camera 100 to the lens 140, and the like.
 レンズ制御部144はCPU、MPUなどのプログラマブルプロセッサであり、不揮発性メモリ145に記憶されたプログラムを、システムメモリ147に読み込んで実行することにより、レンズ140の動作を制御する。レンズ制御部144は端子146,110を介したシステム制御部150との通信を通じて、絞り駆動回路142を介して絞り機構102を制御し、またAF駆動回路143を介してレンズ群103の可動レンズの位置を制御する。AF/MFスイッチ(SW)148は、レンズ140のオートフォーカス機能を利用するか否かを切り替えるスイッチであり、設定値はレンズ制御部144からシステム制御部150に通知される。なお、AF/MFスイッチ148を有さないレンズ140も存在する。 The lens control unit 144 is a programmable processor such as a CPU or MPU, and controls the operation of the lens 140 by reading the program stored in the nonvolatile memory 145 into the system memory 147 and executing it. The lens control unit 144 controls the aperture mechanism 102 via the aperture drive circuit 142 through communication with the system control unit 150 via the terminals 146 and 110, and controls the movable lens of the lens group 103 via the AF drive circuit 143. Control the position. The AF / MF switch (SW) 148 is a switch for switching whether to use the autofocus function of the lens 140, and the setting value is notified from the lens control unit 144 to the system control unit 150. There is also a lens 140 that does not have the AF / MF switch 148.
 カメラ100において、フォーカシングスクリーン113、ペンタプリズム114、および接眼レンズ116は光学ファインダーを構成する。光学ファインダーのフォーカシングスクリーン113に結像された被写体像を、ペンタプリズム114および接眼レンズ116を介して観察することで、ユーザーは撮影範囲や合焦位置などを確認することができる。光学ファインダー内に設けられたAEセンサー117は、レンズ140が形成する被写体像の輝度を測定する。 In the camera 100, the focusing screen 113, the pentaprism 114, and the eyepiece lens 116 constitute an optical viewfinder. By observing the subject image formed on the focusing screen 113 of the optical viewfinder through the pentaprism 114 and the eyepiece lens 116, the user can confirm the photographing range, the in-focus position, and the like. The AE sensor 117 provided in the optical viewfinder measures the luminance of the subject image formed by the lens 140.
 焦点検出部111はレンズ140を瞳分割するAFセンサーを有し、デフォーカス情報(方向および量)をシステム制御部150に出力する。システム制御部150は、デフォーカス情報に基づいてフォーカスレンズの移動量および移動方向をレンズ制御部144に通知する。レンズ制御部144は、通知された方向および量、AF駆動回路143を介してフォーカスレンズ140を移動させる。 The focus detection unit 111 has an AF sensor that divides the lens 140 into pupils, and outputs defocus information (direction and amount) to the system control unit 150. The system control unit 150 notifies the lens control unit 144 of the movement amount and movement direction of the focus lens based on the defocus information. The lens control unit 144 moves the focus lens 140 via the notified direction and amount and the AF driving circuit 143.
 クイックリターンミラー112(以下、単にミラー112と呼ぶ)は、レンズ140から入射した光束の進行方向を、光学ファインダー方向と撮像部122方向との間で切替える。図3Aはミラー112がダウンした状態を示し、光束は光学ファインダー方向に進行する。撮像部122を露光する際(静止画撮影、ライブビュー撮影、記録用の動画撮影の際)にはミラー112がアップされ、光束が撮像部122方向に進行する。なお、ミラー112がダウンした状態で焦点検出部111が光束の一部を受光できるよう、ミラー112はハーフミラーの領域を有する。 The quick return mirror 112 (hereinafter simply referred to as the mirror 112) switches the traveling direction of the light beam incident from the lens 140 between the optical finder direction and the imaging unit 122 direction. FIG. 3A shows a state in which the mirror 112 is down, and the light beam travels in the direction of the optical viewfinder. When exposing the imaging unit 122 (during still image shooting, live view shooting, and recording moving image shooting), the mirror 112 is raised, and the light beam travels in the direction of the imaging unit 122. Note that the mirror 112 has a half mirror region so that the focus detection unit 111 can receive a part of the light beam in a state where the mirror 112 is down.
 メカニカルシャッター101は、システム制御部150の制御に従って開閉し、撮像部122を所定時間露光する。なお、動画撮影時など電子シャッターを用いる期間、メカニカルシャッター101は継続して開いた状態とされる。
 撮像部122は、複数の画素を備え、光電変換によって光学像を電気信号に変換する撮像素子を有する。A/D変換器123は、撮像部122が生成するアナログ信号をデジタルデータに変換する。
The mechanical shutter 101 opens and closes under the control of the system control unit 150 to expose the imaging unit 122 for a predetermined time. Note that the mechanical shutter 101 is kept open during a period in which the electronic shutter is used, such as during moving image shooting.
The imaging unit 122 includes a plurality of pixels and includes an imaging element that converts an optical image into an electrical signal by photoelectric conversion. The A / D converter 123 converts the analog signal generated by the imaging unit 122 into digital data.
 画像処理部124は、A/D変換器123もしくはメモリ制御部115からのデータに対して所定の画像処理を適用する。例示的な画像処理としては、デモザイク、ホワイトバランス調整、階調補正、スケーリング(リサイズ)、色変換、符号化、復号、領域検出などがあるが、これらに限定されない。なお、領域検出は、特定のパターンや特徴を有する領域(例えば人物や動物の顔、特定の色や輝度を有する領域)など、予め定められた条件に該当する領域を画像から検出する処理であり、被写体検出とも呼ばれる。また、人物の認識処理や、顔の表情、目つむり、赤目などの検出も画像処理部124で行うことができる。以下では、被写体検出の一例として人間の顔を検出する顔検出処理を画像処理部124が実行するものとする。 The image processing unit 124 applies predetermined image processing to the data from the A / D converter 123 or the memory control unit 115. Examples of image processing include, but are not limited to, demosaic, white balance adjustment, gradation correction, scaling (resizing), color conversion, encoding, decoding, and area detection. The area detection is a process for detecting an area that satisfies a predetermined condition, such as an area having a specific pattern or feature (for example, a face of a person or an animal, or an area having a specific color or brightness) from an image. Also called subject detection. In addition, the image processing unit 124 can also perform recognition processing of a person and detection of facial expressions, eyebrows, red eyes, and the like. In the following, it is assumed that the image processing unit 124 executes face detection processing for detecting a human face as an example of subject detection.
 また、画像処理部124は、撮影によって取得された画像(撮像画像)から制御用の情報を生成することもできる。例えば、コントラストAFを実施する場合、画像処理部124は焦点検出領域におけるAF評価値を生成し、システム制御部150に提供することができる。また、画像処理部124は、輝度評価値を生成したり、輝度ヒストグラムを生成したりしてもよい。さらに、画像処理部124は、時系列的な撮影によって取得された複数の画像(例えば複数の動画フレーム)感において、予め定められた画像領域の移動先を検出することができる。画像間における特定領域の移動方向および移動量の検出は、例えばテンプレートマッチングを用いた動きベクトルの検出処理として公知な方法を用いることができるが、それに限定されない。例えば焦点検出領域(AF枠)をテンプレートとして設定することにより、AF枠を追尾することが可能になり、後述する追尾AFが実現できる。 Also, the image processing unit 124 can generate control information from an image (captured image) acquired by shooting. For example, when contrast AF is performed, the image processing unit 124 can generate an AF evaluation value in the focus detection area and provide it to the system control unit 150. The image processing unit 124 may generate a brightness evaluation value or a brightness histogram. Further, the image processing unit 124 can detect a destination of a predetermined image region in a sense of a plurality of images (for example, a plurality of moving image frames) acquired by time-series shooting. The detection of the movement direction and the movement amount of the specific area between the images can be performed using a known method as a motion vector detection process using template matching, for example, but is not limited thereto. For example, by setting a focus detection area (AF frame) as a template, the AF frame can be tracked, and tracking AF described later can be realized.
 A/D変換器123が出力するデジタルデータは、画像処理部124とメモリ制御部115のうち、少なくともメモリ制御部115を介してメモリ132に書き込まれる。メモリ132は、A/D変換器123が出力するデジタルデータのほか、記録媒体185から読み出した画像データなどを格納する。メモリ132は連写時などのバッファメモリとして機能するための記憶容量を有する。 Digital data output from the A / D converter 123 is written into the memory 132 through at least the memory control unit 115 of the image processing unit 124 and the memory control unit 115. The memory 132 stores image data read from the recording medium 185 in addition to the digital data output from the A / D converter 123. The memory 132 has a storage capacity for functioning as a buffer memory during continuous shooting.
 また、メモリ132はビデオメモリを兼ねており、表示部128に表示する画像や、カメラの設定値などの各種情報を表す画像(OSD)のデータなども格納する。D/A変換器125は、メモリ132に格納されている表示用の画像(撮像画像とOSDの合成画像)のデータをアナログ信号に変換して表示部128に供給し、表示用の画像が表示部128に表示される。 The memory 132 also serves as a video memory, and stores images to be displayed on the display unit 128 and image (OSD) data representing various information such as camera setting values. The D / A converter 125 converts the display image data (captured image and OSD composite image) stored in the memory 132 into an analog signal and supplies the analog signal to the display unit 128, so that the display image is displayed. Displayed in the section 128.
 表示部128は例えば画面上にタッチパネル128aが設けられたタッチディスプレイである。タッチパネル128aの位置検出方式に制限は無く、例えば静電容量方式のタッチパネルを用いることができる。なお、図では便宜上タッチパネル128aを表示部128と別個の構成として記載しているが、インセル型タッチディスプレイのように、タッチパネルの機能が表示部128に組み込まれていてもよい。動画を撮影しながら表示部128に表示することで、表示部128をEVFとして機能させることができる。表示部128をEVFとして機能させるために表示する画像を、ライブビュー画像と呼ぶ。 The display unit 128 is, for example, a touch display provided with a touch panel 128a on the screen. There is no limitation on the position detection method of the touch panel 128a, and for example, a capacitive touch panel can be used. In the figure, the touch panel 128a is described as a separate configuration from the display unit 128 for convenience, but the function of the touch panel may be incorporated in the display unit 128 like an in-cell touch display. By displaying the moving image on the display unit 128 while shooting, the display unit 128 can function as an EVF. An image displayed for causing the display unit 128 to function as an EVF is referred to as a live view image.
 不揮発性メモリ156は、例えばEEPROMのように電気的に消去・記録可能なメモリである。不揮発性メモリ156は、システム制御部150が実行するためのプログラム、各種の設定値、GUIデータなどを記憶する。 The nonvolatile memory 156 is a memory that can be electrically erased and recorded, such as an EEPROM. The nonvolatile memory 156 stores a program to be executed by the system control unit 150, various setting values, GUI data, and the like.
 システム制御部150はCPU、MPUなどのプログラマブルプロセッサであり、不揮発性メモリ156に記憶されたプログラムを、システムメモリ152に読み込んで実行することにより、カメラ100の動作を制御する。システム制御部150はメモリ132、D/A変換器125、表示部128を制御することにより、表示制御も行う。 The system control unit 150 is a programmable processor such as a CPU or MPU, and controls the operation of the camera 100 by reading the program stored in the nonvolatile memory 156 into the system memory 152 and executing it. The system control unit 150 also performs display control by controlling the memory 132, the D / A converter 125, and the display unit 128.
 また、システム制御部150は、AEセンサー117から得られる輝度情報に基づく自動露出制御(AE)処理、焦点検出部111から得られるデフォーカス情報に基づく自動焦点検出(AF)処理を行う。また、システム制御部150は、AE処理やAF処理に、画像処理部124で得られる情報を必要に応じて用いる。例えば、システム制御部150は、画像処理部124における顔領域の検出結果を用いて、人物の顔に焦点検出領域(AF枠)を設定したAF処理を行ったり、人物の顔が適正露出になるようにAE処理やフラッシュの調光を行ったりすることができる。また、システム制御部150は、特定の顔に対応する領域を画像処理部124に追尾させることにより、顔の移動に追従したAF(追尾AF)を行うこともできる。また、コントラストAFを実行する場合、システム制御部150は、フォーカスレンズの位置を変えながら撮影を行い、画像処理部124で得られるAF評価値が最大となるフォーカスレンズ位置を探索することができる。 Further, the system control unit 150 performs automatic exposure control (AE) processing based on luminance information obtained from the AE sensor 117 and automatic focus detection (AF) processing based on defocus information obtained from the focus detection unit 111. Further, the system control unit 150 uses information obtained by the image processing unit 124 for AE processing and AF processing as necessary. For example, the system control unit 150 performs an AF process in which a focus detection area (AF frame) is set on a person's face using the detection result of the face area in the image processing unit 124, or the person's face is properly exposed. Thus, AE processing and flash light control can be performed. The system control unit 150 can also perform AF (tracking AF) following the movement of the face by causing the image processing unit 124 to track the area corresponding to the specific face. Further, when executing contrast AF, the system control unit 150 can perform imaging while changing the position of the focus lens, and can search for a focus lens position where the AF evaluation value obtained by the image processing unit 124 is maximized.
 システムタイマー153は各種制御に用いる時間や、内蔵時計の時間を計測する。
 モード切替スイッチ160は、カメラ100の動作モードを静止画記録モード、動画記録モード、再生モード等のいずれかに切り替える。静止画記録モードには、オート撮影モード、オートシーン判別モード、マニュアルモード、絞り優先モード(Avモード)、シャッター速度優先モード(Tvモード)、シーンモード、プログラムAEモード、カスタムモード等がある。シーンモードは、スポーツ、雪景色、花火など、撮影シーンの種類に応じたパラメータの設定が自動的に行われるモードである。これらのモードの少なくとも1つは切り替えスイッチ160によって選択可能であってよい。あるいは、モード切り替えスイッチ160で静止画撮影モードを選択したのち、他の操作部材を用いて静止画撮影モードに含まれる複数のモードのいずれかを選択するようにしてもよい。動画撮影モードにも複数のモードが含まれてもよい。
The system timer 153 measures the time used for various controls and the time of the built-in clock.
The mode switch 160 switches the operation mode of the camera 100 to one of a still image recording mode, a moving image recording mode, a reproduction mode, and the like. Still image recording modes include auto shooting mode, auto scene discrimination mode, manual mode, aperture priority mode (Av mode), shutter speed priority mode (Tv mode), scene mode, program AE mode, custom mode, and the like. The scene mode is a mode in which parameters are automatically set according to the type of shooting scene, such as sports, snow scenes, and fireworks. At least one of these modes may be selectable by the changeover switch 160. Alternatively, after selecting the still image shooting mode with the mode switch 160, any of a plurality of modes included in the still image shooting mode may be selected using another operation member. The video shooting mode may also include a plurality of modes.
 第1シャッタースイッチ162は、シャッターボタン161の半押しでONとなり、第1シャッタースイッチ信号SW1を発生する。システム制御部150は、第1シャッタースイッチ信号SW1を撮影準備の指示として、AF処理、AE処理などの動作を開始する。第2シャッタースイッチ164は、シャッターボタン161の全押しでONとなり、第2シャッタースイッチ信号SW2を発生する。システム制御部150は、第2シャッタースイッチ信号SW2を撮影開始の指示として、撮像部122の露光、読み出しから、画像データの生成、記録媒体185への画像データ書き込みまでの一連の撮影・記録処理を開始する。 The first shutter switch 162 is turned on when the shutter button 161 is half-pressed to generate the first shutter switch signal SW1. The system control unit 150 starts operations such as AF processing and AE processing using the first shutter switch signal SW1 as an instruction for shooting preparation. The second shutter switch 164 is turned on when the shutter button 161 is fully pressed, and generates a second shutter switch signal SW2. The system control unit 150 uses the second shutter switch signal SW2 as an instruction to start shooting, and performs a series of shooting and recording processes from exposure and reading of the imaging unit 122 to generation of image data and writing of image data to the recording medium 185. Start.
 操作部170は、図1A~図1Bに示した操作部材(ダイヤル171~拡大ボタン177)をまとめて示している。これらの操作部材の機能は、例えば、表示部128の表示内容に応じて動的に割り当てされてよい。例えば、メニューボタン、終了ボタン、戻るボタン、画像送りボタン、ジャンプボタン、絞込みボタン、属性変更ボタン等の機能が割り当てられてよい。例えば、メニューボタンが操作される、システム制御部150はユーザーが各種の設定を行うためのメニュー画面を表示部128に表示させる。ユーザーは、例えば十字キー174およびSETボタン175の操作を組み合わせて、あるいはタッチパネル128aの操作により、メニュー画面から各種の設定や指示を入力することができる。なお、 The operation unit 170 collectively shows the operation members (the dial 171 to the enlargement button 177) shown in FIGS. 1A to 1B. The functions of these operation members may be dynamically assigned according to the display content of the display unit 128, for example. For example, functions such as a menu button, an end button, a return button, an image advance button, a jump button, a narrowing button, and an attribute change button may be assigned. For example, when the menu button is operated, the system control unit 150 causes the display unit 128 to display a menu screen for the user to perform various settings. For example, the user can input various settings and instructions from the menu screen by combining the operations of the cross key 174 and the SET button 175 or by operating the touch panel 128a. In addition,
 表示部128に表示する画面に応じて、操作部170に含まれるボタンやキーの操作による動作を予め設定して不揮発性メモリ156に記憶しておくことができる。この設定はユーザーが変更可能であってもよい。 Depending on the screen displayed on the display unit 128, an operation by operating a button or key included in the operation unit 170 can be set in advance and stored in the nonvolatile memory 156. This setting may be changeable by the user.
 電源制御部180は、電池検出回路、DC-DCコンバータ、通電するブロックを切り替えるスイッチ回路等により構成され、電源の装着の有無、電源の種類、電池残量の検出を行う。また、電源制御部180は、これら検出結果およびシステム制御部150の制御に基づいてDC-DCコンバータを制御し、必要な電圧を必要な期間、記録媒体185を含む各部へ供給する。 The power supply control unit 180 includes a battery detection circuit, a DC-DC converter, a switch circuit that switches a block to be energized, and the like, and detects whether or not a power supply is installed, the type of power supply, and the remaining battery level. Further, the power supply control unit 180 controls the DC-DC converter based on these detection results and the control of the system control unit 150, and supplies a necessary voltage to each unit including the recording medium 185 for a necessary period.
 電源部130は、一次電池、二次電池、もしくはACアダプタであってよい。記録媒体I/F118は、記録媒体185とのインターフェースである。記録媒体185は、メモリカードやハードディスク等であってよい。また、着脱可能な記録媒体と固定された記録媒体とが存在してもよい。 The power supply unit 130 may be a primary battery, a secondary battery, or an AC adapter. The recording medium I / F 118 is an interface with the recording medium 185. The recording medium 185 may be a memory card or a hard disk. There may be a detachable recording medium and a fixed recording medium.
 外部I/F193は、外部機器との通信インタフェースであり、ケーブルを接続するためのコネクタを有する。例えば、パーソナルコンピュータなどを接続するためのUSBインタフェース、外部モニタなどを接続するためのHDMI(登録商標)、リモコン200用のインタフェースと、対応するコネクタを備える。通信I/F192は、外部機器との通信インタフェースであり、無線通信用のアンテナや変復調回路などを含む。通信I/F192は、例えば無線LANやBlurtooth(登録商標)などの規格に準じた無線通信を行うためのアンテナ並びに回路を有する。 The external I / F 193 is a communication interface with an external device, and has a connector for connecting a cable. For example, a USB interface for connecting a personal computer or the like, an HDMI (registered trademark) for connecting an external monitor, an interface for the remote control 200, and a corresponding connector are provided. The communication I / F 192 is a communication interface with an external device, and includes an antenna for wireless communication, a modulation / demodulation circuit, and the like. The communication I / F 192 includes, for example, an antenna and a circuit for performing wireless communication in accordance with a standard such as a wireless LAN or Bluetooth (registered trademark).
 本実施形態において、システム制御部150は、固定位置、被写体領域、またはユーザーが指定した任意の位置、に焦点検出領域(AF枠)を設定して自動焦点検出(AF)を実行することが可能である。固定位置は例えば画面の中央であってよい。また、被写体領域とは例えば画像処理部124がライブビュー画像で検出した顔領域であってよい。また、ユーザーが指定した任意の位置とは、例えば十字キー174とSETボタン175の組み合わせ、あるいはタッチパネル128aのタッチ操作によって指定された画像中の位置であってよい。タッチ操作で指定された位置にAF枠を設定してAFを実行することを以下ではタッチAFと呼ぶ。 In the present embodiment, the system control unit 150 can perform automatic focus detection (AF) by setting a focus detection area (AF frame) at a fixed position, a subject area, or an arbitrary position designated by the user. It is. The fixed position may be, for example, the center of the screen. The subject area may be a face area detected by the image processing unit 124 from the live view image, for example. The arbitrary position designated by the user may be, for example, a position in the image designated by a combination of the cross key 174 and the SET button 175 or a touch operation of the touch panel 128a. Hereinafter, setting AF frames at positions designated by touch operations and executing AF is referred to as touch AF.
 タッチ検出手段としてのシステム制御部150はタッチパネル128aのタッチ検知面(表示部128の表示面)への以下の操作、あるいは状態を検出できる。
・タッチパネル128aにタッチしていなかった指やペンが新たにタッチパネル128aにタッチしたこと。すなわち、タッチの開始(以下、タッチダウン(Touch-Down)と称する)。
・タッチパネル128aを指やペンでタッチしている状態であること(以下、タッチオン(Touch-On)と称する)。
・タッチパネル128aを指やペンでタッチしたまま移動したこと(以下、タッチムーブ(Touch-Move)と称する)。
・タッチパネル128aへタッチしていた指やペンを離したこと。すなわち、タッチの終了(以下、タッチアップ(Touch-Up)と称する)。
・タッチパネル128aに何もタッチしていない状態(以下、タッチオフ(Touch-Off)と称する)。
The system control unit 150 serving as a touch detection unit can detect the following operation or state on the touch detection surface (display surface of the display unit 128) of the touch panel 128a.
-A finger or pen that has not touched the touch panel 128a touches the touch panel 128a anew. That is, the start of touch (hereinafter referred to as touch-down).
The touch panel 128a is being touched with a finger or a pen (hereinafter referred to as touch-on).
The touch panel 128a is moved while being touched with a finger or a pen (hereinafter referred to as touch-move).
-The finger or pen that touched the touch panel 128a is released. That is, the end of touch (hereinafter referred to as touch-up).
A state in which nothing touches the touch panel 128a (hereinafter referred to as touch-off).
 タッチダウンが検出されると、同時にタッチオンであることも検出される。タッチダウンの後、タッチアップが検出されない限りは、通常はタッチオンが検出され続ける。タッチムーブが検出されるのもタッチオンが検出されている状態である。タッチオンが検出されていても、タッチ位置が移動していなければタッチムーブは検出されない。タッチしていた全ての指やペンがタッチアップしたことが検出された後は、タッチオフとなる。 When touchdown is detected, it is also detected that touch-on is performed at the same time. After touch-down, unless touch-up is detected, normally touch-on continues to be detected. The touch move is detected in a state where touch-on is detected. Even if the touch-on is detected, the touch move is not detected unless the touch position is moved. After it is detected that all fingers or pens that have been touched are touched up, the touch is turned off.
 これらの操作・状態や、タッチパネル128a上に指やペンがタッチしている位置座標は内部バスを通じてシステム制御部150に通知される。システム制御部150は通知された情報に基づいてタッチパネル128a上にどのような操作(タッチ操作)が行なわれたかを判定する。タッチムーブについてはタッチパネル128a上で移動する指やペンの移動方向についても、位置座標の変化に基づいて、タッチパネル128a上の垂直成分・水平成分毎に判定できる。 These operations / states and the position coordinates where the finger or pen touches the touch panel 128a are notified to the system control unit 150 through the internal bus. The system control unit 150 determines what operation (touch operation) has been performed on the touch panel 128a based on the notified information. Regarding the touch move, the moving direction of the finger or pen moving on the touch panel 128a can also be determined for each vertical component / horizontal component on the touch panel 128a based on the change of the position coordinates.
 所定距離以上をタッチムーブしたことが検出された場合はスライド操作(ドラッグ操作)が行なわれたと判定するものとする。タッチパネル128a上に指をタッチしたまま、ある程度の距離だけ素早く動かして、そのまま離すといった操作をフリックと呼ぶ。フリックは、言い換えればタッチパネル128a上を指ではじくように素早くなぞる操作である。所定距離以上を、所定速度以上でタッチムーブしたことが検出され、そのままタッチアップが検出されるとフリックが行なわれたと判定できる(スライド操作に続いてフリックがあったものと判定できる)。 When it is detected that a touch move has been performed over a predetermined distance, it is determined that a slide operation (drag operation) has been performed. An operation in which a finger is touched on the touch panel 128a, quickly moved by a certain distance, and then released is called a flick. In other words, the flick is an operation of quickly tracing the touch panel 128a with a finger. When it is detected that a touch move is performed at a predetermined speed or more at a predetermined distance or more and a touch-up is detected as it is, it can be determined that a flick has been performed (it can be determined that a flick has occurred following the slide operation).
 更に、複数箇所(例えば2点)を同時にタッチして、互いのタッチ位置を近づけるタッチ操作をピンチイン、互いのタッチ位置を遠ざけるタッチ操作をピンチアウトと称する。ピンチアウトとピンチインを総称してピンチ操作(あるいは単にピンチ)と称する。タッチパネル128aは、抵抗膜方式や静電容量方式、表面弾性波方式、赤外線方式、電磁誘導方式、画像認識方式、光センサ方式等、様々な方式のタッチパネルのうちいずれの方式のものを用いても良い。方式によって、タッチパネルに対する接触があったことでタッチがあったと検出する方式や、タッチパネルに対する指やペンの接近があったことでタッチがあったと検出する方式があるが、いずれの方式でもよい。 Furthermore, a touch operation in which a plurality of locations (for example, two points) are simultaneously touched to bring the touch positions closer to each other is referred to as pinch-in, and a touch operation that moves away from each other is referred to as pinch-out. Pinch-out and pinch-in are collectively referred to as pinch operation (or simply pinch). The touch panel 128a may be any of various types of touch panels such as a resistive film method, a capacitance method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, an image recognition method, and an optical sensor method. good. Depending on the method, there are a method for detecting that there is a touch due to contact with the touch panel, and a method for detecting that there is a touch due to the approach of a finger or pen to the touch panel, but either method may be used.
 次に、リモコン200について説明する。
 不揮発性メモリ256は、例えばEEPROMのように電気的に消去・記録可能なメモリである。不揮発性メモリ256は、リモコン制御部250が実行するためのプログラム、各種の設定値、GUIデータなどを記憶する。
Next, the remote controller 200 will be described.
The nonvolatile memory 256 is an electrically erasable / recordable memory such as an EEPROM. The nonvolatile memory 256 stores a program to be executed by the remote control unit 250, various setting values, GUI data, and the like.
 リモコン制御部250はCPU、MPUなどのプログラマブルプロセッサであり、不揮発性メモリ256に記憶されたプログラムを、システムメモリ252に読み込んで実行することにより、リモコン200の動作を制御する。
 メモリ232は、表示部228のビデオメモリとして用いられ、リモコン制御部250が表示用画像データを格納する。メモリ232に格納されている表示用画像データをD/A変換器213がアナログ信号に変換して表示部228に供給ことで、表示部228で表示用画像が表示される。リモコン制御部250はメモリ232、D/A変換器213、表示部228を制御することにより、表示制御も行う。
The remote controller control unit 250 is a programmable processor such as a CPU or MPU, and controls the operation of the remote controller 200 by reading a program stored in the nonvolatile memory 256 into the system memory 252 and executing it.
The memory 232 is used as a video memory of the display unit 228, and the remote control unit 250 stores display image data. The display image data stored in the memory 232 is converted into an analog signal by the D / A converter 213 and supplied to the display unit 228, whereby the display image is displayed on the display unit 228. The remote control unit 250 also performs display control by controlling the memory 232, the D / A converter 213, and the display unit 228.
 表示部228は画面上にタッチパネル228aが設けられたタッチディスプレイである。カメラ100のタッチパネル128aと同様、タッチパネル228aの方式に制限はない。またタッチパネルの機能が表示部228に組み込まれていてもよい。
 システムタイマー253は各種制御に用いる時間や、内蔵時計の時間を計測する。
The display unit 228 is a touch display provided with a touch panel 228a on a screen. Similar to the touch panel 128a of the camera 100, the method of the touch panel 228a is not limited. Further, the function of the touch panel may be incorporated in the display portion 228.
The system timer 253 measures the time used for various controls and the time of the built-in clock.
 操作部270は、図2における電源スイッチ272以外の各種のキー・ボタン271,273~278をまとめて示している。操作ボタン271の機能は動的に割り当てられてよく、また、現在割り当てられている機能の名称などが表示部228に表示されてもよい。割り当てられる機能の例としては、終了ボタン、戻るボタン、画像送りボタン、ジャンプボタン、絞込みボタン、属性変更ボタンなどがあるが、これらに限定されない。 The operation unit 270 collectively shows various key buttons 271, 273 to 278 other than the power switch 272 in FIG. The function of the operation button 271 may be dynamically assigned, and the name of the currently assigned function may be displayed on the display unit 228. Examples of assigned functions include, but are not limited to, an end button, a return button, an image advance button, a jump button, a narrowing button, and an attribute change button.
 リモコン制御部250は、HOMEボタン276が押下されると、不揮発性メモリ256からGUIデータを読み出し、予め定められたHOME画面を表示部228に表示させる。同様に、リモコン制御部250は、MENUボタン277が押下されると、不揮発性メモリ256からGUIデータを読み出し、予め定められたMENU画面を表示部228に表示させる。 When the HOME button 276 is pressed, the remote control unit 250 reads the GUI data from the nonvolatile memory 256 and displays a predetermined HOME screen on the display unit 228. Similarly, when the MENU button 277 is pressed, the remote control control unit 250 reads the GUI data from the nonvolatile memory 256 and causes the display unit 228 to display a predetermined MENU screen.
 また、リモコン制御部250は、十字キー274の操作に応じてGUI画面におけるフォーカス(選択されている項目の強調表示)を移動させるなど、画面に応じた表示制御を行う。SETボタン275が操作されると、リモコン制御部250はその時点で選択されていた項目に応じた処理を実行したり、画面を遷移させたりする。表示部228に表示する画面に応じて、操作部270に含まれるボタンやキーの操作による動作を予め設定して不揮発性メモリ256に記憶しておくことができる。この設定はユーザーが変更可能であってもよい。 In addition, the remote control unit 250 performs display control corresponding to the screen, such as moving the focus (highlighting of the selected item) on the GUI screen according to the operation of the cross key 274. When the SET button 275 is operated, the remote controller control unit 250 executes processing corresponding to the item selected at that time, or changes the screen. Depending on the screen displayed on the display unit 228, an operation by operating a button or key included in the operation unit 270 can be set in advance and stored in the nonvolatile memory 256. This setting may be changeable by the user.
 電源制御部280は、電池検出回路、DC-DCコンバータ、通電するブロックを切り替えるスイッチ回路等により構成され、電源の装着の有無、電源の種類、電池残量の検出を行う。また、電源制御部280は、これら検出結果およびシステム制御部150の制御に基づいてDC-DCコンバータを制御し、必要な電圧を必要な期間、各部へ供給する。  The power supply control unit 280 includes a battery detection circuit, a DC-DC converter, a switch circuit that switches a block to be energized, and the like, and detects whether or not a power supply is installed, the type of power supply, and the remaining battery level. Further, the power supply control unit 280 controls the DC-DC converter based on the detection result and the control of the system control unit 150, and supplies a necessary voltage to each unit for a necessary period.
 電源部230は、一次電池、二次電池、もしくはACアダプタであってよい。外部I/F293は、カメラ100との通信インタフェースであり、ケーブル291が接続される。カメラ100との通信は、外部I/F293を介し、リモコン制御部250の制御に従って実施される。通信I/F292は、外部機器との通信インタフェースであり、無線通信用のアンテナや変復調回路などを含む。通信I/F292は、例えば無線LANやBlurtooth(登録商標)などの規格に準じた無線通信を行うためのアンテナ並びに回路を有する。 The power supply unit 230 may be a primary battery, a secondary battery, or an AC adapter. The external I / F 293 is a communication interface with the camera 100, and a cable 291 is connected thereto. Communication with the camera 100 is performed under the control of the remote control unit 250 via the external I / F 293. The communication I / F 292 is a communication interface with an external device, and includes an antenna for wireless communication, a modulation / demodulation circuit, and the like. The communication I / F 292 includes an antenna and a circuit for performing wireless communication conforming to a standard such as a wireless LAN or Blurtooth (registered trademark), for example.
 リモコン制御部250は、タッチパネル228aのタッチ検知面(表示部228の表示面)に対するタッチの有無やタッチされた位置の時間変化(速度)を検出し、様々なタッチ操作を認識することができる。ここでは、上述したシステム制御部150と同様のタッチ操作を認識可能であるものとする。 The remote controller 250 can detect various touch operations by detecting the presence or absence of a touch on the touch detection surface of the touch panel 228a (the display surface of the display unit 228) and the temporal change (speed) of the touched position. Here, it is assumed that the same touch operation as that of the system control unit 150 described above can be recognized.
●(動作の説明-カメラ100)
 次に、図4AA~図4CBを用いて、図3Aおよび図3Bに示した撮影システムにおいて、撮影スタンバイ状態や動画記録中のようにカメラ100がライブビュー画像の撮影および表示を行いながら実行する動作について説明する。図4AA~図4CBのフローチャートに示す動作は、システム制御部150が不揮発性メモリ156に記憶されたプログラムをシステムメモリ132に読み込んで実行し、カメラ100の各部を制御することによって実現される。
● (Description of operation-Camera 100)
Next, with reference to FIGS. 4AA to 4CB, in the photographing system shown in FIGS. 3A and 3B, an operation executed while the camera 100 captures and displays a live view image in a photographing standby state or during moving image recording. Will be described. The operations shown in the flowcharts of FIGS. 4AA to 4CB are realized by the system control unit 150 reading the program stored in the nonvolatile memory 156 into the system memory 132 and executing the program, and controlling each unit of the camera 100.
 なお、以下に説明する動作のうち、リモコン200からの入力は、カメラ100の操作部170(タッチパネル128aおよびレンズ140のAF/MFスイッチ148を含む)からの入力であってもよい。また、リモコン200への出力処理はリモコン200が接続されていない場合や、リモコン200が無効な場合などには行わなくてもよい。 Of the operations described below, input from the remote control 200 may be input from the operation unit 170 of the camera 100 (including the touch panel 128a and the AF / MF switch 148 of the lens 140). Further, the output process to the remote controller 200 may not be performed when the remote controller 200 is not connected or when the remote controller 200 is invalid.
 S4001でシステム制御部150は、外部I/F193を介してリモコン200(または操作部170)からAFモードの変更要求を受信したか否かを判定し、受信したと判定されればS4002へ、受信したと判定されなければS4003へ処理を進める。なお、本実施形態において「AFモード」は、オートフォーカス(AF)の動作状態を意味する。また、「AFモードの変更要求」は、AFの動作状態に影響を与える要求、例えばパラメータの値の変更要求などを意味する。本実施形態では、具体例として、顔検出を有効とするか否かの設定、被写体追尾の対象設定、追尾開始、追尾終了の指示、マニュアルフォーカス(MF)とオートフォーカスの切り替え指示をAFモードの変更要求として例示するが、これらに限定されない。 In step S4001, the system control unit 150 determines whether an AF mode change request has been received from the remote controller 200 (or the operation unit 170) via the external I / F 193. If it is determined that the AF mode change request has been received, the processing is received in step S4002. If it is not determined that the process has been performed, the process advances to step S4003. In the present embodiment, the “AF mode” means an operating state of autofocus (AF). The “AF mode change request” means a request that affects the AF operation state, for example, a parameter value change request. In the present embodiment, as specific examples, setting of whether to enable face detection, subject tracking target setting, tracking start, tracking end instruction, manual focus (MF) and autofocus switching instruction are set in the AF mode. Although illustrated as a change request, it is not limited to these.
 本実施形態においては、顔検出が有効である場合、合焦させる被写体をユーザーが選択することができる。また、焦点検出を行う領域(AF枠)の位置を、ユーザーが選択することができる。また、ユーザーが選択した主被写体(顔領域またはAF枠の位置の被写体)を追尾しながらAFを行う追尾AFが可能であり、ユーザーは追尾AFの開始および終了や、追尾対象の領域の変更を行うことができる。 In this embodiment, when face detection is effective, the user can select a subject to be focused. Also, the user can select the position of the area (AF frame) where focus detection is performed. In addition, tracking AF that performs AF while tracking the main subject selected by the user (the subject at the position of the face area or the AF frame) is possible, and the user can start and end tracking AF or change the tracking target area. It can be carried out.
 S4002でシステム制御部150は、AFモード変更要求に従って、AFモードを変更する。システム制御部150は、例えばシステムメモリ152に保持された、現在のAFモードに対応するパラメータ値を変更することによってAFモードを変更することができる。 In S4002, the system control unit 150 changes the AF mode in accordance with the AF mode change request. For example, the system control unit 150 can change the AF mode by changing a parameter value stored in the system memory 152 and corresponding to the current AF mode.
 S4003でシステム制御部150は、システムメモリ152を参照して、現在のAFモードが追尾待機状態か否かを判定し、追尾待機状態と判定されればS4016へ、追尾待機状態と判定されなければS4004へ、それぞれ処理を進める。
 S4004でシステム制御部150は、システムメモリ152を参照して、現在のAFモードが追尾中状態か否かを判定し、追尾中状態と判定されれば図4ABのS4014へ、追尾中状態と判定されなければS4005へ、それぞれ処理を進める。
In S4003, the system control unit 150 refers to the system memory 152 to determine whether or not the current AF mode is the tracking standby state. If it is determined that the tracking standby state is determined, the process proceeds to S4016. The process proceeds to S4004.
In step S4004, the system control unit 150 refers to the system memory 152 to determine whether or not the current AF mode is in the tracking state. If it is determined that the tracking state is in progress, the process proceeds to step S4014 in FIG. If not, the process proceeds to S4005.
 S4005でシステム制御部150は、システムメモリ152を参照して、現在のAFモードが顔検出有効か否かを判定し、顔検出有効と判定されればS4006へ、顔検出有効と判定されなければ図4ABのS4007へ、それぞれ処理を進める。
 S4006でシステム制御部150は、撮像部122で撮像された画像フレームにおいて顔が検出されているか否かを判定し、検出されていると判定されればS4008へ、検出されていると判定されなければ図4ABのS4011へ、それぞれ処理を進める。
In step S4005, the system control unit 150 refers to the system memory 152 to determine whether or not the current AF mode is valid for face detection. If it is determined that face detection is valid, the process proceeds to step S4006. The process proceeds to S4007 in FIG. 4AB.
In S4006, the system control unit 150 determines whether or not a face is detected in the image frame captured by the imaging unit 122. If it is determined that the face is detected, the system control unit 150 determines that the face is detected (S4008). For example, the process proceeds to S4011 in FIG.
 S4007でシステム制御部150は、システムメモリ152を参照して、現在のAFモードがAFかMFかを判定し、AFと判定されれば図4ABのS4011へ、MFと判定されれば図4ABのS4013へ、それぞれ処理を進める。 In step S4007, the system control unit 150 refers to the system memory 152 to determine whether the current AF mode is AF or MF. If it is determined to be AF, the process proceeds to step S4011 in FIG. 4AB. If it is determined to be MF, the system control unit 150 in FIG. The process proceeds to S4013.
 S4008でシステム制御部150は、システムタイマー153によって実現される相対位置タイマーが動作中か否かを判定する。相対位置タイマーは一定時間を計測する手段の一例であり、後述のS4204で計測(カウント)をスタートし、一定時間経過すると(カウントが所定値(例えば0)になると)停止する。システム制御部150は、相対位置タイマーが動作中と判定されればS4011へ、動作中と判定されなければS4009へ、それぞれ処理を進める。 In step S4008, the system control unit 150 determines whether the relative position timer realized by the system timer 153 is operating. The relative position timer is an example of a unit that measures a certain time. Measurement (counting) starts in S4204, which will be described later, and stops when a certain time elapses (when the count reaches a predetermined value (eg, 0)). If it is determined that the relative position timer is in operation, the system control unit 150 proceeds to S4011, and if it is not determined to be in operation, the system control unit 150 proceeds to S4009.
 S4009でシステム制御部150は、外部I/F193を介してリモコン200へ、表示画面の種別(領域選択画面)ならびに検出された個々の顔の情報(例えば識別情報、位置、大きさ)を通知する。なお、複数の顔が検出されている場合、システム制御部150は予め定められた条件に従って複数の顔の1つを主顔として決定し、どの顔が主顔であるかについてもリモコン200へ通知する。主顔の選択条件に特に制限はないが、例えば、画面の中央に最も近い顔、最も大きな顔、距離が近い顔、といった条件の1つ以上を用いることができる。なお、システム制御部150は、ユーザーが主顔として指定した顔が検出されていれば、その顔を主顔として決定する。表示画面の種別を通知する代わりに、表示画面データそのものを送信してもよい。 In step S4009, the system control unit 150 notifies the remote control 200 via the external I / F 193 of the display screen type (area selection screen) and detected individual face information (for example, identification information, position, and size). . When a plurality of faces are detected, the system control unit 150 determines one of the plurality of faces as a main face according to a predetermined condition, and notifies the remote controller 200 which face is the main face. To do. The main face selection condition is not particularly limited, but for example, one or more of the conditions such as the face closest to the center of the screen, the largest face, and the face closest to the distance can be used. Note that if the face designated by the user as the main face is detected, the system control unit 150 determines that face as the main face. Instead of notifying the type of display screen, the display screen data itself may be transmitted.
 図5Aに、S4009の通知に応答してリモコン制御部250が表示部228に表示させる領域選択画面の例を示す。
 5001aは、操作部270に含まれるデバイスの1つに、追尾待機状態への遷移指示機能が割り当てられていることを示すガイド表示である。
 5002aは、操作部270に含まれるデバイスの1つに、AF/MF切替指示機能が割り当てられていることを示すガイド表示である。
 5003aは、カメラ100の現在のフォーカスモードを示す表示であり、図5Aの例では現在のフォーカスモードがAFであることを示している。
FIG. 5A shows an example of an area selection screen that the remote control unit 250 displays on the display unit 228 in response to the notification in S4009.
Reference numeral 5001a denotes a guide display indicating that one of the devices included in the operation unit 270 is assigned the function of instructing the transition to the tracking standby state.
Reference numeral 5002a denotes a guide display indicating that an AF / MF switching instruction function is assigned to one of the devices included in the operation unit 270.
Reference numeral 5003a denotes a display indicating the current focus mode of the camera 100. In the example of FIG. 5A, the current focus mode is AF.
 5004は、操作部270に含まれるデバイスの1つに顔検出の有効・無効切替機能が割り当てられていることを示すガイド表示である。
 5005aは、カメラ100の現在の顔検出機能が有効か無効かを示す表示であり、図5Aの例では、顔検出機能が現在有効であることを示している。
Reference numeral 5004 denotes a guide display indicating that the face detection valid / invalid switching function is assigned to one of the devices included in the operation unit 270.
Reference numeral 5005a denotes a display indicating whether the current face detection function of the camera 100 is valid or invalid. In the example of FIG. 5A, the face detection function is currently valid.
 6003,6004a,6004bは、通知された顔情報に基づいてリモコン制御部250が表示する顔枠であり、検出されている顔の位置と大きさの目安となるマーク表示の一例である。ここでは3つの顔が検出されており、主顔に対応する顔枠6003の表示を他の顔枠6004a,6004bと異ならせている。なお、リモコン制御部250は、顔情報において画像座標で示される顔領域の位置や大きさを、表示部228の解像度に応じて変換して顔枠を表示する。 Reference numerals 6003, 6004a, and 6004b are face frames displayed by the remote control unit 250 based on the notified face information, and are examples of mark display that is a guide for the position and size of the detected face. Here, three faces are detected, and the display of the face frame 6003 corresponding to the main face is different from the other face frames 6004a and 6004b. The remote control unit 250 displays the face frame by converting the position and size of the face area indicated by the image coordinates in the face information according to the resolution of the display unit 228.
 なお、操作部270に含まれる入力デバイスのうち、表示部228の縁部に表示されるガイド表示5001a~5003aと対応するデバイスに特に制限はない。また、タッチパネル228aを用いて、ガイド表示5001a~5003aをそのままタッチ操作できるように構成してもよい。 Of the input devices included in the operation unit 270, there is no particular limitation on devices corresponding to the guide displays 5001a to 5003a displayed on the edge of the display unit 228. Further, the guide displays 5001a to 5003a may be configured to be touched as they are using the touch panel 228a.
 このように、図5Aの画面が表示された状態でユーザーは、
・被写体(顔)選択
・追尾待機状態への遷移指示
・AF/MFの切り替え
・顔検出の有効/無効の切り替え
をリモコン200から行うことができる。
In this way, the user can see the screen of FIG.
The subject (face) selection, the transition instruction to the tracking standby state, the AF / MF switching, and the face detection valid / invalid switching can be performed from the remote controller 200.
 本実施形態においてリモコン200の表示部228にはライブビュー画像を表示しないが、顔情報に基づく顔枠を表示することで、ユーザーはリモコン200から所望の顔を容易に指定することができる。なお、リモコン制御部250は、図5Fに示すように、顔枠の表示を行わない領域選択画面を表示してもよい。この場合、タッチパネル228aに割り当てられている機能と操作方法をユーザーに知らせるためのガイド表示5006aを行うことができる。 In this embodiment, the live view image is not displayed on the display unit 228 of the remote controller 200, but the user can easily designate a desired face from the remote controller 200 by displaying a face frame based on the face information. Note that the remote control unit 250 may display an area selection screen that does not display a face frame, as shown in FIG. 5F. In this case, a guide display 5006a for informing the user of the function and operation method assigned to the touch panel 228a can be performed.
 図5Aの領域選択画面のように、カメラ100が検出した領域を選択可能な画面を表示部228に表示している場合、リモコン200のタッチパネル228aに対する絶対位置操作を有効とする(操作位置の情報を有効とする)。
 そのため、S4010でシステム制御部150は、図4BAを用いて後述する絶対位置操作の処理を行い、処理をS4003に戻す。
When a screen on which the area detected by the camera 100 can be selected is displayed on the display unit 228 as in the area selection screen of FIG. 5A, the absolute position operation on the touch panel 228a of the remote control 200 is validated (operation position information). Is valid).
Therefore, in S4010, the system control unit 150 performs an absolute position operation process to be described later using FIG. 4BA, and returns the process to S4003.
 S4011でシステム制御部150は、外部I/F193を介してリモコン200へ、表示画面の種別(AF枠位置変更画面)を通知する。表示画面の種別を通知する代わりに、表示画面データそのものを送信してもよい。 In S4011, the system control unit 150 notifies the remote controller 200 of the type of display screen (AF frame position change screen) via the external I / F 193. Instead of notifying the type of display screen, the display screen data itself may be transmitted.
 図5Bに、S4011の通知に応答してリモコン制御部250が表示部228に表示させるAF枠位置変更画面の例を示す。この画面は、カメラ100がAF枠の移動操作を受けつける状態であることを示している。図5Aと同一の表示要素には同じ番号を付与し、説明は省略する。 FIG. 5B shows an example of an AF frame position change screen displayed on the display unit 228 by the remote control unit 250 in response to the notification in S4011. This screen indicates that the camera 100 is in a state of accepting an AF frame moving operation. The same display elements as those in FIG. 5A are given the same numbers, and descriptions thereof are omitted.
 5006bは、タッチパネル228aをドラッグ操作することで、AF枠を現在の位置からドラッグ操作の方向と距離に対応した位置へ移動できることを表すガイド表示である。
 このように、AF枠の位置変更に関しては、タッチパネル228aの絶対位置操作ではなく、相対位置操作を有効とする(操作の方向と量の情報を有効とする)。
Reference numeral 5006b denotes a guide display indicating that the AF frame can be moved from the current position to a position corresponding to the direction and distance of the drag operation by dragging the touch panel 228a.
As described above, regarding the position change of the AF frame, the relative position operation is enabled instead of the absolute position operation of the touch panel 228a (operation direction and amount information is enabled).
 S4012でシステム制御部150は、図4BBを用いて後述する相対位置操作の処理を行い、処理をS4003に戻す。 In S4012, the system control unit 150 performs a relative position operation process, which will be described later with reference to FIG. 4BB, and returns the process to S4003.
 S4013でシステム制御部150は、外部I/F193を介してリモコン200へ、表示画面の種別(タッチ操作無効画面)を通知し、処理をS4015に進める。表示画面の種別を通知する代わりに、表示画面データそのものを送信してもよい。 In S4013, the system control unit 150 notifies the remote controller 200 of the type of display screen (touch operation invalid screen) via the external I / F 193, and the process proceeds to S4015. Instead of notifying the type of display screen, the display screen data itself may be transmitted.
 図5Cに、S4013の通知に応答してリモコン制御部250が表示部228に表示させるタッチ操作無効画面の例を示す。この画面は、カメラ100がAF枠や顔枠の操作を受けつけない状態であることを示している。図5Aと同一の表示要素には同じ番号を付与し、説明は省略する。 FIG. 5C shows an example of a touch operation invalidation screen that the remote control unit 250 displays on the display unit 228 in response to the notification in S4013. This screen indicates that the camera 100 is in a state where it cannot accept the operation of the AF frame and the face frame. The same display elements as those in FIG. 5A are given the same numbers, and descriptions thereof are omitted.
 図5Cの例では、追尾待機状態への遷移指示機能が無効であるため、ガイド表示5001bを例えば図5Aに示した有効時とは異なる色で表示している。
 5003bは、現在のフォーカスモードがMFであることを示している。
 5005bは、現在の顔検出機能が無効であることを示している。
 5007cは、タッチパネル228aの操作(ガイド表示5002aおよび5004に対する操作をのぞく)が無効であることを表すガイド表示である。
In the example of FIG. 5C, since the function of instructing the transition to the tracking standby state is invalid, the guide display 5001b is displayed in a color different from the valid state shown in FIG. 5A, for example.
5003b indicates that the current focus mode is MF.
5005b indicates that the current face detection function is invalid.
Reference numeral 5007c denotes a guide display indicating that an operation on the touch panel 228a (except for operations on the guide displays 5002a and 5004) is invalid.
 また、S4014でシステム制御部150は、外部I/F193を介してリモコン200へ表示画面の種別(追尾中画面)を通知し、処理をS4015に進める。表示画面の種別を通知する代わりに、表示画面データそのものを送信してもよい。 In S4014, the system control unit 150 notifies the remote controller 200 of the display screen type (tracking screen) via the external I / F 193, and the process proceeds to S4015. Instead of notifying the type of display screen, the display screen data itself may be transmitted.
 図5Dに、S4014の通知に応答してリモコン制御部250が表示部228に表示させる追尾中画面の例を示す。追尾中画面は、カメラ100が現在、選択された被写体またはAF枠を追尾しながらAF動作を行っている(追尾AF中)であること、タッチ操作(ガイド表示5001cに対する操作をのぞく)が無効であることを報知する画面である。 FIG. 5D shows an example of a tracking screen that the remote control unit 250 displays on the display unit 228 in response to the notification in S4014. The tracking screen indicates that the camera 100 is currently performing an AF operation while tracking the selected subject or AF frame (during tracking AF), and touch operations (except for operations on the guide display 5001c) are invalid. It is a screen which notifies that there is.
 5001cは、操作部270に含まれるデバイスの1つに追尾終了の指示機能が割り当てられていることを示すガイド表示である。
 5007dは、カメラ100のAFモードが追尾AF中であり、タッチパネル228aの操作が無効であることを表すガイド表示である。
A guide display 5001c indicates that one of the devices included in the operation unit 270 is assigned a tracking end instruction function.
Reference numeral 5007d denotes a guide display indicating that the AF mode of the camera 100 is in tracking AF and the operation of the touch panel 228a is invalid.
 S4015でシステム制御部150は、図4CAを用いて後述する位置操作無効の処理を行い、処理をS4003に戻す。 In S4015, the system control unit 150 performs position operation invalidation processing, which will be described later, using FIG. 4CA, and returns the processing to S4003.
 S4016でシステム制御部150は、外部I/F193を介してリモコン200へ表示画面の種別(追尾待機画面)を通知し、処理をS4017に進める。表示画面の種別を通知する代わりに、表示画面データそのものを送信してもよい。 In S4016, the system control unit 150 notifies the remote controller 200 of the type of display screen (tracking standby screen) via the external I / F 193, and the process proceeds to S4017. Instead of notifying the type of display screen, the display screen data itself may be transmitted.
 図5Eに、S4016の通知に応答してリモコン制御部250が表示部228に表示させる追尾待機画面の例を示す。追尾待機画面は、カメラ100のAFモードが追尾待機状態であること、ドラッグ操作によって追尾待機枠を移動可能であることを報知する画面である。 FIG. 5E shows an example of a tracking standby screen displayed on the display unit 228 by the remote control unit 250 in response to the notification in S4016. The tracking standby screen is a screen for notifying that the AF mode of the camera 100 is in the tracking standby state and that the tracking standby frame can be moved by a drag operation.
 5001dは、操作部270に含まれるデバイスの1つに追尾開始指示機能が割り当てられていることを示すガイド表示である。
 5002bは、操作部270に含まれるデバイスの1つに追尾のキャンセル指示機能が割り当てられていることを示すガイド表示である。
 5006eは、カメラ100のAFモードが追尾待機状態であり、タッチパネル228aをドラッグ操作することで、追尾待機枠を現在の位置からドラッグ操作の方向と距離に対応した位置へ移動できることを表すガイド表示である。なお、追尾待機枠は、追尾AFを行う対象の顔またはAF枠であり、追尾開始の指示がなされると、カメラ100は指示があった時点における追尾待機枠を追尾対象とした追尾AFを開始する。
A guide display 5001d indicates that the tracking start instruction function is assigned to one of the devices included in the operation unit 270.
Reference numeral 5002b denotes a guide display indicating that the tracking cancel instruction function is assigned to one of the devices included in the operation unit 270.
A guide display 5006e indicates that the AF mode of the camera 100 is in the tracking standby state, and that the tracking standby frame can be moved from the current position to a position corresponding to the direction and distance of the drag operation by dragging the touch panel 228a. is there. The tracking standby frame is a target face or AF frame for which tracking AF is performed. When an instruction to start tracking is given, the camera 100 starts tracking AF with the tracking standby frame at the time of the instruction as a tracking target. To do.
 S4017でシステム制御部150は、図4CBを用いて後述する追尾待機操作の処理を行い、処理をS4003に戻す。 In S4017, the system control unit 150 performs a tracking standby operation process to be described later with reference to FIG. 4CB, and returns the process to S4003.
 図4BAは、システム制御部150がS4010で実行する絶対位置操作の処理の詳細を示すフローチャートである。
 S4101でシステム制御部150は、外部I/F193を介してリモコン200へ、絶対位置操作モードでタッチパネル228aを動作させるよう通知する。絶対位置操作モードは、タッチ操作の位置(座標)を有効とするモードである。
FIG. 4BA is a flowchart illustrating details of the absolute position operation process executed by the system control unit 150 in step S4010.
In step S4101, the system control unit 150 notifies the remote controller 200 via the external I / F 193 to operate the touch panel 228a in the absolute position operation mode. The absolute position operation mode is a mode in which the position (coordinates) of the touch operation is validated.
 S4102でシステム制御部150は、必要に応じて顔の検出状態を更新する。
 S4103でシステム制御部150は、現在検出されている個々の顔の情報(識別情報、位置、大きさ)を外部I/F193を介してリモコン200へ送信する。なお、S4009で顔の情報をリモコン200に送信しているため、S4102およびS4103の処理は行わなくてもよい。また、S4103の処理はS4102でS4009の実行時から顔検出結果が異なる場合のみ実行するようにしてもよい。
In step S4102, the system control unit 150 updates the face detection state as necessary.
In step S4103, the system control unit 150 transmits information (identification information, position, and size) of each currently detected face to the remote control 200 via the external I / F 193. Note that since the face information is transmitted to the remote controller 200 in S4009, the processing in S4102 and S4103 may not be performed. Further, the process of S4103 may be executed only when the face detection result is different from the execution of S4009 in S4102.
 S4104でシステム制御部150は、リモコン200から、選択された顔の識別情報を受信したか否かを判定し、受信したと判定されればS4105へ、受信したと判定されなければS4107へ、それぞれ処理を進める。
 S4105でシステム制御部150は、S4104で受信した顔の識別情報に対応する顔を主顔として決定し、処理をS4106に進める。
 S4106でシステム制御部150は、例えば主顔を含む領域にAF枠を設定してフォーカス制御(焦点検出)を行い、処理をS4107に進める。
In S4104, the system control unit 150 determines whether or not the identification information of the selected face has been received from the remote controller 200. If it is determined that it has been received, the process proceeds to S4105. If it is not determined to be received, the process proceeds to S4107. Proceed with the process.
In step S4105, the system control unit 150 determines the face corresponding to the face identification information received in step S4104 as the main face, and advances the process to step S4106.
In step S4106, for example, the system control unit 150 sets an AF frame in an area including the main face, performs focus control (focus detection), and advances the process to step S4107.
 S4107でシステム制御部150は、例えば図6Aに示すような、検出された顔を示すマーク表示(顔枠)を含んだライブビュー画面を表示部128へ表示し、絶対位置操作の処理を終了する。
 6001は、動画撮影中もしくは動画撮影の一時停止中であることを示す状態表示である。撮影スタンバイ状態の場合には表示されない。ここでは動画撮影中の表示例を示している。
 6002は、撮影中の動画のタイムコードを示す表示である。
 6003は、検出された顔のうち、主顔を示す枠表示である。
 6004aおよび6004bはいずれも、検出された顔のうち、主顔以外の顔を示す枠表示であり、主顔枠6003とは異なる態様で表示されている。
 6005aおよび6005bおよび6005cは、撮像部122にて撮像された画像に含まれる、被写体検出処理の検出対象(ここでは人間の顔)に対応する被写体である。
In step S4107, the system control unit 150 displays a live view screen including a mark display (face frame) indicating the detected face on the display unit 128 as illustrated in FIG. 6A, for example, and ends the absolute position operation process. .
A status display 6001 indicates that moving image shooting is being performed or that moving image shooting is being paused. It is not displayed in shooting standby mode. Here, a display example during moving image shooting is shown.
Reference numeral 6002 denotes a display showing the time code of the moving image being shot.
Reference numeral 6003 denotes a frame display indicating the main face among the detected faces.
Both 6004a and 6004b are frame displays indicating faces other than the main face among the detected faces, and are displayed in a manner different from the main face frame 6003.
Reference numerals 6005a, 6005b, and 6005c are subjects corresponding to detection targets (here, human faces) of the subject detection process included in the image captured by the imaging unit 122.
 図4BBは、システム制御部150がS4012で実行する相対位置操作の処理の詳細を示すフローチャートである。
 S4201でシステム制御部150は、外部I/F193を介してリモコン200へ、相対位置操作モードでタッチパネル228aを動作させるよう通知する。相対位置操作モードは、タッチ操作の位置ではなく、タッチ操作の位置の移動量、移動方向を有効とするモードである。
FIG. 4BB is a flowchart illustrating details of the relative position operation process executed by the system control unit 150 in step S4012.
In step S4201, the system control unit 150 notifies the remote controller 200 via the external I / F 193 to operate the touch panel 228a in the relative position operation mode. The relative position operation mode is a mode in which not the touch operation position but the movement amount and the movement direction of the touch operation position are validated.
 S4202でシステム制御部150は、リモコン200からタッチ操作の移動量および移動方向に関する情報(移動情報)を受信したか否かを判定し、受信したと判定されればS4203へ、受信したと判定されなければS4205へ、それぞれ処理を進める。
 S4203でシステム制御部150は、現在のAF枠の位置と、S4202で受信した移動情報とから、AF枠の移動先位置を決定し、処理をS4204に進める。
 S4204でシステム制御部150は、相対位置タイマーをリセットし、カウントをスタートし、処理をS4205に進める。上述したように、相対位置タイマーは一定時間を計測するためのタイマーであり、システムタイマー153を用いて実現される。相対位置タイマーはカウントアップタイマーであってもカウントダウンタイマーであってもよい。
In step S4202, the system control unit 150 determines whether or not information (movement information) regarding the movement amount and direction of the touch operation has been received from the remote controller 200. If it is determined that the information has been received, the system control unit 150 determines that the information has been received. If not, the process proceeds to S4205.
In step S4203, the system control unit 150 determines the movement destination position of the AF frame from the current position of the AF frame and the movement information received in step S4202, and advances the process to step S4204.
In step S4204, the system control unit 150 resets the relative position timer, starts counting, and advances the process to step S4205. As described above, the relative position timer is a timer for measuring a certain time, and is realized using the system timer 153. The relative position timer may be a count-up timer or a count-down timer.
 相対位置操作モードと絶対位置操作モードが頻繁に切り替わると、AF枠の位置が頻繁に移動し、ユーザーの意図しないフォーカス制御が行われる原因になる。相対位置タイマーで計測する一定時間は、相対位置操作と絶対位置操作とを切り替える(遷移させる)ための時間であり、違和感のない切り替えには数秒程度が適切と考えられる。システム制御部150は、相対位置タイマーの初期値と目標値とを設定し、カウントを開始させる。一定時間を計測する代わりに、S4205によるフォーカス制御が安定した状態になったことをカウントの開始または終了の条件としてもよい。タッチパネル228aの相対位置操作モードと絶対位置操作モードとの切り替えが頻繁に発生しないようにすることで、違和感のないフォーカス制御を実現することができる。 When the relative position operation mode and the absolute position operation mode are frequently switched, the position of the AF frame is frequently moved, which causes focus control unintended by the user. The fixed time measured by the relative position timer is a time for switching (transitioning) between the relative position operation and the absolute position operation, and about several seconds is considered appropriate for switching without a sense of incongruity. The system control unit 150 sets an initial value and a target value of the relative position timer, and starts counting. Instead of measuring the fixed time, the condition for starting or ending counting may be that the focus control in S4205 is in a stable state. By preventing frequent switching between the relative position operation mode and the absolute position operation mode of the touch panel 228a, it is possible to realize focus control without a sense of incongruity.
 S4205でシステム制御部150は、現在のAF枠位置に基づいてフォーカス制御を行い、処理をS4206に進める。
 S4206でシステム制御部150は、例えば図6Bに示すような、現在のAF枠を示すマーク表示を含んだライブビュー画面を表示部128へ表示し、相対位置操作の処理を終了する。なお、図6Bにおいて、図6Aと同一の表示要素には同じ参照番号を付与し、重複する説明は省略する。
 6006は、現在のAF枠位置を示す表示である。
 6007は、画像中の被写体である。
In step S4205, the system control unit 150 performs focus control based on the current AF frame position, and advances the process to step S4206.
In step S4206, the system control unit 150 displays a live view screen including a mark display indicating the current AF frame on the display unit 128 as illustrated in FIG. 6B, for example, and ends the relative position operation processing. In FIG. 6B, the same reference numerals are assigned to the same display elements as in FIG. 6A, and duplicate descriptions are omitted.
Reference numeral 6006 denotes a display indicating the current AF frame position.
Reference numeral 6007 denotes a subject in the image.
 図4CAは、システム制御部150がS4015で実行する位置操作無効の処理の詳細を示すフローチャートである。
 S4301でシステム制御部150は、外部I/F193を介してリモコン200へ、タッチパネル228aの絶対位置操作および相対位置操作が無効である旨を通知する。
 S4302でシステム制御部150は、S4004と同様に、AFモードが追尾中か否かを判定し、追尾中と判定されればS4303へ、追尾中と判定されなければS4305へ、それぞれ処理を進める。
FIG. 4CA is a flowchart illustrating details of the position operation invalidation process executed by the system control unit 150 in step S4015.
In step S4301, the system control unit 150 notifies the remote control 200 via the external I / F 193 that the absolute position operation and the relative position operation of the touch panel 228a are invalid.
In S4302, similarly to S4004, the system control unit 150 determines whether the AF mode is being tracked. If it is determined that tracking is in progress, the process proceeds to S4303. If not determined to be tracking, the process proceeds to S4305.
 S4305でシステム制御部150は、例えば図6Cに示すような、枠表示のないライブビュー画面を表示部128へ表示し、位置操作無効の処理を終了する。図6Cにおいて、図6Aと同一の表示要素には同じ参照番号を付与し、重複する説明は省略する。追尾中でない場合、ライブビュー画面には、検出された被写体領域を示す枠6003、6004a、6004b(図6A)およびAF枠6006(図6B)は表示されない。 In S4305, the system control unit 150 displays a live view screen without a frame display on the display unit 128 as shown in FIG. 6C, for example, and ends the position operation invalidation process. In FIG. 6C, the same reference numerals are assigned to the same display elements as in FIG. 6A, and duplicate descriptions are omitted. When tracking is not in progress, frames 6003, 6004a, 6004b (FIG. 6A) and AF frame 6006 (FIG. 6B) indicating the detected subject area are not displayed on the live view screen.
 S4303でシステム制御部150は、追尾処理において被写体を見失った(追尾に失敗した)か否かを判定し、追尾に失敗したと判定されればS4304へ、追尾に失敗したと判定されなければS4306へ、それぞれ処理を進める。
 S4304でシステム制御部150は、追尾処理を終了し、システムメモリ152に保持された、AFモードに対応するパラメータ値を追尾なしの状態へ変更し、位置操作無効の処理を終了する。これにより、再度追尾の設定がなされるまで、追尾なしのフォーカス制御が実施される。
In S4303, the system control unit 150 determines whether or not the subject has been lost in the tracking process (tracking has failed). If it is determined that tracking has failed, the process proceeds to S4304. If tracking is not determined to have failed, S4306 is determined. And proceed with each process.
In step S4304, the system control unit 150 ends the tracking process, changes the parameter value corresponding to the AF mode stored in the system memory 152 to a state without tracking, and ends the position operation invalidation process. Thus, focus control without tracking is performed until tracking is set again.
 S4306でシステム制御部150は、画像処理部124によって検出された、追尾中の被写体の移動位置にAF枠の設定を更新する。
 S4307でシステム制御部150は、S4306で更新したAF枠に対してフォーカス制御を行う。
 S4308でシステム制御部150は、例えば図6Dに示すような、追尾対象の被写体を示すマーク表示(追尾枠)を含んだライブビュー画面を表示部128へ表示して位置操作無効の処理を終了する。図6Dにおいて、図6Aと同一の表示要素には同じ参照番号を付与し、重複する説明は省略する。
 6008は、追尾中の被写体を示す枠(追尾枠)表示である。
In step S4306, the system control unit 150 updates the AF frame setting to the movement position of the subject that is being detected, which is detected by the image processing unit 124.
In step S4307, the system control unit 150 performs focus control on the AF frame updated in step S4306.
In step S4308, the system control unit 150 displays, on the display unit 128, a live view screen including a mark display (tracking frame) indicating the tracking target subject as illustrated in FIG. . In FIG. 6D, the same reference numerals are assigned to the same display elements as in FIG. 6A, and duplicate descriptions are omitted.
Reference numeral 6008 denotes a frame (tracking frame) display indicating the subject being tracked.
 図4CBは、システム制御部150がS4017で実行する追尾待機操作の処理の詳細を示すフローチャートである。
 S4401でシステム制御部150は、外部I/F193を介してリモコン200へ、相対位置操作モードでタッチパネル228aを動作させるよう通知する。
 S4402でシステム制御部150は、リモコン200からタッチ操作の移動量および移動方向に関する情報(移動情報)を受信したか否かを判定し、受信したと判定されればS4403へ、受信したと判定されなければS4404へ、それぞれ処理を進める。
FIG. 4CB is a flowchart showing details of the tracking standby operation process executed by the system control unit 150 in S4017.
In step S4401, the system control unit 150 notifies the remote controller 200 via the external I / F 193 to operate the touch panel 228a in the relative position operation mode.
In S4402, the system control unit 150 determines whether or not information (movement information) regarding the movement amount and movement direction of the touch operation has been received from the remote controller 200. If it is determined that the information has been received, the system control unit 150 determines that the information has been received. If not, the process proceeds to S4404.
 S4403でシステム制御部150は、現在の追尾枠(AF枠)の位置と、S4402で受信した移動情報とから、追尾枠の移動先位置を決定し、処理をS4404に進める。S4404でシステム制御部150は、例えば図6Dに示すような、追尾開始時に追尾対象となる被写体を示す表示(追尾待機枠)を含んだライブビュー画面を表示部128へ表示し、追尾待機操作の処理を終了する。図6Bにおいて、図6Aと同一の表示要素には同じ番号を付与しているため、説明は省略する。
 図6Eの6009は、移動後の追尾枠(追尾待機枠)の位置を示す表示である。なお、この時点では追尾動作は開始されていない。システム制御部150は、リモコン200から追尾開始の指示を受信すると、その時点における追尾待機枠を追尾枠とした追尾処理を開始する。
In S4403, the system control unit 150 determines the movement destination position of the tracking frame from the current position of the tracking frame (AF frame) and the movement information received in S4402, and advances the process to S4404. In step S4404, the system control unit 150 displays a live view screen including a display (tracking standby frame) indicating a subject to be tracked at the start of tracking on the display unit 128 as illustrated in FIG. 6D, for example. The process ends. In FIG. 6B, the same number is assigned to the same display element as in FIG.
6E in FIG. 6E is a display showing the position of the tracking frame (tracking standby frame) after movement. At this time, the tracking operation has not started. When the system control unit 150 receives an instruction to start tracking from the remote controller 200, the system control unit 150 starts tracking processing using the tracking standby frame at that time as the tracking frame.
 図7は、図4AA~図4CBを用いて説明したカメラ100の動作と連動してリモコン200が実行する動作を示すフローチャートである。図7のフローチャートに示す動作は、リモコン制御部250が不揮発性メモリ256に記憶されたプログラムに基づいて、リモコン200の各部を制御することによって実現される。 FIG. 7 is a flowchart showing an operation performed by the remote controller 200 in conjunction with the operation of the camera 100 described with reference to FIGS. 4AA to 4CB. The operation shown in the flowchart of FIG. 7 is realized by the remote controller control unit 250 controlling each unit of the remote controller 200 based on a program stored in the nonvolatile memory 256.
 S7001でリモコン制御部250は、外部I/F293を介してカメラ100から表示部228に表示すべき画面の情報、タッチパネル228aの操作モード、検出されている顔の情報を受信する。ここで、画面の情報はカメラ100がS4009、S4011、S4013、S4014、S4016のいずれかで通知する情報である。また、タッチパネル228aのモードはカメラ100がS4101、S4201、S4301、S4401のいずれかで通知する情報である。また、顔の情報はカメラ100がS4009、S4103で通知する情報である。タッチパネル228aは、絶対位置操作モード(第1の操作モード)、相対位置操作モード(第2の操作モード)を含む複数の操作モードを有する。リモコン制御部250は、カメラ100から受信した操作モードに応じて、複数の操作モードのいずれかをタッチパネル228aに設定する。 In step S7001, the remote control unit 250 receives information on the screen to be displayed on the display unit 228, the operation mode of the touch panel 228a, and information on the detected face from the camera 100 via the external I / F 293. Here, the screen information is information that the camera 100 notifies in any one of S4009, S4011, S4013, S4014, and S4016. The mode of the touch panel 228a is information that the camera 100 notifies in any of S4101, S4201, S4301, and S4401. The face information is information notified by the camera 100 in steps S4009 and S4103. The touch panel 228a has a plurality of operation modes including an absolute position operation mode (first operation mode) and a relative position operation mode (second operation mode). Remote control unit 250 sets one of a plurality of operation modes on touch panel 228a in accordance with the operation mode received from camera 100.
 S7002でリモコン制御部250は、S7001で受信した画面の識別情報(または画面データ)に基づいて、図5A~図5Fのいずれかに示した画面を表示部228に表示する。
 S7003でリモコン制御部250は、表示中の画面に応じた有効なデバイス操作(タッチ操作以外の操作)が検知されたか否かを判定し、検知されたと判定されればS7004へ、検知されたと判定されなければS7005へ、それぞれ処理を進める。例えば図5Aの画面を表示中であれば、ガイド表示5001a、5002a、5004に割り当てられたデバイス(操作部270に含まれるボタン、キーのいずれか)の操作が、有効な操作に含まれる。表示画面と有効な操作とが関連付けられて、リモコン200の例えば不揮発性メモリ256に保存されており、リモコン制御部250は検知された操作が現在の表示画面に対して有効な操作かどうかを、不揮発性メモリ256を参照して判定することができる。 
In step S7002, the remote control unit 250 displays the screen shown in any of FIGS. 5A to 5F on the display unit 228 based on the screen identification information (or screen data) received in step S7001.
In step S7003, the remote control control unit 250 determines whether an effective device operation (operation other than the touch operation) corresponding to the screen being displayed has been detected. If it is determined that the detection has been detected, the control proceeds to step S7004. If not, the process proceeds to S7005. For example, when the screen of FIG. 5A is being displayed, an operation of a device (any of buttons and keys included in the operation unit 270) assigned to the guide displays 5001a, 5002a, and 5004 is included in the effective operation. The display screen and the effective operation are associated with each other and stored in, for example, the non-volatile memory 256 of the remote controller 200. The remote control unit 250 determines whether the detected operation is an effective operation for the current display screen. The determination can be made with reference to the nonvolatile memory 256.
 S7004でリモコン制御部250は、S7003で検知された有効な操作に応じたAFモード変更要求を生成して外部I/F293を介してカメラ100へ送信し、処理をS7001に戻す。 In step S7004, the remote control control unit 250 generates an AF mode change request corresponding to the valid operation detected in step S7003, transmits the AF mode change request to the camera 100 via the external I / F 293, and returns the process to step S7001.
 S7005でリモコン制御部250は、現在設定されている操作モードが絶対位置操作モードであるか否かを判定し、絶対位置操作モードと判定されればS7006へ、絶対位置操作モードと判定されなければS7009へ、それぞれ処理を進める。
 S7006でリモコン制御部250は、ガイド表示5001a、5002a、5004以外の表示部228の領域内の位置に対するタッチパネル228aのタッチダウン操作を検知したか否かを判定する。なお、ここではタッチダウン操作が検知された位置によって主顔を設定するものとしているが、タッチアップ、ダブルタップ操作など、他のタッチ操作が検知された位置を用いてもよい。リモコン制御部250は、特定のタッチ操作(ここではタッチダウン操作)を検知したと判定されればS7007へ処理を進め、検知したと判定されなければS7001へ処理を戻す。
In S7005, the remote controller control unit 250 determines whether or not the currently set operation mode is the absolute position operation mode. If it is determined to be the absolute position operation mode, the process proceeds to S7006, and if it is not determined to be the absolute position operation mode. The process proceeds to S7009.
In step S7006, the remote control control unit 250 determines whether a touchdown operation of the touch panel 228a with respect to a position in the display unit 228 other than the guide displays 5001a, 5002a, and 5004 has been detected. Here, the main face is set according to the position where the touch-down operation is detected, but a position where another touch operation such as touch-up or double-tap operation is detected may be used. If it is determined that a specific touch operation (here, a touch-down operation) has been detected, remote control control unit 250 advances the process to S7007, and if not detected, returns the process to S7001.
 S7007でリモコン制御部250は、S7006で検知したタッチ操作が行われた位置(座標)と、S7001で受信した顔情報(もしくは各顔枠の表示位置と大きさ)とから、タッチ操作によって指定された顔を判定し、処理をS7008に進める。より具体的には、タッチ操作が行われた位置(座標)に対応する顔情報が存在する(タッチ操作が行われた位置が顔枠内である)場合に、対応する顔情報を指定された顔と判定する。タッチ操作が行われた位置(タッチ位置)にいずれの顔も存在しない場合は、S7007、S7008の処理では顔を選択しない。
 S7008でリモコン制御部250は、S7007で判定した顔の識別情報を、選択された顔の情報として外部I/F293を介してカメラ100へ送信し、処理をS7001に戻す。
In S7007, the remote control unit 250 is designated by the touch operation from the position (coordinates) where the touch operation detected in S7006 is performed and the face information (or display position and size of each face frame) received in S7001. The face is determined, and the process proceeds to S7008. More specifically, when face information corresponding to the position (coordinates) where the touch operation is performed exists (when the position where the touch operation is performed is within the face frame), the corresponding face information is specified. Judge as a face. If no face exists at the position where the touch operation is performed (touch position), no face is selected in the processes of S7007 and S7008.
In step S7008, the remote control control unit 250 transmits the face identification information determined in step S7007 to the camera 100 via the external I / F 293 as the selected face information, and the process returns to step S7001.
 S7009でリモコン制御部250は、現在設定されている操作モードが相対位置操作モードであるか否かを判定し、相対位置操作モードと判定されればS7010へ処理を進め、相対位置操作モードと判定されなければS7001へ処理を戻す。
 S7010でリモコン制御部250は、タッチパネル228aに対するドラッグ操作(タッチ位置の移動距離が所定距離以上であるタッチムーブ)を検知したか否かを判定する。そして、リモコン制御部250は、検知したと判定されればS7011へ処理を進め、検知したと判定されなければS7001へ処理を戻す。ここで、ドラッグ操作は、相対位置操作モードが指定される表示画面に関連付けられた有効なタッチ操作の一例であり、フリック操作など他のタッチ操作であってもよい。
In step S7009, the remote control unit 250 determines whether the currently set operation mode is the relative position operation mode. If it is determined that the operation mode is the relative position operation mode, the process advances to step S7010 to determine the relative position operation mode. If not, the process returns to S7001.
In step S7010, the remote control control unit 250 determines whether a drag operation on the touch panel 228a (a touch move in which the moving distance of the touch position is a predetermined distance or more) is detected. If it is determined that the remote control unit 250 has detected, the process proceeds to step S7011, and if it is not determined that it has been detected, the process returns to step S7001. Here, the drag operation is an example of an effective touch operation associated with the display screen in which the relative position operation mode is specified, and may be another touch operation such as a flick operation.
 S7011でリモコン制御部250は、S7010で検知したドラッグ操作の方向と距離を枠移動情報に変換する。なお、タッチパネル228aで検知されたドラッグ操作の距離を撮像画像上の距離に変換する際には、タッチパネル228aのドラッグ操作に対するAF枠や追尾待機枠の移動量が過度に大きくならないよう、適切な倍率を適用する。例えば、移動距離が画素数で表され、表示部228の画素数が水平および垂直方向で撮像画像の解像の画素数の1/10とする。この場合、タッチパネル228aに対するドラッグ操作の移動量を単純に変換すると、水平方向および垂直方向においてドラッグ操作量の10倍の移動量に変換されるため、位置の微調整は容易でない。そのため、リモコン制御部250は、水平または垂直方向における、撮像画像の画素数/表示部228の画素数よりも小さい倍率を用いてドラッグ操作の距離を変換するようにする。リモコン200とカメラ100とが接続された際に、システム制御部150とリモコン制御部250との間で情報の交換を行って倍率の初期値を決定するようにしてもよい。従って、リモコン制御部250からは操作量をそのままカメラ100に送信し、S7011の変換処理をシステム制御部150が行ってもよい。 In S7011, the remote control unit 250 converts the direction and distance of the drag operation detected in S7010 into frame movement information. When the distance of the drag operation detected by the touch panel 228a is converted into a distance on the captured image, an appropriate magnification is set so that the movement amount of the AF frame and the tracking standby frame with respect to the drag operation of the touch panel 228a does not become excessively large. Apply. For example, the moving distance is represented by the number of pixels, and the number of pixels of the display unit 228 is set to 1/10 of the number of pixels of resolution of the captured image in the horizontal and vertical directions. In this case, if the movement amount of the drag operation with respect to the touch panel 228a is simply converted, it is converted into a movement amount that is 10 times the drag operation amount in the horizontal direction and the vertical direction, so that fine adjustment of the position is not easy. Therefore, the remote control unit 250 converts the distance of the drag operation using a magnification smaller than the number of pixels of the captured image / the number of pixels of the display unit 228 in the horizontal or vertical direction. When the remote controller 200 and the camera 100 are connected, information may be exchanged between the system control unit 150 and the remote control unit 250 to determine the initial value of the magnification. Therefore, the operation amount may be transmitted as it is from the remote controller 250 to the camera 100, and the system controller 150 may perform the conversion process of S7011.
 なお、ドラッグ操作の距離が画素数でなく、実際の移動距離で得られる場合は、水平(垂直)方向における移動距離/タッチパネルの大きさよりも、水平(垂直)方向における変換後の移動距離/撮像画像の大きさが小さくなるようにすればよい。ただし、撮像画像の大きさは、所定のppi(例えば72ppiや96ppi)で換算した大きさとする。  When the distance of the drag operation is not the number of pixels but the actual moving distance, the moving distance in the horizontal (vertical) direction / the moving distance after conversion in the horizontal (vertical) direction / imaging rather than the moving distance in the horizontal (vertical) direction / touch panel size. What is necessary is just to make the magnitude | size of an image small. However, the size of the captured image is assumed to be a size converted by a predetermined ppi (for example, 72 ppi or 96 ppi).
 また、表示部228の画素ピッチが非常に小さい場合には、72ppiや96ppiとした場合の画素数にドラッグ操作の距離を変換してから、さらに撮像画像上の距離に変換してもよい。なお、タッチパネル228aのドラッグ操作に対するAF枠や追尾待機枠の移動量の感度を動的に調整する機能を操作部270に含まれるデバイスのいずれかに割り当ててもよい。 In addition, when the pixel pitch of the display unit 228 is very small, the distance of the drag operation may be converted into the number of pixels when 72 ppi or 96 ppi is set, and further converted into the distance on the captured image. Note that a function of dynamically adjusting the sensitivity of the movement amount of the AF frame or the tracking standby frame with respect to the drag operation of the touch panel 228a may be assigned to any of the devices included in the operation unit 270.
 S7012でリモコン制御部250は、S7011で生成した枠移動情報を、外部I/F293を介してカメラ100へ送信し、処理をS7001に戻す。 In S7012, the remote control unit 250 transmits the frame movement information generated in S7011 to the camera 100 via the external I / F 293, and returns the process to S7001.
 以上説明したように本実施形態によれば、例えば特定の被写体を選択する場合のように、カメラの機能を用いて検出された領域の選択を行う場合には、タッチパネルのような接触感知デバイスに対する絶対位置指定を有効とする。一方で、カメラの機能を用いて領域が検出されていない場合や、AF枠や追尾枠の移動を指示する場合については、接触感知デバイスに対する絶対位置指定を無効とし、相対位置指定を有効とするようにした。換言すれば、接触感知デバイスに対するタッチ操作の位置を有効とするか、操作方向や量を有効とするかを、カメラによって特定の領域が検出されているか否かによって動的に切り替えるようにした。 As described above, according to the present embodiment, when a region detected using a camera function is selected, for example, when a specific subject is selected, a touch sensing device such as a touch panel is used. The absolute position specification is valid. On the other hand, when the area is not detected using the camera function or when the movement of the AF frame or the tracking frame is instructed, the absolute position designation with respect to the touch sensing device is invalidated and the relative position designation is valid. I did it. In other words, whether the position of the touch operation on the touch sensing device is valid or whether the operation direction or amount is valid is dynamically switched depending on whether or not a specific area is detected by the camera.
 カメラが検出した領域を選択する場合には、領域内もしくは領域近傍の任意の位置の指定によって、小型もしくは画像が表示されない接触感知デバイスでも正確な位置が指定できる。そのため、接触感知デバイスに対するタッチ操作の位置を有効とする。一方、それ以外の場合には、タッチ操作の位置ではなく、操作方向や操作量の少なくとも一方を有効とする。それにより、小型もしくは画像が表示されない接触感知デバイスで画像の絶対位置を指定することによる意図しない結果を防止しつつ、タッチ操作の利便性を享受することができる。これらの切り替えをカメラが自動的に行うため、さらに利便性が高い。また、被写体追尾を行う場合、追尾対象の被写体が検出されているか否かに関わらず、追尾待機状態から追尾を開始し、終了するまでを共通の動作手順で実現することができる。 When selecting an area detected by the camera, an accurate position can be specified even with a small or touch-sensitive device that does not display an image by specifying an arbitrary position within or near the area. Therefore, the position of the touch operation with respect to the touch sensing device is validated. On the other hand, in other cases, not the position of the touch operation but at least one of the operation direction and the operation amount is validated. Thereby, the convenience of the touch operation can be enjoyed while preventing an unintended result by designating the absolute position of the image with a small size or a touch-sensitive device on which no image is displayed. Since the camera automatically switches between these, it is even more convenient. Further, when subject tracking is performed, a common operation procedure can be used to start and end tracking from a tracking standby state regardless of whether or not a tracking target subject is detected.
 より詳しくは、絶対位置指定が有効な(絶対位置操作モードが設定されている)場合に、タッチダウン(タッチの開始)が検出されると、リモコン制御部250は、タッチダウンが検出された位置(タッチダウン位置)が何れかの顔情報に対応する位置であるか(いずれかの顔枠内に対するタッチダウンであったか否か)を判定する。タッチダウン位置が何れかの顔情報に対応する位置であった場合、リモコン制御部250は、対応する顔情報の顔を選択された顔の情報としてカメラ100へ送信する。すなわち、絶対位置指定が有効となっている場合、リモコン制御部250はタッチダウン位置に対応する処理(タッチダウン位置に対応する顔を選択する処理)を実行する。 More specifically, when the absolute position designation is valid (the absolute position operation mode is set) and the touchdown (start of touch) is detected, the remote control unit 250 detects the position where the touchdown is detected. It is determined whether (touch down position) is a position corresponding to any face information (whether it is a touch down with respect to any face frame). When the touchdown position is a position corresponding to any face information, the remote controller control unit 250 transmits the face of the corresponding face information to the camera 100 as the selected face information. That is, when the absolute position designation is valid, the remote control unit 250 executes a process corresponding to the touchdown position (a process of selecting a face corresponding to the touchdown position).
 この場合、リモコン制御部250は、タッチダウンを検出する前にどの顔が選択されていたかは関係なく、タッチダウン位置に対応する顔が選択されたものとする。また、タッチ位置の移動(タッチムーブ)やタッチアップが検出され無くとも、タッチダウン位置が顔枠内であればリモコン制御部250はその顔枠に対応する顔が選択されたものとする。言い換えれば、絶対位置指定が有効な場合は、タッチ位置に一意に対応する機能が実行される。また、タッチ操作の検出前の設定値(タッチ操作の検出前のAF枠または追尾枠の位置)には基づかない処理となる。なお、タッチダウンで顔を選択するのではなく、タッチアップがあった際に、タッチアップがなされる前の最後に検出されていたタッチ位置に対応する顔を選択するようにしてもよい。 In this case, it is assumed that the remote control unit 250 selects the face corresponding to the touch-down position regardless of which face has been selected before detecting the touch-down. Further, it is assumed that even if no movement (touch move) or touch-up of the touch position is detected, if the touch-down position is within the face frame, remote control unit 250 selects the face corresponding to the face frame. In other words, when the absolute position designation is valid, a function uniquely corresponding to the touch position is executed. Further, the processing is not based on the set value before the touch operation is detected (the position of the AF frame or the tracking frame before the touch operation is detected). Instead of selecting a face by touchdown, when a touchup occurs, a face corresponding to the touch position detected last before the touchup may be selected.
 一方、相対位置指定が有効な(相対位置操作モードが設定されている)場合、リモコン制御部250は、タッチダウン(タッチの開始)が検出されだけではAF枠や追尾枠を移動させず、タッチの開始前の位置を維持する。タッチダウンが検出され、引き続いてタッチ位置の移動(タッチムーブ)が検出されると、リモコン制御部250は、枠移動情報(タッチ位置の移動量と移動方向を含む)を生成してカメラ100に送信するという、AF枠または追尾枠を移動するための処理を行う。AF枠または追尾枠は、枠移動情報の生成要因となったタッチが行われる前に設定されていた位置から、枠移動情報に基づいた方向と移動量で移動させられる。 On the other hand, when the relative position designation is valid (the relative position operation mode is set), the remote control unit 250 does not move the AF frame and the tracking frame only by detecting the touchdown (start of touch), and touches it. Maintain the position before the start of. When the touchdown is detected and subsequently the movement of the touch position (touch move) is detected, the remote controller control unit 250 generates frame movement information (including the movement amount and movement direction of the touch position) and sends it to the camera 100. A process for moving the AF frame or the tracking frame is performed. The AF frame or the tracking frame is moved in the direction and the movement amount based on the frame movement information from the position set before the touch that is the generation factor of the frame movement information is performed.
 すなわち、相対位置指定が有効な場合、タッチダウン位置はタッチ位置の移動量と移動方向を算出するために用いられるが、AF枠または追尾枠の移動先には直接は対応しない。タッチムーブの開始位置にかかわらず、タッチ位置の移動量と移動方向が同じであれば、AF枠または追尾枠の移動方向および移動量は等しい。言い換えれば、相対位置指定が有効となっている場合は、タッチ位置の移動量と移動方向に対応する処理が実行されるのであって、タッチ位置に一意に対応する機能は実行されない。また、相対位置指定が有効となっている場合は、タッチ操作の検出前の設定値(タッチ操作の検出前のAF枠または追尾枠の位置)に基づいた処理となる。具体的には、タッチ操作が行われる前の設定値を、タッチ位置の移動量と移動方向に対応する変化量だけ変化させた設定値となる。 That is, when the relative position designation is valid, the touchdown position is used to calculate the moving amount and moving direction of the touch position, but does not directly correspond to the movement destination of the AF frame or the tracking frame. Regardless of the start position of the touch move, if the movement amount and movement direction of the touch position are the same, the movement direction and movement amount of the AF frame or tracking frame are the same. In other words, when the relative position designation is valid, processing corresponding to the movement amount and movement direction of the touch position is executed, and the function uniquely corresponding to the touch position is not executed. When the relative position designation is valid, the process is based on the setting value before the touch operation is detected (the position of the AF frame or the tracking frame before the touch operation is detected). Specifically, the set value before the touch operation is performed is changed by the change amount corresponding to the movement amount and the movement direction of the touch position.
 また、カメラの表示部にライブビュー画像を領域の検出結果や現在のAF枠などの情報とともに表示するため、カメラの表示部を見ながらカメラの外付けリモコンを操作することで、より正確な操作を行うことができる。
 また、絶対位置指定が無効になってから所定の条件が満たされるまでは絶対位置指定を有効にしないようにした。そのため、カメラの領域検出機能による検出が安定しない場合に接触感知デバイスに対する絶対位置指定の有効/無効が短時間に切り替わることを抑制することができる。
In addition, since the Live View image is displayed on the camera display along with information such as the area detection results and the current AF frame, more accurate operation is possible by operating the camera's external remote control while viewing the camera display. It can be performed.
Also, the absolute position designation is not validated until a predetermined condition is satisfied after the absolute position designation becomes invalid. Therefore, when the detection by the area detection function of the camera is not stable, it is possible to prevent the valid / invalid designation of the absolute position designation for the touch sensitive device from being switched in a short time.
(他の実施形態)
 本発明はフォーカス制御以外の機能にも利用することができる。例えば、タッチ操作に応じて決定した領域(AF枠や追尾待機枠)の位置や、選択した領域の情報を露出制御(AE)処理やホワイトバランス制御処理に利用してもよい。
(Other embodiments)
The present invention can also be used for functions other than focus control. For example, the position of the area (AF frame or tracking standby frame) determined according to the touch operation or information on the selected area may be used for the exposure control (AE) process or the white balance control process.
 また、カメラの機能によって検出する領域は顔領域のような被写体領域に限定されない。例えば、輝度が第1の所定値を超える高輝度領域(例えば白飛び領域)や、第2の所定値未満の低輝度領域(例えば黒つぶれ領域)の少なくとも一方を検出してもよい。また、特定の色を有する領域(例えば空領域)を検出してもよい。 Also, the area detected by the camera function is not limited to the subject area such as the face area. For example, at least one of a high luminance region (for example, a whiteout region) whose luminance exceeds a first predetermined value and a low luminance region (for example, a blackout region) whose luminance is lower than a second predetermined value may be detected. Further, an area having a specific color (for example, an empty area) may be detected.
 また、図4AA~図4CBのフロー内の一部をリモコン200内のリモコン制御部250が行ったり、図7のフロー内の一部をカメラ100内のシステム制御部150が行ったりしてもよい。例えば、S7007およびS7011の座標変換処理はカメラ100内で行ってもよい。その場合、リモコン200は、S7008やS7012の代わりに、タッチ操作の種別(タッチダウン操作やドラッグ操作)のイベント情報と、操作位置、操作方向、操作量をカメラ100へ送信する。このようにしても、上述の実施形態と等価な効果を得ることができる。 Also, a part of the flow in FIGS. 4AA to 4CB may be performed by the remote control unit 250 in the remote controller 200, or a part of the flow in FIG. 7 may be performed by the system control unit 150 in the camera 100. . For example, the coordinate conversion processing in S7007 and S7011 may be performed in the camera 100. In that case, the remote controller 200 transmits the event information of the type of touch operation (touch down operation or drag operation), the operation position, the operation direction, and the operation amount to the camera 100 instead of S7008 and S7012. Even if it does in this way, the effect equivalent to the above-mentioned embodiment can be acquired.
 あるいは、カメラ100とリモコン200は一つの筺体内に組み込まれた一体の装置であっても構わない。その場合は、外部I/F193および外部I/F293は装置内部のバス接続に代替される。また、システム制御部150およびリモコン制御部250のように等価なモジュール同士は、一つのモジュールに統合可能である。このような構成であっても、やはり本実施形態と等価な効果を得ることができる。 Alternatively, the camera 100 and the remote controller 200 may be an integrated device incorporated in one housing. In that case, the external I / F 193 and the external I / F 293 are replaced with a bus connection inside the apparatus. Further, equivalent modules such as the system control unit 150 and the remote control unit 250 can be integrated into one module. Even with such a configuration, an effect equivalent to that of the present embodiment can be obtained.
 カメラ100とリモコン200が一つの筺体内に組み込まれた一体の装置であって、さらに別体のリモコン200に相当するリモコンが接続可能であっても構わない。このようにすれば、カメラマンとアシスタントがチームで撮影を行う場合に、カメラ100を扱うカメラマンと、別体のリモコン200を扱うアシスタントの両方が同じ手順で枠操作を行うことができる。 The camera 100 and the remote controller 200 may be integrated into one housing, and a remote controller corresponding to a separate remote controller 200 may be connected. In this way, when the cameraman and the assistant take a picture as a team, both the cameraman who handles the camera 100 and the assistant who handles the separate remote controller 200 can perform the frame operation in the same procedure.
 なお、システム制御部150やリモコン制御部250はそれぞれ、1つのハードウェアで実現されてもよいし、複数のハードウェアが処理を分担することで実現されてもよい。システム制御部150やリモコン制御部250を実現するハードウェアは、例えばCPUなどのプログラマブルプロセッサやASICのような専用ハードウェアであってよい。 Note that each of the system control unit 150 and the remote control unit 250 may be realized by one piece of hardware, or may be realized by sharing a plurality of pieces of hardware. The hardware that implements the system control unit 150 and the remote control unit 250 may be a programmable processor such as a CPU or dedicated hardware such as an ASIC.
 さらに、上述の実施形態では、カメラ100に接続されたリモコン200が有するタッチパネル228aの操作について主に述べたが、リモコン200の使用は本発明に必須ではなく、カメラ100単体でも本発明は実施可能である。
 カメラ100単体で実施する場合、例えば、
・ファインダー(光学ファインダーまたは電子ビューファインダー)をユーザーが使用していることをファインダー付近に設けた近接センサーで検出した場合
・ファインダー内の表示部に表示が行われている場合
・操作部170の操作によってユーザーから指示がなされた場合
に、被写体領域の検出状況に応じた操作モードの設定や、操作モードに応じた動作などを含む、上述したリモコン制御部250の動作をシステム制御部150が実行する。
Furthermore, in the above-described embodiment, the operation of the touch panel 228a included in the remote controller 200 connected to the camera 100 has been mainly described. However, the use of the remote controller 200 is not essential to the present invention, and the present invention can be implemented with the camera 100 alone. It is.
For example, when the camera 100 is used alone, for example,
-When the proximity sensor provided near the viewfinder detects that the user is using the viewfinder (optical viewfinder or electronic viewfinder)-When the display is displayed on the display in the viewfinder-Operation of the operation unit 170 The system control unit 150 executes the above-described operation of the remote control unit 250 including setting of the operation mode according to the detection state of the subject area, the operation according to the operation mode, and the like. .
 具体的には、システム制御部150は、リモコン制御部250が表示部228とタッチパネル228aに対して実施するものと同様の制御動作を、表示部128とタッチパネル128aに対して実施すればよい。またこの場合、上述した各実施形態でシステム制御部150が表示部128に表示するものとして説明した制御と同様に、ファインダーが電子ビューファインダーである場合にはファインダー内の表示部にライブビューを表示するとともに顔枠、AF枠、追尾枠などのマーク表示を行えばよい。また、ファインダーが光学ファインダーである場合には、ライブビュー表示は省き、ファインダー内の表示部において、光学像に重畳して視認されるように顔枠、AF枠、追尾枠などのマーク表示を行えばよい。表示部128へのライブビュー表示を行わない場合もタッチパネル128aに対する操作は有効とすることで、タッチパネル128aを上述したリモコン200のタッチパネル228aと同様に利用することができる。 Specifically, the system control unit 150 may perform the same control operation for the display unit 128 and the touch panel 128a as those performed by the remote control unit 250 for the display unit 228 and the touch panel 228a. Further, in this case, in the same manner as the control described as being displayed on the display unit 128 by the system control unit 150 in each embodiment described above, when the finder is an electronic view finder, a live view is displayed on the display unit in the finder. In addition, mark display such as a face frame, an AF frame, and a tracking frame may be performed. When the viewfinder is an optical viewfinder, live view display is omitted, and the face frame, AF frame, tracking frame, etc. are displayed on the display unit in the viewfinder so that they can be seen superimposed on the optical image. Just do it. Even when live view display on the display unit 128 is not performed, the touch panel 128a can be used in the same manner as the touch panel 228a of the remote controller 200 described above by enabling the operation on the touch panel 128a.
 また、本発明をその好適な実施形態に基づいて詳述してきたが、本発明は説明した特定の実施形態に限られるものではなく、この発明の要旨を逸脱しない範囲の様々な形態も本発明に含まれる。さらに、上述した実施形態は本発明の一実施形態を示すものにすぎない。さらに実施形態で説明した様々な態様を適宜組み合わせることも可能である。 Further, although the present invention has been described in detail based on the preferred embodiments thereof, the present invention is not limited to the specific embodiments described, and various forms within the scope of the present invention are not limited thereto. include. Further, the above-described embodiment is merely an embodiment of the present invention. Furthermore, various aspects described in the embodiments can be appropriately combined.
 また、上述した実施形態においては、電子機器の一例としてのデジタルスチルカメラまたはカメラを用いた撮影システムに本発明を適用した実施形態について説明する。しかし、本発明は撮像機能ならびにタッチセンサ(タッチパネル、タッチパッド)を備える任意の電子機器に適用可能である。なお、タッチセンサは表示装置に組み込まれたり、表示装置上に配置されたりする必要は無い。本発明を適用可能な電子機器には、パーソナルコンピュータ、タブレット端末、携帯電子機器(PDA)、携帯電話機、画像ビューワ、メディアプレーヤ、ゲーム機、電子ブックリーダなどが含まれるが、これらに限定されない。また、本発明はこれらの電子機器に適用可能であるが、タッチセンサの設置面積が小さい、携帯型電子機器において特に有用である。 In the above-described embodiment, an embodiment in which the present invention is applied to a digital still camera as an example of an electronic device or a photographing system using the camera will be described. However, the present invention can be applied to any electronic device having an imaging function and a touch sensor (touch panel, touch pad). Note that the touch sensor need not be incorporated in the display device or disposed on the display device. Electronic devices to which the present invention can be applied include, but are not limited to, personal computers, tablet terminals, portable electronic devices (PDAs), mobile phones, image viewers, media players, game machines, electronic book readers, and the like. In addition, the present invention can be applied to these electronic devices, but is particularly useful in portable electronic devices in which the touch sensor installation area is small.
 本発明は、上述の実施形態の1以上の機能を実現するプログラムを、ネットワーク又は記憶媒体を介してシステム又は装置に供給し、そのシステム又は装置のコンピュータにおける1つ以上のプロセッサーがプログラムを読出し実行する処理でも実現可能である。また、1以上の機能を実現する回路(例えば、ASIC)によっても実現可能である。 The present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in a computer of the system or apparatus read and execute the program This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
 本発明は上記実施の形態に制限されるものではなく、本発明の精神及び範囲から離脱することなく、様々な変更及び変形が可能である。従って、本発明の範囲を公にするために、以下の請求項を添付する。 The present invention is not limited to the above embodiment, and various changes and modifications can be made without departing from the spirit and scope of the present invention. Therefore, in order to make the scope of the present invention public, the following claims are attached.
 本願は、2016年3月22日提出の日本国特許出願特願2016-57556および、2017年2月10日提出の日本国特許出願特願2017-23475を基礎として優先権を主張するものであり、その記載内容の全てを、ここに援用する。 This application claims priority on the basis of Japanese Patent Application No. 2016-57556 filed on Mar. 22, 2016 and Japanese Patent Application No. 2017-23475 filed on Feb. 10, 2017. , The entire contents of which are incorporated herein by reference.

Claims (18)

  1.  画像から予め定められた条件に該当する領域が検出されている場合に第1の操作モードを設定し、前記領域が検出されていない場合または前記領域の検出が行われない場合に第2の操作モードを設定する設定手段と、
     タッチ検知面への操作を検出するタッチ検出手段と、
     制御手段であって、
      前記第1の操作モードが設定されている場合には、前記タッチ検出手段でタッチを検出した位置に対応する機能を実行し、
      前記第2の操作モードが設定されている場合には、前記タッチ検出手段でタッチを検出した位置に対応する処理ではなく、タッチ位置の移動操作に応じた処理を実行する、
    ように制御する制御手段と、
    を有することを特徴とする電子機器。
    The first operation mode is set when an area corresponding to a predetermined condition is detected from the image, and the second operation is performed when the area is not detected or when the area is not detected. Setting means for setting the mode;
    Touch detection means for detecting an operation on the touch detection surface;
    Control means,
    When the first operation mode is set, a function corresponding to the position where the touch is detected by the touch detection unit is executed,
    When the second operation mode is set, a process corresponding to a movement operation of the touch position is executed instead of a process corresponding to the position where the touch is detected by the touch detection unit.
    Control means for controlling
    An electronic device comprising:
  2.  前記画像が撮像手段で撮像された画像であり、前記予め定められた条件に該当する領域が、特定の被写体の領域であることを特徴とする請求項1に記載の電子機器。 2. The electronic apparatus according to claim 1, wherein the image is an image captured by an imaging unit, and the region corresponding to the predetermined condition is a region of a specific subject.
  3.  前記特定の被写体は人物の顔であることを特徴とする請求項2に記載の電子機器。 3. The electronic device according to claim 2, wherein the specific subject is a human face.
  4.  タッチを検出した位置に対応する前記機能は、検出された前記領域のうちタッチを検出した位置に対応する領域を選択する機能であることを特徴とする請求項1乃至3のいずれか1項に記載の電子機器。 4. The function according to claim 1, wherein the function corresponding to the position where the touch is detected is a function of selecting an area corresponding to the position where the touch is detected among the detected areas. 5. The electronic device described.
  5.  前記選択された前記領域は主被写体として設定され、撮像手段での撮影の際にAFで焦点を合わせる対象となることを特徴とする請求項4に記載の電子機器。 5. The electronic apparatus according to claim 4, wherein the selected area is set as a main subject, and is a target to be focused by AF when photographing with an imaging unit.
  6.  前記制御手段は、前記第1の操作モードが設定されている場合には、前記タッチ検出手段がタッチを検出したことに応じて、該タッチを検出した位置に対応する機能を実行するように制御することを特徴とする請求項1乃至5のいずれか1項に記載の電子機器。 When the first operation mode is set, the control means controls to execute a function corresponding to the position where the touch is detected in response to the touch detection means detecting a touch. The electronic device according to claim 1, wherein the electronic device is an electronic device.
  7.  前記制御手段は、前記第2の操作モードが設定されている場合には、前記タッチ検出手段がタッチを検出しても前記画像の一部を指定するマークを当該タッチの検出前から移動せず、前記タッチの位置の移動が検出されると、前記タッチの位置の移動に応じた前記処理として、前記マークを前記タッチの位置の移動量と移動方向に従って移動するように制御することを特徴とする請求項1乃至6のいずれか1項に記載の電子機器。 When the second operation mode is set, the control means does not move a mark designating a part of the image from before the touch is detected even if the touch detection means detects a touch. When the movement of the touch position is detected, the processing is performed according to the movement amount and the moving direction of the touch position as the processing according to the movement of the touch position. The electronic device according to any one of claims 1 to 6.
  8.  前記マークは、撮像手段での撮影の際にAFで焦点を合わせる対象となる位置を示すマークであることを特徴とする請求項7に記載の電子機器。 8. The electronic apparatus according to claim 7, wherein the mark is a mark indicating a position to be focused by AF at the time of photographing with an imaging means.
  9.  前記タッチ検知面は、前記画像の表示面とは異なることを特徴とする請求項1乃至8のいずれか1項に記載の電子機器。 9. The electronic device according to claim 1, wherein the touch detection surface is different from a display surface of the image.
  10.  撮像手段と、
     ファインダーと、
     前記ファインダーを介して視認可能な第1の表示手段と、
     前記ファインダーを介さずに視認可能な第2の表示手段と、をさらに有し、
     前記画像は前記撮像手段で撮像された画像であり、
     前記タッチ検知面は前記第2の表示手段の表示面であり、
     前記制御手段は、前記画像が前記第1の表示手段に表示されている場合、前記領域の検出状況に応じて、タッチを検出した位置に対応する前記機能、または、タッチ位置の移動操作に応じた前記処理を行うように制御することを特徴とする請求項1乃至9のいずれか1項に記載の電子機器。
    Imaging means;
    With the viewfinder,
    First display means visible through the viewfinder;
    A second display means that is visible without going through the viewfinder,
    The image is an image captured by the imaging means;
    The touch detection surface is a display surface of the second display means;
    When the image is displayed on the first display unit, the control unit responds to the function corresponding to the position where the touch is detected or the movement operation of the touch position according to the detection state of the region. The electronic apparatus according to claim 1, wherein the electronic apparatus is controlled to perform the process.
  11.  前記領域の検出が、前記電子機器に通信可能に接続された撮像装置で行われることを特徴とする請求項1乃至10のいずれか1項に記載の電子機器。 11. The electronic device according to claim 1, wherein the detection of the area is performed by an imaging device connected to the electronic device so as to be able to communicate.
  12.  前記設定手段は、前記領域が検出されない状態から検出された状態になった場合、予め定められた条件を満たすまでは、前記第2の操作モードから前記第1の操作モードに変更しないことを特徴とする請求項1乃至11のいずれか1項に記載の電子機器。 The setting means does not change from the second operation mode to the first operation mode until a predetermined condition is satisfied when the region is detected from a state where the region is not detected. The electronic device according to any one of claims 1 to 11.
  13.  前記予め定められた条件が、前記処理の実行からの一定時間の経過であることを特徴とする請求項12に記載の電子機器。 13. The electronic apparatus according to claim 12, wherein the predetermined condition is that a predetermined time has elapsed since the execution of the process.
  14.  画像から予め定められた条件に該当する領域を検出する検出手段と、
     前記検出手段によって前記領域が検出されたか否かに応じて接触感知デバイスの動作を制御する制御手段と、を有し、
     前記制御手段は、前記検出手段によって前記領域が検出された場合には前記接触感知デバイスに対する絶対位置操作を有効とし、前記検出手段によって前記領域が検出されない場合には前記接触感知デバイスに対する絶対位置操作を無効、相対位置操作を有効とする、ことを特徴とする電子機器。
    Detecting means for detecting a region corresponding to a predetermined condition from the image;
    Control means for controlling the operation of the touch sensing device according to whether or not the region is detected by the detection means,
    The control means validates the absolute position operation on the touch sensing device when the area is detected by the detection means, and operates the absolute position operation on the touch sensing device when the area is not detected by the detection means. An electronic device characterized by invalidating and enabling relative position operation.
  15.  画像から予め定められた条件に該当する領域が検出されている場合に第1の操作モードを設定し、前記領域が検出されていない場合または前記領域の検出が行われない場合に第2の操作モードを設定する設定工程と、
     タッチ検知面への操作を検出するタッチ検出工程と、
     制御工程であって、
      前記第1の操作モードが設定されている場合には、前記タッチ検出工程でタッチを検出した位置に対応する機能を実行し、
      前記第2の操作モードが設定されている場合には、前記タッチ検出工程でタッチを検出した位置に対応する処理ではなく、タッチ位置の移動操作に応じた処理を実行する、
    ように制御する制御工程と、
    を有することを特徴とする電子機器の制御方法。
    The first operation mode is set when an area corresponding to a predetermined condition is detected from the image, and the second operation is performed when the area is not detected or when the area is not detected. A setting process for setting the mode;
    A touch detection process for detecting an operation on the touch detection surface;
    A control process,
    When the first operation mode is set, the function corresponding to the position where the touch is detected in the touch detection step is executed,
    When the second operation mode is set, a process corresponding to a movement operation of the touch position is executed instead of a process corresponding to the position where the touch is detected in the touch detection step.
    A control process to control,
    A method for controlling an electronic device, comprising:
  16.  画像から予め定められた条件に該当する領域を検出する検出工程と、
     前記検出工程において前記領域が検出された場合に、接触感知デバイスに対する絶対位置操作を有効とし、前記検出工程において前記領域が検出されない場合には前記接触感知デバイスに対する絶対位置操作を無効、相対位置操作を有効とする制御工程と、
    を有することを特徴とする電子機器の制御方法。
    A detection step of detecting a region corresponding to a predetermined condition from the image;
    When the area is detected in the detection step, the absolute position operation for the touch sensing device is enabled, and when the area is not detected in the detection step, the absolute position operation for the touch sensing device is disabled, and the relative position operation is performed. A control process for enabling
    A method for controlling an electronic device, comprising:
  17.  コンピュータを、請求項1から請求項14のいずれか1項に記載の電子機器の各手段として機能させるためのプログラム。 The program for functioning a computer as each means of the electronic device of any one of Claims 1-14.
  18.  コンピュータを、請求項1から請求項14のいずれか1項に記載の電子機器の各手段として機能させるためのプログラムを格納したコンピュータが読み取り可能な記憶媒体。 A computer-readable storage medium storing a program for causing a computer to function as each unit of the electronic device according to any one of claims 1 to 14.
PCT/JP2017/011080 2016-03-22 2017-03-21 Electronic device and method for controlling same WO2017164140A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780019550.XA CN109076156B (en) 2016-03-22 2017-03-21 Electronic device and control method thereof
US16/131,135 US10649313B2 (en) 2016-03-22 2018-09-14 Electronic apparatus and method for controlling same

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016-057556 2016-03-22
JP2016057556 2016-03-22
JP2017023475A JP6788518B2 (en) 2016-03-22 2017-02-10 Electronic devices and their control methods, as well as imaging devices
JP2017-023475 2017-02-10

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/131,135 Continuation US10649313B2 (en) 2016-03-22 2018-09-14 Electronic apparatus and method for controlling same

Publications (1)

Publication Number Publication Date
WO2017164140A1 true WO2017164140A1 (en) 2017-09-28

Family

ID=59899416

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/011080 WO2017164140A1 (en) 2016-03-22 2017-03-21 Electronic device and method for controlling same

Country Status (1)

Country Link
WO (1) WO2017164140A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007019685A (en) * 2005-07-06 2007-01-25 Sony Corp Imaging apparatus, control method, and program
JP2011164949A (en) * 2010-02-10 2011-08-25 Canon Inc Information processor, retrieval condition setting method, program, and recording medium
JP2012098594A (en) * 2010-11-04 2012-05-24 Ricoh Co Ltd Imaging apparatus and imaging method
JP2013150265A (en) * 2012-01-23 2013-08-01 Olympus Imaging Corp Imaging apparatus, display method, and program
JP2013162425A (en) * 2012-02-07 2013-08-19 Olympus Imaging Corp Imaging apparatus, imaging method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007019685A (en) * 2005-07-06 2007-01-25 Sony Corp Imaging apparatus, control method, and program
JP2011164949A (en) * 2010-02-10 2011-08-25 Canon Inc Information processor, retrieval condition setting method, program, and recording medium
JP2012098594A (en) * 2010-11-04 2012-05-24 Ricoh Co Ltd Imaging apparatus and imaging method
JP2013150265A (en) * 2012-01-23 2013-08-01 Olympus Imaging Corp Imaging apparatus, display method, and program
JP2013162425A (en) * 2012-02-07 2013-08-19 Olympus Imaging Corp Imaging apparatus, imaging method, and program

Similar Documents

Publication Publication Date Title
JP6788518B2 (en) Electronic devices and their control methods, as well as imaging devices
JP6748582B2 (en) Imaging device, control method thereof, program, and recording medium
US9036073B2 (en) Imaging apparatus and for controlling an automatic focus (AF) area and an enlargement area in a live view
JP6757268B2 (en) Imaging device and its control method
CN109218608B (en) Electronic device, control method thereof, and storage medium
JP2019074928A (en) Electronic apparatus and control method thereof
KR20170063388A (en) Electronic device and method for controlling the same
JP2018013745A (en) Electronic equipment and control method therefor
JP6701033B2 (en) Electronic device and control method thereof
JP2018107709A (en) Imaging control device and control method thereof
JP7467071B2 (en) Electronic device, electronic device control method, program, and storage medium
JP6833535B2 (en) Imaging device, control method and program of imaging device
JP2019075699A (en) Electronic device and control method of the same
US11381736B2 (en) Image capture apparatus and control method
US11175737B2 (en) Electronic device for receiving line of sight input, method of controlling electronic device, and non-transitory computer readable medium
JP2019016299A (en) Electronic apparatus having operation member arranged on different surfaces, control method thereof, program and storage medium
JP2018107712A (en) Imaging control device and control method thereof
US10924680B2 (en) Image capture control apparatus and method of controlling the same
WO2017164140A1 (en) Electronic device and method for controlling same
JP6393296B2 (en) IMAGING DEVICE AND ITS CONTROL METHOD, IMAGING CONTROL DEVICE, PROGRAM, AND STORAGE MEDIUM
JP2020197976A (en) Electronic apparatus, control method for electronic apparatus, program, and recording medium
JP2020166199A (en) Electronic apparatus and method of controlling the same
JP7446845B2 (en) Imaging control device, control method for the imaging control device, program, and storage medium
JP7034601B2 (en) Imaging control device and its control method
JP6855317B2 (en) Imaging device, control method of imaging device, program, and recording medium

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17770182

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17770182

Country of ref document: EP

Kind code of ref document: A1