WO2020255675A1 - 電子機器およびその制御方法 - Google Patents
電子機器およびその制御方法 Download PDFInfo
- Publication number
- WO2020255675A1 WO2020255675A1 PCT/JP2020/021324 JP2020021324W WO2020255675A1 WO 2020255675 A1 WO2020255675 A1 WO 2020255675A1 JP 2020021324 W JP2020021324 W JP 2020021324W WO 2020255675 A1 WO2020255675 A1 WO 2020255675A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- line
- sight
- electronic device
- selected position
- movement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/02—Viewfinders
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/18—Signals indicating condition of a camera member or suitability of light
- G03B17/20—Signals indicating condition of a camera member or suitability of light visible in viewfinder
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/675—Focus control based on electronic image sensor signals comprising setting of focusing regions
Definitions
- the present invention relates to an electronic device that can be operated by a line of sight, and a method of controlling the electronic device.
- Patent Document 1 discloses that the line-of-sight position of a user is detected while looking through a finder, and an AF frame is displayed at the line-of-sight position.
- the displayed AF frame can be moved by operating the operation member that can be operated in eight directions of the camera body.
- Patent Document 1 when trying to move the AF frame displayed at the line-of-sight position by operating an operation member such as an arrow key, many operations are required to move the AF frame when there are many AF points. Therefore, it takes time to move to the position desired by the user.
- an object of the present invention is to enable the selected position to be moved to a position desired by the user more quickly and more accurately.
- the electronic device of the present invention An imaging means that captures the subject, An eyepiece for visually recognizing the image of the subject and A display means that can be visually recognized through the eyepiece and A reception means that accepts eye-gaze input from the user's line of sight looking into the eyepiece, and In the state where the selection position is specified based on the line-of-sight input, the movement operation of touching and moving the operation surface of the operation means is performed. It is characterized by having a control means for controlling the selected position displayed on the display means so as to move from a position based on the line-of-sight input to a position corresponding to the direction of the movement operation and the movement amount.
- the selected position can be moved to the position desired by the user more quickly and more accurately.
- FIG. 1A and 1B show external views of a digital camera 100 as an example of a device to which the present invention can be applied.
- FIG. 1A is a front perspective view of the digital camera 100
- FIG. 1B is a rear perspective view of the digital camera 100.
- the display unit 28 is a display unit provided on the back surface of the camera for displaying images and various information.
- the touch panel 70a can detect a touch operation on the display surface (operation surface) of the display unit 28.
- the display unit 43 outside the viewfinder is a display unit provided on the upper surface of the camera, and various setting values of the camera such as a shutter speed and an aperture are displayed.
- the shutter button 61 is an operation unit for giving a shooting instruction.
- the mode changeover switch 60 is an operation unit for switching various modes.
- the terminal cover 40 is a cover that protects a connector (not shown) that connects the connection cable to the external device and the digital camera 100.
- the main electronic dial 71 is a rotation operation member included in the operation unit 70, and by turning the main electronic dial 71, it is possible to change set values such as a shutter speed and an aperture.
- the power switch 72 is an operating member that switches the power of the digital camera 100 on and off.
- the sub electronic dial 73 is a rotation operation member included in the operation unit 70, and can move the selection frame, feed an image, and the like.
- the cross key 74 is a cross key included in the operation unit 70 and having a push button that can be pushed in four directions. The operation can be performed according to the pressed portion of the cross key 74.
- the SET button 75 is included in the operation unit 70, is a push button, and is mainly used for determining a selection item or the like.
- the moving image button 76 is used to instruct the start and stop of moving image shooting (recording).
- the AE lock button 77 is included in the operation unit 70, and the exposure state can be fixed by pressing the AE lock button 77 in the shooting standby state.
- the enlargement button 78 is included in the operation unit 70, and is an operation button for turning on / off the enlargement mode in the live view display of the shooting mode. By operating the main electronic dial 71 after turning on the enlargement mode, the live view image can be enlarged or reduced.
- the play button 79 is included in the operation unit 70 and is an operation button for switching between the shooting mode and the play mode. By pressing the playback button 79 during the shooting mode, the playback mode can be entered, and the latest image among the images recorded on the recording medium 200 can be displayed on the display unit 28.
- the menu button 81 is included in the operation unit 70, and when pressed, various settable menu screens are displayed on the display unit 28. The user can intuitively make various settings by using the menu screen displayed on the display unit 28 and the cross key 74 and the SET button 75.
- the communication terminal 10 is a communication terminal for the digital camera 100 to communicate with the lens unit 150 (detachable) described later.
- the eyepiece 16 is an eyepiece of an eyepiece finder (look-in type finder), and the user can view an electronic image displayed on the internal EVF (Electric View Finder) 29 via the eyepiece 16. It can be visually recognized.
- the eyepiece detection unit 57 is an eyepiece detection sensor that detects whether or not the photographer is in contact with the eyepiece unit 16.
- the lid 202 is a lid of a slot that stores the recording medium 200.
- the grip portion 90 is a holding portion having a shape that makes it easy for the user to hold the digital camera 100 with his / her right hand.
- the shutter button 61 and the main electronic dial 71 are arranged at positions that can be operated by the index finger of the right hand while holding the digital camera by holding the grip portion 90 with the little finger, ring finger, and middle finger of the right hand. Further, in the same state, the sub electronic dial 73 is arranged at a position where it can be operated with the thumb of the right hand.
- FIG. 2 is a block diagram showing a configuration example of the digital camera 100 according to the present embodiment.
- the lens unit 150 is a lens unit equipped with an interchangeable photographing lens.
- the lens 103 is usually composed of a plurality of lenses, but here, it is simply shown by only one lens.
- the communication terminal 6 is a communication terminal for the lens unit 150 to communicate with the digital camera 100.
- the lens unit 150 communicates with the system control unit 50 via the communication terminal 6 and the above-mentioned communication terminal 10, controls the aperture 1 via the aperture drive circuit 2 by the internal lens system control circuit 4, and AF drives. Focusing is achieved by shifting the lens 103 via the circuit 3.
- the shutter 101 is a focal plane shutter that can freely control the exposure time of the imaging unit 22 under the control of the system control unit 50.
- the image pickup unit 22 is an image pickup device composed of a CCD, a CMOS element, or the like that converts an optical image into an electric signal.
- the A / D converter 23 is used to convert an analog signal output from the imaging unit 22 into a digital signal.
- the image processing unit 24 performs resizing processing such as predetermined pixel interpolation and reduction and color conversion processing on the data from the A / D converter 23 or the data from the memory control unit 15 described later. In addition, the image processing unit 24 performs predetermined arithmetic processing using the captured image data.
- the system control unit 50 performs exposure control and distance measurement control based on the calculation result obtained by the image processing unit 24. As a result, TTL (through-the-lens) AF (autofocus) processing, AE (autoexposure) processing, and EF (flash pre-flash) processing are performed. Further, the image processing unit 24 performs a predetermined calculation process using the captured image data, and performs a TTL method AWB (auto white balance) process based on the obtained calculation result.
- TTL through-the-lens
- AF autofocus
- AE autoexposure
- EF flash pre-flash
- the memory control unit 15 controls data transmission / reception between the A / D converter 23, the image processing unit 24, and the memory 32.
- the output data from the A / D converter 23 is written directly to the memory 32 via the image processing unit 24 and the memory control unit 15 or via the memory control unit 15.
- the memory 32 stores the image data obtained by the image pickup unit 22 and converted into digital data by the A / D converter 23, and the image data to be displayed on the display unit 28 and the EVF 29.
- the memory 32 has a storage capacity sufficient to store a predetermined number of still images, moving images for a predetermined time, and audio.
- the memory 32 also serves as a memory (video memory) for displaying an image.
- the image data for display written in the memory 32 is displayed by the display unit 28 and the EVF 29 via the memory control unit 15.
- the display unit 28 and the EVF 29 display on a display such as an LCD or an organic EL according to a signal from the memory control unit 15.
- Live view display can be performed by sequentially transferring and displaying the data that has been A / D converted by the A / D converter 23 and stored in the memory 32 to the display unit 28 or the EVF 29.
- the image displayed in the live view is referred to as a live view image (LV image).
- the infrared light emitting diode 166 is a light emitting element for detecting the position of the user's line of sight in the finder screen, and irradiates the user's eyeball (eye) 161 with the eyepiece 16 with infrared light.
- the infrared light emitted from the infrared light emitting diode 166 is reflected by the eyeball (eye) 161 and the infrared reflected light reaches the dichroic mirror 162.
- the dichroic mirror 162 reflects only infrared light and transmits visible light.
- the infrared reflected light whose optical path has been changed is imaged on the imaging surface of the line-of-sight detection sensor 164 via the imaging lens 163.
- the imaging lens 163 is an optical member that constitutes the line-of-sight detection optical system.
- the line-of-sight detection sensor 164 includes an imaging device such as a CCD type image sensor.
- the line-of-sight detection sensor 164 photoelectrically converts the incident infrared reflected light into an electric signal and outputs it to the line-of-sight detection circuit 165.
- the line-of-sight detection circuit 165 includes at least one processor, detects the user's line-of-sight position from the image or movement of the user's eyeball (eye) 161 based on the output signal of the line-of-sight detection sensor 164, and transmits the detection information to the system control unit 50. Output.
- the dichroic mirror 162, the imaging lens 163, the line-of-sight detection sensor 164, the infrared light emitting diode 166, and the line-of-sight detection circuit 165 constitute the line-of-sight detection means 160.
- the line-of-sight detection block 160 is used to detect the line of sight by a method called the corneal reflex method.
- the corneal reflex method refers to the direction of the line of sight based on the positional relationship between the reflected light emitted from the infrared light emitting diode 166, especially the reflected light of the eyeball (eye) 161 and the pupil of the eyeball (eye) 161. This is a method of detecting the position.
- various methods for detecting the direction and position of the line of sight such as a method called scleral reflection method, which utilizes the difference in light reflectance between black and white eyes.
- a method of line-of-sight detection means other than the above may be used as long as the method can detect the direction and position of the line of sight.
- the liquid crystal display unit 43 outside the viewfinder displays various settings of the camera, such as the shutter speed and the aperture, via the drive circuit 44 of the display unit outside the viewfinder.
- the non-volatile memory 56 is a memory that can be electrically erased and recorded, and for example, a Flash-ROM or the like is used.
- the non-volatile memory 56 stores constants, programs, and the like for the operation of the system control unit 50.
- the program referred to here is a program for executing various flowcharts described later in the present embodiment.
- the system control unit 50 is a control unit including at least one processor or circuit, and controls the entire digital camera 100. By executing the program recorded in the non-volatile memory 56 described above, each process of the present embodiment described later is realized.
- a RAM is used in the system memory 52, and constants and variables for operation of the system control unit 50, a program read from the non-volatile memory 56, and the like are developed.
- the system control unit 50 also controls the display by controlling the memory 32, the display unit 28, and the like.
- the system timer 53 is a time measuring unit that measures the time used for various controls and the time of the built-in clock.
- the mode changeover switch 60, the first shutter switch 62, the second shutter switch 64, and the operation unit 70 are operation means for inputting various operation instructions to the system control unit 50.
- the mode changeover switch 60 switches the operation mode of the system control unit 50 to any one of a still image shooting mode, a moving image shooting mode, and the like.
- the modes included in the still image shooting mode include an auto shooting mode, an auto scene discrimination mode, a manual mode, an aperture priority mode (Av mode), a shutter speed priority mode (Tv mode), and a program AE mode (P mode).
- Av mode aperture priority mode
- Tv mode shutter speed priority mode
- P mode program AE mode
- the mode selector switch 60 allows the user to switch directly to any of these modes. Alternatively, after switching to the shooting mode list screen once with the mode changeover switch 60, one of the displayed plurality of modes may be selected and switched using another operation member.
- the moving image shooting mode may include
- the first shutter switch 62 is turned on by a so-called half-press (shooting preparation instruction) during the operation of the shutter button 61 provided on the digital camera 100, and the first shutter switch signal SW1 is generated.
- the first shutter switch signal SW1 starts shooting preparation operations such as AF (autofocus) processing, AE (autoexposure) processing, AWB (auto white balance) processing, and EF (flash pre-flash) processing.
- the second shutter switch 64 is turned on when the operation of the shutter button 61 is completed, so-called full pressing (shooting instruction), and the second shutter switch signal SW2 is generated.
- the system control unit 50 starts a series of shooting processes from reading the signal from the image pickup unit 22 to writing the captured image as an image file on the recording medium 200 by the second shutter switch signal SW2.
- the operation unit 70 is various operation members as an input unit that receives operations from the user.
- the operation unit 70 includes at least the following operation units. Shutter button 61, touch panel 70a, main electronic dial 71, power switch 72, sub electronic dial 73, cross key 74, SET button 75, video button 76, AE lock button 77, enlargement button 78, play button 79, menu button 81.
- the power supply control unit 80 is composed of a battery detection circuit, a DC-DC converter, a switch circuit for switching a block to be energized, and the like, and detects whether or not a battery is installed, the type of battery, and the remaining battery level. Further, the power supply control unit 80 controls the DC-DC converter based on the detection result and the instruction of the system control unit 50, and supplies a necessary voltage to each unit including the recording medium 200 for a necessary period.
- the power supply unit 30 includes a primary battery such as an alkaline battery or a lithium battery, a secondary battery such as a NiCd battery, a NiMH battery, or a Li battery, an AC adapter, or the like.
- the recording medium I / F18 is an interface with a recording medium 200 such as a memory card or a hard disk.
- the recording medium 200 is a recording medium such as a memory card for recording a captured image, and is composed of a semiconductor memory, a magnetic disk, or the like.
- the communication unit 54 is connected by a wireless or wired cable to transmit and receive video signals and audio signals.
- the communication unit 54 can also be connected to a wireless LAN (Local Area Network) and the Internet.
- the communication unit 54 can also communicate with an external device using Bluetooth (registered trademark) or Bluetooth Low Energy.
- the communication unit 54 can transmit an image (including a live view image) captured by the image pickup unit 22 and an image recorded on the recording medium 200, and can receive an image and various other information from an external device. it can.
- the posture detection unit 55 detects the posture of the digital camera 100 with respect to the direction of gravity. Whether the image taken by the imaging unit 22 based on the posture detected by the posture detecting unit 55 is an image taken by holding the digital camera 100 horizontally or an image taken by holding the digital camera 100 vertically. Can be determined.
- the system control unit 50 can add orientation information according to the posture detected by the posture detection unit 55 to the image file of the image captured by the image pickup unit 22, or can rotate and record the image. is there.
- an acceleration sensor, a gyro sensor, or the like can be used. It is also possible to detect the movement (pan, tilt, lift, whether or not it is stationary, etc.) of the digital camera 100 by using the acceleration sensor or the gyro sensor, which is the posture detection unit 55.
- the eyepiece detection unit 57 is an eyepiece detection sensor that detects the approach (eyepiece) and detachment (eyepiece) of the eye (object) 161 with respect to the eyepiece 16 of the finder (approach detection).
- the system control unit 50 switches the display (display state) / non-display (non-display state) of the display unit 28 and the EVF 29 according to the state detected by the eyepiece detection unit 57. More specifically, when at least the digital camera 100 is in the shooting standby state and the switching setting of the display destination of the image displayed through the imaging unit 22 is the automatic switching setting, the display destination is displayed during non-eye contact. The display is turned on as 28, and the EVF 29 is hidden.
- the eyepiece detection unit 57 can use, for example, an infrared proximity sensor, and can detect the approach of some object to the eyepiece unit 16 of the finder containing the EVF29.
- the infrared rays projected from the light projecting unit (not shown) of the eyepiece detection unit 57 are reflected and received by the light receiving unit (not shown) of the infrared proximity sensor. From the amount of infrared rays received, it is possible to determine how close the object is from the eyepiece 16 (eyepiece distance).
- the eyepiece detection unit 57 performs eyepiece detection that detects the close distance of the object to the eyepiece unit 16.
- the light emitting unit and the light receiving unit of the eyepiece detection unit 57 are devices that are different from the infrared light emitting diode 166 and the line of sight detection sensor 164 described above.
- the infrared light emitting diode 166 may also serve as the light projecting unit of the eyepiece detection unit 57.
- the light emitting unit may also serve as the line-of-sight detection sensor 164.
- the threshold value for detecting the eyepiece and the threshold value for detecting the eyepiece may be different, for example, by providing hysteresis.
- the eyepiece is assumed to be in the eyepiece state until the eyepiece is detected.
- the eyepiece is not in the eyepiece state until the eyepiece is detected.
- the infrared proximity sensor is an example, and another sensor may be used for the eyepiece detection unit 57 as long as it can detect the approach of an eye or an object that can be regarded as an eyepiece.
- the system control unit 50 can detect the following operations or states based on the output from the line-of-sight detection block 160. -The line of sight of the user who touched the eyepiece 16 was newly input (detected). That is, the start of line-of-sight input. -The eyepiece 16 is in a state where the user's line of sight is input. -The user who has the eyepiece on the eyepiece 16 is watching. -The line of sight entered by the user who touched the eyepiece 16 was removed. That is, the end of line-of-sight input. -A state in which the user who touches the eyepiece 16 does not input any line of sight.
- the gaze described here refers to the case where the user's line-of-sight position does not exceed a predetermined amount of movement within a predetermined time.
- the predetermined time may be a time that can be set by the user, may be a fixed time, or may be changed depending on the distance relationship between the immediately preceding line-of-sight position and the current line-of-sight position.
- the touch panel 70a and the display unit 28 can be integrally configured.
- the touch panel 70a is configured so that the light transmittance does not interfere with the display of the display unit 28, and is attached to the upper layer of the display surface of the display unit 28. Then, the input coordinates on the touch panel 70a are associated with the display coordinates on the display screen of the display unit 28.
- a GUI graphical user interface
- the system control unit 50 can detect the following operations or states on the touch panel 70a. -A finger or pen that has not touched the touch panel 70a has newly touched the touch panel 70a. That is, the start of touch (hereinafter referred to as touch-down).
- touch panel 70a is in a state of being touched with a finger or a pen (hereinafter referred to as "touch-on”).
- touch panel 70a is moved while being touched with a finger or a pen (hereinafter, referred to as a touch move (Touch-Move)).
- touch-Move touch move
- touch-up touch-Up
- touch-off A state in which nothing is touched on the touch panel 70a (hereinafter, referred to as touch-off).
- touch-on When touchdown is detected, it is also detected that it is touch-on. After touchdown, touch-on usually continues to be detected unless touch-up is detected. Touch move is also detected when touch on is detected. Even if touch-on is detected, touch move is not detected unless the touch position is moved. After it is detected that all the touched fingers and pens have touched up, the touch-off is performed.
- the touch panel 70a is a device capable of inputting position coordinates.
- the system control unit 50 determines what kind of operation (touch operation) has been performed on the touch panel 70a based on the notified information.
- touch operation the moving direction of the finger or pen moving on the touch panel 70a can also be determined for each vertical component and horizontal component on the touch panel 70a based on the change in the position coordinates. If it is detected that the touch move is performed over a predetermined distance, it is determined that the slide operation has been performed.
- the operation of quickly moving a finger on the touch panel for a certain distance and then releasing it is called flicking.
- flicking is an operation of quickly tracing the touch panel 70a as if flicking it with a finger. It can be determined that a flick has been performed when it is detected that the touch move is performed at a predetermined speed or more over a predetermined distance and the touch-up is detected as it is (it can be determined that there was a flick following the slide operation). Further, a touch operation in which a plurality of points (for example, two points) are touched at the same time to bring the touch positions closer to each other is called a pinch-in, and a touch operation to move the touch positions away from each other is called a pinch-out. Pinch out and pinch in are collectively called pinch operation (or simply pinch).
- the touch panel 70a may be any of various touch panels such as a resistive film method, a capacitance method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, an image recognition method, and an optical sensor method. good.
- a method of detecting that there is a touch due to contact with the touch panel there are a method of detecting that there is a touch due to the approach of a finger or a pen to the touch panel, but any method may be used.
- the user can set the method of specifying the position of the position index according to the touch move operation to either absolute position specification or relative position specification.
- the position index is an AF frame
- the AF position associated with the touched position is set. That is, the position coordinates on which the touch operation is performed are associated with the position coordinates of the display unit 28.
- the position coordinates on which the touch operation is performed and the position coordinates of the display unit 28 are not associated with each other.
- the touch position is moved from the currently set AF position to the movement direction of the touch move by a distance corresponding to the movement amount of the touch move, regardless of the touchdown position with respect to the touch panel 70a. That is, the AF position does not move at the time of touchdown.
- the cursor display / movement control process by the line-of-sight input operation and the touch input operation in the digital camera 100 will be described.
- FIG. 3 is a flowchart in which the movement of the cursor position on the setting menu screen is controlled by eye-gaze input and touch operation.
- This control process is realized by the system control unit 50 expanding the program stored in the non-volatile memory 56 into the system memory 52 and executing the program.
- the flowchart of FIG. 3 is started when the digital camera 100 is activated in the shooting mode and the user is looking into the finder in the shooting standby state, that is, when the eyepiece portion 16 is in the eyepiece state.
- the user operates the touch panel 70a while looking through the finder to select / change the setting menu item.
- the system control unit 50 determines whether or not the user has switched the setting related to the line-of-sight input in the digital camera 100. If it is switched, the process proceeds to S303, otherwise the process proceeds to S304. Specifically, as shown in FIG. 4, it is determined whether or not there is an operation of displaying the menu screen of the digital camera 100 and there is a switching operation of the setting item 403 related to the line-of-sight input.
- FIG. 4 is a setting menu screen related to shooting displayed on the EVF 29 or the display unit 28. Setting items 401 to 404 are displayed on the setting menu screen. Of these, the setting item 403 is a setting item related to line-of-sight input.
- the line-of-sight input refers to a function in which the user moves the cursor or the AF frame by the line of sight.
- the line-of-sight input can be set to either valid or invalid.
- the line-of-sight input is enabled, the cursor can be moved by the line of sight and the AF frame can be moved. If the line-of-sight input is disabled, the cursor cannot be moved or the AF frame cannot be moved by the line of sight.
- the line-of-sight input setting can be enabled by selecting the setting item 403a, and the line-of-sight input setting can be disabled by selecting the setting item 403b.
- FIG. 4 shows that the line-of-sight input setting is valid. Further, the setting item 403 indicates that the setting item 403 is selected by the cursor represented by the cursor 500 shown in FIG.
- the system control unit 50 saves the setting contents changed in S302 in the non-volatile memory 56.
- the system control unit 50 refers to the non-volatile memory 56 and determines whether or not the line-of-sight input setting saved in S303 is valid. If the line-of-sight input setting is valid, the process proceeds to S305, and if it is invalid, the process proceeds to S324.
- the system control unit 50 determines whether or not there is a line-of-sight input. When there is a line-of-sight input, that is, when the line-of-sight by the user is detected in the line-of-sight detection block 160, the process proceeds to S306. If there is no line-of-sight input, that is, the line-of-sight of the user is not detected in the line-of-sight detection block 160, the process proceeds to S308. When there is a line-of-sight input, the system control unit 50 measures the time from the time when the line-of-sight input starts.
- the line-of-sight position detected in the line-of-sight detection block 160 for example, the line-of-sight position is detected every 30 milliseconds and sent to the system control unit 50.
- the system control unit 50 determines whether the user is moving the line of sight significantly or staring at a specific position (gaze) from the line of sight position and the time measured. Gaze will be described later in S306.
- the system control unit 50 determines whether or not there is a gaze by the user. From the line-of-sight position and the time measured, the system control unit 50 determines that the gaze has been performed when the amount of movement of the line-of-sight position within a predetermined time is equal to or less than a predetermined threshold value. For example, if the amount of movement of the line-of-sight position is equal to or less than the predetermined threshold value within 120 milliseconds, it is determined that there is gaze. If there is a gaze, the process proceeds to S307.
- the process proceeds to S308.
- gaze is taken up as a condition for determining the movement of the cursor display position by the line-of-sight input as the line-of-sight position intended by the user, but the user's blinking or voice instruction may be a condition.
- the cursor may be moved according to the detection position of the line of sight without gaze (that is, S306 to S308 and S311 may be eliminated).
- the time counting time for determining gaze is set to 120 milliseconds, but this time measuring time may be set in advance or may be freely set by the user. It should be noted that the position may be changed according to the positional relationship between the position where the user is currently gazing and the gazing position detected 120 milliseconds ago. The change of the time counting time according to the positional relationship of the gaze position described above will be described later with reference to FIG.
- the system control unit 50 stores the gaze flag as 1 in the system memory 52 (sets the gaze flag). If it is determined in S306 that there was gaze, the gaze flag is set to 1.
- the system control unit 50 stores the gaze flag as 0 in the system memory 52. If it is determined in S306 that there is no gaze, the gaze flag is set to 0.
- the system control unit 50 determines whether or not there is a touchdown on the touch panel 70a. If there is a touchdown, the process proceeds to S310, otherwise the process returns to S305.
- the system control unit 50 temporarily prohibits / restricts the movement of the cursor by the line-of-sight input. This is because during touch-on by the user, it is assumed that the user performs a touch operation in order to make fine adjustments (adjustment to a position desired by the user) of the cursor moved by the line-of-sight input. Therefore, after the start of the touchdown operation on the touch panel 70a, while the touch operation continues, the cursor is not moved based on the line of sight even if the gaze described in S306 is performed. As a result, it is possible to prevent the cursor that the user has fine-tuned by touch operation and moved to the user's desired position from moving to another position due to the line of sight.
- the system control unit 50 determines whether or not the gaze flag stored in the system memory 52 is 1. If the gaze flag is 1, that is, if there is gaze in S306, the process proceeds to S312. If the gaze flag is not 1, that is, if there is no gaze in S306, the process proceeds to S325.
- the system control unit 50 sets the cursor (indicator) in the line-of-sight selection display mode to the line-of-sight (gaze) position detected by the EVF 29 before the start of the touch operation on the touch panel 70a or at the start of the touch operation. indicate.
- the line-of-sight selection display mode cursor (hereinafter, line-of-sight cursor) is displayed so that the user can visually recognize that it is a cursor (indicator) different from the touch selection display mode cursor (hereinafter, touch cursor) described later.
- FIG. 5 is an example of the setting menu screen.
- the line-of-sight cursor 501 makes the entire color of the selected area different from the color of the unselected area. This is to prevent the user from confusing the line-of-sight cursor 501 with the touch cursor 502 described later. Therefore, a display form different from that shown in FIG. 5A may be used as long as the display can be identified by the user as different from the line-of-sight cursor 501 and the touch cursor 502. For example, the display form may be such that the pointer is displayed at the line-of-sight position.
- the cursor 500 is set so that the color of the outer frame of the selected area is different from the color of the outer frame of the unselected area.
- the display form of the cursor 500 may be different from that of FIG. 5A as long as the display is not confused with the line-of-sight cursor 501.
- the system control unit 50 stores the slide flag as 0 in the system memory 52.
- the system control unit 50 determines whether or not there is a touch move on the touch panel 70a. If there is a touch move, the process proceeds to S315, and if not, the process proceeds to S317.
- the system control unit 50 stores the slide flag as 1 in the system memory 52 (sets the slide flag).
- the system control unit 50 displays the touch cursor 502 in place of the line-of-sight cursor 501 displayed in S312 when there is a touch move on the touch panel 70a in S313.
- the touch cursor 502 is relatively moved in the EVF 29 by a second amount obtained by multiplying the touch move operation amount on the touch panel 70a by a second coefficient.
- the touch cursor 502 has a display form different from that of the line-of-sight cursor 501 displayed in S312, so that the user does not confuse the position selected by the line of sight with the position selected by the touch.
- the color of the outer frame of the selected area is set to be different from the color of the outer frame of the unselected area as shown in the touch cursor 502 shown in FIG.
- the display form of the cursor may be different from the above-mentioned display form as long as the user does not confuse the touch cursor 502 and the line-of-sight cursor 501.
- the display forms of the cursor 500 and the touch cursor 502 are different, the cursor 500 and the touch cursor 502 may be displayed in the same display form as long as they can be recognized as different cursors from the line-of-sight cursor 501. ..
- the user can visually recognize the cursor displayed on the EVF 29 without confusion.
- the cursor is relatively moved according to the touch move operation.
- the touch operation position of the touch panel 70a and the cursor position displayed in the EVF 29 are not associated with each other.
- the second coefficient is smaller than the first coefficient described later. That is, the cursor does not move significantly even with a large touch move operation amount, and if the touch move operation amount is the same, the touch move amount has a second coefficient rather than the first amount obtained by multiplying the touch move amount described later by the first coefficient.
- the second amount multiplied by is smaller in the cursor movement amount in EVF29. That is, the cursor can be moved more finely.
- the system control unit 50 determines whether or not there is a touch-up from the touch panel 70a. If there is a touch-up, the process proceeds to S318, and if not, the process returns to S313.
- the touch cursor 502 displayed in the EVF 29 at the time of S317 may be changed to the display form of the cursor 500 according to the touch-up operation.
- the system control unit 50 refers to the system memory 52 and determines whether or not the slide flag is 0. That is, it is confirmed in S314 whether or not there is a touch move operation on the touch panel 70a. If the slide flag is 0, that is, if there is no touch move operation in S314, the process proceeds to S319, and if the slide flag is 1, that is, if there is a touch move operation in S314, the process proceeds to S321.
- the fact that the slide flag is 0 in S318 means that there was a touch-up on the touch panel 70a without a touch-move operation after the touch-down.
- the tap operation (a series of operations of touching down and then touching up) after the menu item is selected by the line of sight wants to determine the menu item selected by the user with the line of sight cursor 501.
- the slide flag is 1, it is probable that the user wanted to select another menu item instead of the item selected by the line-of-sight cursor 501.
- the touch cursor 502 multiplies the touch move operation amount by a second coefficient smaller than the first coefficient. Moves relative to each other by the amount of.
- the touch cursor 502 moves finely. As a result, the user can fine-tune the cursor from the menu item selected by the line-of-sight cursor 501 to the menu item desired by the user.
- the system control unit 50 performs a process of determining the item at the displayed cursor position. Specifically, in FIG. 5, an example of determining an item related to the picture style selected by the line-of-sight cursor 501 will be described.
- the setting contents are changed from standard mode, which is a general image quality setting for captured images, to image quality settings such as auto and portrait.
- the system control unit 50 saves the selection process of the selection item performed in S319, that is, the change content of the setting item in the non-volatile memory 56, and ends this control flow.
- the system control unit 50 determines whether or not a predetermined time T1 has elapsed from the touch-up in S317. If the predetermined time T1 has elapsed, the process proceeds to S322, and if not, the process proceeds to S323.
- the predetermined time T1 is assumed to be a time of about 300 milliseconds, but this time may be a predetermined time or may be arbitrarily set by the user.
- the system control unit 50 permits the movement of the line-of-sight cursor 501 by the line-of-sight input, which was temporarily prohibited / restricted in S310, and returns to S302.
- the system control unit 50 determines whether or not there is a touchdown to the touch panel 70a. If there is a touchdown, it returns to S314, and if there is no touchdown, it returns to S321. If the time T1 does not elapse from the touch-up of S317 and the touch-down operation is performed again, it is considered that the user is trying to repeatedly perform the touch move operation on the touch panel 70a. That is, there is a high possibility that the user repeats the touch-up and touch-down operations in order to move the touch cursor 502 further. Therefore, if the touch panel 70a is touched down again within T1 hour from the touchup of S317, the temporary restriction of the line-of-sight input is continued even if there is a touchup.
- the presence or absence of the touch move operation in S314 is determined while continuing the temporary restriction of the line-of-sight input and maintaining the gaze position in S306. As a result, it is possible to reduce the movement of the cursor due to the line of sight, which is contrary to the intention of the user who intends to fine-tune the touch cursor 502 by repeating the touch move operation while maintaining the gaze position.
- the system control unit 50 determines whether or not there is a touchdown to the touch panel 70a, as in S309. If there is a touchdown, the process proceeds to S325, otherwise the process returns to S302.
- the system control unit 50 displays the touch cursor 502 on the item on the setting menu screen in response to the touchdown to the touch panel 70a.
- the line-of-sight input setting in S304 is invalid, the line-of-sight cursor 501 is not displayed, and the touch cursor 502 is displayed in response to the touchdown.
- the cursor 500 in FIG. 5A is an initial position cursor displayed when the user displays the setting menu screen. The cursor 500 does not depend on either the line of sight or the touch operation.
- the touch cursor 502 displayed in S325 points to the touch cursor 502 in FIG. 5B.
- the position of the touch cursor 502 displayed in response to the touchdown is displayed at the cursor position displayed before the touchdown operation displayed on the EVF 29. That is, when the touchdown is performed while the cursor 500 is displayed as shown in FIG. 5A, the touch cursor 502 is displayed at the position of the cursor 500 instead of the cursor 500, and the touch cursor 502 is moved relative to the cursor 500.
- the system control unit 50 stores the slide flag as 0 in the system memory 52 as in S313.
- the system control unit 50 determines whether or not there is a touch move on the touch panel 70a. If there is a touch move, the process proceeds to S328, and if not, the process proceeds to S330.
- the system control unit 50 stores the slide flag as 1 in the system memory 52 (sets the slide flag) in the same manner as in S315.
- the system control unit 50 moves the touch cursor 502 displayed in S325 to the relative position in the EVF 29 by the first amount obtained by multiplying the touch move operation amount by the first coefficient.
- the first coefficient is a coefficient larger than the above-mentioned second coefficient.
- the touch cursor 502 is moved by the first amount obtained by multiplying the touch move operation amount by the first coefficient. If the first coefficient is a large coefficient to some extent, the number of repeated finger operations can be reduced and the cursor can be moved to a desired position more quickly even with a small touch move operation amount.
- the system control unit 50 determines whether or not there is a touch-up from the touch panel 70a. If there is a touch-up, the process proceeds to S331, and if not, the process returns to S326.
- the system control unit 50 refers to the system memory 52 and determines whether or not the slide flag is 0. That is, it is confirmed in S326 whether or not there is a touch move operation on the touch panel 70a. If the slide flag is 0, that is, if there is no touch move in S327, the process proceeds to S319, and if the slide flag is 1, that is, if there is a touch move in S327, the process returns to S325.
- the setting menu item selected by the line-of-sight input by the line-of-sight cursor 501 is shown.
- the display form is changed from the line-of-sight cursor 501 to the touch cursor 502, and the touch cursor 502 is moved according to the touch move operation.
- the user can roughly move the cursor to a desired position by the line of sight, and can fine-tune the cursor by a touch move operation as needed.
- the cursor moves relative to a desired position by an amount larger than the amount corresponding to the touch move operation amount when there is a line-of-sight input, so that the cursor can be moved to a desired position more quickly even without the line-of-sight input. That is, the user can move the cursor to the desired position faster and more accurately without confusion.
- the position designation of the touch operation is defined as the relative position designation. Therefore, there is no one-to-one correspondence between the position coordinates of the user's touch operation on the touch panel 70a and the position coordinates of the display unit 28. That is, since the cursor does not move to the touch position on the touch panel 70a, the user does not need to visually confirm the touch position. As a result, for example, even if the user tries to select a setting menu item while looking through the finder (even if he / she tries to change the setting menu), he / she does not need to take his / her eyes off the finder to check the display unit 28. Since the operation can be performed without taking the trouble of taking your eyes off the viewfinder, it is possible to reduce the possibility that the user feels annoyed and misses a photo opportunity.
- the touch panel 70a (display unit 28) occupies a large area on the back surface of the digital camera 100, and is easier to access than a specific operation button.
- FIGS. 3 to 5 an example of moving the cursor position on the setting menu screen has been described, but the present application is not limited to this.
- the present application can be applied to select a AF point to be used for AF from a plurality of AF points in the shooting mode. Due to the increase in the focus detection points (focus detection areas) that can be arbitrarily selected by the user, a large number of operations are required to make fine adjustments from the line-of-sight position using the cross key 74.
- FIG. 6 is a display example of AF point selection in the finder
- FIG. 6A is a diagram showing that there are a plurality of AF points 600 that can be arbitrarily selected by the user.
- the AF point can be moved by the movement amount of 610 by one touch move operation to the touch panel 70a, and the AF point can be moved to a desired position. it can.
- FIG. 6 shows an example in which the cross key 74 is only pressed twice, if the distance measurement point position desired by the user is diagonally downward when viewed from the distance measurement point 602, the number of operations of the cross key 74 further increases. To increase. If the user's touch move operation amount is large, the movement amount 610 is large, and if the touch move operation amount is small, the movement amount 610 is small. Therefore, even if the AF point 603 desired by the user is slightly far from the AF point 602, the AF point can be moved to the desired position more quickly and more accurately by one operation if the touch move operation is performed. ..
- the item of the displayed cursor position is determined.
- some decision instruction of the user is required in order to execute the item. For example, when gaze is an instruction for determining an item, if the determination time for gaze is lengthened, it takes time to issue the determination instruction, and there is a possibility that a photo opportunity is missed. On the other hand, if the gaze determination time is shortened, a decision instruction contrary to the user's intention may be given, which may be annoying to the user. Therefore, it is necessary to give a decision instruction that can more clearly reflect the user's intention.
- the tap operation is a decision instruction as described in the present embodiment
- it is a user decision instruction to execute the item at the position of the line-of-sight cursor.
- the decision instruction can be given by tapping on the spot without further moving the finger.
- the touch panel 70a that performs the touch operation occupies a wider range than the operation buttons in the digital camera 100, and is easy to access.
- the user's decision instruction and fine adjustment (correction) instruction can be performed only by touch operation, it is not necessary to move the finger to various positions, and the operation is easier for the user. From these facts, by tapping the decision instruction, the user can move the cursor more quickly and more accurately by the line of sight and the touch operation, and can give the decision instruction more quickly by the touch operation.
- a direction key such as a cross key 74 or a multi-controller (not shown) that can be operated in eight directions is operated during the flow control in FIG. 3, the direction is from the cursor position displayed at the time of operation. It is assumed that the cursor can be moved according to the operation of the key. Specifically, when the line-of-sight cursor 501 by the line-of-sight input is displayed and a key operation is performed instead of a touch move, the cursor moves from the line-of-sight cursor 501 position. When the line-of-sight input is invalid, the cursor 500 in FIG. 5A moves according to a key operation.
- the cursor moved in response to the operation of the direction keys may be a cursor in a display mode different from the line-of-sight cursor 501 in the line-of-sight selection display mode, and may have the same display form as the touch cursor 502 in the touch operation display mode. However, another cursor display may be used.
- the first shutter switch 62 is turned on during the flow control, the setting menu screen transitions to the shooting standby state and the shooting preparation operation is started.
- the second shutter switch 64 is pressed, the setting menu screen transitions to the shooting standby state and the shooting process is performed.
- the mode changeover switch 60 is operated, the mode is switched according to the operation, and then the state transitions to the shooting standby state without returning to the menu screen.
- the menu item selected by the tap operation on the touch panel 70a is determined, but the determination process may not be a simple tap operation.
- the determination process is not performed by a simple tap operation, and the determination process may be performed by pressing an operation button mounted on the digital camera 100 such as the SET button 75 after the touch-up operation in S317 and S330.
- the pressing on the touch panel 70a can be detected, and the determination process may be performed according to the pressing on the touch panel 70a. That is, there is a pressure sensor (not shown) for detecting the strength of the touch operation, which detects the pressing force on the operation surface of the display unit 28 (the operation surface of the touch panel 70a).
- the pressure sensor can continuously detect the strength of the pressing force when the display unit 28 is pressed by a touch operation.
- the pressure sensor one or a plurality of strain gauge sensors are installed in the portion distorted by the pressing force on the operating surface of the display unit 28, and the pressing force on the operating surface of the touch panel 70a is detected by the output value from the strain gauge sensor. It can be configured to be.
- the capacitance sensor provided in parallel with the touch panel 70a compresses the distance between the finger on the operation surface and the capacitance sensor due to the operation surface being distorted by the pressing force on the operation surface of the display unit 28. Calculated from the capacity value. Then, the pressure may be calculated based on this distance, or the distance may be treated in the same manner as the pressure. Further, the pressure sensor may be of another type as long as it can detect the pressing force on the operation surface of the touch panel 70a. For example, if the stylus is used to operate the operation surface, a sensor that detects the pressure applied to the tip of the stylus provided on the stylus side may be used, and the strength of the touch operation (pressing) is based on the output from the sensor. Pressure) may be detected.
- the intensity (pressure) of the touch operation may be detected by using various methods and various sensors, or a combination of a plurality of sensors (for example, by a weighted average).
- the pressure sensor may be integrally configured with the touch panel 70a.
- the pressing operation on the operation surface of the display unit 28 is referred to as a touch push. If the digital camera 100 is equipped with the touch panel 70a provided with the pressure sensor, it may be determined whether or not the determination process of S319 is possible depending on whether or not there is a touch push before the touch-up of S317 and S330.
- FIG. 7 is a control flowchart for determining the presence or absence of gaze, which was performed in S306 of FIG.
- This control process is realized by the system control unit 50 expanding the program stored in the non-volatile memory 56 into the system memory 52 and executing the program.
- the flowchart of FIG. 7 is started when the digital camera is activated in the shooting mode, the setting related to the line-of-sight input is valid, and there is a line-of-sight input (the line-of-sight detection block 160 is in a state where the user's line of sight can be detected). To.
- the system control unit 50 has the line-of-sight detection position P (K) detected by the line-of-sight detection block 160 and the line-of-sight detection position P (K-1) that was detected before the line-of-sight was detected at the P (K) position. ) And the difference.
- This difference is referred to as the movement amount ⁇ P of the line-of-sight detection position (hereinafter, referred to as the line-of-sight movement amount ⁇ P).
- K is a count value representing the number of processing times of the control flow of FIG. 7, and is stored in the system memory 52.
- the system control unit 50 determines whether or not the movement amount ⁇ P of the line-of-sight detection position calculated in S701 is smaller than the default movement amount threshold value Lo. That is, it is determined whether or not the line-of-sight position remains at a specific position. If ⁇ P is smaller than Lo ( ⁇ P ⁇ Lo), that is, the line-of-sight position remains in a specific position range, the process proceeds to S703. When ⁇ P is Lo or more ( ⁇ P ⁇ Lo), that is, the line-of-sight position does not stay in a specific position range and moves significantly, the process proceeds to S704.
- the default movement amount threshold value Lo indicates the movement amount of the line-of-sight detection position in EVF29, and if the line-of-sight detection position movement amount ⁇ P is smaller than Lo, it is considered that the user is gazing at a specific position.
- the system control unit 50 determines that ⁇ P is smaller than Lo ( ⁇ P ⁇ Lo)
- the system timer 53 measures the duration of the state in which ⁇ P is smaller than Lo. For example, when the line-of-sight position is detected every 30 milliseconds, it is possible to detect whether ⁇ P is smaller than Lo every 30 milliseconds.
- the default movement amount threshold Lo may be a predetermined fixed value, a value that can be arbitrarily set by the user, or may be changed according to some condition or state.
- the system control unit 50 uses the system timer 53 to increment the timer count value T corresponding to the time counting time started in S702, and proceeds to S705.
- the system control unit 50 sets (clears) the timer count value T to 0 and proceeds to S607.
- the timer count value T is stored in the system memory 52. This indicates a case where it is determined in S702 that ⁇ P is Lo or more ( ⁇ P ⁇ Lo), that is, the line of sight is moving significantly.
- the default timer count threshold Tth may be a predetermined fixed value, a value that can be arbitrarily set by the user, a long value when the value of ⁇ P is small, and a short value when the value of ⁇ P is large. ..
- the system control unit 50 determines that the user has gazed when the timer count value T corresponding to the time when the time counting is started in S702 is larger than the threshold value Tth (T> Tth), and determines the determination result in the system memory. Record at 52.
- the system control unit 50 increments the count value K indicating the number of processing times of the control flow shown in FIG. 7 and stores it in the system memory 52.
- the count values T and K are turned off by the power switch of the digital camera 100.
- the digital camera 100 is started after the power is turned off by the auto power off function (a function that turns off the power of the digital camera 100 when no operation is continued for a predetermined long time, for example, about 1 minute), it is initially started at the time of startup. Be transformed.
- the timer count value T corresponding to the duration of the ⁇ P ⁇ Lo state sets the default time threshold Tth. If it exceeds the limit, it is judged that there was a gaze.
- the value of the default time threshold value Tth the larger the difference between the currently selected cursor display position Po and the current line-of-sight detection position P (K), the smaller Tth may be.
- the configuration as shown in FIG. 8 may be used.
- the line-of-sight position is not displayed each time the user unconsciously moves the line of sight or a large movement of the line of sight. Can reduce the annoyance.
- FIG. 8 is an explanatory diagram of the threshold value change control of the gaze determination time in the present embodiment.
- Position 800 indicates the selected position displayed on EVF29, and this position is designated as the selected position Po.
- the graph of FIG. 8 shows the relationship between the movement amount L and the timer count value T according to the distance from the selected position Po with the selected position Po as the origin.
- position 801 is set to P1 (K).
- the time threshold value Tth at this time is T1.
- position 802 is set to P2 (K).
- the time threshold value Tth at this time is T2.
- the movement amount L and the time threshold value Tth at this time have a proportional relationship in which the time threshold value Tth decreases as the movement amount L increases with reference to the selected position Po. That is, assuming that the slope of the straight line connecting Po and P2 (K) shown in FIG. 8 is A2 and the slope of the straight line connecting Po and P1 (K) is A1.
- the gaze judgment time is shortened, and when the distance is short, the judgment time is lengthened.
- the line-of-sight position can be specified more quickly when the line of sight is moved greatly and more accurately when the line of sight is moved finely.
- the display methods shown in FIGS. 5A and 5B are shown as examples of the selection target display methods by the "line-of-sight selection display mode" and the "touch selection display mode".
- the display method is not limited to this, and the display method of the indicators shown as various cursors can be any display method such as a display method using an arrow-shaped icon or a display method using a circle-shaped icon. I do not care. For example, by changing the color of the icon for each display mode, it is possible to visually recognize which display mode the user is using, so that erroneous operation is unlikely to occur as in the embodiment shown in FIG.
- the user can set either the absolute position designation or the relative position designation as a method of specifying the position of the touch move operation in the eyepiece state, but in the present embodiment, the touch move is performed regardless of the setting contents of the user.
- the position designation according to the operation is defined as the relative position designation.
- the cursor is displayed at the position of EVF29 corresponding to the position where the user performs the touch operation on the touch panel 70a. In order for the user looking into the EVF 29 to accurately confirm and touch the touch position, he / she must take his / her eyes off the EVF 29 and visually check the touch panel 70a.
- the position designation according to the touch move operation is a relative position designation regardless of the setting contents of the user.
- the selection cursor determination process is performed only when the slide flag is 1 (the slide flag is set). Since the slide flag is cleared only when the touch move is not detected, the slide flag is not cleared when the touch-up is detected while the touch move is detected.
- the configuration is suitable for the operation of sequentially moving the cursor frame position using the touch panel.
- FIG. 9 shows an embodiment in the case where the EVF 29 in the present embodiment is not used.
- FIG. 9 is an example of moving the mouse pointer using the touch pad of a notebook computer.
- the touch pad 905 is shown as corresponding to the touch panel 70a described in the present embodiment.
- an independent line-of-sight detection device 910 is connected to the laptop computer, and the line-of-sight is used by using the camera 911 mounted on the line-of-sight detection device 910 or the built-in camera 912 built in the notebook computer. Determine the position.
- the mouse pointer moves to the line-of-sight position in response to the line-of-sight input.
- the mouse pointer displayed at the line-of-sight position in response to the touch move operation on the touch pad 905 is the second amount (with the same touch move amount) obtained by multiplying the touch move operation amount by the second coefficient ( ⁇ first coefficient). If there is, it is smaller than the first quantity). As a result, the mouse pointer can be finely adjusted even on the touch pad.
- this embodiment can be implemented not only by using the touch pad of the notebook computer but also by using a mouse, a pointing device, a joystick, or the like.
- the touchpad, mouse, pointing device, joystick, etc. do not have to be built in the notebook computer, but may be externally attached.
- the touch pad, joystick, etc. mounted on the remote controller such as a pointer can also be used for touch move operation or a position movement instruction operation corresponding to the touch move, as in the present embodiment.
- the line-of-sight detection block 160 is mounted or connected to an external monitor or projector. Further, the line-of-sight detection sensor may exist independently of the external monitor, projector, pointer, and the like.
- the setting menu item selected by the line-of-sight input by the line-of-sight cursor 501 is shown.
- the display form is changed from the line-of-sight cursor 501 to the touch cursor 502, and the touch cursor 502 is moved according to the touch move operation.
- the user can roughly move the cursor to a desired position by the line of sight, and can fine-tune the line-of-sight cursor designated by the line of sight by a touch move operation as needed. That is, it can be moved to a position desired by the user more quickly and more accurately.
- the above-mentioned various controls described as being performed by the system control unit 50 may be performed by one hardware, or the entire device is divided by a plurality of hardware (for example, a plurality of processors and circuits) sharing the processing. May be controlled.
- the present invention has been described in detail based on the preferred embodiment thereof, the present invention is not limited to these specific embodiments, and various embodiments within the scope of the gist of the present invention are also included in the present invention. included.
- the example of the touch panel 70a has been described as a position movement instruction member used in combination with the line-of-sight input, other operation means such as buttons and dials may be used.
- the display position is an AF frame, it may be an icon frame, a parameter setting frame, or an indicator display different from an AF frame such as a mouse pointer.
- the gaze determination criterion is the accumulation time after the line-of-sight input to the line-of-sight detection block 160 is started, but the accumulation time may be a preset time.
- the judgment is made only by enabling / disabling the line-of-sight input setting (line-of-sight AF of setting item 403 in FIG. 4) instead of the judgment criterion by gaze. You may.
- the present invention is applied to a digital camera has been described as an example, but this is not limited to this example and is applicable to any electronic device having a receiving means capable of accepting line-of-sight input. It is possible. It is also possible to combine each embodiment as appropriate.
- the EVF 29 and the line-of-sight detection are used, but the present embodiment can also be implemented in a configuration using the display device and the line-of-sight detection. That is, the present invention is applicable to personal computers, PDAs, mobile phone terminals, portable image viewers, printer devices equipped with displays, digital photo frames, music players, and the like. It can also be applied to wearable devices such as game machines, electronic book readers, and head-mounted displays.
- the present invention is also realized by executing the following processing. That is, software (program) that realizes the above-described embodiment functions is supplied to the system or device via a network or various storage media, and the computer (or CPU, MPU, etc.) of the system or device reads and executes the program code. It is a process to do. In this case, the program and the storage medium storing the program constitute the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Studio Devices (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
- Camera Bodies And Camera Details Or Accessories (AREA)
- Indication In Cameras, And Counting Of Exposures (AREA)
- Automatic Focus Adjustment (AREA)
- User Interface Of Digital Computer (AREA)
- Focusing (AREA)
- Viewfinders (AREA)
- Position Input By Displaying (AREA)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202080044551.1A CN113994659B (zh) | 2019-06-17 | 2020-05-29 | 电子设备及其控制方法、程序和存储介质 |
| US17/550,873 US11910081B2 (en) | 2019-06-17 | 2021-12-14 | Electronic apparatus, method for controlling the same, and storage medium |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2019112315A JP7353811B2 (ja) | 2019-06-17 | 2019-06-17 | 電子機器およびその制御方法 |
| JP2019-112315 | 2019-06-17 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/550,873 Continuation US11910081B2 (en) | 2019-06-17 | 2021-12-14 | Electronic apparatus, method for controlling the same, and storage medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2020255675A1 true WO2020255675A1 (ja) | 2020-12-24 |
Family
ID=73837405
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2020/021324 Ceased WO2020255675A1 (ja) | 2019-06-17 | 2020-05-29 | 電子機器およびその制御方法 |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US11910081B2 (enExample) |
| JP (1) | JP7353811B2 (enExample) |
| CN (1) | CN113994659B (enExample) |
| WO (1) | WO2020255675A1 (enExample) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2022165239A (ja) * | 2021-04-19 | 2022-10-31 | キヤノン株式会社 | 撮像装置及びその制御方法、並びにプログラム |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021235156A1 (ja) * | 2020-05-19 | 2021-11-25 | ソニーグループ株式会社 | 撮像装置、撮像装置の制御方法およびプログラム |
| USD985655S1 (en) * | 2021-01-19 | 2023-05-09 | Blackmagic Design Pty Ltd | Camera |
| CN118159928A (zh) * | 2022-01-26 | 2024-06-07 | 株式会社和冠 | 位置输入装置 |
| JP1743924S (ja) * | 2022-08-18 | 2023-05-11 | デジタルカメラ | |
| JP1744145S (ja) * | 2022-08-18 | 2024-05-16 | デジタルカメラ |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002301030A (ja) * | 2001-04-09 | 2002-10-15 | Canon Inc | 視線検出機能付き機器 |
| JP2015022208A (ja) * | 2013-07-22 | 2015-02-02 | キヤノン株式会社 | 光学機器、その制御方法、および制御プログラム |
| JP2017103566A (ja) * | 2015-11-30 | 2017-06-08 | キヤノン株式会社 | 撮像制御装置およびその制御方法 |
| JP2018023068A (ja) * | 2016-08-05 | 2018-02-08 | キヤノン株式会社 | 電子装置、その制御方法およびプログラム |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9445005B2 (en) * | 2013-07-22 | 2016-09-13 | Canon Kabushiki Kaisha | Optical device capable of selecting focus detection point, method of controlling the same, and storage medium |
| CN105338192A (zh) * | 2015-11-25 | 2016-02-17 | 努比亚技术有限公司 | 移动终端及其操作处理方法 |
| US10093311B2 (en) * | 2016-07-06 | 2018-10-09 | Waymo Llc | Testing predictions for autonomous vehicles |
| JP6799063B2 (ja) * | 2016-07-20 | 2020-12-09 | 富士フイルム株式会社 | 注目位置認識装置、撮像装置、表示装置、注目位置認識方法及びプログラム |
| WO2021131562A1 (ja) * | 2019-12-27 | 2021-07-01 | キヤノン株式会社 | 電子機器、電子機器の制御方法、プログラムおよび記憶媒体 |
| JP2021140590A (ja) * | 2020-03-06 | 2021-09-16 | キヤノン株式会社 | 電子機器、電子機器の制御方法、プログラム、記憶媒体 |
| JP2022129747A (ja) * | 2021-02-25 | 2022-09-06 | キヤノン株式会社 | 電子装置およびその制御方法 |
-
2019
- 2019-06-17 JP JP2019112315A patent/JP7353811B2/ja active Active
-
2020
- 2020-05-29 WO PCT/JP2020/021324 patent/WO2020255675A1/ja not_active Ceased
- 2020-05-29 CN CN202080044551.1A patent/CN113994659B/zh active Active
-
2021
- 2021-12-14 US US17/550,873 patent/US11910081B2/en active Active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2002301030A (ja) * | 2001-04-09 | 2002-10-15 | Canon Inc | 視線検出機能付き機器 |
| JP2015022208A (ja) * | 2013-07-22 | 2015-02-02 | キヤノン株式会社 | 光学機器、その制御方法、および制御プログラム |
| JP2017103566A (ja) * | 2015-11-30 | 2017-06-08 | キヤノン株式会社 | 撮像制御装置およびその制御方法 |
| JP2018023068A (ja) * | 2016-08-05 | 2018-02-08 | キヤノン株式会社 | 電子装置、その制御方法およびプログラム |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2022165239A (ja) * | 2021-04-19 | 2022-10-31 | キヤノン株式会社 | 撮像装置及びその制御方法、並びにプログラム |
| JP7739026B2 (ja) | 2021-04-19 | 2025-09-16 | キヤノン株式会社 | 撮像装置及びその制御方法、並びにプログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| CN113994659A (zh) | 2022-01-28 |
| JP7353811B2 (ja) | 2023-10-02 |
| CN113994659B (zh) | 2023-09-26 |
| JP2020204710A (ja) | 2020-12-24 |
| US20220109797A1 (en) | 2022-04-07 |
| US11910081B2 (en) | 2024-02-20 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7353811B2 (ja) | 電子機器およびその制御方法 | |
| CN112099618B (zh) | 电子设备、电子设备的控制方法和存储介质 | |
| CN112104809B (zh) | 电子装置、电子装置的控制方法及存储介质 | |
| WO2018021165A4 (ja) | 電子機器およびその制御方法 | |
| US20250343985A1 (en) | Electronic device, control method therefor, program, and storage medium | |
| US11240419B2 (en) | Electronic device that can execute function in accordance with line of sight of user, method of controlling electronic device, and non-transitory computer readable medium | |
| JP2022018244A (ja) | 電子機器およびその制御方法 | |
| CN113364945B (zh) | 电子装置、控制方法和计算机可读介质 | |
| CN112312008B (zh) | 电子装置、电子装置的控制方法及存储介质 | |
| CN112702507B (zh) | 电子装置、电子装置的控制方法及存储介质 | |
| JP7383552B2 (ja) | 電子機器およびその制御方法 | |
| JP2023160103A (ja) | 電子機器 | |
| CN112040095B (zh) | 电子装置、电子装置的控制方法及存储介质 | |
| JP2021018634A (ja) | 電子機器およびその制御方法 | |
| JP7387493B2 (ja) | 電子機器、電子機器の制御方法、プログラム、記憶媒体 | |
| JP2022129065A (ja) | 表示制御装置及びその制御方法及びプログラム及び記録媒体 | |
| JP7451255B2 (ja) | 電子機器及びその制御方法 | |
| WO2024095888A1 (ja) | 撮像装置、制御方法およびプログラム | |
| JP2022172840A (ja) | 電子機器、電子機器の制御方法、プログラム、及び記憶媒体 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20826797 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 20826797 Country of ref document: EP Kind code of ref document: A1 |