WO2014084224A1 - 電子機器および視線入力方法 - Google Patents
電子機器および視線入力方法 Download PDFInfo
- Publication number
- WO2014084224A1 WO2014084224A1 PCT/JP2013/081837 JP2013081837W WO2014084224A1 WO 2014084224 A1 WO2014084224 A1 WO 2014084224A1 JP 2013081837 W JP2013081837 W JP 2013081837W WO 2014084224 A1 WO2014084224 A1 WO 2014084224A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- line
- sight
- user
- input
- processor
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
Definitions
- the present invention relates to an electronic apparatus and a line-of-sight input method, and more particularly, to an electronic apparatus and a line-of-sight input method for detecting line-of-sight input, for example.
- a data input device which is an example of background art displays a group of input data such as a menu or a keyboard on a display device, images the eye portion of the user of the device with a camera, and the user's line-of-sight direction from the captured image Is determined, input data located in the line-of-sight direction is determined, and the determined input data is output to an external device or the like.
- a gaze detection apparatus detects the gaze of the subject by detecting the center of the pupil and the corneal reflection point from the captured image.
- gaze input devices tend to be larger in proportion to the distance between the sensor and the eyeball. Accordingly, in consideration of mounting on a relatively small electronic device such as a portable terminal, for example, the above-described two background technologies are large and unsuitable.
- a main object of the present invention is to provide a novel electronic device and line-of-sight input method.
- Another object of the present invention is to provide an electronic device and a line-of-sight input method capable of suppressing power consumption when detecting the line of sight of a user.
- 1st aspect of this invention is an electronic device which detects the gaze input which is an input based on a user's gaze point, and performs the operation
- the detection process for detecting a gaze input is performed
- a display unit that displays a screen including an object display area for displaying a processor, a specific region and an operation object, a detection unit that detects a gaze point, and a gaze input of a user when the gaze point of a user is included in the specific region
- a first setting unit that sets the detection accuracy to the first accuracy mode, and a second accuracy that is higher in detection accuracy than the first accuracy mode when the gaze input of the user is included in the object display area. It is an electronic device provided with the 2nd setting part set to a mode.
- a processor and a specific region that detect a line-of-sight input that is an input based on a gaze point of a user, execute an operation based on the line-of-sight input,
- the first setting step for setting the detection accuracy of the line-of-sight input to the first accuracy mode, and when the user's gazing point is included in the object display area, the detection accuracy of the line-of-sight input is higher than that of the first accuracy mode.
- FIG. 1 is an external view showing a mobile phone according to an embodiment of the present invention.
- FIG. 2 is a block diagram showing an electrical configuration of the mobile phone shown in FIG.
- FIG. 3 is an illustrative view showing one example of a gazing point detected on the display surface of the display shown in FIG. 4 is an illustrative view showing one example of a pupil and a Purkinje image photographed by the infrared camera shown in FIG.
- FIG. 5 is an illustrative view showing an example of a line-of-sight vector calculated by the processor shown in FIG. 2, FIG. 5 (A) shows an example of the first center position and the second center position, and FIG. 5 (B) shows the line of sight. An example of a vector is shown.
- FIG. 5 (A) shows an example of the first center position and the second center position
- FIG. 5 (B) shows the line of sight.
- An example of a vector is shown.
- FIG. 6 is an illustrative view showing a display example of objects displayed on the display shown in FIG.
- FIG. 7 is an illustrative view showing one example of a memory map of the RAM shown in FIG.
- FIG. 8 is a flowchart showing an example of part of the line-of-sight input process of the processor shown in FIG.
- FIG. 9 is an example of another part of the line-of-sight input process of the processor shown in FIG. 2, and is a flowchart subsequent to FIG.
- FIG. 10 is an illustrative view showing one example of an operation state when the high accuracy mode shown in FIG. 9 is set.
- FIG. 11 is an illustrative view showing one example of an operation state when the low accuracy mode of the specific example 1 shown in FIG. 9 is set.
- FIG. 12 is an illustrative view showing one example of an operation state when the low accuracy mode of the specific example 2 shown in FIG. 9 is set.
- FIG. 13 is an illustrative view showing one example of an operation state when the low accuracy mode of the specific example 3 shown in FIG. 9 is set.
- FIG. 14 is a flowchart showing another example of the line-of-sight input process of the processor shown in FIG.
- FIG. 15 is an illustrative view showing a display example of an object of another embodiment displayed on the display shown in FIG.
- a mobile phone 10 is a so-called smartphone, and includes a vertically long flat rectangular housing 12.
- the main surface (front surface) of the housing 12 is provided with a display 14 that functions as a display unit and is formed of, for example, liquid crystal or organic EL.
- a touch panel 16 is provided on the display 14.
- a speaker 18 is built in the surface side of one end in the vertical direction of the housing 12, and a microphone 20 is built in the surface side of the other end in the vertical direction.
- a call key 22, an end call key 24, and a menu key 26 are provided as hardware keys.
- an infrared LED 30 and an infrared camera 32 are provided on the left side of the microphone 20, and a proximity sensor 34 is provided on the right side of the speaker 18.
- the light emitting surface of the infrared LED 30, the imaging surface of the infrared camera 32, and the detection surface of the proximity sensor 34 are provided so as to be exposed from the housing 12, and other portions are built in the housing 12.
- the user can input a telephone number by touching the dial key displayed on the display 14 with the touch panel 16 and can start a voice call by operating the call key 22. If the end call key 24 is operated, the voice call can be terminated. Further, by holding down the end call key 24, the power of the mobile phone 10 can be turned on / off.
- a menu screen is displayed on the display 14.
- the user can perform a selection operation on the software keys and icons by performing a touch operation on the touch panel 16 with respect to the software keys and menu icons displayed on the display 14 in that state.
- a mobile phone such as a smartphone will be described as an example of the electronic device.
- the present invention can be applied to various electronic devices including a display device.
- examples of other electronic devices include a feature phone, an electronic book terminal, a tablet terminal, a PDA, an arbitrary electronic device such as a notebook PC and a display device.
- the mobile phone 10 shown in FIG. 1 includes a processor 40.
- the processor 40 includes an infrared camera 32, a proximity sensor 34, a wireless communication circuit 42, an A / D converter 46, and a D / A conversion. 48, the input device 50, the display driver 52, the flash memory 54, the RAM 56, the touch panel control circuit 58, the LED driver 60, the captured image processing circuit 62, and the like are connected.
- the processor 40 is called a computer or CPU and controls the entire mobile phone 10.
- the processor 40 includes an RTC 40a, and the RTC 40a measures the date and time.
- the RAM 56 all or a part of the program stored in advance in the flash memory 54 is expanded (loaded) when used, and the processor 40 executes various processes according to the program expanded in the RAM 56. At this time, the RAM 56 is used as a working area or a buffer area of the processor 40.
- the input device 50 includes the hardware keys (22, 24, 26) shown in FIG. 1, and functions as an operation unit or an input unit together with the touch panel 16 and the touch panel control circuit 58.
- Information on the hardware key operated by the user is input to the processor 40.
- key operation the operation by the hardware key is referred to as “key operation”.
- the wireless communication circuit 42 is a circuit for transmitting and receiving radio waves for voice calls and mails through the antenna 44.
- the wireless communication circuit 42 is a circuit for performing wireless communication by the CDMA method. For example, when the user operates the input device 50 to instruct a telephone call (calling), the wireless communication circuit 42 executes a telephone call processing under the instruction of the processor 40 and outputs a telephone call signal via the antenna 44. Output.
- the telephone call signal is transmitted to the other party's telephone through the base station and the communication network.
- the processor 40 executes the call process.
- the microphone 20 shown in FIG. 1 is connected to the A / D converter 46, and the audio signal from the microphone 20 is input to the processor 40 as digital audio data through the A / D converter 46.
- the speaker 18 is connected to the D / A converter 48.
- the D / A converter 48 converts digital audio data into an audio signal and supplies the audio signal to the speaker 18 through an amplifier. Therefore, the sound data is output from the speaker 18.
- the sound collected by the microphone 20 is transmitted to the other party's telephone, and the sound collected by the other party's telephone is output from the speaker 18.
- the processor 40 controls the amplification factor of the amplifier connected to the D / A converter 48 in response to, for example, an operation for adjusting the volume by the user, and thereby the volume of the sound output from the speaker 18. Can be adjusted.
- the display driver 52 controls the display on the display 14 connected to the display driver 52 under the instruction of the processor 40.
- the display driver 52 includes a video memory that temporarily stores image data to be displayed.
- the display 14 is provided with a backlight using, for example, an LED as a light source, and the display driver 52 controls the brightness of the backlight and lighting / extinguishing in accordance with instructions from the processor 40.
- the touch panel 16 shown in FIG. 1 is connected to the touch panel control circuit 58.
- the touch panel control circuit 58 applies necessary voltage and the like to the touch panel 16 and also provides the processor 40 with a touch start signal indicating the start of touch by the user, an end signal indicating the end of touch by the user, and coordinate data indicating the touch position. input. Therefore, the processor 40 can determine which icon or key the user has touched based on the coordinate data.
- the touch panel 16 is a capacitance type touch panel that detects a change in capacitance generated between the surface and an object such as a finger approaching the surface.
- the touch panel 16 detects that one or more fingers touched the touch panel 16, for example.
- the touch panel control circuit 58 functions as a detection unit, detects a touch operation within the effective touch range of the touch panel 16, and outputs coordinate data (touch coordinate data) indicating the position of the touch operation to the processor 40.
- the processor 40 can determine which icon or key the user has touched based on the touch coordinate data input from the touch panel control circuit 58.
- touch operation the operation using the touch panel 16 is referred to as “touch operation”.
- the touch operation of this embodiment includes a tap operation, a long tap operation, a flick operation, a slide operation, and the like.
- the touch panel 16 may employ a surface capacitive method, a resistive film method, an ultrasonic method, an infrared method, an electromagnetic induction method, or the like.
- the touch operation is not limited to the user's finger, and may be performed with a stylus pen or the like.
- the proximity sensor 34 includes a light emitting element (for example, an infrared LED) and a light receiving element (for example, a photodiode).
- the processor 40 calculates the distance of an object (for example, a user's face) close to the proximity sensor 34 (mobile phone 10) from the change in the output of the photodiode.
- the light emitting element emits infrared rays
- the light receiving element receives infrared rays reflected by a face or the like. For example, when the light receiving element is far from the user's face, the infrared light emitted from the light emitting element is hardly received by the light receiving element.
- the processor 40 determines whether the proximity sensor 34 is based on the received light amount. The distance to the object can be calculated.
- the infrared LED 30 shown in FIG. 1 is connected to the LED driver 60.
- the LED driver 60 switches on / off (lights on / off) the infrared LED 30 based on a control signal from the processor 40.
- An infrared camera 32 (see FIG. 1) that functions as a photographing unit is connected to the photographed image processing circuit 62.
- the captured image processing circuit 62 performs image processing on the captured image data from the infrared camera 32 and inputs monochrome image data to the processor 40.
- the infrared camera 32 executes photographing processing under the instruction of the processor 40 and inputs photographed image data to the photographed image processing circuit 62.
- the infrared camera 32 includes, for example, a color camera using an imaging element such as a CCD or CMOS, and an infrared transmission filter that attenuates (cuts) light of R, G, and B wavelengths and transmits light of infrared wavelengths. Composed. Therefore, if the infrared transmission filter is configured to be detachable, a color image can be obtained by removing the infrared transmission filter.
- wireless communication circuit 42 the A / D converter 46, and the D / A converter 48 described above may be included in the processor 40.
- an input operation using a line of sight (hereinafter sometimes referred to as a “line of sight operation”) is possible instead of a key operation or a touch operation.
- a line of sight operation a predetermined process set in association with a predetermined area (hereinafter referred to as a determination area) indicated by a point (gaze point) where the line of sight and the display surface of the display 14 intersect is executed.
- a point gaze point
- the user sets his / her dominant eye among the left and right eyes.
- the dominant eye here, the left eye
- the infrared camera 32 captures the face of the user (subject) irradiated with the infrared light emitted by the infrared LED 30.
- An eyeball peripheral image is acquired using a feature point extraction technique for the photographed image.
- a pupil is detected by a labeling process on the acquired image around the eyeball, and reflected light (Purkinje image) by infrared rays (infrared light) is detected by a differential filter process.
- the method for detecting the pupil and the Purkinje image from the photographed image has been outlined, these methods are already well known and are not essential contents of this embodiment, and thus detailed description thereof is omitted.
- the eyelid is relatively wide open.
- a Purkinje image can be detected in either the state or the state where the eyelids are slightly closed.
- the distance between the infrared LED 30 and the infrared camera 32 is the distance between the user's face and the mobile phone 10 (the surface of the housing or the display surface of the display 14) when the user uses the mobile phone 10. It is determined by the size of.
- the processor 40 detects the pupil and the Purkinje image from the captured image, it detects the direction of the dominant eye line of sight (line-of-sight vector V). Specifically, a vector directed from the position of the Purkinje image to the position of the pupil in the two-dimensional captured image captured by the infrared camera 32 is detected. That is, as shown in FIGS. 5A and 5B, the vector from the first center position A toward the second center position center B is the line-of-sight vector V.
- the coordinate system in the infrared camera 32 is determined in advance, and the line-of-sight vector V is calculated using the coordinate system.
- the line-of-sight vector V calculated in this way, calibration is performed as an initial setting for the line-of-sight operation.
- the line-of-sight vectors V when the four corners of the display 14 are respectively watched are acquired, and each line-of-sight vector V is stored as calibration data.
- a gaze point is detected by obtaining a line-of-sight vector V each time an image is captured by the infrared camera 32 and comparing it with calibration data.
- the processor 40 detects that the line of sight is input to the gazing point.
- the distance L between the eyes of the user is calculated from the center position of the Purkinje image of the left and right eyes.
- the distance L between the eyes of the user is stored together with the calibration data.
- the process of detecting the gazing point is executed and the line-of-sight vector V is calculated, the distance L between both eyes recorded when the gazing point is detected is compared with the current distance L between both eyes, and the display 14 and the user's face are compared. It is determined whether or not the distance to If it is determined that the distance between the display 14 and the user's face has changed, the amount of change is calculated from the recorded distance L between both eyes and the current distance L between both eyes, and the magnitude of the line-of-sight vector V is corrected. Is done.
- the line-of-sight vector V is corrected to be large.
- the line-of-sight vector V is corrected to be small.
- FIG. 6 is an illustrative view showing a display example of the display 14 when the electronic book application is executed.
- the display 14 includes a status display area 70 and a function display area 72.
- an icon (pict) indicating a radio wave reception status by the antenna 44, an icon indicating the remaining battery capacity of the secondary battery, and a time are displayed.
- the function display area 72 includes a standard key display area 80 in which a HOME key 90 and a BACK key 92 as standard keys are displayed, and a first application key display area 82 in which a return key 94 for returning to the previous page is displayed. And a second application key display area 84 in which a forward key 96 for proceeding to the next page is displayed, and a headquarter display area 86 in which the text of the electronic book is displayed.
- the HOME key 90 is a key for terminating a running application and displaying a standby screen.
- the BACK key 92 is a key for terminating a running application and displaying a screen before the application is executed.
- the HOME key 90 and the BACK key 92 are displayed if an application is being executed regardless of the type of application to be executed.
- a notification icon is displayed in the status display area 70.
- a new mail icon is displayed in the status display area 70 as a notification icon.
- the notification icon is not displayed.
- keys, GUIs, widgets (gadgets), and the like displayed on the display 14 are collectively referred to as objects.
- the standard key display area 80, the first application key display area 82, and the second application key display area 84 may be collectively referred to as an object display area.
- the user can arbitrarily operate the application being executed by performing line-of-sight input on these objects. For example, when a line of sight is input to the return key 94 or the forward key 96, the displayed page of the electronic book is changed.
- the detection accuracy of the gaze input is changed based on the gaze point of the user.
- first accuracy mode first accuracy mode
- second accuracy mode high accuracy mode
- processing for detecting the user's line of sight is simplified, so that power consumption when the user's line of sight is detected can be suppressed.
- the detection accuracy of the gaze input is set to the low accuracy mode.
- the detection accuracy of the line-of-sight input is set to the high accuracy mode.
- the text display area 86 is set as the specific area. This is because detection of the user's line-of-sight input is not so important in the area where the text of the electronic book is displayed. For this reason, when the user's line of sight is directed to the text of the electronic book, the detection accuracy of the line-of-sight input is set to the low-accuracy mode. On the other hand, when the user's line of sight is directed toward the object display area, the line of sight may be input to the object. Therefore, the detection accuracy of the line-of-sight input is set to the high accuracy mode.
- the high accuracy mode is set in the normal state and the low accuracy mode is set to detect the line-of-sight input with low power consumption
- the high accuracy mode is called the “normal mode” and the low accuracy mode is set to “saving”.
- the high-accuracy mode or the low-accuracy mode may be set in the same manner as the above-described electronic book application during execution of a mail application or a browser application that displays text.
- the specific area when another application is executed may be set for each application, and is determined as the specific area when the number of characters displayed in the area exceeds a predetermined value. May be.
- program storage area 502 and data storage area 504 are formed in RAM 56 shown in FIG. 2.
- the program storage area 502 is an area for reading and storing (developing) part or all of the program data set in advance in the flash memory 54 (FIG. 2).
- the program storage area 502 stores a gaze input program 510 for detecting a gaze input and executing an operation based on the gaze input.
- the program storage area 502 includes a program for executing a telephone function, a mail function, an alarm function, and the like.
- a proximity buffer 530 In the data storage area 504, a proximity buffer 530, a gaze point buffer 532, a line-of-sight buffer 534, and the like are provided.
- the data storage area 504 stores an area coordinate table 536, object data 538, an object table 540, and the like.
- the proximity buffer 530 temporarily stores the distance information to the object obtained from the proximity sensor 34.
- the gaze point buffer 532 temporarily stores the detected gaze point.
- the line-of-sight buffer 534 temporarily stores the position when a line-of-sight input is detected.
- the area coordinate table 536 is a table including information on the coordinate ranges of the state display area 70, the function display area 72, the standard key display area 80, the first application key display area 82, and the second application key display area 84, for example.
- the object data 538 is data including an image of an object displayed on the display 14 and character string data.
- the object table 540 is a table including information such as the display position (coordinate range) of the object displayed on the display 14.
- the processor 40 parallelizes a plurality of tasks including the line-of-sight input processing shown in FIGS. 8 and 9 under the control of a Linux (registered trademark) -based OS such as Android (registered trademark) or REX, and other OSs. Process.
- Linux registered trademark
- OS registered trademark
- REX Registered trademark
- gaze input processing is executed.
- the processor 40 turns on the proximity sensor 34 in step S1. That is, the distance from the mobile phone 10 to the user is measured by the proximity sensor 34. Subsequently, in step S3, the processor 40 determines whether or not the output of the proximity sensor 34 is less than a threshold value. That is, it is determined whether the user's face exists in a range where the infrared rays output from the infrared LED 30 affect the user's eyes. If “NO” in the step S3, that is, if the output of the proximity sensor 34 is equal to or larger than the threshold value, the processor 40 turns off the proximity sensor 34 in a step S5 and ends the line-of-sight input process.
- step S5 a notification (for example, pop-up or voice) that prompts the user's face to be separated from the mobile phone 10 may be given.
- a notification for example, pop-up or voice
- the processor 40 turns on the infrared LED 30 in a step S7, and turns on the infrared camera 32 in a step S9. . That is, the infrared LED 30 and the infrared camera 32 are turned on to detect the user's line-of-sight input.
- step S11 the processor 40 executes face recognition processing. That is, the image data captured by the user's infrared camera 32 is read from the RAM 56, and a process of detecting the user's face is performed on the read image data.
- step S13 the processor 40 determines whether or not it has been recognized. That is, it is determined whether the user's face is recognized by the face recognition process. If “NO” in the step S13, that is, if the user's face is not recognized, the processor 40 returns to the process of the step S11.
- step S13 if “YES” in the step S13, that is, if the user's face is recognized, the processor 40 detects the gazing point in a step S15. That is, the position on the display 14 where the user is gazing is detected. Note that the coordinates of the detected gazing point are recorded in the gazing point buffer 532. Further, the processor 40 that executes the process of step S15 functions as a detection unit.
- step S17 the processor 40 determines whether or not a gaze point is detected. That is, the processor 40 determines whether or not the user's gaze point has been detected from the recognized face image. If “NO” in the step S17, that is, if the gazing point is not detected, the processor 40 returns to the process of the step S11.
- the processor 40 determines whether or not the gazing point is included in the specific area in a step S19. For example, it is determined whether the gazing point recorded in the gazing point buffer 532 is included in the coordinate range of the text display area 86 included in the area coordinate table 536. If “YES” in the step S19, for example, if the user's gaze point is included in the text display area 86 shown in FIG. 6, the processor 40 sets the low accuracy mode in a step S21.
- step S19 if “NO” in the step S19, that is, if the specific area does not include the gazing point, the processor 40 sets the high accuracy mode in a step S23. Then, when the process of step S21 or step S23 ends, the processor 40 proceeds to the process of step S25.
- the processor 40 that executes the process of step S21 functions as a first setting unit
- the processor 40 that executes the process of step S23 functions as a second setting unit.
- step S25 the processor 40 executes a line-of-sight detection process. That is, the user's line-of-sight input is detected based on the set detection accuracy.
- step S27 the processor 40 performs an operation based on the position where the line-of-sight input is performed. For example, when a line-of-sight input is performed on an object, an application and processing related to the object are executed. However, when a line-of-sight input is detected for a character in the text display area 86, the processor 40 does not perform any operation unless processing or operation is associated with the character.
- step S29 the processor 40 determines whether or not the line-of-sight input is completed. For example, the processor 40 determines whether an operation for invalidating the line-of-sight input has been performed. If “NO” in the step S29, that is, if the line-of-sight input is not ended, the processor 40 returns to the process of the step S11. On the other hand, if “YES” in the step S29, that is, if the line-of-sight input has been completed, the processor 40 turns off the infrared LED 30, the infrared camera 32, and the proximity sensor 34 in a step S31. That is, the power consumption of the mobile phone 10 can be suppressed by turning off these power supplies. And if the process of step S31 is complete
- the frame rate of the infrared camera 32 is set lower than that of the high accuracy mode.
- the operation states of the high accuracy mode and the first low accuracy mode will be compared and described.
- FIG. 10 is an illustrative view showing operation states of the infrared camera 32, the processor 40, and the infrared LED 30 when the high-accuracy mode is set.
- the infrared LED 30 is always on, and the infrared camera 32 outputs a captured image at a frame rate of 20 fps (Frames Per Second).
- the processor 40 stores the captured image in the buffer of the RAM 56. Once stored, processing necessary for eye-gaze detection is performed on the captured image.
- the processor 40 performs image reading processing for reading a photographed image input from the photographed image processing circuit 62 from the buffer of the RAM 56, and face recognition processing for recognizing a face from the read photographed image, as necessary processing for line-of-sight detection. And a line-of-sight detection process for detecting the line of sight of the user.
- the processor 40 performs these processes on each captured image output from the infrared camera 32 while the line-of-sight detection is enabled.
- the processor 40 executes processing necessary for line-of-sight detection after the captured image is input from the captured image processing circuit 62. Therefore, the timing at which the captured image is output from the infrared camera 32 and the processing required by the processor 40 for line-of-sight detection. There is a deviation from the timing of executing.
- FIG. 11 is an illustrative view showing operation states of the infrared camera 32, the processor 40, and the infrared LED 30 when the first low-accuracy mode is set.
- the infrared camera 32 in the first low-accuracy mode outputs captured images at a half frame rate relative to the frame rate of the high-accuracy mode, that is, a frame rate of 10 fps.
- the infrared LED 32 is turned on immediately before shooting, and then turned off until the next frame is output. That is, the infrared LED 32 blinks in accordance with the output of the captured image.
- the processor 40 shifts to a sleep state until the next captured image is input. Therefore, in the first low-accuracy mode, the time until the eye-gaze input is detected, that is, the responsiveness when the user performs the eye-gaze input is lower than that in the high-accuracy mode. It seems that the detection accuracy is lower than the high accuracy mode.
- the frame rate of the infrared camera 32 the power consumption of the infrared camera 32 during line-of-sight detection can be suppressed. As a result, power consumption when detecting the user's line of sight can be suppressed. Further, by reducing the frame rate of the infrared camera 32, the power consumption of the infrared LED 30 and the processor 40 can also be reduced.
- FIG. 12 is an illustrative view showing operation states of the infrared camera 32, the processor 40, and the infrared LED 30 when the second low-accuracy mode is set.
- the frequency of the processor 40 in the second low accuracy mode is lowered relative to the high accuracy mode.
- the processor 40 shifts to a sleep state after completing the processing necessary for eye gaze detection, and does not execute the processing necessary for eye gaze detection even if the next frame is output. Then, when the next frame is further output, the process returns from the sleep state and executes processing necessary for eye-gaze detection. At this time, the frame rate of the infrared camera 32 is not changed, and the infrared LED 30 blinks at the same frequency as in the first specific example.
- the time until the line-of-sight input is detected that is, the responsiveness when the user performs the line-of-sight input is lower than the high-precision mode.
- the power consumption of the processor 40 during the line-of-sight detection can be suppressed without changing the frame rate of the infrared camera 32, the power consumption when the user's line of sight is detected can be reduced. It can be suppressed.
- processor 40 may skip a process necessary for eye gaze detection without shifting to the sleep state.
- FIG. 13 is an illustrative view showing operation states of the infrared camera 32, the processor 40, and the infrared LED 30 when the third low-accuracy mode is set.
- the first line-of-sight detection process and the second line-of-sight detection process are executed alternately. For example, when the first visual line detection process is performed on the captured image of the first frame, the second visual line detection process is performed on the captured image of the next second frame. Then, the first visual line detection process is performed again on the captured image of the next third frame.
- the operations of the infrared LED 30 and the infrared camera 32 are substantially the same as in the high accuracy mode. However, the power consumption of the processor 40 that executes the second line-of-sight detection process is lower than when the first line-of-sight detection process is executed.
- the position of the line-of-sight input is detected with an accuracy of “one pixel”, but in the second line-of-sight detection process, the position of the line-of-sight input is detected with an accuracy of “1 area” including a plurality of pixels.
- the accuracy of the position of the detected gaze point is lower than that in the first gaze detection process. Therefore, the user feels that the detection accuracy of the third low accuracy mode is lower than that of the high accuracy mode.
- the power consumption when detecting the line of sight of the user can be suppressed without changing the operation of the hardware (such as the infrared LED 30 and the infrared camera 32). I can do it.
- the proximity sensor 34 may be provided adjacent to the infrared LED 30 and the infrared camera 32 in order to improve the accuracy of detecting the distance from the mobile phone 10 to the user.
- the infrared LED 30 and the infrared camera 32 may be provided adjacent to the proximity sensor 34.
- the proximity of the user's face to the mobile phone 10 may be detected using the infrared LED 30 and the infrared camera 32 instead of the proximity sensor 34.
- the infrared LED 30 emits light weakly, and the light reception level of the infrared camera 32 is measured.
- the received light level exceeds the threshold value, it is determined that the user's face exists in a range where the infrared rays output from the infrared LED 30 affect the user's eyes, and the processor 40 ends the line-of-sight input detection process. .
- the infrared LED 30 is in a normal light emission state, and the user's line-of-sight input is detected as described above.
- the light reception level of the infrared camera 32 is calculated based on the shutter speed and the amplifier gain value. For example, when the illuminance is high, the shutter speed increases and the amplifier gain value decreases. On the other hand, when the illuminance is low, the shutter speed becomes slow and the amplifier gain value becomes high.
- step S55 the processor 40 measures the light reception level of the infrared camera 32. That is, the light reception level of the infrared camera 32 is calculated based on the shutter speed of the infrared camera 32 and the amplifier gain value.
- step S57 the processor 40 determines whether or not the light reception level is less than a threshold value. That is, as in step S3, it is determined whether the user's face exists in a range in which the infrared rays output from the infrared LED 30 affect the user's eyes. If “NO” in the step S57, that is, if the light reception level exceeds the threshold value, the processor 40 proceeds to a process of step S61. Then, the processor 40 turns off the infrared LED 30 and the infrared camera 32 in step S61, and ends the line-of-sight input process.
- step S57 that is, if the light reception level is less than the threshold value, the processor 40 causes the infrared LED 30 to be in a normal light emitting state in a step S59. Subsequently, after the processes of steps S11 to S29 are executed and the user's line-of-sight input is detected, the processor 40 proceeds to the process of step S61.
- step S61 as described above, the infrared LED 30 and the infrared camera 32 are turned off. That is, since the line-of-sight input is detected, the infrared LED 30 and the infrared camera 32 are turned off. Then, when the process of step S61 ends, the processor 40 ends the line-of-sight input process.
- the case where the line-of-sight operation is possible has been described, but actually, the case where the line-of-sight operation (line-of-sight input) is possible may or may not be possible.
- the case where the line-of-sight operation is possible is, for example, when an application set in advance so that the line-of-sight operation can be performed is being executed.
- the target application there are an electronic book application, a mail application, and the like.
- the case where the line-of-sight operation is not possible is, for example, when an application set in advance that the line-of-sight operation cannot be performed is being executed.
- the target application there is a call function.
- a message or an image (icon) to that effect may be displayed.
- a message or an image indicating that a line-of-sight input is being received may be displayed. In this way, the user can recognize that the line-of-sight operation is possible and that the line-of-sight input is accepted.
- the validity / invalidity of the line-of-sight operation may be switched according to the orientation of the mobile phone 10.
- the infrared camera 32 of another embodiment may have a higher sensitivity to infrared rays than a normal color camera.
- the color camera constituting the infrared camera 32 of the other embodiments has an infrared cut filter (low-pass filter) for attenuating (cutting) infrared wavelength light and better receiving light of R, G, B wavelengths. A filter) may be provided.
- the infrared camera 32 provided with an infrared cut filter the sensitivity of light having an infrared wavelength may be increased.
- the infrared cut filter may be detachable from the infrared camera 32.
- a mode icon indicating the currently set mode may be shown to the user.
- the first mode icon 100 including the character string “Lo” in the state display area 70. Is displayed.
- the second mode icon 102 including the character string “Hi” is displayed in the state display area 70.
- the first mode icon 100 or the second mode icon 102 may be displayed only when one of the modes is set.
- the first mode icon 100 may not be displayed when the low accuracy mode is set, and the second mode icon 102 may be displayed only when the high accuracy mode is set.
- the program used in this embodiment may be stored in the HDD of the data distribution server and distributed to the mobile phone 10 via the network.
- the storage medium may be sold or distributed in a state where a plurality of programs are stored in a storage medium such as an optical disk such as a CD, DVD, or BD, a USB memory, or a memory card.
- the present embodiment is an electronic device that detects an eye-gaze input that is an input based on a user's gaze point, and performs an operation based on the eye-gaze input, and a processor that performs a detection process for detecting the eye-gaze input,
- the display unit that displays the screen including the region and the object display area that displays the object for operation, the detection unit that detects the gaze point, and the gaze input detection accuracy when the gaze point of the user is included in the specific region
- the detection accuracy of the line-of-sight input is set to the second accuracy mode that is higher in detection accuracy than the first accuracy mode.
- An electronic device includes a second setting unit.
- the processor (40) of the electronic device detects an input by a line of sight (hereinafter referred to as a line-of-sight input).
- a line-of-sight input detects an input by a line of sight (hereinafter referred to as a line-of-sight input).
- the line-of-sight input is detected when the number of times that the gazing point is detected at the same position reaches the number of determinations.
- an operation is executed based on the input position.
- a display part (14) displays the screen containing the specific area
- a detection part (40, S15) detects a user's gaze point.
- the first setting unit (40, S21) sets the detection accuracy of the line-of-sight input in the first accuracy mode (low accuracy mode), for example, when the user's gazing point is included in a specific area where the text by the electronic book application is displayed. Set to.
- the second setting unit (40, S23) has a higher detection accuracy than the first accuracy mode.
- a high second accuracy mode (high accuracy mode) is set.
- the power consumption when detecting the user's line of sight can be suppressed by changing the detection accuracy of the line-of-sight input based on the position of the user's line of sight.
- Another embodiment further includes a camera for detecting line-of-sight input, and lowers the frame rate of the camera when the first accuracy mode is set.
- the camera (32) is provided in an electronic device to detect a line-of-sight input.
- the frame rate of the camera is set low.
- the power consumption of the camera during the detection of the line of sight can be suppressed by lowering the frame rate of the camera. As a result, power consumption when detecting the user's line of sight can be suppressed.
- the processor when the first accuracy mode is set, the processor reduces the frequency of executing the process for detecting the line-of-sight input.
- the power consumption of the processor during the detection of the line of sight can be suppressed, the power consumption when the user's line of sight is detected can be suppressed.
- the detection processing algorithm by the processor is simplified.
- the accuracy of the input position of the detected line-of-sight is lower than that in the state where the second accuracy mode is set.
- Another embodiment detects a line-of-sight input that is an input based on a user's gaze point, performs an operation based on the line-of-sight input, and performs a detection process for detecting the line-of-sight input.
- a first setting step for setting the detection accuracy of the image to the first accuracy mode (low accuracy mode), and when the user's gaze point is included in the object display area, the detection accuracy of the gaze input is detected from the first accuracy mode.
- the power consumption when detecting the user's line of sight can be suppressed by changing the detection accuracy of the line-of-sight input based on the position of the user's line of sight.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Telephone Function (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
具体例1の第1低精度モードでは、赤外線カメラ32のフレームレートが、高精度モードに対して低くされる。以下、高精度モードと第1低精度モードとの動作状態を比較して説明する。
具体例2の第2低精度モードでは、プロセッサ40が視線を検出する頻度が、高精度モードに対して低くされる。以下、高精度モードと第2低精度モードとの動作状態を比較して説明する。ただし、高精度モードについては説明済みであるため、ここでの詳細な説明は省略する。
具体例3の第3低精度モードでは、高精度モードで実行される視線検出処理と同じ第1視線検出処理と、第1視線検出処理よりも簡易化されたアルゴリズムの第2視線検出処理とが実行される。以下、高精度モードと第3低精度モードとの動作状態を比較して説明する。ただし、高精度モードについては説明済みであるため、ここでの詳細な説明は省略する。
14 …ディスプレイ
16 …タッチパネル
30 …赤外線LED
32 …赤外線カメラ
34 …近接センサ
40 …プロセッサ
50 …入力装置
54 …フラッシュメモリ
56 …RAM
60 …LEDドライバ
62 …撮影画像処理回路
Claims (5)
- ユーザの注視点に基づく入力である視線入力を検出し、前記視線入力に基づく動作を実行する、電子機器であって、
前記視線入力を検出するための検出処理を実行するプロセッサ、
特定領域および操作用のオブジェクトを表示するオブジェクト表示領域を含む画面を表示する表示部、
前記注視点を検出する検出部、
ユーザの注視点が前記特定領域に含まれているとき、前記視線入力の検出精度を第1精度モードに設定する第1設定部、および
ユーザの注視点が前記オブジェクト表示領域に含まれているとき、前記視線入力の検出精度を、前記第1精度モードより検出精度が高い第2精度モードに設定する第2設定部を備える、電子機器。 - 前記視線入力を検出するためのカメラをさらに備え、
前記第1精度モードが設定されたとき、前記カメラのフレームレートを低くする、請求項1記載の電子機器。 - 前記第1精度モードが設定されたとき、前記プロセッサの処理頻度を低くする、請求項1記載の電子機器。
- 前記第1精度モードが設定されたとき、前記プロセッサが実行する検出処理のアルゴリズムが簡易化される、請求項1記載の電子機器。
- ユーザの注視点に基づく入力である視線入力を検出し、前記視線入力に基づく動作を実行し、前記視線入力を検出するための検出処理を実行するプロセッサおよび特定領域および操作用のオブジェクトを表示するオブジェクト表示領域を含む画面を表示する表示部を有する、電子機器における視線入力方法であって、前記電子機器の前記プロセッサが次のステップを実行する:
ユーザの注視点が前記特定領域に含まれているとき、前記視線入力の検出精度を第1精度モードに設定する第1設定ステップ、および
ユーザの注視点が前記オブジェクト表示領域に含まれているとき、前記視線入力の検出精度を、前記第1精度モードより検出精度が高い第2精度モードに設定する第2設定ステップ。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/647,798 US20150309568A1 (en) | 2012-11-27 | 2013-11-27 | Electronic apparatus and eye-gaze input method |
JP2014550205A JPWO2014084224A1 (ja) | 2012-11-27 | 2013-11-27 | 電子機器および視線入力方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-258413 | 2012-11-27 | ||
JP2012258413 | 2012-11-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014084224A1 true WO2014084224A1 (ja) | 2014-06-05 |
Family
ID=50827860
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/081837 WO2014084224A1 (ja) | 2012-11-27 | 2013-11-27 | 電子機器および視線入力方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20150309568A1 (ja) |
JP (1) | JPWO2014084224A1 (ja) |
WO (1) | WO2014084224A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020517012A (ja) * | 2017-04-14 | 2020-06-11 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | マルチモーダル眼追跡 |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9633186B2 (en) * | 2012-04-23 | 2017-04-25 | Apple Inc. | Systems and methods for controlling output of content based on human recognition data detection |
JP2015090569A (ja) * | 2013-11-06 | 2015-05-11 | ソニー株式会社 | 情報処理装置及び情報処理方法 |
US11567566B2 (en) * | 2015-04-08 | 2023-01-31 | Controlrad Systems, Inc. | Devices and methods for monitoring gaze |
CN106843709B (zh) | 2015-12-04 | 2020-04-14 | 阿里巴巴集团控股有限公司 | 根据实时信息显示展现对象的方法和装置 |
TW201740250A (zh) * | 2016-05-04 | 2017-11-16 | 原相科技股份有限公司 | 觸控偵測方法以及觸控偵測系統 |
EP3555850B1 (en) * | 2016-12-15 | 2021-10-27 | General Electric Company | System and method for image segmentation using a joint deep learning model |
US10757328B2 (en) * | 2016-12-23 | 2020-08-25 | Microsoft Technology Licensing, Llc | Eye tracking using video information and electrooculography information |
US10810773B2 (en) * | 2017-06-14 | 2020-10-20 | Dell Products, L.P. | Headset display control based upon a user's pupil state |
KR102495359B1 (ko) * | 2017-10-27 | 2023-02-02 | 삼성전자주식회사 | 객체 트래킹 방법 및 장치 |
US10855979B2 (en) | 2018-10-23 | 2020-12-01 | Microsoft Technology Licensing, Llc | Interpreting eye gaze direction as user input to near-eye-display (NED) devices for enabling hands free positioning of virtual items |
US10718942B2 (en) | 2018-10-23 | 2020-07-21 | Microsoft Technology Licensing, Llc | Eye tracking systems and methods for near-eye-display (NED) devices |
US10838490B2 (en) * | 2018-10-23 | 2020-11-17 | Microsoft Technology Licensing, Llc | Translating combinations of user gaze direction and predetermined facial gestures into user input instructions for near-eye-display (NED) devices |
US10996746B2 (en) | 2018-10-23 | 2021-05-04 | Microsoft Technology Licensing, Llc | Real-time computational solutions to a three-dimensional eye tracking framework |
US10852823B2 (en) | 2018-10-23 | 2020-12-01 | Microsoft Technology Licensing, Llc | User-specific eye tracking calibration for near-eye-display (NED) devices |
US11740071B2 (en) | 2018-12-21 | 2023-08-29 | Apple Inc. | Optical interferometry proximity sensor with temperature variation compensation |
US11156456B2 (en) | 2019-05-21 | 2021-10-26 | Apple Inc. | Optical proximity sensor integrated into a camera module for an electronic device |
US11473898B2 (en) | 2019-05-24 | 2022-10-18 | Apple Inc. | Wearable voice-induced vibration or silent gesture sensor |
US11695758B2 (en) * | 2020-02-24 | 2023-07-04 | International Business Machines Corporation | Second factor authentication of electronic devices |
US11460293B2 (en) | 2020-09-25 | 2022-10-04 | Apple Inc. | Surface quality sensing using self-mixing interferometry |
US11874110B2 (en) | 2020-09-25 | 2024-01-16 | Apple Inc. | Self-mixing interferometry device configured for non-reciprocal sensing |
US11629948B2 (en) | 2021-02-04 | 2023-04-18 | Apple Inc. | Optical interferometry proximity sensor with optical path extender |
WO2023215112A1 (en) * | 2022-05-04 | 2023-11-09 | Apple Inc. | Retinal reflection tracking for gaze alignment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1091325A (ja) * | 1996-09-13 | 1998-04-10 | Toshiba Corp | 視線検出システム |
JP2009301166A (ja) * | 2008-06-11 | 2009-12-24 | Panasonic Corp | 電子機器制御装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8593375B2 (en) * | 2010-07-23 | 2013-11-26 | Gregory A Maltz | Eye gaze user interface and method |
JP2012048358A (ja) * | 2010-08-25 | 2012-03-08 | Sony Corp | 閲覧機器、情報処理方法およびプログラム |
US20120229391A1 (en) * | 2011-01-10 | 2012-09-13 | Andrew Skinner | System and methods for generating interactive digital books |
US9519423B2 (en) * | 2011-04-22 | 2016-12-13 | Sony Corporation | Information processing apparatus |
US8976110B2 (en) * | 2011-10-27 | 2015-03-10 | Tobii Technology Ab | Power management in an eye-tracking system |
-
2013
- 2013-11-27 US US14/647,798 patent/US20150309568A1/en not_active Abandoned
- 2013-11-27 JP JP2014550205A patent/JPWO2014084224A1/ja active Pending
- 2013-11-27 WO PCT/JP2013/081837 patent/WO2014084224A1/ja active Application Filing
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1091325A (ja) * | 1996-09-13 | 1998-04-10 | Toshiba Corp | 視線検出システム |
JP2009301166A (ja) * | 2008-06-11 | 2009-12-24 | Panasonic Corp | 電子機器制御装置 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020517012A (ja) * | 2017-04-14 | 2020-06-11 | マジック リープ, インコーポレイテッドMagic Leap,Inc. | マルチモーダル眼追跡 |
JP2022132349A (ja) * | 2017-04-14 | 2022-09-08 | マジック リープ, インコーポレイテッド | マルチモーダル眼追跡 |
US11449140B2 (en) | 2017-04-14 | 2022-09-20 | Magic Leap, Inc. | Multimodal eye tracking |
US11561615B2 (en) | 2017-04-14 | 2023-01-24 | Magic Leap, Inc. | Multimodal eye tracking |
JP7211966B2 (ja) | 2017-04-14 | 2023-01-24 | マジック リープ, インコーポレイテッド | マルチモーダル眼追跡 |
JP2023014151A (ja) * | 2017-04-14 | 2023-01-26 | マジック リープ, インコーポレイテッド | マルチモーダル眼追跡 |
JP7291841B2 (ja) | 2017-04-14 | 2023-06-15 | マジック リープ, インコーポレイテッド | マルチモーダル眼追跡 |
JP7455905B2 (ja) | 2017-04-14 | 2024-03-26 | マジック リープ, インコーポレイテッド | マルチモーダル眼追跡 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2014084224A1 (ja) | 2017-01-05 |
US20150309568A1 (en) | 2015-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014084224A1 (ja) | 電子機器および視線入力方法 | |
JP6043586B2 (ja) | 電子機器、視線入力プログラムおよび視線入力方法 | |
US20220206741A1 (en) | Volume adjustment method and electronic device | |
US20220066725A1 (en) | Message processing method, related apparatus, and system | |
JP6105953B2 (ja) | 電子機器、視線入力プログラムおよび視線入力方法 | |
EP2990852B1 (en) | Head-mounted display hosting a smartphone for providing virtual reality environment | |
US11243657B2 (en) | Icon display method, and apparatus | |
KR102534354B1 (ko) | 시스템 탐색 바 표시 제어 방법, 그래픽 사용자 인터페이스 및 전자 디바이스 | |
US11258893B2 (en) | Method for prompting notification message and mobile terminal | |
JP6062175B2 (ja) | 携帯端末、省電力制御プログラムおよび省電力制御方法 | |
US11843715B2 (en) | Photographing method and terminal | |
US11625164B2 (en) | Display method and terminal device | |
KR20190013339A (ko) | 전자 장치 및 이의 제어 방법 | |
EP3232301B1 (en) | Mobile terminal and virtual key processing method | |
WO2022134632A1 (zh) | 作品处理方法及装置 | |
CN112764654B (zh) | 组件的吸附操作方法、装置、终端及存储介质 | |
KR102553558B1 (ko) | 전자 장치 및 전자 장치의 터치 이벤트 처리 방법 | |
CN111897465A (zh) | 弹窗显示方法、装置、设备及存储介质 | |
CN111708479A (zh) | 触控操作的响应方法、装置、终端及存储介质 | |
CN109104573B (zh) | 一种确定对焦点的方法及终端设备 | |
CN111158575B (zh) | 终端执行处理的方法、装置、设备以及存储介质 | |
US20220021763A1 (en) | Touch Operation Locking Method and Electronic Device | |
CN109725820B (zh) | 获取列表条目的方法和装置 | |
CN112486371B (zh) | 应用图标的拖动方法、装置及存储介质 | |
CN109246345B (zh) | 美瞳拍摄方法、装置、存储介质及移动终端 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13858859 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014550205 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14647798 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13858859 Country of ref document: EP Kind code of ref document: A1 |