CN111061372A - Equipment control method and related equipment - Google Patents

Equipment control method and related equipment Download PDF

Info

Publication number
CN111061372A
CN111061372A CN201911310496.1A CN201911310496A CN111061372A CN 111061372 A CN111061372 A CN 111061372A CN 201911310496 A CN201911310496 A CN 201911310496A CN 111061372 A CN111061372 A CN 111061372A
Authority
CN
China
Prior art keywords
touch signal
user interface
floating window
glasses
key
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911310496.1A
Other languages
Chinese (zh)
Other versions
CN111061372B (en
Inventor
吴恒刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911310496.1A priority Critical patent/CN111061372B/en
Publication of CN111061372A publication Critical patent/CN111061372A/en
Application granted granted Critical
Publication of CN111061372B publication Critical patent/CN111061372B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application discloses equipment control method and related equipment, which are applied to AR glasses, wherein the AR glasses comprise a Bluetooth module, a display and a human eye tracking lens, the human eye tracking lens is arranged on one side of the display, and the method comprises the following steps: displaying a user interface on the display; determining a first fixation position where a human eye gazes at the user interface through the human eye tracking lens, and receiving a first touch signal sent by a touch ring matched with the AR glasses through the Bluetooth module; and performing control operation based on the first gaze position and the first touch signal. Adopt this application embodiment can control AR glasses conveniently.

Description

Equipment control method and related equipment
Technical Field
The present application relates to the field of augmented reality technologies, and in particular, to an apparatus control method and related apparatus.
Background
Wearable display device is the general term of applying wearable technique to carry out intelligent design, develop the display device that can wear to daily wearing. Augmented Reality (AR) glasses are also wearable devices, and users can see virtual information superimposed on pictures in a real environment after wearing the AR glasses. With the improvement of the use frequency of the AR glasses, how to control the AR glasses conveniently becomes a technical problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a device control method and related devices, which are used for conveniently controlling AR glasses.
In a first aspect, an embodiment of the present application provides an apparatus control method, which is applied to augmented reality AR glasses, where the AR glasses include a bluetooth module, a display, and a human eye tracking lens, the human eye tracking lens is disposed on one side of the display, and the method includes:
displaying a user interface on the display;
determining a first fixation position where a human eye gazes at the user interface through the human eye tracking lens, and receiving a first touch signal sent by a touch ring matched with the AR glasses through the Bluetooth module;
and performing control operation based on the first gaze position and the first touch signal.
In a second aspect, an embodiment of the present application provides an apparatus control device, which is applied to augmented reality AR glasses, the AR glasses include a bluetooth module, a display and a human eye tracking lens, the human eye tracking lens is disposed on one side of the display, and the apparatus includes:
a display unit for displaying a user interface on the display;
the gaze position determining unit is used for determining a first gaze position where the human eyes gaze the user interface through the human eye tracking lens;
the communication unit is used for receiving a first touch signal sent by a touch ring matched with the AR glasses through the Bluetooth module;
and the control unit is used for carrying out control operation based on the first fixation position and the first touch signal.
In a third aspect, embodiments of the present application provide AR glasses, including a processor, a memory, a bluetooth module, a display, an eye tracking lens, a wireless communication module, and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the processor, and the programs include instructions for performing the steps of the method according to the first aspect of the embodiments of the present application.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium for storing a computer program, where the computer program is executed by a processor to implement some or all of the steps described in the method according to the first aspect of the embodiments of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described in a method as described in the first aspect of embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the application, the AR glasses firstly display the user interface on the display, then determine the gaze position where the human eyes gaze the user interface, receive the touch signal sent by the touch ring paired with the AR glasses through the bluetooth module, and finally perform control operation based on the gaze position and the touch signal, wherein the eye movement tracking realizes interactive fineness, and the touch ring realizes interactive convenience, thereby realizing convenient and accurate control of the AR glasses.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic structural diagram of AR glasses provided in an embodiment of the present application;
fig. 1B is a schematic structural diagram of another AR glasses provided in the embodiments of the present application;
fig. 1C is a schematic structural diagram of another AR glasses provided in the embodiments of the present application;
FIG. 1D is a schematic diagram of a user interface provided by an embodiment of the present application;
fig. 1E is a schematic structural diagram of a touch ring provided in the embodiment of the present application;
fig. 2A is a schematic flowchart of an apparatus control method according to an embodiment of the present application;
2B-2W are schematic diagrams of another user interface provided by the embodiment of the application;
fig. 3 is a schematic structural diagram of another AR glasses provided by the present application;
fig. 4 is a schematic structural diagram of an apparatus control device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The following are detailed below.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and in the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1A to 1C, fig. 1A to 1C are schematic structural views of an AR glasses according to an embodiment of the present application. The AR glasses include an AR glasses main body 101, a main camera 102, a plurality of first infrared tracking Light Emitting Diodes (LEDs) 103, a first eye tracking lens 104, a plurality of second infrared tracking LEDs 105, a second eye tracking lens 106, a left eye display 107, and a right eye display 108.
Optionally, the main camera 102, the plurality of first infrared tracking LEDs 103, the first eye tracking lens 104, the plurality of second infrared tracking LEDs 105, the second eye tracking lens 106, the left eye display 107, and the right eye display 108 are all fixed on the AR glasses main body 101. The plurality of first infrared tracking LEDs 103 are disposed along the peripheral side of the left-eye display 107, and the first eye tracking lens 104 is disposed on one side of the left-eye display 107. The plurality of second infrared tracking LEDs 105 are disposed along the peripheral side of the right-eye display 108, and the second eye tracking lens 106 is disposed on one side of the left-eye display 108.
Optionally, the AR glasses are further provided with a processor, an image information processing module, a memory, a bluetooth module, an eye movement tracking processing module, a wireless communication module, and the like. The image information processing module and the eye tracking processing module may be independent of the processor or may be integrated in the processor. The AR glasses can be in communication connection with a wireless communication device (such as a smart phone, a tablet computer and the like) through the wireless communication module.
The AR glasses can be worn on the head of a user to ensure that the user can clearly see virtual information superposed in the real world. The main camera 102 is used for photographing and shooting, and after the main camera 102 is started, the external light 109 is collected and converted into digital information, and the digital information is converted into visible digital image information through an image information processing module of the AR glasses. Taking the right eye as an example, the image information is projected to the right eye display 108 by the micro-projector 110, and finally projected to the field of view of the right eye of the user after the light direction is changed by the right eye display 108. The micro-projector 110 is fixed to the AR eyeglass body 101, and the micro-projector 110 has two, one is provided on one side of the left-eye display 107 and the other is provided on one side of the right-eye display 108.
The first eye-tracking lens 104 is used for tracking the direction of the eye line of the left eye of the user and the gaze position in real time. The second eye-tracking lens 106 is used for tracking the sight direction and the fixation position of the right eye of the user in real time. After the user correctly wears the AR glasses, the first infrared tracking LEDs 103 and the second infrared tracking LEDs 105 project infrared light to eyeballs of the user, taking the right eye as an example, the infrared light forms light spots at the position of the cornea of the right eye, the second eye tracking lens 106 captures the infrared light spots reflected by the cornea of the eye in real time, then the infrared light spots are converted into digital information, and the digital information analyzes the sight direction of the user and the fixation position in the user interface through the eye movement tracking processing module. The eye movement tracking processing module can also judge the blinking behavior of the user according to the disappearance sequence and the reappearance sequence of the infrared light spots.
Referring to fig. 1D, fig. 1D is a schematic diagram of a user interface provided in the embodiment of the present application. And the image information processing module displays the processed image information in a view frame of the user interface. And a cursor displayed in the view-finding frame calculates the watching position of the human eye watching the user interface for the eye movement tracking module. When the cursor is located in a hot area of an object (such as a video recording key, a photographing key, a virtual information key, and the like) in the user interface, a display effect of the hot area changes (such as a change in a boundary size, a frame color thickness, and the like), and at this time, the user can complete corresponding interactive operation through a touch ring paired with the AR glasses bluetooth.
Referring to fig. 1E, fig. 1E is a schematic structural diagram of a touch ring according to an embodiment of the present disclosure. The touch-control ring can be worn at the second joint of the index finger of one hand (such as the left hand or the right hand) of a user, and then the interaction is realized by touching the touch-control area of the touch-control ring through the thumb. The inside bluetooth module that is equipped with of touch-control ring, the regional expansion of touch-control can be a rectangle plane. And a capacitive touch sensor in the touch area records touch operation in real time and transmits a touch signal to a Bluetooth module of the AR glasses in real time through a Bluetooth module of the touch ring. The touch signal includes, for example:
(1) a touch signal: single touch point time is less than 500 milliseconds;
(2) long press signal: a single touch point time is greater than or equal to 500 milliseconds;
(3) and (3) right slip signal: a single touch point and produces a horizontal right shift;
(4) left slip signal: a single touch point generates horizontal displacement to the left;
(5) an up-slide signal: a single touch point generates vertical upward displacement;
(6) a roll-off signal: a single touch point and a vertical downward displacement is generated;
(7) double finger pinch signal: detecting that the two touch points generate displacement, wherein the distance between the two touch points gradually approaches;
(8) two-finger split signal: and detecting that the two touch points generate displacement, wherein the distance between the two touch points is gradually far away.
Referring to fig. 2A, fig. 2A is a schematic flowchart of a device control method provided in an embodiment of the present application, and the method is applied to the AR glasses, and the method includes:
step 201: displaying a user interface on the display.
The user interface is a user interface of the AR glasses in a camera mode, for example, a photographing mode, a video recording mode, and the like.
Step 202: the method comprises the steps of determining that human eyes watch at a first watching position of a user interface through the human eye tracking lens, and receiving a first touch signal sent by a touch ring matched with AR glasses through the Bluetooth module.
In an implementation manner of the present application, the determining, by the eye-tracking lens, that the human eye gazes at the first gazing position of the user interface includes:
capturing infrared light spots reflected by human eyes through the human eye tracking lens, wherein the infrared light spots are formed after the infrared tracking LEDs project infrared light to the human eyes;
and determining a gaze position where the human eye gazes at the user interface based on the infrared light spots.
It should be noted that, a specific implementation manner of determining the gaze location of the human eye on the user interface based on the infrared light spot is an existing human eye tracking technology, and is not described herein.
Step 203: and performing control operation based on the first gaze position and the first touch signal.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, and the performing a control operation based on the first gaze location and the first touch signal includes: and if the first watching position is positioned in the viewing frame and the first touch signal is a light touch signal, focusing the first watching position, wherein after the first watching position is focused, the user interface displays a focusing key and an exposure scale. As shown in particular in fig. 2B.
When the AR glasses are in the camera mode, the user interface includes a view frame, and when the first gaze position is located in the view frame and both eyes blink, the first gaze position is focused, and after the first gaze position is focused, the first user interface displays a focus key and an exposure scale.
In addition, after the focusing key and the exposure scale are displayed on the user interface, if no operation is performed within the first time length, the focusing key and the exposure scale are removed. The first duration may be 2000ms, 3000ms, or other values, for example.
In an implementation manner of the present application, after focusing the first gaze location, the method further includes:
and if a fourth touch signal sent by the touch ring is received by the Bluetooth module within the first duration and is a right slide signal, reducing the exposure level based on the right slide signal. As shown in fig. 2C, the specific exposure level is determined based on the sliding distance of the right slide, and the larger the sliding distance is, the larger the reduced exposure level is, the smaller the sliding distance is, the smaller the reduced exposure level is.
In an implementation manner of the present application, after focusing the first gaze location, the method further includes:
and if a fourth touch signal sent by the touch ring is received by the Bluetooth module within the first duration and is a left slide signal, improving the exposure based on the left slide signal. As shown in fig. 2D, the specific exposure level is determined based on the sliding distance of the left slide, and the larger the sliding distance is, the larger the reduced exposure level is, the smaller the sliding distance is, the smaller the reduced exposure level is.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, and the performing a control operation based on the first gaze location and the first touch signal includes:
and if the first watching position is located in the view frame and the first touch signal is a double-finger pinch signal, zooming out and adjusting the view based on the double-finger pinch signal, and displaying a zooming scale on the user interface when zooming out and adjusting the view. Specifically, as shown in fig. 2E, how much the specific view is reduced is determined based on the distance between the two fingers, and the smaller the distance between the two fingers is, the larger the view is reduced, and the larger the distance between the two fingers is, the smaller the view is reduced.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, and the performing a control operation based on the first gaze location and the first touch signal includes:
and if the first watching position is positioned in the viewing frame and the first touch signal is a double-finger separating signal, carrying out view amplification adjustment based on the double-finger separating signal, and displaying a zoom scale on the user interface when carrying out view amplification adjustment. As shown in fig. 2F, in addition, how much the specific view magnification is determined based on the distance between the two fingers, the smaller the distance between the two fingers is, the smaller the view magnification is, and the larger the distance between the two fingers is, the larger the view magnification is.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, a zoom factor icon is displayed in the view finder, and the control operation is performed based on the first gaze position and the first touch signal, including:
and if the first fixation position is located in the zoom multiple icon boundary and the first touch signal is an upward sliding signal, carrying out view amplification adjustment based on the upward sliding signal, and displaying a zoom scale on the user interface when the view amplification adjustment is carried out. As shown in fig. 2G in particular, in addition, how much the specific view magnification is determined based on the sliding distance of the upward slide, the smaller the view magnification is when the sliding distance is smaller, and the larger the sliding distance is, the larger the view magnification is.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, a zoom factor icon is displayed in the view finder, and the control operation is performed based on the first gaze position and the first touch signal, including:
and if the first fixation position is located in the zoom multiple icon boundary and the first touch signal is a downslide signal, performing view reduction adjustment based on the upslide signal, and displaying a zoom scale on the user interface when the view reduction adjustment is performed. As shown in fig. 2H, in addition, how much the specific view is reduced is determined based on the sliding distance of the slide, and the view is reduced as the sliding distance is smaller, and the view is reduced as the sliding distance is larger.
In an implementation manner of the present application, the AR glasses support multiple camera modes, the multiple camera modes are sequentially arranged, the AR glasses are currently in a first camera mode, the user interface includes a view finder, and performing a control operation based on the first gaze location and the first touch signal includes: if the first gaze location is located on the view frame and the first touch signal is a glide signal, switching to a second camera mode, the second camera mode being adjacent to and arranged above the first camera mode;
if the first gaze location is located on the view frame and the first touch signal is an up-slide signal, switching to a third camera mode, the third camera mode being adjacent to and arranged below the first camera mode.
If the second camera mode is a video recording mode and the third camera mode is a camera mode a, the specific diagram is shown in fig. 2I.
In an implementation manner of the present application, the AR glasses are in a camera mode, the AR glasses support multiple camera modes, the AR glasses are currently in a first camera mode, the user equipment includes a photographing key, and the control operation is performed based on the first gaze location and the first touch signal, including:
if the first gaze location is located within the photographing key boundary and the first touch signal is a light touch signal or a glide signal, switching to a fourth camera mode, the fourth camera mode being adjacent to and arranged above the first camera mode;
if the first gaze location is located within the photographing key boundary and the first touch signal is a light touch signal or an up-slide signal, switching to a fifth camera mode, the fifth camera mode being adjacent to and arranged below the first camera mode;
and if the first watching position is positioned in the photographing key boundary and the first touch signal is a light touch signal or a continuous gliding signal, switching to a more functional mode, and displaying icons of various camera modes on the user interface in the more functional mode for a user to select.
If the fourth camera mode is a video recording mode and the fifth camera mode is a camera mode a, the specific diagram is shown in fig. 2J.
In an implementation manner of the present application, the AR glasses are in a photographing mode, the user interface includes a view finder, and the control operation is performed based on the first gaze location and the first touch signal, including:
if the user interface comprises a photographing key, the first gaze location is located within the photographing key boundary, and the first touch signal is a light touch signal, performing real photographing, as shown in fig. 2K specifically;
if the user interface includes a focusing key, the first gaze location is located in the focusing key, and the first touch signal is a light touch signal, then performing real photographing, as shown in fig. 2L specifically;
if the user interface comprises a virtual information key, the first gaze location is located within the boundary of the virtual information key, and the first touch signal is a light touch signal, then starting a mixed reality photographing function, displaying AR virtual information superimposed in a real scene in the view-finding frame under the mixed reality photographing function, and obtaining image information that is real scene image information superimposed with the AR virtual information under the mixed reality photographing function, as shown in FIG. 2M specifically;
if the user interface comprises a floating window mode key, the first gaze location is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, the floating window mode photographing function is started, and a first floating window is displayed in the user interface under the floating window mode photographing function, wherein the first floating window comprises a reduced view frame and a photographing shutter key, which are specifically shown in fig. 2N.
It should be noted that the real-world photographing refers to photographing the external environment of the AR glasses through the main camera, and the obtained image is a real-world scene image.
In an implementation manner of the present application, after the floating window mode photographing function is started, the method further includes:
determining a second fixation position where the human eye gazes at the user interface through the human eye tracking lens, and receiving a second touch signal sent by the touch ring through the Bluetooth module;
if the second gaze location is located in the view frame and the second touch signal is a light touch signal, turning off the floating window mode photographing function, as shown in fig. 2O specifically;
if the second gaze location is located within the photographing key boundary and the second touch signal is a light touch signal, performing real photographing, as shown in fig. 2P;
if the second gaze location is within the first floating window and the second touch signal is a long press signal, the first floating window is moved based on the gaze point of the human eye, as shown in fig. 2Q.
The second fixation position is located in the first floating window, which means that the second fixation position is located in the boundary of the photographing shutter button, or the second fixation position is located in the boundary of the viewing frame.
In an implementation manner of the present application, the AR glasses are in a video recording mode, and the performing a control operation based on the first gaze location and the first touch signal includes:
if the user interface includes a video recording key, the first gaze location is located within the video recording key boundary, and the first touch signal is a light touch signal, then performing real video recording, as shown in fig. 2R specifically;
if the user interface includes a virtual information key, the first gaze location is located within the boundary of the virtual information key, and the first touch signal is a light touch signal, then starting a mixed reality video recording function, displaying AR virtual information superimposed in a real scene in the view frame under the mixed reality video recording function, and recording obtained video information as real scene video information superimposed with the AR virtual information under the mixed reality video recording function, as shown in fig. 2S specifically;
if the AR glasses are recording, the user interface includes a floating window mode key, the first gaze location is located within the boundary of the floating window mode key, and the first touch signal is a tap signal, the floating window mode recording function is turned on, and under the floating window mode photographing function, a second floating window is displayed within the user interface, the second floating window includes a reduced recorded duration, a view frame, and a record pause key, as specifically shown in fig. 2T.
In an implementation manner of the present application, after the video recording function in the floating window mode is started, the method further includes:
determining a third fixation position where the human eye gazes at the user interface through the human eye tracking lens, and receiving a third touch signal sent by the touch ring through the Bluetooth module;
if the third gaze location is within the view frame and the third touch signal is a light touch signal, turning off the floating window mode recording function, as shown in fig. 2U specifically;
if the third gaze location is located within the record pause key boundary and the third touch signal is a light touch signal, pausing video recording, as shown in fig. 2V specifically;
if the third gaze location is within the second floating window and the third touch signal is a long press signal, moving the second floating window based on the gaze point of the human eye, as shown in fig. 2W.
The third fixation position is located in the second floating window, which means that the third fixation position is located in the boundary of the recorded duration of the beat, or the third fixation position is located in the boundary of the view frame, or the third fixation position is located in the boundary of the record-pause key.
It can be seen that, in the embodiment of the application, the AR glasses firstly display the user interface on the display, then determine the gaze position where the human eyes gaze the user interface, receive the touch signal sent by the touch ring paired with the AR glasses through the bluetooth module, and finally perform control operation based on the gaze position and the touch signal, wherein the eye movement tracking realizes interactive fineness, and the touch ring realizes interactive convenience, thereby realizing convenient and accurate control of the AR glasses.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an AR glasses according to an embodiment of the present disclosure, and as shown in the figure, the electronic device includes a processor, a memory, a bluetooth module, a display, an eye tracking lens, and a wireless communication module, where the eye tracking lens is disposed on one side of the display; wherein the one or more programs are stored in the memory and configured to be executed by the processor, the programs including instructions for performing the steps of:
displaying a user interface on the display;
determining a first fixation position where a human eye gazes at the user interface through the human eye tracking lens, and receiving a first touch signal sent by a touch ring matched with the AR glasses through the Bluetooth module;
and performing control operation based on the first gaze position and the first touch signal.
It can be seen that, in the embodiment of the application, the AR glasses firstly display the user interface on the display, then determine the gaze position where the human eyes gaze the user interface, receive the touch signal sent by the touch ring paired with the AR glasses through the bluetooth module, and finally perform control operation based on the gaze position and the touch signal, wherein the eye movement tracking realizes interactive fineness, and the touch ring realizes interactive convenience, thereby realizing convenient and accurate control of the AR glasses.
In an implementation of the present application, the AR glasses further include a plurality of infrared tracking LEDs, the plurality of infrared tracking LEDs are along the periphery of the display is disposed, and in terms of the first gaze location of the user interface, the above program includes instructions specifically for performing the following steps:
capturing infrared light spots reflected by human eyes through the human eye tracking lens, wherein the infrared light spots are formed after the infrared tracking LEDs project infrared light to the human eyes;
determining a first gaze location of a human eye gazing at the user interface based on the infrared spot.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, and in terms of performing a control operation based on the first gaze location and the first touch signal, the program includes instructions specifically configured to perform the following steps:
and if the first watching position is positioned in the viewing frame and the first touch signal is a light touch signal, focusing the first watching position, wherein after the first watching position is focused, the user interface displays a focusing key and an exposure scale.
In an implementation manner of the present application, the AR glasses are in a photographing mode, the user interface includes a view finder, and in terms of performing control operation based on the first gaze location and the first touch signal, the program includes instructions specifically configured to perform the following steps:
if the user interface comprises a photographing key, the first watching position is located in the boundary of the photographing key, and the first touch signal is a light touch signal, real photographing is carried out;
if the user interface comprises a focusing key, the first watching position is located in the focusing key, and the first touch signal is a light touch signal, real photographing is carried out;
if the user interface comprises a virtual information key, the first fixation position is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, starting a mixed reality photographing function, displaying AR virtual information superposed in a real scene in the view-finding frame under the mixed reality photographing function, and obtaining image information which is real scene image information superposed with the AR virtual information under the mixed reality photographing function;
if the user interface comprises a floating window mode key, the first fixation position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, starting a floating window mode photographing function, displaying a first floating window in the user interface under the floating window mode photographing function, wherein the first floating window comprises a reduced view-finding frame and a photographing shutter key.
In an implementation manner of the present application, after the floating window mode photographing function is turned on, the program includes instructions further configured to perform the following steps:
determining a second fixation position where the human eye gazes at the user interface through the human eye tracking lens, and receiving a second touch signal sent by the touch ring through the Bluetooth module;
if the second fixation position is located in the view-finding frame and the second touch signal is a light touch signal, closing the floating window mode photographing function;
if the second fixation position is located in the photographing shutter button boundary and the second touch signal is a light touch signal, performing real photographing;
and if the second fixation position is located in the first floating window and the second touch signal is a long press signal, moving the first floating window based on the fixation point of the human eye.
In an implementation manner of the present application, the AR glasses are in a video recording mode, and in terms of performing a control operation based on the first gaze location and the first touch signal, the program includes instructions specifically configured to perform the following steps:
if the user interface comprises a video recording key, the first watching position is located in the boundary of the video recording key, and the first touch signal is a light touch signal, then real video recording is carried out;
if the user interface comprises a virtual information key, the first watching position is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, starting a mixed reality video recording function, displaying AR virtual information superposed in a real scene in the view-finding frame under the mixed reality video recording function, and recording obtained video information as real scene video information superposed with the AR virtual information under the mixed reality video recording function;
if the AR glasses are recording, the user interface comprises a floating window mode key, the first watching position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, starting a floating window mode recording function, and displaying a second floating window in the user interface under the floating window mode photographing function, wherein the second floating window comprises a shortened recording time, a view frame and a record pause key.
In an implementation manner of the present application, after the floating window mode video recording function is turned on, the program includes instructions further configured to perform the following steps:
determining a third fixation position where the human eye gazes at the user interface through the human eye tracking lens, and receiving a third touch signal sent by the touch ring through the Bluetooth module;
if the third fixation position is located in the view-finding frame and the third touch signal is a light touch signal, closing the floating window mode video recording function;
if the third watching position is located in the recording pause key boundary and the third touch signal is a light touch signal, stopping recording;
and if the third gaze location is located in the second floating window and the third touch signal is a long press signal, moving the second floating window based on the gaze point of the human eye.
It should be noted that, for the specific implementation process of the present embodiment, reference may be made to the specific implementation process described in the above method embodiment, and a description thereof is omitted here.
Referring to fig. 4, fig. 4 is a device control apparatus provided in an embodiment of the present application, which is applied to AR glasses, where the AR glasses include a bluetooth module, a display, and a human eye tracking lens, the human eye tracking lens is disposed on one side of the display, and the apparatus includes:
a display unit 401 for displaying a user interface on the display;
a gaze location determination unit 402 configured to determine a first gaze location where a human eye gazes at the user interface through the human eye tracking lens;
a communication unit 403, configured to receive, through the bluetooth module, a first touch signal sent by a touch ring paired with the AR glasses;
a control unit 404, configured to perform a control operation based on the first gaze location and the first touch signal.
It can be seen that, in the embodiment of the application, the AR glasses firstly display the user interface on the display, then determine the gaze position where the human eyes gaze the user interface, receive the touch signal sent by the touch ring paired with the AR glasses through the bluetooth module, and finally perform control operation based on the gaze position and the touch signal, wherein the eye movement tracking realizes interactive fineness, and the touch ring realizes interactive convenience, thereby realizing convenient and accurate control of the AR glasses.
In an implementation manner of the present application, the AR glasses further include a plurality of infrared tracking LEDs, the plurality of infrared tracking LEDs are along the periphery of the display is disposed, and the eye tracking lens determines that the eye gazes at the first gazing position of the user interface, and the gazing position determining unit 402 is specifically configured to:
capturing infrared light spots reflected by human eyes through the human eye tracking lens, wherein the infrared light spots are formed after the infrared tracking LEDs project infrared light to the human eyes;
determining a first gaze location of a human eye gazing at the user interface based on the infrared spot.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a view finder, and in terms of performing a control operation based on the first gaze location and the first touch signal, the control unit 404 is specifically configured to:
and if the first watching position is positioned in the viewing frame and the first touch signal is a light touch signal, focusing the first watching position, wherein after the first watching position is focused, the user interface displays a focusing key and an exposure scale.
In an implementation manner of the present application, the AR glasses are in a photographing mode, the user interface includes a view finder, and in terms of performing control operation based on the first gaze location and the first touch signal, the control unit 404 is specifically configured to:
if the user interface comprises a photographing key, the first watching position is located in the boundary of the photographing key, and the first touch signal is a light touch signal, real photographing is carried out;
if the user interface comprises a focusing key, the first watching position is located in the focusing key, and the first touch signal is a light touch signal, real photographing is carried out;
if the user interface comprises a virtual information key, the first fixation position is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, starting a mixed reality photographing function, displaying AR virtual information superposed in a real scene in the view-finding frame under the mixed reality photographing function, and obtaining image information which is real scene image information superposed with the AR virtual information under the mixed reality photographing function;
if the user interface comprises a floating window mode key, the first fixation position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, starting a floating window mode photographing function, displaying a first floating window in the user interface under the floating window mode photographing function, wherein the first floating window comprises a reduced view-finding frame and a photographing shutter key.
In an implementation manner of the present application, the gaze location determining unit 402 is further configured to determine, through the eye tracking lens, a second gaze location where the human eye gazes at the user interface after the floating window mode photographing function is started;
the communication unit 403 is further configured to receive, through the bluetooth module, a second touch signal sent by the touch ring;
the control unit 404 is further configured to close the floating window mode photographing function if the second gaze location is located in the view frame and the second touch signal is a light touch signal; if the second fixation position is located in the photographing shutter button boundary and the second touch signal is a light touch signal, performing real photographing; and if the second fixation position is located in the first floating window and the second touch signal is a long press signal, moving the first floating window based on the fixation point of the human eye.
In an implementation manner of the present application, the AR glasses are in a video recording mode, and in terms of performing a control operation based on the first gaze location and the first touch signal, the control unit 404 is specifically configured to:
if the user interface comprises a video recording key, the first watching position is located in the boundary of the video recording key, and the first touch signal is a light touch signal, then real video recording is carried out;
if the user interface comprises a virtual information key, the first watching position is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, starting a mixed reality video recording function, displaying AR virtual information superposed in a real scene in the view-finding frame under the mixed reality video recording function, and recording obtained video information as real scene video information superposed with the AR virtual information under the mixed reality video recording function;
if the AR glasses are recording, the user interface comprises a floating window mode key, the first watching position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, starting a floating window mode recording function, and displaying a second floating window in the user interface under the floating window mode photographing function, wherein the second floating window comprises a shortened recording time, a view frame and a record pause key.
In an implementation manner of the present application, the gaze location determining unit 402 is further configured to determine, through the eye-tracking lens, a third gaze location where the human eye gazes at the user interface after the floating-window mode video recording function is turned on;
the communication unit 403 is further configured to receive, through the bluetooth module, a third touch signal sent by the touch ring;
the control unit 404 is further configured to close the floating window mode video recording function if the third gaze location is located in the view frame and the third touch signal is a light touch signal; if the third watching position is located in the recording pause key boundary and the third touch signal is a light touch signal, stopping recording; and if the third gaze location is located in the second floating window and the third touch signal is a long press signal, moving the second floating window based on the gaze point of the human eye.
Embodiments of the present application also provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform some or all of the steps described in the above method embodiments for AR glasses.
Embodiments of the present application also provide a computer program product, wherein the computer program product comprises a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described for AR glasses in the above method. The computer program product may be a software installation package.
The steps of a method or algorithm described in the embodiments of the present application may be implemented in hardware, or may be implemented by a processor executing software instructions. The software instructions may be comprised of corresponding software modules that may be stored in Random Access Memory (RAM), flash Memory, Read Only Memory (ROM), Erasable Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), registers, a hard disk, a removable disk, a compact disc Read only Memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC. Additionally, the ASIC may reside in an access network device, a target network device, or a core network device. Of course, the processor and the storage medium may reside as discrete components in an access network device, a target network device, or a core network device.
Those skilled in the art will appreciate that in one or more of the examples described above, the functionality described in the embodiments of the present application may be implemented, in whole or in part, by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy Disk, a hard Disk, a magnetic tape), an optical medium (e.g., a Digital Video Disk (DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the embodiments of the present application in further detail, and it should be understood that the above-mentioned embodiments are only specific embodiments of the present application, and are not intended to limit the scope of the embodiments of the present application, and any modifications, equivalent substitutions, improvements and the like made on the basis of the technical solutions of the embodiments of the present application should be included in the scope of the embodiments of the present application.

Claims (10)

1. The device control method is applied to AR (augmented reality) glasses, the AR glasses comprise a Bluetooth module, a display and a human eye tracking lens, the human eye tracking lens is arranged on one side of the display, and the method comprises the following steps:
displaying a user interface on the display;
determining a first fixation position where a human eye gazes at the user interface through the human eye tracking lens, and receiving a first touch signal sent by a touch ring matched with the AR glasses through the Bluetooth module;
and performing control operation based on the first gaze position and the first touch signal.
2. The method of claim 1, wherein the AR glasses further comprise a plurality of infrared tracking Light Emitting Diodes (LEDs) disposed along a peripheral side of the display, wherein determining a first gaze location of a human eye at the user interface through the human eye tracking lens comprises:
capturing infrared light spots reflected by human eyes through the human eye tracking lens, wherein the infrared light spots are formed after the infrared tracking LEDs project infrared light to the human eyes;
determining a first gaze location of a human eye gazing at the user interface based on the infrared spot.
3. The method of claim 1 or 2, wherein the AR glasses are in a camera mode, the user interface comprises a view finder, and the controlling based on the first gaze location and the first touch signal comprises:
and if the first watching position is positioned in the viewing frame and the first touch signal is a light touch signal, focusing the first watching position, wherein after the first watching position is focused, the user interface displays a focusing key and an exposure scale.
4. The method according to claim 1 or 2, wherein the AR glasses are in a photographing mode, the user interface includes a view finder, and the performing a control operation based on the first gaze location and the first touch signal includes:
if the user interface comprises a photographing key, the first watching position is located in the boundary of the photographing key, and the first touch signal is a light touch signal, real photographing is carried out;
if the user interface comprises a focusing key, the first watching position is located in the focusing key, and the first touch signal is a light touch signal, real photographing is carried out;
if the user interface comprises a virtual information key, the first fixation position is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, starting a mixed reality photographing function, displaying AR virtual information superposed in a real scene in the view-finding frame under the mixed reality photographing function, and obtaining image information which is real scene image information superposed with the AR virtual information under the mixed reality photographing function;
if the user interface comprises a floating window mode key, the first fixation position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, starting a floating window mode photographing function, displaying a first floating window in the user interface under the floating window mode photographing function, wherein the first floating window comprises a reduced view-finding frame and a photographing shutter key.
5. The method of claim 4, wherein after the floating window mode photographing function is turned on, the method further comprises:
determining a second fixation position where the human eye gazes at the user interface through the human eye tracking lens, and receiving a second touch signal sent by the touch ring through the Bluetooth module;
if the second fixation position is located in the view-finding frame and the second touch signal is a light touch signal, closing the floating window mode photographing function;
if the second fixation position is located in the photographing shutter button boundary and the second touch signal is a light touch signal, performing real photographing;
and if the second fixation position is located in the first floating window and the second touch signal is a long press signal, moving the first floating window based on the fixation point of the human eye.
6. The method of claim 1 or 2, wherein the AR glasses are in a video recording mode, and wherein the performing a control operation based on the first gaze location and the first touch signal comprises:
if the user interface comprises a video recording key, the first watching position is located in the boundary of the video recording key, and the first touch signal is a light touch signal, then real video recording is carried out;
if the user interface comprises a virtual information key, the first watching position is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, starting a mixed reality video recording function, displaying AR virtual information superposed in a real scene in the view-finding frame under the mixed reality video recording function, and recording obtained video information as real scene video information superposed with the AR virtual information under the mixed reality video recording function;
if the AR glasses are recording, the user interface comprises a floating window mode key, the first watching position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, starting a floating window mode recording function, and displaying a second floating window in the user interface under the floating window mode photographing function, wherein the second floating window comprises a shortened recording time, a view frame and a record pause key.
7. The method of claim 6, wherein after the initiating the floating window mode video recording function, the method further comprises:
determining a third fixation position where the human eye gazes at the user interface through the human eye tracking lens, and receiving a third touch signal sent by the touch ring through the Bluetooth module;
if the third fixation position is located in the view-finding frame and the third touch signal is a light touch signal, closing the floating window mode video recording function;
if the third watching position is located in the recording pause key boundary and the third touch signal is a light touch signal, stopping recording;
and if the third gaze location is located in the second floating window and the third touch signal is a long press signal, moving the second floating window based on the gaze point of the human eye.
8. The utility model provides an equipment control device, its characterized in that is applied to augmented reality AR glasses, AR glasses include bluetooth module, display and people's eye tracking lens, people's eye tracking lens is located one side of display, the device includes:
a display unit for displaying a user interface on the display;
the gaze position determining unit is used for determining a first gaze position where the human eyes gaze the user interface through the human eye tracking lens;
the communication unit is used for receiving a first touch signal sent by a touch ring matched with the AR glasses through the Bluetooth module;
and the control unit is used for carrying out control operation based on the first fixation position and the first touch signal.
9. Augmented Reality (AR) glasses comprising a processor, a memory, a Bluetooth module, a display, a human eye tracking lens, a wireless communication module, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium is used to store a computer program, which is executed by a processor to implement the method according to any of claims 1-7.
CN201911310496.1A 2019-12-18 2019-12-18 Equipment control method and related equipment Active CN111061372B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911310496.1A CN111061372B (en) 2019-12-18 2019-12-18 Equipment control method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911310496.1A CN111061372B (en) 2019-12-18 2019-12-18 Equipment control method and related equipment

Publications (2)

Publication Number Publication Date
CN111061372A true CN111061372A (en) 2020-04-24
CN111061372B CN111061372B (en) 2023-05-02

Family

ID=70302279

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911310496.1A Active CN111061372B (en) 2019-12-18 2019-12-18 Equipment control method and related equipment

Country Status (1)

Country Link
CN (1) CN111061372B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103747181A (en) * 2014-01-10 2014-04-23 上海斐讯数据通信技术有限公司 System for combining videos and camera-acquired pictures
CN103795926A (en) * 2014-02-11 2014-05-14 惠州Tcl移动通信有限公司 Method, system and photographing device for controlling photographing focusing by means of eyeball tracking technology
WO2015183438A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Realtime capture exposure adjust gestures
CN105824522A (en) * 2015-09-24 2016-08-03 维沃移动通信有限公司 Photographing method and mobile terminal
CN106131394A (en) * 2016-06-15 2016-11-16 青岛海信移动通信技术股份有限公司 A kind of method and device taken pictures
CN106354253A (en) * 2016-08-19 2017-01-25 上海理湃光晶技术有限公司 Cursor control method and AR glasses and intelligent ring based on same
CN107003730A (en) * 2015-03-13 2017-08-01 华为技术有限公司 A kind of electronic equipment, photographic method and camera arrangement
CN107729871A (en) * 2017-11-02 2018-02-23 北方工业大学 Infrared light-based human eye movement track tracking method and device
CN109600555A (en) * 2019-02-02 2019-04-09 北京七鑫易维信息技术有限公司 A kind of focusing control method, system and photographing device
CN109597489A (en) * 2018-12-27 2019-04-09 武汉市天蝎科技有限公司 A kind of method and system of the eye movement tracking interaction of near-eye display device
CN109725717A (en) * 2018-11-30 2019-05-07 成都理想境界科技有限公司 Image processing method and AR equipment applied to AR equipment
CN110069101A (en) * 2019-04-24 2019-07-30 洪浛檩 A kind of wearable calculating equipment and a kind of man-machine interaction method
US10466780B1 (en) * 2015-10-26 2019-11-05 Pillantas Systems and methods for eye tracking calibration, eye vergence gestures for interface control, and visual aids therefor

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103747181A (en) * 2014-01-10 2014-04-23 上海斐讯数据通信技术有限公司 System for combining videos and camera-acquired pictures
CN103795926A (en) * 2014-02-11 2014-05-14 惠州Tcl移动通信有限公司 Method, system and photographing device for controlling photographing focusing by means of eyeball tracking technology
WO2015120673A1 (en) * 2014-02-11 2015-08-20 惠州Tcl移动通信有限公司 Method, system and photographing equipment for controlling focusing in photographing by means of eyeball tracking technology
WO2015183438A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Realtime capture exposure adjust gestures
CN107003730A (en) * 2015-03-13 2017-08-01 华为技术有限公司 A kind of electronic equipment, photographic method and camera arrangement
CN105824522A (en) * 2015-09-24 2016-08-03 维沃移动通信有限公司 Photographing method and mobile terminal
US10466780B1 (en) * 2015-10-26 2019-11-05 Pillantas Systems and methods for eye tracking calibration, eye vergence gestures for interface control, and visual aids therefor
CN106131394A (en) * 2016-06-15 2016-11-16 青岛海信移动通信技术股份有限公司 A kind of method and device taken pictures
CN106354253A (en) * 2016-08-19 2017-01-25 上海理湃光晶技术有限公司 Cursor control method and AR glasses and intelligent ring based on same
CN107729871A (en) * 2017-11-02 2018-02-23 北方工业大学 Infrared light-based human eye movement track tracking method and device
CN109725717A (en) * 2018-11-30 2019-05-07 成都理想境界科技有限公司 Image processing method and AR equipment applied to AR equipment
CN109597489A (en) * 2018-12-27 2019-04-09 武汉市天蝎科技有限公司 A kind of method and system of the eye movement tracking interaction of near-eye display device
CN109600555A (en) * 2019-02-02 2019-04-09 北京七鑫易维信息技术有限公司 A kind of focusing control method, system and photographing device
CN110069101A (en) * 2019-04-24 2019-07-30 洪浛檩 A kind of wearable calculating equipment and a kind of man-machine interaction method

Also Published As

Publication number Publication date
CN111061372B (en) 2023-05-02

Similar Documents

Publication Publication Date Title
US11017257B2 (en) Information processing device, information processing method, and program
EP2929424B1 (en) Multi-touch interactions on eyewear
KR102184272B1 (en) Glass type terminal and control method thereof
US9442571B2 (en) Control method for generating control instruction based on motion parameter of hand and electronic device using the control method
US10477090B2 (en) Wearable device, control method and non-transitory storage medium
CN110546601B (en) Information processing device, information processing method, and program
JP2017199379A (en) Tracking display system, tracking display program, tracking display method, wearable device using the same, tracking display program for wearable device, and manipulation method for wearable device
TW201122909A (en) Method for switching to display three-dimensional images and digital display system
CN111970456B (en) Shooting control method, device, equipment and storage medium
KR102361025B1 (en) Wearable glasses and method for displaying image therethrough
US20180176459A1 (en) Method and device for changing focal point of camera
KR102110208B1 (en) Glasses type terminal and control method therefor
US10896545B1 (en) Near eye display interface for artificial reality applications
GB2494907A (en) A Head-mountable display with gesture recognition
CN112585566A (en) Hand-covering face input sensing for interacting with device having built-in camera
JP2016177658A (en) Virtual input device, input method, and program
Weng et al. Facesight: Enabling hand-to-face gesture interaction on ar glasses with a downward-facing camera vision
CN109600555A (en) A kind of focusing control method, system and photographing device
US20230336865A1 (en) Device, methods, and graphical user interfaces for capturing and displaying media
JP7459798B2 (en) Information processing device, information processing method, and program
KR20180094875A (en) Information processing apparatus, information processing method, and program
Zhang et al. ReflecTouch: Detecting grasp posture of smartphone using corneal reflection images
CN104754202B (en) A kind of method and electronic equipment of Image Acquisition
CN111061372B (en) Equipment control method and related equipment
US20200250498A1 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant