CN111061372B - Equipment control method and related equipment - Google Patents

Equipment control method and related equipment Download PDF

Info

Publication number
CN111061372B
CN111061372B CN201911310496.1A CN201911310496A CN111061372B CN 111061372 B CN111061372 B CN 111061372B CN 201911310496 A CN201911310496 A CN 201911310496A CN 111061372 B CN111061372 B CN 111061372B
Authority
CN
China
Prior art keywords
touch signal
user interface
key
glasses
floating window
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911310496.1A
Other languages
Chinese (zh)
Other versions
CN111061372A (en
Inventor
吴恒刚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201911310496.1A priority Critical patent/CN111061372B/en
Publication of CN111061372A publication Critical patent/CN111061372A/en
Application granted granted Critical
Publication of CN111061372B publication Critical patent/CN111061372B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The application discloses equipment control method and related equipment are applied to AR glasses, the AR glasses comprise a Bluetooth module, a display and a human eye tracking lens, the human eye tracking lens is arranged on one side of the display, and the method comprises the following steps: displaying a user interface on the display; determining a first gazing position where the human eyes gaze at the user interface through the human eye tracking lens, and receiving a first touch signal sent by a touch finger ring paired with the AR glasses through the Bluetooth module; and performing control operation based on the first gazing position and the first touch signal. The AR glasses can be conveniently controlled by adopting the embodiment of the application.

Description

Equipment control method and related equipment
Technical Field
The application relates to the technical field of augmented reality, in particular to a device control method and related devices.
Background
The wearable display device is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable display devices. Augmented reality (Augmented Reality, AR) glasses are also a wearable device, and after a user wears the AR glasses, the user can see a picture in which virtual information is superimposed in a real environment. Along with the improvement of the use frequency of the AR glasses, how to conveniently control the AR glasses becomes a technical problem to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a device control method and related devices, which are used for conveniently controlling AR glasses.
In a first aspect, an embodiment of the present application provides a device control method, which is applied to augmented reality AR glasses, where the AR glasses include a bluetooth module, a display, and a human eye tracking lens, and the human eye tracking lens is disposed on one side of the display, and the method includes:
displaying a user interface on the display;
determining a first gazing position where the human eyes gaze at the user interface through the human eye tracking lens, and receiving a first touch signal sent by a touch finger ring paired with the AR glasses through the Bluetooth module;
and performing control operation based on the first gazing position and the first touch signal.
In a second aspect, an embodiment of the present application provides a device control apparatus, which is applied to augmented reality AR glasses, the AR glasses include a bluetooth module, a display, and a human eye tracking lens, the human eye tracking lens is disposed on one side of the display, and the apparatus includes:
a display unit for displaying a user interface on the display;
a gaze location determining unit configured to determine, through the eye-tracking lens, a first gaze location at which the human eye gazes at the user interface;
The communication unit is used for receiving a first touch signal sent by a touch finger ring matched with the AR glasses through the Bluetooth module;
and the control unit is used for performing control operation based on the first gazing position and the first touch control signal.
In a third aspect, an embodiment of the present application provides AR glasses, including a processor, a memory, a bluetooth module, a display, a human eye tracking lens, a wireless communication module, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, the programs including instructions for performing steps in the method according to the first aspect of the embodiment of the present application.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program for execution by a processor to implement some or all of the steps described in the method according to the first aspect of embodiments of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described in the method of the first aspect of embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in this embodiment of the present application, the AR glasses first display a user interface on a display, then determine a gaze location where a human eye gazes at the user interface, receive, through the bluetooth module, a touch signal sent by a touch finger ring paired with the AR glasses, and finally perform a control operation based on the gaze location and the touch signal, where the eye movement tracking achieves interaction fineness, and the touch finger ring achieves interaction convenience, so that the AR glasses are conveniently and accurately controlled.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1A is a schematic structural diagram of AR glasses according to an embodiment of the present application;
fig. 1B is a schematic structural diagram of another AR glasses provided in an embodiment of the present application;
fig. 1C is a schematic structural diagram of another AR glasses provided in an embodiment of the present application;
FIG. 1D is a schematic illustration of a user interface provided by an embodiment of the present application;
fig. 1E is a schematic structural diagram of a touch finger ring according to an embodiment of the present application;
fig. 2A is a schematic flow chart of a device control method according to an embodiment of the present application;
FIGS. 2B-2W are schematic illustrations of another user interface provided by embodiments of the present application;
FIG. 3 is a schematic view of another AR glasses according to the embodiment of the present application;
fig. 4 is a schematic structural diagram of a device control apparatus according to an embodiment of the present application.
Detailed Description
In order to make the present application solution better understood by those skilled in the art, the following description will be made in detail and with reference to the accompanying drawings in the embodiments of the present application, it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, shall fall within the scope of the present application.
The following will describe in detail.
The terms "first," "second," "third," and "fourth" and the like in the description and in the claims of this application and in the drawings, are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the present application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
Referring to fig. 1A to 1C, fig. 1A to 1C are schematic structural views of AR glasses according to an embodiment of the present application. The AR glasses include an AR glasses body 101, a main camera 102, a plurality of first infrared tracking light emitting diodes (Light Emitting Diode, LEDs) 103, a first eye tracking lens 104, a plurality of second infrared tracking LEDs 105, a second eye tracking lens 106, a left eye display 107, and a right eye display 108.
Optionally, a main camera 102, a plurality of first infrared tracking LEDs 103, a first eye tracking lens 104, a plurality of second infrared tracking LEDs 105, a second eye tracking lens 106, a left eye display 107 and a right eye display 108 are all fixed on the AR glasses body 101. The plurality of first infrared tracking LEDs 103 are disposed along the peripheral side of the left-eye display 107, and the first eye tracking lens 104 is disposed on one side of the left-eye display 107. The plurality of second infrared tracking LEDs 105 are disposed along a peripheral side of the right eye display 108, and the second eye tracking lens 106 is disposed on one side of the left eye display 108.
Optionally, a processor, an image information processing module, a memory, a bluetooth module, an eye movement tracking processing module, a wireless communication module and the like are further arranged in the AR glasses. The image information processing module and the eye tracking processing module may be independent of the processor or integrated into the processor. The AR glasses may be communicatively connected to a wireless communication device (e.g., a smart phone, a tablet computer, etc.) via a wireless communication module.
Wherein AR glasses may be worn on the head of a user to ensure that the user can clearly see virtual information superimposed in the real world. The main camera 102 is used for photographing and shooting, and after being started, external light 109 is collected through the main camera 102 and converted into digital information, and the digital information is converted into visible digital image information through an image information processing module of the AR glasses. Taking the right eye as an example, the image information is projected by the micro projector 110 to the right eye display 108, and finally projected into the field of view of the user's right eye after changing the direction of the light via the right eye display 108. The micro projector 110 is fixed to the AR glasses body 101, and two micro projectors 110 are provided, one being provided on one side of the left-eye display 107 and the other being provided on one side of the right-eye display 108.
The first eye tracking lens 104 is used for tracking the sight direction and the gaze position of the left eye of the user in real time. The second eye tracking lens 106 is used to track the gaze direction and gaze location of the right eye of the user in real time. After the user wears the AR glasses correctly, the plurality of first infrared tracking LEDs 103 and the plurality of second infrared tracking LEDs 105 project infrared light to the eyeballs of the user, for example, the infrared light forms a light spot at the cornea of the right eye, and the second eye tracking lens 106 captures the infrared light spot reflected by the cornea of the human eye in real time and then converts the infrared light spot into digital information, and the digital information analyzes the sight line direction of the user and the sight position in the user interface through the eye movement tracking processing module. The eye movement tracking processing module can also judge the blink behavior of the user according to the disappearance sequence and the reappearance sequence of the infrared light spots.
Referring to fig. 1D, fig. 1D is a schematic diagram of a user interface according to an embodiment of the present application. The image information processing module displays the processed image information in a view-finding frame of the user interface. The cursor displayed in the view-finding frame calculates the gazing position of the human eye gazing at the user interface for the eye-movement tracking module. When the cursor is located in a hot zone of an object (such as a video button, a photographing button, a virtual information button, etc.) in the user interface, the display effect of the hot zone changes (such as the size of a boundary, the thickness of a frame color, etc.), and at this time, the user can complete corresponding interactive operation through a touch finger ring paired with the Bluetooth of the AR glasses.
Referring to fig. 1E, fig. 1E is a schematic structural diagram of a touch finger ring according to an embodiment of the present application. The touch finger ring can be worn at the second joint of the index finger of one hand (such as the left hand or the right hand) of the user, and then the interaction is realized by touching the touch area of the touch finger ring through the thumb. The Bluetooth module is arranged inside the touch finger ring, and the touch area can be unfolded to be a rectangular plane. The capacitive touch sensor of the touch area records touch operation in real time, and transmits touch signals to the Bluetooth module of the AR glasses in real time through the Bluetooth module of the touch finger ring. The touch signal includes:
(1) Tapping signal: the single touch point time is less than 500 milliseconds;
(2) Long press signal: the single touch point time is greater than or equal to 500 milliseconds;
(3) Right slide signal: a single touch point and produces a horizontal rightward displacement;
(4) Left slide signal: a single touch point and produces a horizontal left displacement;
(5) Up slide signal: a single touch point and producing a vertical upward displacement;
(6) And (3) a downslide signal: a single touch point and producing a vertical downward displacement;
(7) Double finger kneading signal): detecting that two touch points generate displacement, wherein the distance between the two touch points is gradually close;
(8) Double finger separation signal: and detecting that the two touch points generate displacement, wherein the distance between the two touch points is gradually far away.
Referring to fig. 2A, fig. 2A is a schematic flow chart of a device control method according to an embodiment of the present application, which is applied to the above AR glasses, and the method includes:
step 201: a user interface is displayed on the display.
The user interface is a user interface of the AR glasses in a camera mode, such as a photographing mode, a video recording mode, and the like.
Step 202: and determining a first gazing position where the human eyes gaze at the user interface through the human eye tracking lens, and receiving a first touch signal sent by a touch finger ring paired with the AR glasses through the Bluetooth module.
In an implementation manner of the present application, the determining, by the eye tracking lens, a first gaze location of the user interface at which the human eye gazes includes:
capturing infrared light spots reflected by human eyes through the human eye tracking lens, wherein the infrared light spots are formed after the plurality of infrared tracking LEDs project infrared light to the human eyes;
a gaze location of the human eye at the user interface is determined based on the infrared light spot.
It should be noted that, a specific implementation manner of determining the gaze position of the eye gazing at the user interface based on the infrared light spot is an existing eye tracking technology, which will not be described herein.
Step 203: and performing control operation based on the first gazing position and the first touch signal.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a viewfinder, and the controlling operation based on the first gaze location and the first touch signal includes: and if the first gazing position is positioned in the view-finding frame and the first touch signal is a light touch signal, focusing the first gazing position, wherein after focusing the first gazing position, the user interface displays focusing keys and exposure scales. As particularly shown in fig. 2B.
It should be noted that, when the AR glasses are in the camera mode, the user interface includes a viewfinder, if the first gazing position is located in the viewfinder and two eyes blink, focusing is performed on the first gazing position, and after focusing is performed on the first gazing position, the first user interface displays a focusing key and an exposure scale.
In addition, after the user interface displays the focus key and the exposure scale, if no operation is performed within the first time period, the focus key and the exposure scale are removed. The first time length is, for example, 2000ms, 3000ms, or other value.
In an implementation of the present application, after focusing the first gaze location, the method further includes:
and if a fourth touch signal sent by the touch finger ring is received through the Bluetooth module within the first duration and the fourth touch signal is a right-slide signal, reducing the exposure degree based on the right-slide signal. As shown in fig. 2C in particular, in addition, how much the exposure is reduced is determined based on the sliding distance of the right slide, the greater the sliding distance, the greater the reduced exposure, and the smaller the sliding distance, the smaller the reduced exposure.
In an implementation of the present application, after focusing the first gaze location, the method further includes:
if a fourth touch signal sent by the touch finger ring is received through the Bluetooth module within a first duration, and the fourth touch signal is a left-sliding signal, the exposure is improved based on the left-sliding signal. As shown in fig. 2D in particular, in addition, how much the exposure is reduced is determined based on the sliding distance of the left slide, the greater the sliding distance, the greater the reduced exposure, and the smaller the sliding distance, the smaller the reduced exposure.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a viewfinder, and the controlling operation based on the first gaze location and the first touch signal includes:
And if the first gazing position is positioned in the view finding frame and the first touch signal is a double-finger kneading signal, performing view reduction adjustment based on the double-finger kneading signal, and displaying zoom scales on the user interface when performing view reduction adjustment. As shown in fig. 2E in particular, in addition, how much a particular view is reduced is determined based on the distance of the two fingers, the smaller the distance of the two fingers, the larger the view is reduced, and the smaller the view is reduced.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a viewfinder, and the controlling operation based on the first gaze location and the first touch signal includes:
and if the first gazing position is positioned in the view finding frame and the first touch signal is a double-finger separation signal, performing view enlargement adjustment based on the double-finger separation signal, and displaying zoom scales on the user interface when performing the view enlargement adjustment. As shown in fig. 2F in particular, in addition, how much a particular view zoom is based on the distance of the two fingers, the smaller the view zoom, and the larger the distance of the two fingers, the larger the view zoom.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a viewfinder, a zoom magnification icon is displayed in the viewfinder, and the control operation is performed based on the first gaze location and the first touch signal, including:
and if the first gazing position is positioned in the zoom multiple icon boundary and the first touch signal is an upward sliding signal, performing view amplification adjustment based on the upward sliding signal, and displaying a zoom scale on the user interface when performing the view amplification adjustment. As shown in fig. 2G in particular, in addition, how much a particular view is enlarged is determined based on the sliding distance of the upward slide, the smaller the sliding distance, the smaller the view enlargement, and the larger the sliding distance, the larger the view enlargement.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a viewfinder, a zoom magnification icon is displayed in the viewfinder, and the control operation is performed based on the first gaze location and the first touch signal, including:
and if the first gazing position is positioned in the zoom-out multiple icon boundary and the first touch signal is a downslide signal, performing view reduction adjustment based on the upslide signal, and displaying zoom scales on the user interface when performing view reduction adjustment. As shown in fig. 2H in particular, the degree of view reduction is determined based on the sliding distance of the slide down, and the smaller the sliding distance, the smaller the view reduction, and the larger the sliding distance, the larger the view reduction.
In an implementation manner of the present application, the AR glasses support a plurality of camera modes, where the plurality of camera modes are sequentially arranged, the AR glasses are currently in a first camera mode, the user interface includes a viewfinder, and the control operation is performed based on the first gaze location and the first touch signal, and the method includes: if the first gazing position is located in the view-finding frame and the first touch signal is a sliding signal, switching to a second camera mode, wherein the second camera mode is adjacent to the first camera mode and is arranged above the first camera mode;
and if the first gazing position is located in the view-finding frame and the first touch signal is an upward sliding signal, switching to a third camera mode, wherein the third camera mode is adjacent to the first camera mode and is arranged below the first camera mode.
The specific schematic diagram is shown in fig. 2I, assuming that the second camera mode is a video mode and the third camera mode is a camera mode a.
In an implementation manner of the present application, the AR glasses are in a camera mode, the AR glasses support a plurality of camera modes, the AR glasses are currently in a first camera mode, the user equipment includes a photographing key, and the control operation is performed based on the first gaze location and the first touch signal, including:
If the first gazing position is located in the shooting key boundary and the first touch signal is a light touch signal or a sliding signal, switching to a fourth camera mode, wherein the fourth camera mode is adjacent to the first camera mode and is arranged above the first camera mode;
if the first gazing position is located in the shooting key boundary and the first touch signal is a light touch signal or an up-slide signal, switching to a fifth camera mode, wherein the fifth camera mode is adjacent to the first camera mode and arranged below the first camera mode;
and if the first gazing position is positioned in the shooting key boundary and the first touch signal is a light touch signal or a continuous sliding signal, switching to a more functional mode, and displaying icons of a plurality of camera modes on the user interface in the more functional mode for selection by a user.
In this embodiment, assuming that the fourth camera mode is a video recording mode and the fifth camera mode is a camera mode a, a specific schematic diagram is shown in fig. 2J.
In an implementation manner of the present application, the AR glasses are in a photographing mode, the user interface includes a viewfinder, and the controlling operation based on the first gaze location and the first touch signal includes:
If the user interface includes a photographing key, the first gaze location is located within the boundary of the photographing key, and the first touch signal is a light touch signal, performing real photographing, as shown in fig. 2K;
if the user interface includes a focusing key, the first gaze location is located in the focusing key, and the first touch signal is a light touch signal, performing real photographing, as shown in fig. 2L;
if the user interface includes a virtual information key, the first gaze location is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, a mixed reality photographing function is started, AR virtual information superimposed in a real scene is displayed in the viewfinder under the mixed reality photographing function, and image information obtained by photographing under the mixed reality photographing function is real scene image information superimposed with the AR virtual information, as shown in fig. 2M;
if the user interface includes a floating window mode key, the first gaze location is located in a boundary of the floating window mode key, and the first touch signal is a light touch signal, a floating window mode photographing function is started, and under the floating window mode photographing function, a first floating window is displayed in the user interface, where the first floating window includes the reduced viewfinder and photographing shutter key, as shown in fig. 2N.
It should be noted that, the actual photographing means that the external environment of the AR glasses is photographed by the main camera, and the obtained image is an actual scene image.
In an implementation manner of the present application, after the floating window mode photographing function is turned on, the method further includes:
determining a second gazing position where the human eyes gaze at the user interface through the human eye tracking lens, and receiving a second touch signal sent by the touch finger ring through the Bluetooth module;
if the second gaze location is located in the viewfinder and the second touch signal is a light touch signal, the floating window mode photographing function is turned off, as shown in fig. 2O;
if the second gazing position is located in the photographing key boundary and the second touch signal is a touch signal, performing real photographing, as shown in fig. 2P;
if the second gaze location is located in the first floating window and the second touch signal is a long press signal, the first floating window is moved based on the gaze point of human eyes, as shown in fig. 2Q.
The second gazing position is located in the first floating window, which means that the second gazing position is located in the shooting shutter key boundary or the second gazing position is located in the view-finding frame boundary.
In an implementation manner of the present application, the AR glasses are in a video recording mode, and the performing a control operation based on the first gaze location and the first touch signal includes:
if the user interface includes a video button, the first gaze location is located within the video button boundary, and the first touch signal is a light touch signal, performing real video recording, as shown in fig. 2R;
if the user interface includes a virtual information key, the first gaze location is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, a mixed reality video recording function is started, AR virtual information superimposed in a real scene is displayed in the viewfinder under the mixed reality video recording function, and video information recorded under the mixed reality video recording function is real scene video information superimposed with the AR virtual information, as shown in fig. 2S;
if the AR glasses are recording, the user interface includes a floating window mode key, the first gaze location is located in a boundary of the floating window mode key, and the first touch signal is a light touch signal, a floating window mode recording function is started, and under the floating window mode photographing function, a second floating window is displayed in the user interface, where the second floating window includes a reduced recorded duration, the viewfinder, and a pause recording key, as shown in fig. 2T.
In an implementation manner of the present application, after the floating window mode video recording function is turned on, the method further includes:
determining a third gazing position where the human eyes gaze at the user interface through the human eye tracking lens, and receiving a third touch signal sent by the touch finger ring through the Bluetooth module;
if the third gazing position is located in the view-finding frame and the third touch signal is a light touch signal, the floating window mode video recording function is closed, as shown in fig. 2U;
if the third gazing position is located in the pause recording key boundary and the third touch signal is a touch signal, pause recording is performed, as shown in fig. 2V;
if the third gaze location is located in the second floating window and the third touch signal is a long press signal, the second floating window is moved based on the gaze point of human eyes, as shown in fig. 2W.
The third gazing position is located in the second floating window, which means that the third gazing position is located in the boundary of the recorded duration of the beat, or the third gazing position is located in the boundary of the view-finding frame, or the third gazing position is located in the boundary of the pause recording key.
It can be seen that, in this embodiment of the present application, the AR glasses first display a user interface on a display, then determine a gaze location where a human eye gazes at the user interface, receive, through the bluetooth module, a touch signal sent by a touch finger ring paired with the AR glasses, and finally perform a control operation based on the gaze location and the touch signal, where the eye movement tracking achieves interaction fineness, and the touch finger ring achieves interaction convenience, so that the AR glasses are conveniently and accurately controlled.
Referring to fig. 3, fig. 3 is a schematic structural diagram of AR glasses provided in an embodiment of the present application, as shown in the drawing, the electronic device includes a processor, a memory, a bluetooth module, a display, a human eye tracking lens, and a wireless communication module, where the human eye tracking lens is disposed on one side of the display; wherein the one or more programs are stored in the memory and configured to be executed by the processor, the programs including instructions for:
displaying a user interface on the display;
determining a first gazing position where the human eyes gaze at the user interface through the human eye tracking lens, and receiving a first touch signal sent by a touch finger ring paired with the AR glasses through the Bluetooth module;
And performing control operation based on the first gazing position and the first touch signal.
It can be seen that, in this embodiment of the present application, the AR glasses first display a user interface on a display, then determine a gaze location where a human eye gazes at the user interface, receive, through the bluetooth module, a touch signal sent by a touch finger ring paired with the AR glasses, and finally perform a control operation based on the gaze location and the touch signal, where the eye movement tracking achieves interaction fineness, and the touch finger ring achieves interaction convenience, so that the AR glasses are conveniently and accurately controlled.
In an implementation of the present application, the AR glasses further include a plurality of infrared tracking LEDs disposed along a peripheral side of the display, and the program includes instructions for performing the following steps in determining, by the eye tracking lens, a first gaze location at which the eye gazes at the user interface:
capturing infrared light spots reflected by human eyes through the human eye tracking lens, wherein the infrared light spots are formed after the plurality of infrared tracking LEDs project infrared light to the human eyes;
a first gaze location of the human eye at the user interface is determined based on the infrared light spot.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a viewfinder, and the program includes instructions specifically for executing the following steps in terms of performing control operations based on the first gaze location and the first touch signal:
and if the first gazing position is positioned in the view-finding frame and the first touch signal is a light touch signal, focusing the first gazing position, wherein after focusing the first gazing position, the user interface displays focusing keys and exposure scales.
In an implementation manner of the present application, the AR glasses are in a photographing mode, the user interface includes a viewfinder, and the program includes instructions specifically for executing the following steps in terms of performing control operations based on the first gaze location and the first touch signal:
if the user interface comprises a photographing key, the first gazing position is located in the boundary of the photographing key, and the first touch signal is a light touch signal, performing real photographing;
if the user interface comprises a focusing key, the first gazing position is positioned in the focusing key, and the first touch signal is a light touch signal, performing reality photographing;
If the user interface comprises a virtual information key, the first gaze location is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, a mixed reality photographing function is started, AR virtual information overlapped in a real scene is displayed in the view-finding frame under the mixed reality photographing function, and image information obtained by photographing under the mixed reality photographing function is real scene image information overlapped with the AR virtual information;
if the user interface comprises a floating window mode key, the first gazing position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, a floating window mode photographing function is started, a first floating window is displayed in the user interface under the floating window mode photographing function, and the first floating window comprises a reduced view-finding frame and a photographing shutter key.
In an implementation manner of the present application, after the floating window mode photographing function is turned on, the program includes instructions for further performing the following steps:
determining a second gazing position where the human eyes gaze at the user interface through the human eye tracking lens, and receiving a second touch signal sent by the touch finger ring through the Bluetooth module;
If the second gazing position is positioned in the view-finding frame and the second touch signal is a light touch signal, closing the floating window mode photographing function;
if the second gazing position is positioned in the boundary of the photographing shutter key and the second touch signal is a light touch signal, performing actual photographing;
and if the second gazing position is positioned in the first floating window and the second touch signal is a long-press signal, moving the first floating window based on the eye gazing point.
In an implementation manner of the present application, the AR glasses are in a video mode, and in terms of performing control operations based on the first gaze location and the first touch signal, the program includes instructions specifically for performing the following steps:
if the user interface comprises a video button, the first gazing position is positioned in the boundary of the video button, and the first touch signal is a light touch signal, performing real video recording;
if the user interface comprises a virtual information key, the first gaze location is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, a mixed reality video recording function is started, AR virtual information overlapped in a real scene is displayed in the view-finder frame under the mixed reality video recording function, and video information recorded under the mixed reality video recording function is real scene video information overlapped with the AR virtual information;
If the AR glasses are recording, the user interface comprises a floating window mode key, the first gaze position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, a floating window mode recording function is started, a second floating window is displayed in the user interface under the floating window mode photographing function, and the second floating window comprises a reduced recorded duration, a view finding frame and a pause recording key.
In an implementation manner of the present application, after the floating window mode video recording function is turned on, the program includes instructions for further performing the following steps:
determining a third gazing position where the human eyes gaze at the user interface through the human eye tracking lens, and receiving a third touch signal sent by the touch finger ring through the Bluetooth module;
if the third gazing position is positioned in the view-finding frame and the third touch signal is a light touch signal, closing the floating window mode video recording function;
if the third gazing position is positioned in the pause recording key boundary and the third touch signal is a light touch signal, pausing recording;
and if the third gazing position is positioned in the second floating window and the third touch signal is a long-press signal, moving the second floating window based on the eye gazing point.
It should be noted that, the specific implementation process of this embodiment may refer to the specific implementation process described in the foregoing method embodiment, which is not described herein.
Referring to fig. 4, fig. 4 is a device control apparatus provided in an embodiment of the present application, which is applied to AR glasses, the AR glasses include a bluetooth module, a display, and a eye tracking lens, and the eye tracking lens is disposed on one side of the display, and the apparatus includes:
a display unit 401 for displaying a user interface on the display;
a gaze location determining unit 402 for determining a first gaze location of the user interface at which the human eye gazes through the eye tracking lens;
a communication unit 403, configured to receive, through the bluetooth module, a first touch signal sent by a touch finger ring paired with the AR glasses;
the control unit 404 is configured to perform a control operation based on the first gaze location and the first touch signal.
It can be seen that, in this embodiment of the present application, the AR glasses first display a user interface on a display, then determine a gaze location where a human eye gazes at the user interface, receive, through the bluetooth module, a touch signal sent by a touch finger ring paired with the AR glasses, and finally perform a control operation based on the gaze location and the touch signal, where the eye movement tracking achieves interaction fineness, and the touch finger ring achieves interaction convenience, so that the AR glasses are conveniently and accurately controlled.
In an implementation manner of the present application, the AR glasses further include a plurality of infrared tracking LEDs, where the plurality of infrared tracking LEDs are disposed along a peripheral side of the display, and the gaze location determining unit 402 is specifically configured to:
capturing infrared light spots reflected by human eyes through the human eye tracking lens, wherein the infrared light spots are formed after the plurality of infrared tracking LEDs project infrared light to the human eyes;
a first gaze location of the human eye at the user interface is determined based on the infrared light spot.
In an implementation manner of the present application, the AR glasses are in a camera mode, the user interface includes a viewfinder, and the control unit 404 is specifically configured to:
and if the first gazing position is positioned in the view-finding frame and the first touch signal is a light touch signal, focusing the first gazing position, wherein after focusing the first gazing position, the user interface displays focusing keys and exposure scales.
In an implementation manner of the present application, the AR glasses are in a photographing mode, the user interface includes a viewfinder, and the control unit 404 is specifically configured to:
If the user interface comprises a photographing key, the first gazing position is located in the boundary of the photographing key, and the first touch signal is a light touch signal, performing real photographing;
if the user interface comprises a focusing key, the first gazing position is positioned in the focusing key, and the first touch signal is a light touch signal, performing reality photographing;
if the user interface comprises a virtual information key, the first gaze location is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, a mixed reality photographing function is started, AR virtual information overlapped in a real scene is displayed in the view-finding frame under the mixed reality photographing function, and image information obtained by photographing under the mixed reality photographing function is real scene image information overlapped with the AR virtual information;
if the user interface comprises a floating window mode key, the first gazing position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, a floating window mode photographing function is started, a first floating window is displayed in the user interface under the floating window mode photographing function, and the first floating window comprises a reduced view-finding frame and a photographing shutter key.
In an implementation manner of the present application, the gaze location determining unit 402 is further configured to determine, after the floating window mode photographing function is turned on, that the human eye gazes at the second gaze location of the user interface through the human eye tracking lens;
the communication unit 403 is further configured to receive, through the bluetooth module, a second touch signal sent by the touch finger ring;
the control unit 404 is further configured to close the floating window mode photographing function if the second gaze location is located in the viewfinder and the second touch signal is a touch signal; if the second gazing position is positioned in the boundary of the photographing shutter key and the second touch signal is a light touch signal, performing actual photographing; and if the second gazing position is positioned in the first floating window and the second touch signal is a long-press signal, moving the first floating window based on the eye gazing point.
In an implementation manner of the present application, the AR glasses are in a video mode, and the control unit 404 is specifically configured to:
if the user interface comprises a video button, the first gazing position is positioned in the boundary of the video button, and the first touch signal is a light touch signal, performing real video recording;
If the user interface comprises a virtual information key, the first gaze location is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, a mixed reality video recording function is started, AR virtual information overlapped in a real scene is displayed in the view-finder frame under the mixed reality video recording function, and video information recorded under the mixed reality video recording function is real scene video information overlapped with the AR virtual information;
if the AR glasses are recording, the user interface comprises a floating window mode key, the first gaze position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, a floating window mode recording function is started, a second floating window is displayed in the user interface under the floating window mode photographing function, and the second floating window comprises a reduced recorded duration, a view finding frame and a pause recording key.
In an implementation manner of the present application, the gaze location determining unit 402 is further configured to determine, after the floating window mode video recording function is turned on, a third gaze location where the human eye gazes at the user interface through the human eye tracking lens;
The communication unit 403 is further configured to receive, through the bluetooth module, a third touch signal sent by the touch finger ring;
the control unit 404 is further configured to close the floating window mode video recording function if the third gaze location is located in the viewfinder and the third touch signal is a touch signal; if the third gazing position is positioned in the pause recording key boundary and the third touch signal is a light touch signal, pausing recording; and if the third gazing position is positioned in the second floating window and the third touch signal is a long-press signal, moving the second floating window based on the eye gazing point.
The present application also provides a computer readable storage medium, wherein the computer readable storage medium stores a computer program for electronic data exchange, and wherein the computer program causes a computer to perform some or all of the steps described in the AR glasses in the above method embodiment.
Embodiments of the present application also provide a computer program product, wherein the computer program product comprises a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps described for AR glasses in the above method. The computer program product may be a software installation package.
The steps of a method or algorithm described in the embodiments of the present application may be implemented in hardware, or may be implemented by executing software instructions by a processor. The software instructions may be comprised of corresponding software modules that may be stored in random access Memory (Random Access Memory, RAM), flash Memory, read Only Memory (ROM), erasable programmable Read Only Memory (Erasable Programmable ROM), electrically Erasable Programmable Read Only Memory (EEPROM), registers, hard disk, a removable disk, a compact disc Read Only Memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. In addition, the ASIC may reside in an access network device, a target network device, or a core network device. It is of course also possible that the processor and the storage medium reside as discrete components in an access network device, a target network device, or a core network device.
Those of skill in the art will appreciate that in one or more of the above examples, the functions described in the embodiments of the present application may be implemented, in whole or in part, in software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (Digital Subscriber Line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy Disk, a hard Disk, a magnetic tape), an optical medium (e.g., a digital video disc (Digital Video Disc, DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
The foregoing embodiments have been provided for the purpose of illustrating the embodiments of the present application in further detail, and it should be understood that the foregoing embodiments are merely illustrative of the embodiments of the present application and are not intended to limit the scope of the embodiments of the present application, and any modifications, equivalents, improvements, etc. made on the basis of the technical solutions of the embodiments of the present application are included in the scope of the embodiments of the present application.

Claims (9)

1. A device control method, applied to augmented reality AR glasses, the AR glasses including a bluetooth module, a display, and a human eye tracking lens, the human eye tracking lens being provided on one side of the display, the method comprising:
displaying a user interface on the display;
determining a first gazing position where the human eyes gaze at the user interface through the human eye tracking lens, and receiving a first touch signal sent by a touch finger ring paired with the AR glasses through the Bluetooth module;
performing control operation based on the first gaze location and the first touch signal;
wherein, when the AR glasses are in a camera mode, the user interface includes a viewfinder, and the controlling operation based on the first gaze location and the first touch signal includes:
And if the first gaze location is located in the view-finding frame and the first touch signal is a touch signal, focusing the first gaze location, wherein after focusing the first gaze location, the user interface displays a focusing key and an exposure scale, wherein the focusing key is used for realizing realistic photographing, and after the user interface displays the focusing key and the exposure scale, if the first gaze location is located in the focusing key and the first touch signal is a touch signal, realistic photographing is performed, and if no operation is performed in a first time period, the focusing key and the exposure scale are removed.
2. The method of claim 1, wherein the AR glasses further comprise a plurality of infrared tracking light emitting diodes, LEDs, disposed along a perimeter of the display, the determining, by the eye tracking lens, a first gaze location of the user interface at which the eye gazes, comprising:
capturing infrared light spots reflected by human eyes through the human eye tracking lens, wherein the infrared light spots are formed after the plurality of infrared tracking LEDs project infrared light to the human eyes;
A first gaze location of the human eye at the user interface is determined based on the infrared light spot.
3. The method of claim 1 or 2, wherein when the AR glasses are in a photographing mode, the user interface includes a viewfinder, the controlling operation based on the first gaze location and the first touch signal comprises:
if the user interface comprises a photographing key, the first gazing position is located in the boundary of the photographing key, and the first touch signal is a light touch signal, performing real photographing;
if the user interface comprises a focusing key, the first gazing position is positioned in the focusing key, and the first touch signal is a light touch signal, performing reality photographing;
if the user interface comprises a virtual information key, the first gaze location is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, a mixed reality photographing function is started, AR virtual information overlapped in a real scene is displayed in the view-finding frame under the mixed reality photographing function, and image information obtained by photographing under the mixed reality photographing function is real scene image information overlapped with the AR virtual information;
If the user interface comprises a floating window mode key, the first gazing position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, a floating window mode photographing function is started, a first floating window is displayed in the user interface under the floating window mode photographing function, and the first floating window comprises a reduced view-finding frame and a photographing shutter key.
4. A method according to claim 3, wherein after said turning on a floating window mode photographing function, the method further comprises:
determining a second gazing position where the human eyes gaze at the user interface through the human eye tracking lens, and receiving a second touch signal sent by the touch finger ring through the Bluetooth module;
if the second gazing position is positioned in the view-finding frame and the second touch signal is a light touch signal, closing the floating window mode photographing function;
if the second gazing position is positioned in the boundary of the photographing shutter key and the second touch signal is a light touch signal, performing actual photographing;
and if the second gazing position is positioned in the first floating window and the second touch signal is a long-press signal, moving the first floating window based on the eye gazing point.
5. The method of claim 1 or 2, wherein the controlling operation based on the first gaze location and the first touch signal when the AR glasses are in a video recording mode comprises:
if the user interface comprises a video button, the first gazing position is positioned in the boundary of the video button, and the first touch signal is a light touch signal, performing real video recording;
if the user interface comprises a virtual information key, the first gaze location is located in the boundary of the virtual information key, and the first touch signal is a light touch signal, a mixed reality video recording function is started, AR virtual information overlapped in a real scene is displayed in the view-finder frame under the mixed reality video recording function, and video information recorded under the mixed reality video recording function is real scene video information overlapped with the AR virtual information;
if the AR glasses are recording, the user interface comprises a floating window mode key, the first gaze position is located in the boundary of the floating window mode key, and the first touch signal is a light touch signal, a floating window mode recording function is started, a second floating window is displayed in the user interface under the floating window mode photographing function, and the second floating window comprises a reduced recorded duration, a view finding frame and a pause recording key.
6. The method of claim 5, wherein after the floating window mode video recording function is turned on, the method further comprises:
determining a third gazing position where the human eyes gaze at the user interface through the human eye tracking lens, and receiving a third touch signal sent by the touch finger ring through the Bluetooth module;
if the third gazing position is positioned in the view-finding frame and the third touch signal is a light touch signal, closing the floating window mode video recording function;
if the third gazing position is positioned in the pause recording key boundary and the third touch signal is a light touch signal, pausing recording;
and if the third gazing position is positioned in the second floating window and the third touch signal is a long-press signal, moving the second floating window based on the eye gazing point.
7. A device control apparatus, characterized in that it is applied to augmented reality AR glasses, the AR glasses include bluetooth module, display and eye tracking lens, the eye tracking lens is located one side of display, the apparatus includes:
a display unit for displaying a user interface on the display;
a gaze location determining unit configured to determine, through the eye-tracking lens, a first gaze location at which the human eye gazes at the user interface;
The communication unit is used for receiving a first touch signal sent by a touch finger ring matched with the AR glasses through the Bluetooth module;
the control unit is used for performing control operation based on the first gazing position and the first touch control signal;
wherein when the AR glasses are in a camera mode, the user interface includes a viewfinder, and the control unit performs a control operation based on the first gaze location and the first touch signal, including:
and if the first gaze location is located in the view-finding frame and the first touch signal is a touch signal, focusing the first gaze location, wherein after focusing the first gaze location, the user interface displays a focusing key and an exposure scale, wherein the focusing key is used for realizing realistic photographing, and after the user interface displays the focusing key and the exposure scale, if the first gaze location is located in the focusing key and the first touch signal is a touch signal, realistic photographing is performed, and if no operation is performed in a first time period, the focusing key and the exposure scale are removed.
8. An augmented reality AR glasses comprising a processor, a memory, a bluetooth module, a display, a human eye tracking lens, a wireless communication module, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any one of claims 1-6.
9. A computer readable storage medium for storing a computer program for execution by a processor to implement the method of any one of claims 1-6.
CN201911310496.1A 2019-12-18 2019-12-18 Equipment control method and related equipment Active CN111061372B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911310496.1A CN111061372B (en) 2019-12-18 2019-12-18 Equipment control method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911310496.1A CN111061372B (en) 2019-12-18 2019-12-18 Equipment control method and related equipment

Publications (2)

Publication Number Publication Date
CN111061372A CN111061372A (en) 2020-04-24
CN111061372B true CN111061372B (en) 2023-05-02

Family

ID=70302279

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911310496.1A Active CN111061372B (en) 2019-12-18 2019-12-18 Equipment control method and related equipment

Country Status (1)

Country Link
CN (1) CN111061372B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015120673A1 (en) * 2014-02-11 2015-08-20 惠州Tcl移动通信有限公司 Method, system and photographing equipment for controlling focusing in photographing by means of eyeball tracking technology
WO2015183438A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Realtime capture exposure adjust gestures
CN107003730A (en) * 2015-03-13 2017-08-01 华为技术有限公司 A kind of electronic equipment, photographic method and camera arrangement
CN109600555A (en) * 2019-02-02 2019-04-09 北京七鑫易维信息技术有限公司 A kind of focusing control method, system and photographing device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103747181A (en) * 2014-01-10 2014-04-23 上海斐讯数据通信技术有限公司 System for combining videos and camera-acquired pictures
CN105824522A (en) * 2015-09-24 2016-08-03 维沃移动通信有限公司 Photographing method and mobile terminal
US10466780B1 (en) * 2015-10-26 2019-11-05 Pillantas Systems and methods for eye tracking calibration, eye vergence gestures for interface control, and visual aids therefor
CN106131394A (en) * 2016-06-15 2016-11-16 青岛海信移动通信技术股份有限公司 A kind of method and device taken pictures
CN106354253A (en) * 2016-08-19 2017-01-25 上海理湃光晶技术有限公司 Cursor control method and AR glasses and intelligent ring based on same
CN107729871A (en) * 2017-11-02 2018-02-23 北方工业大学 Infrared light-based human eye movement track tracking method and device
CN109725717A (en) * 2018-11-30 2019-05-07 成都理想境界科技有限公司 Image processing method and AR equipment applied to AR equipment
CN109597489A (en) * 2018-12-27 2019-04-09 武汉市天蝎科技有限公司 A kind of method and system of the eye movement tracking interaction of near-eye display device
CN110069101B (en) * 2019-04-24 2024-04-02 洪浛檩 Wearable computing device and man-machine interaction method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015120673A1 (en) * 2014-02-11 2015-08-20 惠州Tcl移动通信有限公司 Method, system and photographing equipment for controlling focusing in photographing by means of eyeball tracking technology
WO2015183438A1 (en) * 2014-05-30 2015-12-03 Apple Inc. Realtime capture exposure adjust gestures
CN107003730A (en) * 2015-03-13 2017-08-01 华为技术有限公司 A kind of electronic equipment, photographic method and camera arrangement
CN109600555A (en) * 2019-02-02 2019-04-09 北京七鑫易维信息技术有限公司 A kind of focusing control method, system and photographing device

Also Published As

Publication number Publication date
CN111061372A (en) 2020-04-24

Similar Documents

Publication Publication Date Title
CN110647237B (en) Gesture-based content sharing in an artificial reality environment
JP6393367B2 (en) Tracking display system, tracking display program, tracking display method, wearable device using them, tracking display program for wearable device, and operation method of wearable device
JP6400197B2 (en) Wearable device
US9838597B2 (en) Imaging device, imaging method, and program
CN112118380B (en) Camera control method, device, equipment and storage medium
US9442571B2 (en) Control method for generating control instruction based on motion parameter of hand and electronic device using the control method
US11017257B2 (en) Information processing device, information processing method, and program
CN111970456B (en) Shooting control method, device, equipment and storage medium
CN110546601B (en) Information processing device, information processing method, and program
JP6341755B2 (en) Information processing apparatus, method, program, and recording medium
JPWO2015105044A1 (en) Interface device, portable device, control device, module, control method, and computer program
JP2016177658A (en) Virtual input device, input method, and program
US20180176459A1 (en) Method and device for changing focal point of camera
KR102361025B1 (en) Wearable glasses and method for displaying image therethrough
US11650661B2 (en) Electronic device and control method for electronic device
US11675198B2 (en) Eyewear including virtual scene with 3D frames
JP7459798B2 (en) Information processing device, information processing method, and program
KR20180094875A (en) Information processing apparatus, information processing method, and program
CN111061372B (en) Equipment control method and related equipment
CN111782053B (en) Model editing method, device, equipment and storage medium
WO2019239162A1 (en) Hand held device for controlling digital magnification on a portable display
CN106662911A (en) Gaze detector using reference frames in media
US20210375002A1 (en) Electronic apparatus, control method of electronic apparatus, and non-transitory computer readable medium
JP2023087378A (en) Electronic device, method for controlling electronic device, and program
JP2023087412A (en) Electronic device, electronic device control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant