CN115877942A - Electronic device and method of controlling the same by detecting state of user's eyes - Google Patents

Electronic device and method of controlling the same by detecting state of user's eyes Download PDF

Info

Publication number
CN115877942A
CN115877942A CN202111141084.7A CN202111141084A CN115877942A CN 115877942 A CN115877942 A CN 115877942A CN 202111141084 A CN202111141084 A CN 202111141084A CN 115877942 A CN115877942 A CN 115877942A
Authority
CN
China
Prior art keywords
user
eyes
state information
eye
head
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111141084.7A
Other languages
Chinese (zh)
Inventor
李江亮
苏爱民
方俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Guangshi Fusion Intelligent Technology Co ltd
Original Assignee
Shanghai Guangshi Fusion Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Guangshi Fusion Intelligent Technology Co ltd filed Critical Shanghai Guangshi Fusion Intelligent Technology Co ltd
Priority to CN202111141084.7A priority Critical patent/CN115877942A/en
Publication of CN115877942A publication Critical patent/CN115877942A/en
Pending legal-status Critical Current

Links

Images

Abstract

An electronic device and a method of controlling the same by detecting a state of a user's eyes, the method comprising: detecting current state information of a left eye of a user; detecting current state information of the right eye of the user; comparing the current state information of the left eye with the current state information of the right eye; determining whether the eyes of the user are in an unnatural state according to the comparison result; and starting the corresponding function of the electronic equipment when the eyes of the user are determined to be in the unnatural state.

Description

Electronic device and method of controlling the same by detecting state of user's eyes
Technical Field
The present invention relates to the field of human-computer interaction, and more particularly, to an electronic device and a method for controlling the same by detecting a state of a user's eyes.
Background
With the development of technology, head-mounted interactive devices (e.g., smart helmets, smart glasses, etc.) are becoming increasingly popular. For a head-mounted interactive device, how to allow a user to conveniently interact with the head-mounted interactive device is an important consideration.
One interactive method is to perform gesture input, which requires a camera to be deployed on the head-mounted interactive device and to continuously acquire and analyze gesture images of the user through the camera, so that the requirements on the performance and the calculation power of the device are high, and the energy consumption is also high.
Another interaction method is to interact through an input device held by the user, and in particular, to change the position of a cursor on a display medium of the head-mounted interaction device by sensing the position and posture change of the input device, so as to determine the interaction operation the user wishes to perform, such as an icon the user wishes to click. However, this approach requires the user to additionally hold the input device, is cumbersome, and does not liberate the user's hands.
An electronic device and a method of controlling the same by detecting a state of a user's eyes are provided.
Disclosure of Invention
One aspect of the invention relates to a method of controlling an electronic device by detecting a state of a user's eyes, comprising: detecting current state information of a left eye of a user; detecting current state information of the right eye of the user; comparing the current state information of the left eye with the current state information of the right eye; determining whether the eyes of the user are in an unnatural state according to the comparison result; and starting the corresponding function of the electronic equipment when the eyes of the user are determined to be in the unnatural state.
Another aspect of the invention relates to a method for performing an operation by detecting a state of a user's eye, comprising: detecting current state information of any eye of a user; judging whether the eyes are closed or not according to the current state information; if the eyes are determined to be closed, presenting a predetermined interactive interface on a display medium of a head-mounted interactive device worn by the user and corresponding to the other eye; if the eye is no longer closed, performing one or more of the following operations:
deactivating or closing the predetermined interactive interface on a display medium of the head-mounted interactive device;
presenting another interactive interface; or
Perform other functions.
Another aspect of the present invention relates to a method of controlling an electronic device by detecting a state of a user's eyes, comprising: detecting current state information of any eye of a user; judging whether the eyes are closed or not according to the current state information; if the eyes are determined to be closed, starting corresponding functions of the electronic equipment, wherein the functions comprise: a screen picture sliding or moving function; a virtual object move function; or a direction selection function.
Another aspect of the invention relates to an electronic device comprising: one or more detection devices for detecting status information of the user's eyes; one or more display media for presenting an interactive interface; and a processor for implementing the methods described herein.
Another aspect of the invention relates to a storage medium in which a computer program is stored which, when being executed by a processor, can be used for carrying out the method described in the present application.
By the scheme of the invention, the method for conveniently, efficiently and stably realizing function control through the eyes of the user is provided. In some embodiments, the predetermined function may be activated when both eyes of the user are in an unnatural state or when one eye is closed, and the function may be exited or stopped or other functions may be activated when both eyes of the user are no longer in an unnatural state or when the eyes are no longer closed, thereby improving the convenience and efficiency of the interactive operation. In some embodiments, the operation of closing one eye by the user can be used not only for starting the interactive interface, but also for accurately aligning the cursor on the display medium to the virtual object or the actual object through only one eye (i.e., the eye that is not closed) afterwards, so that the parallax between the two eyes can be avoided, the user can select the virtual object or the actual object more accurately and efficiently, and the efficiency and the convenience of human-computer interaction are obviously improved.
Drawings
Embodiments of the invention are further described below with reference to the accompanying drawings, in which:
fig. 1 shows an apparatus or electronic device for performing an operation by detecting the state of a user's eyes according to one embodiment;
FIG. 2 illustrates a method for controlling an electronic device by detecting a state of a user's eyes, according to one embodiment;
FIG. 3 illustrates a method for controlling an electronic device by detecting a state of a user's eyes, according to one embodiment;
FIG. 4 illustrates an interactive interface presented on a display medium of a head-mounted interactive device, in accordance with one embodiment;
FIG. 5 illustrates an interaction process according to one embodiment;
FIG. 6 illustrates a method for controlling an electronic device by detecting a state of a user's eyes, according to one embodiment;
FIG. 7 illustrates an interaction process according to one embodiment;
FIG. 8 illustrates a method for controlling an electronic device by detecting a state of a user's eyes, according to one embodiment;
fig. 9 shows a method for controlling an electronic device by detecting a state of a user's eyes according to one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail by embodiments with reference to the accompanying drawings. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the present application, the natural state of both eyes refers to a state in which both eyes may be present when a person unconsciously controls both eyes, such as natural opening of both eyes, natural closing of both eyes, and natural intermediate states between opening and closing of both eyes.
Generally, the unnatural state of both eyes is a state that a person consciously controls one or both eyes to create that is distinguishable from the natural state of both eyes. The unnatural state of the eyes may be, for example, a difference in the degree of closure of the eyes, such as closing an eye, squinting an eye, intentionally opening an eye, and the like. In one embodiment, the unnatural state of both eyes means that the state of both eyes is different.
Fig. 1 shows an apparatus or electronic device for performing an operation by detecting a state of a user's eyes, which may be, for example, a head-mounted interactive device (e.g., smart glasses) 100, according to one embodiment. The apparatus comprises a first detection device 101 for detecting the state of the right eye of a user and a second detection device 102 for detecting the state of the left eye of a user.
The first detection device 101 and the second detection device 102 may be various devices that can be used to detect the state of the eyes of the user, such as an infrared detection device, a laser detection device, a camera, and the like.
In one embodiment, the detection device may include a transmitter and a receiver, which may be integrated or combined together, or may be two physically separated devices. The detection device may transmit a signal (e.g., an infrared signal) to the user's eye (e.g., the eyeball region) via a transmitter therein and detect the state of the user's eye via a reflected signal received by a receiver.
In one embodiment, the detection device may continuously transmit signals through a transmitter therein and may continuously collect reflected signals through a receiver. In one embodiment, the detection device may also transmit a signal at predetermined time intervals via a transmitter therein.
To avoid interference from various signals in the environment, the signal emitted by the emitter may have unique characteristics, for example, the emitted signal may be an infrared signal or a far infrared signal, or the emitted signal may have a particular frequency to distinguish from other signals in the environment.
The strength of the signal emitted by the emitter can be configured to avoid harm to the human eye. In one embodiment, a light sensing device is also included in the apparatus to enable the intensity or amplitude of the signal emitted by the emitter to be dynamically adjusted in dependence on ambient light.
In one embodiment, the apparatus may be a head-mounted interactive device, such as smart glasses, smart helmets, and the like. The smart glasses may be, for example, AR glasses, VR glasses, or the like. In one embodiment, the detection device may be mounted at the nose pad of the eyeglasses, or near the junction of the frame and temple of the eyeglasses, or on the frame of the eyes, or at another location.
Fig. 2 shows a method for controlling an electronic device by detecting a state of a user's eyes according to one embodiment, the method comprising the steps of:
step 201: current state information of the left eye of the user is detected.
Step 202: current state information of the right eye of the user is detected.
In one embodiment, the current state information of the left eye of the user may be detected by a first detection device and the current state information of the right eye of the user may be detected by a second detection device. In one embodiment, the current state information of the left eye and the right eye of the user can be detected simultaneously by one detection device.
The current state information of the user's eyes may be any information related to the state of the user's eyes currently detected by the detection device, such as the intensity of the reflected signal, the time of flight of the signal, the image of the user's eyes, etc., or may be other information calculated or processed based on the information, such as the current open-close state, the degree of closure, etc., of the user's eyes. In this application, the degree of closure of the eye is used to indicate the degree of closure of the eye, with the greater the eye is open, the less closure.
The state information of the user's eyes may have various expressions. In one embodiment, the state information of the user's eyes may simply include two states: open and close. In one embodiment, the status information of the user's eyes may include more than one status, such as open, closed, and one or more intermediate statuses between open and closed. In one embodiment, the status information of the user's eyes may be a value within a certain continuous range expressed in numerical values. For example, if 1 indicates that the eye is fully open and 0 indicates that the eye is fully closed, the state information of the eye may be a value between 0 and 1, the magnitude of which may be used to indicate the degree of closure of the eye, with larger values corresponding to smaller degrees of closure. In one embodiment, the state information of the eye may be represented by the raw information collected by the detection device, such as the intensity of the reflected signal, the time of flight of the signal, and the like.
Step 203: comparing the current state information of the left eye with the current state information of the right eye.
After the current state information of the left and right eyes of the user are obtained, respectively, the two state information may be compared. In one embodiment, comparing the current state information for the left eye and the current state information for the right eye includes determining a difference between the current state information for the left eye and the current state information for the right eye.
Step 204: and determining whether the eyes of the user are in an unnatural state according to the comparison result.
In one embodiment, if the state information of the user's eyes is divided into a limited number of states, it may be compared whether the states of the left and right eyes are the same, and if not, it may be considered that the user's eyes are in an unnatural state. In one embodiment, if the state information of the user's eyes is represented as a value within a certain continuous range, a difference between the left-eye state information and the right-eye state information may be determined, and if the difference exceeds a predetermined threshold, it may be determined that the user's eyes are in an unnatural state, and conversely, it may be determined that the user's eyes are in a natural state.
In the natural state of a human eye, the state of the left and right eyes of the same person is generally the same or similar, whether the eyes are in an open state, a closed state, or some intermediate state between open and closed. And the open or closed state of the eyes of different persons may differ due to different physiological conditions (e.g., different eye sizes, positions, shapes, etc.). Therefore, by comparing the difference between the state information of the left eye and the right eye of the same user, whether the two eyes of the user are in an unnatural state can be accurately judged, and meanwhile, misjudgment possibly caused by the difference of physiological conditions of different people can be avoided or alleviated.
Step 205: and if the eyes of the user are determined to be in the unnatural state, starting the corresponding function of the electronic equipment.
The electronic device may be a head-mounted interactive device. In one embodiment, when the eyes of the user are in the unnatural state, the duration of the eyes of the user in the unnatural state can be determined, and then the corresponding function can be started according to the duration of the eyes of the user in the unnatural state. For example, a cursor (also referred to as a sight) may be immediately presented on a display medium of the head-mounted interactive device when the user's eyes are in an unnatural state. If the eyes are not in the unnatural state at the moment, the cursor disappears; if both eyes continue in an unnatural state, e.g., 2 seconds or more, a four-, eight-, or multi-directional menu centered on the cursor can be presented on the display medium of the head-mounted interactive device.
In one embodiment, if it is determined that the eyes of the user are in an unnatural state, a predetermined interactive interface may be presented on a display medium of a head-mounted interactive device (e.g., glasses) worn by the user. The predetermined interactive interface may be, for example, a function selection interface, a targeting interface, a landmark recognition interface, an object selection interface, a scene facility operation interface, and the like. The function selection interface may be used to select different functions (e.g., keys, buttons, icons, menus, etc.); the aiming interface may be used to aim a cursor on a display medium of the head-mounted interactive device at a virtual object or a real object; the landmark recognition interface may be used to recognize landmarks that are present in a real scene; the object selection interface may be used to select objects present in the real scene; the scene facility operation interface can be used for operating facilities in a scene.
In one embodiment, the predetermined interactive interface has a cursor (also referred to as a sight) thereon for selecting a virtual object or a real object. The cursor may always be presented at a predetermined location on the display medium of the head-mounted interactive device, such as a center location of the display medium. The virtual objects may be, for example, keys, function keys, icons, text, images, videos, etc., presented on a display medium of a head-mounted interactive device, and the real objects may be, for example, signs in a real scene, electronic devices, etc.
To select a virtual object on the interactive interface with a cursor, a spatial location of the virtual object may be defined. And the cursor may always be presented at a predetermined location on the display medium of the head-mounted interactive device, such as a center location of the display medium. In this way, when the user rotates the head, the relative position relationship between the cursor and the virtual object can be changed, so that the corresponding virtual object on the interactive interface can be selected through the cursor.
In one embodiment, absolute spatial position information of a virtual object in a real scene, that is, a physical position of the virtual object in a real environment (for example, a spatial position in a global coordinate system or a spatial position in a certain building) may be defined, and it is determined whether a cursor is aligned with a certain virtual object according to real-time position and posture information (simply referred to as pose information) of a head-mounted interactive device worn by a user. For example, according to the real-time pose information of the head-mounted interactive device and the absolute spatial position information of the virtual object, whether the virtual object is currently presented on a display medium of the head-mounted interactive device is determined, and if yes, whether the presentation position of the cursor is coincident with the presentation position of the virtual object on the display medium is further judged, so that whether a certain virtual object is selected by the cursor is judged. The real-time pose information of the head-mounted interactive device can be determined according to various existing techniques.
In one embodiment, relative position information between the virtual object and the head mounted interactive device may be defined, for example, a spatial position relative to the head mounted interactive device, the human body, or a part of the human body, which is different from the absolute spatial position, so that, when the virtual object is first presented on the display medium of the head mounted interactive device, presentation may be performed on the display medium of the head mounted interactive device according to the relative position information. For example, when a virtual object is first rendered, the position of the virtual object may be defined as being 3 meters directly in front of the head-mounted interaction device, after which the virtual object may be selected using the cursor in accordance with pose change information or only pose change information of the head-mounted interaction device. Means for sensing pose changes or pose changes of the head-mounted interaction device, such as inertial sensors (IMU), may be included in the head-mounted interaction device.
The virtual objects having relative spatial positions may be used to present an interface of personal interaction functions to control functions related to the head-mounted interaction device, such as: starting/closing a clock of the equipment, starting/stopping a timer, starting telephone dialing, starting or stopping a video recording function and the like; the virtual object with absolute spatial position may be used to present an interface of scene interaction functionality for live scene control, for example: identifying signs deployed in the scene, controlling on/off of lights in the scene, selection of air conditioning functions in the scene, operation of electrical devices in the scene, and the like.
In one embodiment, the unnatural state of the user's eyes includes closing one eye of the user, and then determining whether a cursor on a display medium corresponding to the other eye of the user is aligned with a virtual or real object with the user closing the one eye. Therefore, only one eye can be used in the alignment process, so that the parallax between two eyes is avoided, and the efficiency and the accuracy of the alignment process are improved. In this way, the operation of closing one eye by the user (i.e. making both eyes in an unnatural state) is not only used for starting the interactive interface, but also used for accurately aligning the cursor on the display medium to the virtual object or the actual object through only one eye, so that the user can more accurately and efficiently select the virtual object or the actual object, and the efficiency and the convenience of human-computer interaction are obviously improved.
When a cursor on the predetermined interactive interface is aligned with a virtual object or a real object, an interactive interface associated with the virtual object or the real object may be presented. In one embodiment, when the cursor on the predetermined interactive interface is aligned with a virtual object or an actual object, whether a predetermined condition is met or not can be further judged, and the interactive interface associated with the virtual object or the actual object is presented if the predetermined condition is met. The predetermined condition may be, for example, that the user's previously closed eyes are open, that the user blinks (e.g., the previously open eyes blink while the previously closed eyes remain closed), that a virtual object or real object is aligned for a predetermined length of time (e.g., 1 second), and so forth. In one embodiment, in the case that it is required to determine whether the virtual object or the actual object is aligned for a predetermined length of time, prompt information may be provided to the user, for example, a progress bar or other visual prompt information may be presented on the interactive interface, a voice prompt may be provided to the user, and so on.
In one embodiment, if it is determined that the eyes of the user are in an unnatural state, a predetermined interactive interface is presented on a display medium of a head-mounted interactive device (e.g., glasses) corresponding to the eyes of the user that are less closed. For example, if one eye of the user is open and the other eye is closed or substantially closed, the predetermined interactive interface is presented on the display medium of the head-mounted interactive device corresponding to the open eye of the user.
In one embodiment, if it is determined that the degree of closure of the left eye of the user is high, a first predetermined interactive interface may be presented on a display medium of the head-mounted interactive device corresponding to the right eye of the user; and/or, if it is determined that the degree of closure of the right eye is high, a second predetermined interactive interface may be presented on a display medium of the head-mounted interactive device corresponding to the left eye of the user.
The first predefined interactive interface and the second predefined interactive interface may be different, but may also be the same. The interactive interface may have various virtual objects thereon, such as cursors, keys, function keys, icons, operation buttons, text, images, and the like. In one embodiment, the first predefined interactive interface is an interface related to a personal interactive function, which may include, for example, timing, recording, telephony, clock, etc., and the second predefined interactive interface is an interface related to a scene interactive function, such as an interface for interacting with objects in a real scene; or vice versa. The object in the real scene may be any identifiable marker deployed in the scene, such as a bar code, a two-dimensional code, a light emitting marker, an optical communication device, and the like.
The display medium may be any device that can be used to provide an information display function, such as a display screen, a prism, a lens, a mirror, a transparent object (e.g., glass), etc.
Fig. 3 shows a method for controlling an electronic device by detecting a state of an eye of a user according to an embodiment, which includes the following steps (some of the steps are similar to those in fig. 2, and are not described here again):
step 301: current state information of the left eye of the user is detected.
Step 302: current state information of the right eye of the user is detected.
Step 303: comparing the current state information of the left eye with the current state information of the right eye.
Step 304: and determining whether the eyes of the user are in an unnatural state according to the comparison result.
Step 305: and if the eyes of the user are determined to be in the unnatural state, presenting a preset interactive interface on a display medium of a head-mounted interactive device worn by the user.
Step 306: the current state information of the left eye of the user and the current state information of the right eye of the user are continuously detected.
Step 307: comparing the current state information of the left eye and the current state information of the right eye.
Step 308: and determining whether the eyes of the user are in an unnatural state according to the comparison result.
Step 309: deactivating or closing the predetermined interactive interface, or presenting another interactive interface, or performing other functions on a display medium of the head-mounted interactive device, if it is determined that the user's eyes are no longer in an unnatural state.
In one embodiment, the other interactive interface presented or other function performed is related to the operation performed by the user, e.g. related to a virtual object or real object selected by the user, related to the location at which the user looks.
In this way, the predetermined interactive interface can be started on the display medium of the head-mounted interactive device when the eyes of the user are in the unnatural state, and the predetermined interactive interface can be stopped or closed on the display medium when the eyes of the user are not in the unnatural state any more, or another interactive interface is presented, or other functions are executed, so that the convenience and the efficiency of interactive operation are improved.
Fig. 4 illustrates an interactive interface presented on a display medium of a head-mounted interactive device with 8 function keys listed, respectively, take a picture, record a video, share a share, note, schedule, phone, clock, time, according to one embodiment. A cursor (also referred to as a sight) similar to the shape of a cross-star is also presented on the display medium for selecting any one of the above function keys. It will be appreciated that the interactive interface and the cursor may have different representations and that the functionality presented on the interactive interface may be different.
In one embodiment, when a user wearing the head-mounted interactive device rotates the head, the relative position between the cursor and the function keys is changed, so that the function keys in the interactive interface can be selected by the cursor by rotating the head. In one embodiment, when the function keys are presented on the display medium of the head-mounted interactive device, the positional relationship of the function keys (i.e., the relative spatial positions of the function keys) with respect to the user or the head-mounted interactive device may be defined, for example, the function keys are located 3 meters directly in front of the user. And the cursor may always be presented at a predetermined location on the display medium of the head-mounted interactive device, such as a center location of the display medium. In this way, when the user rotates the head, the relative position relationship between the cursor and the function key can be changed, so that the corresponding function key on the interactive interface can be selected through the cursor. It will be appreciated that other virtual objects than function keys may be selected in a similar manner.
FIG. 5 shows an interaction process according to one embodiment, which includes four interaction stages (A), (B), (C), (D).
In the interaction phase (a), the user puts both eyes in an unnatural state, e.g., closes one eye, so that an interaction interface is presented on a display medium of the head-mounted interaction device corresponding to the other eye of the user, the display medium is in a rectangular shape, 8 function keys are arranged on the interaction interface, and a cursor similar to a cross-star shape is arranged on the interaction interface, and the cursor is used for selecting any one of the function keys.
In the interaction stage (B), the user changes the relative positional relationship between the cursor and the function key by rotating the head.
In the interaction phase (C), the user continues to turn the head to select the function key "time" using the cursor, which is highlighted after selection.
In the interaction phase (D), the user brings both eyes to a natural state, e.g. closed eyes before opening, at which point a timing operation is initiated and timing information is presented on the display medium of the head mounted interaction device. It will be appreciated that the timing operation need not be initiated immediately when both eyes are in their natural state, but may be delayed for a period of time (e.g. 3 seconds) and then initiated again.
Fig. 6 shows a method for controlling an electronic device by detecting a state of an eye of a user according to an embodiment, which includes the following steps (some of which are similar to those in fig. 2 and are not described here again):
step 601: current state information of the left eye of the user is detected.
Step 602: current state information of the right eye of the user is detected.
Step 603: comparing the current state information of the left eye with the current state information of the right eye.
Step 604: and determining whether the eyes of the user are in an unnatural state according to the comparison result.
Step 605: and if the eyes of the user are determined to be in the unnatural state, presenting a predetermined interactive interface on a display medium of the equipment worn by the user, wherein the predetermined interactive interface is provided with a cursor for selecting an object in a real scene.
The object in the real scene may be, for example, a marker in the real scene, an electronic device, or the like.
Step 606: it is determined whether the cursor is directed at an object in a real scene.
In one embodiment, image recognition techniques may be used to determine whether the cursor is directed at an object in the real scene, such as a landmark in the real scene.
In one embodiment, the position information of the relevant object in the scene may be calibrated or stored in advance, and whether the cursor is aligned with an object in the real scene may be determined according to the real-time position and posture information (referred to as pose information for short) of the head-mounted interactive device worn by the user. For example, there may be a plurality of electronic devices that can be controlled or operated in a scene, in this case, the position information of these electronic devices may be stored in advance, further, a ray emitted from the head-mounted interactive device may be determined in space according to the real-time pose information of the head-mounted interactive device worn by the user and the presentation position of the cursor on the display medium, and then, according to whether the ray passes through a certain electronic device, whether the cursor is currently aligned with the electronic device may be determined.
In one embodiment, where a user closes one eye, it is determined whether a cursor on a display medium corresponding to the user's other eye is directed at an object in a real scene. Therefore, only one eye can be used in the alignment process, so that the parallax between the two eyes is avoided, and the user can more accurately and efficiently select the object in the real scene.
Step 607: if so, an interactive interface associated with the object is presented.
In one embodiment, when the cursor is determined to be directed at an object in the real scene, an interactive interface associated with the object may be presented on the display medium corresponding to the user's non-closed eyes.
In one embodiment, when the cursor is determined to be directed at an object in the real scene, it may be further determined whether the user's eyes are in a natural state, and if it is determined to be in the natural state, an interactive interface associated with the object is presented. Thus, step 607 may include: if the alignment is such that the user's eyes are determined to be in a natural state, an interactive interface associated with the object is displayed.
FIG. 7 shows an interaction process including four interaction phases (A), (B), (C), (D), according to one embodiment.
In the interaction phase (a), the user puts both eyes in an unnatural state, e.g., closes one eye, so that an interactive interface is presented on the display medium of the head-mounted interactive device corresponding to the other eye of the user, and the interactive interface has a cross-star-like shaped cursor (also called a sight-star) for aiming, and the cursor can be used for selecting an object in a real scene. The objects shown in fig. 7 are two square markers located in a real scene, e.g. optical communication devices, with which the user can interact information via a head-mounted interaction device.
In the interaction stage (B), the user changes the relative positional relationship between the cursor and the marker by rotating the head.
In the interaction phase (C), the user continues to turn the head to select a certain sign using the cursor.
In an interaction phase (D), an interaction interface associated with the logo is displayed. The sign can be, for example, a sign installed in a restaurant, and the interactive interface of the sign has a plurality of function keys, including: number arrangement, ordering, reservation, comment, point, recommendation, discount and dish. It is to be understood that this interactive interface is merely an example, which may have different forms or functions.
Thereafter, the user may continue to use the cursor to select a function key in the interactive interface, which may be similar to fig. 5 and will not be described herein.
Fig. 8 illustrates a method for controlling an electronic device by detecting the state of eyes of a user according to an embodiment, in which reference state information of the eyes of the user is further introduced to avoid or mitigate misjudgment that may be caused by the difference of eyes themselves of the user (e.g., a slight difference in the sizes of left and right eyes existing in a natural state) and an error between different detection devices. The method may include (some steps are similar to those in the method shown in fig. 2, and are not described here again):
step 801: current state information of the left eye of the user is detected.
Step 802: current state information of the right eye of the user is detected.
Step 803: based on the reference state information of the left eye, the current state information of the left eye is processed.
Step 804: the current state information of the right eye is processed based on the reference state information of the right eye.
Reference state information of the user's eyes may be obtained and stored in advance. In one embodiment, the reference state information may include one or more of the following: reference state information when the eyes of the user are naturally open, and reference state information when the eyes are naturally closed. The reference state information of the user may be generated and stored in advance.
According to the reference state information of the left eye or the right eye, the current state information of the left eye or the right eye can be processed, such as adjustment, correction, normalization processing and the like, so that negative effects which may be caused by differences of the two eyes and errors between different detection devices can be avoided or alleviated.
Step 805: comparing the current state information of the left eye with the current state information of the right eye.
After processing the current state information of the left and right eyes, the current state information after the processing may be compared.
Step 806: and determining whether the eyes of the user are in an unnatural state according to the comparison result.
Step 807: and if the eyes of the user are determined to be in the unnatural state, starting the corresponding function of the electronic equipment.
In one embodiment, the method further comprises: reference state information of the left eye of the user and reference state information of the right eye of the user are determined. In one embodiment, reference state information for the user's eye may be determined through a calibration process. For example, a calibration process may be performed each time a user wears a head-mounted interactive device (e.g., smart glasses) to obtain baseline state information for the user's eyes.
The user may or may not be prompted when the calibration process is performed. In one embodiment, the user may be prompted to naturally open their eyes, and the reference state information when the user's eyes are naturally open may be detected by the detection device; the user can be prompted to naturally close the eyes, and the reference state information of the user when the eyes are naturally closed is detected through the detection equipment. In one embodiment, a detection device may be used to detect status information of a user's eyes over a period of time and generate baseline status information for the eyes based on the status information.
In one embodiment, the reference state information of the left and right eyes of the user may be determined at intervals, or may be continuously determined, for example, a time sliding window may be used.
In one embodiment, the reference status information for the left eye of the user may be determined by a first detection device and/or the reference status information for the right eye of the user may be determined by a second detection device.
In one embodiment, the reference state information includes reference state information when the eyes are naturally open and reference state information when the eyes are naturally closed, and the processing the current state information of the left eye based on the reference state information of the left eye includes: processing current state information of the left eye based on the reference state information when the left eye is naturally opened and the reference state information when the left eye is naturally closed; the processing the current state information of the right eye based on the reference state information of the right eye includes: the current state information of the right eye is processed based on the reference state information when the right eye is naturally open and the reference state information when the right eye is naturally closed.
For example, it is assumed that the reference state information when the left eye of the user is naturally open is 0.9, the reference state information when the left eye is naturally closed is 0.1, the reference state information when the right eye of the user is naturally open is 0.8, and the reference state information when the right eye is naturally closed is 0. If the current state information of the left eye of the user is 0.6 and the current state information of the right eye of the user is 0.5, the current state information of the left eye may be processed as (0.6-0.1)/(0.9-0.1) =5/8 and the current state information of the right eye may be processed as (0.5-0)/(0.8-0) =5/8. Therefore, although the originally obtained left-eye current state information and right-eye current state information are different, the processed left-eye current state information and right-eye current state information are no longer different, and thus it can be considered that the two eyes of the user are currently in a natural state.
In some embodiments above, the predetermined interactive interface is presented on the display medium of the head-mounted interactive device as an example, but the present invention is not limited thereto, and any required function may be initiated when it is determined that the eyes of the user are in the unnatural state, and the function may include, for example: a screen picture sliding or moving function; a virtual object move function; a direction selection function; and so on.
For the screen sliding or moving function, the direction of the screen sliding or moving may be determined according to the rotation direction of the head-mounted interactive device worn by the user while the eyes of the user remain in an unnatural state. For example, when a user wants to scroll a screen, the user may leave both eyes in an unnatural state to start a screen sliding or moving function, and then rotate the head (i.e., rotate the head-mounted interactive device) while keeping both eyes in an unnatural state to perform the sliding or moving of the screen. Further, the magnitude or speed of the screen sliding or moving may also be determined according to the rotation magnitude or speed of the head-mounted interactive apparatus in the case where both eyes of the user are in an unnatural state. The user may select the screen to be slid or moved using various means, for example, a cursor may be used to select the screen to be slid or moved.
For the virtual object panning function, the direction in which the virtual object is panned may be determined according to the rotation direction (i.e., the posture change information) of the head-mounted interactive apparatus worn by the user while the eyes of the user remain in an unnatural state. In one embodiment, the spatial position of the virtual object may also be changed according to the pose change information of the head-mounted interactive device worn by the user when the eyes of the user are in an unnatural state. In one embodiment, when the virtual object is moved, a virtual "rigid connection" between the virtual object and the head-mounted interactive device is established, so that when the position or posture of the head-mounted interactive device changes, the spatial position of the virtual object changes accordingly. The user may select the virtual object to be panned using various means, for example, a cursor may be used to select the virtual object to be panned, then both eyes may be put in an unnatural state to activate the virtual object panning function, and then the user may turn the head (i.e., turn the head-mounted interaction device) or change the pose of the head-mounted interaction device while keeping both eyes in an unnatural state to pan the virtual object in space. In moving the virtual object, the magnitude or speed of movement of the virtual object may be determined from the magnitude or speed of rotation of the head-mounted interactive device.
For the direction selection function, the selected direction may be determined according to the turning direction of the head mounted interactive device with both eyes of the user kept in an unnatural state, which may be used, for example, to control the movement of a virtual object (e.g., a virtual character) or a real object. Likewise, in the case where both eyes of the user are in an unnatural state, the magnitude or speed of movement of the virtual object or the actual object may also be determined according to the rotational magnitude or speed of the head-mounted interactive apparatus. In one embodiment, the head-mounted interactive device may be used as a control stick, the zero point of which may be determined in a variety of ways. For example, the zero point may be determined by the initial pose of the head-mounted interactive device when the joystick function is activated, a certain position on the screen may be taken as the zero point, the zero point may be selected by the user, or the zero point may be dynamically re-determined during operation (e.g., the real-time position of a virtual character moving on the screen is taken as the zero point, or the position at which the cursor stops or substantially stops after each movement is taken as a new zero point). In the control process, based on the real-time difference between the zero point and the virtual object, a vector with a magnitude and a direction can be obtained, the direction of the vector can be used for determining the moving direction of the virtual object, and the magnitude of the vector can be used for determining the moving speed or the acceleration of the virtual object.
In one embodiment, the function may be exited or stopped, or other functions initiated, when the user's eyes are no longer in an unnatural state.
In one embodiment, when the eyes of the user are no longer in the unnatural state (for example, when one eye that was closed before is open), the direction and speed of the screen sliding or moving at this time, or the direction and speed of the movement of any virtual object at this time, may be determined, and then the direction and speed may be used as the initial direction and speed, so that the screen or the virtual object continues to slide or move under the preset condition. The preset condition may be, for example, a preset damping coefficient, which may gradually slow down the sliding or moving speed of the screen or gradually slow down the moving speed of the virtual object. The preset condition may be, for example, a preset gravity parameter, which may enable the virtual object to continue to travel in a parabolic manner, for example.
In one embodiment, during the continuous sliding or moving of the screen or during the continuous moving of the virtual object, it may be detected whether the eyes of the user are again in an unnatural state (e.g., close one eye), and if the eyes of the user are again in an unnatural state, the sliding or moving may be stopped, or a greater damping coefficient may be applied to the sliding or moving so that it can be stopped more quickly. The screen or virtual object to be stopped may be selected by a cursor, but may be selected in other ways, for example, by default, the screen or virtual object that the user last operated before.
In this way, the predetermined function can be started when the eyes of the user are in the unnatural state, and the function can be exited or stopped or other functions can be started when the eyes of the user are no longer in the unnatural state, thereby improving the convenience and efficiency of the interactive operation.
In one embodiment, it is also possible to perform only a single eye detection, or to perform detection on both eyes but not to compare the detection results of both eyes. Fig. 9 shows a method for controlling an electronic device by detecting a state of a user's eyes according to an embodiment, which may comprise the steps of:
step 901: current state information of any one eye of the user is detected.
Step 902: and judging whether the eyes are closed or not according to the current state information.
Step 903: if the eyes are determined to be closed, presenting a predetermined interactive interface on a display medium of a head-mounted interactive device worn by the user corresponding to the other eye.
Step 904: if the eye is no longer closed, performing one or more of the following operations:
deactivating or closing the predetermined interactive interface on a display medium of the head-mounted interactive device;
presenting another interactive interface; or
Perform other functions.
In this way, a predetermined function may be activated when a user's eye is closed, and the function may be exited or stopped when the eye is no longer closed, or other functions may be activated, thereby improving the convenience and efficiency of the interactive operation.
In one embodiment, the predetermined interactive interface has a cursor thereon for selecting a virtual object or a real object. In one embodiment, the cursor is always presented at a predetermined location on the display medium of the head-mounted interactive device. The virtual object may have an absolute position in space or a relative position with the head-mounted interaction device.
In one embodiment, the method further comprises: determining whether the cursor is directed at a virtual object or a real object with the eye remaining closed.
In one embodiment, the method further comprises: if a certain virtual object or an actual object is aligned, presenting an interactive interface associated with the virtual object or the actual object; and/or presenting an interactive interface associated with a virtual or real object if the virtual or real object is aligned and a predetermined condition is met, wherein the predetermined condition comprises one or more of the following: the user's previously closed eyes are open; blinking of the user; the virtual object or real object is aligned for a predetermined length of time.
In one embodiment, if the left eye of the user is closed, presenting a first predetermined interactive interface on a display medium of the head-mounted interactive device corresponding to the right eye of the user; and/or, if the right eye of the user is closed, presenting a second predetermined interactive interface on a display medium of the head-mounted interactive device corresponding to the left eye of the user. In one embodiment, the first predetermined interactive interface is an interface related to a personal interactive function, and the second predetermined interactive interface is an interface related to a scene interactive function; or vice versa.
In one embodiment, the method further comprises processing the detected current state information of the user's eyes based on reference state information of the user's eyes. The reference state information of the user's eyes may be obtained and used in a similar manner as above.
Similarly, the various methods implemented by binocular contrast detection above are equally applicable to the case where only monocular detection is performed, or the case where detection is performed on both eyes but the detection results of both eyes are not compared. For example, in one embodiment, a method of controlling an electronic device by detecting a state of a user's eyes includes: detecting current state information of any eye of a user; judging whether the eyes are closed or not according to the current state information; if the eyes are determined to be closed, starting corresponding functions of the electronic equipment, wherein the functions comprise: a screen picture sliding or moving function; a virtual object move function; or a direction selection function. In one embodiment, for the screen sliding or moving function, the direction of screen sliding or moving is determined according to the rotation direction of the head-mounted interactive device under the condition that the eyes are closed; for the virtual object moving function, determining the moving direction of the virtual object according to the rotating direction of the head-mounted interactive equipment under the condition that the eyes are closed; or, for the direction selection function, the selected direction is determined according to the rotation direction of the head-mounted interactive device under the condition that the eyes are closed.
In one embodiment, there is provided an apparatus for performing an operation by detecting a state of a user's eye, comprising: one or more detection devices for detecting status information of the user's eyes; one or more display media for presenting an interactive interface; and a processor for implementing the methods described herein. The apparatus may be a head-mounted interactive device, and the processor may be integrated into a body of the head-mounted interactive device (e.g., a frame of smart glasses) or may be coupled to the body of the head-mounted interactive device as a discrete device by wire or wirelessly. The one or more detection devices may include: a first detection device for detecting status information of a left eye of a user; and a second detection device for detecting status information of the right eye of the user.
In one embodiment of the invention, the invention may be implemented in the form of a computer program. The computer program may be stored in various storage media (e.g., hard disk, optical disk, flash memory, etc.), which when executed by a processor, can be used to implement the methods of the present invention.
In another embodiment of the present invention, the present invention may be implemented in the form of an electronic device. The electronic device comprises a processor and a memory in which a computer program is stored which, when being executed by the processor, can be used for carrying out the method of the invention.
References herein to "various embodiments," "some embodiments," "one embodiment," or "an embodiment," etc., indicate that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. Thus, appearances of the phrases "in various embodiments," "in some embodiments," "in one embodiment," or "in an embodiment," or the like, in various places throughout this document are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. Thus, a particular feature, structure, or characteristic illustrated or described in connection with one embodiment may be combined, in whole or in part, with a feature, structure, or characteristic of one or more other embodiments without limitation, so long as the combination is not logically inconsistent or workable. Expressions appearing herein similar to "according to a", "based on a", "by a" or "using a" mean non-exclusive, i.e. "according to a" may cover "according to a only", and also "according to a and B", unless it is specifically stated that the meaning is "according to a only". In the present application, some illustrative operational steps are described in a certain order for clarity of illustration, but one skilled in the art will appreciate that each of these operational steps is not essential, and some of the steps may be omitted or replaced with others. It is also not necessary that these operations be performed sequentially in the manner shown, but rather that some of these operations be performed in a different order, or in parallel, as desired, provided that the new implementation is not logically or operationally unfeasible.
Having thus described several aspects of at least one embodiment of this invention, it is to be appreciated various alterations, modifications, and improvements will readily occur to those skilled in the art. Such alterations, modifications, and improvements are intended to be within the spirit and scope of the invention. Although the present invention has been described in connection with some embodiments, it is not intended to limit the present invention to the embodiments described herein, and various changes and modifications may be made without departing from the scope of the present invention.

Claims (31)

1. A method of controlling an electronic device by detecting a state of a user's eyes, comprising:
detecting current state information of a left eye of a user;
detecting current state information of the right eye of the user;
comparing the current state information of the left eye with the current state information of the right eye;
determining whether the eyes of the user are in an unnatural state according to the comparison result; and
and starting the corresponding function of the electronic equipment when the eyes of the user are determined to be in the unnatural state.
2. The method of claim 1, further comprising: determining a duration of time that both eyes of a user are in an unnatural state, and wherein the starting the corresponding function of the electronic device when it is determined that both eyes of the user are in the unnatural state comprises: and when determining that the eyes of the user are in the unnatural state, starting the corresponding functions of the electronic equipment according to the duration of the unnatural state of the eyes of the user.
3. The method of claim 1 or 2, wherein the electronic device is a head-mounted interactive device worn by a user, and wherein initiating the respective function of the electronic device comprises: a predetermined interactive interface is presented on a display medium of a head-mounted interactive device worn by a user.
4. The method of claim 3, wherein the predetermined interactive interface is presented on a display medium of the head-mounted interactive device corresponding to an eye of the user that is less closed.
5. The method of claim 3, wherein,
if the closeness of the left eye of the user is higher, presenting a first preset interactive interface on a display medium of the head-mounted interactive device corresponding to the right eye of the user; and/or
And if the degree of closure of the right eye of the user is higher, presenting a second preset interactive interface on a display medium of the head-mounted interactive equipment corresponding to the left eye of the user.
6. The method of claim 5, wherein the first predetermined interactive interface is an interface regarding a personal interactive function, and the second predetermined interactive interface is an interface regarding a scene interactive function; or vice versa.
7. The method of claim 6, wherein the scene interaction function is used to interact with objects in a real scene.
8. The method of claim 3, further comprising:
continuously detecting the current state information of the left eye of the user and the current state information of the right eye of the user;
comparing the current state information of the left eye with the current state information of the right eye;
determining whether the eyes of the user are in an unnatural state according to the comparison result;
upon determining that the user's eyes are no longer in an unnatural state, then performing one or more of the following operations:
deactivating or closing the predetermined interactive interface on a display medium of the head-mounted interactive device;
presenting another interactive interface; or
Perform other functions.
9. The method of claim 8, wherein the other interactive interface presented or other function performed is related to an operation performed by a user.
10. The method of claim 3, wherein the predetermined interactive interface has a cursor thereon for selecting a virtual object or a real object.
11. The method of claim 10, wherein the cursor is always presented at a predetermined location on a display medium of the head-mounted interactive device.
12. The method of claim 10, wherein the virtual object has an absolute position in space or a relative position with the head-mounted interactive device.
13. The method of claim 10, wherein the user's eyes being in an unnatural state comprises one of the user's eyes being closed, the method further comprising:
with one eye of the user closed, it is determined whether a cursor on a display medium corresponding to the other eye of the user is aligned with a virtual object or a real object.
14. The method of claim 13, further comprising:
if a certain virtual object or an actual object is aligned, presenting an interactive interface associated with the virtual object or the actual object; and/or
If a virtual object or a real object is aligned and a predetermined condition is satisfied, an interactive interface associated with the virtual object or the real object is presented.
15. The method of claim 14, wherein the predetermined condition comprises one or more of:
the user's previously closed eyes are open;
blinking of the user;
the virtual object or real object is aligned for a predetermined length of time.
16. The method of claim 1 or 2, wherein the initiated function comprises:
a screen picture sliding or moving function;
a virtual object move function; or
A direction selection function.
17. The method of claim 16, wherein,
for the screen picture sliding or moving function, under the condition that the eyes of the user are in an unnatural state, determining the direction of screen picture sliding or moving according to the rotating direction of the head-mounted interactive equipment;
for the virtual object moving function, determining a direction of moving the virtual object according to the rotation direction of the head-mounted interactive device under the condition that the eyes of the user are in an unnatural state; or alternatively
For the direction selection function, the selected direction is determined according to a rotation direction of the head-mounted interactive apparatus in a case where both eyes of the user are in an unnatural state.
18. The method of claim 17, further comprising: and under the condition that the eyes of the user are in an unnatural state, determining the speed of the screen sliding or moving or determining the speed of the virtual object moving according to the rotation amplitude or speed of the head-mounted interactive equipment.
19. The method of claim 18, further comprising:
when the eyes of the user are no longer in an unnatural state, determining the direction and speed of the screen picture sliding or moving at the moment, or the direction and speed of the virtual object moving at the moment;
and taking the direction and the speed as initial directions and speeds, so that the screen picture or the virtual object continues to slide or move under preset conditions.
20. The method according to claim 19, wherein in the process of continuing to slide or move the screen or the virtual object, if it is detected that the eyes of the user are in an unnatural state again, stopping the sliding or moving, or applying a larger damping coefficient to the sliding or moving so that the sliding or moving can be stopped more quickly.
21. The method of claim 16, further comprising:
exiting or stopping the function, or initiating other functions, when the user's eyes are no longer in an unnatural state.
22. The method of claim 1 or 2, further comprising:
processing the current state information of the left eye based on the reference state information of the left eye of the user;
the current state information of the right eye is processed based on the reference state information of the right eye of the user.
23. The method of claim 22, wherein the reference state information comprises one or more of: reference state information when the eyes are naturally open, and reference state information when the eyes are naturally closed.
24. The method of claim 22, further comprising:
determining reference state information of a left eye of a user;
reference state information of the right eye of the user is determined.
25. The method of claim 22, wherein the reference state information includes reference state information when the eyes are naturally open and reference state information when the eyes are naturally closed, and wherein:
the processing the current state information of the left eye based on the reference state information of the left eye of the user includes: processing current state information of the left eye based on the reference state information when the left eye is naturally opened and the reference state information when the left eye is naturally closed;
the processing the current state information of the right eye based on the reference state information of the right eye of the user includes: the current state information of the right eye is processed based on the reference state information when the right eye is naturally open and the reference state information when the right eye is naturally closed.
26. A method for performing an operation by detecting a state of a user's eye, comprising:
detecting current state information of any eye of a user;
judging whether the eyes are closed or not according to the current state information;
if it is determined that the eyes are closed, presenting a predetermined interactive interface on a display medium of a head-mounted interactive device worn by a user corresponding to the other eye;
if the eye is no longer closed, performing one or more of the following operations:
deactivating or closing the predetermined interactive interface on a display medium of the head-mounted interactive device;
presenting another interactive interface; or
Perform other functions.
27. The method of claim 26, wherein the predetermined interactive interface has a cursor thereon for selecting a virtual object or a real object, the method further comprising: determining whether the cursor is directed at a virtual object or a real object with the eye remaining closed.
28. A method of controlling an electronic device by detecting a state of a user's eyes, comprising:
detecting current state information of any eye of a user;
judging whether the eyes are closed or not according to the current state information;
if the eyes are determined to be closed, starting corresponding functions of the electronic equipment, wherein the functions comprise:
a screen picture sliding or moving function;
a virtual object move function; or
A direction selection function.
29. The method of claim 28, wherein,
for the screen picture sliding or moving function, determining the screen picture sliding or moving direction according to the rotating direction of the head-mounted interactive equipment under the condition that the eyes are closed;
for the virtual object moving function, determining the moving direction of the virtual object according to the rotating direction of the head-mounted interactive equipment under the condition that the eyes are closed; or
For the direction selection function, the selected direction is determined from a rotational direction of the head-mounted interactive device with the eyes closed.
30. An electronic device, comprising:
one or more detection devices for detecting status information of the user's eyes;
one or more display media for presenting an interactive interface; and
a processor for implementing the method of any one of claims 1-29.
31. A storage medium in which a computer program is stored which, when being executed by a processor, is operative to carry out the method of any one of claims 1-29.
CN202111141084.7A 2021-09-28 2021-09-28 Electronic device and method of controlling the same by detecting state of user's eyes Pending CN115877942A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111141084.7A CN115877942A (en) 2021-09-28 2021-09-28 Electronic device and method of controlling the same by detecting state of user's eyes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111141084.7A CN115877942A (en) 2021-09-28 2021-09-28 Electronic device and method of controlling the same by detecting state of user's eyes

Publications (1)

Publication Number Publication Date
CN115877942A true CN115877942A (en) 2023-03-31

Family

ID=85763290

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111141084.7A Pending CN115877942A (en) 2021-09-28 2021-09-28 Electronic device and method of controlling the same by detecting state of user's eyes

Country Status (1)

Country Link
CN (1) CN115877942A (en)

Similar Documents

Publication Publication Date Title
US11093045B2 (en) Systems and methods to augment user interaction with the environment outside of a vehicle
US20210405761A1 (en) Augmented reality experiences with object manipulation
US20210407203A1 (en) Augmented reality experiences using speech and text captions
US10133407B2 (en) Display apparatus, display system, method for controlling display apparatus, and program
JP7092028B2 (en) Information processing equipment, information processing methods, and programs
CN103827780B (en) Method and system for virtual input device
US20170123491A1 (en) Computer-implemented gaze interaction method and apparatus
CN110546601B (en) Information processing device, information processing method, and program
CN104076512A (en) Head-mounted display device and method of controlling head-mounted display device
KR20190089627A (en) Device and operating method thereof for providing ar(augmented reality) service
US20160171780A1 (en) Computer device in form of wearable glasses and user interface thereof
JP2016208370A (en) Head mounted display and control method for head mounted display
WO2019187487A1 (en) Information processing device, information processing method, and program
CN108369451B (en) Information processing apparatus, information processing method, and computer-readable storage medium
KR20160137253A (en) Augmented Reality Device, User Interaction Apparatus and Method for the Augmented Reality Device
WO2019142560A1 (en) Information processing device for guiding gaze
US20190187479A1 (en) Transmission-type head mounted display apparatus, display control method, and computer program
EP3617851B1 (en) Information processing device, information processing method, and recording medium
JP6638392B2 (en) Display device, display system, display device control method, and program
EP4172732A1 (en) Augmented reality eyewear with mood sharing
US20230367118A1 (en) Augmented reality gaming using virtual eyewear beams
JP6740613B2 (en) Display device, display device control method, and program
CN115877942A (en) Electronic device and method of controlling the same by detecting state of user's eyes
CN115877941A (en) Head-mounted interactive device and interactive method for same
KR102539045B1 (en) Dashboard control apparatus and method for wearable augmented reality device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination