CN117289789A - Controlling device settings using head gestures - Google Patents

Controlling device settings using head gestures Download PDF

Info

Publication number
CN117289789A
CN117289789A CN202310742816.0A CN202310742816A CN117289789A CN 117289789 A CN117289789 A CN 117289789A CN 202310742816 A CN202310742816 A CN 202310742816A CN 117289789 A CN117289789 A CN 117289789A
Authority
CN
China
Prior art keywords
scalar
head pose
user interface
indicator
slider
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310742816.0A
Other languages
Chinese (zh)
Inventor
G·卢特尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/295,772 external-priority patent/US20230418371A1/en
Application filed by Apple Inc filed Critical Apple Inc
Publication of CN117289789A publication Critical patent/CN117289789A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • G06F3/04855Interaction with scrollbars

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to controlling device settings using head gestures. A head-mounted device may use head pose changes for user input. In particular, a display in the head mounted device may display a slider bar with an indicator. The slider bar may be a visual representation of a scalar of a device setting such as volume or brightness. The scalar of the device setting and the position of the indicator on the slider bar may be updated based on the head pose change. The direction of movement of the head may correspond to the direction of movement of the indicator in the slider. The scalar of the device settings may be updated only when gaze input from the user targets the slider. The slider may be displayed in response to a gaze input targeting an icon associated with the slider.

Description

Controlling device settings using head gestures
The present application claims priority from U.S. patent application Ser. No. 18/295,772, filed 4/2023, and U.S. provisional patent application Ser. No. 63/355,500, filed 24/6/2022, which are incorporated herein by reference in their entireties.
Background
The present disclosure relates generally to head mounted devices, and more particularly to head mounted devices having a display.
Some electronic devices, such as head-mounted devices, include a display (sometimes referred to as a near-eye display) that is positioned near the user's eyes during operation. The positioning of near-eye displays may make it difficult to provide touch input to these displays. Thus, controlling device settings on a head mounted device may be more difficult than desired.
Disclosure of Invention
An electronic device may include: one or more sensors; one or more displays; one or more processors; and a memory storing instructions configured to be executed by the one or more processors, the instructions for: displaying, using the one or more displays, a user interface element comprising a visual representation of a scalar of the device settings; obtaining head pose information via a first subset of the one or more sensors; updating the scalar for the device setting based on the head pose information; and updating the visual representation of the scalar based on the updated scalar.
Drawings
Fig. 1 is a schematic diagram of an exemplary head mounted device according to some embodiments.
Fig. 2A-2C are diagrams of an exemplary user of a head mounted device showing how the head pose of the user may be defined by yaw angle (yaw), roll angle (roll), and pitch angle (pitch), respectively, according to some embodiments.
Fig. 3A is a view of an exemplary horizontal slider bar representing a scalar for a device setting, according to some embodiments.
Fig. 3B is a view of an exemplary vertical slider bar representing a scalar for a device setting, according to some embodiments.
Fig. 3C is a view of an exemplary radial slide bar representing a scalar of a device setting, according to some embodiments.
FIG. 4 is a diagram of an exemplary user with varying head gestures to adjust a scalar of a device setting, according to some embodiments.
Fig. 5A-5D are views of an exemplary display with a horizontal slider bar that switches between a fixed mode and an adjustable mode based on gaze input, according to some embodiments.
Fig. 6A-6D are views of an exemplary display with icons and radial slide bars selectively displayed based on gaze input, according to some embodiments.
Fig. 7 is a flowchart illustrating an exemplary method performed by a head-mounted device, according to some embodiments.
Detailed Description
In some head-mounted devices, the change in head pose may be used to provide user input to the head-mounted device. In particular, the change in head pose may be used to adjust a slider bar, which is a visual representation of a scalar of a device setting such as speaker volume or display brightness. The scalar of the device setting may be updated based on the head pose of the user while the user is looking at the visual representation of the scalar of the device setting. This provides a way for the user to adjust the device settings without touching the display.
A schematic diagram of an exemplary head mounted device is shown in fig. 1. As shown in fig. 1, a head mounted device 10 (sometimes referred to as an electronic device 10, system 10, head mounted display 10, etc.) may have a control circuit 14. Control circuitry 14 may be configured to perform operations in head-mounted device 10 using hardware (e.g., dedicated hardware or circuitry), firmware, and/or software. Software code and other data for performing operations in the headset 10 are stored on a non-transitory computer readable storage medium (e.g., a tangible computer readable storage medium) in the control circuit 14. Software code may sometimes be referred to as software, data, program instructions, or code. The non-transitory computer-readable storage medium (sometimes commonly referred to as memory) may include non-volatile memory such as non-volatile random access memory (NVRAM), one or more hard disk drives (e.g., magnetic disk drives or solid state drives), one or more removable flash drives, or other removable media, and the like. Software stored on the non-transitory computer readable storage medium may be executed on the processing circuitry of the control circuit 14. The processing circuitry may include an application specific integrated circuit with processing circuitry, one or more microprocessors, digital signal processors, graphics processing units, central Processing Units (CPUs), or other processing circuitry.
The head mounted device 10 may include input-output circuitry 20. The input-output circuitry 20 may be used to allow data to be received by the headset 10 from external equipment (e.g., tethered computers, portable devices (such as handheld or laptop computers), or other electrical equipment) and to allow a user to provide user input to the headset 10. The input-output circuitry 20 may also be used to gather information about the environment in which the headset 10 is operating. Output components in circuit 20 may allow headset 10 to provide output to a user and may be used to communicate with external electrical equipment.
As shown in fig. 1, the input-output circuit 20 may include a display such as the display 16. The display 16 may be used to display images for a user of the head mounted device 10. The display 16 may be a transparent display so that a user may view the physical object through the display while overlaying computer-generated content on top of the physical object by presenting computer-generated images on the display. The transparent display may be formed from an array of transparent pixels (e.g., a transparent organic light emitting diode display panel) or may be formed from a display device that provides an image to a user through a beam splitter, holographic coupler, or other optical coupler (e.g., a display device such as a liquid crystal on silicon display). Alternatively, the display 16 may be an opaque display that blocks light from the physical object when the user operates the headset 10. In this type of arrangement, a perspective camera may be used to display physical objects to the user. The perspective camera may capture an image of the physical environment and the physical environment image may be displayed on a display for viewing by a user. Additional computer-generated content (e.g., text, game content, other visual content, etc.) may optionally be overlaid on the physical environment image, thereby providing an augmented reality environment for the user. When display 16 is opaque, the display may also optionally display the entire computer-generated content (e.g., not display an image of the physical environment).
The display 16 may include one or more optical systems (e.g., lenses) that allow a viewer to view images on the display 16. A single display 16 may produce images for both eyes or a pair of displays 16 may be used to display images. In a configuration with multiple displays (e.g., left-eye and right-eye displays), the focal length and position of the lens may be selected such that any gap that exists between the displays will not be visible to the user (e.g., such that the images of the left and right displays overlap or merge seamlessly). The display module that generates different images for the left and right eyes of a user may be referred to as a stereoscopic display. The stereoscopic display may be capable of presenting two-dimensional content (e.g., user notifications with text) and three-dimensional content (e.g., simulations of physical objects such as cubes).
The input-output circuitry 20 may include various other input-output devices for gathering data and user inputs and for providing output to a user. For example, the input-output circuit 20 may include a gaze tracker 18 (sometimes referred to as a gaze tracking system or gaze tracking camera).
The gaze tracker 18 may include a camera and/or other gaze tracking system components (e.g., a light source that emits a light beam such that reflections of the light beam from the user's eyes may be detected) to monitor the user's eyes. The gaze tracker 18 may face the eyes of the user and may track the gaze of the user. The camera in the gaze tracking system may determine a position of the user's eyes (e.g., the center of the user's pupils), may determine an orientation direction of the user's eyes therein (the user's gaze direction), may determine a pupil size of the user (e.g., such that light modulation and/or other optical parameters, and/or sequential amounts used to spatially adjust one or more of these parameters, and/or areas in which one or more of these optical parameters are adjusted based on pupil size), may be used to monitor a current focal length of the lens in the user's eyes (e.g., whether the user is focused in the near field or the far field, which may be used to evaluate whether the user is in making a white-day dream or is strategically thinking), and/or other gaze information. Cameras in gaze tracking systems may sometimes be referred to as inward facing cameras, gaze detection cameras, eye tracking cameras, gaze tracking cameras, or eye monitoring cameras. Other types of image sensors (e.g., infrared and/or visible light emitting diodes and photodetectors, etc.) may also be used to monitor the user's gaze if desired. The use of a gaze detection camera in the gaze tracker 18 is merely illustrative.
As shown in fig. 1, the input-output circuitry 20 may include a position and motion sensor 22 (e.g., a compass, gyroscope, accelerometer, and/or other device for monitoring the position, posture, and movement of the headset 10, satellite navigation system circuitry such as global positioning system circuitry for monitoring the user's position, etc.). For example, the control circuit 14 may monitor a current direction of the user's head orientation relative to the surrounding environment (e.g., the user's head pose) using the sensor 22.
The input-output circuitry 20 may also include other sensors and input-output components (e.g., ambient light sensors, force sensors, temperature sensors, touch sensors, image sensors for detecting gestures or body gestures, buttons, capacitive proximity sensors, light-based proximity sensors, other proximity sensors, strain gauges, gas sensors, pressure sensors, humidity sensors, magnetic sensors, microphones, speakers, audio components, tactile output devices, light emitting diodes, other light sources, wired and/or wireless communication circuitry, etc.), if desired.
The user may use the position and motion sensor 22 to provide user input to the headset 10. In particular, the position and motion sensor 22 may detect changes in head pose (sometimes referred to as head movement) during operation of the headset 10. The head movement may be used to adjust a scalar of the device settings, such as the volume of a speaker or the brightness of a display. Simultaneously with adjusting the scalar based on the head movement, a visual representation of the scalar may be displayed on the display (and updated based on the head movement).
If desired, changes in yaw angle, roll angle, and/or pitch angle of the user's head (and, accordingly, the head mounted device) may be interpreted as user input. Fig. 2A to 2C show how yaw, roll and pitch angles may be defined for a user's head. Fig. 2A-2C illustrate a user 24. In each of fig. 2A-2C, the user is facing in the Z-direction and the Y-axis is aligned with the user's height. The X-axis may be considered the left-right axis of the user's head, the Z-axis may be considered the front-back axis of the user's head, and the Y-axis may be considered the vertical axis of the user's head. The X-axis may be referred to as extending from the left ear of the user to the right ear of the user, from the left side of the user's head to the right side of the user's head, etc. The Z-axis may be referred to as extending from the back of the user's head to the front of the user's head (e.g., to the user's face). The Y-axis may be referred to as extending from the bottom of the user's head to the top of the user's head.
As shown in fig. 2A, the yaw angle may be defined as rotating about a vertical axis (e.g., the Y-axis in fig. 2A-2C). As the user's head rotates in direction 26, the yaw angle of the user's head changes. Yaw angle is sometimes alternatively referred to as heading. The user's head may change the yaw angle by rotating right or left about a vertical axis. Rotation to the right about a vertical axis (e.g., an increase in yaw angle) may be referred to as a right head movement. Rotation to the left about a vertical axis (e.g., yaw angle decrease) may be referred to as a left head movement.
As shown in fig. 2B, the roll angle may be defined as rotating about a front-to-back axis (e.g., the Z-axis in fig. 2A-2C). As the user's head rotates in direction 28, the roll angle of the user's head changes. The user's head may change roll angle by rotating to the right or left about the front-to-back axis. Rotation to the right about the anterior-posterior axis (e.g., increased roll angle) may be referred to as right head movement. Rotation to the left about the fore-aft axis (e.g., roll angle reduction) may be referred to as left head movement.
As shown in fig. 2C, the pitch angle may be defined as rotation about a left-right axis (e.g., the X-axis in fig. 2A-2C). As the user's head rotates in direction 30, the pitch angle of the user's head changes. The user's head may change the pitch angle by rotating up or down about the left-right axis. Downward rotation about the left-right axis (e.g., decreasing pitch angle, following the right arrow in direction 30 in fig. 2C) may be referred to as downward head movement. Upward rotation about the left-right axis (e.g., increasing pitch angle, following the left arrow in direction 30 in fig. 2C) may be referred to as upward head movement.
It should be appreciated that the position and motion sensor 22 may directly determine the pose, movement, yaw angle, pitch angle, roll angle, etc. of the headset 10. The position and motion sensor 22 may assume that the headset is mounted on the user's head. Thus, references herein to head pose, head movement, yaw angle of the user's head, pitch angle of the user's head, roll angle of the user's head, etc. may be considered interchangeable with references to device pose, device movement, yaw angle of the device, pitch angle of the device, roll angle of the device, etc.
At any given time, the position and motion sensor 22 (and/or the control circuit 14) may determine the yaw, roll, and pitch angles of the user's head. The yaw angle, roll angle, and pitch angle of the user's head may collectively define the head pose of the user. The detected change in head pose may be used as a user input to the head mounted device 10.
In particular, the change in head pose may be used to adjust a scalar of the device settings. Additionally, the change in head pose may be used to adjust a scalar visual representation of the device settings. Herein, it may be assumed that the adjustment of the scalar of the device setting is accompanied by an adjustment of the visual representation of the scalar of the device setting, and vice versa (even if not explicitly mentioned). One example of a visual representation of a scalar for a device setting is a slider bar. Fig. 3A-3C are views of an exemplary slider bar 32 that may be used to represent a scalar for a device setting. Fig. 3A shows a horizontal slider bar 32-H with an indicator of horizontal movement. Fig. 3B shows a vertical slider bar 32-V with an indicator of vertical movement. Fig. 3C shows a radial slider bar 32-R with indicators of radial movement. Each of the horizontal slider 32-H, vertical slider 32-V, and radial slider 32-R may sometimes be referred to simply as a slider, user interface element, or the like.
Each slider includes an indicator 34. The indicator 34 is a visual representation of a scalar corresponding to the device setting of the slider bar 32. There are many possible aesthetic appearances for the indicator 34. In FIG. 3A, the indicator 34 is an interface between a first portion 36-1 of the slider having a first appearance and a second portion 36-2 having a second appearance that is different from the first appearance. The first portion 36-1 and the second portion 36-2 may have different colors, patterns, transparency, or any other desired aesthetic characteristics. The dimensions of portions 36-1 and 36-2 may vary as indicator 34 moves. For example, as indicator 34 slides to the right, portion 36-1 will become larger and portion 36-2 will become smaller. As indicator 34 slides to the left, portion 36-2 will become larger and portion 36-1 will become smaller.
The slider bar 32-H in FIG. 3A may optionally include a numeric area 38 that directly shows the scalar set by the device controlled by the slider bar in text form. In the example of fig. 3A, the device settings have a scalar 35 (e.g., 35 in the range of 0 to 100). Thus, the numeral 35 is displayed in the numeral area 38 of the slider 32-H.
In fig. 3A, the indicator 34 is an interface (boundary) between different portions of a slider bar having different appearances. This example of an indicator is merely illustrative. In another example shown in FIG. 3B, indicator 34 may have a different appearance than portions 36-1 and 36-2 on the slider.
In FIG. 3B, as indicator 34 slides downward, portion 36-2 will become larger and portion 36-1 will become smaller. As indicator 34 slides upward, portion 36-1 will become larger and portion 36-2 will become smaller. The slider 32-V includes a number area 38 on the right side of the slider. This example is merely illustrative, and the slider 32-V may optionally include a numerical area at any desired portion of the slider.
The slider 32-V in fig. 3B has an indicator 34 having a different appearance than the portions 36-1 and 36-2. In FIG. 3B, indicator 34 has a greater width than portions 36-1 and 36-2. Indicator 34 may alternatively or additionally have a distinct boundary, for example, distinguishing indicator 34 from portions 36-1 and 36-2. In an example such as that of FIG. 3B, where the location of indicator 34 is easily distinguished from both portions 36-1 and 36-2, portions 36-1 and 36-2 may optionally have the same appearance. For example, portions 36-1 and 36-2 may have the same color and pattern, and the user may still easily determine the relative position of indicator 34 along the slider.
Fig. 3C shows a radial slider 32-R having an indicator 34 that moves radially around the slider. As indicator 34 slides clockwise, portion 36-1 will become larger and portion 36-2 will become smaller. As indicator 34 slides counterclockwise, portion 36-2 will become larger and portion 36-1 will become smaller. The slider 32-R includes a number area 38 in the center of the slider. This example is merely illustrative, and slider 32-R may optionally include a numerical area at any desired portion of the slider.
In fig. 3A, sliding the indication Fu Xiangyou may increase the scalar of the device settings, while sliding the indication Fu Xiangzuo may decrease the scalar of the device settings. This example is merely illustrative and the reverse arrangement may be used if desired.
In fig. 3B, sliding the indicator up may increase the scalar of the device setting, while sliding the indicator down may decrease the scalar of the device setting. This example is merely illustrative and the reverse arrangement may be used if desired.
In fig. 3C, sliding the indicator Fu Shun clockwise may increase the scalar of the device setting, while sliding the indicator counterclockwise may decrease the scalar of the device setting. This example is merely illustrative and the reverse arrangement may be used if desired.
In fig. 3A, the indicators of slider 32-H (and the corresponding scalar of the device settings) may be controlled by head movement. Movement of the head in a given direction may cause the indicator for the slider to slide in the given direction. For example, in fig. 3A, a right head movement (e.g., a yaw angle change to the right and/or a roll angle change to the right) may cause the indication Fu Xiangyou for the slider to slide, while a left head movement (e.g., a yaw angle change to the left and/or a roll angle change to the left) may cause the indication Fu Xiangzuo for the slider to slide. Alternatively or additionally, upward head movement (e.g., change in pitch angle upward) may cause the indication Fu Xiangyou for the slider to slide, while downward head movement (e.g., change in pitch angle downward) may cause the indication Fu Xiangzuo for the slider to slide. In some cases, only one type of head rotation may control an indicator of slider 32-H (e.g., only yaw angle, only roll angle, or only pitch angle is used as user input to the slider). Alternatively, multiple types of head rotation may control indicators of slider 32-H (e.g., two or more of yaw, roll, and pitch angle are used as user inputs to the slider).
The indicators of the slider bar 32-V (and the corresponding scalar of the device settings) may be controlled by head movement. Movement of the head in a given direction may cause the indicator for the slider to slide in the given direction. For example, in fig. 3B, upward head movement (e.g., pitch angle up change) may cause the indicator for the slider to slide upward, while downward head movement (e.g., pitch angle down change) may cause the indicator for the slider to slide downward. Alternatively or additionally, movement of the right head (e.g., change in yaw angle to the right and/or change in roll angle to the right) may cause the indicator for the slider to slide upward, while movement of the left head (e.g., change in yaw angle to the left and/or change in roll angle to the left) may cause the indicator for the slider to slide downward. In some cases, only one type of head rotation may control the indicator of slider 32-V. Alternatively, multiple types of head rotation may control the indicator of slider 32-V.
The indicators of slider bar 32-R (and the corresponding scalar of the device settings) may be controlled by head movement. For example, in fig. 3C, upward head movement (e.g., change in pitch angle upward) may cause the indicator Fu Shun for the slider to slide clockwise, while downward head movement (e.g., change in pitch angle downward) may cause the indicator for the slider to slide counterclockwise. Alternatively or additionally, movement of the right head (e.g., change in yaw angle to the right and/or change in roll angle to the right) may cause the indicator Fu Shun for the slider to slide clockwise, while movement of the left head (e.g., change in yaw angle to the left and/or change in roll angle to the left) may cause the indicator for the slider to slide counterclockwise. In some cases, only one type of head rotation may control the indicator of slider 32-R. Alternatively, multiple types of head rotation may control the indicator of slider 32-R.
Fig. 4 is a diagram showing how a user may change their head pose to provide user input to the headset 10. In FIG. 4, user 24 may see in direction 40-1 at a first time, may see in direction 40-2 at a second time, and may see in direction 40-3 at a third time. As shown in FIG. 4, the user's head has a 0 degree yaw angle when viewed in direction 40-1, a 15 degree yaw angle when viewed in direction 40-2, and a 30 degree yaw angle when viewed in direction 40-3.
The headset 10 may interpret the change in head pose as user input to the slider in a variety of ways. Three such exemplary ways are: the head pose change is interpreted as a constant rate change of the indicator of the slider, the head pose change is interpreted as a variable rate change of the indicator of the slider, and the head pose change is interpreted as being related to the indicator of the slider.
First, consider an example in which a head posture change is interpreted as a constant rate change of an indicator of a slider. The indicator of the slider may remain fixed when the user looks in direction 40-1 (with a 0 degree yaw angle). After the user makes a rightward head movement (e.g., moves their head 15 degrees rightward to face the direction 40-2), the indicator of the slider may move at a constant rate in a positive direction (e.g., move rightward in fig. 3A). In other words, the slider bar can move in a positive direction as long as the user is facing in the direction 40-2. The constant rate may be, for example, +1/second, +2/second, +3/second, etc. Consider the example in fig. 3A, where the horizontal slider starts at a value of 35. When using a constant rate of +3/sec, the scalar indicated by the slider bar will increase from 35 to 53 after 6 seconds facing the direction 40-2.
In the example where the slider movement rate is constant, the rate will remain the same regardless of the degree of change in the user's head pose. For example, if user 24 makes a right head movement (e.g., moves their head 15 degrees to the right to face direction 40-3 instead of direction 40-2), the indicator of the slider may still move at a constant rate in the positive direction. If the user 24 then returns from direction 40-3 to direction 40-2, the indicator of the slider may still move in the forward direction at a constant rate. Thus, a positive yaw angle results in a constant rate of change of the indicator of the slider. If at any time the user returns to facing direction 40-1, the indicator of the slider will stop moving and remain fixed in its current position. If at any time the user makes a left head movement (and has a negative yaw angle), the slider bar may move in a negative direction at a constant rate (e.g., -1/second, -2/second, -3/second, etc.). The movement rate of the indicator may have the same magnitude but different signs for positive and negative yaw angles.
The positive and negative yaw angles may be defined in absolute terms (e.g., relative to 0 degrees) or relative terms (e.g., relative to a starting yaw angle at which user input to the slider begins). For example, the user may initiate a slider adjustment when the user's head has a 15 degree yaw angle. If the yaw angle is changed based on the absolute yaw angle, the slider value may increase at a constant rate when the yaw angle is greater than 0 degrees, and decrease at a constant rate when the yaw angle is less than 0 degrees. If the yaw angle is changed based on the relative yaw angle, the slider value may increase at a constant rate when the yaw angle is greater than 15 degrees, and decrease at a constant rate when the yaw angle is less than 15 degrees.
Additionally, there may be a yaw angle range where the position of the slider remains fixed. For example, the slider bar will be fixed while the head pose has a yaw angle between-10 degrees and 10 degrees. When the yaw angle is greater than 10 degrees, the slider value may increase at a constant rate. When the yaw angle is less than-10 degrees, the slider value may decrease at a constant rate.
The example in which the rate of change of the indicator of the slider bar is fixed is merely illustrative. Alternatively, the rate of change of the indicator of the slider bar may vary based on the head pose. The indicator of the slider may remain fixed when the user looks in direction 40-1 (with a 0 degree yaw angle). After the user makes a rightward head movement (e.g., moves their head 15 degrees rightward to face direction 40-2), the indicator of the slider may be moved at a first rate in a positive direction (e.g., to the right in fig. 3A). For example, the first rate may be +2/sec. Consider the example in fig. 3A, where the horizontal slider starts at a value of 35. When using a constant rate of +2/sec, the slider value will increase from 35 to 12 to 47 after 6 seconds facing the direction 40-2.
If user 24 makes a right head movement (e.g., moves their head 15 degrees to the right to face direction 40-3 instead of direction 40-2), the indicator of the slider bar may move in a positive direction at a second rate that is greater than the first rate. For example, the second rate may be +5/second. When using a constant rate of +5/sec, the slider value will increase from 47 to 30 to 77 after 6 seconds facing the direction 40-3. If user 24 returns from direction 40-3 to direction 40-2, the indicator of the slider may again be moved in the positive direction at a first rate (e.g., a rate associated with the yaw angle of direction 40-2).
Thus, a positive yaw angle results in a positive rate of change of the slider. However, the positive rate of change may increase with increasing positive yaw angle. The function relating the rate of change to the yaw angle may be linear (e.g., the rate increases gradually with increasing yaw angle), non-linear (e.g., the rate increases exponentially with increasing yaw angle), or a step function (e.g., a first rate for a first yaw angle range, a second rate for a second yaw angle range, etc.).
In the variable rate input method, as with the constant rate input method, the positive and negative yaw angles may be defined in absolute terms (e.g., relative to 0 degrees) or relative terms (e.g., relative to the starting yaw angle at the beginning of user input to the slider bar), as discussed above.
As yet another example, a scalar represented by a slider bar may be related to a head pose. Each yaw angle degree may be associated with a corresponding change in the slider values (e.g., +1, +2, +3, etc.). Consider an example in which each yaw angle is associated with a change of +2 of the scalar quantity represented by the slider bar. As shown in fig. 3A, when the slider has an initial scalar of 35, the user may initiate user input to the slider. The yaw angle of 0 degrees may be a reference point for a change of 0 to a scalar. If the user looks in direction 40-2 (with a 15 degree yaw angle), the scalar will increase by 30 to a size of 65. If the user looks in direction 40-3 (with a 30 degree yaw angle), the scalar will increase 60 (relative to the starting point) to a magnitude of 95. If the user changes their head pose back to a 15 degree yaw angle, the scalar will again increase by 30 (relative to the starting point) to a size of 65. If the user changes their head pose so as to have a yaw angle of-10 degrees, the scalar will change by-20 (relative to the starting point) to a size of 15.
When the scalar is related to the head pose as described above, the amount of change in the scalar per degree in the head pose (e.g., yaw angle, roll angle, and/or pitch angle) may be selected based at least in part on the minimum and maximum values of the scalar. Additionally, there may be a head pose range in which the scalar is related to the head pose. Outside this range, the scalar may remain fixed (e.g., at a minimum of the scalar or a maximum of the scalar). The scalar and head pose related minimum and maximum head poses (e.g., minimum and maximum yaw angles, minimum and maximum roll angles, minimum and maximum pitch angles) may be fixed (e.g., default ranges for all users) or adjustable (e.g., tailored for each user to account for more or less flexibility than an average person).
In a related input method, the positive and negative yaw angles may be defined in absolute terms (e.g., relative to 0 degrees) or relative terms (e.g., relative to a starting yaw angle at which user input to the slider begins), as discussed above.
The adjustment of the slider based on the head pose has been described in relation to the yaw angle in fig. 4. However, it should be understood that the same principles are also applicable to the change of pitch angle and the change of roll angle.
It may be desirable for the slider bar (and accordingly the scalar set by the device represented by the slider bar) to be responsive only at times to head pose changes during operation of the head-mounted device. In other words, the user can operate the head-mounted device normally without any change in the slider bar (and the scalar of the device setting it controls) due to a change in head pose. The user may selectively place the slider in an adjustable mode in which the slider changes in response to the head pose. Once the slider is adjusted to the user's satisfaction, the user can put the slider back into a fixed mode in which the head pose changes do not cause any changes to the slider (and the scalar of the device settings it controls).
Any desired user input may trigger the slider to switch between the fixed mode and the adjustable mode. For example, a user may provide verbal commands, gesture inputs (e.g., gestures), or other selection inputs to switch the slider between a fixed mode and an adjustable mode. One possible input for switching the slider between the fixed mode and the adjustable mode, which will be discussed herein, is a gaze input. As mentioned previously, the gaze tracker 18 may track the gaze point of the user. In one illustrative example shown in fig. 5A-5D, the slider may be in an adjustable mode when the user's gaze point is targeting the slider and in a fixed mode when the user's gaze point is not targeting the slider.
Fig. 5A-5D are views of the display 16 with the slider 32-H. Each of fig. 5A to 5D also shows the head pose of the user in the region 52. As previously discussed, slider 32-H has an indicator 34 and a numeric area 38. In fig. 5A, the gaze point 42 of the user is not targeted to the slider bar 32-H. In other words, the gaze point 42 is elsewhere on the display 16 and does not overlap the slider bar 32-H. When the gaze point 42 of the user is not targeting the slider bar 32-H, the slider bar 32-H may be in a fixed mode and the scalar set by the device controlled by the slider bar is fixed (even when a head pose change is detected).
In fig. 5B, the gaze point 42 targets the slider 32-H. In other words, the gaze point 42 overlaps the slider bar 32-H. When the gaze point 42 targets the slider bar 32-H, the slider bar 32-H may be in an adjustable mode, and the scalar set by the device controlled by the slider bar is adjusted based on the head pose changes. The scalar of the device settings may be adjusted in any of the manners previously described (e.g., using yaw, roll, and/or pitch angles to adjust the scalar at a constant rate at different head poses, adjust the scalar at a different rate at different head poses, or adjust the scalar based on the correlation between the start head pose and the end head pose).
If desired, the slider may be switched from the fixed mode to the adjustable mode only if the gaze input targets the slider for at least a given dwell time (e.g., more than 50 milliseconds, more than 100 milliseconds, more than 200 milliseconds, more than 500 milliseconds, more than 1 second, etc.).
When the slider is switched from the fixed mode to the adjustable mode in fig. 5B (e.g., targeting the slider in response to gaze input), the user's head pose may be used as an initial head pose for subsequent updating of the scalar based on the initial head pose.
In fig. 5C, the head pose change causes the scalar of the device setting to increase from 35 to 65 while the slider is in the adjustable mode. The slider bar may remain in the adjustable mode (as shown in fig. 5B and 5C) while the gaze point 42 targets the slider bar.
As shown in region 52 of fig. 5A and 5B, the user's head is facing directly forward in fig. 5A and 5B (e.g., having a 0 degree yaw angle). However, between fig. 5B and 5C, the yaw angle of the user's head is increased by an angle 54. Since the gaze input targets slider bar 32-H between fig. 5B and 5C (and the slider bar is in an adjustable mode), the yaw angle change between fig. 5B and 5C changes the scalar represented by slider bar 32-H.
In fig. 5D, the gaze point no longer targets the slider, so the slider returns to the fixed mode, and the scalar set by the device controlled by the slider is fixed at the updated value of 65 (even when a head pose change is detected). Between fig. 5C and 5D, the yaw angle of the user's head is reduced back to an angle of 0 degrees. However, since the gaze input does not target the slider bar 32-H between FIG. 5C and FIG. 5D (and thus the slider bar is in a fixed mode), the yaw angle change between FIG. 5C and FIG. 5D does not change the scalar represented by the slider bar 32-H.
In the example of fig. 5A-5D, gaze input (e.g., from gaze tracker 18) is used to switch slider 32-H from a fixed mode to an adjustable mode. This example is merely illustrative. Other types of inputs may be used to switch the slider bar between the fixed mode and the adjustable mode, if desired. For example, input from a remote control operable with the headset 10 may be used to switch the slider bar between a fixed mode and an adjustable mode. One or more additional components (e.g., buttons, touch sensors, etc.) on the headset may be used to gather user input to switch the slider between a fixed mode and an adjustable mode.
In the example of fig. 5A-5D, the slider bar 32-H is continuously displayed on the display 16. Alternatively, icons (sometimes referred to as user interface elements) may be continuously displayed on the display 16. The slider bar may be displayed in response to the gaze point targeting the icon. When the slider is not displayed, the scalar set by the slider-controlled device is fixed. When the slider is displayed after the point of regard targets the icon, the scalar set by the device controlled by the slider is adjustable.
Fig. 6A-6D are views of the display 16 with icons 44. Each of fig. 6A to 6D also shows the head pose of the user in the region 52. In this example, the icon 44 is a volume icon, and the slider bar associated with the icon 44 is used to control the volume of the headset 10. This example is merely illustrative, and any other desired icon (and corresponding device settings) may be used. As an alternative, the icon 44 may be a brightness icon (e.g., sun), and a slider bar associated with the icon 44 may be used to control the display brightness.
In fig. 6A, the gaze point 42 is not targeted to the icon 44. Thus, the slider is not displayed, and the scalar set by the device controlled by the slider is fixed.
In fig. 6B, the gaze point 42 targets (e.g., overlaps) an icon 44. This results in the display of slider 32-R. Slider bar 32-R may be displayed adjacent to icon 44 (e.g., below icon 44 in fig. 6B). As previously discussed, slider 32-R has an indicator 34 and a numeric area 38. When slider 32-R is displayed (as shown in FIG. 6B), the slider may be in an adjustable mode in which the scalar set by the device controlled by the slider is adjusted based on head pose changes. Alternatively, the slider bar 32-R may be placed in a fixed mode upon initial display. Then, when gaze input from the user targets the slider bar 32-R instead of the icon 44, the slider bar 32-R enters an adjustable mode. This essentially serves as a second confirmation step to ensure that the user wants to change the settings. The scalar of the device settings may be adjusted in any of the manners previously described (e.g., using yaw, roll, and/or pitch angles to adjust the scalar at a constant rate at different head poses, adjust the scalar at a different rate at different head poses, or adjust the scalar based on the correlation between the start head pose and the end head pose).
If desired, the slider bar may be displayed only when the gaze input targets the icon for at least a given dwell time (e.g., more than 50 milliseconds, more than 100 milliseconds, more than 200 milliseconds, more than 500 milliseconds, more than 1 second, etc.).
When a slider bar is displayed in fig. 6B (e.g., targeting an icon in response to a gaze input), a head pose of a user may be used as an initial head pose for subsequent updating of the scalar based on the initial head pose.
In fig. 6C, the head pose change causes the scalar of the device setting to increase from 35 to 65 while the slider is in the adjustable mode. The slider bar may remain displayed (and in the adjustable mode) while the gaze point 42 targets the slider bar 32-R or icon 44. Alternatively, the slider bar may remain displayed (and in the adjustable mode) only when the gaze point 42 targets the slider bar 32-R (rather than the icon 44). Alternatively, the slider bar may remain displayed (and in the adjustable mode) only when the gaze point 42 targets the icon 44 (rather than the slider bar 32-R). In the example of fig. 6C, the slider bar remains displayed (and in the adjustable mode) while the gaze point 42 targets the slider bar 32-R or icon 44. Thus, in FIG. 6C, the slider bar remains displayed (and in the adjustable mode) when the gaze point targets the slider bar 32-R. However, in fig. 6D, when the gaze point 42 is moved away from both the slider bar and the icon 44, the slider bar is no longer displayed (and thus in the fixed mode).
If desired, the slider may enter a fixed mode (and no longer be displayed) only when the gaze input leaves a given area (e.g., slider bar or icon, slider bar, or icon) for at least a given dwell time (e.g., more than 50 milliseconds, more than 100 milliseconds, more than 200 milliseconds, more than 500 milliseconds, more than 1 second, etc.).
In another possible user experience, a slider bar may be displayed in response to a gaze point targeting icon 44. The slider bar may then be adjustable when the gaze point targets the slider bar or any other part of the display. The slider bar is no longer displayed and the fixed mode is entered only when the gaze point again targets the icon 44. In other words, once the user completes updating the scalar of the device setting, the user views icon 44 to fix or lock the scalar of the device setting.
As shown in region 52 of fig. 6A and 6B, the user's head is facing directly forward in fig. 6A and 6B (e.g., having a 0 degree yaw angle). However, between fig. 6B and 6C, the yaw angle of the user's head is increased by an angle 54. Since the slider bar is in an adjustable mode between fig. 6B and 6C, the yaw angle change between fig. 6B and 6C changes the scalar represented by slider bar 32-R. Between fig. 6C and 6D, the yaw angle of the user's head is reduced back to an angle of 0 degrees. However, since the gaze input does not target the slider bar 32-H between FIG. 6C and FIG. 6D, the slider bar is not in an adjustable mode, and the change in yaw angle between FIG. 6C and FIG. 6D does not change the scalar represented by the slider bar 32-R.
In the example of fig. 6A-6D, gaze input (e.g., from gaze tracker 18) is used to target icon 44 and cause slider bar 32-R to be displayed in an adjustable mode. This example is merely illustrative. Other types of inputs may be used to target icons and cause a slider bar to be displayed (e.g., input from a remote control operable with the headset 10, input from one or more additional components on the headset such as buttons or touch sensors, etc.).
Fig. 7 is a flowchart illustrating an exemplary method performed by a head-mounted device (e.g., control circuitry 14 in device 10). The blocks of fig. 7 may be stored as instructions in a memory of the electronic device 10, where the instructions are configured to be executed by one or more processors in the electronic device.
At optional block 102, control circuitry 14 may display user interface elements on a display. The user interface element may be, for example, an icon, such as icon 44 in fig. 6A-6D. At optional block 104, the control circuit 14 may obtain gaze input (e.g., from the gaze tracker 18). The gaze input may optionally be used to trigger the display of a slider bar (e.g., adjacent to the user interface element from block 102) that is a visual representation of the scalar of the device setting. The gaze input may optionally be used to trigger a slider bar (which is already on the display) to switch between a fixed mode and an adjustable mode. The example of obtaining gaze input at block 104 is merely illustrative. If desired, other user inputs may be obtained at block 104 (e.g., from a remote control operable with the headset 10, buttons on the headset 10, touch sensors on the headset 10, etc.) for triggering display of a slider bar that is a visual representation of a scalar of a device setting and/or triggering a slider bar (which is already on a display) to switch between a fixed mode and an adjustable mode.
At block 106, control circuitry 14 may display a visual representation of the scalar of the device settings on a display. The visual representation of the scalar of the device settings may sometimes be referred to as a user interface element. The visual representation may be, for example, a slider bar, such as one of the slider bars shown in fig. 3A-3C. The slider bar shows the current scalar of the device settings, such as volume, brightness. In some cases, the slider may be continuously displayed on the display 16. In some cases, displaying the visual representation in block 106 may be triggered by gaze input (or other user input) targeting the user interface element displayed at block 102. In this example, the device settings for the slider bar may be associated with the user interface element displayed at block 102 (e.g., the slider bar controls the volume when a volume icon is displayed at block 102 or the slider bar controls the display brightness when a brightness icon is displayed at block 102).
At block 108, control circuit 14 may obtain head pose information (e.g., from position and motion sensor 22). In some cases, the same sensor may obtain both gaze input and head pose information. The head pose information may indicate yaw, pitch, and roll angles of the user's head over time.
At block 110, control circuit 14 may update the scalar of the device settings based on the head pose information. The user may change their head pose to increase the scalar or decrease the scalar. As previously described, the scalar may increase at a constant rate or a variable rate when the yaw angle is positive, the scalar may increase at a constant rate or a variable rate when the roll angle is positive, and/or the scalar may increase at a constant rate or a variable rate when the pitch angle is positive. Alternatively, the scalar may increase at a constant rate or a variable rate when the change in yaw angle relative to the initial yaw angle is positive, the scalar may increase at a constant rate or a variable rate when the change in roll angle relative to the initial roll angle is positive, and/or the scalar may increase at a constant rate or a variable rate when the change in pitch angle relative to the initial pitch angle is positive. Similarly, the scalar may decrease at a constant rate or a variable rate when the yaw angle is negative, decrease at a constant rate or a variable rate when the roll angle is negative, and/or decrease at a constant rate or a variable rate when the pitch angle is negative. Optionally, the scalar may decrease at a constant rate or a variable rate when the change in yaw angle relative to the initial yaw angle is negative, the scalar may decrease at a constant rate or a variable rate when the change in roll angle relative to the initial roll angle is negative, and/or the scalar may decrease at a constant rate or a variable rate when the change in pitch angle relative to the initial pitch angle is negative. As yet another option, the scalar may be increased or decreased in a manner related to yaw angle (relative to a 0 degree reference point), may be increased or decreased in a manner related to roll angle (relative to a 0 degree reference point), and/or may be increased or decreased in a manner related to pitch angle (relative to a 0 degree reference point). Optionally, the scalar may be increased or decreased in a manner related to the change in yaw angle relative to the initial yaw angle, may be increased or decreased in a manner related to the change in roll angle relative to the initial roll angle, and/or may be increased or decreased in a manner related to the change in pitch angle relative to the initial pitch angle.
Only when the visual representation of the scalar of the device setting (and the scalar itself) is in the adjustable mode, the change in head pose can be used to update the scalar of the device setting. When the visual representation of the scalar of the device setting (as well as the scalar itself) is in a fixed mode, the change in head pose does not change the scalar of the device setting.
At block 112, control circuitry 14 may update a visual representation of the scalar of the device setting based on the updated scalar. In other words, the visual representation displayed at block 106 may be updated to reflect the current scalar of the device settings. In examples where the visual representation in block 106 is a slider, the indicator of the slider may be moved in a direction related to the direction of movement of the user's head. For example, when the scalar is updated based on the rightward head movement in block 110, the indicator in the horizontal slider bar may be moved rightward in block 112. When the scalar is updated based on the left head movement in block 110, the indicator in the horizontal slider bar may be moved to the left in block 112. When the scalar is updated based on the upward head movement in block 110, the indicator in the vertical slider bar may be moved upward in block 112. When the scalar is updated based on the downward head movement in block 110, the indicator in the vertical slider bar may be moved downward in block 112. When the scalar is updated based on the rightward head movement in block 110, the indication Fu Shun in the radial slide bar may be moved clockwise in block 112. When the scalar is updated based on the left head movement in block 110, the indicator in the radial slider bar may be moved counterclockwise in block 112.
With sufficient caution, it should be noted that to some extent, if any specific implementation of the technology involves the use of personally identifiable information, the implementer should follow privacy policies and practices that are generally considered to meet or exceed industry or government requirements to maintain user privacy. In particular, personally identifiable information data should be managed and processed to minimize the risk of inadvertent or unauthorized access or use, and the nature of authorized use should be specified to the user.
According to an embodiment, there is provided an electronic device including: one or more sensors; one or more displays; one or more processors; and a memory storing instructions configured to be executed by the one or more processors, the instructions for: displaying, using the one or more displays, a user interface element comprising a visual representation of a scalar of the device settings; obtaining head pose information via a first subset of the one or more sensors; updating the scalar for the device setting based on the head pose information; and updating the visual representation of the scalar based on the updated scalar.
According to another embodiment, the instructions further comprise instructions for: displaying additional user interface elements using the one or more displays prior to displaying the user interface elements and obtaining gaze input via the second subset of the one or more sensors, the displaying the user interface elements including displaying the user interface elements in accordance with determining that the gaze input targets the additional user interface elements.
According to another embodiment, the instructions further comprise instructions for: obtaining gaze input via a second subset of the one or more sensors, updating the scalar for the device setting based on the head pose information includes updating the scalar for the device setting based on the head pose information in accordance with determining that the gaze input targets the user interface element.
According to another embodiment, the instructions further comprise instructions for: in accordance with a determination that the gaze input is no longer targeting the user interface element, display of the user interface element is stopped and updating of the scalar for the device setting is stopped.
According to another embodiment, the user interface element comprises: a horizontal slider bar having an indicator that moves left and right, the updating the visual representation of the scalar including moving the indicator of the horizontal slider bar in accordance with determining that the head pose information indicates a change in yaw angle; a vertical slider having an indicator that moves up and down, updating the visual representation of the scalar including moving the indicator of the vertical slider in accordance with determining that the head pose information indicates a change in pitch angle; or a radial slider with a radially moving indicator, updating the visual representation of the scalar includes moving the indicator of the radial slider in accordance with determining that the head pose information indicates a change in roll angle.
According to another embodiment, the user interface element comprises a slider bar with an indicator.
According to another embodiment, updating the visual representation of the scalar includes moving the indicator of the slider bar in a given direction in accordance with determining that the head pose information indicates head movement in the given direction.
According to another embodiment, obtaining the head pose information includes obtaining a first head pose at a first time when the gaze input targets the user interface element and obtaining a second head pose at a second time after the first time, and updating the scalar for the device setting based on the head pose information includes updating the scalar for the device setting based on a change between the first head pose and the second head pose.
According to an embodiment, there is provided a method of operating an electronic device comprising one or more displays, the method comprising: displaying, using the one or more displays, a user interface element comprising a visual representation of a scalar of the device settings; obtaining head pose information via a first subset of the one or more sensors; updating the scalar for the device setting based on the head pose information; and updating the visual representation of the scalar based on the updated scalar.
According to another embodiment, the method includes displaying, prior to displaying the user interface element, an additional user interface element using the one or more displays and obtaining gaze input via the second subset of the one or more sensors, the displaying the user interface element including displaying the user interface element in accordance with determining that the gaze input targets the additional user interface element.
According to another embodiment, the method includes obtaining gaze input via a second subset of the one or more sensors, the updating the scalar for the device setting based on the head pose information including updating the scalar for the device setting based on the head pose information in accordance with determining that the gaze input targets the user interface element.
According to another embodiment, the method includes ceasing to display the user interface element and ceasing to update the scalar for the device setting in accordance with a determination that the gaze input is no longer targeting the user interface element.
According to another embodiment, the user interface element comprises: a horizontal slider bar having an indicator that moves left and right, the updating the visual representation of the scalar including moving the indicator of the horizontal slider bar in accordance with determining that the head pose information indicates a change in yaw angle; a vertical slider having an indicator that moves up and down, updating the visual representation of the scalar including moving the indicator of the vertical slider in accordance with determining that the head pose information indicates a change in pitch angle; or a radial slider with a radially moving indicator, updating the visual representation of the scalar includes moving the indicator of the radial slider in accordance with determining that the head pose information indicates a change in roll angle.
According to another embodiment, the user interface element comprises a slider bar with an indicator.
According to another embodiment, updating the visual representation of the scalar includes moving the indicator of the slider bar in a given direction in accordance with determining that the head pose information indicates head movement in the given direction.
According to another embodiment, obtaining the head pose information includes obtaining a first head pose at a first time when the gaze input targets the user interface element and obtaining a second head pose at a second time after the first time, and updating the scalar for the device setting based on the head pose information includes updating the scalar for the device setting based on a change between the first head pose and the second head pose.
According to an embodiment, there is provided a non-transitory computer readable storage medium storing one or more programs configured for execution by one or more processors of an electronic device comprising one or more displays, the one or more programs comprising instructions for: displaying, using the one or more displays, a user interface element comprising a visual representation of a scalar of the device settings; obtaining head pose information via a first subset of the one or more sensors; updating the scalar for the device setting based on the head pose information; and updating the visual representation of the scalar based on the updated scalar.
According to another embodiment, the instructions further comprise instructions for: displaying additional user interface elements using the one or more displays prior to displaying the user interface elements and obtaining gaze input via the second subset of the one or more sensors, the displaying the user interface elements including displaying the user interface elements in accordance with determining that the gaze input targets the additional user interface elements.
According to another embodiment, the instructions further comprise instructions for: obtaining gaze input via a second subset of the one or more sensors, updating the scalar for the device setting based on the head pose information includes updating the scalar for the device setting based on the head pose information in accordance with determining that the gaze input targets the user interface element.
According to another embodiment, the instructions further comprise instructions for: in accordance with a determination that the gaze input is no longer targeting the user interface element, display of the user interface element is stopped and updating of the scalar for the device setting is stopped.
According to another embodiment, the user interface element comprises: a horizontal slider bar having an indicator that moves left and right, the updating the visual representation of the scalar including moving the indicator of the horizontal slider bar in accordance with determining that the head pose information indicates a change in yaw angle; a vertical slider having an indicator that moves up and down, updating the visual representation of the scalar including moving the indicator of the vertical slider in accordance with determining that the head pose information indicates a change in pitch angle; or a radial slider with a radially moving indicator, updating the visual representation of the scalar includes moving the indicator of the radial slider in accordance with determining that the head pose information indicates a change in roll angle.
According to another embodiment, the user interface element comprises a slider bar with an indicator.
According to another embodiment, updating the visual representation of the scalar includes moving the indicator of the slider bar in a given direction in accordance with determining that the head pose information indicates head movement in the given direction.
According to another embodiment, obtaining the head pose information includes obtaining a first head pose at a first time when the gaze input targets the user interface element and obtaining a second head pose at a second time after the first time, and updating the scalar for the device setting based on the head pose information includes updating the scalar for the device setting based on a change between the first head pose and the second head pose.
The foregoing is merely exemplary and various modifications may be made to the embodiments described. The foregoing embodiments may be implemented independently or may be implemented in any combination.

Claims (20)

1. An electronic device, comprising:
one or more sensors;
one or more displays;
one or more processors; and
a memory storing instructions configured to be executed by the one or more processors, the instructions for:
Displaying, using the one or more displays, a user interface element comprising a visual representation of a scalar of a device setting;
obtaining head pose information via a first subset of the one or more sensors;
updating the scalar for the device setting based on the head pose information; and
the visual representation of the scalar is updated based on the updated scalar.
2. The electronic device of claim 1, wherein the instructions further comprise instructions to:
displaying additional user interface elements using the one or more displays prior to displaying the user interface elements; and
obtaining gaze input via a second subset of the one or more sensors, wherein displaying the user interface element includes displaying the user interface element in accordance with a determination that the gaze input targets the additional user interface element.
3. The electronic device of claim 1, wherein the instructions further comprise instructions to:
obtaining gaze input via a second subset of the one or more sensors, wherein updating the scalar for the device settings based on the head pose information comprises updating the scalar for the device settings based on the head pose information in accordance with determining that the gaze input targets the user interface element.
4. The electronic device of claim 3, wherein the instructions further comprise instructions to:
in accordance with a determination that the gaze input is no longer targeting the user interface element, ceasing to display the user interface element and ceasing to update the scalar for the device setting.
5. The electronic device of claim 1, wherein the user interface element comprises:
a horizontal slider bar having an indicator that moves left and right, wherein updating the visual representation of the scalar includes moving the indicator of the horizontal slider bar in accordance with determining that the head pose information indicates a change in yaw angle;
a vertical slider having an indicator that moves up and down, wherein updating the visual representation of the scalar includes moving the indicator of the vertical slider in accordance with determining that the head pose information indicates a change in pitch angle; or alternatively
A radial slider having an indicator that moves radially, wherein updating the visual representation of the scalar includes moving the indicator of the radial slider in accordance with determining that the head pose information indicates a change in roll angle.
6. The electronic device of claim 1, wherein the user interface element comprises a slider bar with an indicator.
7. The electronic device of claim 6, wherein updating the visual representation of the scalar comprises, in accordance with a determination that the head pose information indicates head movement in a given direction, moving the indicator of the slider bar in the given direction.
8. The electronic device of claim 1, wherein obtaining the head pose information comprises obtaining a first head pose at a first time when gaze input targets the user interface element and obtaining a second head pose at a second time after the first time, and wherein updating the scalar for the device setting based on the head pose information comprises updating the scalar for the device setting based on a change between the first head pose and the second head pose.
9. A method of operating an electronic device comprising one or more displays, the method comprising:
displaying, using the one or more displays, a user interface element comprising a visual representation of a scalar of a device setting;
obtaining head pose information via a first subset of the one or more sensors;
updating the scalar for the device setting based on the head pose information; and
The visual representation of the scalar is updated based on the updated scalar.
10. The method of claim 9, the method further comprising:
displaying additional user interface elements using the one or more displays prior to displaying the user interface elements; and
obtaining gaze input via a second subset of the one or more sensors, wherein displaying the user interface element includes displaying the user interface element in accordance with a determination that the gaze input targets the additional user interface element.
11. The method of claim 9, the method further comprising:
obtaining gaze input via a second subset of the one or more sensors, wherein updating the scalar for the device settings based on the head pose information comprises updating the scalar for the device settings based on the head pose information in accordance with determining that the gaze input targets the user interface element.
12. The method of claim 11, the method further comprising:
in accordance with a determination that the gaze input is no longer targeting the user interface element, ceasing to display the user interface element and ceasing to update the scalar for the device setting.
13. The method of claim 9, wherein the user interface element comprises:
a horizontal slider bar having an indicator that moves left and right, wherein updating the visual representation of the scalar includes moving the indicator of the horizontal slider bar in accordance with determining that the head pose information indicates a change in yaw angle;
a vertical slider having an indicator that moves up and down, wherein updating the visual representation of the scalar includes moving the indicator of the vertical slider in accordance with determining that the head pose information indicates a change in pitch angle; or alternatively
A radial slider having an indicator that moves radially, wherein updating the visual representation of the scalar includes moving the indicator of the radial slider in accordance with determining that the head pose information indicates a change in roll angle.
14. The method of claim 9, wherein the user interface element comprises a slider bar having an indicator.
15. The method of claim 14, wherein updating the visual representation of the scalar comprises, in accordance with a determination that the head pose information indicates head movement in a given direction, moving the indicator of the slider bar in the given direction.
16. The method of claim 9, wherein obtaining the head pose information comprises obtaining a first head pose at a first time when gaze input targets the user interface element and obtaining a second head pose at a second time after the first time, and wherein updating the scalar for the device setting based on the head pose information comprises updating the scalar for the device setting based on a change between the first head pose and the second head pose.
17. A non-transitory computer readable storage medium storing one or more programs configured for execution by one or more processors of an electronic device comprising one or more displays, the one or more programs comprising instructions for:
displaying, using the one or more displays, a user interface element comprising a visual representation of a scalar of a device setting;
obtaining head pose information via a first subset of the one or more sensors;
updating the scalar for the device setting based on the head pose information; and
the visual representation of the scalar is updated based on the updated scalar.
18. The non-transitory computer-readable storage medium of claim 17, wherein the instructions further comprise instructions to:
displaying additional user interface elements using the one or more displays prior to displaying the user interface elements; and
obtaining gaze input via a second subset of the one or more sensors, wherein displaying the user interface element includes displaying the user interface element in accordance with a determination that the gaze input targets the additional user interface element.
19. The non-transitory computer-readable storage medium of claim 17, wherein the instructions further comprise instructions to:
obtaining gaze input via a second subset of the one or more sensors, wherein updating the scalar for the device settings based on the head pose information comprises updating the scalar for the device settings based on the head pose information in accordance with determining that the gaze input targets the user interface element; and
in accordance with a determination that the gaze input is no longer targeting the user interface element, ceasing to display the user interface element and ceasing to update the scalar for the device setting.
20. The non-transitory computer-readable storage medium of claim 17, wherein the user interface element comprises:
a horizontal slider bar having an indicator that moves left and right, wherein updating the visual representation of the scalar includes moving the indicator of the horizontal slider bar in accordance with determining that the head pose information indicates a change in yaw angle;
a vertical slider having an indicator that moves up and down, wherein updating the visual representation of the scalar includes moving the indicator of the vertical slider in accordance with determining that the head pose information indicates a change in pitch angle; or alternatively
A radial slider having an indicator that moves radially, wherein updating the visual representation of the scalar includes moving the indicator of the radial slider in accordance with determining that the head pose information indicates a change in roll angle.
CN202310742816.0A 2022-06-24 2023-06-21 Controlling device settings using head gestures Pending CN117289789A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US63/355,500 2022-06-24
US18/295,772 US20230418371A1 (en) 2022-06-24 2023-04-04 Controlling a Device Setting Using Head Pose
US18/295,772 2023-04-04

Publications (1)

Publication Number Publication Date
CN117289789A true CN117289789A (en) 2023-12-26

Family

ID=89239729

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310742816.0A Pending CN117289789A (en) 2022-06-24 2023-06-21 Controlling device settings using head gestures

Country Status (1)

Country Link
CN (1) CN117289789A (en)

Similar Documents

Publication Publication Date Title
CN107209386B (en) Augmented reality view object follower
US11520456B2 (en) Methods for adjusting and/or controlling immersion associated with user interfaces
KR20230025914A (en) Augmented reality experiences using audio and text captions
KR20230026505A (en) Augmented reality experiences using object manipulation
US9886086B2 (en) Gesture-based reorientation and navigation of a virtual reality (VR) interface
US9035878B1 (en) Input system
US20210405772A1 (en) Augmented reality eyewear 3d painting
US20210303107A1 (en) Devices, methods, and graphical user interfaces for gaze-based navigation
US11720171B2 (en) Methods for navigating user interfaces
JP2019105678A (en) Display device and method to display images
JP2021524971A (en) Displaying physical input devices as virtual objects
US20230336865A1 (en) Device, methods, and graphical user interfaces for capturing and displaying media
CN117916777A (en) Hand-made augmented reality endeavor evidence
WO2022005733A1 (en) Augmented reality eyewear with mood sharing
JPWO2016121883A1 (en) Electronics
CN117289789A (en) Controlling device settings using head gestures
US20230418371A1 (en) Controlling a Device Setting Using Head Pose
CN115812189A (en) Dynamic sensor selection for visual inertial odometer system
US20240036711A1 (en) Controlling an Interface Using Gaze Input
US20180239487A1 (en) Information processing device, information processing method, and program
US20240103712A1 (en) Devices, Methods, and Graphical User Interfaces For Interacting with Three-Dimensional Environments
US20240104849A1 (en) User interfaces that include representations of the environment
US20240103685A1 (en) Methods for controlling and interacting with a three-dimensional environment
CN116648683A (en) Method and system for selecting objects
CN117940964A (en) Hand-made augmented reality experience

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination