CN117193589A - Interface interaction control method, computer device, and computer-readable storage medium - Google Patents

Interface interaction control method, computer device, and computer-readable storage medium Download PDF

Info

Publication number
CN117193589A
CN117193589A CN202311170835.7A CN202311170835A CN117193589A CN 117193589 A CN117193589 A CN 117193589A CN 202311170835 A CN202311170835 A CN 202311170835A CN 117193589 A CN117193589 A CN 117193589A
Authority
CN
China
Prior art keywords
augmented reality
cursor
display
reality content
control method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311170835.7A
Other languages
Chinese (zh)
Inventor
张英健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Bounds Inc
Original Assignee
Meta Bounds Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meta Bounds Inc filed Critical Meta Bounds Inc
Priority to CN202311170835.7A priority Critical patent/CN117193589A/en
Publication of CN117193589A publication Critical patent/CN117193589A/en
Pending legal-status Critical Current

Links

Landscapes

  • Position Input By Displaying (AREA)

Abstract

The application discloses an interface interaction control method, computer equipment and a computer readable storage medium. The method comprises the following steps: when a first trigger instruction is received, moving a cursor in a display area of the augmented reality equipment to a target augmented reality content position, wherein a first display interface is displayed in the display area of the augmented reality equipment, and preset augmented reality content is displayed on the first display interface, wherein the first trigger instruction is a control instruction for adjusting the generation of the spatial pose of the augmented reality equipment; and when the second trigger instruction is received, changing the state of the cursor so that the augmented reality equipment selects the target augmented reality content for the first interactive operation. The method selects the augmented reality content needing to be interacted based on the mode of controlling the pose of the augmented reality device, further completes subsequent interactive operation, is convenient and accurate in the whole interaction process, and improves the interaction experience of the user.

Description

Interface interaction control method, computer device, and computer-readable storage medium
Technical Field
The application belongs to the technical field of augmented reality, and particularly relates to an interface interaction control method, computer equipment and a computer readable storage medium.
Background
Augmented reality (augmented reality, AR) technology is a computer simulation technology that can create and experience a virtual world, which uses a computer to generate a simulated environment, which is a system simulation of multi-source information-fused, interactive three-dimensional dynamic views and physical behaviors, to immerse users in the environment.
Common interaction methods of augmented reality devices include gesture recognition, gaze tracking, touch control, voice recognition, and the like. Gesture recognition and gaze tracking require accurate capture of the user's gestures and gaze point to ensure proper operation and feedback, which may lead to misoperations or reduced user experience if interactive recognition is inaccurate or unstable. The touch control interaction mode needs to use additional external equipment, such as a handheld controller or a touch panel, and because the external equipment and the display interface are mutually separated, corresponding operation instructions are generated by touching the external equipment, the use experience is general, and the voice recognition mode can cause social embarrassment in public places. For example, the use of voice commands aloud may be noticeable and annoying to others, especially in quiet environments or in places where people are dense.
Therefore, the interaction mode of the augmented reality device needs to be perfected so as to improve the user experience.
Disclosure of Invention
The application solves the technical problems that: how to improve the interaction mode of the augmented reality device so as to improve the user interaction experience.
The application discloses an interface interaction control method, which comprises the following steps:
when a first trigger instruction is received, moving a cursor in a display area of the augmented reality equipment to a target augmented reality content position, wherein a first display interface is displayed in the display area of the augmented reality equipment, and a preset augmented reality content is displayed on the first display interface, wherein the first trigger instruction is a control instruction for adjusting the generation of the spatial pose of the augmented reality equipment; when a second trigger instruction is received, the state of the cursor is changed, so that the augmented reality equipment selects the target augmented reality content to perform first interactive operation, and the augmented reality content is conveniently selected by controlling the space pose of the augmented reality equipment, so that the interaction difficulty is reduced.
Optionally, the second trigger instruction includes: the finger ring receives the control instruction generated after the touch operation or the finger ring receives the control instruction generated by the key operation, and combines the space pose change of the augmented reality device with the finger ring operation, thereby providing a new interaction mode.
Optionally, the first interaction operation includes: a select operation and/or an open operation.
Optionally, when receiving the first trigger instruction, moving a cursor within the display area of the augmented reality device to the target augmented reality content location, the method further comprises:
when a first trigger instruction is received, moving a cursor in a display area of the augmented reality equipment to a position at a preset distance from the target augmented reality content; and tracking and capturing the eye gaze position, and moving the cursor when the eye gaze position is the target augmented reality content position, so that the cursor is adsorbed to the target augmented reality content position, the cursor can be moved to the target augmented reality content position more quickly, and the adjustment difficulty of the space pose of the augmented reality device is reduced.
Optionally, the tracking and capturing a gaze location of a human eye, moving the cursor when the human eye gaze location is at the target augmented reality content location, further comprising:
and acquiring the duration time when the eye gazing position is at the target augmented reality content position, judging whether the time reaches a preset threshold value, and if so, moving the cursor according to the eye gazing position so as to reduce misoperation.
Optionally, the shape of the cursor includes a circle, an arrow, a triangle, a circle, a pie, to provide more personalized options.
Optionally, at least one of a display color, a display transparency, and a display contrast of the cursor is different from the first display interface, so as to facilitate the user to recognize the cursor.
Optionally, when receiving the second trigger instruction, changing the state of the cursor further includes:
at least one of a display shape, a display color, a display transparency, and a display contrast of the cursor is changed to indicate that the augmented reality device has selected the target augmented reality content to play a role of reminding a user.
Optionally, the method further comprises: at least one of the display shape, the display color, the display transparency and the display contrast of the cursor is provided with a parameter range, and the parameter range is used for a user to adjust the parameter when using the cursor so as to set the cursor, so that the requirements of different users are met.
Optionally, the method further includes, when receiving a third trigger instruction, displaying a second interaction operation on the target augmented reality content, where the third trigger instruction includes a control instruction generated by a touch operation on a ring or a control instruction generated by a key operation on the ring, and the second interaction operation includes: drag operation, rotation operation and zoom operation, and richer interaction operation is realized through the finger ring.
The application also discloses a computer readable storage medium, which stores an interface interaction control program, and the interface interaction control program realizes the interface interaction control method when being executed by a processor.
The application also discloses a computer device, which comprises a computer readable storage medium, a processor and an interface interaction control program stored in the computer readable storage medium, wherein the interface interaction control program realizes the interface interaction control method when being executed by the processor.
The application discloses an interface interaction control method, computer equipment and a computer readable storage medium, which have the following technical effects:
the method selects the augmented reality content needing to be interacted based on the mode of controlling the pose of the augmented reality device, further completes subsequent interaction operation, is convenient and accurate in the whole interaction process, and improves the interaction experience of the user.
Drawings
FIG. 1 is a flow chart of an interface interaction control method according to a first embodiment of the application;
FIG. 2 is a schematic view of a first embodiment of the present application when the gaze point overlaps with display content;
fig. 3 is a schematic diagram of a point of gaze approaching a display content according to a first embodiment of the present application;
fig. 4 is a schematic view illustrating a movement process of a gaze point according to a first embodiment of the present application;
FIG. 5 is a schematic block diagram of an interface interaction control device according to a second embodiment of the present application;
fig. 6 is a schematic diagram of a computer device according to a fourth embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
Before describing various embodiments of the present application in detail, the technical idea of the present application will be briefly described first: the conventional interaction mode of the augmented reality device can bring the problems of low interaction accuracy, complex interaction mode, poor interaction experience of other people influenced by the interaction process and the like due to the characteristics of the conventional augmented reality device. Therefore, the interface interaction control method provided by the application adjusts the position of the cursor by adjusting the space pose of the augmented reality equipment so as to determine the target augmented reality content, and finally, further interactive operation is carried out on the target augmented reality content by combining the corresponding interactive instruction. The position of the cursor here may represent the user gaze point position, while the cursor also acts as a carrier for the user's interaction with the augmented reality device. The method is based on the fact that the augmented reality content needing to be interacted is determined by controlling the pose of the augmented reality device, the subsequent interaction operation is further completed, the whole interaction process is convenient and accurate, and the interaction experience of a user is improved. The interface interaction control method of the present application is described below in connection with further embodiments.
Specifically, as shown in fig. 1, the interface interaction control method in the first embodiment includes:
and step S10, when a first trigger instruction is received, moving a cursor in a display area of the augmented reality equipment to a target augmented reality content position, wherein a first display interface is displayed in the display area of the augmented reality equipment, and the first display interface is displayed with preset augmented reality content, wherein the first trigger instruction is a control instruction for adjusting the spatial pose of the augmented reality equipment.
And step S20, when a second trigger instruction is received, changing the state of a cursor so that the augmented reality equipment selects target augmented reality content to perform first interactive operation.
Specifically, in step S10, the spatial pose of the augmented reality device may be detected by an inertial sensor of the augmented reality device, and the specific method includes: acquiring a reference pose of the augmented reality equipment; and detecting the current rotation pose and/or the current leveling pose of the augmented reality equipment relative to the reference pose, and taking the current rotation pose and/or the current leveling pose as a space pose.
In one implementation, the reference pose of the first embodiment is a spatial pose of the augmented reality device when starting to start operation, for example, when the user starts to wear the augmented reality device with the head facing straight ahead, the spatial pose of the augmented reality device at this time is recorded as the reference pose. In one implementation, the reference pose of the first embodiment is a spatial pose of the augmented reality device at any time before the current time in the use process, for example, a spatial pose at a time immediately before the current time may be used as the reference pose, where the immediately before time may be a second, a minute, an hour, and so on immediately before the current time. The spatial pose may be a position coordinate of the augmented reality device, or a rotation vector and/or a translation vector of the augmented reality device, and for the reference pose, the corresponding position coordinate may be set to 0 or a predetermined coordinate value, or the rotation vector and/or the translation vector of the corresponding augmented reality device may be set to 0 or a predetermined value. The position coordinates, the rotation vector and the translation vector of the augmented reality device can be detected by an inertial sensor (gyroscope, displacement sensor and the like) carried by the augmented reality device.
Further, at the current time, the current rotation pose and/or the current translational pose of the augmented reality device detected relative to the reference pose is taken as the spatial pose, that is, the determining factor of the spatial pose may be two factors of rotation and translation, or may be one of the two factors. Further, the current rotational pose comprises a rotational angle and/or a rotational direction of the augmented reality device relative to the reference pose; the current panning position comprises a panning distance and/or a panning direction of the augmented reality device relative to the reference position. The determining factor of the current rotation pose can be two factors of the rotation angle and the rotation direction, or can be one of the two factors; the determining factor of the current panning posture may be the panning distance or the panning method, or may be one of them. Taking the current rotation pose as an example, the determining factor depends on the layout of the display interface, for example, when the display contents of the display interface are sequentially distributed along the same direction, different contents can be displayed or selected by controlling the rotation angle of the augmented reality device. For example, when the display contents of the display interface are distributed in different directions, different contents can be displayed or selected by controlling the rotation direction of the augmented reality device. In the case of a more complex layout, for example, display contents are distributed in all directions, and a plurality of display contents are sequentially arranged in a part of directions, at this time, it is necessary to control the rotation angle and rotation direction of the augmented reality device at the same time to display or select different contents. Similarly, the determinant for the current panning gesture also depends on the layout of the display interface. For a display interface with more complex layout, the current rotation pose and the current leveling pose can be simultaneously considered so as to more accurately select or display corresponding augmented reality content.
In the first embodiment, the position of the cursor can be conveniently adjusted by binding the moving process of the cursor in the display area of the augmented reality device with the adjusting process of the spatial pose. Illustratively, the central position of the display area of the augmented reality device in the spatial pose is taken as the position of the cursor, so that the cursor can be ensured to be always positioned at the center of the field of view of the user.
Further, the preset augmented reality content displayed on the first display interface may be divided according to the function type. As shown in fig. 2, the first display interface is divided into four display areas D1, D2, D3, and D4, each of which has a plurality of different augmented reality contents a, and each of the display areas may be different functional areas, for example, an application area D4, a widget area D3, a notification bar area D1, and a dock bar area D2, where the cursor F is located in the application area D4 and overlaps with one of the application programs, and the application program is the target augmented reality content.
In one embodiment, in order to facilitate identification of the cursor, the shape and display manner of the cursor may be designed so as to distinguish the cursor from the first display interface. Illustratively, the shape of the cursor includes a circle, an arrow, a triangle, a circle, a pie, and the like. When the display mode is designed, at least one of the display color, the display transparency and the display contrast of the cursor can be distinguished from the first display interface. For example, the cursor may be represented by a red circle. Wherein at least one of the display shape, display color, display transparency, and display contrast of the cursor is provided with a parameter range for a user to adjust parameters to set the cursor when using the cursor. For example, the parameter range of the display transparency may be set to 50% to 80% for the display transparency of the corresponding parameter may be set as needed.
Further, in step S20, when the second trigger instruction is received, the manner of changing the state of the cursor includes changing at least one of a display shape, a display color, a display transparency, and a display contrast of the cursor to indicate that the augmented reality device has selected the target augmented reality content, so as to play a role in reminding the user. For example, after receiving the second trigger instruction, the display transparency of the cursor may be adjusted from 80% to 50% to indicate that the target augmented reality content at the cursor position has been selected.
In another embodiment, the different display modes may also include different display distances. By way of example, the "display distance" refers to the distance of the augmented reality content from the user, and may also be understood as the resultant distance of the augmented reality content. In one embodiment, after the target augmented reality content is selected, the target augmented reality content is elastically moved a predetermined distance in a direction of the user to be distinguished from other display contents.
In other embodiments, when the display content is being targeted to augment reality content, a corresponding feedback instruction may be generated at this time, and a corresponding feedback operation may be generated according to the feedback instruction to prompt the user. For example, the feedback operation may augment the reality device to generate vibrations or to emit speech.
Illustratively, in step S10, it may be determined in two ways that the cursor is moved to the target augmented reality content location.
The first way is precise movement: when the cursor is moved to overlap with the target augmented reality content, it may be determined that the cursor is moved to the target augmented reality content location. Specifically, a plurality of augmented reality contents are displayed on the first display interface, after the spatial pose of the augmented reality device is adjusted, the current position of the cursor is changed, when the cursor intersects with the augmented reality contents, the augmented reality contents can be used as target augmented reality contents, and the cursor is determined to be moved to the position of the target augmented reality contents.
The second way is close to adsorption: and when the distance between the cursor and the target augmented reality content is smaller than a preset value, adsorbing the cursor to the target augmented reality content. Specifically, a plurality of augmented reality contents are displayed on the first display interface, after the spatial pose of the augmented reality device is adjusted, the current position of the cursor is changed, and when the cursor is close to the augmented reality contents, the user can be considered to have an intention of selecting the augmented reality contents, so when the distance between the cursor and the augmented reality contents is smaller than a preset value, the augmented reality contents can be used as target augmented reality contents, and the cursor can be adsorbed to the target augmented reality contents. The magnitude of the preset value is set according to actual requirements. As shown in fig. 3, the cursor F is located in the application area D4, and the cursor F is close to the application at the center of the application area D4, and when the distance between the cursor F and the application is smaller than the preset value, the application can be used as the target augmented reality content, and the cursor F is adsorbed to the application. On one hand, the user can be prompted to confirm corresponding target augmented reality content through the adsorption process, and on the other hand, the cursor is adsorbed on the target augmented reality content, so that further operation on the target augmented reality content is facilitated. Illustratively, when the target augmented reality content is a clickable button, the cursor automatically adsorbs to the button when the cursor approaches the clickable button, so that the user can easily select and click the button; when the target augmented reality content is text content, when a cursor approaches the text content, the cursor automatically adsorbs to characters or words of the text content, which makes it easier to edit a document, select a text or perform cursor positioning; when the target augmented reality content is a functional area, adsorbing the cursor to the edge of the functional area when the cursor approaches the functional area, so as to avoid the cursor loss; when the cursor is positioned in the interval area of each functional area, the judgment is made according to the pixel intermediate value of the interval area, the current position of the cursor is closer to the functional area on which side, and the cursor is adsorbed to the edge of the functional area, so that the cursor is prevented from being lost.
Both the above two ways are to independently adjust the spatial pose of the augmented reality device to realize the cursor movement, so that the spatial pose of the augmented reality device needs to be accurately adjusted to move the cursor to the target augmented reality content. In another embodiment, the cursor may be moved in combination with the spatial pose and the eye gaze position, that is, when the first trigger instruction is received in step S10, the cursor in the display area of the augmented reality device is moved to the target augmented reality content position, which includes the following steps:
step S11, when a first trigger instruction is received, moving a cursor in a display area of the augmented reality equipment to a position at a preset distance from target augmented reality content;
step S12, tracking and capturing the eye gazing position, and moving the cursor when the eye gazing position is the target augmented reality content position, so that the cursor is adsorbed to the target augmented reality content position.
Wherein eye movement may be tracked in accordance with an eye movement tracking device on the augmented reality device, a gaze direction determined, and a corresponding eye gaze position in the display area determined in accordance with the gaze direction. When the eye gazing position is overlapped with the position of the target augmented reality content, the cursor can be automatically adsorbed to the position of the target augmented reality content only at the position of the preset distance from the target augmented reality content. On the one hand, the cursor can be moved to the target augmented reality content more quickly, on the other hand, the adjustment difficulty of the spatial pose of the augmented reality device is reduced, namely, the cursor is not precisely moved to the target augmented reality content, and at the moment, the cursor can be moved by combining the eye gazing position without continuously adjusting the spatial pose of the augmented reality device.
Further, the duration time when the eye gazing position is the target augmented reality content position is obtained, whether the time reaches a preset threshold value is judged, and if yes, a cursor is moved according to the eye gazing position. That is, in order to reduce the malfunction, it is necessary to determine the duration for which the eye gaze position stays at the target augmented reality content, and it is possible to determine that the user does want to interact with the target augmented reality content only when the time reaches a preset threshold. The preset threshold may be set according to the need, for example, 500 milliseconds.
In step S20, the second trigger instruction includes: the finger ring receives a control instruction generated after touch operation or receives a control instruction generated by key operation. A ring is a device worn on a finger, typically equipped with IMU gyroscopic sensors and physical buttons, for capturing movements and gestures of the finger according to algorithms of gyroscopic movements. Buttons on the ring may be used for key operations, for example, to confirm selection or to perform other specific functions, and touch areas on the ring may generate touch operations. The first interaction operation generated by the second trigger instruction comprises the following steps: a select operation and/or an open operation.
Further, the interface interaction control method comprises the following steps:
step S30, displaying a second interaction operation on the target augmented reality content when a third trigger instruction is received, where the third trigger instruction includes a control instruction generated by a touch operation on the ring or a control instruction generated by a key operation on the ring, and the second interaction operation includes: drag operation, rotation operation, zoom operation.
For example, when the finger of the user wears the ring, the motion and gesture of the finger can be captured according to the algorithm of the gyroscope motion, so that a third trigger instruction is generated, and the generated third trigger instruction can perform a drag operation on the target augmented reality content by moving the ring.
Further, the interface interaction control method of the first embodiment further includes: tracking the space pose change process of the augmented reality equipment; and moving the cursor according to the space pose change process and displaying a guide mark, wherein the guide mark is used for prompting the movement process of the cursor to a user. For example, when a user wants to select other display contents far away from the current target augmented reality content, the current spatial pose of the augmented reality device needs to be adjusted to a large extent, at this time, spatial poses at a plurality of continuous moments in the adjustment process can be detected in real time, and according to the spatial poses at the plurality of continuous moments, a cursor is moved and a guiding identifier is displayed in a display interface, so that the user can observe the movement process of the cursor in real time, and the cursor is ensured to move to the corresponding display content. The guide mark can be a track line or a guide arrow. As shown in fig. 4, the cursor F is located in the application area D4 at this time, the application at the upper left corner of the application area D4 is the target augmented reality content, when the user wants to perform interactive operation on the content in the dock area D2 after the interactive operation on the application is completed, the current space pose needs to be changed greatly, and the moving track of the cursor F can be shown by a dashed arrow, through which the current position and moving direction of the cursor can be indicated in real time, so as to improve the accuracy of cursor movement.
The interface interaction control method of the first embodiment further includes: and acquiring a switching instruction, and determining new target augmented reality content according to the switching instruction. Specifically, after the interactive operation is completed on the current target augmented reality content, the spatial pose of the augmented reality device can be continuously adjusted to adjust the current position of the cursor, so that new target augmented reality content can be selected, or the spatial pose of the augmented reality device can be kept unchanged, a switching instruction can be generated in other modes, and the new target augmented reality content can be determined. For example, the switching instructions may be generated by an interactive ring communicatively connected to the augmented reality device and/or by a touch area of the augmented reality device. For example, different target augmented reality content may be selected by a sliding motion of a finger wearing a ring.
As shown in fig. 5, the second embodiment also discloses an interface interaction control device, where the interface interaction control device includes a mobile module 100 and an interaction module 200. The mobile module 100 is configured to, when receiving a first trigger instruction, move a cursor in a display area of the augmented reality device to a target augmented reality content position, where a first display interface is displayed in the display area of the augmented reality device, and the first display interface displays preset augmented reality content, where the first trigger instruction is a control instruction for adjusting a spatial pose of the augmented reality device; the interaction module 200 is configured to change a state of a cursor when receiving the second trigger instruction, so that the augmented reality device selects the target augmented reality content for performing the first interaction operation.
The second trigger instruction includes: the finger ring receives a control instruction generated after touch operation or a control instruction generated by key operation, and the first interactive operation comprises: a select operation and/or an open operation.
Further, the mobile module 100 is further configured to, when receiving the first trigger instruction, move a cursor in a display area of the augmented reality device to a position at a preset distance from the target augmented reality content; and further for tracking and capturing a human eye gaze location, moving the cursor when the human eye gaze location is at the target augmented reality content location, such that the cursor is attracted to the target augmented reality content location.
Further, the moving module 100 is further configured to obtain a duration when the eye gaze location is at the target augmented reality content location, determine whether the duration reaches a preset threshold, and if yes, move the cursor according to the eye gaze location.
Wherein the shape of the cursor comprises a circle, an arrow, a triangle, a ring, a cake, and at least one of a display color, a display transparency and a display contrast of the cursor is different from the first display interface. At least one of the display shape, display color, display transparency, and display contrast of the cursor is provided with a parameter range for a user to adjust the parameter to set the cursor when using the cursor.
Further, the interaction module 200 is further configured to change the state of the cursor when receiving the second trigger instruction further includes: at least one of a display shape, a display color, a display transparency, and a display contrast of the cursor is changed to indicate that the augmented reality device has selected the target augmented reality content.
Further, the interaction module 200 is further configured to display a second interaction operation on the target augmented reality content when receiving a third trigger instruction, where the third trigger instruction includes a control instruction generated by a touch operation on the ring or a control instruction generated by a key operation on the ring, and the second interaction operation includes: drag operation, rotate operation, and/or zoom operation.
The third embodiment also discloses a computer readable storage medium, in which an interface interaction control program is stored, and when the interface interaction control program is executed by the processor, the interface interaction control method in the first embodiment is implemented.
The fourth embodiment also discloses a computer device, which includes, at the hardware level, as shown in fig. 6, a processor 12, an internal bus 13, a network interface 14, and a computer readable storage medium 11. The processor 12 reads the corresponding computer program from the computer-readable storage medium and then runs to form the request processing means at a logic level. Of course, in addition to software implementation, one or more embodiments of the present disclosure do not exclude other implementation manners, such as a logic device or a combination of software and hardware, etc., that is, the execution subject of the following processing flow is not limited to each logic unit, but may also be hardware or a logic device. The computer readable storage medium 11 stores an interface interaction control program, which when executed by a processor implements the interface interaction control method described above.
Computer-readable storage media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer-readable storage media include, but are not limited to, phase-change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic disk storage, quantum memory, graphene-based storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device.
The computer device described above may be an AR device. The computer device may be, for example, augmented reality glasses.
In an example of augmented reality glasses, a computer device may be configured to communicate data to and receive data from an external processing device through a signal connection, which may be a wired connection, a wireless connection, or a combination thereof. However, in other cases, the computer device may be used as a stand-alone device, i.e., the data processing is performed at the computer device itself. The signal connection may be configured to carry any kind of data, such as image data (e.g., still images and/or full motion video, including 2D and 3D images), audio, multimedia, voice, and/or any other type of data. The external processing device may be, for example, a game console, personal computer, tablet computer, smart phone, or other type of processing device. The signal connection may be, for example, a Universal Serial Bus (USB) connection, a Wi-Fi connection, a bluetooth or Bluetooth Low Energy (BLE) connection, an ethernet connection, a cable connection, a DSL connection, a cellular connection (e.g., 3G, LTE/4G or 5G), etc., or a combination thereof. Additionally, the external processing device may communicate with one or more other external processing devices via a network, which may be or include, for example, a Local Area Network (LAN), wide Area Network (WAN), intranet, metropolitan Area Network (MAN), global internet, or a combination thereof.
While certain embodiments have been shown and described, it would be appreciated by those skilled in the art that changes and modifications may be made without departing from the principles and spirit of the application, the scope of which is defined in the claims and their equivalents.

Claims (12)

1. An interface interaction control method, which is characterized by comprising the following steps:
when a first trigger instruction is received, moving a cursor in a display area of the augmented reality equipment to a target augmented reality content position, wherein a first display interface is displayed in the display area of the augmented reality equipment, and a preset augmented reality content is displayed on the first display interface, wherein the first trigger instruction is a control instruction for adjusting the generation of the spatial pose of the augmented reality equipment;
and when a second trigger instruction is received, changing the state of the cursor so that the augmented reality equipment selects the target augmented reality content for first interactive operation.
2. The interface interaction control method according to claim 1, wherein the second trigger instruction includes: the finger ring receives a control instruction generated after touch operation or receives a control instruction generated by key operation.
3. The interface interaction control method according to claim 1, wherein the first interaction operation includes: a select operation and/or an open operation.
4. The interface interaction control method of claims 1-3, wherein upon receiving the first trigger instruction, moving a cursor within a display area of the augmented reality device to a target augmented reality content location, the method further comprising:
when a first trigger instruction is received, moving a cursor in a display area of the augmented reality equipment to a position at a preset distance from the target augmented reality content;
and tracking and capturing a human eye gazing position, and moving the cursor when the human eye gazing position is at the target augmented reality content position so as to enable the cursor to be adsorbed to the target augmented reality content position.
5. The interface interaction control method of claim 4, wherein the tracking and capturing the gaze location of the human eye, moving the cursor when the human eye gaze location is at the target augmented reality content location further comprises:
and acquiring the duration time when the eye gazing position is at the target augmented reality content position, judging whether the time reaches a preset threshold value, and if so, moving the cursor according to the eye gazing position.
6. The interface interactive control method according to claim 1, wherein the shape of the cursor comprises a circle, an arrow, a triangle, a doughnut, a cookie.
7. The interface interaction control method according to claim 6, wherein at least one of a display color, a display transparency, and a display contrast of the cursor is different from the first display interface.
8. The interface interaction control method according to claim 7, wherein the changing the state of the cursor when the second trigger instruction is received further comprises:
at least one of a display shape, a display color, a display transparency, and a display contrast of the cursor is changed to indicate that the augmented reality device has selected the target augmented reality content.
9. The interface interaction control method according to claim 7, further comprising: at least one of the display shape, display color, display transparency and display contrast of the cursor is provided with a parameter range for a user to adjust the parameter to set the cursor when using the cursor.
10. The interface interaction control method according to claim 1, further comprising, upon receiving a third trigger instruction, displaying a second interaction operation on the target augmented reality content, wherein the third trigger instruction includes a control instruction generated by a touch operation on a ring or a control instruction generated by a key operation on a ring, the second interaction operation including: drag operation, rotate operation, and/or zoom operation.
11. A computer-readable storage medium, characterized in that the computer-readable storage medium stores an interface interaction control program which, when executed by a processor, implements the interface interaction control method according to any one of claims 1 to 10.
12. A computer device comprising a computer readable storage medium, a processor and an interface interaction control program stored in the computer readable storage medium, which when executed by the processor implements the interface interaction control method of any of claims 1 to 10.
CN202311170835.7A 2023-09-11 2023-09-11 Interface interaction control method, computer device, and computer-readable storage medium Pending CN117193589A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311170835.7A CN117193589A (en) 2023-09-11 2023-09-11 Interface interaction control method, computer device, and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311170835.7A CN117193589A (en) 2023-09-11 2023-09-11 Interface interaction control method, computer device, and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN117193589A true CN117193589A (en) 2023-12-08

Family

ID=88986534

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311170835.7A Pending CN117193589A (en) 2023-09-11 2023-09-11 Interface interaction control method, computer device, and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN117193589A (en)

Similar Documents

Publication Publication Date Title
US11875013B2 (en) Devices, methods, and graphical user interfaces for displaying applications in three-dimensional environments
KR20220040493A (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
CN116719452A (en) Method for interacting with virtual controls and/or affordances for moving virtual objects in a virtual environment
US20220229524A1 (en) Methods for interacting with objects in an environment
CN116438505A (en) Method for manipulating objects in an environment
US10649616B2 (en) Volumetric multi-selection interface for selecting multiple objects in 3D space
US9395764B2 (en) Gestural motion and speech interface control method for 3d audio-video-data navigation on handheld devices
US11573627B2 (en) Method of controlling device and electronic device
US20210303107A1 (en) Devices, methods, and graphical user interfaces for gaze-based navigation
US11720171B2 (en) Methods for navigating user interfaces
US20230094522A1 (en) Devices, methods, and graphical user interfaces for content applications
US11934569B2 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
WO2020080107A1 (en) Information processing device, information processing method, and program
US20230343049A1 (en) Obstructed objects in a three-dimensional environment
US20230334808A1 (en) Methods for displaying, selecting and moving objects and containers in an environment
KR20210073429A (en) Integration Interface Method and System based on Eye tracking and Gesture recognition for Wearable Augmented Reality Device
US20230092874A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
US20230100689A1 (en) Methods for interacting with an electronic device
WO2022166448A1 (en) Devices, methods, systems, and media for selecting virtual objects for extended reality interaction
CN117193589A (en) Interface interaction control method, computer device, and computer-readable storage medium
CN117193588A (en) Interface interaction control method, computer device and storage medium
US11688148B2 (en) Methods and systems for selection of objects
US20230152935A1 (en) Devices, methods, and graphical user interfaces for presenting virtual objects in virtual environments
US20240184356A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
CN117043720A (en) Method for interacting with objects in an environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination