CN117193588A - Interface interaction control method, computer device and storage medium - Google Patents

Interface interaction control method, computer device and storage medium Download PDF

Info

Publication number
CN117193588A
CN117193588A CN202311163392.9A CN202311163392A CN117193588A CN 117193588 A CN117193588 A CN 117193588A CN 202311163392 A CN202311163392 A CN 202311163392A CN 117193588 A CN117193588 A CN 117193588A
Authority
CN
China
Prior art keywords
display
interface
augmented reality
reality device
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311163392.9A
Other languages
Chinese (zh)
Inventor
张英健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Meta Bounds Inc
Original Assignee
Meta Bounds Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Meta Bounds Inc filed Critical Meta Bounds Inc
Priority to CN202311163392.9A priority Critical patent/CN117193588A/en
Publication of CN117193588A publication Critical patent/CN117193588A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an interface interaction control method, computer equipment and a storage medium. The method comprises the following steps: detecting the space pose of the augmented reality device, judging the space position of the augmented reality device relative to the initial space position according to the space pose, and moving the display area of the augmented reality device to a target display interface according to the space position; the augmented reality device comprises a main display interface associated with the initial spatial position, different auxiliary display interfaces are correspondingly distributed on different spatial orientations of the main display interface, and the target display interface is one of the auxiliary display interfaces. According to the method, the display area is controlled to move along with the space pose, so that different display interfaces are selected in the display area, the limitation of the area of the display area can be broken through, the display range is expanded, meanwhile, the content of the display area is the content focused or wanted to be interacted by the user, and the watching experience and the interaction experience of the user are improved.

Description

Interface interaction control method, computer device and storage medium
Technical Field
The application belongs to the technical field of augmented reality, and particularly relates to an interface interaction control method, computer equipment and a computer readable storage medium.
Background
Augmented reality (augmented reality, AR) technology is a computer simulation technology that can create and experience a virtual world, which uses a computer to generate a simulated environment, which is a system simulation of multi-source information-fused, interactive three-dimensional dynamic views and physical behaviors, to immerse users in the environment.
Limited by a field of view (FOV), the area of a display interface of the current AR device is smaller, and the smaller display interface area can cause less displayable content, for example, an application program which wants to interact is not displayed, so that the display content needs to be switched on the same display interface to obtain the content which wants to interact, however, unlike a mobile phone and other terminals, the AR device can touch a display screen, and needs to assist by other interaction logic, such as complex gesture interaction operation or touch interaction operation, so that interaction and display logic are complex, memory capability of a user is very tested, and interaction experience of the user is reduced.
Disclosure of Invention
The application solves the technical problems that: how to overcome the problem of poor interactive experience caused by the limitation of the area of a display interface in the augmented reality equipment.
The application discloses an interface interaction control method, which comprises the following steps:
detecting the space pose of the augmented reality equipment, judging the space orientation of the augmented reality equipment relative to the initial space position according to the space pose, and moving the display area of the augmented reality equipment to a target display interface according to the space orientation; the method comprises the steps that the augmented reality equipment comprises a main display interface associated with an initial space position, different auxiliary display interfaces are correspondingly distributed on different space orientations of the main display interface, the target display interface is one of the auxiliary display interfaces, and different display interfaces are selected by controlling the space pose of the augmented reality equipment, so that interaction experience is improved.
Optionally, the main display interface carries first display information, and the importance level of the first display information is greater than the importance level of the display information carried by other display interfaces, so that the important information is located in the main display interface, and the user can conveniently acquire and select the important information.
Optionally, the different secondary display interfaces correspondingly carry different display information, and the different display information is correspondingly provided with different importance levels according to the spatial orientation of the secondary display interface relative to the primary display interface so as to provide more personalized choices.
Optionally, a display cursor is disposed in a display area of the augmented reality device, and moving the display area of the augmented reality device to the target display interface according to the spatial orientation further includes:
and synchronously moving the display cursor to the target display interface according to the space orientation so as to avoid losing the display cursor serving as an interaction carrier.
Optionally, the method comprises: the method comprises the steps of detecting the space pose of the augmented reality device, judging the space orientation of the augmented reality device relative to an initial space position according to the space pose, and detecting the space orientation of the augmented reality device relative to the initial space position;
before the display area of the augmented reality device is moved to the target display interface according to the spatial orientation, the method further comprises activating the target display interface distributed on the spatial orientation according to the spatial orientation, so that a user is guided to adjust the spatial pose of the augmented reality device, and the display area is moved to the target display interface.
Optionally, the method further comprises: the method comprises the steps of detecting the space pose of the augmented reality device, judging the space orientation of the augmented reality device relative to an initial space position according to the space pose, and detecting the space orientation of the augmented reality device relative to the initial space position;
and before synchronously moving the display cursor to the target display interface according to the space orientation, activating the display cursor so as to facilitate the user to identify the display cursor.
Optionally, the method further comprises: the activating the display cursor includes adjusting at least one of a display shape, a display color, a display transparency, and a display contrast of the display cursor to facilitate a user to select different modes to adjust the display cursor.
Optionally, the activating the display cursor includes adjusting a display transparency of the display cursor and a display contrast, wherein the display transparency is graded from 80% to 100%, and the display contrast is graded from 80% to 100%.
Optionally, said synchronously moving said display cursor to said target display interface according to said spatial orientation further comprises,
when the display cursor synchronously moves to a preset distance from the target display interface according to the space orientation, the display cursor is adsorbed to the target display interface, so that on one hand, a user can be prompted to confirm the corresponding target display interface through an adsorption process, and on the other hand, after the display cursor is adsorbed to the target display interface, further interactive operation is facilitated on the target display interface.
Optionally, the method further includes that different display interfaces are correspondingly provided with preset display cursors, and the display cursors are interaction carriers of the display interfaces.
Optionally, the moving the display area of the augmented reality device to the target display interface according to the spatial orientation further comprises:
and synchronously switching and displaying the display cursors corresponding to the target display interfaces according to the target display interfaces, and switching and displaying the cursors in time after the target display interfaces are selected, so that a user can conveniently and timely recognize the display cursors.
Optionally, the method further comprises:
and the initial display position of the display cursor corresponding to the target display interface is related to the spatial position of the target display interface relative to the main display interface, so that the display cursor is set for different display interfaces.
Optionally, after the moving the display area of the augmented reality device to the target display interface according to the spatial orientation, the method further comprises:
and when a first trigger instruction is received, changing the state of the display cursor so that the augmented reality equipment selects target augmented reality content in the target display interface to perform first interactive operation, and prompting a user of the interactive operation process.
Optionally, the first trigger instruction includes: the control instruction generated after the finger ring receives the touch operation, the control instruction generated after the finger ring receives the key operation, the control instruction generated after the augmented reality device receives the touch operation and/or the control instruction generated after the augmented reality device receives the key operation.
The application also discloses a computer readable storage medium, which stores an interface interaction control program, and the interface interaction control display program realizes the interface interaction control method when being executed by a processor.
The application also discloses a computer device, which comprises a computer readable storage medium, a processor and an interface interaction control program stored in the computer readable storage medium, wherein the interface interaction control method program realizes the interface interaction control method when being executed by the processor.
The application discloses an interface interaction control method, computer equipment and a storage medium, which have the following technical effects:
according to the method, the display area is controlled to move along with the space pose, so that different display interfaces are selected in the display area, the limitation of the area of the display area can be broken through, the display range is expanded, meanwhile, the content of the display area is the content focused or wanted to be interacted by the user, and the watching experience and the interaction experience of the user are improved.
Drawings
FIG. 1 is a flow chart of an interface interaction control method according to a first embodiment of the application;
FIG. 2 is a schematic diagram of a main display and a sub display interface according to a first embodiment of the present application;
FIG. 3 is a schematic diagram of a display area selection main display interface according to a first embodiment of the present application;
FIG. 4 is a schematic diagram of a display area selection sub-display interface according to a first embodiment of the present application;
FIG. 5 is a schematic view of a display area selecting another pair of display interfaces according to a first embodiment of the present application;
FIG. 6 is a schematic block diagram of an interface interaction control device according to a second embodiment of the present application;
fig. 7 is a schematic diagram of a computer device according to a fourth embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
Before describing various embodiments of the present application in detail, the technical idea of the present application will be briefly described first: the current augmented reality devices such as AR glasses have smaller display area due to the limitation of the angle of view, so that the display content is less, and the user experience of watching and interacting is poor. According to the interface interaction control method provided by the application, the interaction areas are adjusted by adjusting the space pose of the augmented reality equipment, namely, the interaction areas are changed along with the space pose, and the set augmented reality content is displayed in each interaction area, so that the display is carried out according to the requirements of users, the limitation of the area of the display area can be broken through, the display range is expanded, meanwhile, the content of the interaction area is the content focused or wanted to be interacted by the users, and the watching experience and the interaction experience of the users are improved. The interface interaction control method of the present application is described below in connection with further embodiments.
Specifically, as shown in fig. 1, the interface interaction control method in the first embodiment includes:
step S10, detecting the space pose of the augmented reality device, judging the space position of the augmented reality device relative to the initial space position according to the space pose, and moving the display area of the augmented reality device to a target display interface according to the space position, wherein the augmented reality device comprises a main display interface associated with the initial space position, different auxiliary display interfaces are correspondingly distributed on different space positions of the main display interface, and the target display interface is one of the auxiliary display interfaces.
Illustratively, an augmented reality device typically incorporates inertial sensors, such as gyroscopes for detecting angular velocity and rotation angle of the augmented reality device, and displacement sensors, which detect a space pose based on the inertial sensors.
Specifically, in step S10, the specific method for detecting the spatial pose of the augmented reality device includes: acquiring a reference pose of the augmented reality equipment; and detecting a rotation pose and/or a translation pose of the augmented reality equipment relative to the reference pose, and taking the rotation pose and/or the translation pose as a space pose.
In one implementation, the reference pose of the first embodiment is a spatial pose of the augmented reality device when starting to start operation, for example, when the user starts to wear the augmented reality device with the head facing straight ahead, the spatial pose of the augmented reality device at this time is recorded as the reference pose. In one implementation, the reference pose of the first embodiment is a spatial pose of the augmented reality device at any time before the current time in the use process, for example, a spatial pose at a time immediately before the current time may be used as the reference pose, where the immediately before time may be a second, a minute, an hour, and so on immediately before the current time. The spatial pose may be a position coordinate of the augmented reality device, or a rotation vector and/or a translation vector of the augmented reality device, and for the reference pose, the corresponding position coordinate may be set to 0 or a predetermined coordinate value, or the rotation vector and/or the translation vector of the corresponding augmented reality device may be set to 0 or a predetermined value. The position coordinates, the rotation vector and the translation vector of the augmented reality device can be detected by an inertial sensor (gyroscope, displacement sensor and the like) carried by the augmented reality device.
Further, the spatial position corresponding to the reference pose is taken as the initial spatial position of the augmented reality device, after the spatial pose is obtained, the spatial position of the augmented reality device moving relative to the initial spatial position can be determined according to the spatial pose and the reference pose, wherein the moving comprises rotation and translation, namely, the adjustment process of the spatial pose of the augmented reality device can be decomposed into a rotation process and a translation process.
For example, at the current time, the rotation pose and/or the translation pose of the augmented reality device detected relative to the reference pose is taken as the current space pose, that is, the determining factor of the space pose may be two factors of rotation and translation, or may be one of the two factors. Further, the rotational pose comprises a rotational angle and/or a rotational direction of the augmented reality device relative to the reference pose; the translational pose includes a translational distance and/or a translational direction of the augmented reality device relative to the reference pose. That is, the determining factor of the rotation pose may be two factors of the rotation angle and the rotation direction, or may be one of the two factors; the determining factor of the panning posture may be the panning distance or the panning method, or may be one of them. Taking the rotation pose as an example, the determining factor depends on the layout of the display screen, for example, when the respective display contents of the display screen are sequentially distributed along the same direction, different contents can be displayed or selected by controlling the rotation angle of the augmented reality device. For example, when the respective display contents of the display screen are distributed in different directions, different contents can be displayed or selected by controlling the rotation direction of the augmented reality apparatus. In the case of a more complex layout, for example, display contents are distributed in all directions, and a plurality of display contents are sequentially arranged in a part of directions, at this time, it is necessary to control the rotation angle and rotation direction of the augmented reality device at the same time to display or select different contents. Similarly, the determining factor for the panning gestures also depends on the layout of the display screen. For display screens with more complicated layouts, the rotation pose and the panning pose can be considered at the same time, so that the corresponding content can be more accurately selected or displayed.
Further, after the spatial orientation is determined, the position of the display area can be further determined, that is, the moving process of the display area is bound with the spatial pose adjustment of the augmented reality device, so that the display area moves to the target display interface.
The main display interface is used for carrying first display information, and the importance level of the first display information is larger than that of the display information carried by other display interfaces. When the augmented reality device is in the initial spatial position, the display area of the augmented reality device is located at the main display interface by default, and the first display information may be an application app and a system application app that are frequently used by a user. Different auxiliary display interfaces are correspondingly distributed on different spatial orientations, different auxiliary display interfaces correspondingly bear different display information, and different display information is correspondingly provided with different importance levels according to the spatial orientations of the auxiliary display interfaces relative to the main display interface.
Further, before moving the display area of the augmented reality device to the target display interface according to the spatial orientation, the target display interface distributed on the spatial orientation is activated according to the spatial orientation. Specifically, when the display area is located on the main display interface, other sub display interfaces are not activated and displayed, and after the spatial orientation is determined, the sub display interface serving as the target display interface is activated and displayed. As shown in fig. 2, in the initial spatial position, the display area of the augmented reality device is located on the main display interface D, the auxiliary display interfaces A, B, C, E on the other different spatial orientations are not activated and displayed, and after the spatial orientation of the augmented reality device is determined, the corresponding auxiliary display interfaces are activated.
In another embodiment, the main display interface and each auxiliary display interface are in an activated state, and the display area can be directly moved to different target display interfaces according to the spatial orientation of the augmented reality device. Further, in order to better distinguish the target display interface from other display interfaces, the display manner of the target display interface and other display screens may be set to be different at this time. Different display modes can be realized through controlling the transparency, the saturation and the display distance. For example, the transparency of the target display interface can be reduced, the transparency of other display images can be increased, the target display interface is clearer, the other display images are more fuzzy, a certain contrast degree is formed visually, and the user is more focused on the target display interface.
By way of example, the main display interface and the auxiliary display interface are activated, the target display interface and other display interfaces can be distinguished through different display modes, more augmented reality contents can be displayed outside the target display interface on the premise that the focus of a user is focused on the target display interface, the user can know the positions of other augmented reality contents in advance, and the user can conveniently, quickly and conveniently select the desired augmented reality contents later. In one embodiment, each display interface may be a different functional area, respectively. As shown in fig. 3, for example, an application area D4 as a main display interface, a gadget area D3 as a sub display interface, a notification bar area D1, and a dock bar area D2, the display area is adjusted by adjusting the spatial pose of the augmented reality device such that the display area is on a different target display interface, and the display area in fig. 3 is on the application area D4 in the center of the screen.
Further, when the main display interface and each auxiliary display interface are in an activated state, and the display area is overlapped with one of the display interfaces, the overlapped display interface is used as a new target display interface. By way of example, the term "overlapping" herein includes both partially overlapping and fully overlapping situations. In one embodiment, if the overlapping area of the display area and the other display interfaces is greater than a certain threshold, the display interface may be regarded as the target display interface. For example, if the display area is located at the center of the display screen, when the user turns right after wearing the augmented reality device, the display area moves rightward, and when the overlapping area of the display area and the right side sub-display interface reaches a threshold value, the right side sub-display interface is set as the target display interface. As shown in fig. 4, the widget region D3 located on the right side of the field of view is taken as a new target display interface IR. As shown in fig. 5, the dock field D2 located below the field of view is taken as a new target display interface IR.
In order to distinguish the preset augmented reality content of the target display interface from the augmented reality content of the other display interfaces, the augmented reality content of the other display interfaces may be different from the preset augmented reality content of the target display interface in at least one of saturation, transparency and display distance. By way of example, the "display distance" refers to the distance of the augmented reality content from the user, and may also be understood as the resultant distance of the augmented reality content. In one embodiment, after the target display interface displays the preset augmented reality content, the preset augmented reality content is elastically moved a predetermined distance in the direction of the user to distinguish from the augmented reality content of the other display interfaces.
Illustratively, the target display interface includes at least two interactable controls, at least one interactable control being selected upon movement of the display area to the target display interface. For example, among the plurality of interactable controls of the target display interface, the interactable control positioned in the center is selected, and also the interactable control most frequently used by the user is selected.
Further, the interface interaction control method further comprises the following steps:
and step S20, synchronously moving and displaying a cursor to a target display interface according to the space orientation.
As shown in fig. 3, 4 and 5, the display area of the augmented reality device is provided with a display cursor F, which is an interactive carrier of the display interface. That is, the moving process of the display cursor F is bound to the space pose adjusting process of the augmented reality device, and in the process that the display area of the augmented reality device is moved to the target display interface, the display cursor F is also moved to the target display interface, so as to ensure the subsequent interaction process.
Further, before synchronously moving the display cursor to the target display interface according to the spatial orientation, activating the display cursor F is further included. For example, the display cursor may be activated by adjusting at least one of a display shape, a display color, a display transparency, and a display contrast of the display cursor. For example, the display transparency of the display cursor and the display contrast can be adjusted, for example, wherein the display transparency is graded from 80% to 100%, and the display contrast is graded from 80% to 100%.
In the process of moving the display cursor in step S20, in one embodiment, the display cursor may precisely move following the spatial orientation of the augmented reality device, that is, the display cursor may not enter into the target display interface until the spatial orientation is adjusted to a predetermined orientation. In another embodiment, the display cursor may also be moved in a close-proximity adsorption manner. That is, when the display cursor is synchronously moved to a preset distance from the target display interface according to the spatial orientation, the display cursor is adsorbed to the target display interface. On one hand, the user can be prompted to confirm the corresponding target display interface through the adsorption process, and on the other hand, after the display cursor is adsorbed on the target display interface, further interactive operation is convenient to carry out on the target display interface.
Further, preset display cursors are correspondingly arranged on different display interfaces, and the display cursors are interaction carriers of the display interfaces. That is, the display cursors are preset on the respective display interfaces, and after the display interface is selected as the target display interface, the corresponding display cursors are activated.
Specifically, after the display area of the augmented reality device is moved to the target display interface according to the spatial orientation, the display cursor corresponding to the target display interface is synchronously switched and displayed according to the target display interface. The initial display position of the display cursor corresponding to the target display interface is related to the spatial position of the target display interface relative to the main display interface. For example, for the secondary display interfaces in different spatial orientations of the primary display interface, the position of the display cursor on each secondary display interface is different, e.g., the display cursor may be located at a center position, a left edge position, etc. of the secondary display interfaces.
Illustratively, when the target display interface has a plurality of interactable controls, the interactable control at the display cursor position is selected and in an active state for further interactive operation of the interactable control.
Further, the interface interaction control method further comprises the following steps:
and step S30, when a first trigger instruction is received, changing the state of a display cursor so that the augmented reality equipment selects target augmented reality content in a target display interface to perform first interactive operation.
The first trigger instruction comprises: the ring receives a control instruction generated after touch operation, the ring receives a control instruction generated by key operation, the augmented reality device receives a control instruction generated after touch operation and/or the augmented reality device receives a control instruction generated after key operation. A ring is a device worn on a finger, typically equipped with IMU gyroscopic sensors and physical buttons, for capturing movements and gestures of the finger according to algorithms of gyroscopic movements. Buttons on the ring may be used for key operations, for example, to confirm selection or to perform other specific functions, and touch areas on the ring may generate touch operations. The interactive operation comprises at least one of a selection operation, a drag operation, a zooming operation and a confirmation operation.
In one embodiment, one of the interactable controls in the preset augmented reality content in the target display interface is selected, at this time, the interactable control can be opened for subsequent operation by pressing a button on the finger ring, for example, the interactable control can be dragged by pressing the button on the finger ring and sliding a finger. In another embodiment, when one of the interactive controls is selected in the preset augmented reality content of the target display interface, the interactive controls can be slid through a sliding action performed by a finger wearing the interactive finger ring at this time, so that other interactive controls are selected.
For example, when the augmented reality device is an augmented reality glasses, a touch area may be set on a temple of the augmented reality glasses, and the corresponding first trigger instruction may be generated by clicking the touch area and sliding on the touch area.
As shown in fig. 6, the second embodiment further discloses an interface interaction control device, where the interface interaction control device includes a detection module 100 and a movement module 200, where the detection module 100 is configured to detect a spatial pose of the augmented reality device and determine a spatial orientation of the augmented reality device relative to an initial spatial position according to the spatial pose; the moving module 200 is configured to move a display area of the augmented reality device to the target display interface according to the spatial orientation. The augmented reality device comprises a main display interface associated with the initial spatial position, different auxiliary display interfaces are correspondingly distributed on different spatial orientations of the main display interface, and the target display interface is one of the auxiliary display interfaces.
The main display interface carries first display information, the importance level of the first display information is greater than that of display information carried by other display interfaces, different auxiliary display interfaces correspondingly carry different display information, and different importance levels are correspondingly arranged on the different display information according to the space orientation of the auxiliary display interface relative to the main display interface.
Further, the moving module 200 is further configured to synchronously move the display cursor to the target display interface according to the spatial orientation.
Further, the interface interaction control device further comprises a display module 300, and the display module 300 is used for displaying the target display interface distributed on the spatial orientation according to the spatial orientation before moving the display area of the augmented reality device to the target display interface according to the spatial orientation.
Further, before synchronously moving the display cursor to the target display interface according to the spatial orientation, the display module 300 is further configured to activate the display cursor. The display module 300 is further configured to adjust at least one of a display shape, a display color, a display transparency, and a display contrast of the display cursor. Wherein activating the display cursor comprises adjusting display transparency and display contrast of the display cursor, wherein the display transparency is gradually changed from 80% to 100%, and the display contrast is gradually changed from 80% to 100%.
Further, the moving module 200 is further configured to adsorb the display cursor to the target display interface when synchronously moving the display cursor to a preset distance from the target display interface according to the spatial orientation.
Further, preset display cursors are correspondingly arranged on different display interfaces, the display cursors are interaction carriers of the display interfaces, and the display module 300 is further used for synchronously switching the display cursors corresponding to the display target display interfaces according to the target display interfaces, wherein the display cursors corresponding to the target display interfaces are associated with the spatial orientation of the target display interfaces relative to the main display interfaces at the initial display positions of the target display interfaces.
Further, the interface interaction control device further includes an interaction module 400, where the interaction module 400 is configured to change a state of a display cursor when receiving the first trigger instruction, so that the augmented reality device selects the target augmented reality content in the target display interface to perform the first interaction operation. The first trigger instruction comprises: the ring receives a control instruction generated after touch operation, the ring receives a control instruction generated by key operation, the augmented reality device receives a control instruction generated after touch operation and/or the augmented reality device receives a control instruction generated after key operation.
The third embodiment also discloses a computer readable storage medium, in which an interface interaction control program is stored, and when the interface interaction control program is executed by the processor, the interface interaction control method in the first embodiment is implemented.
The fourth embodiment also discloses a computer device, which includes, at the hardware level, as shown in fig. 7, a processor 12, an internal bus 13, a network interface 14, and a computer readable storage medium 11. The processor 12 reads the corresponding computer program from the computer-readable storage medium and then runs to form the request processing means at a logic level. Of course, in addition to software implementation, one or more embodiments of the present disclosure do not exclude other implementation manners, such as a logic device or a combination of software and hardware, etc., that is, the execution subject of the following processing flow is not limited to each logic unit, but may also be hardware or a logic device. The computer readable storage medium 11 stores an interface interaction control program, which when executed by a processor implements the interface interaction control method described above.
Computer-readable storage media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer-readable storage media include, but are not limited to, phase-change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic disk storage, quantum memory, graphene-based storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device.
The computer device described above may be an AR device. The computer device may be, for example, augmented reality glasses.
In an example of augmented reality glasses, a computer device may be configured to communicate data to and receive data from an external processing device through a signal connection, which may be a wired connection, a wireless connection, or a combination thereof. However, in other cases, the computer device may be used as a stand-alone device, i.e., the data processing is performed at the computer device itself. The signal connection may be configured to carry any kind of data, such as image data (e.g., still images and/or full motion video, including 2D and 3D images), audio, multimedia, voice, and/or any other type of data. The external processing device may be, for example, a game console, personal computer, tablet computer, smart phone, or other type of processing device. The signal connection may be, for example, a Universal Serial Bus (USB) connection, a Wi-Fi connection, a bluetooth or Bluetooth Low Energy (BLE) connection, an ethernet connection, a cable connection, a DSL connection, a cellular connection (e.g., 3G, LTE/4G or 5G), etc., or a combination thereof. Additionally, the external processing device may communicate with one or more other external processing devices via a network, which may be or include, for example, a Local Area Network (LAN), wide Area Network (WAN), intranet, metropolitan Area Network (MAN), global internet, or a combination thereof.
While certain embodiments have been shown and described, it would be appreciated by those skilled in the art that changes and modifications may be made without departing from the principles and spirit of the application, the scope of which is defined in the claims and their equivalents.

Claims (16)

1. An interface interaction control method, which is characterized by comprising the following steps:
detecting the space pose of the augmented reality equipment, judging the space orientation of the augmented reality equipment relative to the initial space position according to the space pose, and moving the display area of the augmented reality equipment to a target display interface according to the space orientation;
the augmented reality device comprises a main display interface associated with an initial spatial position, different auxiliary display interfaces are correspondingly distributed on different spatial orientations of the main display interface, and the target display interface is one of the auxiliary display interfaces.
2. The interface interaction control method according to claim 1, wherein the main display interface carries first display information, and an importance level of the first display information is greater than an importance level of display information carried by other display interfaces.
3. The interface interaction control method according to claim 1, wherein the different secondary display interfaces correspondingly carry different display information, and the different display information is correspondingly provided with different importance levels according to the spatial orientation of the secondary display interface relative to the primary display interface.
4. The interface interaction control method according to claims 1-3, wherein a display cursor is provided in a display area of the augmented reality device, and the moving the display area of the augmented reality device to a target display interface according to the spatial orientation further comprises:
and synchronously moving the display cursor to the target display interface according to the space orientation.
5. The interface interaction control method according to claim 1, characterized in that the method comprises: the method comprises the steps of detecting the space pose of the augmented reality device, judging the space orientation of the augmented reality device relative to an initial space position according to the space pose, and detecting the space orientation of the augmented reality device relative to the initial space position;
before the display area of the augmented reality device is moved to the target display interface according to the spatial orientation, the method further comprises activating the target display interface distributed on the spatial orientation according to the spatial orientation.
6. The interface interaction control method according to claim 4, further comprising: the method comprises the steps of detecting the space pose of the augmented reality device, judging the space orientation of the augmented reality device relative to an initial space position according to the space pose, and detecting the space orientation of the augmented reality device relative to the initial space position;
and synchronously moving the display cursor to the target display interface according to the space orientation, and activating the display cursor.
7. The interface interaction control method according to claim 6, further comprising: the activating the display cursor includes adjusting at least one of a display shape, a display color, a display transparency, and a display contrast of the display cursor.
8. The interface interaction control method of claim 7, wherein the activating the display cursor comprises adjusting a display transparency of the display cursor and a display contrast, wherein the display transparency is graded from 80% to 100%, and the display contrast is graded from 80% to 100%.
9. The method of claim 4, wherein said synchronously moving said display cursor to said target display interface according to said spatial orientation further comprises,
and when the display cursor synchronously moves to a preset distance from the target display interface according to the space orientation, adsorbing the display cursor to the target display interface.
10. The interface interaction control method according to claim 1, further comprising the step of correspondingly arranging preset display cursors on different display interfaces, wherein the display cursors are interaction carriers of the display interfaces.
11. The interface interaction control method of claim 10, wherein the moving the display area of the augmented reality device to the target display interface according to the spatial orientation further comprises:
and synchronously switching and displaying a display cursor corresponding to the target display interface according to the target display interface.
12. The interface interaction control method according to claim 11, characterized in that the method further comprises:
and the initial display position of the display cursor corresponding to the target display interface on the target display interface is associated with the spatial position of the target display interface relative to the main display interface.
13. The interface interaction control method according to claim 4, further comprising, after the moving the display area of the augmented reality device to the target display interface according to the spatial orientation:
and when a first trigger instruction is received, changing the state of the display cursor so that the augmented reality equipment selects target augmented reality content in the target display interface to perform first interactive operation.
14. The interface interaction control method of claim 13, wherein the first trigger instruction comprises: the control instruction generated after the finger ring receives the touch operation, the control instruction generated after the finger ring receives the key operation, the control instruction generated after the augmented reality device receives the touch operation and/or the control instruction generated after the augmented reality device receives the key operation.
15. A computer-readable storage medium storing an interface interaction control program which, when executed by a processor, implements the interface interaction control method of any one of claims 1 to 14.
16. A computer device comprising a computer readable storage medium, a processor and an interface interaction control program stored in the computer readable storage medium, which when executed by the processor implements the interface interaction control method of any of claims 1 to 14.
CN202311163392.9A 2023-09-11 2023-09-11 Interface interaction control method, computer device and storage medium Pending CN117193588A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311163392.9A CN117193588A (en) 2023-09-11 2023-09-11 Interface interaction control method, computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311163392.9A CN117193588A (en) 2023-09-11 2023-09-11 Interface interaction control method, computer device and storage medium

Publications (1)

Publication Number Publication Date
CN117193588A true CN117193588A (en) 2023-12-08

Family

ID=89001123

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311163392.9A Pending CN117193588A (en) 2023-09-11 2023-09-11 Interface interaction control method, computer device and storage medium

Country Status (1)

Country Link
CN (1) CN117193588A (en)

Similar Documents

Publication Publication Date Title
US11262835B2 (en) Human-body-gesture-based region and volume selection for HMD
US20230325004A1 (en) Method of interacting with objects in an environment
US10509487B2 (en) Combining gyromouse input and touch input for navigation in an augmented and/or virtual reality environment
CN108780360B (en) Virtual reality navigation
US9591295B2 (en) Approaches for simulating three-dimensional views
US20200225736A1 (en) Discrete and continuous gestures for enabling hand rays
US9378581B2 (en) Approaches for highlighting active interface elements
US9465437B2 (en) Method and apparatus for controlling screen by tracking head of user through camera module, and computer-readable recording medium therefor
US9224237B2 (en) Simulating three-dimensional views using planes of content
EP3047363B1 (en) Approaches for three-dimensional object display
US9483113B1 (en) Providing user input to a computing device with an eye closure
US9437038B1 (en) Simulating three-dimensional views using depth relationships among planes of content
US9134799B2 (en) Interacting with a projected user interface using orientation sensors
US10839572B2 (en) Contextual virtual reality interaction
US20230094522A1 (en) Devices, methods, and graphical user interfaces for content applications
US20200233487A1 (en) Method of controlling device and electronic device
CN111813226B (en) Enhanced information by depth traversing photographs using gesture and UI controlled occlusion planes
US11169598B2 (en) Apparatus and associated methods for presentation of a virtual reality space
CN116048281A (en) Interaction method, device, equipment and storage medium in virtual reality scene
JP6147357B2 (en) Display control apparatus and display control method
CN117193588A (en) Interface interaction control method, computer device and storage medium
CN117193589A (en) Interface interaction control method, computer device, and computer-readable storage medium
CN117193590A (en) Interface display control method, computer device, and computer-readable storage medium
KR20140021173A (en) Method, apparatus, and computer readable recording medium for displaying browser by reacting device's movement
CN118534997A (en) Method, apparatus, device and storage medium for operating virtual object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination