CN116820229A - XR space display method, XR equipment, electronic equipment and storage medium - Google Patents

XR space display method, XR equipment, electronic equipment and storage medium Download PDF

Info

Publication number
CN116820229A
CN116820229A CN202310559559.7A CN202310559559A CN116820229A CN 116820229 A CN116820229 A CN 116820229A CN 202310559559 A CN202310559559 A CN 202310559559A CN 116820229 A CN116820229 A CN 116820229A
Authority
CN
China
Prior art keywords
control
interface
space
image size
size
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310559559.7A
Other languages
Chinese (zh)
Inventor
顾平平
贺壮杰
包啸君
曹庆峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202310559559.7A priority Critical patent/CN116820229A/en
Publication of CN116820229A publication Critical patent/CN116820229A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an XR space display method, an XR device, an electronic device and a storage medium, and relates to the technical field of XR. The electronic device receives a first operation for triggering the electronic device to reduce the image size of the first XR object, and an interaction angle corresponding to the first control is obtained. And if the interaction angle corresponding to the first control is equal to the preset interaction angle in the process of responding the first operation by the electronic equipment, controlling the XR equipment to display a third XR interface of the XR space. The third XR interface comprises: the image size of the first XR object is changed from a large to a small dynamic image in response to the first operation, and a first control of a third image size, the third image size corresponding to a preset interaction angle. Therefore, a first control which is convenient for the user to read or operate can be provided for the user in the XR interface, and the user experience is improved.

Description

XR space display method, XR equipment, electronic equipment and storage medium
Technical Field
The application relates to the technical field of augmented reality, in particular to an XR space display method, an XR device, an electronic device and a storage medium.
Background
Extended Reality (XR) is an emerging technology for graphical image correlation that has emerged in recent years. XR technology is a collective term for various technologies such as augmented Reality (Augmented Reality, AR) technology, virtual Reality (VR) technology, mixed Reality (MR) technology, and the like. XR technology may refer to creating a virtual scene that may be interacted with by human and machine, and providing a user with sensory experience simulating a real scene such as sight, hearing, smell, etc., so that the user is as if it were in the scene.
In an XR scenario, a user may adjust XR objects in the XR scenario. For example, taking an XR scene comprising a three-dimensional desktop as an example, a user may adjust the distance or size of an application window (i.e., XR object) in the three-dimensional desktop. In the case where the application window is turned away or down, the application window seen by the user may be made smaller accordingly. However, if the XR object becomes too small from the user's perspective, it may be difficult for the user to manipulate or see the controls associated with the XR object (e.g., manipulating buttons or text, etc.), affecting the user experience.
Disclosure of Invention
In view of the foregoing, the present application provides a display method, an XR device, an electronic apparatus, and a storage medium for an XR space, which are used for reducing the situation that a control associated with an XR object in the XR space becomes too small in a user perspective, so that the control associated with the XR object is kept at a suitable image size in the user perspective, thereby facilitating user operation or resolution, and improving user experience.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical scheme:
in a first aspect, the application provides a method of displaying an XR space for use in an augmented reality XR device for displaying the XR space to a user. The XR space includes a first XR object and a first control associated with the first XR object. The method comprises the following steps:
The XR device shows a first XR interface of the XR space. The first XR interface includes a first XR object of a first image size and a first control of a second image size. The XR device receives a first operation. The first operation is to trigger the XR device to reduce the image size of the first XR object. For example, the first operation is an operation to zoom out the first XR object or an operation to zoom in the spatial size of the first XR object. In response to the first operation, the XR device displays a second XR interface of the XR space. The second XR interface comprises: the image size of the first XR object is changed from a larger to a smaller dynamic image in response to the first operation, and the image size of the first control is changed from a larger to a smaller dynamic image in response to the first operation. And if the interaction angle corresponding to the first control is equal to the preset interaction angle in the process of responding to the first operation by the XR equipment, displaying a third XR interface of the XR space. The third XR interface comprises: the image size of the first XR object is changed from a large to a small dynamic image in response to the first operation, and a first control of a third image size. The third image size corresponds to the preset interaction angle, that is, the image size of the first control is kept at the third image size corresponding to the preset interaction angle.
In this way, in the process that a user adjusts a first XR object in an XR space through an XR device or an electronic device, the interaction angle corresponding to a first control associated with the first XR object is always larger than or equal to a preset interaction angle, the image size of the first control can be kept at the image size convenient for the user to read or operate, the user can always operate or see clearly the control associated with the XR object, the user can conveniently read or operate, the occurrence of misoperation situations of the user is reduced, and the user experience is improved.
In a possible implementation manner of the first aspect, before displaying the third XR interface of the XR space, the XR device may further periodically obtain an interaction angle corresponding to the first control in response to the first operation. And if the interaction angle corresponding to the first control is equal to the preset interaction angle in the process of responding to the first operation by the XR equipment, adjusting the space size of the first control to enable the interaction angle corresponding to the first control to be kept to be the preset interaction angle.
In this method, the XR device can control the XR space and display the XR space. In the process of responding to the first operation, the XR equipment can enable the interaction angle corresponding to the first control to be kept at a preset interaction angle by adjusting the space size of the first control, so that the image size of the first control can be kept at the image size convenient for a user to read or operate in an XR interface displayed by the XR equipment.
In another possible implementation of the first aspect, the XR device is connected to an electronic device, the electronic device providing XR space for the XR device. After receiving the first operation, the XR device sends a first instruction to the electronic device. The first instruction is used for triggering the electronic equipment to reduce the image size of the first XR object, indicating the electronic equipment to acquire the interaction angle corresponding to the first control, and controlling the interaction angle corresponding to the first control to be unchanged under the condition that the interaction angle corresponding to the first control is equal to a preset interaction angle.
In this method, the electronic device may control the XR space and provide the XR space to the XR device, which is displayed by the XR device. After receiving the first operation, the XR device may send a first instruction to the electronic device indicating the received first operation. The electronic equipment responds to the first instruction, and the corresponding interaction angle of the first control can be kept at a preset interaction angle by adjusting the space size of the first control, so that the image size of the first control can be kept at the image size which is convenient for a user to read or operate in an XR interface displayed by the XR equipment.
In another possible implementation of the first aspect, the XR device displays a fourth XR interface of the XR space. The fourth XR interface comprises: the first XR object of the fourth image size and the first control of the third image size, the fourth image size smaller than the first image size. In the case of the fourth XR interface being displayed, the XR device receives the second operation. The second operation is to trigger the XR device to resize the image of the first XR object. In response to the second operation, the XR device displays a fifth XR interface of the XR space. The fifth XR interface comprises: an image size of the first XR object changes from a small to a large dynamic image in response to the second operation, and a first control of a third image size. If the size ratio of the first control to the first XR object increases to the preset size ratio during the response of the XR device to the second operation, a sixth XR interface of the XR space is displayed. The sixth XR interface comprises: the image size of the first XR object is responsive to the second operation from a small to a large dynamic image, and the image size of the first control is responsive to the second operation from a small to a large dynamic image.
In this way, in the process of the XR device or the electronic device responding to the second operation, if the size ratio of the first control and the first XR object is restored to the original preset size ratio, the XR device or the electronic device simultaneously adjusts the image sizes of the first XR object and the first control. The interaction angle corresponding to the first control associated with the first XR object is always larger than or equal to the preset interaction angle, so that the user can read or operate conveniently, and the user experience is improved.
In another possible implementation of the first aspect, the XR device is configured to periodically obtain a size ratio of the first control to the first XR object in response to the second operation before displaying the fifth XR interface of the XR space. And if the size ratio of the first control to the first XR object is increased to the preset size ratio in the process of responding to the second operation by the XR equipment, controlling the size ratio of the first control to the first XR object to be kept to be the preset size ratio.
In the method, if the size ratio of the first control to the first XR object is restored to the original preset size ratio, the XR equipment can simultaneously increase the image sizes of the first XR object and the first control, so that the size ratio of the first control to the first XR object is kept at the original preset size ratio.
In another possible implementation of the first aspect, the XR space comprises a three-dimensional desktop. The first XR object is an application window in a three-dimensional desktop. The first control includes one or more of text information, manipulation buttons.
In a second aspect, the present application provides a method for displaying an XR space, applied to an electronic device, where the electronic device is connected to the XR device. The electronic device is configured to provide an XR space for the XR device, which is presented to the user by the XR device. The method comprises the following steps:
the electronics control the XR device to display a first XR interface of the XR space. The first XR interface comprises: a first XR object of a first image size and a first control of a second image size. The electronic device receives a first operation by a user that triggers the electronic device to reduce an image size of the first XR object. And responding to the first operation, the electronic equipment acquires the interaction angle corresponding to the first control, and controls the XR equipment to display a second XR interface of the XR space. The second XR interface comprises: the image size of the first XR object is changed from a larger to a smaller dynamic image in response to the first operation, and the image size of the first control is changed from a larger to a smaller dynamic image in response to the first operation. If the interaction angle corresponding to the first control is equal to the preset interaction angle in the process that the electronic equipment responds to the first operation, the electronic equipment controls the XR equipment to display a third XR interface of the XR space. The third XR interface comprises: the image size of the first XR object is changed from a large to a small dynamic image in response to the first operation, and a first control of a third image size, the third image size corresponding to a preset interaction angle.
In this way, in the process that the user adjusts the first XR object in the XR space through the electronic equipment, the interaction angle corresponding to the first control associated with the first XR object is always larger than or equal to the preset interaction angle, the image size of the first control can be kept at the image size convenient for the user to read or operate, the user can always operate or see clearly the control associated with the XR object, the user can conveniently read or operate, the occurrence of misoperation situations of the user is reduced, and the user experience is improved.
In a possible implementation manner of the second aspect, if, during the response of the electronic device to the first operation, the interaction angle corresponding to the first control is equal to the preset interaction angle, the electronic device adjusts the spatial dimension of the first control, so that the interaction angle corresponding to the first control is kept at the preset interaction angle, and generates and transmits a third XR interface to the XR device.
In the method, in the process of responding to the first operation, the electronic equipment can enable the interaction angle corresponding to the first control to be kept at a preset interaction angle by adjusting the space size of the first control, so that the image size of the first control can be kept at the image size convenient for a user to read or operate in an XR interface transmitted to the XR equipment.
In another possible implementation of the second aspect, the electronic device controls the XR device to display a fourth XR interface of the XR space. The fourth XR interface comprises: the first XR object of the fourth image size and the first control of the third image size, the fourth image size smaller than the first image size. The electronic device receives a second operation by the user, the second operation for triggering the electronic device to resize the image of the first XR object. In response to the second operation, the electronic device obtains a size ratio of the first control to the first XR object and controls the XR device to display a fifth XR interface of the XR space. The fifth XR interface comprises: an image size of the first XR object changes from a small to a large dynamic image in response to the second operation, and a first control of a third image size. If the size ratio of the first control to the first XR object is increased to the preset size ratio in the process of responding to the second operation by the electronic equipment, the electronic equipment controls the XR equipment to display a sixth XR interface of the XR space. The sixth XR interface comprises: the image size of the first XR object is responsive to the second operation from a small to a large dynamic image, and the image size of the first control is responsive to the second operation from a small to a large dynamic image.
In the method, in the process of the electronic equipment increasing the image size of the first XR object in response to the second operation, if the size ratio of the first control to the first XR object is restored to the original preset size ratio, the electronic equipment simultaneously increases the image sizes of the first XR object and the first control. The interaction angle corresponding to the first control associated with the first XR object is always larger than or equal to the preset interaction angle, so that the user can read or operate conveniently, and the user experience is improved.
In another possible implementation manner of the second aspect, if the size ratio of the first control to the first XR object increases to a preset size ratio during the process of the electronic device responding to the second operation, the size ratio of the first control to the first XR object is controlled to be kept at the preset size ratio, and a sixth XR interface is generated and transmitted to the XR device.
In the method, if the size ratio of the first control to the first XR object is restored to the original preset size ratio, the electronic equipment can simultaneously increase the image sizes of the first XR object and the first control, so that the size ratio of the first control to the first XR object is kept at the original preset size ratio.
In another possible implementation manner of the second aspect, the XR space includes a three-dimensional desktop. The first XR object is an application window in a three-dimensional desktop. The first control includes one or more of text information, manipulation buttons.
In a third aspect, the application provides an XR device comprising: the system includes a memory, a display screen, and one or more processors, the display screen for presenting an XR space to a user. A display screen, a memory, and a processor are coupled, the memory having stored therein computer program code comprising computer instructions which, when executed by the processor, cause the XR device to perform the method of the first aspect and any one of the possible implementations thereof.
In another possible implementation manner of the third aspect, the XR device further comprises: a communication module; the communication module is used for interacting with the electronic equipment and receiving an XR space provided by the electronic equipment for the XR equipment.
In a fourth aspect, the present application provides an electronic device comprising: a communication module, a memory, and one or more processors; the communication module and the memory are coupled with the processor; the communication module is used for transmitting data or signaling with the XR equipment; the memory has stored therein computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of the second aspect described above and any one of its possible implementations.
In a fifth aspect, the application provides a computer readable storage medium comprising computer instructions which, when run on an XR device, cause the XR device to perform the method of the first aspect and any one of its possible implementations. Alternatively, the computer instructions, when executed on an electronic device, cause the electronic device to perform the method of the second aspect and any one of its possible implementations.
In a sixth aspect, the application provides a computer program product comprising program instructions which, when run on a computer, enable the computer to perform the method of the first aspect and any one of its possible implementations, or enable the computer to perform the method of the second aspect and any one of its possible implementations.
In a seventh aspect, the present application is a chip system, which is applied to an electronic device. The system-on-chip includes an interface circuit and a processor. The interface circuit and the processor are interconnected by a wire. The interface circuit is for receiving signals from the memory and transmitting signals to the processor, the signals including computer instructions stored in the memory. When the processor executes computer instructions, the XR device performs the method of the first aspect and any one of its possible implementations described above. Alternatively, when the processor executes computer instructions, the electronic device performs the method of the second aspect and any one of the possible implementations thereof.
Drawings
FIG. 1 is a schematic diagram illustrating an example of an XR interface for an XR object in a remote XR space according to an embodiment of the application;
FIG. 2 is a schematic diagram illustrating an example of an XR interface for an XR object in a reduced XR space according to an embodiment of the application;
FIG. 3 is a schematic diagram of an example of an XR object in a remote XR space according to an embodiment of the application;
fig. 4 is a schematic diagram of a preset interaction angle example corresponding to a control according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an example XR system provided by an embodiment of the application;
fig. 6 is a schematic hardware structure diagram of an example of a mobile phone 100 according to an embodiment of the present application;
fig. 7 is a schematic diagram of a hardware structure of an example of XR glasses 200 according to an embodiment of the present application;
FIG. 8 is a flowchart illustrating an example of a method for displaying an XR space according to an embodiment of the present application;
fig. 9 is a schematic diagram of an example of an interaction angle calculation manner according to an embodiment of the present application;
FIG. 10 is a flowchart illustrating another example of a method for displaying XR space according to an embodiment of the application;
FIG. 11 is a schematic diagram of an example XR interface provided by an embodiment of the application;
FIG. 12 is a schematic diagram of an example of a display method of XR space according to an embodiment of the application;
Fig. 13 is a schematic diagram of another example of an XR space display method according to an embodiment of the application.
Detailed Description
The embodiment of the application provides an XR space display method. The XR space may include three-dimensional images, three-dimensional video, stereo audio, and the like. The XR space may be a three-dimensional space in which a real scene and a virtual scene are combined, or may be a three-dimensional space of only a virtual scene.
An XR device is a device used to present an XR space to a user. The XR device may direct an XR interface for presentation of XR space, allowing the user to experience an XR scene that is beyond reality. Through the XR device, the user can view the three-dimensional XR space, as if it were in the XR space.
In some implementations, the methods provided by embodiments of the present application may be applied to XR devices. The XR device may act as a master device and a display device for the XR space. The user may operate the XR device to control the XR space and view the XR space through the XR device.
In other implementations, the methods provided by embodiments of the application may be applied to XR systems including XR devices and electronic devices. The electronic device acts as a master device for the XR space, providing the XR space to the XR device. The XR device acts as a display device for the XR space, presenting the XR space to the user.
One or more XR objects, such as avatar or object, may be included in the XR space. The user may operate the XR device or electronic device to adjust the size or proximity of the XR object in the XR space. The method for triggering and adjusting the distance or the size of the XR object by the XR device or the electronic device may refer to the detailed description in the following embodiments, which are not repeated herein. The operation of an XR device or electronic device to adjust the distance of an XR object and to adjust the size of an XR object causes changes in the image size of the XR object in the XR interface of the XR space. The image size includes an image length and an image width. The change in image size may include one or more of a change in image length and a change in image width.
It will be appreciated that in a scene where the distance of an XR object is adjusted, the XR object at near will be visually larger than the XR object at far for the same XR object due to the perspective effect that exists in the XR space. In a scenario where the size of an XR object is adjusted, the XR object may also decrease in visual sense as the spatial size of the XR object decreases.
For example, taking an XR space as a three-dimensional desktop, and an XR object as a gallery application window in the three-dimensional desktop as an example, a scene in which an XR device or an electronic device adjusts the distance and the size of the XR object is described.
Taking the gallery application window in the remote three-dimensional desktop as an example, as shown by a in fig. 1, the three-dimensional desktop includes a contact application window, a gallery application window and a video application window. The gallery application window is located at the forefront of the three-dimensional desktop. The user operates the XR device or the electronic device to zoom out the gallery application window in the three-dimensional desktop (i.e., reduce the spatial distance of the gallery application window). The XR interface of the three-dimensional desktop viewed at the user's perspective after remoting is shown as b in fig. 1, and the image size of the gallery application window in the three-dimensional desktop becomes smaller. The spatial position of the gallery application window changes, and the gallery application window is positioned at the rearmost part of the three-dimensional desktop.
Taking the gallery application window in the three-dimensional desktop as an example, as shown in (1) in fig. 2, the gallery application window is located at the forefront of the three-dimensional desktop. The user operates the XR device or electronic device to reduce the gallery application window in the three-dimensional desktop (i.e., reduce the spatial size of the image application). The XR interface of the three-dimensional desktop viewed from the user's perspective after the adjustment is as shown in fig. 2 (2), the image size of the gallery application window in the three-dimensional desktop becomes smaller. But the spatial position of the gallery application window is unchanged, and the gallery application window is positioned at the forefront of the three-dimensional desktop.
The XR objects described above may have one or more controls associated therewith. The controls may be used to provide data for interaction with a user. For example, the controls may include text information, manipulation buttons, menus, progress bars, and the like. For example, the text information may be a window title, an application name, or the like. The manipulation keys may be a minimize key, a maximize key, a cancel key, etc.
During the adjustment of the distance or size of an XR object by an XR device or electronic device, the controls associated with the XR object will also change simultaneously with the XR object. For example, in the case where an XR object is zoomed out, the control of that XR object will also follow the zoom out of the XR object as well. In the XR interface displayed by the XR device, the XR object and the controls associated with the XR object become smaller at the same time. When the spatial distance of the XR object is too far, if the user wants to operate or read the control (such as text information and control buttons) associated with the XR object, the user may have difficulty in distinguishing the text information or precisely operating the image area where the control buttons are located (also called as an operation hot area) due to the undersize of the image corresponding to the control, and in case of misoperation, the user experience may be affected.
In view of this, in the method provided by the embodiment of the present application, for any one of the controls (such as the first control) associated with the XR object in the XR space, such as text information or a control button, the XR device or the electronic device may determine whether the user can operate or see the first control by presetting an angle (also referred to as an interaction angle) between the observation point and a connecting line of two ends of the first control. The preset observation point corresponds to a user's perspective position in the XR space. In the process of adjusting parameters such as the space distance or the space size of an XR object, which can cause the image size of the XR object to be reduced, if the corresponding interaction angle of a first control associated with the XR object is reduced to a preset interaction angle, the image size of the first control is reduced to the optimal interaction size or the minimum interaction size which can be operated or seen by a user. In this case, the image size of the XR object continues to decrease in the XR interface of the XR space presented to the user by the XR device, but the image size of the XR object associated with the first control remains at the image size corresponding to the preset interaction angle.
In this way, in the process that a user adjusts the XR object in the XR space through the XR equipment or the electronic equipment, the interaction angle corresponding to the control associated with the XR object is always larger than or equal to the preset interaction angle, the image size of the control is always kept at the image size convenient for the user to read or operate, the user can always operate or see clearly the control associated with the XR object, the user can conveniently read or operate, the occurrence of misoperation situations of the user is reduced, and the user experience is improved.
For example, taking the scenario illustrated in FIG. 1 above, a user operating an XR device or electronic device zooms out of a gallery application window (i.e., XR object) in a three-dimensional desktop. As shown in a of fig. 3, before the gallery application window is tuned away, an initial spatial distance between a preset observation point corresponding to the user view angle position in the XR space and the gallery application window is a1. At this time, the interaction angle corresponding to the text information (i.e., a control, such as a window title) associated with the gallery application window is greater than the preset interaction angle. The gallery application window is positioned at the forefront in the view angle of the user, the gallery application window has a larger image size, and the user can easily see text information associated with the gallery application window.
In the process of remote gallery application window, the spatial distance of text information of the gallery application window is consistent with the change of the gallery application window. As shown in fig. 3B, the spatial distance between the preset observation point and the gallery application window is changed from a1 to a2, where a2 > a1. At this time, in the user's view angle (i.e., in the XR interface displayed by the XR device), the text information of the gallery application window and the gallery application window both become smaller. The interaction angle corresponding to the text information is also continuously reduced. The interaction angle corresponding to the text information in B of fig. 3 is just reduced to the preset interaction angle. As shown in fig. 3C, the spatial distance between the gallery application window continues to increase, and the spatial distance between the preset observation point and the gallery application window is changed from a2 to a3, where a3 > a2. At this time, the interaction angle corresponding to the text information is not reduced any more and is kept at the preset interaction angle. In the user viewing angle (i.e., in the XR interface displayed by the XR device), the image size of the gallery application window continues to decrease, and the image size of the text information associated with the gallery application window remains at the image size corresponding to the preset interaction angle. Therefore, the text information associated with the gallery application window can be kept at the image size convenient for the user to read, and the user experience is improved.
It can be appreciated that in the embodiment of the present application, the interaction angle is used to measure whether a user can see or operate a control. For example, taking a control being text information as an example, the interaction angle may be an included angle formed by connecting lines from preset observation points to two ends (such as a top end and a bottom end) of the text information, respectively. If the interaction angle is greater than or equal to the preset interaction angle, the user can be considered to see the text information. If the interaction angle is smaller than the preset interaction angle, it may be considered that the user has difficulty in seeing the text information. The operation of the XR device or the electronic device for adjusting the space distance or the space size can cause the corresponding interaction angle of the control to change. The smaller interaction angle corresponding to a control may be due to the larger spatial distance between the control and the preset observation point, or may be due to the smaller spatial size of the control.
The interaction angle is an included angle between a preset observation point and a connecting line of two ends of a control. In some implementations, the two ends of a control may be preset two ends of the control. For example, where one control is text information, the text information may be at both ends, respectively, the top and bottom of one character in the text information. In other implementations, the two ends of one control may be the ends with the shortest distance of the control that can be observed at the preset observation point. For example, in the case where one control is a manipulation key, if the manipulation key is rectangular, both ends of the manipulation key may be both ends corresponding to the width of the rectangle.
Here, the preset interaction angles corresponding to different controls may be the same or different. In the following, taking a control as text information and a control button as examples, the situation that different controls correspond to different preset interaction angles is illustrated. The size of the interaction angle of one control may represent the likelihood of a user-resolvable control. In the case where the interaction angle is greater than or equal to a certain angle (i.e., a preset interaction angle), the user can see the control clearly. In the event that the interaction angle is less than a certain angle, it may be difficult for the user to see the control.
As shown in fig. 4, taking the example that the control is text information, the interaction angle of the text information is an included angle formed by connecting lines from the preset observation point to the top end and the bottom end of the text information respectively. The minimum interaction angle for the user to distinguish text information may be 0.6 deg., and the optimum interaction angle for the user to optimally read text information may be 0.75 deg.. When the interaction angle of the text information is equal to 0.6 °, the text information corresponds to the minimum interaction size. If the interaction angle of the text information continues to decrease, the interaction angle of the text information may be less than 0.6 °, and the image size of the text information may be less than the minimum interaction size, at which time it is difficult for the user to distinguish the contents of the text information. When the interaction angle of the text information is equal to 0.75 °, the text information corresponds to the optimal interaction size. That is, when the interactive angle of the text information is greater than or equal to 0.75 °, the image size of the text information is greater than or equal to the optimal interactive size, and the user can easily recognize the content of the text information. When the interaction angle of the text information is smaller than 0.75 DEG and larger than 0.6 DEG, the image size of the text information is smaller than the optimal interaction size of the text information and larger than the minimum interaction size of the text information, and a user can distinguish the text information, but is more laborious.
Taking the control as an example of controlling the buttons. The minimum interaction angle of the user operable manipulation keys may be 1.6 deg., and the optimum interaction angle of the user optimum operation manipulation keys may be 3 deg.. When the interaction angle is equal to 1.6 degrees, the control key corresponds to the minimum interaction size. If the interaction angle of the control key is continuously reduced, the interaction angle of the control key is smaller than 1.6 degrees, the image size of the control key is smaller than the minimum interaction size, and at the moment, the user is difficult to operate the control key. When the interaction angle of the control key is equal to 3 degrees, the control key corresponds to the optimal interaction size. That is, when the interaction angle of the control key is greater than or equal to 3 °, the image size of the control key is greater than or equal to the optimal interaction size, and the user can easily operate the control key. When the interaction angle of the control key is smaller than 3 degrees and larger than 1.6 degrees, the image size of the control key is smaller than the optimal interaction size of the control key and larger than the minimum interaction size of the control key, and a user can operate the control key, but is more laborious.
In the embodiment of the application, the preset interaction angle corresponding to one control can be set as the minimum interaction angle or the optimal interaction angle corresponding to the control.
The XR object has a plurality of parameters in the XR space. For example, the plurality of parameters includes: spatial dimensions (i.e., size), spatial distance from a predetermined observation point (i.e., distance, which may be simply referred to as spatial distance), spatial position, color, shape, and the like. The target parameter may be a parameter of the plurality of parameters that causes a change in an image size of the XR object. For example, the target parameter is a spatial dimension or a spatial distance.
It is understood that to describe an XR object in the XR space, the target parameters may include multiple dimensions. For example, the target parameter is a spatial dimension including a spatial length, a spatial width, a spatial height. Accordingly, the parameter values of the target parameters also include numerical values respectively corresponding to the plurality of dimensions. For example, the first value of the spatial dimension includes a first spatial length, a first spatial width, and a first spatial height. The second value of the spatial dimension includes a second spatial length, a second spatial width, and a second spatial height. Of course, the target parameter may also include only one dimension. For example, the target parameter is a spatial distance, which corresponds to one dimension. The parameter value of the spatial distance also includes a value of one dimension.
As described above, the method provided by the embodiment of the application can be applied to an XR device or an XR system. Several implementation scenarios are provided below in which an XR device or electronic device provides XR space.
In one implementation scenario, the XR device may be used not only as an XR spatial display device, but also as an XR spatial display master device. The XR device may be configured with some physical keys, sensing elements, etc. For example, the XR device may be configured with keys for switching the XR device, keys for adjusting the brightness of the screen, keys for adjusting the viewing angle, or the like. The XR device may receive user operations. In response to user operation, the XR device may control and display the XR space.
For example, the XR device receives a user action by a user to zoom in on the first XR object. The first instruction is to trigger the XR device to reduce the image size of the first XR object. And the XR equipment responds to the first operation, increases the space size of the first XR object, and acquires the interaction angle corresponding to the first control associated with the first XR object. And under the condition that the interaction angle corresponding to the first control is reduced to a preset interaction angle, the XR equipment controls the interaction angle corresponding to the first control to be unchanged.
In another implementation, an XR system includes an electronic device and an XR device. The electronic device acts as a master device in the XR space and the XR device acts as a display device in the XR space. The XR device is connected to an electronic device which provides an XR space for the XR device, which XR device displays.
Here, in the XR system, the electronic device and the XR device may be connected by a wired or wireless means. The wired connection may include a wired connection that communicates through a universal serial bus (Universal Serial Bus, USB) interface, a high-definition multimedia interface (High Definition Multimedia Interface, HDMI), or the like. The wireless connection may include one or more of a wireless connection that communicates via Bluetooth, wi-Fi, radio frequency, or the like. The embodiment of the application does not limit the connection mode between the two.
Illustratively, the electronic device in the XR system is a cell phone and the XR device is an XR glasses. As shown in fig. 5, a wireless connection is established between the cell phone and the XR glasses. The XR space is a three-dimensional desktop. The cell phone may provide an XR interface of a three-dimensional desktop to the XR glasses. XR glasses display the XR interface of the three-dimensional desktop through a lens (i.e., display screen). The user wears XR glasses and can watch the three-dimensional desktop.
In one implementation of this implementation scenario, the electronic device may act as an input device in an XR system. And the electronic equipment receives user operations such as clicking, sliding and the like, and further adjusts target parameters of the XR object in the XR space according to the user operations. For example, the electronic device may receive a user operation by which the user zooms out of the first XR object. The first operation is to trigger the electronic device to reduce the image size of the first XR object. And responding to the first operation, the electronic equipment adjusts the space distance of the first XR object, and acquires the interaction angle corresponding to the first control associated with the first XR object. And under the condition that the interaction angle corresponding to the first control is reduced to a preset interaction angle, the electronic equipment controls the interaction angle corresponding to the first control to be unchanged.
In another implementation of this implementation scenario, the XR device may be used as an input device in an XR system. The XR device may receive user operations. After receiving the user operation, the XR device sends an operation instruction to the electronic device to cause the electronic device to respond thereto. For example, the XR device receives a user operation from a user to zoom out a first XR object, and sends a first instruction to the electronic device. The first instruction is to trigger the electronic device to reduce the image size of the first XR object. And the electronic equipment responds to the first instruction, adjusts the space distance of the first XR object, and acquires the interaction angle corresponding to the first control associated with the first XR object. And under the condition that the interaction angle corresponding to the first control is reduced to a preset interaction angle, the electronic equipment controls the interaction angle corresponding to the first control to be unchanged.
It will be appreciated that the image data of the XR interface presented by the XR device may be two-dimensional image data. In some implementations, the image data of the XR interface transmitted by the XR device electronics to the XR device may also be three-dimensional image data. The three-dimensional image data includes depth information in addition to the two-dimensional image data.
In some implementations, in order for a user to view a three-dimensional XR space, the XR interface displayed by the XR device at the same time comprises two images with parallax. Under the action of parallax, the user can feel the feeling of being in the three-dimensional space.
The principle of an XR device showing an XR interface in three dimensions in XR space is illustrated here by way of example with an XR device being an XR glasses. The XR glasses include a first display screen and a second display screen. The XR interface presented by the XR glasses at the same time comprises a first image and a second image. The first image and the second image are two images having parallax. Parallax refers to the difference in position of an object in the field of view observed when the same object is observed at two positions (e.g., positions of two human eyes) at a distance. The XR glasses display the first image through the first display screen and the second image through the second display screen. When the user wears XR glasses, one eye of the user views a first image displayed on the first display screen, and the other eye of the user views a second image displayed on the second display screen. The first image and the second image will be superimposed in the brain of the user as a complete and stereoscopic image. In this way, the user experiences a feeling in the XR space of the three dimensions of the body when wearing XR glasses.
In some implementations, to facilitate user manipulation of any one of the XR objects within the XR space, in some implementations, the XR device or electronic device may also provide a cursor in the XR interface of the XR space. The cursor corresponds to the movement track operated by the user, so that the XR object currently operated by the user can be prompted. The display style of the cursor can be set according to the actual application scene or the user requirement. For example, the cursor may be displayed in the form of an arrow, a circle, a ray, or the like.
By way of example, the electronic device may be a cell phone, tablet, desktop, laptop, handheld, notebook, ultra-mobile personal computer, UMPC, netbook, cell phone, personal digital assistant (personal digital assistant, PDA), or the like. XR devices include AR devices, VR devices, MR devices, and the like. By way of example, the XR device may be an XR helmet, XR glasses, XR player, or the like. The XR device has a display screen. The XR device displays an XR interface of an XR space through the display screen, and brings a user experience exceeding reality to a user. The embodiment of the application does not limit the specific forms of the electronic device and the XR device. The electronic device may be configured to generate an XR interface of the XR space and to transfer the XR interface to the XR device. The XR device may provide XR room to the user through the displayed XR interface.
In the embodiment of the present application, taking the mobile phone 100 as an example, the hardware structure of the electronic device is introduced through the mobile phone 100. As shown in fig. 6, the mobile phone 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc.
The processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), a driver processor, and the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The processor 110 may be a neural and command center of the cell phone 100. The processor 110 may generate operation control signals according to the instruction operation code and the timing signals to complete instruction fetching and instruction execution control.
In an embodiment of the present application, the cell phone 100 may generate an XR interface of the XR space through the GPU. The GPU is used for performing mathematical and geometric calculations and performing graphics rendering according to an XR space to obtain an XR interface of the XR space.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capabilities of the handset 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 121. For example, in an embodiment of the present application, the processor 110 may include a storage program area and a storage data area by executing instructions stored in the internal memory 121.
The storage program area may store, among other things, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, a configuration file of the motor 191, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) created during use of the handset 100, etc. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. The charging management module 140 may also supply power to the mobile phone 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. In some embodiments, the power management module 141 and the charge management module 140 may also be provided in the same device.
The wireless communication function of the mobile phone 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. In some embodiments, the antenna 1 and the mobile communication module 150 of the handset 100 are coupled, and the antenna 2 and the wireless communication module 160 are coupled, so that the handset 100 can communicate with a network and other devices through wireless communication technology.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation.
The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc. applied to the handset 100.
The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In the embodiment of the present application, the mobile phone 100 may send an XR interface or a communication command of an XR space to an XR device through the mobile communication module 150, the wireless communication module 160, or the wired interface.
The handset 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The sensor module 180 may include sensors such as a pressure sensor, a gyroscope sensor, a barometric pressure sensor, a magnetic sensor, an acceleration sensor, a hall sensor, a touch sensor, an ambient light sensor, and a bone conduction sensor. The cell phone 100 may collect various data through the sensor module 180.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), miniLED, microLED, micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like.
In XR mode, handset 100 may display the scene interface in XR space in synchronization with the XR device via display 194. Alternatively, cell phone 100 displays an XR-mode control interface via display 194, and the XR device displays an XR interface in the XR space. The user may operate on the control interface to control the XR objects in the XR space as viewed through the XR device.
The mobile phone 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display 194, an application processor, and the like. The ISP is used to process data fed back by the camera 193. The camera 193 is used to capture still images or video. In some embodiments, the cell phone 100 may include 1 or more cameras 193.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195 or removed from the SIM card interface 195 to enable contact and separation with the handset 100. The handset 100 may support 1 or more SIM card interfaces. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like.
It should be understood that the connection relationship between the modules illustrated in this embodiment is only illustrative, and does not limit the structure of the electronic device. In other embodiments, the electronic device may also include more or fewer modules than provided in the foregoing embodiments, and different interfaces or a combination of multiple interfaces may be used between the modules in the foregoing embodiments. The hardware structure of the electronic device provided in the embodiment of the present application may also refer to the hardware structure of the mobile phone 100 as shown in the figure. The methods in the following embodiments may be implemented in an electronic device having the above-described hardware configuration.
In the embodiment of the present application, taking the XR device as the XR glasses 200 as an example, the hardware structure of the XR device is described by the XR glasses 200. As shown in fig. 7, XR glasses 200 may include: processor 201, memory 202, communication module 203, sensor 204, camera 205, display module 206, audio module 207. The above components may be coupled and communicate with each other.
The processor 201 may include one or more processing units, such as: the processor may include an AP, a modem processor, a GPU, an ISP, a controller, a video codec, a DSP, a baseband processor, and/or an NPU, etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
Memory 202 may store some executable instructions. The memory 202 may include a stored program area and a stored data area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. The storage data area may store data (e.g., audio data, image data, etc.) used by the XR glasses 200, and so forth. In addition, memory 202 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash memory (universal flash storage, UFS), and the like. Processor 201 performs various functional applications and data processing of XR glasses 200 by executing instructions stored in memory 202 and/or instructions stored in a memory provided in the processor.
The communication module 203 may include a mobile communication module and a wireless communication module. The mobile communication module may provide a solution including 2G/3G/4G/5G wireless communication applied to the XR glasses 200. The wireless communication module may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the XR glasses 200. The wireless communication module may be one or more devices that integrate at least one communication processing module.
The sensor 204 may include an accelerometer, compass, gyroscope, magnetometer, or other sensor for detecting motion, or the like. The sensor 204 is used to collect corresponding data, such as acceleration sensor collecting XR glasses 200 acceleration, gyroscope sensor collecting XR glasses 200 angular velocity, etc. The data collected by sensor 204 may reflect the movement of the head of the user wearing the XR glasses 200. In some embodiments, the sensor 204 may be an inertial measurement unit (inertial measurement unit, IMU) disposed within the XR glasses 200. In some embodiments, XR glasses 200 may send data acquired by the sensor system to processor 201 for analysis. Processor 201 may determine the motion of the user's head from the data collected by the various sensors and perform the corresponding functions based on the motion of the user's head, e.g., converting the viewing angle of the XR space based on the motion of the user's head, etc. The movement of the user's head may include: whether rotated, the direction of rotation, etc.
The sensor 204 may also include an optical sensor for tracking the user's eye position and capturing eye movement data in conjunction with the camera 205. The eye movement data may be used, for example, to determine the distance between the eyes of the user, the position of each eye relative to the XR glasses 200, the magnitude and gaze direction of the twist and rotation (i.e., turning, pitching, and panning) of each eye, and so forth. In one example, infrared light is emitted within XR glasses 200 and reflected from each eye, the reflected light is detected by camera 205 or an optical sensor, and the detected data is transmitted to processor 201, such that processor 201 analyzes the position, pupil diameter, movement status, etc. of the user's eyes from changes in the infrared light reflected from each eye.
The camera 205 may be used to capture a captured still image or video. The still image or video may be an image or video of the surrounding environment of the user facing outward or an image or video of the face of the user facing inward. The camera 205 may track movement of a user's single or both eyes. The cameras 205 include, but are not limited to, conventional color cameras (RGB cameras), depth cameras (RGB depth cameras), dynamic vision sensor (dynamic vision sensor, DVS) cameras, and the like. The depth camera can acquire depth information of a photographed object. In some embodiments, camera 205 may be used to capture images of the user's eyes and send the images to processor 201 for analysis. The processor 201 may determine a movement state of the eyes of the user according to the image acquired by the camera 205, and perform a corresponding function according to the movement state of the eyes of the user. The movement state of the user's eyes may include: whether rotated, direction of rotation, whether not rotated for a long period of time, line of sight angle, etc.
The display module 206 may include: one or more display screens, one or more optical components. The display screen may include a display panel that may be used to display an XR interface of the XR space. The display panel may employ LCD, OLED, AMOLED, FLED, miniLED, microLED, micro-OLED, QLED, etc. The optical assembly may be used to direct light from the display screen to the exit pupil for perception by a user. In some embodiments, one or more optical elements (e.g., lenses) in the optical assembly may have one or more coatings, such as an anti-reflective coating. The magnification of the image light by the optical assembly allows the display to be physically smaller, lighter, and consume less power. In addition, the magnification of the image light can increase the field of view of the content displayed by the display screen. For example, the optical assembly may cause the field of view of the content displayed by the display screen to be the full field of view of the user. The one or more display screens include a first display screen and a second display screen. The first display and the second display may be different display areas of the same display. The first display and the second display may also be two different displays. The first display screen is used for displaying a first image, the second display screen is used for displaying a second image, and the first image and the second image have parallax. In this way, the user may feel an experience in, for example, the body XR space.
In an embodiment of the present application, XR glasses 200 may display XR interface data of the XR space transmitted by cell phone 100 via display module 206. XR glasses 200 may provide an XR experience for a user.
The audio module 207 is used for realizing collection and output of audio. The audio module 207 may include, but is not limited to, a microphone, speaker, headphones, and the like. The audio module 207 may provide stereo audio for the user to experience stereo sound effects.
It will be appreciated that the structure shown in fig. 7 does not constitute a particular limitation of the XR device. In other embodiments of the application, the XR device may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. For example, the XR device may also include physical keys such as on-off keys, volume keys, screen brightness adjustment keys, and various types of interfaces (e.g., USB interfaces), among others. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The hardware structure of the XR device provided in the embodiments of the present application may also be referred to as the hardware structure of the XR glasses 200 as shown in the figure. The methods of the following embodiments may be implemented in an XR device having the hardware configuration described above.
In the following embodiments, the implementation scenario of the embodiments of the present application applied to the XR system is taken as an example, and the method provided by the embodiments of the present application is described. In this implementation scenario, the electronics in the XR system may immediately enter XR mode upon connection with the XR device. Alternatively, the electronic device may enter XR mode with a connection established with the XR device and with a particular application running. The particular application may be an XR application, a desktop application, a screen-casting application, etc. Upon entering the XR mode, the electronic device may provide the XR device with an XR interface of the XR space, which is presented to the user by the XR device.
Illustratively, in the scenario shown in FIG. 5, the electronic device in the XR system is a cell phone and the XR device in the XR system is an XR glasses. The cell phone establishes wireless connection (such as Bluetooth connection and wifi connection) with the XR glasses. The XR space is a three-dimensional desktop, and the first XR object (either one) in the XR space is a gallery application window in the three-dimensional desktop. The first control associated with the first XR object (either control associated with the first XR object) is the window title of the gallery application window. Taking an example that an interaction angle corresponding to a window title is continuously reduced from greater than a preset interaction angle to the preset interaction angle, as shown in fig. 8, the method provided by the embodiment of the application includes the following steps:
S801, the mobile phone controls the XR glasses to display a first XR interface of the three-dimensional desktop, wherein the first XR interface comprises a gallery application window with a first image size and a window title with a second image size.
In the XR mode, the handset transmits the XR interface of the three-dimensional desktop (i.e., an example of the XR space) to the XR glasses in real time. The user wears the XR glasses, and can experience the three-dimensional desktop through the XR glasses. Here, the cell phone transmits a first XR interface of the three-dimensional desktop to the XR glasses, which presents the first XR interface of the three-dimensional desktop to the user.
S80a, XR glasses display a first XR interface of the three-dimensional desktop.
The XR glasses receive a first XR interface of the three-dimensional desktop transmitted by the mobile phone, and display the first XR interface to a user.
For example, as shown at a in fig. 3, the first XR interface displayed by the XR glasses includes a gallery application window of a first image size and a window title of a second image size. At this time, the window title corresponding interaction angle associated with the gallery application window is larger than the preset interaction angle. The user can easily distinguish the text in the window title.
S802, the mobile phone receives a first operation, wherein the first operation is used for triggering the mobile phone to reduce the image size of the gallery application window.
The handset may receive a first operation by a user. For example, a first operation of the handset on the screen or physical key of the handset. The first operation is used for indicating the mobile phone to adjust the target parameter of the gallery application window, and the mobile phone can be triggered to reduce the image size of the gallery application window.
For example, taking the example that the target parameter is a spatial distance, a user operation that a user slides up and down in the screen of the mobile phone may trigger the mobile phone to adjust the spatial distance of the gallery application window. The first operation may be a user operation in which the user slides up in the cell phone screen, for indicating to enlarge the spatial distance of the gallery application window. The operation of enlarging the spatial distance of the gallery application window can cause the image size of the gallery application window to be reduced, namely the first operation can trigger the mobile phone to reduce the image size of the gallery application window.
For another example, taking the example that the target parameter is a spatial size, a user operation that the user slides left and right in the mobile phone screen may trigger the mobile phone to adjust the spatial size (i.e., the target parameter) of the gallery application window. The first operation may be a user operation of the user sliding left in the cell phone screen for indicating to reduce the spatial size of the gallery application window. The operation of reducing the spatial size of the gallery application window may cause a reduction in the image size of the gallery application window, i.e., the first operation may also trigger the mobile phone to reduce the image size of the gallery application window.
S803, responding to the first operation, and obtaining an interaction angle corresponding to a window title associated with the gallery application window by the mobile phone.
And responding to the first operation, the mobile phone can acquire the interaction angle of the window title associated with the gallery application window. For example, the mobile phone may obtain the interaction angle corresponding to the window title according to the spatial distance and the spatial size of the window title.
Here, the interaction angle corresponding to the window title may be an included angle formed by connecting lines from the preset observation points to the shortest ends of the window title, respectively. For example, in the case where the window title of the gallery application window is laid out laterally with respect to the human eye, the interaction angle corresponding to the window title may be an included angle formed by connecting lines from the preset observation points to the top and bottom ends of the window title, respectively. For another example, in the case that the window title of the gallery application window is vertically arranged relative to the human eye, the interaction angle corresponding to the window title may be an included angle formed by connecting lines from the preset observation points to the leftmost end and the rightmost end of the window title respectively.
In some implementations, the interaction angle of the window title is related to the spatial size of the window title, the spatial distance of the window title. The mobile phone can pre-store the preset corresponding relation among the interactive angle, the space size and the space distance of the window title. The mobile phone can calculate the interaction angle corresponding to the window title according to the preset corresponding relation.
For example, taking the interaction angle corresponding to the window title as an included angle formed by connecting lines from the preset observation points to the top end and the bottom end of the window title respectively, the spatial dimension of the window title includes the spatial height as an example. As shown in fig. 9, the interaction angle corresponding to the window title is denoted as α. Alpha is the angle difference between the connecting line from the top end of the window title to the preset observation point and the horizontal plane passing through the preset observation point and the connecting line from the bottom end of the window title to the preset observation point and the horizontal plane. And the included angle between the connecting line from the bottom end of the window title to the preset observation point and the horizontal plane is represented as beta. The spatial height of the window title is denoted as h1, and the height difference between the bottom end of the window title and the preset observation point in the vertical direction is denoted as h2. The difference in height between the top of the window header and the preset observation point in the vertical direction is denoted as H. The spatial distance between the preset observation point and the plane in which the window title is located is denoted as d. From the geometric knowledge, it can be seen that:
tan (α+β) =h/d formula (1);
tan β=h2/d formula (2);
equation (3) can be obtained from equation (1) and equation (2) above:
h1 =h-h2= [ tan (α+β) -tan α ] d formula (3).
H, h1, h2 and d in the above formula can be obtained by presetting the spatial positions of the observation points and the spatial positions of the top end and the bottom end of the window title.
It can be seen that the interaction angle corresponding to the window title has a preset corresponding relationship as shown in the formula (3) with the space height of the window title and the space distance from the window title to the preset observation point. The mobile phone can calculate the interaction angle corresponding to the window title according to the preset corresponding relation.
It is understood that the included angle formed by a connecting line and the horizontal plane passing through the preset observation point can be positive or negative. The included angle formed by the connecting line above the horizontal plane and the horizontal plane is positive, and the included angle formed by the connecting line below the horizontal line and the horizontal plane is negative. For example, β may be negative when the bottom end of the window title is below the horizontal plane passing through the preset observation point.
S804, the cell phone controls the XR glasses to display a second XR interface of the three-dimensional desktop, the second XR interface comprising: the gallery applies a moving image in which the image size of the window is changed from large to small in response to the first operation, and the image size of the window title is changed from large to small in response to the first operation.
After the mobile phone acquires the interaction angle corresponding to the window title, the interaction angle corresponding to the window title can be compared with the preset interaction angle corresponding to the window title, and whether the interaction angle of the window title is larger than the preset interaction angle is judged. If the corresponding interaction angle of the window title is larger than the preset interaction angle, the image size of the window title can be indicated to be larger, and the current user can see the window title clearly. In this case, the mobile phone adjusts the target parameters of the gallery application window and the target parameters of the window title under the condition that the size ratio of the window title to the space size of the gallery application window remains unchanged. At the same time, the cell phone controls the XR glasses to display the second XR interface of the three-dimensional desktop.
For example, in the case where the target parameter is a spatial distance, the mobile phone may enlarge the spatial distance of the gallery application window and the window title while the size ratio of the control window title to the gallery application window is maintained at a preset size ratio. The interaction angle corresponding to the window title becomes smaller as the spatial distance becomes larger. In the process that the space distance between the gallery application window and the window title is increased from small to large, in a second XR interface provided by the XR glasses, the image sizes of the gallery application window and the window title are reduced along with the increase of the space distance.
For another example, in the case where the target parameter is a spatial size, the mobile phone may reduce the spatial sizes of the gallery application window and the window title while the size ratio of the control window title to the gallery application window remains at a preset size ratio. The interaction angle corresponding to the window title will decrease as the spatial size becomes smaller. In the process that the space sizes of the gallery application window and the window title are changed from small to large, in a second XR interface provided by the XR glasses, the image sizes of the gallery application window and the window title are changed to small along with the decrease of the space sizes.
Here, the size ratio of the window title to the gallery application window may be a ratio between the spatial size of the window title and the spatial size of the gallery application window. The dimensional ratios may include: the ratio of the spatial length of the window title to the spatial length of the gallery application window, the ratio of the spatial width of the window title to the spatial width of the gallery application window, and the ratio of the spatial height of the window title to the spatial height of the gallery application window. The size ratio of the window title to the gallery application window remains unchanged, and it is understood that all three ratios remain unchanged.
It can be appreciated that, in the case where the interaction angle corresponding to the window title is greater than the preset interaction angle, the window title and the gallery application window may be the whole, the spatial position of the window title relative to the gallery application window is unchanged, and the spatial size of the window title relative to the gallery application window is unchanged.
S80b, the XR glasses display a second XR interface of the three-dimensional desktop.
The XR glasses receive a second XR interface of the three-dimensional desktop transmitted by the mobile phone, and display the second XR interface to a user.
For example, the first operation is to instruct the mobile phone to enlarge the spatial distance of the gallery application window in the three-dimensional desktop, and the second XR interface displayed by XR to the user may be a dynamic image from a in fig. 3 to B in fig. 3. In the second XR interface, the image size of the gallery application window is changed from large to small, and the image size of the window title associated with the gallery application window is also changed from large to small. The interaction angle corresponding to the window title in fig. 3B is just reduced to the preset interaction angle.
S805, if the interaction angle corresponding to the window title is reduced to the preset interaction angle in the process of responding to the first operation by the mobile phone, the mobile phone controls the XR glasses to display a third XR interface.
The third XR interface comprises: the gallery applies a dynamic image in which the image size of the window is reduced from a large size in response to the first operation, and a window title of a third image size. The third image size is smaller than the second image size. The third image size corresponds to a preset interaction angle. It is understood that the interaction angle corresponding to the window title of the third image size is a preset interaction angle.
In the process of adjusting the target parameters of the gallery application window in the three-dimensional desktop by the mobile phone in response to the first operation, the interaction angle corresponding to the window title can be periodically acquired. For example, the cell phone may acquire the interaction angle of the window title at certain time intervals. Or under the condition that the difference between the interaction angle of the window title and the preset interaction angle is large, if the difference between the interaction angle of the window title and the preset interaction angle is larger than the first angle, the mobile phone acquires (such as the first time interval) the interaction angle of the window title at a large time interval. Under the condition that the difference between the interaction angle of the window title and the preset interaction angle is smaller, if the difference between the interaction angle of the window title and the preset interaction angle is smaller than or equal to the first angle, the mobile phone acquires (such as a second time interval) the interaction angle of the window title at a smaller time interval.
If the interaction angle corresponding to the window title is reduced to the preset interaction angle in the process of responding to the first operation by the mobile phone, the window title can be considered to reach the minimum interaction size or the optimal interaction size which is distinguished by the user. If the interaction angle corresponding to the window title is further reduced, the user is difficult to see the window title clearly or it is hard to distinguish the window title. In this case, the mobile phone continues to adjust the target parameters of the gallery application window while keeping the interaction angle corresponding to the window title unchanged (equal to the preset interaction angle). At the same time, the mobile phone generates a third XR interface of the three-dimensional desktop, and transmits the third XR interface of the three-dimensional desktop to the XR glasses, and the XR glasses display the third XR interface to the user.
For example, a first operation is to indicate a spatial distance to enlarge a gallery application window. And in the process of responding to the first operation, if the interaction angle corresponding to the window title is reduced to the preset interaction angle, the mobile phone continuously increases the spatial distance between the gallery application window and the window title while keeping the interaction angle corresponding to the window title at the preset interaction angle. At the same time, the handset transmits a third XR interface to the XR glasses.
In some implementations, if the interaction angle corresponding to the window title is reduced to the preset interaction angle in the first operation process in response to the mobile phone, the mobile phone adjusts the spatial size of the window title, so that the interaction angle corresponding to the window title is kept to be the preset interaction angle.
For example, the mobile phone may calculate the spatial size corresponding to the spatial distance of the window title according to the preset correspondence (such as the correspondence described in the above formula (3)) between the interaction angle of the window title, the spatial size of the window title, and the spatial distance of the window title when the interaction angle corresponding to the window title is equal to the preset interaction angle.
In this way, in the process of responding to the first operation, if the interaction angle corresponding to the window title is reduced to the preset interaction angle, the mobile phone can keep the interaction angle corresponding to the window title to be the preset interaction angle by adjusting the space size of the window title. For example, the first operation is used for indicating to turn the spatial distance of the gallery application window, and the mobile phone can turn the spatial size of the window title up while turning up the spatial distance of the gallery application window and the window title, so that the interaction angle corresponding to the window title is kept at the preset interaction angle. For another example, the first operation is used for indicating to reduce the space size of the gallery application window, and the mobile phone adjusts the space size of the window title while reducing the space size of the gallery application window, so that the interaction angle corresponding to the window title is kept at the preset interaction angle.
S80c, XR glasses display a third XR interface of the three-dimensional desktop.
The XR glasses receive a third XR interface of the three-dimensional desktop transmitted by the mobile phone, and display the third XR interface to a user.
For example, the first operation is to instruct the mobile phone to enlarge the spatial distance of the gallery application window in the three-dimensional desktop. As shown in C in fig. 3, after the window title corresponding interaction angle associated with the gallery application window is reduced to the preset interaction angle, the image size (herein referred to as the image height) of the window title is maintained at the third image size corresponding to the preset interaction angle, and the image height of the window title is no longer reduced as the gallery application window becomes smaller. The third XR interface presented to the user by XR may be a dynamic image changing from B in fig. 3 to C in fig. 3. In the third XR interface, the image size of the gallery application window is changed from large to small, and the image size of the window title associated with the gallery application window is kept at a third image size corresponding to the preset interaction angle.
In some cases, it may be difficult for the handset to present the full title content during adjustment of the gallery application window in response to the first operation. In this case, the handset may set the window title to scroll mode so that the window title in the XR interface may scroll through the displayed title content. For example, as shown in fig. 3C, as the spatial distance of the gallery application window increases, the window title associated with the gallery application window in the XR interface has difficulty in rendering the complete title content "gallery-picture 1". In this case, the mobile phone can set the window title to a scrolling mode, so that the window title in the XR interface scrolls and displays "gallery-picture 1", and the user can see the complete window title.
In the process of executing the steps S801-S805, in the XR interface displayed by the XR glasses to the user, the gallery application window is continuously reduced (i.e. the image size of the gallery application window is continuously reduced), and the window title associated with the gallery application window is firstly reduced and then kept unchanged (i.e. the image size of the window title associated with the gallery application window is firstly reduced and then kept unchanged). Therefore, the user can always distinguish the title content in the window title associated with the gallery application window, and the user experience is improved.
The above-mentioned procedure of decreasing the interaction angle corresponding to the window title from greater than the preset interaction angle to the preset interaction angle is provided in fig. 8. In some implementations, the user may also operate the handset to zoom in on the gallery application window or to increase the gallery window. Next, taking an example that an interaction angle corresponding to a window title associated with a gallery application window is continuously increased from a preset interaction angle, as shown in fig. 10, the method provided by the embodiment of the application includes the following steps:
s1001, the handset controls the XR glasses to display a fourth XR interface of the three-dimensional desktop, the fourth XR interface including a gallery application window of a fourth image size and a window header of a third image size.
The interaction angle corresponding to the current window title is equal to the preset interaction angle, and the image size of the window title is equal to the third image size corresponding to the preset interaction angle. The cell phone generates a fourth XR interface of the three-dimensional desktop and transmits the fourth XR interface to the XR glasses.
S100a, XR glasses display a fourth XR interface of the three-dimensional desktop.
The XR glasses receive a fourth XR interface of the three-dimensional desktop transmitted by the mobile phone, and display the fourth XR interface to a user.
For example, as shown at D in fig. 11, a fourth XR interface displayed by the XR glasses includes a gallery application window of a fourth image size and a window title of a third image size. At this time, the window title corresponding interaction angle associated with the gallery application window is equal to the preset interaction angle. The user can distinguish between text in the window title.
S1002, the mobile phone receives a second operation, wherein the second operation is used for triggering the mobile phone to enlarge the image size of the gallery application window.
The handset may receive a second operation by the user. The second operation is used for indicating the mobile phone to adjust the target parameters of the gallery application window, and can trigger the mobile phone to enlarge the image size of the gallery application window.
For example, taking the example that the target parameter is a spatial distance. The second operation may be a user operation of the user sliding down in the cell phone screen for indicating to adjust the spatial distance of the gallery application window. The operation of reducing the spatial distance of the gallery application window causes the image size of the gallery application window to become larger. I.e. the second operation will also trigger the handset to enlarge the image size of the gallery application window.
As another example, the target parameter is a spatial dimension. The second operation may be a user operation by which the user slides right in the cell phone screen for indicating to enlarge the spatial size of the gallery application window. The operation of enlarging the spatial size of the gallery application window may cause the image size of the gallery application window to become larger. I.e. the second operation will also trigger the handset to enlarge the image size of the gallery application window.
S1003, responding to the second operation, and acquiring the size proportion of the window title and the gallery application window by the mobile phone.
In response to the second operation, the mobile phone can periodically acquire the size ratio of the window title to the gallery application window. For example, the mobile phone may obtain the spatial size of the window title and the spatial size of the gallery application window, and calculate the size ratio of the spatial size of the window title to the spatial size of the gallery application window.
It can be understood that, in the case where the image size of the gallery application window is the fourth image size and the image size of the window title associated with the gallery application window is the third image size, the interaction angle corresponding to the window title associated with the gallery application window is equal to the preset interaction angle. At this time, the size ratio of the window title to the gallery application window may not be equal to the preset size ratio, or may be equal to the preset size ratio.
If the size ratio of the window title to the gallery application window is greater than the preset size ratio, the mobile phone is indicated to adjust the space size of the window title in order to keep the interaction angle corresponding to the window title at the preset interaction angle, so that the size ratio of the window title to the gallery application window is not equal to the preset size ratio. If the size ratio of the window title to the gallery application window is equal to the preset size ratio, the interaction angle corresponding to the window title is just the preset interaction angle, and the mobile phone does not adjust the space size of the window title so that the interaction angle corresponding to the window title is larger than or equal to the preset interaction angle.
S1004, the mobile phone controls the XR glasses to display a fifth XR interface of the three-dimensional desktop, and the fifth XR interface comprises: the gallery applies the moving image of the window whose image size is changed from small to large in response to the second operation, and the window title of the third image size.
After the mobile phone obtains the size ratio of the window title to the gallery application window, the size ratio of the window title to the gallery application window can be compared with a preset size ratio, and whether the size ratio of the window title to the gallery application window is larger than the preset size ratio is judged. If the size ratio of the window title to the gallery application window is greater than the preset size ratio, the mobile phone adjusts the space size of the window title so that the interaction angle corresponding to the window title is greater than or equal to the preset interaction angle, and therefore the size ratio of the window title to the gallery application window is not equal to the preset size ratio. In this case, the mobile phone can adjust the target parameters of the gallery application window under the condition of keeping the interaction angle corresponding to the window title unchanged. At the same time, the handset transmits the fifth XR interface of the three-dimensional desktop to the XR glasses, so that the XR glasses display the fifth XR interface of the three-dimensional desktop.
Here, in the process of reducing the spatial distance of the gallery application window, the spatial distance of the window title is reduced simultaneously with the reduction of the spatial distance of the gallery application window by the mobile phone. The gallery application window will increase with decreasing spatial distance in the user's view (i.e., the image size of the gallery application window will increase with decreasing spatial distance), but the window title remains substantially unchanged (i.e., the image size of the window title remains substantially unchanged).
For example, in the case where the target parameter is a spatial distance, the mobile phone may reduce the spatial distance between the gallery application window and the window title, while the interaction angle corresponding to the control window title remains at the preset interaction angle. In the fifth XR interface provided by the XR glasses during the process of decreasing the spatial distance between the gallery application window and the window title, the image size of the gallery application window increases as the spatial distance decreases, and the image size of the window title remains substantially unchanged.
For another example, in the case that the target parameter is a spatial size, the mobile phone may enlarge the spatial size of the gallery application window while the interaction angle corresponding to the control window title remains at the preset interaction angle. In the process that the space size of the gallery application window is reduced from large to small, in a fifth XR interface provided by the XR glasses, the image size of the gallery application window is increased along with the reduction of the space distance, and the image size of the window title is basically kept unchanged.
In some implementations, the mobile phone may adjust the spatial size of the window title, so that the interaction angle corresponding to the window title is kept as the preset interaction angle. For example, the mobile phone may calculate the spatial size corresponding to the spatial distance of the window title according to the above-mentioned preset correspondence between the interaction angle, the spatial size and the spatial distance of the window title when the interaction angle corresponding to the window title is equal to the preset interaction angle. In this way, the mobile phone can respond to the second operation, and then the mobile phone can keep the interaction angle corresponding to the window title as the preset interaction angle by adjusting the space size of the window title.
S100b, the XR glasses display a fifth XR interface of the three-dimensional desktop.
The XR glasses receive a fifth XR interface of the three-dimensional desktop transmitted by the mobile phone, and the fifth XR interface is displayed for a user.
For example, as shown in E of fig. 11, the window title correspondence interaction angle associated with the gallery application window is maintained at a preset interaction angle. The fifth XR interface displayed by the XR glasses may be a dynamic image changing from D in fig. 11 to E in fig. 11. In the fifth XR interface, the image size of the gallery application window is changed from small to large, the image size (herein referred to as image height) of the window title is kept at the third image size corresponding to the preset interaction angle, and the image height of the window title does not become larger as the gallery application window becomes larger.
S1005, if the size ratio of the window title to the gallery application window is increased to the preset size ratio in the process of responding to the second operation by the mobile phone, the mobile phone controls the XR glasses to display a sixth XR interface.
The sixth XR interface comprises: the gallery application interface has an image size that is responsive to the second operation from small to large dynamic images, and the window header has an image size that is responsive to the second operation from small to large dynamic images.
The mobile phone can periodically acquire the size ratio of the window title to the gallery application window in the process of responding to the second operation. For example, the handset may obtain the size ratio of the window title to the gallery application window at certain time intervals. Or under the condition that the difference between the size ratio of the window title and the gallery application window and the preset size ratio is larger, for example, under the condition that the difference between the size ratio of the window title and the gallery application window and the preset size ratio is larger than the first ratio, the mobile phone acquires (for example, the third time interval) the size ratio of the window title and the gallery application window at larger time intervals. Under the condition that the difference between the size ratio of the window title and the gallery application window and the preset size ratio is smaller, for example, under the condition that the difference between the size ratio of the window title and the gallery application window and the preset size ratio is smaller than or equal to the first ratio, the mobile phone acquires (for example, acquires the size ratio of the window title and the gallery application window at a smaller time interval) at a fourth time interval.
If the size ratio of the window title to the gallery application window changes to the preset size ratio in the process of responding to the second operation by the mobile phone, the size ratio of the current window title to the gallery application window can be considered to be kept at the original size ratio (namely, the preset size ratio), and the interaction angle corresponding to the window title can be larger than or equal to the preset interaction angle. At this time, the mobile phone can adjust the target parameters of the gallery application window and the window title under the condition that the size ratio of the window title to the gallery application window is kept unchanged (equal to the preset size ratio). At the same time, the handset generates a sixth XR interface of the three-dimensional desktop, and transmits the sixth XR interface of the three-dimensional desktop to the XR glasses, which display the sixth XR interface to the user.
S100c, XR glasses display a sixth XR interface of the three-dimensional desktop.
The XR glasses receive a sixth XR interface of the three-dimensional desktop transmitted by the mobile phone, and display the sixth XR interface to a user.
For example, as shown in F in fig. 11, after the size ratio of the window title to the gallery application window is equal to the preset size ratio, the image size of the window title is larger than the third image size corresponding to the preset interaction angle, and the fifth XR interface displayed by the XR glasses is a dynamic image changed from E in fig. 11 to F in fig. 11. In the fifth XR interface, the image size of the gallery application window is changed from small to large, and the image size of the window title associated with the gallery application window is also changed from small to large.
In the process of executing the above S1001-S1005 by the mobile phone, in the XR interface displayed by the XR glasses to the user, the gallery application window is continuously increased (i.e. the image size of the gallery application window is continuously increased), and the window title associated with the gallery application window is firstly kept unchanged and then increased (i.e. the image size of the window title associated with the gallery application window is firstly kept unchanged and then increased). Therefore, the user can always distinguish the title content in the window title associated with the gallery application window, and the user experience is improved.
It will be appreciated that in the implementation shown in fig. 8 and 10, the method provided by the embodiment of the present application is illustrated by using the mobile phone as the master device to control the XR space and providing the XR glasses with an XR interface of the XR space, and displaying the XR interface of the XR space by the XR glasses.
In other embodiments, the XR glasses may receive the first operation described above. And responding to the first operation, and sending a first instruction to the mobile phone by the XR glasses. The first instruction is used for triggering the mobile phone to reduce the image size of the gallery application window, indicating the mobile phone to obtain the interaction angle corresponding to the window title, and controlling the interaction angle corresponding to the first control to be unchanged under the condition that the interaction angle of the window title is equal to the preset interaction angle. The handset may perform a flow similar to S801-S805. The first operation is received by the XR glasses instead of the mobile phone, and will not be described here again.
Accordingly, the XR glasses may receive the second operation described above. And responding to the second operation, and sending a second instruction to the mobile phone by the XR glasses. The second instruction is used for triggering the mobile phone to enlarge the image size of the gallery application window, indicating the mobile phone to acquire the size ratio of the window title to the gallery application window, and controlling the size ratio of the window title to the gallery application window to be unchanged under the condition that the size ratio of the window title to the gallery application window is equal to the preset size ratio. The handset may perform a flow similar to S1001-S1005. The mobile phone does not receive the second operation any more, but the XR glasses receive the second operation, which is not described here again.
In other embodiments, in an implementation scenario where the XR device acts as a master device and display device for the XR space, the XR device may perform similar flows as S801-S805 and S1001-S1005. In contrast, an XR device is not required to transmit the XR interface of a three-dimensional desktop to other devices. The XR device directly displays the XR interface after generating the XR interface of the three-dimensional desktop.
The method provided by the embodiment of the application is illustratively described by taking an example that the XR device is an XR glasses, the XR space is a three-dimensional desktop, the first XR object in the XR space is a gallery application window in the three-dimensional desktop, and the first control associated with the gallery application window is a window title. In an example, in a three-dimensional desktop provided by the XR glasses, an interaction angle corresponding to a window title associated with a gallery application window is continuously reduced from a value larger than a preset interaction angle to the preset interaction angle. As shown in fig. 12, the method provided by the embodiment of the present application may include the following steps:
S1201, the XR glasses display a first XR interface of the XR space comprising a gallery application window of a first image size and a window header of a second image size.
In the XR mode, the XR glasses may display the XR interface of a three-dimensional desktop, providing the user with a three-dimensional desktop. The user wears the XR glasses, and can experience the three-dimensional desktop through the XR glasses.
For example, as shown in a of fig. 12, the first XR interface displayed by the XR glasses includes a gallery application window and a window title. At this time, the window title corresponding interaction angle associated with the gallery application window is larger than the preset interaction angle. The image size of the window title related to the gallery application window is larger, and a user can easily distinguish characters in the window title.
S1202, the XR glasses receive a first operation for triggering the XR glasses to reduce the image size of the gallery application window.
The XR glasses may receive a first operation of the user. For example, XR glasses may sense a first operation corresponding to a user's head movement or a first operation of a user on a physical key. The first operation is used for indicating the XR glasses to adjust target parameters of the gallery application window, and the XR glasses can be triggered to adjust the image size of the gallery application window.
The first operation of XR glasses receiving may refer to the description of the first operation of the mobile phone receiving in S802, which is not described herein.
S1203, in response to the first operation, the XR glasses acquire an interaction angle corresponding to a window title associated with the gallery application window.
The XR glasses may refer to the description content of the interaction angle corresponding to the window title associated with the gallery application window obtained by the mobile phone in S803, which is not described herein.
S1204, XR glasses display a second XR interface of the three-dimensional desktop.
The second XR interface comprises: the gallery applies a moving image in which the image size of the window is changed from large to small in response to the first operation, and the image size of the window title is changed from large to small in response to the first operation.
If the interaction angle corresponding to the window title is larger than the preset interaction angle, the XR glasses adjust the target parameters of the gallery application window and the target parameters of the window title under the condition that the size ratio of the window title to the space size of the gallery application window is kept unchanged. At the same time, the XR glasses generate and display a second XR interface of the three-dimensional desktop.
For example, the first operation described above is an example of instructing XR glasses to zoom in on the spatial distance of gallery application windows in a three-dimensional desktop. The second XR interface displayed by the XR glasses may be a dynamic image changing from a in fig. 12 to B in fig. 12. In the second XR interface, the image size of the gallery application window is changed from large to small, and the image size of the window title associated with the gallery application window is also changed from large to small.
Here, the XR glasses may refer to the description content of the mobile phone adjustment target parameter in S804 for adjusting the target parameter of the gallery application window and the target parameter of the window title in response to the first operation, which is not described herein.
S1205, if the interaction angle corresponding to the window title is equal to the preset interaction angle in the process of responding to the first operation by the XR glasses, displaying a third XR interface of the XR space.
The third XR interface comprises: the gallery applies a dynamic image in which the image size of the window is reduced from a large size in response to the first operation, and a window title of a third image size. The third image size is smaller than the second image size, and the third image size corresponds to a preset interaction angle.
For example, the first operation described above is to instruct XR glasses to zoom in on the spatial distance of gallery application windows in a three-dimensional desktop. As shown in C in fig. 12, after the window title corresponding interaction angle associated with the gallery application window is reduced to the preset interaction angle, the image size (herein referred to as the image height) of the window title is maintained at the third image size corresponding to the preset interaction angle, and the image height of the window title is no longer reduced as the gallery application window becomes smaller. The third XR interface presented to the user by XR may be a dynamic image changing from B in fig. 12 to C in fig. 12. In the third XR interface, the image size of the gallery application window is changed from large to small, and the image size of the window title associated with the gallery application window is kept at a third image size corresponding to the preset interaction angle.
In some implementations, in the process of adjusting the target parameters of the gallery application window in the three-dimensional desktop by the XR glasses in response to the first operation, the interaction angle corresponding to the window title may be periodically acquired. If the interaction angle corresponding to the window title is reduced to the preset interaction angle in the process of the XR glasses responding to the first operation, the XR glasses adjust the space size of the window title, so that the interaction angle corresponding to the window title is kept to be the preset interaction angle.
In this implementation manner, the description content of the mobile phone in S805 that the interaction angle corresponding to the window title is the preset interaction angle may be referred to, which is not described herein.
In the process of executing S1201-S1205, the XR glasses continuously decrease the gallery application window (i.e., the image size of the gallery application window is continuously decreased), and then the window title associated with the gallery application window is kept unchanged (i.e., the image size of the window title associated with the gallery application window is firstly decreased and then kept unchanged). Therefore, the user can always distinguish the title content in the window title associated with the gallery application window, and the user experience is improved.
In some implementations, the XR glasses may also receive a second operation by the user to resize the image of the gallery application window in the XR interface. The following is an example of a window title associated with a gallery application window becoming larger from a preset interaction angle, as shown in fig. 13, the method provided by the embodiment of the present application may further include the following steps:
S1301, the XR glasses display a fourth XR interface of the three-dimensional desktop, the fourth XR interface comprising a gallery application window of a fourth image size and a window title of a third image size.
For example, as shown in D of fig. 13, a fourth XR interface displayed by the XR glasses includes a gallery application window of a fourth image size and a window title. At this time, the window title corresponding interaction angle associated with the gallery application window is equal to the preset interaction angle. The user can distinguish between text in the window title.
S1302, the XR glasses receive a second operation to trigger the XR glasses to enlarge an image size of the gallery application window.
The XR glasses may receive a second operation by the user. The second operation is for directing the XR glasses to adjust a target parameter of the gallery application window, which may trigger the XR glasses to enlarge an image size of the gallery application window.
The XR glasses receiving second operation may refer to the description of the mobile phone receiving second operation in S1002, which is not described herein.
S1303, in response to the second operation, the XR glasses acquire the size ratio of the window title to the gallery application window.
The size ratio of the XR glasses to the gallery application window may refer to the description of the size ratio of the mobile phone window title to the gallery application window in S1003, which is not described herein.
S1304, XR glasses display a fifth XR interface of the three-dimensional desktop.
The fifth XR interface comprises: the gallery applies the moving image of the window whose image size is changed from small to large in response to the second operation, and the window title of the third image size.
And if the size ratio of the window title to the gallery application window is larger than the preset size ratio, adjusting the target parameters of the gallery application window under the condition that the interaction angle corresponding to the window title is kept unchanged by the XR glasses. At the same time, the XR glasses generate and display a fifth XR interface of the three-dimensional desktop.
For example, as shown in E of fig. 13, the window title correspondence interaction angle associated with the gallery application window is maintained at a preset interaction angle. The fifth XR interface displayed by the XR glasses may be a dynamic image changing from D in fig. 13 to E in fig. 13. In the fifth XR interface, the image size of the gallery application window is changed from small to large, the image size (herein referred to as image height) of the window title is kept at the third image size corresponding to the preset interaction angle, and the image height of the window title does not become larger as the gallery application window becomes larger.
Here, the target parameters of the XR glasses in response to the first operation adjustment gallery application window may refer to the description contents of the mobile phone adjustment target parameters in S1004, which are not described herein.
S1305, if the size ratio of the window title to the gallery application window is increased to the preset size ratio in the process that the XR glasses respond to the second operation, the XR glasses display a sixth XR interface.
The sixth XR interface comprises: the gallery application interface has an image size that is responsive to the second operation from small to large dynamic images, and the window header has an image size that is responsive to the second operation from small to large dynamic images.
For example, as shown in F in fig. 13, after the size ratio of the window title to the gallery application window is equal to the preset size ratio, the image size of the window title is larger than the third image size corresponding to the preset interaction angle, and the fifth XR interface displayed by the XR glasses is a dynamic image changed from E in fig. 13 to F in fig. 13. In the fifth XR interface, the image size of the gallery application window is changed from small to large, and the image size of the window title associated with the gallery application window is also changed from small to large.
In some implementations, the XR glasses periodically obtain a size ratio of the window title to the gallery application window in response to the second operation. If the size ratio of the window title to the gallery application window increases to the preset size ratio in the process of the XR glasses responding to the second operation, the XR glasses control the size ratio of the window title to the gallery application window to be kept to be the preset size ratio.
The implementation manner may refer to the description content that the size ratio of the mobile phone holding window title to the gallery application window in S1005 is a preset size ratio, which is not described herein.
In the process of the XR glasses executing S1301-S1305, in the XR interface displayed by the XR glasses to the user, the gallery application window is continuously increased (i.e. the image size of the gallery application window is continuously increased), and the window title associated with the gallery application window is firstly kept unchanged and then increased (i.e. the image size of the window title associated with the gallery application window is firstly kept unchanged and then increased). Therefore, the user can always distinguish the title content in the window title associated with the gallery application window, and the user experience is improved.
Still further embodiments of the present application provide an XR device comprising: a memory, a display screen, and one or more processors. The display screen is used to present the XR space to the user. The display screen, the memory and the processor are coupled. The memory has stored therein computer program code comprising computer instructions which, when executed by a processor, cause an XR device to perform the functions or steps of the method embodiments described above.
In some implementations of this embodiment, the XR device described above further comprises: and a communication module. The communication module is used for interacting with the electronic equipment and receiving an XR space provided by the electronic equipment for the XR equipment.
Of course, the XR device may also include other hardware configurations. For example, the XR device also includes hardware structures such as sensors, cameras, etc. The structure of the XR device may be referred to as the structure of XR glasses 200 shown in fig. 7.
Further embodiments of the present application provide an electronic device comprising: a communication module, a memory, and one or more processors. The communication module, the memory, and the processor are coupled. The communication module is used for transmitting data or signaling with the XR equipment. The memory has stored therein computer program code comprising computer instructions. The electronic device, when executed by a processor, may perform the functions or steps of the method embodiments described above. Of course, the electronic device may also include other hardware structures such as other antennas for receiving signals. For example, the electronic device further includes a sensor, a display screen, and other hardware structures. The structure of the electronic device may refer to the structure of the mobile phone 100 shown in fig. 6.
The embodiment of the application also provides a chip system which is applied to the electronic equipment. The system-on-chip includes at least one processor and at least one interface circuit. The processors and interface circuits may be interconnected by wires. For example, the interface circuit may be used to receive signals from other devices (e.g., memory). For another example, the interface circuit may be used to send signals to other devices (e.g., processors). The interface circuit may, for example, read instructions stored in the memory and send the instructions to the processor. The instructions, when executed by the processor, may cause the electronic device or XR device to perform the steps of the foregoing embodiments. Of course, the system-on-chip may also include other discrete devices, which are not particularly limited in accordance with embodiments of the present application.
Embodiments of the present application also provide a computer-readable storage medium comprising computer instructions which, when run on an electronic device as described above, cause the electronic device or XR device to perform the functions or steps of the method embodiments described above.
Embodiments of the present application also provide a computer program product which, when run on a computer, causes the computer to perform the functions or steps of the method embodiments described above.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (15)

1. A method of displaying an XR space, applied to an augmented reality XR device, the XR device configured to display the XR space to a user; the method comprises the following steps:
displaying a first XR interface of the XR space; wherein the first XR interface comprises: a first XR object of a first image size and a first control of a second image size, the first control associated with the first XR object;
receiving a first operation for triggering the XR device to reduce an image size of the first XR object;
displaying a second XR interface of the XR space; wherein the second XR interface comprises: an image size of the first XR object responsive to the first manipulation by a moving image of the first control changing from large to small, and an image size of the first control responsive to the first manipulation by a moving image of the first control changing from large to small;
If the interaction angle corresponding to the first control is equal to the preset interaction angle in the process of responding the first operation by the XR equipment, displaying a third XR interface of the XR space; wherein the third XR interface comprises: an image size of the first XR object responsive to the first manipulation by a moving image of a size that is smaller, and the first control of a third image size; the third image size is smaller than the second image size, and the third image size corresponds to the preset interaction angle; the interaction angle is an included angle from a preset observation point to a connecting line of two ends of the first control, and the preset observation point is determined by the position of the user in the XR space.
2. The method of claim 1, wherein prior to the displaying the third XR interface of the XR space, the method further comprises:
responding to the first operation, and periodically acquiring an interaction angle corresponding to the first control;
and if the interaction angle corresponding to the first control is equal to the preset interaction angle in the process of responding the first operation by the XR equipment, adjusting the space size of the first control to enable the interaction angle corresponding to the first control to be kept to be the preset interaction angle.
3. The method of claim 1, wherein the XR device is coupled to an electronic device that provides the XR space for the XR device;
after the receiving the first operation, the method further comprises:
and sending a first instruction to the electronic equipment, wherein the first instruction is used for triggering the electronic equipment to reduce the image size of the first XR object, indicating the electronic equipment to obtain the interaction angle corresponding to the first control, and controlling the interaction angle corresponding to the first control to be unchanged under the condition that the interaction angle corresponding to the first control is equal to a preset interaction angle.
4. A method according to any one of claims 1-3, characterized in that the method further comprises:
displaying a fourth XR interface of the XR space; wherein the fourth XR interface comprises: a first XR object of a fourth image size and a first control of the third image size, the fourth image size smaller than the first image size;
receiving a second operation for triggering the XR device to resize the image of the first XR object;
a fifth XR interface showing the XR space; wherein the fifth XR interface comprises: an image size of the first XR object responsive to the second operation changing from a small to a large dynamic image, and a first control of the third image size;
If the size ratio of the first control to the first XR object is increased to a preset size ratio in the process of responding to the second operation by the XR equipment, displaying a sixth XR interface of the XR space; wherein the sixth XR interface comprises: the image size of the first XR object is changed from small to large in response to the second operation, and the image size of the first control is changed from small to large in response to the second operation.
5. The method of claim 4, wherein prior to the displaying the fifth XR interface of the XR space, the method further comprises:
in response to the second operation, periodically acquiring a size ratio of the first control to the first XR object;
and if the size ratio of the first control to the first XR object is increased to the preset size ratio in the process of responding to the second operation by the XR equipment, controlling the size ratio of the first control to the first XR object to be kept to be the preset size ratio.
6. The method of any one of claims 1-5, wherein the XR space comprises a three-dimensional desktop; the first XR object is an application window in the three-dimensional desktop; the first control comprises one or more of text information and control buttons.
7. The XR space display method is characterized by being applied to electronic equipment, wherein the electronic equipment is connected with the augmented reality XR equipment; the electronic device is used for providing an XR space for the XR device, and the XR space is displayed to a user by the XR device; the method comprises the following steps:
controlling the XR device to display a first XR interface of the XR space; wherein the first XR interface comprises: a first XR object of a first image size and a first control of a second image size, the first control associated with the first XR object;
receiving a first operation of a user, and acquiring an interaction angle corresponding to the first control; wherein the first operation triggers the electronic device to reduce an image size of the first XR object; the interaction angle is an included angle between a preset observation point and a connecting line of two ends of the first control, and the preset observation point is determined by the position of the user in the XR space;
controlling the XR device to display a second XR interface of the XR space; wherein the second XR interface comprises: an image size of the first XR object responsive to the first manipulation by a moving image of the first control changing from large to small, and an image size of the first control responsive to the first manipulation by a moving image of the first control changing from large to small;
If the interaction angle corresponding to the first control is equal to the preset interaction angle in the process that the electronic equipment responds to the first operation, controlling the XR equipment to display a third XR interface of the XR space; wherein the third XR interface comprises: an image size of the first XR object responsive to the first manipulation by a moving image of a size that is smaller, and the first control of a third image size; the third image size is smaller than the second image size, and the third image size corresponds to the preset interaction angle.
8. The method of claim 7, wherein controlling the XR device to display a third XR interface of the XR space if the interaction angle corresponding to the first control is equal to a preset interaction angle during the response of the electronic device to the first operation comprises:
if the interaction angle corresponding to the first control is equal to a preset interaction angle in the process that the electronic equipment responds to the first operation, the space size of the first control is adjusted, so that the interaction angle corresponding to the first control is kept to be the preset interaction angle;
the third XR interface is generated and transmitted to the XR device.
9. The method of claim 7 or 8, wherein the controlling the XR device to display a second XR interface of the XR space comprises:
controlling the XR device to display a fourth XR interface of the XR space; wherein the fourth XR interface comprises: a first XR object of a fourth image size and a first control of the third image size, the fourth image size smaller than the first image size;
receiving a second operation of a user, and acquiring the size proportion of the first control to the first XR object; the second operation is to trigger the electronic device to resize the image of the first XR object;
controlling the XR device to display a fifth XR interface of the XR space; wherein the fifth XR interface comprises: an image size of the first XR object responsive to the second operation changing from a small to a large dynamic image, and a first control of the third image size;
if the size ratio of the first control to the first XR object is increased to a preset size ratio in the process of responding to the second operation by the electronic equipment, controlling the XR equipment to display a sixth XR interface of the XR space; wherein the sixth XR interface comprises: the image size of the first XR object is changed from small to large in response to the second operation, and the image size of the first control is changed from small to large in response to the second operation.
10. The method of claim 9, wherein if the size ratio of the first control to the first XR object increases to the preset size ratio during the response of the electronic device to the second operation, controlling the XR device to display a sixth XR interface of the XR space comprises:
if the size ratio of the first control to the first XR object is increased to the preset size ratio in the process of responding to the second operation by the electronic equipment, controlling the size ratio of the first control to the first XR object to be kept to be the preset size ratio;
the sixth XR interface is generated and transmitted to the XR device.
11. The method of any one of claims 7-10, wherein the XR space comprises a three-dimensional desktop; the first XR object is an application window in the three-dimensional desktop; the first control comprises one or more of text information and control buttons.
12. An XR device comprising: a memory, a display screen, and one or more processors; the display screen is used for displaying the XR space to a user; the display screen and the memory are coupled with the processor; wherein the memory has stored therein computer program code comprising computer instructions which, when executed by the processor, cause the XR device to perform the method of any one of claims 1-6.
13. The XR device of claim 12, further comprising: a communication module; the communication module is used for interacting with electronic equipment and receiving the XR space provided by the electronic equipment for the XR equipment.
14. An electronic device, comprising: a communication module, a memory, and one or more processors; the communication module, the memory and the processor are coupled; wherein the communication module is used for transmitting data or signaling with the XR equipment; the memory has stored therein computer program code comprising computer instructions which, when executed by the processor, cause the electronic device to perform the method of any of claims 7-11.
15. A computer-readable storage medium comprising computer instructions that, when run on an XR device, cause the XR device to perform the method of any one of claims 1-6; alternatively, the computer instructions, when run on an electronic device, cause the electronic device to perform the method of any of claims 7-11.
CN202310559559.7A 2023-05-17 2023-05-17 XR space display method, XR equipment, electronic equipment and storage medium Pending CN116820229A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310559559.7A CN116820229A (en) 2023-05-17 2023-05-17 XR space display method, XR equipment, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310559559.7A CN116820229A (en) 2023-05-17 2023-05-17 XR space display method, XR equipment, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116820229A true CN116820229A (en) 2023-09-29

Family

ID=88121269

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310559559.7A Pending CN116820229A (en) 2023-05-17 2023-05-17 XR space display method, XR equipment, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116820229A (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140225920A1 (en) * 2013-02-13 2014-08-14 Seiko Epson Corporation Image display device and display control method for image display device
US8854433B1 (en) * 2012-02-03 2014-10-07 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
CN104461267A (en) * 2013-09-16 2015-03-25 联想(北京)有限公司 Interface interactive object display method and device
CN108897881A (en) * 2018-07-05 2018-11-27 腾讯科技(深圳)有限公司 Interactive image display methods, device, equipment and readable storage medium storing program for executing
CN113672133A (en) * 2020-05-13 2021-11-19 华为技术有限公司 Multi-finger interaction method and electronic equipment
CN114115619A (en) * 2020-08-31 2022-03-01 华为技术有限公司 Application program interface display method and electronic equipment
KR20220074393A (en) * 2020-11-27 2022-06-03 (주)와이투콘 Apparatus and method for prividing virtual experience service based on extended reality
WO2022143118A1 (en) * 2020-12-31 2022-07-07 华为技术有限公司 Image processing method and electronic device
US20220229535A1 (en) * 2021-01-19 2022-07-21 Samsung Electronics Company, Ltd. Systems and Methods for Manipulating Views and Shared Objects in XR Space
CN115185440A (en) * 2021-04-07 2022-10-14 华为技术有限公司 Control display method and related equipment
US20230031832A1 (en) * 2021-07-28 2023-02-02 Apple Inc. Devices, methods and graphical user interfaces for three-dimensional preview of objects
US20230099824A1 (en) * 2020-02-20 2023-03-30 Huawei Technologies Co., Ltd. Interface layout method, apparatus, and system
CN116048436A (en) * 2022-06-17 2023-05-02 荣耀终端有限公司 Application interface display method, electronic device and storage medium

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8854433B1 (en) * 2012-02-03 2014-10-07 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
US20140225920A1 (en) * 2013-02-13 2014-08-14 Seiko Epson Corporation Image display device and display control method for image display device
CN104461267A (en) * 2013-09-16 2015-03-25 联想(北京)有限公司 Interface interactive object display method and device
CN108897881A (en) * 2018-07-05 2018-11-27 腾讯科技(深圳)有限公司 Interactive image display methods, device, equipment and readable storage medium storing program for executing
US20230099824A1 (en) * 2020-02-20 2023-03-30 Huawei Technologies Co., Ltd. Interface layout method, apparatus, and system
CN113672133A (en) * 2020-05-13 2021-11-19 华为技术有限公司 Multi-finger interaction method and electronic equipment
CN114115619A (en) * 2020-08-31 2022-03-01 华为技术有限公司 Application program interface display method and electronic equipment
KR20220074393A (en) * 2020-11-27 2022-06-03 (주)와이투콘 Apparatus and method for prividing virtual experience service based on extended reality
WO2022143118A1 (en) * 2020-12-31 2022-07-07 华为技术有限公司 Image processing method and electronic device
US20220229535A1 (en) * 2021-01-19 2022-07-21 Samsung Electronics Company, Ltd. Systems and Methods for Manipulating Views and Shared Objects in XR Space
CN115185440A (en) * 2021-04-07 2022-10-14 华为技术有限公司 Control display method and related equipment
US20230031832A1 (en) * 2021-07-28 2023-02-02 Apple Inc. Devices, methods and graphical user interfaces for three-dimensional preview of objects
CN116048436A (en) * 2022-06-17 2023-05-02 荣耀终端有限公司 Application interface display method, electronic device and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘佰鑫;刘成良;贡亮;: "基于触屏设备的人机交互界面设计", 机电一体化, no. 04, 15 April 2015 (2015-04-15) *
彭帆;陶毅阳;于超;李海峰;刘旭;: "基于红外结构光的三维显示用交互装置", 光学学报, no. 04, 10 April 2013 (2013-04-10) *

Similar Documents

Publication Publication Date Title
CN108499105B (en) Method, device and storage medium for adjusting visual angle in virtual environment
CN110502954B (en) Video analysis method and device
US11567333B2 (en) Head-mounted display, head-mounted display linking system, and method for same
CN110488977B (en) Virtual reality display method, device and system and storage medium
US10412379B2 (en) Image display apparatus having live view mode and virtual reality mode and operating method thereof
KR101678861B1 (en) Mobile terminal and method for controlling the same
CN108710525B (en) Map display method, device, equipment and storage medium in virtual scene
CN109191549B (en) Method and device for displaying animation
CN108965922B (en) Video cover generation method and device and storage medium
CN110333834B (en) Frame frequency adjusting method and device, display device and computer readable storage medium
CN110557626B (en) Image display method and electronic equipment
CN111541907A (en) Article display method, apparatus, device and storage medium
CN113223129B (en) Image rendering method, electronic equipment and system
CN109634688B (en) Session interface display method, device, terminal and storage medium
CN109821237B (en) Method, device and equipment for rotating visual angle and storage medium
KR20150142516A (en) Glass type terminal and control method thereof
US9270982B2 (en) Stereoscopic image display control device, imaging apparatus including the same, and stereoscopic image display control method
CN110998666A (en) Information processing apparatus, information processing method, and program
CN114579016A (en) Method for sharing input equipment, electronic equipment and system
CN109636715B (en) Image data transmission method, device and storage medium
US11062664B2 (en) Grayscale adjustment method and display device
KR101751348B1 (en) Mobile terminal and method for controlling the same
CN110349527B (en) Virtual reality display method, device and system and storage medium
CN111223311B (en) Traffic flow control method, device, system, control equipment and storage medium
CN111694535B (en) Alarm clock information display method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination