CN115248655A - Method and apparatus for displaying information - Google Patents

Method and apparatus for displaying information Download PDF

Info

Publication number
CN115248655A
CN115248655A CN202110469947.7A CN202110469947A CN115248655A CN 115248655 A CN115248655 A CN 115248655A CN 202110469947 A CN202110469947 A CN 202110469947A CN 115248655 A CN115248655 A CN 115248655A
Authority
CN
China
Prior art keywords
image
mixed reality
key
head
mounted electronic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110469947.7A
Other languages
Chinese (zh)
Inventor
汤雨薇
刘昕笛
辛高高
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shining Reality Wuxi Technology Co Ltd
Original Assignee
Shining Reality Wuxi Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shining Reality Wuxi Technology Co Ltd filed Critical Shining Reality Wuxi Technology Co Ltd
Priority to CN202110469947.7A priority Critical patent/CN115248655A/en
Publication of CN115248655A publication Critical patent/CN115248655A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros

Abstract

The embodiment of the application discloses a method and a device for displaying information. One embodiment of the method comprises: the method comprises the steps of obtaining a display request of the head-mounted electronic equipment, and displaying a first key and a second key in a screen of the terminal equipment; responding to the trigger event of the first key, entering a screen projection display mode, acquiring data of an image to be displayed, rendering the data of the image to be displayed to generate a screen projection image, and respectively displaying the screen projection image and the image to be displayed on the head-mounted electronic equipment and the terminal equipment; and responding to the trigger event of the second key, entering a mixed reality display mode, rendering a mixed reality operation image based on the posture information of the head-mounted electronic equipment, and controlling the head-mounted electronic equipment to display the mixed reality operation image, wherein the mixed reality operation image is updated based on the operation of the user on the mixed reality operation image. This embodiment enables a user to quickly and easily enter a screen-projected display mode or a mixed reality display mode of the head mounted electronic device.

Description

Method and apparatus for displaying information
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a method and a device for displaying information.
Background
With the development of technology, more and more display devices appear in people's daily life and work, and become an indispensable important component in people's life, such as head-mounted electronic devices. In general, a head-mounted electronic device for display may implement different effects of virtual reality, augmented reality, mixed reality, and the like.
In the related art, a user may have different needs when using a head-mounted electronic device. For example, the user may use the head-mounted electronic device to present content displayed by the other device, i.e., to screen-display the other device through the head-mounted electronic device. Alternatively, the user may also need to not only use the head-mounted electronic device to present viewable virtual content, but also need to interact with an object displayed by the head-mounted electronic device in real time in a visualization environment, that is, to implement mixed reality display through the head-mounted electronic device.
Disclosure of Invention
The embodiment of the application provides a method and a device for displaying information.
In a first aspect, an embodiment of the present application provides a method for displaying information, where the method includes: the method comprises the steps of obtaining a display request of the head-mounted electronic equipment, and displaying a first key and a second key in a screen of the terminal equipment, wherein the first key is used for selecting a screen projection display mode, and the second key is used for selecting a mixed reality display mode; responding to the trigger event of the first key, entering a screen projection display mode, acquiring data of an image to be displayed, rendering the data of the image to be displayed to generate a screen projection image, and displaying the screen projection image and the image to be displayed on the head-mounted electronic equipment and the terminal equipment respectively; and responding to the trigger event of the second key, entering a mixed reality display mode, rendering a mixed reality operation image based on the posture information of the head-mounted electronic equipment, and controlling the head-mounted electronic equipment to display the mixed reality operation image, wherein the mixed reality operation image is updated based on the operation of the user on the mixed reality operation image.
In a second aspect, an embodiment of the present application provides an apparatus for displaying information, the apparatus including: the display control device comprises an acquisition unit, a display unit and a display unit, wherein the acquisition unit is configured to acquire a display request of the head-mounted electronic equipment and display a first key and a second key in a screen of the terminal equipment, the first key is used for selecting a screen projection display mode, and the second key is used for selecting a mixed reality display mode; the first display unit is configured to enter a screen projection display mode in response to detecting a trigger event of a first key, acquire data of an image to be displayed, render the data of the image to be displayed to generate a screen projection image, and display the screen projection image and the image to be displayed on the head-mounted electronic device and the terminal device respectively; the second display unit is configured to enter a mixed reality display mode in response to detecting a trigger event of the second key, render a mixed reality operation image based on the posture information of the head-mounted electronic device, and control the head-mounted electronic device to display the mixed reality operation image, wherein the mixed reality operation image is updated based on the operation of the mixed reality operation image by the user.
In a third aspect, an embodiment of the present application provides a terminal device, including: one or more processors; a storage device for storing one or more programs which, when executed by one or more processors, cause the one or more processors to implement the method described above.
In a fourth aspect, embodiments of the present application provide a non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method of the claims above.
In a fifth aspect, the present application provides a computer program product, including a computer program, where the computer program is executed by a processor to implement the method described above.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram to which one embodiment of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for displaying information according to the present application;
FIG. 3 is a flow diagram of yet another embodiment of a method for displaying information according to the present application;
FIG. 4 is a schematic illustration of an application scenario of a method for displaying information according to the present application;
FIG. 5 is a schematic diagram of an embodiment of an apparatus for displaying information according to the present application;
fig. 6 is a schematic structural diagram of a computer system suitable for implementing a terminal device according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 shows an exemplary system architecture 100 to which embodiments of the method for displaying information or the apparatus for displaying information of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include a terminal device 101, a head mounted electronic device 102. The terminal device 101 and the head mounted electronic device 102 may be connected via a network, which may include a variety of connections such as wired, wireless communication links, or fiber optic cables, among others. The terminal device 101 and the head mounted electronic device 102 may interact over a network to send or receive data information or the like.
A user may use terminal device 101 to interact with head mounted electronic device 102 over a network to receive or send messages or the like. Various communication client applications, such as a web browser application, a shopping application, a search application, an instant messaging tool, a mailbox client, social platform software, etc., may be installed on the terminal device 101. The head-mounted electronic device 102 may be an electronic device with an image display function, including but not limited to AR smart glasses and VR smart glasses.
The terminal device 101 may be various electronic devices having a display screen, including but not limited to a smart phone, a tablet computer, an e-book reader, an MP3 player (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), a laptop portable computer, a desktop computer, and the like. Of course, the terminal device 101 may also be software, and when the terminal device 101 is software, it may be installed in the electronic devices listed above. It may be implemented as multiple pieces of software or software modules (e.g., to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The terminal device 101 may also provide various services, such as support for a screen-projected image or a mixed reality operation image displayed on the head-mounted electronic device 102. The terminal device may analyze and process the acquired data such as the display request, and transmit a processing result (for example, a screen projection image or a mixed reality operation image) to the head-mounted electronic device for display.
It should be noted that the method for displaying information provided in the embodiment of the present application is generally performed by the terminal device 101, and accordingly, the apparatus for displaying information is generally disposed in the terminal device 101.
It should be understood that the number of terminal devices 101 and head mounted electronic devices 102 in fig. 1 is merely illustrative. There may be any number of terminal devices 101, and head-mounted electronic devices 102, as desired for implementation. For example, two terminal devices 101 may be included in fig. 1, wherein one terminal device 101 may perform processing such as analysis on the acquired data such as the display request, and send the processing results (e.g., the rendered screen projection image corresponding to the terminal device 101 and the rendered screen projection image corresponding to the other terminal device 101) to the head-mounted electronic device 102 for display. For another example, fig. 1 may further include two head-mounted electronic devices 102, and the terminal device 101 may perform processing such as analysis on data such as an acquired display request, and transmit a processing result (for example, a screen projection image or a mixed reality operation image) to the two head-mounted electronic devices 101 for display.
It should also be noted that, although the solution disclosed in the present application may be applied to the terminal device 101, this does not exclude that the solution may also be applied to the head-mounted electronic device 102. In the case where the solution disclosed in the present application is applied to the head-mounted electronic device 102, the head-mounted electronic device 102 may also obtain the display request and control the display of the first key and the second key in the screen of the terminal device 101. In response to detecting the trigger event of the first key by using the terminal device 101, the head-mounted electronic device 102 may enter a screen-projection display mode, acquire data of an image to be displayed, render the data of the image to be displayed to generate a screen-projection image, and display the screen-projection image and the image to be displayed on the head-mounted electronic device and the terminal device, respectively. Similarly, the head-mounted electronic device 102 may also enter a mixed reality display mode and display a mixed reality operation image. In this case, the method for displaying information may be performed by the head-mounted electronic device 102, and accordingly, the means for displaying information may also be provided in the head-mounted electronic device 102.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for displaying information in accordance with the present application is shown. The method for displaying information comprises the following steps:
step 202, acquiring a display request of the head-mounted electronic device, and displaying a first key and a second key in a screen of the terminal device.
In the present embodiment, an execution subject of the method for displaying information (e.g., the terminal device shown in fig. 1) may acquire a display request of the head-mounted electronic device in various ways. Here, the display request may be for requesting image display at the head mounted electronic device. The execution body may then display the first key and the second key in a screen. The user may select the screen-projection display mode through the first key and may select the mixed reality display mode through the second key. The screen projection mode can be understood as that the head-mounted electronic device is used as a screen projection device of the terminal device, and the screens of the head-mounted electronic device and the terminal device display the same content. The mixed reality display mode can be understood as a mode that the head-mounted electronic equipment and the terminal equipment are combined to provide a virtual image which can be interactively fed back with a user in real time in a real environment.
In some optional implementations of the embodiment, before acquiring the display request of the head-mounted electronic device, the executing body may determine whether the executing body establishes a communication connection with the head-mounted electronic device. If the execution main body establishes communication connection with the head-mounted electronic equipment, a display request can be directly generated, so that the execution main body can obtain the display request. As an example, the data line of the head mounted electronic device may be directly connected with the terminal device, and the terminal device may directly obtain the generated display request. The realization mode can quickly display the first key and the second key in the screen, simplifies the selection operation of the display mode, and further improves the efficiency of selecting the screen projection display mode and the mixed reality display mode by the user. It is to be understood that the execution main body may also obtain the display request in other manners, for example, after the head-mounted electronic device establishes a communication connection with the terminal device, the user may input the display request by clicking a preset key or the like, and at this time, the terminal device may obtain the display request input by the user.
In some optional implementations of this embodiment, before generating the display request in response to determining that the terminal device and the head-mounted electronic device establish the communication connection, the executing body may further prompt the user to establish the communication connection between the terminal device and the head-mounted electronic device. Specifically, the execution main body may receive an opening instruction of a user for a target application program installed in advance, run the target application program, and display first prompt information for prompting the user to establish a communication connection between the terminal device and the head-mounted electronic device on a screen; then, the user can connect the head-mounted electronic equipment to the terminal equipment through the prompt of the prompt message. In this implementation manner, the execution main body may prompt the user to establish a communication connection between the terminal device and the head-mounted electronic device by sending the first prompt information through the target application program, so that the user may quickly enter a display mode selection without learning, and the learning cost is reduced.
Generally, the head-mounted electronic device and the terminal device can be used in cooperation, for example, the head-mounted electronic device and the terminal device can be used as a split type device. In this case, the user can use the head-mounted electronic device to project a screen for the terminal device (screen projection display mode), and can use the terminal device and the head-mounted electronic device to perform mixed reality interaction (mixed reality display mode). Therefore, when a user uses the head-mounted electronic device and the terminal device, the user often considers the problems of what display mode needs to be entered, how to enter the specified display mode for good display, and the like. According to the scheme disclosed by the embodiment of the application, when the user uses the head-mounted electronic equipment and the terminal equipment, an effective mode can be provided and the head-mounted electronic equipment is utilized for displaying.
Step 204, in response to detecting the trigger event of the first key, entering a screen projection display mode, acquiring data of an image to be displayed, rendering the data of the image to be displayed to generate a screen projection image, and displaying the screen projection image and the image to be displayed on the head-mounted electronic device and the terminal device respectively.
In this embodiment, based on the first key and the second key displayed in step 202, the executing entity may detect a trigger event of the first key and the second key. It is understood that the user may have various operations to generate the corresponding trigger event according to different actual situations. For example, the first key and the second key may be touch keys, and then the touch of the user on the first key and the second key may generate a trigger event of the corresponding key.
If the execution main body detects the triggering event of the first key, the execution main body can determine that the user selects a screen projection display mode. At this time, the executing body may acquire data of an image to be displayed on the screen, and the data to be displayed on the screen may be the image to be displayed. The executing agent may then render the acquired data so that a projected image may be generated. Here, the execution subject may render the acquired data by a rendering engine such as unity, so as to obtain a rendered image, which is a screen projection image. Finally, the execution main body can send the screen projection image to the head-mounted electronic equipment, so that the head-mounted electronic equipment can display the screen projection image, and screen projection display of the head-mounted electronic equipment is completed. And the execution main body can also display the image to be displayed on the screen of the terminal equipment. It is understood that the contents of the projected image and the image currently displayed on the screen are the same, and thus the display of the head-mounted electronic device may be equivalent to the mirror display of the execution subject screen described above.
It is understood that the image displayed on the screen may be various images, and is not limited thereto. As an example, the terminal device may be a mobile phone, in which case, the image displayed on the screen of the terminal device may include only one interface, such as a desktop. Or, the terminal device may also be a computer, in which case, the image displayed on the screen of the terminal device may include a plurality of window interfaces opened on a desktop.
In some optional implementation manners of this embodiment, after entering the screen-projection display mode, the execution main body may return to the desktop of the terminal device regardless of the interface before the screen. Then, the execution subject may acquire data of an image corresponding to a desktop of the terminal device, and the image may be an image to be displayed, so that the execution subject may render a projection image corresponding to the desktop image and display the projection image on the head-mounted electronic device. It can be understood that the screen of the terminal device may also display a corresponding desktop. According to the implementation mode, the terminal equipment can directly return to the desktop under the condition that the user uses the screen-projecting display mode, the user does not need to manually return to the desktop, and the efficiency of displaying by using the screen-projecting display mode is further improved. It can be understood that, after entering the screen projection display mode, the execution main body may directly acquire data of an image to be displayed by the terminal device, and render a screen projection image corresponding to the image, so that the head-mounted electronic device may display the screen projection image.
Alternatively, in the screen projection mode, the execution subject may update the image displayed on the screen when receiving an operation of the screen by the user. Specifically, the executing body may determine an image to be displayed by the terminal device as a current display image of the screen, and then update data of the image to be displayed in response to receiving an operation of the user on the current display image displayed on the screen; and finally, updating the screen projection image based on the updated data of the image to be displayed, and displaying the updated screen projection image and the image to be displayed on the head-mounted electronic equipment and the terminal equipment respectively. The implementation mode can enable the screen projection image displayed in the head-mounted electronic equipment to be changed in real time according to the image in the screen of the terminal equipment, and ensures that the data of the head-mounted electronic equipment and the terminal equipment are synchronous when the head-mounted electronic equipment projects the screen.
And step 206, responding to the trigger event of the detected second key, entering a mixed reality display mode, rendering a mixed reality operation image based on the posture information of the head-mounted electronic equipment, and controlling the head-mounted electronic equipment to display the mixed reality operation image.
In this embodiment, if the execution subject detects a trigger event of the second key, it may be determined that the user selects the mixed reality display mode. At this time, the execution subject may analyze the posture information of the head-mounted electronic device, etc., so as to render a mixed reality operation image, and transmit the mixed reality operation image to the head-mounted electronic device, so that the head-mounted electronic device displays the mixed reality operation image. The mixed reality operation image can update the display content based on the direct or indirect operation of the user on the mixed reality operation image, so that the mixed reality interaction is realized. As an example, the head-mounted electronic device may support gesture recognition, and the user may directly operate the mixed reality operation image displayed in the real environment through a gesture, so that the mixed reality operation image displayed by the head-mounted electronic device may be updated.
In some optional implementations of the embodiment, after the execution subject enters the mixed reality display mode, a mixed reality interface may be rendered and displayed on a screen. In this case, the user may use the execution body as a control handle of the head-mounted electronic device, and may indirectly operate the mixed reality operation interface displayed on the head-mounted electronic device through the mixed reality interface displayed on the screen. It can be understood that when the terminal device is used as a handle, the mixed reality interface displayed on the screen is usually different from the mixed reality operation interface displayed on the head-mounted electronic device. The scheme that this implementation mode provided can show mixed reality interface in terminal equipment's screen to the user can control the demonstration of wear-type electronic equipment as the handle with this terminal equipment, make full use of terminal equipment's function, for modes such as gesture recognition control wear-type electronic equipment's demonstration can give user's brand-new experience.
Optionally, the mixed reality interface may further include a mixed reality display setting button, where the button is used for setting a parameter (e.g., a sleep mode, a left-hand operation, etc.) of the mixed reality display mode by a user. Therefore, when receiving a second preset operation such as clicking of the mixed reality display setting key by the user, the execution main body can configure parameters of the mixed reality display mode, and the experience of user operation in the mixed reality display mode is improved.
By way of example, the terminal device may be a mobile phone, and the mobile phone may be used as a handle of the head-mounted electronic device after displaying the mixed reality interface. The user can change the position of an operation point in the mixed reality operation interface displayed by the head-mounted electronic equipment through operating the mixed reality interface. For example, a user may control the operation point of the mixed reality operation interface to move to the location of the application program a through the mixed reality interface, and then the user may open the window of the application program a by double-clicking the application program through the mixed reality interface, so that the mixed reality operation interface displayed by the head-mounted electronic device may be updated to the application program a window, or the mixed reality operation interface displayed by the head-mounted electronic device may be updated to an interface in which the application program a window is added to the original display content.
In some optional implementations of the embodiment, the execution subject may render the mixed reality operation image by: acquiring attitude information and to-be-mixed reality data of the head-mounted electronic equipment, wherein the attitude information is used for representing the position and the attitude of the head-mounted electronic equipment; adjusting the position and the posture of the virtual camera according to the posture information; and rendering the data to be mixed by adopting the adjusted virtual camera to generate a mixed reality operation image. It is to be understood that the data to be mixed with reality may refer to data for generating a mixed reality operation image, and the execution host may render the mixed reality operation image to be displayed by the head-mounted electronic device through unity or the like. Therefore, before rendering the mixed reality operation image, the posture information of the head-mounted electronic device may be determined so as to adjust the position and posture of the virtual camera in the unit and the like according to the posture information, and the mixed reality data is rendered according to the adjusted virtual camera, so that the mixed reality operation image with a proper view angle may be generated. The scheme disclosed by the implementation mode enables the head-mounted electronic equipment to quickly display a mixed reality operation interface suitable for the visual angle of a user. It is understood that the execution subject may render the mixed reality operation image in other manners, which is not limited herein. For example, according to the posture information of the head-mounted electronic device, an optical module composed of an image source and an optical lens in the head-mounted electronic device is adjusted, so that the optical module can project a mixed reality operation image with a proper visual angle.
Alternatively, when the execution subject renders the mixed reality operation image to be displayed by the head-mounted electronic device by using unity or the like, in the mixed reality display mode, another image to be displayed by the head-mounted electronic device may be generated by using unity or the like.
In some optional implementations of the embodiment, the controlling the head-mounted electronic device to display the mixed reality operation image may include: for a target object in a mixed reality operation image displayed by the head-mounted electronic equipment, responding to the received operation of a user on the target object, calling and executing a control instruction for the target object to update the mixed reality operation image; and sending the updated mixed reality operation image to the head-mounted electronic equipment so that the head-mounted electronic equipment displays the updated mixed reality operation image. As an example, the mixed reality operation image displayed by the head-mounted electronic device may be a desktop formed by an application selected from the terminal device based on a white list and the like, and for the application such as google in the desktop, if a double-click operation of the user on the google and the like is received, an opening instruction of the application such as google and the like may be invoked and executed, and the desktop is updated to an interface such as google; then, the execution subject can send an interface such as google to the head-mounted electronic device, so that the head-mounted electronic device can display the updated interface such as google. In this implementation, the displayed mixed reality operation image of the head-mounted electronic device can be made to be quickly updated according to the user's operation thereon. It is understood that this implementation provides only one possible operation manner for the mixed reality display mode, and there may be other operation manners, for example, updating the mixed reality operation image displayed by the head-mounted electronic device through the handle, and the like, which is not limited herein.
The method for displaying information provided by the above embodiment of the application, when the terminal device and the head-mounted electronic device are used cooperatively, the display request may be acquired and the first key and the second key may be displayed on the screen, then the screen-projection mode may be entered after determining that the trigger event of the first key is detected, the content currently displayed on the screen of the terminal device may be displayed on the head-mounted electronic device, the mixed reality display mode may be entered after determining that the trigger event of the second key is detected, and the user may operate the mixed reality operation image displayed on the head-mounted electronic device through the terminal device to implement the mixed reality interaction. According to the scheme disclosed by the embodiment of the application, when the user initiates the display request of the head-mounted electronic equipment, the user can accurately and quickly enter the selected mode to perform screen projection display or mixed reality display, the user does not need to learn, the operation efficiency of the head-mounted electronic equipment is improved, and the operation difficulty of the head-mounted electronic equipment is reduced.
With further reference to fig. 3, a flow 300 of another embodiment of a method for displaying information is shown. The process 300 of the method for displaying information includes the steps of:
step 302, acquiring a display request of the head-mounted electronic device, and displaying a first key and a second key in a screen of the terminal device.
In this embodiment, step 302 is substantially the same as step 202 in the above embodiment, and is not described herein again.
And 304, responding to the trigger event of the detected second key, entering a mixed reality display mode, rendering a mixed reality operation image based on the posture information of the head-mounted electronic equipment, and controlling the head-mounted electronic equipment to display the mixed reality operation image.
In this embodiment, step 304 is substantially the same as step 206 in the above embodiment, and is not described herein again.
Step 306, in response to detecting the trigger event of the first key, entering a screen projection display mode, and displaying a third key and a fourth key in the screen.
In this embodiment, the executing body may enter a screen-projection display mode when detecting a trigger event of the first key. Then, the execution main body can display the third key and the fourth key in the screen in various modes in the screen projection display mode. It should be noted that the third key may be used for the user to select the first screen projection display mode or the second screen projection display mode in the screen projection display mode. Here, the size of the projection image in the first projection display mode may be larger than the size of the projection image in the second projection display mode, that is, the first projection display mode and the second projection display mode may be respectively understood as a large projection display mode and a small projection display mode. Further, the fourth button may be used for a user to switch from the current screen projection display mode to the mixed reality display mode.
In some optional implementations of this embodiment, displaying the third key and the fourth key on the screen may include the following steps: and receiving a first preset operation of a user for the screen, and displaying a window comprising a third key and a fourth key in the screen. Specifically, after entering the screen projection display mode, the execution main body may receive a first preset operation of the screen, such as a drop-down window, by the user, so that the drop-down window and the like may be displayed on the screen, and the third key and the fourth key may be displayed in the drop-down window and the like. It can be understood that, after entering the screen projection display mode, the user may directly perform a first preset operation such as a pull-down window. Or, after entering the screen-projection display mode, the execution main body may further display, in the screen, a prompt message for prompting the user to execute a first preset operation such as a pull-down window to select the display mode, so that the user may open a window including the third key and the fourth key according to the prompt message. In this implementation manner, the execution main body displays the window including the third key and the fourth key after receiving the first preset operation of the user, so that the user can autonomously select whether to display the window including the third key and the fourth key, and the user experience is improved. Of course, the user may also perform the screen projection display by starting the first key and directly entering, for example, the first screen projection display mode, etc., without any limitation.
In some optional implementations of this embodiment, displaying the third key and the fourth key on the screen may include the following steps: and calling and executing a popup display instruction to control the screen to display a popup comprising a third key and a fourth key within a preset time length. Specifically, after entering the screen projection mode, the execution main body may directly call a popup display instruction, and after executing the instruction, a popup may be displayed in the screen, where the popup may include a third key and a fourth key. Further, the execution main body can control the popup to display a preset time, and the popup can be directly closed after the preset time is exceeded. In the implementation mode, after the screen projection display mode is entered, the terminal equipment can directly display the third key and the fourth key in a popup window mode without additional operation of a user, and the screen projection display mode is convenient and fast.
Step 308, in response to detecting the first trigger event of the third key, entering a first screen projection display mode, acquiring data of an image to be displayed, rendering the data of the image to be displayed to generate a screen projection image, and displaying the screen projection image and the image to be displayed on the head-mounted electronic device and the terminal device, respectively.
In this embodiment, the executing body may enter a first screen-projection display mode when detecting a first triggering event of the third key. The execution main body may acquire data of an image to be displayed on a screen (the image to be displayed on the screen may be the image to be displayed), and render the acquired data, so that a projection image of a size corresponding to the first projection display mode may be generated. Then, the execution subject may transmit the generated screen projection image to the head mounted electronic device so that the head mounted electronic device may display the generated screen projection image. That is, the head-mounted electronic device can perform screen projection display in the large screen projection display mode. It is understood that the image to be displayed may also be displayed in the screen of the terminal device, that is, the screen of the terminal device and the image displayed by the head-mounted electronic device include the same content.
Step 310, in response to detecting a second trigger event of the third key, entering a second screen projection display mode, acquiring data of an image to be displayed, rendering the data of the image to be displayed to generate a screen projection image, and displaying the screen projection image and the image to be displayed on the head-mounted electronic device and the terminal device respectively.
In this embodiment, the executing body may enter a second screen-projection display mode when detecting a second triggering event of the third key. The execution main body may acquire data of an image to be displayed on a screen, where the image to be displayed on the screen is the image to be displayed. And rendering the acquired data so as to generate a projection image with a size corresponding to the second projection display mode. Finally, the execution subject may transmit the generated screen projection image to the head mounted electronic device so that the head mounted electronic device may display the generated screen projection image. That is, the head-mounted electronic device can perform screen projection display in the small screen projection display mode. It is understood that the image to be displayed may also be displayed in the screen of the terminal device, that is, the screen of the terminal device and the image displayed by the head-mounted electronic device include the same content.
Here, the execution main body may generate the first trigger event and the second trigger event in various ways. For example, the third button may be divided into a first area and a second area, the execution main body may generate the first trigger event when receiving a click operation of a user on the first area of the third button, and may generate the second trigger event when receiving a click operation of a user on the second area of the third button. Alternatively, the execution main body may generate the first trigger event when receiving a single-click operation of a user on the third key, and may generate the second trigger event when receiving a double-click operation of the user on the third key.
Step 312, in response to detecting the trigger event of the fourth key, switching from the screen-projection display mode to the mixed reality display mode, rendering the mixed reality operation image based on the posture information of the head-mounted electronic device, and controlling the head-mounted electronic device to display the mixed reality operation image.
In this embodiment, the executing body may switch from the screen-projection display mode to the mixed reality display mode when detecting the trigger event of the fourth key. At this time, the execution subject may render a mixed reality operation image based on the pose information of the head-mounted electronic device and transmit the mixed reality operation image to the head-mounted electronic device, so that the head-mounted electronic device may display the mixed reality operation image. According to the embodiment, the user can directly switch to the mixed reality display mode from the screen projection display mode without returning to the interface for displaying the first key and the second key to enter the mixed reality display mode, and the operation is simpler. It is to be understood that the executing body may render the mixed reality operation image based on the posture information of the head-mounted electronic device in various ways, and there is no unique limitation here, see the above embodiments.
In some optional implementation manners of this embodiment, in the case of entering the mixed reality display mode, a switching key may be disposed in a desktop of the head-mounted electronic device, and the switching key may switch the mixed reality display mode to the screen-projection display mode.
As can be seen from fig. 3, compared with the embodiment corresponding to fig. 2, the flow 300 of the method for displaying information in this embodiment highlights that, when entering the screen projection display mode, the first screen projection display mode or the second screen projection display mode may also be selected to enter, and when entering the screen projection display mode, the mixed reality display mode may be directly switched to. Therefore, according to the scheme described in the embodiment, the user can quickly enter the large-projection screen display mode or the small-projection screen display mode, and can quickly switch to the mixed reality display mode in the projection screen display mode, so that the operation speed of mode selection when the user uses the head-mounted electronic equipment can be further increased, and the operation difficulty is reduced.
With continued reference to fig. 4, fig. 4 is a schematic diagram of an application scenario of the method for displaying information according to the present embodiment. In the application scenario of fig. 4, a user first initiates a display request of the head-mounted electronic device 401 by inserting a data line of the head-mounted electronic device 401 into a terminal device, so that a first key and a second key can be displayed in a screen of the terminal device, as shown in fig. 4. Thereafter, if the terminal device detects a trigger event of the first key, the terminal device may enter a screen projection display mode, as shown in fig. 4, the terminal device may acquire an image to be displayed (e.g., a desktop to be displayed on a screen), the terminal device may render the acquired data to generate a screen projection image 402 of the desktop, for example, the head-mounted electronic device 401 may display the screen projection image 402 so as to project the screen projection image 402 into a space, as shown in fig. 4, and the terminal device may display the desktop. Optionally, in the screen-projection display mode, the screen of the terminal device may also display a third key and a fourth key first, as shown in fig. 4, when the terminal device detects a first trigger event of the third key (for example, the user clicks a left area of the third key), the first screen-projection display mode may be entered, the head-mounted electronic device 401 displays the screen-projection image 402 in the space in a large screen-projection display manner, or when the terminal device detects a second trigger event of the third key (for example, the user clicks a right area of the third key), the second screen-projection display mode is entered, and the head-mounted electronic device 401 displays the screen-projection image 402 in the space in a small screen-projection display manner. Further, if the terminal device detects a trigger event of the second key, the augmented reality display mode may be entered, and at this time, the terminal device may render a mixed reality operation image 403 based on the posture information of the head-mounted electronic device, and control the head-mounted electronic device 401 to display the mixed reality operation image 403, as shown in fig. 4, so that the user may operate the mixed reality operation image 403 to update the displayed image. Alternatively, after entering the mixed reality display mode, the terminal device may serve as an operation handle of the head-mounted electronic device 401, and the handle may be as shown in fig. 4, and the user may manipulate the mixed reality operation image 403 displayed by the head-mounted electronic device 401 through the handle, so as to update the image displayed by the head-mounted electronic device 401. Optionally, for the fourth key, in the screen projection display mode, if the user clicks the fourth key, the screen projection display mode may be switched to the mixed reality display mode, as shown in fig. 4.
With further reference to fig. 5, as an implementation of the methods shown in the above-mentioned figures, the present application provides an embodiment of an apparatus for displaying information, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 5, the apparatus 500 for displaying information of the present embodiment includes: an acquisition unit 501, a first display unit 502, and a second display unit 503. The obtaining unit 501 is configured to obtain a display request of the head-mounted electronic device, and display a first key and a second key in a screen of the terminal device, where the first key is used to select a screen-projection display mode, and the second key is used to select a mixed reality display mode; the first display unit 502 is configured to enter a screen-projection display mode in response to detecting a trigger event of the first key, acquire data of an image to be displayed, render the data of the image to be displayed to generate a screen-projection image, and display the screen-projection image and the image to be displayed on the head-mounted electronic device and the terminal device, respectively; the second display unit 503 is configured to enter a mixed reality display mode in response to detecting a trigger event of the second key, render a mixed reality operation image based on the posture information of the head-mounted electronic device, and control the head-mounted electronic device to display the mixed reality operation image, wherein the mixed reality operation image is updated based on the user operation on the mixed reality operation image.
In some optional implementations of the present embodiment, the first display unit 502 is further configured to: rendering a mixed reality interface; displaying a mixed reality interface in the screen to enable a user to update the mixed reality operation image displayed by the head-mounted electronic equipment through the mixed reality interface to operate the mixed reality operation image.
In some optional implementations of this embodiment, the apparatus 500 further includes a generating unit: is configured to generate a display request in response to determining that the terminal device establishes a communication connection with the head-mounted electronic device.
In some optional implementations of this embodiment, the apparatus 500 further includes a establishing unit: the terminal device is configured to receive an opening instruction of a user for a pre-installed target application program, run the target application program, and display first prompt information on a screen, wherein the first prompt information is used for prompting establishment of a communication connection between the terminal device and the head-mounted electronic device; and establishing a communication connection between the terminal equipment and the head-mounted electronic equipment.
In some optional implementations of the present embodiment, the first display unit is further configured to: acquiring data of a desktop of terminal equipment; and determining the image formed by the desktop as the image to be displayed.
In some optional implementations of the present embodiment, the first display unit 502 includes: the display module is configured to display a third key and a fourth key in a screen, wherein the third key is used for selecting a first screen projection display mode or a second screen projection display mode, the size of a screen projection image in the first screen projection display mode is larger than that in the second screen projection display mode, and the fourth key is used for switching from the screen projection display mode to a mixed reality display mode; responding to a first trigger event of a third key, and entering a first screen projection display mode; entering a second screen projection display mode in response to detecting a second trigger event of the third key; and responding to the detection of the triggering event of the fourth key, and switching from the screen projection display mode to the mixed reality display mode.
In some optional implementations of this embodiment, the display module is further configured to: and receiving a first preset operation of a user on a screen, and displaying a window comprising a third key and a fourth key on the screen.
In some optional implementations of this embodiment, the display module is further configured to: and calling and executing a popup display instruction to control the screen to display a popup including a third key and a fourth key within a preset time length.
In some optional implementations of this embodiment, the first display unit 502 further includes: the terminal equipment comprises a first updating module, a second updating module and a display module, wherein the first updating module is configured to determine an image to be displayed by the terminal equipment as a current display image of a screen; updating data of an image to be displayed in response to receiving an operation of a user on a current display image displayed on a screen; updating the screen projection image based on the updated data of the image to be displayed, and respectively displaying the updated screen projection image and the updated image to be displayed on the head-mounted electronic equipment and the terminal equipment to determine the image to be displayed on the terminal equipment as a current display image of the screen; updating data of an image to be displayed in response to receiving an operation of a user on a current display image displayed on a screen; and updating the screen projection image based on the updated data of the image to be displayed, and displaying the updated screen projection image and the image to be displayed on the head-mounted electronic equipment and the terminal equipment respectively.
In some optional implementations of this embodiment, the second display unit 503 is further configured to: acquiring attitude information and to-be-mixed reality data of the head-mounted electronic equipment, wherein the attitude information is used for representing the position and the attitude of the head-mounted electronic equipment; adjusting the position and the posture of a virtual camera in a preset rendering engine according to the posture information; rendering the data to be mixed reality based on the adjusted virtual camera to generate a mixed reality operation image.
In some optional implementations of the present embodiment, the second display unit 503 is further configured to: for a target object in a mixed reality operation image displayed by the head-mounted electronic equipment, responding to the received operation of a user on the target object, and calling and executing a control instruction for the target object to update the mixed reality operation image; and sending the updated mixed reality operation image to the head-mounted electronic equipment so that the head-mounted electronic equipment displays the updated mixed reality operation image.
In some optional implementations of this embodiment, the mixed reality interface includes a mixed reality display setup button; the first display unit 502 is further configured to: and responding to the received second preset operation of the user for the mixed reality display setting key, and configuring parameters of the mixed reality display mode.
The units recited in the apparatus 500 correspond to the various steps in the method described with reference to fig. 2 and 3. Thus, the operations and features described above for the method are equally applicable to the apparatus 500 and the units included therein, and are not described in detail here.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing a terminal device of an embodiment of the present application. The terminal device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU) 601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601. It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, which may be described as: a processor includes an acquisition unit, a first display unit, and a second display unit. Here, the names of these units do not constitute a limitation to the unit itself in some cases, and for example, the acquiring unit may also be described as "a unit that acquires a display request of the head-mounted electronic apparatus, displays the first key and the second key in the screen of the terminal apparatus".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: the method comprises the steps of obtaining a display request of the head-mounted electronic equipment, and displaying a first key and a second key in a screen of the terminal equipment, wherein the first key is used for selecting a screen projection display mode, and the second key is used for selecting a mixed reality display mode; responding to the trigger event of the first key, entering a screen projection display mode, acquiring data of an image to be displayed, rendering the data of the image to be displayed to generate a screen projection image, and respectively displaying the screen projection image and the image to be displayed on the head-mounted electronic equipment and the terminal equipment; and responding to the trigger event of the second key, entering a mixed reality display mode, rendering a mixed reality operation image based on the posture information of the head-mounted electronic equipment, and controlling the head-mounted electronic equipment to display the mixed reality operation image, wherein the mixed reality operation image is updated based on the operation of the user on the mixed reality operation image.
The foregoing description is only exemplary of the preferred embodiments of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements in which any combination of the features described above or their equivalents does not depart from the spirit of the invention disclosed above. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (16)

1. A method for displaying information, which is applied to a terminal device, comprises the following steps:
the method comprises the steps of obtaining a display request of head-mounted electronic equipment, and displaying a first key and a second key in a screen of the terminal equipment, wherein the first key is used for selecting a screen projection display mode, and the second key is used for selecting a mixed reality display mode;
in response to the detection of the trigger event of the first key, entering a screen projection display mode, acquiring data of an image to be displayed, rendering the data of the image to be displayed to generate a screen projection image, and displaying the screen projection image and the image to be displayed on the head-mounted electronic device and the terminal device respectively;
and responding to the trigger event of the second key, entering the mixed reality display mode, rendering a mixed reality operation image based on the posture information of the head-mounted electronic equipment, and controlling the head-mounted electronic equipment to display the mixed reality operation image, wherein the mixed reality operation image is updated based on the operation of the user on the mixed reality operation image.
2. The method of claim 1, wherein after entering the mixed reality display mode, the method comprises:
rendering a mixed reality interface;
displaying a mixed reality interface in the screen to cause the user to operate on the mixed reality operation image through the mixed reality interface to update the mixed reality operation image displayed by the head-mounted electronic device.
3. The method of claim 1, wherein prior to obtaining a display request of a head mounted electronic device, the method further comprises:
and generating the display request in response to determining that the terminal device establishes a communication connection with the head-mounted electronic device.
4. The method of claim 3, wherein prior to generating the display request in response to determining that the terminal device establishes a communication connection with the head-mounted electronic device, the method further comprises:
receiving an opening instruction of a user for a pre-installed target application program, running the target application program, and displaying first prompt information on the screen, wherein the first prompt information is used for prompting establishment of communication connection between the terminal device and the head-mounted electronic device;
and establishing communication connection between the terminal equipment and the head-mounted electronic equipment.
5. The method of claim 1, wherein after entering the screen projection display mode, the method further comprises:
acquiring data of a desktop of the terminal equipment;
and determining the image formed by the desktop as the image to be displayed.
6. The method of claim 1, wherein after entering the screen projection display mode, the method further comprises:
displaying a third key and a fourth key in the screen, wherein the third key is used for selecting a first screen projection display mode or a second screen projection display mode, the size of a screen projection image in the first screen projection display mode is larger than that in the second screen projection display mode, and the fourth key is used for switching from the screen projection display mode to the mixed reality display mode;
entering the first screen projection display mode in response to detecting a first trigger event of the third key;
entering the second screen projection display mode in response to detecting a second trigger event of the third key;
switching from the screen projection display mode to the mixed reality display mode in response to detecting a triggering event of the fourth key.
7. The method of claim 6, wherein said displaying a third key and a fourth key on said screen comprises:
and receiving a first preset operation of a user for the screen, and displaying a window comprising the third key and the fourth key on the screen.
8. The method of claim 6, wherein said displaying a third key and a fourth key on said screen comprises:
and calling and executing a popup window display instruction to control the screen to display a popup window comprising the third key and the fourth key within a preset time length.
9. The method of claim 1, wherein after entering the screen projection display mode, the method further comprises:
determining an image to be displayed by the terminal equipment as a current display image of the screen;
updating the data of the image to be displayed in response to receiving the operation of the user on the current display image displayed on the screen;
updating the screen projection image based on the updated data of the image to be displayed, and displaying the updated screen projection image and the image to be displayed on the head-mounted electronic equipment and the terminal equipment respectively.
10. The method of claim 1, wherein the rendering a mixed reality operations image based on pose information of the head mounted electronic device comprises:
acquiring attitude information and to-be-mixed reality data of the head-mounted electronic equipment, wherein the attitude information is used for representing the position and the attitude of the head-mounted electronic equipment;
adjusting the position and the posture of a virtual camera in a preset rendering engine according to the posture information;
rendering the data to be mixed reality based on the adjusted virtual camera, and generating the mixed reality operation image.
11. The method of claim 1, wherein the controlling the head-mounted electronic device to display a mixed reality operational image comprises:
for a target object in a mixed reality operation image displayed by the head-mounted electronic equipment, responding to the received operation of a user on the target object, and calling and executing a control instruction for the target object to update the mixed reality operation image;
and sending the updated mixed reality operation image to the head-mounted electronic equipment so that the head-mounted electronic equipment displays the updated mixed reality operation image.
12. The method of claim 2, wherein the mixed reality interface comprises a mixed reality display settings button;
the method further comprises the following steps:
and configuring parameters of the mixed reality display mode in response to receiving a second preset operation of the user on the mixed reality display setting key.
13. An apparatus for displaying information, applied to a terminal device, the apparatus comprising:
the display control device comprises an acquisition unit, a display unit and a display unit, wherein the acquisition unit is configured to acquire a display request of the head-mounted electronic equipment and display a first key and a second key in a screen of the terminal equipment, the first key is used for selecting a screen projection display mode, and the second key is used for selecting a mixed reality display mode;
the first display unit is configured to enter the screen projection display mode in response to detecting a trigger event of the first key, acquire data of an image to be displayed, render the data of the image to be displayed to generate a screen projection image, and display the screen projection image and the image to be displayed on the head-mounted electronic device and the terminal device respectively;
a second display unit configured to enter the mixed reality display mode in response to detecting a trigger event of the second key, render a mixed reality operation image based on posture information of the head-mounted electronic device, and control the head-mounted electronic device to display the mixed reality operation image, wherein the mixed reality operation image is updated based on a user operating the mixed reality operation image.
14. A terminal device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-12.
15. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-12.
16. A computer program product comprising a computer program, wherein the computer program when executed by a processor implements the method of any one of claims 1-12.
CN202110469947.7A 2021-04-28 2021-04-28 Method and apparatus for displaying information Pending CN115248655A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110469947.7A CN115248655A (en) 2021-04-28 2021-04-28 Method and apparatus for displaying information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110469947.7A CN115248655A (en) 2021-04-28 2021-04-28 Method and apparatus for displaying information

Publications (1)

Publication Number Publication Date
CN115248655A true CN115248655A (en) 2022-10-28

Family

ID=83696733

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110469947.7A Pending CN115248655A (en) 2021-04-28 2021-04-28 Method and apparatus for displaying information

Country Status (1)

Country Link
CN (1) CN115248655A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015192117A1 (en) * 2014-06-14 2015-12-17 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
WO2017166231A1 (en) * 2016-03-31 2017-10-05 深圳市柔宇科技有限公司 Head-mounted display device
US10466780B1 (en) * 2015-10-26 2019-11-05 Pillantas Systems and methods for eye tracking calibration, eye vergence gestures for interface control, and visual aids therefor
CN110456905A (en) * 2019-07-23 2019-11-15 广东虚拟现实科技有限公司 Positioning and tracing method, device, system and electronic equipment
CN111766945A (en) * 2020-06-05 2020-10-13 维沃移动通信有限公司 Interface display method and device
WO2020230418A1 (en) * 2019-05-14 2020-11-19 パナソニックIpマネジメント株式会社 Display device
CN112306443A (en) * 2020-11-23 2021-02-02 Oppo广东移动通信有限公司 Information display method and storage medium
CN112463016A (en) * 2020-12-09 2021-03-09 Oppo广东移动通信有限公司 Display control method and device, electronic equipment and wearable display equipment
CN112558825A (en) * 2019-09-26 2021-03-26 华为技术有限公司 Information processing method and electronic equipment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015192117A1 (en) * 2014-06-14 2015-12-17 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
US10466780B1 (en) * 2015-10-26 2019-11-05 Pillantas Systems and methods for eye tracking calibration, eye vergence gestures for interface control, and visual aids therefor
WO2017166231A1 (en) * 2016-03-31 2017-10-05 深圳市柔宇科技有限公司 Head-mounted display device
WO2020230418A1 (en) * 2019-05-14 2020-11-19 パナソニックIpマネジメント株式会社 Display device
CN110456905A (en) * 2019-07-23 2019-11-15 广东虚拟现实科技有限公司 Positioning and tracing method, device, system and electronic equipment
CN112558825A (en) * 2019-09-26 2021-03-26 华为技术有限公司 Information processing method and electronic equipment
CN111766945A (en) * 2020-06-05 2020-10-13 维沃移动通信有限公司 Interface display method and device
CN112306443A (en) * 2020-11-23 2021-02-02 Oppo广东移动通信有限公司 Information display method and storage medium
CN112463016A (en) * 2020-12-09 2021-03-09 Oppo广东移动通信有限公司 Display control method and device, electronic equipment and wearable display equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
87870网: "【评测】钉钉智连Nreal Light AR眼镜:技术演示的半成品", pages 1 - 22, Retrieved from the Internet <URL:https://www.163.com/dy/article/FQES8TB80511AEA7.html> *

Similar Documents

Publication Publication Date Title
US20220405986A1 (en) Virtual image generation method, device, terminal and storage medium
CN110046021B (en) Page display method, device, system, equipment and storage medium
US20140026057A1 (en) Providing access to a remote application via a web client
CN108833787B (en) Method and apparatus for generating short video
CN113741765B (en) Page jump method, device, equipment, storage medium and program product
CN113377366B (en) Control editing method, device, equipment, readable storage medium and product
CN113806054A (en) Task processing method and device, electronic equipment and storage medium
US20220050562A1 (en) Methods and apparatuses for generating a hosted application
CN114363686B (en) Method, device, equipment and medium for publishing multimedia content
CN110673886B (en) Method and device for generating thermodynamic diagrams
CN115576458A (en) Application window display method, device, equipment and medium
CN110619615A (en) Method and apparatus for processing image
CN113377365B (en) Code display method, apparatus, device, computer readable storage medium and product
CN115248655A (en) Method and apparatus for displaying information
CN111324244B (en) Method and device for switching picture display types
CN109190097B (en) Method and apparatus for outputting information
CN113360064A (en) Method and device for searching local area of picture, medium and electronic equipment
CN113391737A (en) Interface display control method and device, storage medium and electronic equipment
CN112947918A (en) Data display method and device
CN113126863A (en) Object selection implementation method and device, storage medium and electronic equipment
CN111797350A (en) Data processing method and device and electronic equipment
WO2020011067A1 (en) Display method and device for terminal, terminal, and storage medium
CN112346615A (en) Information processing method and device
CN110941389A (en) Method and device for triggering AR information points by focus
CN112083840A (en) Method, device, terminal and storage medium for controlling electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination