CN111913674A - Virtual content display method, device, system, terminal equipment and storage medium - Google Patents

Virtual content display method, device, system, terminal equipment and storage medium Download PDF

Info

Publication number
CN111913674A
CN111913674A CN201910377282.XA CN201910377282A CN111913674A CN 111913674 A CN111913674 A CN 111913674A CN 201910377282 A CN201910377282 A CN 201910377282A CN 111913674 A CN111913674 A CN 111913674A
Authority
CN
China
Prior art keywords
content
interactive
area
interaction
displayed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910377282.XA
Other languages
Chinese (zh)
Inventor
卢智雄
戴景文
贺杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Virtual Reality Technology Co Ltd
Original Assignee
Guangdong Virtual Reality Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Virtual Reality Technology Co Ltd filed Critical Guangdong Virtual Reality Technology Co Ltd
Priority to CN201910377282.XA priority Critical patent/CN111913674A/en
Publication of CN111913674A publication Critical patent/CN111913674A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted

Abstract

The embodiment of the application discloses a method, a device, a system, a terminal device and a storage medium for displaying virtual content, and relates to the technical field of display. The virtual content display method is applied to terminal equipment, the terminal equipment is in communication connection with an interaction device, the interaction device comprises an interaction area, and the virtual content display method comprises the following steps: acquiring content to be displayed, wherein the content to be displayed comprises non-interactive content and interactive content; acquiring first content data corresponding to non-interactive content; acquiring relative spatial position information between an interaction device and terminal equipment; generating virtual extended content according to the relative spatial position information and the first content data; when an interactive picture corresponding to the interactive content is displayed, the virtual extended content is displayed, a first display area of the interactive picture corresponds to the interactive area, and a second display area of the virtual extended content corresponds to a set area outside the interactive area. The method can improve the display effect of the display content.

Description

Virtual content display method, device, system, terminal equipment and storage medium
Technical Field
The present application relates to the field of display technologies, and in particular, to a method, an apparatus, a system, a terminal device, and a storage medium for displaying virtual content.
Background
At present, with the rapid development of multimedia technology, more and more intelligent mobile terminals (such as palm computers, smart phones, smart watches and the like) enter the lives of people, and are popular among people due to the characteristics of small size, convenience in carrying and the like. However, in the use process of the conventional smart mobile device, the content displayed by the smart mobile device is generally limited by the size of the screen of the smart mobile device, so that the displayed content is not rich enough and not complete enough.
Disclosure of Invention
The embodiment of the application provides a display method, device and system of virtual content, a terminal device and a storage medium, and the display effect of the display content can be improved.
In a first aspect, an embodiment of the present application provides a method for displaying virtual content, which is applied to a terminal device, where the terminal device is in communication connection with an interaction apparatus, the interaction apparatus includes an interaction area, and the method includes: acquiring content to be displayed, wherein the content to be displayed comprises non-interactive content and interactive content; acquiring first content data corresponding to non-interactive content; acquiring relative spatial position information between an interaction device and terminal equipment; generating virtual extended content according to the relative spatial position information and the first content data; when an interactive picture corresponding to the interactive content is displayed, the virtual extended content is displayed, a first display area of the interactive picture corresponds to the interactive area, and a second display area of the virtual extended content corresponds to a set area outside the interactive area.
In a second aspect, an embodiment of the present application provides a display apparatus for virtual content, which is applied to a terminal device, where the terminal device is in communication connection with an interaction apparatus, the interaction apparatus includes an interaction area, and the apparatus includes: the content display device comprises a content acquisition module, a data acquisition module, a position acquisition module, a content generation module and a display control module, wherein the content acquisition module is used for acquiring content to be displayed, and the content to be displayed comprises non-interactive content and interactive content; the data acquisition module is used for acquiring first content data corresponding to the non-interactive content; the position acquisition module is used for acquiring relative spatial position information between the interaction device and the terminal equipment; the content generation module is used for generating virtual extended content according to the relative spatial position information and the first content data; the display control module is used for displaying the virtual extension content when an interactive picture corresponding to the interactive content is displayed, wherein a first display area of the interactive picture corresponds to the interactive area, and a second display area of the virtual extension content corresponds to a set area outside the interactive area.
In a third aspect, an embodiment of the present application provides a virtual content display system, where the system includes a terminal device and an interaction device, the terminal device is in communication connection with the interaction device, and the interaction device includes an interaction area, where the interaction device is configured to obtain, according to a content to be displayed in the interaction area, first content data corresponding to a non-interaction content in the content to be displayed, where the content to be displayed includes the non-interaction content and an interaction content, send the first content data to the terminal device, and control the interaction area to display an interaction picture corresponding to the interaction content; the terminal equipment is used for acquiring relative space position information between the interaction device and the terminal equipment, receiving first content data, generating virtual extended content according to the relative space position information and the first content data, and displaying the virtual extended content, wherein a display area of the virtual extended content corresponds to a set area outside the interaction area.
In a fourth aspect, an embodiment of the present application provides a terminal device, including: one or more processors; a memory; one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method for displaying virtual content as provided by the first aspect above.
In a fifth aspect, an embodiment of the present application provides a computer-readable storage medium, where a program code is stored in the computer-readable storage medium, and the program code may be called by a processor to execute the method for displaying virtual content provided in the first aspect.
According to the scheme provided by the embodiment of the application, the manuscript finishing equipment generates virtual extension content according to relative spatial position information and first content data by acquiring the content to be displayed, wherein the content to be displayed comprises non-interactive content and interactive content, then acquiring first content data corresponding to the non-interactive content, acquiring relative spatial position information between the interactive device and the terminal equipment, and generating virtual extension content according to the relative spatial position information and the first content data, when an interactive picture corresponding to the interactive content is displayed, the virtual extension content is displayed, a first display area of the interactive picture corresponds to an interactive area, a second display area of the virtual extension content corresponds to a set area outside an interactive area of the interactive device, so that the virtual extension content corresponding to the non-interactive content in the display content is displayed in a virtual space according to the spatial position of the interactive device, and a user can see the effect of overlapping the virtual extension content corresponding to the non-interactive content outside the interactive area, the displayed interactive content corresponds to the interactive area, so that the display space of the displayed content is enlarged, the display effect of the displayed content is improved, and the interaction between the user and the displayed content is facilitated.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows a schematic diagram of an application environment suitable for the embodiment of the present application.
Fig. 2 shows a schematic diagram of another application scenario applicable to the embodiments of the present application.
Fig. 3 shows a flow chart of a method of displaying virtual content according to an embodiment of the application.
Fig. 4A-4B are schematic diagrams illustrating a display effect according to an embodiment of the application.
Fig. 5 shows a flowchart of a method of displaying virtual content according to another embodiment of the present application.
Fig. 6 shows a schematic diagram of a display effect according to an embodiment of the application.
Fig. 7 shows another display effect diagram according to an embodiment of the application.
Fig. 8 shows a schematic diagram of another display effect according to an embodiment of the application.
FIG. 9 shows a block diagram of a display device of virtual content according to one embodiment of the present application.
Fig. 10 is a block diagram of a terminal device for executing a display method of virtual content according to an embodiment of the present application.
Fig. 11 is a block diagram of an interaction apparatus for performing a display method of virtual content according to an embodiment of the present application.
Fig. 12 is a storage unit for storing or carrying program codes for implementing a display method of virtual content according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
Along with the rapid promotion of science and technology level and standard of living, mobile terminal (for example, smart mobile phone, panel computer etc.) is popularized, because mobile terminal has characteristics such as small in size, convenient carrying, receives everyone's liking. When a mobile terminal is used by a user, the mobile terminal generally displays content on a touch screen, such as displaying a multimedia picture, an application interface, file content, and the like. Since the size of the screen of the mobile terminal is limited, the content displayed thereon is also limited by the size of the screen.
The inventor provides a display method, a display device, a display system, a display terminal device and a storage medium of virtual content in the embodiment of the application through long-term research, and converts non-interactive content in the display content into virtual extension content and displays the virtual extension content in a virtual space in an augmented reality mode according to the spatial position of an interactive device, so that a user can see the effect that the virtual extension content is displayed outside an interactive area of the interactive device, and the display effect of the display content corresponding to the interactive area is improved.
An application scenario of the display method of virtual content provided in the embodiment of the present application is described below.
Referring to fig. 1, a display system 10 for virtual content provided in an embodiment of the present application is shown, and includes a terminal device 100 and an interaction apparatus 200, where the terminal device 100 is communicatively connected to the interaction apparatus 200.
In the embodiment of the present application, the terminal device 100 may be a head-mounted display device, or may be a mobile device such as a mobile phone and a tablet. When the terminal device 100 is a head-mounted display device, the head-mounted display device may be an integrated head-mounted display device. The terminal device 100 may also be an intelligent terminal such as a mobile phone connected to an external/access head-mounted display device, that is, the terminal device 100 may be used as a processing and storage device of the head-mounted display device, inserted into or accessed to the external head-mounted display device, and display the virtual content through the head-mounted display device.
In the embodiment of the present application, the interaction device 200 may be an electronic device provided with the markers 201, the number of the markers 201 provided on the interaction device 200 is not limited, and the number of the markers 201 may be one or more. The specific configuration, structure and size of the interactive device 200 are not limited, and may be various shapes, such as square, circular, or various shapes, such as plane.
In some embodiments, the marker 201 may be attached to or integrated with the interactive device 200, may be disposed on a protective cover of the interactive device 200, or may be an external marker, and may be inserted into the interactive device 200 through a USB (Universal Serial Bus) or an earphone hole when in use. If the interactive device 200 is provided with a display screen, the marker 201 can be displayed on the display screen of the interactive device 200.
In some embodiments, the interaction device 200 may perform information and instruction interaction with the terminal device 100, and the terminal device 100 and the interaction device 200 may be connected through Wireless communication modes such as bluetooth, WiFi (Wireless-Fidelity), ZigBee (ZigBee technology), and the like, or may be connected through a USB interface through wired communication. Referring to fig. 2, when the terminal device 100 is a head-mounted display device and the interaction device 200 is a mobile phone terminal or a tablet computer, the head-mounted display device is in wired communication with the tablet computer and the mobile phone terminal through a USB interface. Of course, the connection mode between the terminal device 100 and the interaction apparatus 200 is not limited in the embodiment of the present application.
When the terminal device 100 and the interaction device 200 are used together, the marker 201 can be located in the visual range of the image sensor on the terminal device 100 to acquire an image containing the marker 201, so that the marker 201 is identified and tracked, and the positioning and tracking of the interaction device 200 are realized.
In some embodiments, the terminal device 100 may further implement position tracking of the interaction means 200 according to a light spot disposed on the interaction means 200, wherein the light spot may be an array of light spots.
In some embodiments, at least one interactive area 202 is provided on the interactive device 200, and the user can perform related control and interaction through the interactive area 202. The interactive area 202 may include a key, a touch pad, or a touch screen, among others. The interactive apparatus 200 can generate a control command corresponding to a control operation detected in the interactive region 202, and perform related control. The interaction device 200 may also transmit the control instruction to the terminal apparatus 100, or the interaction device 200 generates operation data according to the operation detected by the interaction area and transmits the operation data to the terminal apparatus 100. When the terminal device 100 receives the control instruction transmitted by the interaction apparatus 200, the display of the virtual content (e.g., the virtual content is controlled to rotate, displace, etc.) may be controlled according to the control instruction.
In a specific embodiment, the terminal device 100 is a head-mounted display device, and by wearing the head-mounted display device, the user can observe that the virtual chat interface 301 is superimposed on the interactive region 202 of the interactive device 200 displayed in the real space, and can also observe that non-interactive content (such as the historical chat records 302) in the virtual chat interface 301 is displayed at a position outside the interactive region 202, so that the user can view the historical chat records 302 while the virtual chat interface 301 chats with a chat object, and the interactive display of the displayed content is realized without being limited by the size of the interactive region when the user views the content through the interactive device 200.
A specific display method of the virtual content will be described below.
Referring to fig. 3, an embodiment of the present application provides a method for displaying virtual content, which is applicable to the terminal device, where the terminal device is in communication connection with an interaction apparatus, and the interaction apparatus includes an interaction area, where the method may include:
step S110: and acquiring content to be displayed, wherein the content to be displayed comprises non-interactive content and interactive content.
In the embodiment of the present application, the display area preset for the content to be displayed corresponds to the interaction area. In one embodiment, the interaction area includes a touch screen, and the content to be displayed may be content that the interaction device needs to display through the touch screen. In another embodiment, the content to be displayed may also be content that the terminal device needs to display, when the terminal device displays the content to be displayed, the terminal device may convert the content to be displayed into corresponding virtual content and display the virtual content in the interaction area in an overlapping manner, and the user may see the corresponding virtual content through the head-mounted display device and display the virtual content in the interaction device in the real world in an overlapping manner, so that the display area of the content to be displayed corresponds to the interaction area.
Due to the fact that the size of the interactive region is limited, the corresponding content to be displayed is limited by the size of the region, and the display effect of the content to be displayed is poor, therefore, the non-interactive content in the content to be displayed can be displayed outside the interactive region in an overlapped mode through the augmented reality technology, the display region of the interactive content in the content to be displayed only needs to correspond to the interactive region, complete display of the content to be displayed is guaranteed, and meanwhile the display effect of the interactive content corresponding to the interactive region is improved. In the embodiment of the application, the terminal device may obtain the content to be displayed, so as to determine the non-interactive content according to the content to be displayed. The content to be displayed comprises non-interactive content and interactive content.
In some embodiments, the content to be displayed may be content that needs to be displayed by the interaction device, the terminal device acquires the content to be displayed, or the interaction device transmits the content to be displayed to the terminal device after acquiring the content to be displayed by the touch screen, so that the terminal device may acquire the content to be displayed by the touch screen. The content to be displayed may be stored in the interactive apparatus, or may be acquired by the interactive apparatus from a server or other electronic devices, which is not limited herein.
In other embodiments, the content to be displayed may be content that needs to be displayed by the terminal device, and the content to be displayed may be stored in the terminal device, or may be acquired by the terminal device from a server or other electronic devices, or may be acquired by identifying a marker on the interaction apparatus. Of course, the above manner of acquiring the content to be displayed is only an example, and the specific content to be displayed may not be limited in the embodiment of the present application. For example, the content to be displayed may also be acquired according to a captured scene image of the environment in which the terminal device is located.
In the embodiment of the present application, the content to be displayed may be any content, and is not limited herein. For example, it may be an application desktop, an interface of an application (e.g., a chat interface, a game interface), a multi-level menu page (e.g., a setup page, a file management page), etc. The non-interactive content is content that only needs to be viewed without interaction in the current content to be displayed, such as signal strength, electric quantity, time, document, picture, video, personal information, preview information, historical chat content, historical mail content, historical interactive page (such as upper menu page), lower interactive page, and the like. The interactive content may be a current interactive interface, an operation control, and the like. Of course, the specific content to be displayed, the non-interactive content and the interactive content may not be limited in the embodiments of the present application.
Step S120: first content data corresponding to the non-interactive content is obtained.
In this embodiment of the application, the terminal device may obtain, according to the content to be displayed, first content data corresponding to non-interactive content in the content to be displayed, so as to determine content in which a display area in the content to be displayed does not correspond to an interactive area. When the content to be displayed is displayed by the interactive device, the content to be displayed, which is acquired by the terminal device from the interactive device, is the image data of the content to be displayed, and therefore, the first content data corresponding to the non-interactive content may be the image data of the non-interactive content. When the content to be displayed is displayed by the terminal device, the first content data corresponding to the non-interactive content acquired by the terminal device may be three-dimensional model data corresponding to the non-interactive content, and the three-dimensional model data may include a color, a model vertex coordinate, model contour data, and the like for constructing a model corresponding to the three-dimensional model.
In some embodiments, when the content to be displayed is content that the interactive device needs to display, the terminal device may directly obtain the first content data corresponding to the non-interactive content from the interactive device. As one mode, the terminal device may send a data request to the interactive apparatus, and the interactive apparatus may transmit the first content data of the non-interactive content to the terminal device according to the data request, so that the terminal device may obtain the first content data corresponding to the non-interactive content.
Step S130: and acquiring relative spatial position information between the interaction device and the terminal equipment.
In this embodiment of the application, the terminal device may obtain relative spatial position information between the interaction apparatus and the terminal device, so that the terminal device displays corresponding virtual content according to the relative spatial position information.
As an implementation mode, the terminal equipment can acquire an image of a marker on the interaction device through the image sensor, identify the marker in the tracking image, and acquire relative spatial position information between the interaction device and the terminal equipment. Wherein, the relative spatial position information between the interaction device and the terminal equipment comprises: the relative position information and the posture information between the interactive device and the terminal equipment, and the posture information can be the orientation, the rotation angle and the like of the interactive device relative to the terminal equipment.
In some embodiments, the marker is a pattern having a topological structure, which refers to the connection relationship between the sub-markers and the feature points, etc. in the marker.
In some embodiments, the interactive device may further include an optical spot and an Inertial Measurement Unit (IMU), the terminal device may acquire an optical spot image on the interactive device through the image sensor, acquire measurement data through the Inertial measurement unit, and determine relative spatial position information between the interactive device and the terminal device according to the optical spot image and the measurement data, thereby positioning and tracking the interactive device. The light spots arranged on the interaction device can be visible light spots or infrared light spots, and the number of the light spots can be one or a light spot sequence consisting of a plurality of light spots.
Of course, the specific manner of acquiring the relative spatial location information between the interaction apparatus and the terminal device may not be limited in this embodiment of the application.
Step S140: and generating virtual extended content according to the relative spatial position information and the first content data.
In this embodiment of the application, after the terminal device acquires the relative spatial position information between the interactive apparatus and the terminal device and the first content data corresponding to the non-interactive content, the terminal device may generate the virtual extended content corresponding to the non-interactive content according to the relative spatial position information and the first content data, so as to display the virtual extended content subsequently.
In some embodiments, the terminal device may acquire a set area that the virtual extension content needs to be displayed, and then acquire a rendering position of the virtual extension content according to the relative spatial position information between the interaction device and the terminal device and the relative positional relationship between the set area and the interaction device, and then render the virtual content according to the rendering position. The set region refers to a region superimposed in real space when the virtual extension content is displayed.
Specifically, the terminal device may obtain the position information of the set region relative to the terminal device according to the relative spatial position information between the interaction device and the terminal device and the relative positional relationship between the set region and the interaction device, so as to obtain the spatial position coordinates of the set region in the real space, and convert the spatial position coordinates into the spatial coordinates in the virtual space, where the virtual space refers to a three-dimensional space used for rendering and displaying the virtual extension content. The space coordinates of the set area in the virtual space can be used as rendering coordinates of the virtual extended content in the virtual space, that is, the rendering position of the virtual extended content is obtained, so that the virtual extended content is rendered at the rendering position. The rendering coordinates refer to three-dimensional space coordinates of the virtual extended content in a virtual space. In some embodiments, the setting area may be any area outside the interaction area or a preset area, for example, the setting area may be an area that is outside the interaction area but is adjacent to the interaction area, or the setting area may be an area that is outside the interaction area and is a set distance from the interaction area, and of course, the specific setting area is not limited in this embodiment of the application.
It can be understood that, after the terminal device obtains rendering coordinates for rendering the virtual extended content in the virtual space, the terminal device may construct the virtual extended content according to the obtained first content data corresponding to the non-interactive content and according to the rendering coordinates.
Step S150: when an interactive picture corresponding to the interactive content is displayed, the virtual extended content is displayed, a first display area of the interactive picture corresponds to the interactive area, and a second display area of the virtual extended content corresponds to a set area outside the interactive area.
In the embodiment of the application, when an interactive picture corresponding to the interactive content is displayed, the terminal device may display the virtual extended content, where a first display area of the interactive picture corresponds to the interactive area, and a second display area of the virtual extended content corresponds to a set area outside the interactive area. The non-interactive content in the content to be displayed is displayed outside the interactive area in an overlapped mode through the augmented reality display technology, the display area of the interactive content of the content to be displayed corresponds to the interactive area, complete display of the content to be displayed is guaranteed, and meanwhile the display effect of the interactive content corresponding to the interactive area is improved.
Specifically, after the terminal device constructs and renders the virtual extension content, the virtual extension content may be converted into a virtual picture, and display data of the virtual picture is obtained, where the display data may include RGB values of each pixel point in the display picture, and corresponding pixel point coordinates, and the terminal device may generate the display picture according to the display data, and project the display picture to display the virtual extension content. The terminal device generates the virtual extension content according to the relative position relation between the set area and the interactive device, the relative spatial position information and the first content data, so that the virtual extension content is displayed in an overlapping manner in the set area, namely, the display overlapping area of the virtual extension content is the set area outside the interactive area, and a user can see the set area outside the interactive area of the interactive device in the real world in an overlapping manner by wearing a display lens of the display device, thereby realizing the effect of augmented reality.
Because the display area of the virtual extension content is the set area outside the interaction area, the display position of the virtual extension content does not conflict with other contents corresponding to the interaction area, so that a user can control the interaction content corresponding to the interaction area while checking the virtual extension content outside the interaction area, complete display of the content to be displayed is realized, and meanwhile, the display effect and the operation experience of the interaction content corresponding to the interaction area are improved.
In some embodiments, the interactive screen corresponding to the interactive content is displayed, and the terminal device may convert the interactive content into the interactive screen and display the interactive screen in an overlapped manner in the interactive area. Specifically, the terminal device may obtain second content data corresponding to the interactive content, so as to construct an interactive picture corresponding to the interactive content according to the second content data; the terminal equipment can obtain rendering coordinates of the interactive picture in the virtual space according to the relative spatial position information between the interactive device and the terminal equipment; the terminal equipment can construct an interactive picture according to the rendering coordinates and superpose and display the interactive picture on the interactive area; the user can see the interaction picture displayed in the interaction area in an overlapping mode through the head-mounted display device. As an embodiment, the terminal device may render and display the virtual extension content and the interactive screen at the same time, so that the user sees the interactive screen through the head-mounted display device to be displayed in an overlapped manner on the interactive area, and the virtual extension content is displayed in an overlapped manner on the set area outside the interactive area.
For example, in fig. 1, the interaction device 200 is a touch pad and does not have a display function, and a user observes the interaction region 202 of the interaction device 200 in which the virtual chat interface 301 is superimposed and displayed in a real space through a head-mounted display device worn by the user, and can observe that non-interaction content (such as the historical chat records 302) in the virtual chat interface 301 is displayed at a position outside the interaction region 202 in the form of virtual content, so that the user can chat with a chat object through the interaction region 202 in the virtual chat interface 301, and can also view the historical chat records 302 at the same time, so that the user can view the content through the interaction device 200 without being limited by the size of the interaction region, thereby implementing interactive display of the content to be displayed, improving the display effect, and facilitating interaction between the user and the displayed content.
In other embodiments, the interactive screen corresponding to the interactive content is displayed, or the interactive device may display the interactive screen corresponding to the interactive content through a touch screen of the interactive area. The terminal device may display the virtual extension content when detecting that the touch screen displays the interactive screen. Referring to fig. 4A and 4B, the interaction device 200 is a smart phone, fig. 4A is a screen display of a conventional smart phone, and the terminal device may acquire non-interaction content 203 (such as signal strength, electric quantity, time, and the like) to be displayed by the smart phone, which only needs to be viewed, and display the non-interaction content 203 as virtual extension content in an area outside a touch screen of the smart phone, and please refer to fig. 4B, when the touch screen of the smart phone displays the interaction content, a user may see an area outside the touch screen of the smart phone, in which the virtual extension content 303 is displayed in the real world, through the head-mounted display device.
The method for displaying virtual content provided in the embodiment of the application includes obtaining content to be displayed, where the content to be displayed includes non-interactive content and interactive content, then obtaining first content data corresponding to the non-interactive content, obtaining relative spatial position information between an interactive device and a terminal device, and generating virtual extended content according to the relative spatial position information and the first content data, and displaying the virtual extended content when an interactive picture corresponding to other content is displayed, where a first display area of the interactive picture corresponds to an interactive area, and a second display area of the virtual extended content corresponds to a set area outside the interactive area, so as to display the virtual extended content corresponding to the non-interactive content in the content to be displayed in a virtual space according to the spatial position of the interactive device, and enable a user to see an effect that the virtual extended content is displayed outside the interactive area, the displayed interactive content corresponds to the interactive area, so that the display space of the displayed content is enlarged, the display effect is improved, and the interaction between the user and the displayed content is facilitated.
Referring to fig. 5, another embodiment of the present application provides a method for displaying virtual content, which is applicable to a terminal device, where the terminal device is in communication connection with an interaction apparatus, and the interaction apparatus includes an interaction area, where the method may include:
step S210: and acquiring content to be displayed, wherein the content to be displayed comprises non-interactive content and interactive content.
In some embodiments, the interactive area of the interactive device may include a touch screen, i.e., the interactive area may be capable of content display. When the interactive apparatus displays content, the acquiring of the content to be displayed may include: sending a data request to an interactive device, wherein the data request is used for indicating the interactive device to acquire contents to be displayed on a touch screen; and receiving the content to be displayed sent by the interactive device.
When the terminal equipment needs to acquire the content to be displayed, the terminal equipment can send a data request to the interaction device so as to acquire the content to be displayed on the touch screen of the interaction device. When receiving the data request, the interaction device may obtain the content to be displayed on the touch screen from the server, or may obtain the content to be displayed on the touch screen from the local storage. And then transmitting the data of the content to be displayed to the terminal equipment, so that the terminal equipment can receive the data of the content to be displayed.
In some embodiments, before sending the data request to the interaction device, the terminal device may establish a communication connection with the interaction device in a wired or wireless manner, so that the data request can be successfully sent to the interaction device, and at the same time, data of the content to be displayed sent by the interaction device can be successfully received.
In some implementations, the non-interactive content can correspond to a portion of the interactive content. For example, when the part of content is an operation control, the non-interactive content may be interface content associated with the operation control, and for example, when the part of content is a pinyin nine key or an english twenty-six key in an input method interface, the non-interactive content may be character candidate content corresponding to a character input box, a character display interface, a key, and the like.
Step S220: first content data corresponding to the non-interactive content is obtained.
In some application scenarios, due to the limitation of the size of the display area of the interaction area, the content to be displayed often needs to be paged, when a current-level menu page is viewed, the previous-level menu page is usually hidden, and the content of other menu pages can be viewed only by returning, and when the current-level menu page is viewed, the next-level menu page cannot be directly viewed, and a certain menu item needs to be selected to enter the current-level menu page. Therefore, in the embodiment of the present application, the terminal device may use the previous-level menu page or the next-level menu page of the current-level menu page as the non-interactive content, and convert the non-interactive content into the setting area in which the virtual extension content is superimposed and displayed outside the interactive area, so that the user may view the previous-level menu page and the current menu page corresponding to the interactive area at the same time, or the user may view the current menu page corresponding to the interactive area and the next-level menu page at the same time.
Specifically, in some embodiments, the content to be displayed may include a plurality of menu pages, and the acquiring first content data corresponding to the non-interactive content may include: according to the plurality of menu pages, first data of a second menu page corresponding to the first menu page is obtained, and the first data is used as first content data corresponding to non-interactive content in the content to be displayed. The second menu page is an upper-level menu page or a lower-level menu page of the first menu page, and the first data of the second menu page may be image data of the second menu page. Therefore, the terminal device can take the first data of the upper-level menu page or the lower-level menu page of the current-level menu page as the first content data corresponding to the non-interactive content in the content to be displayed, so as to generate the virtual extended content according to the first content data in the following process.
In some embodiments, the first menu page may be a menu page that needs to be currently operated, among a plurality of menu pages to be displayed. It is understood that, when the first menu page is a first-level menu page of a plurality of menu pages, the second menu page may only be a next-level menu page of the first menu page, for example, when the first menu page is a setting page of a mobile phone, the second menu page may be a sub-page of a bluetooth setting, a sub-page of a connectable wireless account, or the like. When the first menu page is a menu page of another level of the plurality of menu pages except the first menu page, the second menu page may be a next-level menu page of the first menu page, or may be a previous-level menu page of the first menu page, for example, when the first menu page is a sub-page for setting the electric quantity, the second menu page may be a corresponding previous-level page, such as a setting page of a mobile phone, or a next-level page for the setting page, such as a power saving setting page, a sleep setting page, or the like.
When the first menu page includes a plurality of menu items, the first menu page may correspond to a plurality of second menu pages. Therefore, as an implementation manner, the terminal device may obtain, by default, first data corresponding to a second menu page corresponding to a first menu item in the menu item list, and use the first data as first content data corresponding to the non-interactive content. For example, when the first menu page that needs to be operated at present is a file management page, the terminal device may default to acquire data corresponding to the file content in the first folder.
As another implementation, the terminal device may also obtain first data corresponding to a second menu page corresponding to a plurality of menu items in the menu item list, that is, the terminal device may use the plurality of second menu pages as non-interactive content. For example, when the first menu page is a game interface, the terminal device may obtain data of the second menu page corresponding to a backpack menu item, a map menu item, and the like.
Step S230: and acquiring relative spatial position information between the interaction device and the terminal equipment.
In the embodiment of the present application, step S230 may refer to the contents of the above embodiments, and is not described herein again.
Step S240: and acquiring a first relative position relation between a set area and the interactive device, wherein the set area is a corresponding overlapping area in the real environment when the virtual extended content is displayed.
In this embodiment, the terminal device may obtain a display area where the virtual extension content needs to be displayed, so as to generate the virtual extension content according to the display area in the following. Specifically, the terminal device may obtain a first relative positional relationship between a set area and the interaction apparatus, where the set area refers to a corresponding overlay area where the terminal device displays the virtual extension content in an overlay manner in the real environment, and may also be understood as an area where the virtual extension content is seen by the user through the terminal device in the real environment.
In some embodiments, the obtaining of the first relative position relationship between the setting area and the interaction device may include: according to the non-holding area of the interaction device, a set area outside the interaction area and corresponding to the non-holding area is determined, a first relative position relation between the set area and the interaction device is obtained, and the non-holding area is an area which is not held in the edge area of the interaction device. Therefore, the terminal device can set the display area of the virtual extension content in the area outside the side where the interactive device is not held according to the holding area where the user holds the interactive device.
In some embodiments, the setting area outside the interactive area may correspond to a non-holding area of the interactive apparatus, that is, different non-holding areas are different when the user holds different edge sides of the interactive apparatus, and the terminal device may determine the setting areas corresponding to different non-holding areas according to a corresponding relationship between the setting area and the non-holding area, so that the display areas of the generated virtual extension content may also be different. For example, when the user holds the left side and the right side of the interactive device relative to the terminal device, the display area corresponding to the virtual extension content may be an upper area or a lower area of the interactive area relative to the terminal device, so as to avoid the virtual extension content from being blocked by the hand of the user. The corresponding relation between the set area and the non-holding area can be stored in the terminal equipment or downloaded from the server.
As a specific embodiment, the determining, according to the non-holding area of the interaction device, a setting area outside the interaction area and corresponding to the non-holding area, and acquiring a first relative position relationship between the setting area and the interaction device may include: when the interaction device is detected to be in a held state, acquiring a gesture image; determining a non-holding area of the interactive device according to the gesture image; and acquiring a set area corresponding to the non-holding area outside the interaction area, and acquiring a first relative position relation between the set area and the interaction device. The terminal equipment can acquire a gesture image of a user through the image sensor to determine an area which is not held in an edge area of the interaction device according to the gesture image, so that a set area which is outside the interaction area and corresponds to the non-held area can be acquired according to the corresponding relation between the set area and the non-held area. Therefore, the terminal equipment can determine the non-holding area of the interactive device by detecting the gesture of the user holding the interactive device.
In some embodiments, the terminal device may determine whether the interaction region of the interaction device is within a field of view of an image sensor of the terminal device according to the position information, the rotation direction, and the rotation angle of the interaction device relative to the terminal device, so as to determine whether the interaction region is in a held state. The terminal equipment can also directly judge whether the interaction device is held or not according to the sensing data of the sensor of the interaction device. The sensor may be an acceleration sensor, a gravity sensor, or the like.
As another specific embodiment, the interactive device may be provided with a temperature sensor or a pressure sensor, and the interactive device may determine an edge side of the interactive device held by the user according to the corresponding sensing data, so that a non-holding area of the interactive device may be obtained. And the interactive device transmits the data of the non-holding area to the terminal equipment.
In another embodiment, the obtaining the first relative positional relationship between the setting area and the interaction device may further include: according to the first control action parameter detected by the interaction area, a set area corresponding to the first control action parameter outside the interaction area is determined, and a first relative position relation between the set area and the interaction device is obtained. The terminal equipment can determine the set area selected by the user according to the first control action parameter detected by the interactive area sent by the interactive device. Therefore, the display area required to be displayed by the virtual extension content can be selected by the user to meet the requirements of different users.
In some embodiments, the first manipulation parameter may include parameters such as a touch position corresponding to the touch operation, a type of the touch operation, a number of fingers of the touch operation, a pressing pressure of the finger, and a duration of the touch operation. The touch position corresponding to the touch operation may refer to a position of the manipulated area on the interaction area, for example, may be a touch coordinate in a plane coordinate system of a plane where the interaction area is located. The type of touch operation may include a click operation, a slide operation, a long press operation, and the like. The number of fingers of the touch operation refers to the number of fingers performing the operation, that is, the number of areas pressed when the sensor of the interaction area detects the touch operation, and the number is, for example, 1, and, for example, 2. The finger pressing pressure refers to a pressing pressure for performing the touch operation, that is, a magnitude of the pressure detected by the sensor of the interaction area, for example, the pressing pressure is 0.5N (cow). The duration of the touch operation is the time when the finger detected in the interaction area is in contact with the interaction area, for example, the duration is 1S (second). Of course, the specific first manipulation motion parameter may not be limited in this embodiment, and the first manipulation motion parameter may also include other touch parameters, for example, a sliding track, a click frequency of a click operation, and the like.
In some embodiments, the terminal device determines, according to the first manipulation motion parameter, a setting region outside the interaction region and corresponding to the first manipulation motion parameter, where a selection list of the setting region may include a plurality of setting regions when the selection list is displayed in the interaction region. A slider bar for setting the distance between the setting area and the interactive area may be displayed in the interactive area. After the interactive area detects the touch operation on the selection list and the sliding bar, the interactive device can send corresponding operation and control action parameters to the terminal equipment, so that the terminal equipment can determine the set area option selected by the user according to the operation and control action parameters. For example, the touch position when the user currently selects the setting area option is determined according to the first control action parameter, and the sliding length of the user currently to the sliding bar is determined according to the first control action parameter.
In some embodiments, the obtaining of the first relative position relationship between the setting area and the interaction device may further include: and reading a first relative position relation between a preset area outside the pre-stored interaction area and the interaction device. It is to be understood that the position relationship between the display area required to be displayed by the virtual extension content and the interactive device may be fixed, for example, the setting area is in the left area and the right area outside the interactive area.
Of course, the specific manner of acquiring the first relative positional relationship between the setting region and the interaction device may not be limited in the embodiment of the present application.
Step S250: and acquiring a second relative position relation between the set area and the terminal equipment according to the relative spatial position information and the first relative position relation.
In this embodiment of the application, the terminal device may obtain, by using the terminal device as a reference object, a second relative positional relationship between the set region and the terminal device according to the relative spatial positional information between the interaction device and the terminal device and the first relative positional relationship between the set region and the interaction device, so that the spatial positional information of the set region in the real space may be obtained.
Step S260: and generating virtual extended content according to the second relative position relation and the first content data.
In this embodiment of the application, after determining the setting area where the virtual extension content needs to be displayed, the terminal device may acquire a rendering position of the virtual extension content in the virtual space according to a second relative position relationship between the setting area and the terminal device, and render the virtual content according to the rendering position. The specific steps for generating the virtual extension content may refer to the contents of the foregoing embodiments, and are not described herein again.
Step S270: when an interactive picture corresponding to the interactive content is displayed, the virtual extended content is displayed, a first display area of the interactive picture corresponds to the interactive area, and a second display area of the virtual extended content corresponds to a set area outside the interactive area.
In some embodiments, when the content to be displayed includes a plurality of menu pages, and the non-interactive content is an upper-level menu page or a lower-level menu page of the first menu page, the displaying the virtual extended content when the interactive picture corresponding to the interactive content is displayed may include: and when the interactive picture corresponding to the first menu page is displayed, displaying virtual extension content, wherein the virtual extension content comprises a second menu page. Therefore, when the display area of the interactive picture corresponding to the first menu page corresponds to the interactive area, the user can see the set area, displayed outside the interactive area, of the second menu page in an overlapped mode through the head-mounted display device, so that the user can simultaneously view the previous menu page and the current menu page corresponding to the interactive area, or the user can simultaneously view the current menu page corresponding to the interactive area and the next menu page. The interactive screen corresponding to the first menu page may be displayed in the interactive area by the terminal device in an overlapping manner, or may be displayed by the interactive device through a touch screen of the interactive area.
For example, referring to fig. 6, the interaction device 200 is a smart phone, and when the smart phone enters the wlan menu page 204, the terminal device may convert the corresponding upper level menu page into an area where the virtual extension content 303 is displayed outside the touch screen of the smart phone, so that the user can view the upper level menu page while viewing the wlan menu page. For another example, referring to fig. 7, when the content to be displayed is a game interface, the terminal device may convert the next-level menu page corresponding to the game screen, the backpack option, the map option, and the like as non-interactive content into virtual extended content, and display the virtual extended content in an area outside the touch screen of the smart phone, so that the user can see the virtual game screen 304, the virtual backpack content 305, and the virtual map content 306 through the head-mounted display device in the area outside the touch screen of the smart phone in the real world.
In some embodiments, when the terminal device displays the virtual extension content in the set area outside the interaction area, the terminal device may display an icon corresponding to the virtual extension content on the interaction area to identify and distinguish the virtual extension content. Specifically, before displaying the virtual extended content when the interactive screen corresponding to the interactive content is displayed, the method for displaying the virtual content may further include:
acquiring second content data corresponding to the interactive content; acquiring icon data corresponding to the non-interactive content; and generating an interactive picture according to the second content data, the icon data and the relative spatial position information.
In some embodiments, when the interactive apparatus does not have the display function, the interactive content and the non-interactive content may be displayed by the terminal device. The terminal device can acquire the icon data of the icon corresponding to the non-interactive content according to the non-interactive content in the content to be displayed, and accordingly can generate an interactive picture containing the icon and the interactive content according to the second content data corresponding to the interactive content, the icon data and the relative spatial position information. The icon data may include model data for rendering an icon, and the second content data may include model data for rendering interactive content. The terminal can also acquire the position information of the designated position of the icon and the interactive content which need to be displayed in the interactive area on the interactive device, so as to obtain the display position of the icon and the interactive content according to the relative spatial position information and the position information, and generate an interactive picture containing the icon and the interactive content for display. The manner of generating the interactive screen may refer to the manner of generating the virtual extended content in the above embodiment, and details are not described herein.
As a mode, when the non-interactive content is a previous-level menu page, each icon in the corresponding icon list corresponds to each menu item in the previous-level menu page one to one, and the virtual extended content generated according to the previous-level menu page may be information content such as an explanation, a description, a name, and the like of each icon. It should be understood that the icon data corresponding to the non-interactive content may be stored in the terminal device, may be downloaded from a server, and is not limited herein.
In some embodiments, the icons corresponding to the non-interactive content may be associated with controls. Wherein, the icon can be directly displayed above the control in an overlapping manner. The control may be a button control, an input box control, a list control, or the like, may be directly displayed at a lower position, may be used to control display and cancel display of the virtual extension content, and may also be used to control movement of the virtual extension content and adjust a scale of the virtual extension content, which is not limited herein. Therefore, the user can trigger the display control of the virtual extension content by operating the icon.
For example, referring to fig. 8, a virtual chat page 301 of a selected chat object is displayed on an interaction area 202 of an interaction device, a terminal device may generate a virtual icon option list 307 with a user avatar and display the virtual icon option list on one side of the interaction area in an overlapping manner, and a user selects different icons of the virtual icon option list through a touch operation on the interaction area to switch the virtual chat pages of different chat objects. Meanwhile, the terminal device may obtain information corresponding to the avatar of each user, such as personal information (account number, name, profile, etc.), or historical chat content, generate the virtual extension content 303, and display the virtual extension content 303 outside the interaction area, where the personal information of each chat object may be displayed in a corresponding icon. When the user drags the virtual icon option list, for example, slides upward, the user avatar of the virtual icon option list changes accordingly, and the virtual extension 303 corresponding to the user avatar also changes accordingly.
Similarly, when the non-interactive content is a next-level menu page, each icon piece in the corresponding icon list can correspond to each menu item in the next-level menu page one to one. In some embodiments, there are some menu items that do not have corresponding next level menu pages, e.g., the GPS setting has only two states, open and closed, and there are no next level menus. The terminal device can display the icon and the button control corresponding to the option on the interactive area, and the icon is displayed above the button control in an overlapped mode, so that a user can open and close the GPS setting through touch control actions (such as clicking the icon) on the interactive area.
Further, in some embodiments, when the interactive screen is generated according to the second content data, the icon data, and the relative spatial position information, the user may control the display of the virtual extension content by operating the icon. Specifically, the displaying the virtual extension content when the interactive screen corresponding to the interactive content is displayed may include:
displaying the interactive picture; and when the icon in the interactive picture is determined to be operated according to the third operation and control action parameter detected by the interactive area, displaying the virtual expanded content corresponding to the operated icon, wherein the second display area of the virtual expanded content corresponds to the set area, matched with the operated icon, outside the interactive area.
After the terminal device generates the interactive picture containing the icon and the interactive content, the interactive picture can be displayed, so that the user can view the interactive picture containing the icon and the interactive content to be superposed in the interactive area. When it is determined that the user operates the icon in the interactive screen according to the third manipulation action parameter detected in the interactive region, the terminal device may display virtual extended content corresponding to the operated icon, and the second display region of the virtual extended content corresponds to the set region where the operated icon is matched outside the interactive region. Therefore, the user can control the display of the virtual extension content corresponding to the set area outside the interactive area by performing touch operation on the icon on the interactive area. In some embodiments, the operation on the icon in the interactive screen may include, but is not limited to, clicking, moving, and the like on the icon. The setting area is matched with the icon, and the setting area can be arranged on the side edge of the icon according to the position of the icon in the interactive area, so that a user can quickly find the corresponding virtual extension content according to the icon. For example, referring to fig. 6 again, when the icon list 205 is on the left side of the interactive area, the set area outside the interactive area corresponding to the virtual extension 303 is also on the left side of the icon list 205.
In some embodiments, when the interactive device includes a touch screen, the interactive device may display the content through the touch screen, and therefore, when the terminal device displays the virtual extension content in the set area outside the interactive area, the interactive device may also display the interactive screen including the icon and the interactive content through the touch screen. When the interactive region detects an operation on the icon, the interactive apparatus may send a control instruction for instructing the terminal device to control the virtual extended content to the terminal device, and the terminal device may perform the above-described display control on the virtual extended content according to the control instruction. For example, referring to fig. 6 again, when a menu page of a wireless local area network is displayed on the screen of the mobile phone terminal, the terminal device displays the virtual extension content 303 (a previous-level menu page) outside the touch screen of the smart phone, and simultaneously, an icon list 205 corresponding to the menu items of the previous-level menu page one by one may be generated and displayed on the screen of the mobile phone terminal, and when a user clicks and selects different icon options, for example, a WiFi icon in the icon list 205 is clicked, the user may directly jump to the wireless local area network menu page 204 corresponding to the icon options.
In some embodiments, when the interactive screen corresponding to the interactive content is displayed, the terminal device may directly display the virtual extended content, so as to implement automatic display of the virtual extended content without manual control by the user. After the terminal device displays the virtual extended content, the terminal device may also perform display control such as moving, zooming, and the like on the virtual extended content according to the operation and control parameters of the user on the icon in the interaction area, for example, when the icon is moved from the left side of the interaction area to the right side of the interaction area, the second display area of the virtual extended content may be moved from the left area outside the interaction area to the right area outside the interaction area along with the movement of the icon.
Further, since the non-interactive content in the content to be displayed is displayed outside the interactive area by the terminal device, the display area of the interactive content may correspond to the entire interactive area, so that the position and the proportion of the interactive content may be appropriately optimized, for example, in a game application, in order to not block excessive screen space, the control may be designed to be smaller at times, and therefore, when the non-interactive content in the game screen is displayed outside the interactive area, the interactive area may only display the control, and the control at this time may be designed to be larger, which is convenient for the user to click, and improves the game experience of the user. In some embodiments, the generating an interactive screen according to the second content data, the icon data, and the relative spatial position information may include:
acquiring control data from the second content data; determining a first arrangement position and a first proportion size of each control in the interactive picture according to the control data; determining a second arrangement position and a second proportion size of each icon in the interactive picture according to the icon data; and generating an interactive picture containing the controls and the icons according to the relative spatial position relationship, the first arrangement position and the first proportion size, and the second arrangement position and the second proportion size.
The interactive content may include a control for controlling the display of the non-interactive content, where the control may be a video "play", "pause", "progress bar" or other controls, and may also be a skill control, a displacement control, an equipment setting control, and the like in the game interface, which is not limited herein.
In some embodiments, the terminal device may obtain control data corresponding to controls in the interactive content, re-determine a first arrangement position and a first proportion size of each control in the interactive picture according to the control data, determine a second arrangement position and a second proportion size of each icon in the interactive picture according to the icon data, and generate the interactive picture including the controls and the icons according to the relative spatial position relationship, the first arrangement position and the first proportion size, and the second arrangement position and the second proportion size. The manner of generating the interactive screen including the control and the icon may refer to the manner of generating the virtual extended content in the above embodiment, and details are not described here. For example, referring to fig. 8, the terminal device may display a control for triggering display of a chat page of a corresponding chat object under the virtual icon option list 307 with a user avatar, so that the user may select different icons of the virtual icon option list through touch operation on the interaction area to trigger the corresponding control, thereby implementing switching of virtual chat pages of different chat objects.
In other embodiments, when the interactive area includes a touch screen, the generated interactive screen including the control and the icon may also be generated by the interactive device. For example, referring to fig. 6 again, the mobile phone terminal may display the controls corresponding to each menu item in the upper-level menu page under the generated icon list 205, where the controls correspond to the icons one to one, so that when the user clicks and selects different icon options, the controls corresponding to the icons may be directly triggered to implement corresponding functions, for example, when a WiFi icon in the icon list 205 is clicked, the corresponding controls may be directly triggered to jump into the corresponding wlan menu page 204. For another example, referring to fig. 7, in a game application, when the terminal device displays a virtual game screen 304 on the upper side of a smartphone screen in a 3D manner, and content viewed by virtual backpack content 305, virtual map content 306, and the like is displayed on the left and right sides of the screen, the terminal device may display corresponding operation controls (including a backpack control for triggering the display of the virtual backpack content 305 and a map control for triggering the display of the virtual map content 306 by a user) through a touch screen, so that the user can click conveniently, and the game experience of the user is improved.
Further, when generating the interactive screen containing the controls and the icons, after displaying the virtual extension content, the method further includes:
and when the control in the interactive picture is determined to be operated according to the fourth control action parameter detected in the interactive area, generating a control instruction according to the operated control, and controlling the virtual object in the virtual expanded content according to the control instruction.
In some embodiments, the controls in the content may have a correspondence with the non-interactive content. For example, the non-interactive content in the game interface is a game picture, the control is a displacement control, and the user can realize the switching of the game picture by controlling the displacement control. Therefore, when the virtual extension content is generated according to the non-interactive content, the terminal device can control the virtual extension content according to the control action parameter detected by the interactive area. Specifically, when the control in the interactive picture is determined to be operated according to the fourth control action parameter detected in the interactive region, the terminal device may generate a control instruction according to the operated control, and control the virtual object in the virtual extended content according to the control instruction, thereby implementing the interaction between the interactive apparatus and the terminal device. For example, referring to fig. 7 again, the terminal device may switch the content display of the virtual game screen 304 in real time according to the user's manipulation of the displacement control on the smartphone screen.
Further, when the terminal device displays the virtual expanded content, the priority of the display area may be sorted, for example, the priority may be sorted from low to high before and after the display trigger time of the virtual expanded content, that is, the priority of the virtual expanded content that is triggered to be displayed earlier is lower. In one embodiment, the terminal device may display the virtual extension content with high priority on the upper side of the interactive area, and display the virtual extension content with low priority on the left and right sides of the interactive area.
In some embodiments, the terminal device may store the display order of the virtual extension contents, so that when exiting from the display of the current virtual extension content, the previous virtual extension content of the current virtual extension content may be automatically displayed. In one embodiment, a progress bar of a plurality of virtual expanded contents may be displayed in the interactive area, and the user may switch between the plurality of virtual expanded contents displayed previously by sliding the progress bar.
The method for displaying virtual content, provided by the embodiment of the application, includes obtaining first content data corresponding to non-interactive content in content to be displayed, then obtaining relative spatial position information between an interactive device and a terminal device, obtaining a first relative position relationship between a set area and the interactive device, the set area being a corresponding overlay area in a real environment when virtual extended content is displayed, obtaining a second relative position relationship between the set area and the terminal device according to the relative spatial position information and the first relative position relationship, so as to generate virtual extended content according to the second relative position relationship and the first content data, and displaying the virtual extended content when an interactive picture corresponding to the interactive content is displayed, so as to display the virtual extended content corresponding to the non-interactive content in the content to be displayed in a virtual space according to a spatial position of the interactive device, the user can see the effect that the virtual extension content corresponding to the non-interactive content is displayed outside the interactive area, the displayed interactive content corresponds to the interactive area, the display space of the displayed content is enlarged, the display effect is improved, and the interaction between the user and the displayed content is facilitated.
Referring to fig. 9, a block diagram of a display apparatus 500 for virtual content according to an embodiment of the present application is shown, and is applied to a terminal device, where the terminal device is communicatively connected to an interaction apparatus, and the interaction apparatus includes an interaction area. The apparatus may include: a content acquisition module 510, a data acquisition module 520, a location acquisition module 530, a content generation module 540, and a display control module 550. The content obtaining module 510 is configured to obtain content to be displayed, where the content to be displayed includes non-interactive content and interactive content; the data obtaining module 520 is configured to obtain first content data corresponding to the non-interactive content; the position obtaining module 530 is configured to obtain relative spatial position information between the interaction apparatus and the terminal device; the content generating module 540 is configured to generate a virtual extension content according to the relative spatial position information and the first content data; the display control module 550 is configured to display the virtual extension content when the interactive screen corresponding to the interactive content is displayed, where a first display area of the interactive screen corresponds to the interactive area, and a second display area of the virtual extension content corresponds to the set area outside the interactive area.
In some embodiments, the content generation module 540 may be specifically configured to: acquiring a first relative position relation between a set area and an interaction device, wherein the set area is a corresponding overlapping area in a real environment when virtual extended content is displayed; acquiring a second relative position relation between the set area and the terminal equipment according to the relative spatial position information and the first relative position relation; and generating virtual extended content according to the second relative position relation and the first content data.
Further, the obtaining, by the content generating module 540, a first relative position relationship between the setting area and the interaction device may include: determining a set area corresponding to a non-holding area outside an interaction area according to the non-holding area of the interaction device, and acquiring a first relative position relation between the set area and the interaction device, wherein the non-holding area is an area which is not held in the edge area of the interaction device; or determining a set area corresponding to the second control action outside the interactive area according to the first control action parameter detected in the interactive area, and acquiring a first relative position relation between the set area and the interactive device; or reading a first relative position relation between a preset area outside the pre-stored interactive area and the interactive device.
Further, the determining, by the content generating module 540, a set area outside the interaction area and corresponding to the non-holding area according to the non-holding area of the interaction device, and obtaining a first relative position relationship between the set area and the interaction device, where the non-holding area is an area that is not held in the edge area of the interaction device, and the determining may include: when the interaction device is detected to be in a held state, acquiring a gesture image; determining a non-holding area of the interactive device according to the gesture image; and acquiring a set area corresponding to the non-holding area outside the interaction area, and acquiring a first relative position relation between the set area and the interaction device.
In some embodiments, the content to be displayed includes a plurality of menu pages, and the data obtaining module 520 may be specifically configured to: according to the multiple menu pages, first data of a second menu page corresponding to the first menu page is obtained, and the first data is used as first content data corresponding to non-interactive content, wherein the second menu page is an upper-level menu page or a lower-level menu page of the first menu page. The display control module 550 may be specifically configured to: and when the interactive picture corresponding to the first menu page is displayed, displaying virtual extension content, wherein the virtual extension content comprises a second menu page.
Further, the second menu page is a next-level menu page of the first menu page, and the display apparatus 500 for virtual content may further include: and a content switching module. The content switching module is used for determining a menu item in a selected state in the first menu page according to a second control action parameter detected by the interaction area, and acquiring second data of a new next-level menu page corresponding to the menu item in the selected state; generating new virtual extension content according to the relative spatial position information and the second data; and switching the currently displayed virtual extended content into new virtual extended content, wherein the new virtual extended content comprises a new next-level menu page.
In some embodiments, the interaction area includes a touch screen, and the content obtaining module 510 may be specifically configured to: sending a data request to an interactive device, wherein the data request is used for indicating the interactive device to acquire contents to be displayed on a touch screen; and receiving the content to be displayed sent by the interactive device.
In some embodiments, the display device 500 of the virtual content may further include: and a picture generation module. The picture generation module is used for acquiring second content data corresponding to the interactive content; acquiring icon data corresponding to the non-interactive content; and generating an interactive picture according to the second content data, the icon data and the relative spatial position information. The display control module 550 may be specifically configured to: displaying the interactive picture; and when the icon in the interactive picture is determined to be operated according to the third operation and control action parameter detected by the interactive area, displaying the virtual expanded content corresponding to the operated icon, wherein the second display area of the virtual expanded content corresponds to the set area, matched with the operated icon, outside the interactive area.
In some embodiments, the generating the interactive screen by the screen generating module according to the second content data, the icon data and the relative spatial position information may include: acquiring control data from the second content data; determining a first arrangement position and a first proportion size of each control in the interactive picture according to the control data; determining a second arrangement position and a second proportion size of each icon in the interactive picture according to the icon data; and generating an interactive picture containing the controls and the icons according to the relative spatial position relationship, the first arrangement position and the first proportion size, and the second arrangement position and the second proportion size. The display apparatus 500 of the virtual content may further include: and a content control module. The content control module is used for generating a control instruction according to the operated control when the control in the interactive picture is determined to be operated according to the fourth control action parameter detected in the interactive area, and controlling the virtual object in the virtual expanded content according to the control instruction.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling or direct coupling or communication connection between the modules shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or modules may be in an electrical, mechanical or other form.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
To sum up, in the display apparatus of virtual content provided in the embodiment of the present application, the terminal device obtains a content to be displayed, where the content to be displayed includes a non-interactive content and an interactive content, then obtains first content data corresponding to the non-interactive content, obtains relative spatial position information between the interactive apparatus and the terminal device, and generates a virtual extended content according to the relative spatial position information and the first content data, when an interactive screen corresponding to the interactive content is displayed, the virtual extended content is displayed, where a first display area of the interactive screen corresponds to an interactive area of the interactive apparatus, and a second display area of the virtual extended content corresponds to a set area outside the interactive area, so that the virtual extended content corresponding to the non-interactive content in the content to be displayed is displayed in the virtual space according to the spatial position of the interactive apparatus, so that a user can see an effect that the virtual extended content is displayed outside the interactive area, the displayed interactive content corresponds to the interactive area, so that the display space of the displayed content is enlarged, the display effect is improved, and the interaction between the user and the displayed content is facilitated.
Referring to fig. 1 again, an embodiment of the present application provides a display system 10 for virtual content, the system includes a terminal device 100 and an interaction apparatus 200, the terminal device 100 is connected to the interaction apparatus 200, the interaction apparatus 200 includes an interaction area 202, where:
the interaction device 200 is configured to obtain first content data corresponding to non-interactive content in the content to be displayed according to the content to be displayed in the interaction area 202, where the content to be displayed includes the non-interactive content and the interactive content, send the first content data to the terminal device 100, and control the interaction area 202 to display an interaction picture corresponding to the interactive content;
the terminal device 100 is configured to obtain relative spatial position information between the interaction apparatus 200 and the terminal device 100, receive the first content data, generate a virtual extended content according to the relative spatial position information and the first content data, and display the virtual extended content, where a display area of the virtual extended content corresponds to a set area outside the interaction area 202.
In some embodiments, the terminal device 100 in the above embodiments may be an external/access head-mounted display device, and the head-mounted display device is connected to the interaction device. The head-mounted display device may only complete displaying of interactive virtual content such as virtual extended content and acquiring of the marker image, all the processing operations related to displaying and controlling of the virtual extended content may be completed by the interaction device 200, and after the interaction device 200 generates the virtual extended content, the display screen corresponding to the virtual extended content is transmitted to the head-mounted display device, that is, the displaying of the virtual extended content may be completed.
Referring to fig. 10, a block diagram of a terminal device according to an embodiment of the present application is shown. The terminal device 100 may be a terminal device capable of running an application, such as a smart phone, a tablet computer, a head-mounted display device, and the like. The terminal device 100 in the present application may include one or more of the following components: a processor 110, a memory 120, an image sensor 130, and one or more applications, wherein the one or more applications may be stored in the memory 120 and configured to be executed by the one or more processors 110, the one or more programs configured to perform the methods as described in the aforementioned method embodiments.
Processor 110 may include one or more processing cores. The processor 110 connects various parts within the entire terminal device 100 using various interfaces and lines, and performs various functions of the terminal device 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120 and calling data stored in the memory 120. Alternatively, the processor 110 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 110 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 110, but may be implemented by a communication chip.
The Memory 120 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the terminal device 100 in use, and the like.
In the embodiment of the present application, the image sensor 130 is used for capturing images of real objects and capturing scene images of a target scene. The image sensor 130 may be an infrared camera or a visible light camera, and the specific type is not limited in the embodiment of the present application.
In one embodiment, the terminal device is a head-mounted display device, and may further include one or more of the following components in addition to the processor, the memory, and the image sensor described above: display module assembly, optical module assembly, communication module and power.
The display module may include a display control unit. The display control unit is used for receiving the display image of the virtual content rendered by the processor, and then displaying and projecting the display image onto the optical module, so that a user can view the virtual content through the optical module. The display device may be a display screen or a projection device, and may be used to display an image.
The optical module can adopt an off-axis optical system or a waveguide optical system, and a display image displayed by the display device can be projected to eyes of a user after passing through the optical module. The user sees the display image that display device throws through optical module group simultaneously. In some embodiments, the user can also observe the real environment through the optical module, and experience the augmented reality effect after the virtual content and the real environment are superimposed.
The communication module can be a module such as Bluetooth, WiFi (Wireless-Fidelity), ZigBee (Violet technology) and the like, and the head-mounted display device can be in communication connection with the terminal equipment through the communication module. The head-mounted display device in communication connection with the terminal equipment can perform information and instruction interaction with the terminal equipment. For example, the head-mounted display device may receive image data transmitted from the terminal device via the communication module, and generate and display virtual content of a virtual world from the received image data.
The power supply can supply power for the whole head-mounted display device, and the normal operation of each part of the head-mounted display device is ensured.
Referring to fig. 11, a block diagram of an interaction apparatus according to an embodiment of the present disclosure is shown. The interaction device 200 may be an electronic device such as a smart phone or a tablet computer having an interaction area, and the interaction area may include a touch pad or a touch screen. The interaction device 200 may include one or more of the following components: a processor 210, a memory 220, and one or more applications, wherein the one or more applications may be stored in the memory 220 and configured to be executed by the one or more processors 210, the one or more programs configured to perform a method as described in the aforementioned method embodiments.
Referring to fig. 12, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable storage medium 800 has stored therein program code that can be invoked by a processor to perform the methods described in the method embodiments above.
The computer-readable storage medium 800 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 800 includes a non-volatile computer-readable storage medium. The computer readable storage medium 800 has storage space for program code 810 to perform any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. The program code 810 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (13)

1. A method for displaying virtual content is applied to a terminal device, the terminal device is in communication connection with an interaction device, the interaction device comprises an interaction area, and the method comprises the following steps:
acquiring content to be displayed, wherein the content to be displayed comprises non-interactive content and interactive content;
acquiring first content data corresponding to the non-interactive content;
acquiring relative spatial position information between the interaction device and the terminal equipment;
generating virtual extended content according to the relative spatial position information and the first content data;
and when an interactive picture corresponding to the interactive content is displayed, displaying the virtual extended content, wherein a first display area of the interactive picture corresponds to the interactive area, and a second display area of the virtual extended content corresponds to a set area outside the interactive area.
2. The method according to claim 1, wherein the generating virtual extension content according to the relative spatial position information and the first content data comprises:
acquiring a first relative position relation between a set area and the interaction device, wherein the set area is a corresponding overlapping area in a real environment when virtual extended content is displayed;
acquiring a second relative position relation between the set area and the terminal equipment according to the relative spatial position information and the first relative position relation;
and generating the virtual extension content according to the second relative position relation and the first content data.
3. The method of claim 2, wherein obtaining the first relative position relationship between the setting area and the interactive device comprises:
determining a set area corresponding to the non-holding area outside the interaction area according to the non-holding area of the interaction device, and acquiring a first relative position relationship between the set area and the interaction device, wherein the non-holding area is an area which is not held in the edge area of the interaction device; or
Determining a set area corresponding to the first control action parameter outside the interaction area according to the first control action parameter detected in the interaction area, and acquiring a first relative position relation between the set area and the interaction device; or
And reading a first relative position relation between a preset area outside the interaction area and the interaction device, which is stored in advance.
4. The method according to claim 3, wherein the determining, according to a non-holding area of the interaction device, a setting area outside the interaction area and corresponding to the non-holding area, and obtaining a first relative positional relationship between the setting area and the interaction device comprises:
when the interaction device is detected to be in a held state, acquiring a gesture image;
determining a non-holding area of the interaction device according to the gesture image;
and acquiring a set area corresponding to the non-holding area outside the interaction area, and acquiring a first relative position relation between the set area and the interaction device.
5. The method according to claim 1, wherein the content to be displayed includes a plurality of menu pages, and the obtaining first content data corresponding to the non-interactive content includes:
according to the plurality of menu pages, acquiring first data of a second menu page corresponding to a first menu page, and taking the first data as first content data corresponding to the non-interactive content, wherein the second menu page is an upper-level menu page or a lower-level menu page of the first menu page;
when the interactive picture corresponding to the interactive content is displayed, displaying the virtual extended content, including:
and when the interactive picture corresponding to the first menu page is displayed, displaying the virtual extension content, wherein the virtual extension content comprises the second menu page.
6. The method according to claim 5, wherein the second menu page is a menu page at a next level of the first menu page, and the virtual extension content is displayed when the interactive screen corresponding to the first menu page is displayed, and after the virtual extension content includes the second menu page, the method further includes:
determining a menu item in a selected state in the first menu page according to a second control action parameter detected in the interaction area, and acquiring second data of a new next-level menu page corresponding to the menu item in the selected state;
generating new virtual extension content according to the relative spatial position information and the second data;
and switching the currently displayed virtual expanded content into the new virtual expanded content, wherein the new virtual expanded content comprises the new next-level menu page.
7. The method according to any one of claims 1-6, wherein the interaction area comprises a touch screen, and the obtaining the content to be displayed comprises:
sending a data request to the interactive device, wherein the data request is used for indicating the interactive device to acquire the content to be displayed on the touch screen;
and receiving the content to be displayed sent by the interaction device.
8. The method according to any one of claims 1 to 6, wherein before displaying the virtual extension content when the interactive screen corresponding to the interactive content is displayed, the method further comprises:
acquiring second content data corresponding to the interactive content;
acquiring icon data corresponding to the non-interactive content;
generating an interactive picture according to the second content data, the icon data and the relative spatial position information;
when the interactive picture corresponding to the interactive content is displayed, displaying the virtual extended content, including:
displaying the interactive picture;
and when the icon in the interactive picture is determined to be operated according to the third operation and control action parameter detected in the interactive area, displaying virtual expanded content corresponding to the operated icon, wherein a second display area of the virtual expanded content corresponds to a set area matched with the operated icon outside the interactive area.
9. The method according to claim 8, wherein the generating an interactive screen based on the second content data, the icon data, and the relative spatial position information comprises:
acquiring control data from the second content data;
determining a first arrangement position and a first proportion size of each control in the interactive picture according to the control data;
determining a second arrangement position and a second proportion size of each icon in the interactive picture according to the icon data;
generating an interactive picture containing the control and the icon according to the relative spatial position relationship, the first arrangement position and the first proportion size, and the second arrangement position and the second proportion size;
after the displaying the virtual extension content, further comprising:
and when determining to operate the control in the interactive picture according to the fourth control action parameter detected in the interactive area, generating a control instruction according to the operated control, and controlling the virtual object in the virtual extended content according to the control instruction.
10. A virtual content display device is applied to a terminal device, the terminal device is in communication connection with an interaction device, the interaction device comprises an interaction area, and the device comprises:
the content acquisition module is used for acquiring content to be displayed, and the content to be displayed comprises non-interactive content and interactive content;
the data acquisition module is used for acquiring first content data corresponding to the non-interactive content;
the position acquisition module is used for acquiring relative spatial position information between the interaction device and the terminal equipment;
the content generating module is used for generating virtual extended content according to the relative spatial position information and the first content data;
and the display control module is used for displaying the virtual extension content when an interactive picture corresponding to the interactive content is displayed, wherein a first display area of the interactive picture corresponds to the interactive area, and a second display area of the virtual extension content corresponds to a set area outside the interactive area.
11. A display system of virtual content, the system comprising a terminal device and an interaction apparatus, the terminal device being in communication connection with the interaction apparatus, the interaction apparatus comprising an interaction area, wherein:
the interaction device is used for acquiring first content data corresponding to non-interactive content in the content to be displayed according to the content to be displayed in the interaction area, sending the first content data to the terminal equipment, and controlling the interaction area to display an interaction picture corresponding to the interactive content, wherein the content to be displayed comprises the non-interactive content and the interactive content;
the terminal device is configured to acquire relative spatial position information between the interaction device and the terminal device, receive the first content data, generate virtual extended content according to the relative spatial position information and the first content data, and display the virtual extended content, where a display area of the virtual extended content corresponds to a set area outside the interaction area.
12. A terminal device, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of any of claims 1-9.
13. A computer-readable storage medium, having stored thereon program code that can be invoked by a processor to perform the method according to any one of claims 1 to 9.
CN201910377282.XA 2019-05-07 2019-05-07 Virtual content display method, device, system, terminal equipment and storage medium Pending CN111913674A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910377282.XA CN111913674A (en) 2019-05-07 2019-05-07 Virtual content display method, device, system, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910377282.XA CN111913674A (en) 2019-05-07 2019-05-07 Virtual content display method, device, system, terminal equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111913674A true CN111913674A (en) 2020-11-10

Family

ID=73241868

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910377282.XA Pending CN111913674A (en) 2019-05-07 2019-05-07 Virtual content display method, device, system, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111913674A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114415907A (en) * 2022-01-21 2022-04-29 腾讯科技(深圳)有限公司 Media resource display method, device, equipment and storage medium
WO2023024871A1 (en) * 2021-08-24 2023-03-02 亮风台(上海)信息科技有限公司 Interface interaction method and device
CN115942022A (en) * 2021-08-27 2023-04-07 中移(苏州)软件技术有限公司 Information preview method, related equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150277713A1 (en) * 2014-04-01 2015-10-01 International Business Machines Corporation Expanding touch zones of graphical user interface widgets displayed on a screen of a device without programming changes
CN105808147A (en) * 2016-05-10 2016-07-27 安徽大学 Wireless replication and expanded display interaction method and system
CN108310768A (en) * 2018-01-16 2018-07-24 腾讯科技(深圳)有限公司 The display methods and device of virtual scene, storage medium, electronic device
CN109460170A (en) * 2018-10-23 2019-03-12 努比亚技术有限公司 Screen extension and exchange method, terminal and computer readable storage medium
CN109496293A (en) * 2018-10-12 2019-03-19 北京小米移动软件有限公司 Extend content display method, device, system and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150277713A1 (en) * 2014-04-01 2015-10-01 International Business Machines Corporation Expanding touch zones of graphical user interface widgets displayed on a screen of a device without programming changes
CN105808147A (en) * 2016-05-10 2016-07-27 安徽大学 Wireless replication and expanded display interaction method and system
CN108310768A (en) * 2018-01-16 2018-07-24 腾讯科技(深圳)有限公司 The display methods and device of virtual scene, storage medium, electronic device
CN109496293A (en) * 2018-10-12 2019-03-19 北京小米移动软件有限公司 Extend content display method, device, system and storage medium
CN109460170A (en) * 2018-10-23 2019-03-12 努比亚技术有限公司 Screen extension and exchange method, terminal and computer readable storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023024871A1 (en) * 2021-08-24 2023-03-02 亮风台(上海)信息科技有限公司 Interface interaction method and device
CN115942022A (en) * 2021-08-27 2023-04-07 中移(苏州)软件技术有限公司 Information preview method, related equipment and storage medium
CN114415907A (en) * 2022-01-21 2022-04-29 腾讯科技(深圳)有限公司 Media resource display method, device, equipment and storage medium
CN114415907B (en) * 2022-01-21 2023-08-18 腾讯科技(深圳)有限公司 Media resource display method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN111158469A (en) Visual angle switching method and device, terminal equipment and storage medium
CN110456907A (en) Control method, device, terminal device and the storage medium of virtual screen
CN108038726B (en) Article display method and device
JP2022537614A (en) Multi-virtual character control method, device, and computer program
CN110442245A (en) Display methods, device, terminal device and storage medium based on physical keyboard
US11244511B2 (en) Augmented reality method, system and terminal device of displaying and controlling virtual content via interaction device
KR20160087268A (en) Mobile terminal and control method for the same
US20140071044A1 (en) Device and method for user interfacing, and terminal using the same
CN111383345B (en) Virtual content display method and device, terminal equipment and storage medium
CN111176764B (en) Display control method and terminal equipment
CN111913674A (en) Virtual content display method, device, system, terminal equipment and storage medium
US20200380724A1 (en) Personalized scene image processing method, apparatus and storage medium
CN111766937A (en) Virtual content interaction method and device, terminal equipment and storage medium
CN112044065B (en) Virtual resource display method, device, equipment and storage medium
CN111766936A (en) Virtual content control method and device, terminal equipment and storage medium
CN111161396B (en) Virtual content control method, device, terminal equipment and storage medium
CN110866940A (en) Virtual picture control method and device, terminal equipment and storage medium
CN111913639B (en) Virtual content interaction method, device, system, terminal equipment and storage medium
CN111913560A (en) Virtual content display method, device, system, terminal equipment and storage medium
CN111897437A (en) Cross-terminal interaction method and device, electronic equipment and storage medium
KR102292619B1 (en) Method for generating color, terminal thereof, and system thereof
JP2013004001A (en) Display control device, display control method, and program
CN111913565B (en) Virtual content control method, device, system, terminal device and storage medium
KR20220057388A (en) Terminal for providing virtual augmented reality and control method thereof
CN111913564B (en) Virtual content control method, device, system, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination