CN113923435B - Information display method, equipment and storage medium - Google Patents

Information display method, equipment and storage medium Download PDF

Info

Publication number
CN113923435B
CN113923435B CN202110988976.4A CN202110988976A CN113923435B CN 113923435 B CN113923435 B CN 113923435B CN 202110988976 A CN202110988976 A CN 202110988976A CN 113923435 B CN113923435 B CN 113923435B
Authority
CN
China
Prior art keywords
target
video material
virtual
visual angle
dimensional space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110988976.4A
Other languages
Chinese (zh)
Other versions
CN113923435A (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Chengshi Wanglin Information Technology Co Ltd
Original Assignee
Beijing Chengshi Wanglin Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Chengshi Wanglin Information Technology Co Ltd filed Critical Beijing Chengshi Wanglin Information Technology Co Ltd
Priority to CN202110988976.4A priority Critical patent/CN113923435B/en
Publication of CN113923435A publication Critical patent/CN113923435A/en
Application granted granted Critical
Publication of CN113923435B publication Critical patent/CN113923435B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/293Generating mixed stereoscopic images; Generating mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides an information display method, information display equipment and a storage medium. In the embodiment of the application, a target video material can be associated with a target observation visual angle in a virtual three-dimensional space of a target space and a preset visual angle range is configured, and the target video material takes a virtual three-dimensional space picture corresponding to the target observation visual angle as an initial frame; based on the method, the trigger control of the target video material can be displayed on the graphical user interface, the current observation visual angle is switched to the target observation visual angle in response to the trigger instruction which is monitored to occur on the trigger control of the target video material, and the target video material is acquired and played, so that the virtual three-dimensional space picture corresponding to the target observation visual angle is linked with the target video material for display. Therefore, dynamic picture experience can be provided in the virtual three-dimensional space based on the video materials, so that the content richness in the virtual three-dimensional space is enriched, the surprise experience of a user is enhanced, and the sense of reality cannot be damaged due to the addition of the video materials.

Description

Information display method, equipment and storage medium
Technical Field
The present application relates to the field of VR technologies, and in particular, to an information display method, device, and storage medium.
Background
In a panoramic roaming scene, a plurality of roaming points can be prearranged in a virtual space, and a user can adjust the position and the visual angle of the user in the virtual space by switching the roaming points, so that the user can simulate the experience of walking in the space, and the panoramic roaming is realized.
At present, a user can roam in a virtual space according to a plurality of roaming points, and the seen content is limited to a view angle picture, so that the panoramic roaming process is tedious and poor in experience.
Disclosure of Invention
Aspects of the present application provide a panoramic roaming method, device and storage medium, so as to improve content richness in a panoramic roaming scene without destroying reality.
The embodiment of the application provides an information display method, a graphical user interface is provided through an electronic terminal, content displayed by the graphical user interface comprises a virtual three-dimensional space of a target space, a target video material is associated with a target observation visual angle of the virtual three-dimensional space, a preset visual angle range is configured at the target observation visual angle, the target video material takes a virtual three-dimensional space picture corresponding to the target observation visual angle as an initial frame, and the method comprises the following steps:
in response to monitoring that the current observation visual angle is within the preset visual angle range of the target observation visual angle, displaying a trigger control of the target video material on the graphical user interface;
and in response to monitoring a trigger instruction generated on a trigger control of the target video material, switching the current observation visual angle to the target observation visual angle, and acquiring and playing the target video material so as to enable a virtual three-dimensional space picture corresponding to the target observation visual angle to be linked and displayed with the target video material.
The embodiment of the application also provides an electronic terminal, which comprises a memory, a processor and a display component;
the memory is to store one or more computer instructions;
the display component is used for providing a graphical user interface, the content displayed by the graphical user interface comprises a virtual three-dimensional space of a target space, a target video material is associated with a target observation visual angle of the virtual three-dimensional space, and a preset visual angle range is configured at the target observation visual angle, wherein the target video material takes a virtual three-dimensional space picture corresponding to the target observation visual angle as a starting frame;
the processor, coupled with the memory and the display component, to execute the one or more computer instructions to:
in response to monitoring that the current observation visual angle is within the preset visual angle range of the target observation visual angle, displaying a trigger control of the target video material on the graphical user interface;
and in response to monitoring a trigger instruction generated on a trigger control of the target video material, switching the current observation visual angle to the target observation visual angle, and acquiring and playing the target video material so as to enable a virtual three-dimensional space picture corresponding to the target observation visual angle to be linked and displayed with the target video material.
Embodiments of the present application also provide a computer-readable storage medium storing computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the aforementioned information presentation method.
In the embodiment of the application, a target video material can be associated with a target observation visual angle in a virtual three-dimensional space of a target space, and the target video material takes a virtual three-dimensional space picture corresponding to the target observation visual angle as an initial frame; a preset viewing angle range can also be configured for the target viewing angle; based on the method, the triggering control of the target video material can be displayed on the graphical user interface in response to the fact that the monitored current observation visual angle is within the preset visual angle range of the target observation visual angle, the current observation visual angle is switched to the target observation visual angle in response to the triggering instruction generated on the triggering control of the target video material, and the target video material is obtained and played, so that the virtual three-dimensional space picture corresponding to the target observation visual angle is linked with the target video material for display. Therefore, in the embodiment of the application, dynamic picture experience can be provided in the virtual three-dimensional space based on the video materials, so that the content richness in the virtual three-dimensional space is enriched, the surprise experience of a user is enhanced, and the experience cognition of the user to a real space is closer; moreover, the video material can be smoothly connected with the visual angle picture in the virtual three-dimensional space, so that the user can be prevented from browsing experience of blocking and splitting, and the reality sense cannot be damaged by adding the video material.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flowchart of an information presentation method according to an exemplary embodiment of the present application;
FIG. 2 is a schematic view of a scenario provided by an exemplary embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic terminal according to another exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
At present, the panoramic roaming process is relatively boring and tedious, and the experience feeling is insufficient. To this end, in some embodiments of the present application: the dynamic picture experience can be provided in the virtual three-dimensional space based on the video material, so that the content richness in the virtual three-dimensional space is enriched, the surprise experience of a user is enhanced, and the experience cognition of the user to a real space is closer; moreover, the video material can be smoothly connected with the visual angle picture in the virtual three-dimensional space, so that the user can be prevented from browsing experience of blocking and splitting, and the reality sense cannot be damaged by adding the video material.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a flowchart illustrating an information presentation method according to an exemplary embodiment of the present application, where the method may be performed by an information presentation apparatus, the information presentation apparatus may be implemented as a combination of software and/or hardware, and the information presentation apparatus may be integrated in an electronic terminal.
In this embodiment, a graphical user interface may be provided through the electronic terminal, and the content displayed by the graphical user interface may include a virtual three-dimensional space of a target space, where a target viewing angle of the virtual three-dimensional space is associated with a target video material, and the target viewing angle is configured with a preset viewing angle range, where the target video material takes a virtual three-dimensional space picture corresponding to the target viewing angle as a start frame. In addition, the target observation angle may be any observation angle included in the virtual three-dimensional space.
Based on this, referring to fig. 1, the information displaying method provided in this embodiment may include:
step 100, in response to the fact that the current observation visual angle is monitored to be within a preset visual angle range of the target observation visual angle, displaying a trigger control of the target video material on a graphical user interface;
step 101, responding to a trigger instruction generated on a trigger control for monitoring a target video material, switching a current observation visual angle to a target observation visual angle, and acquiring and playing the target video material so as to enable a virtual three-dimensional space picture corresponding to the target observation visual angle to be linked and displayed with the target video material.
The information display method provided by the embodiment can be applied to the panoramic roaming scene, so that the multiple video materials are added to the panoramic roaming scene, the content richness in the panoramic roaming process is improved under the condition of not damaging the reality, and more surprise experiences are brought to the user.
In this embodiment, the electronic terminal may be a mobile phone, a computer, a tablet computer, or other terminal device that displays a virtual three-dimensional space of a target space for a user, where the virtual three-dimensional space may be a space model generated after performing space modeling on the target space, and in the virtual three-dimensional space, the user may continuously switch a roaming viewing angle by clicking a roaming point, adjusting a roaming viewing angle, or other operations, and may view a corresponding virtual three-dimensional space picture at different roaming viewing angles, thereby obtaining a panoramic roaming experience.
In addition, the target space may be diverse in different application fields. For example, in the VR house source field, the target space may be a house space. For another example, in the VR automotive field, the target space may be an automotive exterior space or an automotive interior space. As another example, in the VR store domain, the target space may be a store environment space. Of course, these are also only exemplary, and the present embodiment does not limit the specification, position, type, and other attributes of the target space.
In this embodiment, the video material may be prepared in advance and associated to the virtual three-dimensional space. The following describes a process for preparing video material.
For example, a target video material is prepared for a target observation visual angle in a virtual three-dimensional space, in this embodiment, a virtual three-dimensional space picture corresponding to the target observation visual angle may be used as a start frame to prepare the target video material; and associating the target video material to the target viewing perspective. Therefore, the target video material takes the virtual three-dimensional space picture corresponding to the target observation visual angle as a starting frame, and the video content can be added into the subsequent video frames as required.
For example, if the target space is in the VR source domain, the source introduction content may be added to the video material, for example, the video content may be a room detail introduced by an instructor while walking through the target space. In this example, the starting frame of the target video material may be a virtual three-dimensional space picture corresponding to the target viewing angle, and in the second frame of the target video material, the interpreter may go out from a certain gate in the virtual three-dimensional space picture corresponding to the target viewing angle.
After the target video material is prepared, the target video material can be associated to a target observation visual angle. In practical application, a video material information base can be maintained, and the incidence relation between the video material and the observation visual angle is maintained in the information base, so that the incidence relation between the target video material and the target observation visual angle is added in the information base to realize the purpose of correlating the target video material to the target observation visual angle. And the information base is used as a reference for calling video materials in the implementation process of the information display scheme. Of course, in this embodiment, other manners may also be adopted to associate the target video material with the target viewing angle, and this embodiment is not limited to the manner of the information base.
Therefore, compared with a monotonous virtual three-dimensional space picture, the video material can provide richer and more flexible picture content, and can simulate dynamic picture experience for users. As in the above example, the user can obtain a dynamic screen experience of walking out the instructor to introduce the details of the house source in the virtual three-dimensional space screen corresponding to the target viewing perspective.
The above is a description of a preparation process of a target video material associated with a target viewing angle, and it should be understood that, in this embodiment, the same preparation process may be adopted for other viewing angle-associated video materials in a virtual three-dimensional space, and details thereof are not described here.
In this way, a virtual three-dimensional space can be generated with at least one video material associated therewith.
In this embodiment, a preset viewing angle range may also be configured for the viewing angle associated with the video material, or the target viewing angle is taken as an example. In an optional implementation manner, two rotation boundary view angles corresponding to a target observation view angle can be calculated in a virtual three-dimensional space according to a specified view angle rotation angle and the target observation view angle; and configuring the view angle range spanned by the two rotating boundary view angles as a preset view angle range of the target observation view angle.
For example, if the specified angle of view rotation is 30 °, in the virtual three-dimensional space, the target angle of view may be used as a reference, the angle of view hit by rotating 15 ° to the left is used as a first rotation boundary angle of view, the angle of view hit by rotating 15 ° to the right is used as a second rotation boundary angle of view, and the angle range spanned by the first rotation boundary angle of view and the second rotation corner angle of view is used as a preset angle range of view of the target angle of view.
It should be noted that, in this alternative implementation, the specified angle of rotation of the viewing angle may be customized as needed; the target observation angle of view is used as a reference, how to rotate the angle of view according to the specified angle of view rotation, and the execution logic of the part can be customized as required, and is not specifically limited herein.
By configuring the preset view angle range for the target view angle, the association range of the target video material can be expanded, that is, the target video material is extended to all view angles within the preset angle range corresponding to the target view angle and has association. This can effectively improve the triggering flexibility of the target video material.
The above is a description of the configuration process of the preset angle range corresponding to the target observation angle, and it should be understood that, in this embodiment, the same configuration process may be adopted to configure the preset angle range for other observation angles in the virtual three-dimensional space, and details thereof are not described here.
On this basis, referring to fig. 1, in step 100, a trigger control of the target video material may be displayed on the graphical user interface in response to monitoring that the current viewing angle is within a preset viewing angle range of the target viewing angle. The user can execute the operation of adjusting the observation visual angle in the virtual three-dimensional space of the target space displayed on the graphical user interface, and adjust the observation visual angle as required to realize panoramic roaming. Therefore, in the process of adjusting the observation angle by the user, the information display scheme provided by the embodiment can monitor whether the current observation angle is within the preset angle range corresponding to a certain video material. Taking the target video material as an example, as mentioned above, in this embodiment, a preset angle range is configured for the target observation angle, and the target video material is associated, so that the target video material is equivalent to the preset angle range corresponding to the associated target observation angle, and based on this, in step 100, if it is monitored that the current observation angle is within the preset angle range of the target observation angle, the trigger control of the target video material can be displayed on the graphical user interface. Therefore, under the condition that the current observation visual angle of the user enters the preset angle range associated with the target video material, the permission of playing the target video material can be provided for the user.
In this embodiment, the form, the display position, and the like of the trigger control are not limited.
And if the user desires to play the target video material, the triggering operation can be executed through the triggering control. In this embodiment, in step 101, in response to a trigger instruction occurring on the trigger control for monitoring the target video material, the current viewing angle may be switched to the target viewing angle, and the target video material may be acquired and played, so that the virtual three-dimensional space picture corresponding to the target viewing angle is displayed in a linked manner with the target video material.
In step 101, the target video material is not played directly, but is played after switching the viewing angle. It should be understood that if the current observation angle is the target observation angle, switching the current observation angle to the target observation angle is automatically interpreted as not requiring the observation angle switching operation.
The switching of the current observation angle to the target observation angle means rotating from the current observation angle to the target observation angle. The viewing angle switching process here can follow the view angle switching manner in the panoramic roaming scene, and the detailed description is not repeated here.
And after the target observation visual angle is switched, the target video material can be acquired and played. As mentioned above, the target video material takes the virtual three-dimensional space picture corresponding to the target observation angle as the initial frame, and therefore, the target video material is played after the target observation angle is switched, so that the target video material and the virtual three-dimensional space picture corresponding to the target observation angle can be ensured to be seamlessly joined and naturally transited, and the content of the target video material can be naturally continued to the virtual three-dimensional space picture.
For example, if the video content of the target video material is that the interpreter introduces details of the house source while walking in the target space, the initial frame of the target video material may be a virtual three-dimensional space picture corresponding to the target observation perspective, and in the second frame of the target video material, the interpreter may walk out from a certain gate in the virtual three-dimensional space picture corresponding to the target observation perspective. In this way, in step 101, from the sensory perspective of the user, after the trigger operation is initiated, the current viewing angle may be rotated to the target viewing angle, and then the target video material starts to be played, and the user may see in the graphical user interface that the interpreter walks out of the virtual three-dimensional space picture corresponding to the target viewing angle and starts to interpret the details of the house source. In the whole process, the user can obtain panoramic roaming in the target space, and dynamic picture experience of introduction of house source details by an instructor occurs surprisingly in the roaming process.
Optionally, in this embodiment, in the process of playing the target video material, a video interaction control is displayed in the graphical user interface; and responding to a video interaction instruction initiated by a user through the video interaction control, and adjusting the playing state of the target video material. The video interaction instruction may include, but is not limited to, a pause play instruction, a play progress adjustment instruction, a play speed adjustment instruction, and the like. In this way, the user can adjust the play state of the target video material as desired.
In addition, in the embodiment, in the process of playing the target video material, a full-screen playing mode may be adopted to maintain the consistency of the visual effects of the picture in the target space and the virtual three-dimensional space picture involved in the video material. In addition, the target video material can adopt panoramic video, so as to further improve the consistency of the target video material and the virtual three-dimensional space in content and sense.
In this embodiment, in the playing process of the target video material, the return virtual three-dimensional space control may also be displayed in the graphical user interface. Based on this, in this embodiment, in response to a virtual three-dimensional space returning instruction initiated by the user through the virtual three-dimensional space returning control, the playing picture of the target video material may be switched back to the virtual three-dimensional space picture corresponding to the current viewing angle in the graphical user interface. In returning to the virtual three-dimensional space, the user may continue to perform panoramic roaming based on the virtual three-dimensional space.
Accordingly, in this embodiment, a target video material can be associated with a target observation visual angle in a virtual three-dimensional space of a target space, and the target video material takes a virtual three-dimensional space picture corresponding to the target observation visual angle as a start frame; a preset viewing angle range can also be configured for the target viewing angle; based on the method, the triggering control of the target video material can be displayed on the graphical user interface in response to the fact that the monitored current observation visual angle is within the preset visual angle range of the target observation visual angle, the current observation visual angle is switched to the target observation visual angle in response to the triggering instruction generated on the triggering control of the target video material, and the target video material is obtained and played, so that the virtual three-dimensional space picture corresponding to the target observation visual angle is linked with the target video material for display. Therefore, in the embodiment of the application, dynamic picture experience can be provided in the virtual three-dimensional space based on the video materials, so that the content richness in the virtual three-dimensional space is enriched, the surprise experience of a user is enhanced, and the experience cognition of the user to a real space is closer; moreover, the video material can be smoothly connected with the visual angle picture in the virtual three-dimensional space, so that the user can be prevented from browsing experience of blocking and splitting, and the reality sense cannot be damaged by adding the video material.
Fig. 2 is a schematic view of a scenario provided in an exemplary embodiment of the present application. Referring to fig. 2, an exemplary illustration of an information presentation scheme with a room source space as a target space will be described below.
Referring to fig. 2, the first diagram from the left shows a virtual three-dimensional space of a room source space displayed in a graphical user interface, and a picture appearing in the diagram is a virtual three-dimensional space picture corresponding to a current viewing angle of a user. Moreover, the current viewing angle is within the preset viewing angle range corresponding to the video material a, so that in the first drawing, a "video guide" trigger control is also displayed in the graphical user interface.
The user may click on the "video guide" trigger control to initiate the trigger instruction. Based on this, in the virtual three-dimensional control, the current viewing angle is switched to the target viewing angle associated with the video material a. Referring to the second diagram in fig. 2, the target viewing angle has been switched to and a virtual three-dimensional space picture corresponding to the target viewing angle is shown in the graphical user interface. Comparing the first and second images, the viewing angle is rotated to the right by a certain angle.
After switching to the target viewing angle, the video material a can be acquired and played. The third image shows that the video material A takes the virtual three-dimensional space image corresponding to the target observation visual angle as a starting frame, the third image shows that the second frame of the video material A can be the video material A, a speaker can be seen to walk out of the wall, and the position, the shape and the like of the wall are consistent with those of the wall in the second image, so that the user can be naturally immersed into the dynamic world presented by the video material A.
According to the change of the display pictures in the graphical user interface shown in the three pictures shown in fig. 2, the user can feel that the information display scheme provided by the embodiment can naturally transit from the virtual three-dimensional space picture to the video material, so that the user can simulate the experience of really entering the dynamic picture, and the surprise and the sense of reality are greatly improved.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of steps 100 to 101 may be device a; for another example, the execution subject of step 100 may be device a, and the execution subject of step 101 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 100, 101, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel.
Fig. 3 is a schematic structural diagram of an electronic terminal according to another exemplary embodiment of the present application. As shown in fig. 3, the electronic terminal includes: memory 30, processor 31, and display component 32.
A memory 30 for storing one or more computer instructions;
the display component 32 is configured to provide a graphical user interface, where the content displayed by the graphical user interface includes a virtual three-dimensional space of a target space, where a target viewing angle of the virtual three-dimensional space is associated with a target video material, and the target viewing angle is configured with a preset viewing angle range, where the target video material takes a virtual three-dimensional space picture corresponding to the target viewing angle as a start frame;
a processor 31 coupled to the memory 30 and the display assembly 32 for executing computer programs in the memory 30 for:
in response to monitoring that the current observation visual angle is within a preset visual angle range of the target observation visual angle, displaying a trigger control of the target video material on a graphical user interface;
and responding to a trigger instruction generated on a trigger control for monitoring the target video material, switching the current observation visual angle into a target observation visual angle, and acquiring and playing the target video material so as to enable a virtual three-dimensional space picture corresponding to the target observation visual angle to be linked and displayed with the target video material.
In an alternative embodiment, the processor 31 is configured to, during the preparation of the target video material:
preparing a target video material by taking a virtual three-dimensional space picture corresponding to a target observation visual angle as an initial frame;
and associating the target video material to the target viewing perspective.
In an alternative embodiment, the processor 31 is configured to, during the configuration of the preset viewing angle range:
in the virtual three-dimensional space, calculating two rotation boundary visual angles corresponding to a target observation visual angle according to a specified visual angle rotation angle and the target observation visual angle;
and configuring the view angle range spanned by the two rotating boundary view angles as a preset view angle range of the target observation view angle.
In an optional embodiment, the display component 32 is further configured to display a video interaction control in the graphical user interface during the playing process of the target video material;
and the processor 31 is further configured to adjust the playing state of the target video material in response to a video interaction instruction initiated by the user through the video interaction control.
In an alternative embodiment, the display assembly 32 is further configured to:
in the process of playing the target video material, displaying a return virtual three-dimensional space control in a graphical user interface;
and the processor 31 is further configured to switch back, in the graphical user interface, the playing picture of the target video material to the virtual three-dimensional space picture corresponding to the current observation visual angle in response to a return virtual three-dimensional space instruction initiated by the user through the return virtual three-dimensional space control.
Further, as shown in fig. 3, the electronic terminal further includes: communication components 33, power components 34, audio components 35, and the like. Only some of the components are schematically shown in fig. 3, and the electronic terminal is not meant to include only the components shown in fig. 3.
It should be noted that, for the technical details of the embodiments of the electronic terminal, reference may be made to the related description in the foregoing method embodiments, and for the sake of brevity, detailed description is not provided herein, but this should not cause a loss of scope of the present application.
Accordingly, the present application further provides a computer-readable storage medium storing a computer program, where the computer program is capable of implementing the steps that can be executed by the electronic device in the foregoing method embodiments when executed.
The memory of FIG. 3, described above, is used to store a computer program and may be configured to store other various data to support operations on a computing platform. Examples of such data include instructions for any application or method operating on the computing platform, contact data, phonebook data, messages, pictures, videos, and so forth. The memory may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The communication component in fig. 3 is configured to facilitate wired or wireless communication between the device where the communication component is located and other devices. The device where the communication component is located can access a wireless network based on a communication standard, such as a WiFi, a 2G, 3G, 4G/LTE, 5G and other mobile communication networks, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
The display assembly of fig. 3 may include a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The power supply assembly of fig. 3 described above provides power to the various components of the device in which the power supply assembly is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
The audio component of fig. 3 described above may be configured to output and/or input an audio signal. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (9)

1. An information display method is characterized in that a graphical user interface is provided through an electronic terminal, the content displayed by the graphical user interface comprises a virtual three-dimensional space of a target space, wherein a target video material is associated with a target observation visual angle of the virtual three-dimensional space, a preset visual angle range is configured at the target observation visual angle, and the target video material takes a virtual three-dimensional space picture corresponding to the target observation visual angle as an initial frame, and the method comprises the following steps:
in response to monitoring that the current observation visual angle is within the preset visual angle range of the target observation visual angle, displaying a trigger control of the target video material on the graphical user interface;
responding to a trigger instruction generated on a trigger control of the target video material, switching the current observation visual angle to the target observation visual angle, and acquiring and playing the target video material so as to enable a virtual three-dimensional space picture corresponding to the target observation visual angle to be linked and displayed with the target video material;
wherein the process of preparing the target video material comprises:
preparing the target video material by taking a virtual three-dimensional space picture corresponding to the target observation visual angle as a starting frame;
and associating the target video material to the target viewing perspective.
2. The method according to claim 1, wherein configuring the preset viewing angle range comprises:
in the virtual three-dimensional space, according to the specified view angle rotation angle and the target observation view angle, calculating two rotation boundary view angles corresponding to the target observation view angle;
and configuring the view angle range spanned by the two rotating boundary view angles as a preset view angle range of the target observation view angle.
3. The method of claim 1, further comprising:
displaying a video interaction control in the graphical user interface in the process of playing the target video material;
and responding to a video interaction instruction initiated by a user through the video interaction control, and adjusting the playing state of the target video material.
4. The method of claim 1, further comprising:
displaying a return virtual three-dimensional space control in the graphical user interface in the process of playing the target video material;
and responding to a virtual three-dimensional space returning instruction initiated by the user through the virtual three-dimensional space returning control, and switching back the virtual three-dimensional space picture corresponding to the current observation visual angle from the playing picture of the target video material in the graphical user interface.
5. An electronic terminal comprising a memory, a processor, and a display assembly;
the memory is to store one or more computer instructions;
the display component is used for providing a graphical user interface, the content displayed by the graphical user interface comprises a virtual three-dimensional space of a target space, a target video material is associated with a target observation visual angle of the virtual three-dimensional space, and a preset visual angle range is configured at the target observation visual angle, wherein the target video material takes a virtual three-dimensional space picture corresponding to the target observation visual angle as a starting frame;
the processor, coupled with the memory and the display component, to execute the one or more computer instructions to:
in response to monitoring that the current observation visual angle is within the preset visual angle range of the target observation visual angle, displaying a trigger control of the target video material on the graphical user interface;
responding to a trigger instruction generated on a trigger control of the target video material, switching the current observation visual angle to the target observation visual angle, and acquiring and playing the target video material so as to enable a virtual three-dimensional space picture corresponding to the target observation visual angle to be linked and displayed with the target video material; wherein, the processor is configured to, in the process of preparing the target video material:
preparing the target video material by taking a virtual three-dimensional space picture corresponding to the target observation visual angle as a starting frame;
and associating the target video material to the target viewing perspective.
6. The electronic terminal according to claim 5, wherein the display component is further configured to calculate two rotation boundary views corresponding to the target viewing angle according to a specified view angle rotation angle and the target viewing angle in the virtual three-dimensional space;
the processor is specifically configured to configure a view angle range spanned by the two rotation boundary view angles as a preset view angle range of the target observation view angle.
7. The electronic terminal of claim 5, wherein the display component is further configured to display a video interaction control in the graphical user interface during the playing of the target video material;
the processor is further configured to adjust a playing state of the target video material in response to a video interaction instruction initiated by a user through the video interaction control.
8. The electronic terminal of claim 5, wherein the processor is further configured to:
displaying a return virtual three-dimensional space control in the graphical user interface in the process of playing the target video material;
and responding to a virtual three-dimensional space returning instruction initiated by the user through the virtual three-dimensional space returning control, and switching back the virtual three-dimensional space picture corresponding to the current observation visual angle from the playing picture of the target video material in the graphical user interface.
9. A computer-readable storage medium storing computer instructions, which when executed by one or more processors, cause the one or more processors to perform the information presentation method of any one of claims 1-4.
CN202110988976.4A 2021-08-26 2021-08-26 Information display method, equipment and storage medium Active CN113923435B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110988976.4A CN113923435B (en) 2021-08-26 2021-08-26 Information display method, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110988976.4A CN113923435B (en) 2021-08-26 2021-08-26 Information display method, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113923435A CN113923435A (en) 2022-01-11
CN113923435B true CN113923435B (en) 2022-08-05

Family

ID=79233219

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110988976.4A Active CN113923435B (en) 2021-08-26 2021-08-26 Information display method, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113923435B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017177842A1 (en) * 2016-04-11 2017-10-19 腾讯科技(深圳)有限公司 Additional media presentation method and device for use during media playing
CN109829977A (en) * 2018-12-30 2019-05-31 贝壳技术有限公司 Method, apparatus, electronic equipment and the medium in room are seen in virtual three-dimensional space
CN111158469A (en) * 2019-12-12 2020-05-15 广东虚拟现实科技有限公司 Visual angle switching method and device, terminal equipment and storage medium
CN111629225A (en) * 2020-07-14 2020-09-04 腾讯科技(深圳)有限公司 Visual angle switching method, device and equipment for live broadcast of virtual scene and storage medium
CN111667589A (en) * 2020-06-12 2020-09-15 上海商汤智能科技有限公司 Animation effect triggering display method and device, electronic equipment and storage medium
CN111739169A (en) * 2019-10-31 2020-10-02 北京京东尚科信息技术有限公司 Product display method, system, medium and electronic device based on augmented reality
CN112569596A (en) * 2020-12-11 2021-03-30 腾讯科技(深圳)有限公司 Video picture display method and device, computer equipment and storage medium
CN113178015A (en) * 2021-03-26 2021-07-27 瑞庭网络技术(上海)有限公司 House source interaction method and device, electronic equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017177842A1 (en) * 2016-04-11 2017-10-19 腾讯科技(深圳)有限公司 Additional media presentation method and device for use during media playing
CN109829977A (en) * 2018-12-30 2019-05-31 贝壳技术有限公司 Method, apparatus, electronic equipment and the medium in room are seen in virtual three-dimensional space
CN111739169A (en) * 2019-10-31 2020-10-02 北京京东尚科信息技术有限公司 Product display method, system, medium and electronic device based on augmented reality
CN111158469A (en) * 2019-12-12 2020-05-15 广东虚拟现实科技有限公司 Visual angle switching method and device, terminal equipment and storage medium
CN111667589A (en) * 2020-06-12 2020-09-15 上海商汤智能科技有限公司 Animation effect triggering display method and device, electronic equipment and storage medium
CN111629225A (en) * 2020-07-14 2020-09-04 腾讯科技(深圳)有限公司 Visual angle switching method, device and equipment for live broadcast of virtual scene and storage medium
CN112569596A (en) * 2020-12-11 2021-03-30 腾讯科技(深圳)有限公司 Video picture display method and device, computer equipment and storage medium
CN113178015A (en) * 2021-03-26 2021-07-27 瑞庭网络技术(上海)有限公司 House source interaction method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113923435A (en) 2022-01-11

Similar Documents

Publication Publication Date Title
CN110662083B (en) Data processing method and device, electronic equipment and storage medium
US11315336B2 (en) Method and device for editing virtual scene, and non-transitory computer-readable storage medium
EP3163549A1 (en) Interface display method and device
US11880999B2 (en) Personalized scene image processing method, apparatus and storage medium
CN110751707B (en) Animation display method, animation display device, electronic equipment and storage medium
EP3796317A1 (en) Video processing method, video playing method, devices and storage medium
CN111479158B (en) Video display method and device, electronic equipment and storage medium
WO2021073293A1 (en) Animation file generating method and device, and storage medium
CN112653920B (en) Video processing method, device, equipment and storage medium
CN112044064A (en) Game skill display method, device, equipment and storage medium
US20180035170A1 (en) Method and device for controlling playing state
CN104461283A (en) Network view screen shooting method and device and electronic device
CN104035674A (en) Picture displaying method and device
CN113900606B (en) Information display method, equipment and storage medium
CN109947506A (en) Interface switching method, device and electronic equipment
CN111352560B (en) Screen splitting method and device, electronic equipment and computer readable storage medium
CN110321042B (en) Interface information display method and device and electronic equipment
CN113938620A (en) Image processing method, mobile terminal and storage medium
CN113923435B (en) Information display method, equipment and storage medium
CN112764636A (en) Video processing method, video processing device, electronic equipment and computer-readable storage medium
CN105426496A (en) Page display method and apparatus and electronic device
CN113473224B (en) Video processing method, video processing device, electronic equipment and computer readable storage medium
CN110662103B (en) Multimedia object reconstruction method and device, electronic equipment and readable storage medium
CN109407942B (en) Model processing method and device, control client and storage medium
CN114268802A (en) Virtual space display method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant