CN111198610B - Method, device and equipment for controlling field of view of panoramic video and storage medium - Google Patents

Method, device and equipment for controlling field of view of panoramic video and storage medium Download PDF

Info

Publication number
CN111198610B
CN111198610B CN201811369535.0A CN201811369535A CN111198610B CN 111198610 B CN111198610 B CN 111198610B CN 201811369535 A CN201811369535 A CN 201811369535A CN 111198610 B CN111198610 B CN 111198610B
Authority
CN
China
Prior art keywords
direction rotation
panoramic video
texture
playing window
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811369535.0A
Other languages
Chinese (zh)
Other versions
CN111198610A (en
Inventor
付宇豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Douyin Vision Co Ltd
Douyin Vision Beijing Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN201811369535.0A priority Critical patent/CN111198610B/en
Publication of CN111198610A publication Critical patent/CN111198610A/en
Application granted granted Critical
Publication of CN111198610B publication Critical patent/CN111198610B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Abstract

The embodiment of the application provides a method, a device, equipment and a storage medium for controlling a view field of a panoramic video; the method comprises the following steps: monitoring a direction rotation event according to the type of equipment for displaying the panoramic video playing window; adjusting the playing window view field corresponding to the playing window according to the direction rotation event; projecting the adjusted playing window view field in the surface texture of the three-dimensional model to obtain a texture area; the surface texture is obtained by mapping texture coordinates corresponding to pixel points of a video frame of the panoramic video to vertex coordinates of the surface of the three-dimensional model; rendering the texture region in the playback window.

Description

Method, device and equipment for controlling field of view of panoramic video and storage medium
Technical Field
The embodiment of the application relates to multimedia technology, in particular to a method, a device, equipment and a storage medium for controlling a view field of a panoramic video.
Background
The development of communication infrastructure has led to the increasing rate of terminal network communication, so that various media forms, especially video, have never been widely spread in the network. The panoramic video is an expansion of a user perception video mode, and the display of the content in the video can present various modes.
For example, in a panoramic video, a user is no longer a pure viewer, and can interact with the panoramic video, a field of view in the panoramic video can be rotated in direction according to the needs of the user, and objects (such as scenery, people and the like) in the field of view can be zoomed according to the wishes of the user, so that the user can pay attention to the content of interest in the panoramic video, and give the user the perception effect as if he is in the scene.
However, the instructions for controlling the direction rotation of the panoramic video are different for the user on the terminals with different UI interaction mechanisms, however, the problem that the field direction rotation instruction of the panoramic video is incompatible with the UI interaction mechanism of the terminal is not solved by the existing video playing control technology, which results in that the direction rotation of the panoramic video cannot be completed on the terminal.
Disclosure of Invention
In view of this, embodiments of the present application provide a method, an apparatus, a device, and a storage medium for controlling a field of view of a panoramic video.
The embodiment of the application provides a method for controlling a view field of a panoramic video, which comprises the following steps:
monitoring a direction rotation event according to the type of equipment for displaying the panoramic video playing window;
adjusting the playing window view field corresponding to the playing window according to the direction rotation event;
projecting the adjusted playing window view field in the surface texture of the three-dimensional model to obtain a texture area; the surface texture is obtained by mapping texture coordinates corresponding to pixel points of a video frame of the panoramic video to vertex coordinates of the surface of the three-dimensional model;
rendering the texture region in the playback window.
In the foregoing solution, before monitoring a direction rotation event according to a device type of a display panoramic video playing window, the method further includes:
mapping pixel points of video frames of the panoramic video to texture space to obtain texture coordinates of the pixel points;
and mapping the texture coordinates of the pixel points to the vertex coordinates of the surface of the three-dimensional model to form the surface texture.
In the foregoing solution, the adjusting the play window view field corresponding to the play window according to the direction rotation event includes:
determining a direction rotation parameter corresponding to the panoramic video according to the direction rotation event;
and rotating the angle of the view field of the playing window according to the direction rotation parameter.
In the above solution, the monitoring a direction rotation event according to the type of the device displaying the panoramic video playing window includes:
when the device type is a touch terminal type, monitoring a direction rotation event triggered by touch operation;
and when the equipment type is the type of the controlled terminal of the controlled entity, monitoring a direction rotation event triggered by the operation of the entity.
In the foregoing solution, the determining, according to the direction rotation event, a direction rotation parameter corresponding to the panoramic video includes:
and when the direction rotation event comprises at least two direction rotation events, respectively responding to the at least two direction rotation events according to the priority of the preset direction rotation event, and determining the direction rotation parameters corresponding to the panoramic video.
In the foregoing solution, the adjusting the play window view field corresponding to the play window according to the direction rotation event further includes:
responding to the direction rotation event, and adjusting the playing window view field corresponding to the playing window according to a preset direction rotation angle; alternatively, the first and second electrodes may be,
and responding to the direction rotation event, and adjusting the playing window view field corresponding to the playing window according to a preset direction rotation speed.
In the foregoing solution, the adjusting the play window view field corresponding to the play window according to the direction rotation event further includes:
according to the direction rotation event, adjusting the position of the currently displayed video frame in the panoramic video in the playing window
The embodiment of the application provides a field of view controlling means of panoramic video, includes:
the first monitoring module is used for monitoring a direction rotation event according to the type of equipment for displaying the panoramic video playing window;
the first adjusting module is used for adjusting the playing window view field corresponding to the playing window according to the direction rotating event;
the first projection module is used for projecting the adjusted playing window view field in the surface texture of the three-dimensional model to obtain a texture area; the surface texture is obtained by mapping texture coordinates corresponding to pixel points of a video frame of the panoramic video to vertex coordinates of the surface of the three-dimensional model;
a first rendering module to render the texture region in the playback window.
In the above scheme, the apparatus further comprises:
mapping pixel points of video frames of the panoramic video to texture space to obtain texture coordinates of the pixel points;
and mapping the texture coordinates of the pixel points to the vertex coordinates of the surface of the three-dimensional model to form the surface texture.
In the foregoing solution, the first adjusting module includes:
the first determining unit is used for determining a direction rotation parameter corresponding to the panoramic video according to the direction rotation event;
and the first rotating unit is used for rotating the angle of the view field of the playing window according to the direction rotating parameter.
In the foregoing solution, the first monitoring module includes:
the first monitoring unit is used for monitoring a direction rotation event triggered by touch operation when the equipment type is a touch terminal type;
and the second monitoring unit is used for monitoring a direction rotation event triggered by the operation of the entity when the equipment type is the type of the controlled terminal of the entity.
In the foregoing solution, the first determining unit includes:
and the first response subunit is configured to, when the direction rotation event includes at least two direction rotation events, respectively respond to the at least two direction rotation events according to a preset direction rotation event priority, and determine a direction rotation parameter corresponding to the panoramic video.
In the foregoing solution, the first adjusting module further includes:
the first adjusting unit is used for responding to the direction rotating event and adjusting the playing window view field corresponding to the playing window according to a preset direction rotating angle; alternatively, the first and second electrodes may be,
and the second adjusting unit is used for responding to the direction rotating event and adjusting the playing window view field corresponding to the playing window according to a preset direction rotating speed.
In the foregoing solution, the first adjusting module further includes:
a third adjusting unit, configured to adjust, according to the direction rotation event, a position of a video frame currently displayed in the playing window in a video frame of the panoramic video
An embodiment of the present application provides an electronic device, including:
a memory for storing executable instructions;
and the processor is used for realizing the field-of-view control method of the panoramic video provided by the embodiment of the application when the executable instruction is executed.
The embodiment of the application provides a storage medium, which stores executable instructions, and the executable instructions are used for realizing the field-of-view control method of panoramic video provided by the embodiment of the application when being executed.
The embodiment of the application provides a field-of-view control method, a field-of-view control device, a field-of-view control equipment and a storage medium for panoramic video, wherein a corresponding direction rotation event is monitored according to the type of the equipment, so that the problem that a field-of-view direction rotation instruction of the panoramic video is incompatible with a UI (user interface) interaction mechanism of a terminal is solved, and the panoramic video completes direction rotation of the panoramic video on different terminals.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device 100 implementing an embodiment of the present application;
FIG. 2 is an alternative schematic diagram of an electronic device implementing an embodiment of the present application;
FIG. 3 is a schematic view of an alternative flow chart for displaying a panoramic video by an electronic device implementing an embodiment of the present application;
FIG. 4A is a schematic diagram of an alternative texture mapping for displaying a panoramic video by an electronic device implementing an embodiment of the present application;
FIG. 4B is a schematic view of an alternative field of view of an electronic device displaying panoramic video implementing embodiments of the present application;
FIG. 4C is a schematic diagram of an alternative projection of an electronic device displaying a panoramic video, implementing an embodiment of the present application;
fig. 5 is a schematic flowchart of a method for implementing field control of a panoramic video according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a method for implementing field control of a panoramic video according to another embodiment of the present application;
fig. 7 is a schematic structural diagram of a view field control device of a panoramic video according to an embodiment of the present application.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application. In the following description, reference is made to "some embodiments" which describe a subset of all possible embodiments, but it is understood that "some embodiments" may be the same subset or different subsets of all possible embodiments, and may be combined with each other without conflict.
Before further detailed description of the embodiments of the present application, terms and terminology referred to in the embodiments of the present application are explained, and unless otherwise defined, technical and scientific terms used in the embodiments of the present application have the same meaning as commonly understood by one of ordinary skill in the art to which the embodiments of the present application pertain. The terminology used is for the purpose of describing the embodiments of the present application only and is not intended to be limiting of the embodiments of the present application, and the terms and expressions referring to the embodiments of the present application are used for the purpose of explanation as follows.
1) The panoramic video is a video which can be watched by naked eyes of a user and is played in various electronic devices, and the direction and the magnification of a view field of a playing window can be adjusted.
2) The three-dimensional model is a model simulating a space expressed by the panoramic video and is used for mapping video frames of the panoramic video to the surface of the three-dimensional model to form surface textures, and the surface of the three-dimensional model generally adopts a spherical surface or a cylindrical surface.
3) The field of view, the set of lines of sight that the virtual lens in the center of the three-dimensional model perceives the surface texture of the three-dimensional model, more generally, refers to the area that can be viewed in a video frame of a panoramic video through one window.
4) The method comprises the steps of playing a window, wherein a full-screen or non-full-screen window used for playing the panoramic video is defaulted in a client of the panoramic video, at least comprises a video playing area of the panoramic video, and can also comprise an operation area provided with an entrance of related operation.
5) The playing window view field, i.e. the view field corresponding to the playing window, controls that the partial content in the corresponding view field in the video frame can be perceived in the playing window.
6) The sub-window is a non-full-screen window which is used for playing in the panoramic video client side in an auxiliary mode, is smaller than a playing storage mode, at least comprises a video playing area of the panoramic video and can also comprise an operation area provided with an entrance of related operation.
7) The sub-window field of view, i.e. the field of view corresponding to the sub-window, controls the portion of the content in the video frame that is perceivable in the window that is in the corresponding field of view.
8) Texture, the characteristic of objects in a video frame that are geometrically regular in color, is represented by the texture coordinates and corresponding color values of each texel in the video frame.
9) The texture region, the region included by the projection of the field of view on the surface texture of the three-dimensional model, is a subset of the surface texture of the three-dimensional model, for example, the viewing angle of the region that can be completely displayed by the panoramic video frame may cover 0 to 180 degrees in the vertical direction and the horizontal direction, and the viewing angle covered by the texture region corresponding to the field of view may cover 0 to 30 degrees in the vertical direction and the horizontal direction.
Referring to fig. 1, fig. 1 is a schematic structural diagram of an electronic device 100 implementing an embodiment of the present application. The electronic device may be various terminals including, but not limited to, mobile terminals such as a mobile phone, a notebook computer, a Digital broadcast receiver, a Personal Digital Assistant (PDA), a tablet computer (PAD), a Portable Multimedia Player (PMP), a vehicle mounted terminal (e.g., a car navigation terminal), etc., and fixed terminals such as a Digital Television (TV), a desktop computer, etc. The electronic device shown in fig. 1 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 1, the electronic device 100 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 110, which may perform various appropriate actions and processes according to a program stored in a Read-Only Memory (ROM) 120 or a program loaded from a storage means 180 into a Random Access Memory (RAM) 130. In the RAM 130, various programs and data necessary for the operation of the electronic apparatus 100 are also stored. The processing device 110, the ROM 120, and the RAM 130 are connected to each other through a bus 140. An Input/Output (I/O) interface 150 is also connected to bus 140.
Generally, the following devices may be connected to the I/O interface 150: input devices 160 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 170 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, or the like; a storage device 180 including, for example, a magnetic tape, a hard disk, or the like; and a communication device 190. The communication device 190 may allow the electronic device 100 to communicate wirelessly or by wire with other devices to exchange data. While fig. 1 illustrates an electronic device 100 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, the processes described by the provided flow charts may be implemented as computer software programs according to embodiments of the present application. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such embodiments, the computer program may be downloaded and installed from a network through the communication device 190, or installed from the storage device 180, or installed from the ROM 120. The computer program, when executed by the processing device 110, performs the functions of the methods of the embodiments of the present application.
It should be noted that the computer readable medium in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. Examples of computer readable storage media may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In embodiments of the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In embodiments of the present application, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, Radio Frequency (RF), etc., or any suitable combination of the foregoing.
The computer readable medium may be included in the electronic device 100; or may be separate and not incorporated into the electronic device 100.
The computer readable medium carries one or more programs, which when executed by the electronic device, cause the electronic device 100 to perform the methods provided by the embodiments of the present application.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of Network, including a Local Area Network (LAN) and a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The units and/or modules described in the embodiments of the present application may be implemented by software or hardware.
As a hardware manner, the units and/or modules of the electronic Device implementing the embodiments of the present Application may be implemented by one or more Application Specific Integrated Circuits (ASICs), DSPs, Programmable Logic Devices (PLDs), Complex Programmable Logic Devices (CPLDs), Field Programmable Gate Arrays (FPGAs), or other electronic components, and are used to execute the method provided by the embodiments of the present Application.
As for the software manner, the units and/or modules implementing the field of view control apparatus of panoramic video of the embodiments of the present application may be implemented by two or more units,
the following exemplifies units and/or modules of a field control apparatus for panoramic video according to an embodiment of the present application by way of software.
Referring to fig. 2, fig. 2 is a schematic diagram of an alternative structure of a field-of-view control apparatus for panoramic video according to an embodiment of the present application, showing the following software modules stored in the storage apparatus 180: the mapping unit 210, the field of view unit 220, the texture unit 230, and the rendering unit 240 are explained separately.
A mapping unit 210, configured to map video frames in the panoramic video to surface textures of the three-dimensional model.
A view field unit 220, configured to determine a play window view field corresponding to a play window and a sub-window view field corresponding to a sub-window of the play window.
The texture unit 230 is configured to respectively project the playing window view field and the sub-window view field in the surface texture of the three-dimensional model, and correspondingly obtain a texture region in the playing window view field and a texture region in the sub-window view field.
A rendering unit 240, configured to render, in the playing window, the texture region in the field of view of the playing window, and render, in the sub-window, the texture region in the field of view of the sub-window.
In some embodiments, the mapping unit 210 is configured to: mapping pixel points of the video frame to be rendered to a texture space to obtain texture coordinates of the pixel points; and mapping the texture coordinates of the pixel points to the vertex coordinates of the surface of the three-dimensional model to form the surface texture of the three-dimensional model.
In some embodiments, the field of view unit 220 is configured to: determining a playing window view field corresponding to the playing window by adopting the following modes: analyzing the decoding metadata of the panoramic video to obtain the direction and the magnification of the view field of the playing window when the video frame is rendered in the playing window; or analyzing the received operation for setting the direction and the magnification factor, and determining the direction and the magnification factor corresponding to the playing window view field.
In some embodiments, the field of view unit 220 is configured to: determining the direction of the sub-window view field which satisfies the relative position relation with the direction of the play window view field according to the received operation for expressing the relative position relation; or according to a target object in the panoramic video, determining the direction of the field of view including the target object as the direction of the field of view of the sub-window.
In some embodiments, the rendering unit is to: for each video frame to be rendered in chronological order, performing the following operations: and projecting the view field of the playing window in the surface texture of the three-dimensional model corresponding to the video frame to be rendered to obtain a texture area in the view field of the playing window, and projecting the view field of the sub-window in the surface texture of the three-dimensional model corresponding to the video frame to be rendered to obtain the texture area in the view field of the sub-window.
In some embodiments, the rendering unit is to: for each video frame to be rendered in chronological order, performing the following operations: projecting the playing window view field in the surface texture of the three-dimensional model corresponding to the video frame to be rendered to obtain a texture area in the playing window view field; and projecting the sub-window view field in the surface texture of the three-dimensional model corresponding to the rendered video frame to obtain a texture area in the sub-window view field.
It should be noted that the above-mentioned classification of units does not constitute a limitation of the electronic device itself, for example, some units may be split into two or more sub-units, or some units may be combined into a new unit.
It should also be noted that the names of the above-mentioned cells do not in some cases constitute a limitation on the cells themselves, and for example, the above-mentioned mapping unit 210 may also be described as a cell of "mapping a video frame in a panoramic video to a surface texture of a three-dimensional model".
For the same reason, units and/or modules in the electronic device, which are not described in detail, do not represent defaults of the corresponding units and/or modules, and all operations performed by the electronic device may be implemented by the corresponding units and/or modules in the electronic device.
Fig. 3 is a schematic flow chart of an alternative process for displaying a panoramic video by an electronic device according to an embodiment of the present application, as shown in fig. 3, taking a program loaded from a Read Only Memory (ROM)102 by a processing device 801 or a program loaded from a storage device 180 into a Random Access Memory (RAM)103 as an example of a client for playing a panoramic video, during the playing of the panoramic video, a field of view used by a playing window to display video frames (i.e., a field of view used for displaying video frames in the playing window, which is referred to as a playing window field of view in this embodiment of the present application) may change constantly, and the client will display contents in the playing window field of view in sequentially decoded video frames in the playing window in a manner of texture mapping according to the change of the playing window field of view, and play audio frames synchronously.
The process of playing the panoramic video in the playing window by the client in the electronic device is described with reference to the steps shown in fig. 3, and it should be noted that, for convenience of description, in the following related description about fig. 3, in the description about fig. 3, "field of view" refers to "field of view of playing window".
In step S301, the client requests a file of the panoramic video from the server of the panoramic video, and loads the file into a memory (RAM 130) of the client.
For example, in an online playing scene of a panoramic video, the client requests a preloaded time period (e.g., 1 minute) segment file after a current playing point (e.g., a starting playing point or a playing point jumping according to a user operation) from the server through the communication device 809, and loads the segment file into the RAM 803. The preloading duration can be automatically set by the client according to network parameters such as the network access type and the bandwidth of the electronic equipment, and can also receive the setting of a user.
The segmented file of the panoramic video comprises necessary decoding metadata and media data (including audio frames and video frames), the client can decode the segmented file in time, and the duration of the segmented file ensures continuous playing effect in the client and does not excessively consume the bandwidth of the electronic equipment.
In some embodiments, as an alternative to step S301, the client may request a complete file of the panoramic video from the server at a time and load the complete file into the memory; or, the local panoramic video file is read from the storage device 808 and loaded into the memory.
Step S302, the client decodes the file of the panoramic video loaded into the memory to extract the decoded metadata and the media data.
According to an agreed packaging format such as flv (flash video), Moving Pictures Experts Group-4 (MPEG-4), etc., the panoramic video file is packaged with decoding metadata, and the client reads out the decoding metadata from the file according to an agreed position (for example, a few bytes of the binary data start of the file) in the packaging format, and the decoding metadata indicates information such as storage position, time (decoding time/rendering time), length, width, height, resolution, etc. of each audio frame/video frame in the file. Thus, the client can extract each video frame and audio frame from the file.
Step S303, the client maps the video frame to be rendered in the media data to the surface texture of the three-dimensional model.
For the video frame decoded in step S302, it is necessary to perform subsequent rendering to display the video frame in the playing window, so the video frame decoded and not rendered in the playing window is also referred to as a video frame to be rendered, for example, a video frame decoded from a segment file of the panoramic video, or a video frame decoded from a complete file of the panoramic video.
The implementation process of mapping a video frame to be rendered on the surface of a three-dimensional model based on the video frame is exemplarily illustrated.
First, the client maps the video frame to texture space.
The texture of the video frame in the texture space is represented by the texture coordinates and corresponding color values of each pixel in the video frame, and the texture coordinates (u, v) of the pixels of the video frame in the texture space are a two-dimensional array for storing the positions of the pixels in the video frame in the x axis/y axis of the texture space, so that the color values of each pixel point in the video frame can be discretely separated in the texture space.
Secondly, mapping the texture coordinates of the video frame in the texture space to the vertex coordinates of the surface of the three-dimensional model to form the surface texture of the three-dimensional model.
For example, such a manner may be adopted: texture coordinates of pixels of the video frame in texture space are mapped to coordinates (x, y, z) of vertices of the three-dimensional model, which are vertices that segment the surface of the three-dimensional model into a series of figures (e.g., triangles), such that pixel points between the vertices are stable regardless of changes in the three-dimensional model.
For example, referring to fig. 4A, fig. 4A is an optional texture mapping schematic diagram for displaying a panoramic video by an electronic device according to an embodiment of the present disclosure, a client decodes a video frame in media data, and takes the decoded video frame 41 as an example, and maps texture coordinates of each pixel point in the video frame 41 to vertices of a triangle of a spherical model 42 (certainly, the triangle is not limited to the spherical model, and a three-dimensional model such as a cylindrical model may also be used) of the spherical model 42 (the triangle will stabilize texture on the surface of the spherical model 42, and an object displayed in the video frame is not easily deformed, and is certainly not limited to the triangle), so as to form a spherical model 43 taking the video frame 41 as a surface texture.
Step S304, the client determines a field of view used when each video frame of the panoramic video is displayed in the play window, that is, the play window field of view.
In step S305A, the client renders a texture corresponding to the field of view of the video frame to be rendered, from among the surface textures of the three-dimensional model in which the video frame to be rendered is the map, to the playback window displayed on the display in the output device 170.
In some embodiments, the client determines a texture region in the field of view in the surface texture of the three-dimensional model according to the direction of the field of view toward the three-dimensional model, extracts a texture corresponding to the texture region from the texture space according to a vertex in the region, including a corresponding texel of the vertex in the texture space and a corresponding texel of the region between the vertices in the texture space; the texture corresponding to the texture region is rendered to a playing window displayed on a display in the output device 170 by means of perspective projection (i.e., the principle of near-far-near-far).
It can be understood that due to the rotation of the direction of the field of view, the texture pixels extracted from the texture region of the three-dimensional model in the field of view can be correspondingly rotated as a whole.
In step S305B, in synchronization with step S305A, the client plays the decoded audio frames that are synchronized in time with the video frames to be rendered in step S305A.
In step S306, the client determines whether all the video frames to be rendered in the memory have been rendered.
Here, if yes, the playing is ended; otherwise, step S304 is performed, and the next video frame is continued until all the video frames to be rendered in the memory are rendered.
So far, the process that the client displays the panoramic video in the playing window has been described, and the client can play the object in the view field according with the user's intention in the playing process, so that the user can flexibly rotate the direction of the view field to pay attention to the content of interest in the panoramic video, and the user can be given the same as the personally-on-the-spot perception effect.
An embodiment of the present application provides a method for controlling a field of view of a panoramic video, and fig. 5 is a schematic flowchart of a method for controlling a field of view of a panoramic video according to an embodiment of the present application, where as shown in fig. 5, the method includes the following steps:
step S501, monitoring a direction rotation event according to the type of equipment for displaying the panoramic video playing window.
Here, the device types include a touch terminal type and an entity-controlled terminal, such as a mobile phone and a computer. In the step S501, it can be understood that, when the device type is a mobile phone, a direction rotation event or a gyroscope rotation event triggered by a touch or a sliding of the mobile phone is monitored; when the equipment type is a computer, monitoring a direction rotation event triggered by clicking a mouse and knocking a keyboard.
Step S502, adjusting the playing window view field corresponding to the playing window according to the direction rotation event.
Here, in response to the direction rotation event, a direction rotation parameter is determined, and the direction of the view field of the playing window corresponding to the playing window is adjusted according to the parameter.
And S503, projecting the adjusted playing window field of view in the surface texture of the three-dimensional model to obtain a texture area.
Here, the surface texture is obtained by mapping texture coordinates corresponding to pixel points of a video frame of the panoramic video to vertex coordinates of the surface of the three-dimensional model, that is, mapping pixel points of the video frame of the panoramic video to a texture space to obtain texture coordinates of the pixel points; and then mapping the texture coordinates of the pixel points to the vertex coordinates of the surface of the three-dimensional model to form the surface texture.
Step S504, rendering the texture region in the play window.
After the direction of the playing window field corresponding to the playing window is adjusted, the video frame with the changed field direction is displayed for the user, so that the watching requirement of the user is met.
In this embodiment, the panoramic video completes the directional rotation of the panoramic video on different terminals by monitoring the corresponding directional rotation event according to the device type.
In other embodiments, the step S502, that is, "adjusting the playing window view field corresponding to the playing window according to the direction rotation event" includes:
and adjusting the position of the currently displayed video frame in the playing window in the video frame of the panoramic video according to the direction rotation event.
Here, as shown in fig. 4C, the direction rotation is a change representing the direction of the field of view, and the direction of the play window field of view 46 (i.e., the direction of the field of view toward the surface texture of the spherical model 43) affects the position of the partial content (i.e., the partial content of the video frame) displayed in the play window by the client in the video frame. For example, assuming that the playing window field of view 46 is rotated with respect to the direction toward the spherical model 43 as shown in fig. 4C, then a texture of another texture region different from the texture region 49 will be rendered into the playing window, thereby enabling the user to view the content of another region in the video frame 41.
The present embodiment provides a method for controlling a field of view of a panoramic video, fig. 6 is a schematic flowchart of a method for controlling a field of view of a panoramic video according to another embodiment of the present application, and as shown in fig. 6, the method includes the following steps:
step S601, when the device type is a touch terminal type, monitoring a direction rotation event triggered by a touch operation.
Here, the touch terminal may be a mobile phone, a tablet computer, or the like, and the touch control operation may include a touch drag interaction of a mobile terminal, a gyroscope event; if the device type is a touch terminal type, after a direction rotation event triggered by a touch operation applied to the touch terminal is monitored, the step S603 is performed.
Step S602, when the device type is the controlled terminal type of the controlled entity, monitoring the direction rotation event triggered by the operation of the entity.
Here, the controlled terminal may be a computer, and the entity operation may be mouse dragging interaction, control rotation, keyboard event, and the like at the computer. Further, the type of the operation of the directional rotation is not limited to the above type, and may include, for example, a facial motion and an eye motion performed to the camera of the input device 160. Therefore, the client can display the content of the panoramic video in different view fields according to the user intention, and the personalized requirements of the user for the panoramic video are met; if the device type is the controlled terminal type of the controlled entity, after monitoring the direction rotation event triggered by the entity operation acting on the controlled terminal of the controlled entity, the step S603 is entered.
And step S603, determining a direction rotation parameter corresponding to the panoramic video according to the direction rotation event.
Here, when the direction rotation event includes at least two direction rotation events, according to a preset direction rotation event priority, respectively responding to the at least two direction rotation events, and determining a direction rotation parameter corresponding to the panoramic video. That is, a priority is set for the directional rotation event, e.g., first turn left.
Step S604, rotating the angle of the view field of the playing window according to the direction rotation parameter.
Here, the step S603 and the step S604 can be implemented in various ways, and two ways are listed here:
first, in response to the direction rotation event, the play window view field corresponding to the play window is adjusted according to a preset direction rotation angle, for example, the play window view field corresponding to the play window is rotated according to a fixed angle.
Secondly, responding to the direction rotation event, adjusting the playing window view field corresponding to the playing window according to a preset direction rotation speed, namely, setting the direction rotation speed, so that the playing window view field corresponding to the playing window is in inertia change according to the preset direction rotation speed, for example, the change of the set direction rotation speed conforms to a Bezier curve, and adjusting the direction of the playing window view field corresponding to the playing window according to the direction rotation speed conforming to the trend of the Bezier curve, thereby changing the position of part of content displayed in the playing window in the video frame. In this embodiment, when the operation of the user indicating the parameter is not detected, a default value of the parameter optimized for the playing effect in the client is adopted to ensure the optimal playing effect of the panoramic video. The direction rotation parameter of the view field can be an experience value which is set in the client according to the characteristics of the panoramic video and ensures the viewing experience; alternatively, the parameter may be a parameter read from a file of the panoramic video (set by a user who shoots or issues the panoramic video, and may be set for a video frame corresponding to a part or all of the time axis in the panoramic video), or may be a parameter sent to the client by the server following the panoramic video.
And step S605, projecting the adjusted playing window view field in the surface texture of the three-dimensional model to obtain a texture area.
Step S606, rendering the texture region in the play window.
In this embodiment, the client determines the rendering time sequence of each decoded video frame according to the time of each video frame corresponding to the decoding metadata of the panoramic video, and sequentially determines the corresponding play window field of view when each video frame is rendered, and the client determines the field of view through the view angle (determining the size of the field of view), the direction (determining the position of the content of the video frame in the field of view in the video frame), and the like, which will be described below.
Referring to fig. 4B, fig. 4B is a schematic view of an alternative view field of the electronic device for displaying the panoramic video, where the play window view field 46 simulates an area that can be viewed by a human eye 47 through a play window displayed on a display of the output device 170, the size of the play window view field 46 depends on a horizontal view angle 44 and a vertical view angle 45 of the play window view field 46, and the larger the angle, the more content of video frames that can be viewed in the play window view field 46 (in a case that the magnification of the play window view field 46 is a certain factor).
For example, referring to fig. 4C, fig. 4C is an alternative projection diagram of the electronic device implementing the embodiment of the present application displaying a panoramic video, a virtual lens 48 of a human eye 47 shown in fig. 4B is simulated in the center of the spherical model 43, the lens 48 has the same field of view as the playing window, and a projection area of the playing window field of view 46 in the surface texture of the spherical model 43, i.e. a texture area 49, is content that can be viewed in a video frame through the playing window. The horizontal and vertical viewing angles of the field of view depend on the display (e.g., the width and height of the display) in the output device 170.
In this embodiment, the market direction of the playing window is changed in multiple ways to satisfy that the user can see the video frames at different positions, and monitor the direction rotation event corresponding to the device type, thereby ensuring the compatibility of the playing control of different terminal types and the panoramic video.
The embodiment of the present application further provides a field-of-view control device for panoramic video, where the device includes modules, and sub-modules and units included in the modules, and can be implemented by a processor in an electronic device; of course, the implementation can also be realized through a specific logic circuit; in implementation, the processor may be a Central Processing Unit (CPU), a Microprocessor (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
Fig. 7 is a schematic view illustrating a structure of a panoramic video field control apparatus according to an embodiment of the present invention, and as shown in fig. 7, the apparatus 700 includes: a first listening module 701, a first adjusting module 702, a first projecting module 703 and a first rendering module 704, wherein:
the first monitoring module 701 is configured to monitor a direction rotation event according to a device type of a display panoramic video playing window;
the first adjusting module 702 is configured to adjust a playing window view field corresponding to the playing window according to the direction rotation event;
the first projection module 703 is configured to project the adjusted play window field of view in a surface texture of the three-dimensional model to obtain a texture region; the surface texture is obtained by mapping texture coordinates corresponding to pixel points of a video frame of the panoramic video to vertex coordinates of the surface of the three-dimensional model;
a first rendering module 704, configured to render the texture region in the play window.
It should be noted that the above description of the embodiment of the apparatus, similar to the above description of the embodiment of the method, has similar beneficial effects as the embodiment of the method. For technical details not disclosed in the embodiments of the apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be noted that, in the embodiment of the present application, if the above-mentioned field-of-view control method for panoramic video is implemented in the form of a software functional module and is sold or used as a standalone product, it may also be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions to enable a field-of-view control device (which may be a terminal, a server, etc.) of a panoramic video to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
Accordingly, an embodiment of the present application further provides a readable storage medium, where a view field control program of a panoramic video is stored on the readable storage medium, and when executed by a processor, the view field control program of the panoramic video implements the steps of the view field control method of the panoramic video.
An embodiment of the present application further provides an electronic device, which includes a memory and a processor, where the memory stores a computer program that can be executed on the processor, and the processor implements the steps in the method for controlling the field of view of a panoramic video when executing the program.
The above description of the electronic device and storage medium embodiments, similar to the description of the method embodiments above, has similar advantageous effects as the method embodiments. For technical details not disclosed in the embodiments of the field control device and storage medium for panoramic video, please refer to the description of the embodiments of the method of the present application for understanding.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk.
Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or portions thereof contributing to the prior art may be embodied in the form of a software product stored in a storage medium, and including instructions for causing a computer device (which may be a personal computer, a server, etc.) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A method for controlling a field of view of a panoramic video, comprising:
monitoring a direction rotation event corresponding to the equipment type according to the equipment type for displaying the panoramic video playing window, wherein the equipment type comprises a touch terminal type and an entity-controlled terminal type;
determining a direction rotation parameter corresponding to the playing window according to the direction rotation event, wherein the direction rotation event comprises a facial action and an eye action which are implemented to a camera;
adjusting the direction of the view field of the playing window according to the direction rotation parameter;
projecting the adjusted playing window view field in the surface texture of the three-dimensional model to obtain a texture area; the surface texture is obtained by mapping texture coordinates corresponding to pixel points of a video frame of the panoramic video to vertex coordinates of the surface of the three-dimensional model;
rendering the texture region in the playback window.
2. The method of claim 1, wherein before the monitoring, according to a device type showing the panoramic video playing window, a direction rotation event corresponding to the device type, the method further comprises:
mapping pixel points of video frames of the panoramic video to texture space to obtain texture coordinates of the pixel points;
and mapping the texture coordinates of the pixel points to the vertex coordinates of the surface of the three-dimensional model to form the surface texture.
3. The method of claim 1, wherein the monitoring a direction rotation event corresponding to a device type according to the device type displaying the panoramic video playing window comprises:
when the equipment type is the touch terminal type, monitoring a direction rotation event triggered by touch operation;
and monitoring a direction rotation event triggered by the operation of the entity when the equipment type is the type of the controlled terminal of the entity.
4. The method according to claim 1, wherein determining the direction rotation parameter corresponding to the playing window according to the direction rotation event comprises:
and when the direction rotation event comprises at least two direction rotation events, respectively responding to the at least two direction rotation events according to the priority of the preset direction rotation event, and determining the direction rotation parameters corresponding to the playing window.
5. The method of claim 1, wherein the adjusting the direction of the field of view of the playing window according to the direction rotation parameter comprises:
adjusting the playing window view field corresponding to the playing window according to a preset direction rotation angle; alternatively, the first and second electrodes may be,
and adjusting the playing window view field corresponding to the playing window according to the preset direction rotation speed.
6. The method of claim 1, further comprising:
and adjusting the position of the currently displayed video frame in the playing window in the video frame of the panoramic video according to the direction rotation event.
7. An apparatus for controlling a field of view of a panoramic video, the apparatus comprising: first monitoring module, first adjustment module, first projection module and first rendering module, wherein:
the first monitoring module is used for monitoring a direction rotation event corresponding to the equipment type according to the equipment type for displaying the panoramic video playing window, wherein the equipment type comprises a touch terminal type and an entity-controlled terminal type;
the first adjusting module is configured to determine a direction rotation parameter corresponding to the playing window according to the direction rotation event, where the direction rotation event includes a facial action and an eye action performed on a camera; adjusting the direction of the view field of the playing window according to the direction rotation parameter;
the first projection module is used for projecting the adjusted playing window view field in the surface texture of the three-dimensional model to obtain a texture area; the surface texture is obtained by mapping texture coordinates corresponding to pixel points of a video frame of the panoramic video to vertex coordinates of the surface of the three-dimensional model;
a first rendering module to render the texture region in the playback window.
8. An electronic device, comprising:
a memory for storing executable instructions;
a processor for implementing the method of controlling the field of view of a panoramic video according to any one of claims 1 to 6 when executing said executable instructions.
9. A computer-readable storage medium having stored thereon executable instructions for implementing a method of controlling a field of view of a panoramic video as claimed in any one of claims 1 to 6 when executed.
CN201811369535.0A 2018-11-16 2018-11-16 Method, device and equipment for controlling field of view of panoramic video and storage medium Active CN111198610B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811369535.0A CN111198610B (en) 2018-11-16 2018-11-16 Method, device and equipment for controlling field of view of panoramic video and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811369535.0A CN111198610B (en) 2018-11-16 2018-11-16 Method, device and equipment for controlling field of view of panoramic video and storage medium

Publications (2)

Publication Number Publication Date
CN111198610A CN111198610A (en) 2020-05-26
CN111198610B true CN111198610B (en) 2021-08-10

Family

ID=70746071

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811369535.0A Active CN111198610B (en) 2018-11-16 2018-11-16 Method, device and equipment for controlling field of view of panoramic video and storage medium

Country Status (1)

Country Link
CN (1) CN111198610B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113298599B (en) * 2020-09-18 2022-08-02 阿里巴巴集团控股有限公司 Object display method, device and equipment
CN112308757B (en) * 2020-10-19 2024-03-22 武汉中科通达高新技术股份有限公司 Data display method and mobile terminal
CN112465939B (en) * 2020-11-25 2023-01-24 上海哔哩哔哩科技有限公司 Panoramic video rendering method and system
CN114154082B (en) * 2021-11-29 2023-07-18 上海烜翊科技有限公司 Offline data-driven visual demonstration method based on lens scheme design
CN115396740B (en) * 2022-07-29 2023-11-21 北京势也网络技术有限公司 Panoramic video playing method and device, electronic equipment and readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101312526A (en) * 2008-06-26 2008-11-26 天津市亚安科技电子有限公司 Full-view cooperative video monitoring apparatus and full-view image splicing method
WO2009003169A2 (en) * 2007-06-27 2008-12-31 University Of Florida Research Foundation, Inc. Display-based interactive simulation with dynamic panorama
CN104503570A (en) * 2014-12-10 2015-04-08 北京诺亚星云科技有限责任公司 Panorama roaming-based user behavior data processing system and equipment
CN105245838A (en) * 2015-09-29 2016-01-13 成都虚拟世界科技有限公司 Panoramic video playing method and player
CN105939482A (en) * 2015-03-05 2016-09-14 诺基亚技术有限公司 Video streaming transmission method
CN106385533A (en) * 2016-09-08 2017-02-08 三星电子(中国)研发中心 Panorama video control method and system
CN107659851A (en) * 2017-03-28 2018-02-02 腾讯科技(北京)有限公司 The displaying control method and device of panoramic picture

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009003169A2 (en) * 2007-06-27 2008-12-31 University Of Florida Research Foundation, Inc. Display-based interactive simulation with dynamic panorama
CN101312526A (en) * 2008-06-26 2008-11-26 天津市亚安科技电子有限公司 Full-view cooperative video monitoring apparatus and full-view image splicing method
CN104503570A (en) * 2014-12-10 2015-04-08 北京诺亚星云科技有限责任公司 Panorama roaming-based user behavior data processing system and equipment
CN105939482A (en) * 2015-03-05 2016-09-14 诺基亚技术有限公司 Video streaming transmission method
CN105245838A (en) * 2015-09-29 2016-01-13 成都虚拟世界科技有限公司 Panoramic video playing method and player
CN106385533A (en) * 2016-09-08 2017-02-08 三星电子(中国)研发中心 Panorama video control method and system
CN107659851A (en) * 2017-03-28 2018-02-02 腾讯科技(北京)有限公司 The displaying control method and device of panoramic picture

Also Published As

Publication number Publication date
CN111198610A (en) 2020-05-26

Similar Documents

Publication Publication Date Title
CN111198610B (en) Method, device and equipment for controlling field of view of panoramic video and storage medium
EP3606082B1 (en) Panoramic video playback method and client terminal
WO2017193576A1 (en) Video resolution adaptation method and apparatus, and virtual reality terminal
CN110419224B (en) Method for consuming video content, electronic device and server
EP3691280B1 (en) Video transmission method, server, vr playback terminal and computer-readable storage medium
EP4092616A1 (en) Interaction method and apparatus, and electronic device and computer-readable storage medium
TW201702807A (en) Method and device for processing a part of an immersive video content according to the position of reference parts
CN106598514B (en) Method and system for switching virtual reality mode in terminal equipment
US20200401362A1 (en) Screen sharing for display in vr
WO2022170958A1 (en) Augmented reality-based display method and device, storage medium, and program product
CN107295393B (en) method and device for displaying additional media in media playing, computing equipment and computer-readable storage medium
CN112672185B (en) Augmented reality-based display method, device, equipment and storage medium
CN114419213A (en) Image processing method, device, equipment and storage medium
CN114581566A (en) Animation special effect generation method, device, equipment and medium
KR20180059210A (en) Image processing apparatus and method for image processing thereof
CN111200750B (en) Multi-window playing method and device of panoramic video, electronic equipment and storage medium
CN111352560A (en) Screen splitting method and device, electronic equipment and computer readable storage medium
CN111200758B (en) Multi-view-field control method and device for panoramic video, electronic equipment and storage medium
CN111667313A (en) Advertisement display method and device, client device and storage medium
EP4071725A1 (en) Augmented reality-based display method and device, storage medium, and program product
JP7447266B2 (en) View encoding and decoding for volumetric image data
CN111726666A (en) Video display control method and device
KR20200028069A (en) Image processing method and apparatus of tile images
CN111277886B (en) Panoramic video view field control method and device, electronic equipment and storage medium
CN111200754B (en) Panoramic video playing method and device, terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee after: Tiktok vision (Beijing) Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee before: BEIJING BYTEDANCE NETWORK TECHNOLOGY Co.,Ltd.

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee after: Douyin Vision Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee before: Tiktok vision (Beijing) Co.,Ltd.