CN114222185A - Video playing method, terminal equipment and storage medium - Google Patents
Video playing method, terminal equipment and storage medium Download PDFInfo
- Publication number
- CN114222185A CN114222185A CN202111510114.7A CN202111510114A CN114222185A CN 114222185 A CN114222185 A CN 114222185A CN 202111510114 A CN202111510114 A CN 202111510114A CN 114222185 A CN114222185 A CN 114222185A
- Authority
- CN
- China
- Prior art keywords
- video
- rendering
- texture map
- operating system
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 93
- 238000009877 rendering Methods 0.000 claims abstract description 91
- 238000012545 processing Methods 0.000 claims description 16
- 238000013507 mapping Methods 0.000 claims description 11
- 238000004458 analytical method Methods 0.000 claims description 3
- 238000004891 communication Methods 0.000 claims description 3
- 230000006870 function Effects 0.000 abstract description 21
- 238000011161 development Methods 0.000 abstract description 10
- 238000012423 maintenance Methods 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 11
- 241000737241 Cocos Species 0.000 description 6
- 230000002452 interceptive effect Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000005034 decoration Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/443—OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Software Systems (AREA)
- Image Generation (AREA)
Abstract
The embodiment of the invention provides a video playing method, terminal equipment and a storage medium, wherein the method comprises the following steps: acquiring video data to be played; determining the type of an operating system adopted by the terminal equipment, wherein the video player supports different types of operating systems; and generating a texture map corresponding to the video data in a video memory space of the terminal equipment by a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal equipment performs video rendering based on the texture map. Because the texture maps in the video memory space can be directly applied to rendering controls in the terminal equipment, the method can enable the terminal equipment of different operating systems or application engines to realize the video playing function, reduce the development and maintenance difficulty of the video playing function, expand the application range of the video playing function and ensure the compatibility of the video playing function.
Description
Technical Field
The present invention relates to the field of image processing, and in particular, to a video playing method, a terminal device, and a storage medium.
Background
Video is becoming one of the main information dissemination ways as a way to vividly and directly present content. Therefore, in the development process of many application programs, the video playing requirement needs to be considered.
Currently, in the related art, video playing requirements in the development process of application programs can be supported by various types of operating systems and various application engines. Taking a video player as an example, if the video player needs to run in multiple types of operating systems or call multiple application engines to operate, the corresponding video player needs to be developed and maintained for the various types of operating systems or the various application engines respectively. Obviously, in the related art, the video player has compatibility problems in different operating systems or different application engines.
Therefore, it is desirable to provide a video playing scheme to solve the problem of cross-platform compatibility of video players.
Disclosure of Invention
The embodiment of the invention provides a video playing method, terminal equipment and a storage medium, which are used for realizing a video playing function in a cross-platform and cross-operating system mode, expanding the application range of a video playing mode and ensuring the compatibility of a video playing scheme.
In a first aspect, an embodiment of the present invention provides a video playing method, where the method is applied to a video player installed in a terminal device, and the method includes:
acquiring video data to be played;
determining the type of an operating system adopted by the terminal equipment, wherein the video player supports different types of operating systems;
and generating a texture map corresponding to the video data in a video memory space of the terminal equipment by a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal equipment performs video rendering based on the texture map.
In a second aspect, an embodiment of the present invention provides a video playing apparatus, where the apparatus is applied to a video player installed in a terminal device, and the apparatus includes:
the acquisition module is used for acquiring video data to be played;
the determining module is used for determining the type of the operating system adopted by the terminal equipment, and the video player supports different types of operating systems;
and the rendering module is used for generating a texture map corresponding to the video data in a video memory space of the terminal equipment through a rendering method corresponding to the type of the operating system so as to enable a rendering control in the terminal equipment to perform video rendering based on the texture map.
In a third aspect, an embodiment of the present invention provides an electronic device, including: a memory, a processor, a communication interface; wherein the memory has stored thereon executable code which, when executed by the processor, causes the processor to implement at least the video playback method as described in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a non-transitory machine-readable storage medium, on which executable code is stored, and when the executable code is executed by a processor of an electronic device, the processor is enabled to implement at least the video playing method according to the first aspect.
In the scheme provided by the embodiment of the invention, video data to be played is acquired; determining the type of an operating system adopted by the terminal equipment, wherein the video player supports different types of operating systems; and generating a texture map corresponding to the video data in a video memory space of the terminal equipment by a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal equipment performs video rendering based on the texture map. Because the texture map in the video memory space can be directly applied to the rendering control in the terminal equipment, the scheme can enable the terminal equipment of different operating systems or application engines to realize the video playing function, reduce the development and maintenance difficulty of the video playing function, expand the application range of the video playing function and ensure the compatibility of the video playing function.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
Fig. 1 is a flowchart of a video playing method according to an embodiment of the present invention;
fig. 2 is a schematic diagram illustrating a schematic diagram of a video data processing flow in an android system according to an embodiment of the present invention;
fig. 3 is a schematic diagram illustrating a video data processing flow in an iOS system according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating a schematic diagram of a texture map rendering process in an android system according to an embodiment of the present invention;
fig. 5 is a schematic diagram illustrating a texture map rendering process in an iOS system according to an embodiment of the present invention;
fig. 6 is a schematic diagram illustrating a texture map rendering process in a Cocos engine according to an embodiment of the present invention;
FIG. 7 is a schematic diagram illustrating a process of rendering a texture map in a Unity engine according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a video playing apparatus according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electronic device corresponding to the video playback device provided in the embodiment shown in fig. 8.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In addition, the sequence of steps in each method embodiment described below is only an example and is not strictly limited.
The video playing method provided by the embodiment of the invention can be executed by an electronic device, and the electronic device can be a terminal device with data processing capability, such as a PC, a notebook computer, a smart phone, and the like. Or the video playing method can be realized by a server connected with the terminal equipment, and the realization result is transmitted to the terminal equipment for displaying. The server may be a physical server including an independent host, or may also be a virtual server, or may also be a cloud server or a server cluster.
With the development of device intelligence and the development of related software, video playing functions are used in many interactive scenes, such as: calling video conference software to communicate during a video conference; in the intelligent teaching process, the subject knowledge is explained in a video sharing mode; displaying the commodity in shopping software through videos; in the intelligent security software, the real-time state of a monitored object (such as a room, an elderly person living alone and the like) is displayed through videos. Therefore, in the development process of many application programs, the video playing requirement needs to be considered.
However, in the related art, the video playing requirement in the development process of the application program can be supported by aiming at various types of operating systems and various application engines. Taking a video player as an example, if the video player needs to run in multiple types of operating systems or call multiple application engines to operate, the corresponding video player needs to be developed and maintained for the various types of operating systems or the various application engines respectively. Obviously, in the related art, the video player has compatibility problems in different operating systems or different application engines.
In addition, in the related art, the video player (or control) used by various types of operating systems and various application engines has the following problems: a video playing control (VideoView) in an Android operating system does not support nested display with a native View component, and if a background is switched in the video playing process, the switching process can cause a resource release problem and a black screen phenomenon occurs. When a video playing control (e.g., AVPlayer) of the iOS plays some videos, a black screen, a green screen, and the like may occur. The Cocos engine usually uses a video player of the system itself, so that the video frames cannot be nested with the content generated by the engine, and can only be set at the upper layer or the lower layer of the level of the engine content. The video player used by the Unity engine has the problem that the user experience is affected slowly by loading. In addition, the video player or the video playing control has poor support for video formats, and cannot provide functions such as buffering rate.
Therefore, under the condition of multiple operating systems and multiple engines, how to implement the video playing function in different types of devices in a cross-engine mode through cross-operation becomes a technical problem to be solved urgently.
An embodiment of the present invention provides a video playing method, as shown in fig. 1. Fig. 1 is a flowchart of a video playing method according to an embodiment of the present invention, which may include the following steps:
101. and acquiring video data to be played.
102. And determining the type of the operating system adopted by the terminal equipment, wherein the video player supports different types of operating systems.
103. And generating a texture map corresponding to the video data in a video memory space of the terminal equipment by a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal equipment performs video rendering based on the texture map.
In this embodiment, the method is applied to a video player installed in a terminal device. Here, the video player may be implemented as an independent application software, or may also be implemented as a control having a video playing function, and is nested in other application software. In any form, the essence is to provide a cross-platform video playing function for developers, thereby reducing the development and maintenance threshold of the video playing function.
In 101, video data to be played is acquired.
In particular, video data is used to carry image content that needs to be presented. It will be appreciated that the video data may be pre-produced, such as a pre-recorded instructional video of a teacher in an intelligent educational setting, or may be captured in real-time, such as a video stream captured in real-time by a user in a short video production setting. In practical applications, the format and the obtaining mode of the video data can be customized according to the requirements of the application software where the video data is located.
In addition, in order to further improve the compatibility of the video playing function, the format of the video data is not limited in this embodiment. For example, the Video data format may be Moving Picture Experts Group (MPEG), Audio Video Interleaved (AVI), Movie, MOV, Windows Media Video (WMV), Audio Video compression (RM), or the like.
In 102, the type of operating system employed by the terminal device is determined. Specifically, in an alternative embodiment, during the installation of the video player, the configuration attributes of the terminal device may be read to determine the type of operating system employed by the terminal device. Or, in another optional embodiment, the application downloading tool reads the configuration attribute of the terminal device, so as to determine the operating system type adopted by the terminal device according to the configuration data, and downloads and installs the video player of the corresponding version. Of course, this step may also be implemented in other ways, and this embodiment is not limited.
In this embodiment, the video player supports different types of operating systems. Here, the operating system refers to a native platform used by the terminal device. The native platform may be an operating system installed in the terminal device, such as an iOS operating system and an Android (Android) operating system.
103, generating a texture map corresponding to the video data in a video memory space of the terminal device by a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal device performs video rendering based on the texture map.
In this embodiment, the rendering control includes a native rendering control corresponding to an operating system type, or an application engine connected to the terminal device. In practical application, the native rendering control corresponding to the operating system type, for example, a SurfaceView control in an android native, and a UIView control in an iOS native. The application engine used by the terminal device is, for example, a Unity engine or a Cocos engine. In the terminal device, applications in the native platform or application engine and underlying resources may be invoked. The interfaces, calling methods and rendering methods that can be called in each operating system or application engine are different, and refer to the following embodiments.
In 103, optionally, the step of performing video rendering by a rendering control in the terminal device based on the texture map may be implemented as: and transmitting the identifier corresponding to the texture mapping to a rendering control so that the rendering control reads the texture mapping from the video memory space according to the identifier to perform video rendering. The identification can be an identification in a Graphic Processing Unit (GPU), and the texture mapping processing and transmission can be realized in the GPU in the mode, so that the complexity of rendering steps caused by data conversion between the GPU and a CPU is avoided, and the video playing performance is further improved. Therefore, the transfer process of the texture map can be directly completed in the video memory space through the identifier, and the transfer efficiency from the current processing module to the rendering control is improved.
Optionally, the identifier corresponding to the texture map includes but is not limited to: the Handle (Handle) of the texture map in the display memory space is identified. A handle is a special intelligent pointer that is used when an application needs to reference a memory region or object managed by another system (e.g., an operating system). In short, the handle referred to in this embodiment may be understood as an identifier for identifying the texture map, and may be used to describe the storage location of the texture map in the video memory space.
In the embodiment, the texture map corresponding to the video data is generated in the video memory space of the terminal device by the rendering method corresponding to the type of the operating system, and the texture map in the video memory space can be directly applied to the rendering control in the terminal device, so that the terminal devices of different operating systems or application engines can realize the video playing function, the development and maintenance difficulty of the video playing function is reduced, the application range of the video playing function is expanded, and the compatibility of the video playing function is ensured.
In the foregoing or the following embodiments, in 103, optionally generating a texture map corresponding to video data in a video memory space of the terminal device by a rendering method corresponding to an operating system type, including:
analyzing each frame of video image from the video data; and generating a texture mapping corresponding to each frame of video image in a video memory space by adopting a video analysis method corresponding to the type of the operating system.
First, the video data needs to be parsed into frame-by-frame video data. I.e. a video image of each frame is obtained. Further, the texture map generation procedure is implemented differently for terminal devices using different operating systems, and the procedure is described below according to the type of operating system.
For the android device, no matter the implementation form is a mobile phone, a wearable device, a tablet computer and a computer, the type of the operating system used by the android device is an android system. Based on this, in the above steps, a video parsing method corresponding to the type of the operating system is adopted, and a texture map corresponding to each frame of video image is generated in the video memory space, which can be implemented as follows:
if the type of the operating system is android, creating an initial texture map in a video memory space, and creating a Surface texture (Surface texture) and a memory area (Surface) corresponding to the Surface texture based on the initial texture map; setting the Surface into an android native player in an android operating system; in the android native player, a frame available (onFrameAvailable) method employing surface texture (surface texture) calls back each frame of video image; and associating each frame of video image into the corresponding initial texture map by using a mapping update (updateimage) method of the surface texture to obtain the corresponding texture map of each frame of video image.
Specifically, taking the video data processing scenario shown in fig. 2 as an example, after video data is decoded by a video player to obtain each frame of video image, in an android system of a terminal device, an initial Texture map (Texture2D) is created through a native video playing control or a currently-installed application engine, then a Surface Texture is created through Texture2D, and then a Surface is created through the Surface Texture and set to the video player.
Further, each frame of video image is called back by the first thread using the onFrameAvailable method of surface texture. And binding each frame of video image into a corresponding texture map in a video memory space by adopting an updateimage method of surface texture in the second thread so as to obtain the texture map corresponding to each frame of video image. The binding here corresponds to pointing the identifier of the texture map (i.e. the texture map id) to the video data decoded in the foregoing step, so as to indicate which data area in the memory space the video data corresponding to the texture map id belongs to in this way. By the method, the calling back thread can be unified, so that the bound texture map can point to corresponding video data, and a basis is provided for calling in a subsequent rendering process. Finally, the Texture2D may be passed to an application engine or native player control for video rendering.
For the iOS device, whether the implementation form is a mobile phone, a wearable device, a tablet computer, or a computer, the type of the operating system used by the iOS device is an iOS system. Based on this, in the above steps, a video parsing method corresponding to the type of the operating system is adopted, and texture maps corresponding to each frame of video image are generated in the video memory space, which can be implemented in two ways, as shown in fig. 3. One of the modes is specifically realized as follows:
if the operating system type is iOS, processing each frame of video image in an iOS native player in the iOS operating system in a hardware decoding mode to obtain hard decoding video data corresponding to each frame of video image; and creating a texture map corresponding to each frame of video Image based on the hard solution video data corresponding to each frame of video Image through a hard solution drawing method (CVOpenGLESTextureCacheCreateTextureFrom Image).
Specifically, taking the video data processing scenario shown in fig. 3 as an example, after the video player decodes the video data to obtain each frame of video image, in the iOS system of the terminal device, each frame of video image is called back and forth through a native video playing control or a currently-loaded application engine, and displayixels is adopted to carry each frame of video image. And then, processing each frame of video image in a hardware decoding mode to obtain the hard-decoded video data corresponding to each frame of video image.
Further, in fig. 3, the hard-decoded video data is CVPixelBufferRef, and based on this, a corresponding Texture map (Texture2D) is created for the hard-decoded video data corresponding to each frame of video image by using the method of cvopenglestexturecacerecreatetexturefrommemage. Finally, the Texture2D may be passed to an application engine or native player control for video rendering.
Another way to implement iOS devices is specifically: if the type of the operating system is iOS, processing each frame of video image in an iOS native player in the iOS operating system in a software decoding mode to obtain texture data corresponding to each frame of video image; and drawing each dimension texture in the texture data into a texture map of a video memory space to obtain a texture map corresponding to each frame of video image.
Specifically, in fig. 3, for the soft solution case, the main differences are: pixels (pixels) can be obtained by means of software decoding as texture data (such as YUV texture data) corresponding to each frame of video image. And then binding the texture data of three dimensions in the texture data with the texture maps of three dimensions to obtain three texture maps. And further creating an initial Texture2D in the video memory space, and drawing three Texture maps into the initial Texture2D through a Texture drawing control (such as a shader) in the native platform to obtain the Texture2D corresponding to each frame of video image. Wherein CVPixelBufferRef is a pixel picture parameter in iOS systems.
In practical applications, the application engine that finally receives the texture map may be the above-described Cocos engine or Unity engine. The native player control may be a customized video playing control in the android system (such as the IHVideoView shown in fig. 2) or a customized iOS video playing control in the iOS system (such as the IHVideoView shown in fig. 3). Of course, other applications with video rendering functions are also possible, and are not limited herein.
In the foregoing or the following embodiments, in 103, optionally, the rendering control in the terminal device performs video rendering based on texture mapping, which may be implemented as: and rendering the texture map through an application engine or a native player control used by the terminal equipment to obtain a video image for playing.
The texture map rendering step is implemented differently for terminal devices using different operating systems or application engines, and is described below according to the operating system type or application engine type.
For the android device, no matter the implementation form is a mobile phone, a wearable device, a tablet computer and a computer, the type of the operating system used by the android device is an android system. Specifically, as shown in fig. 4, for example, in the texture map rendering scenario, in the android system, the EGL module may create a context parameter (EGLContext), a Surface parameter (e.g., EGLSurface), and a related configuration parameter through Surface data (Surface) of a local window system (Surface view) or a Surface texture (Surface texture) of a texture window system (TextureView). The EGL is an intermediate interface layer between Open Graphics Library (OpenGL) rendering and a local windowing system (surface view in an android system, etc.). EGL is introduced here that can be used to render vector graphics across languages, across platforms. Further, the GLThread module draws Texture2D onto EGLSurface in a draw frame (DrawFrame) module using the received Texture2D (i.e., Texture map), thereby rendering Texture2D onto the terminal device screen for display.
For the iOS device, whether the implementation form is a mobile phone, a wearable device, a tablet computer, or a computer, the type of the operating system used by the iOS device is an iOS system. Specifically, as an example of Texture map rendering scene shown in fig. 5, rendering Texture2D (i.e., Texture map) can also be implemented in two specific ways in the iOS system. In one embodiment, if the Texture2D is generated through a Metal interface, the Texture2D can be directly rendered through MTKview.
In another specific way, if the Texture2D is generated through an OpenGLES interface, the Texture2D needs to be converted into the UIImage data (i.e., a type of image frame, which can be created based on the OpenGL top layer and is used for shader processing on an image), and then the CIImage data is converted into the UIImage data (i.e., a type of image), and finally the UIImage data is rendered through the UIImage view (basic control in the iOS system).
In practical applications, the former rendering mode is more efficient, but the latter rendering mode can be used for realizing a video rendering process when a video player is nested with other application programs. Therefore, the corresponding IOS rendering mode can be determined according to the actual application requirement.
For a terminal device using a Cocos engine, as an example of a Texture map rendering scene shown in fig. 6, in the Cocos engine, for a received Texture2D (i.e., a Texture map), a drawing (draw) method of a Node (Node) is used to call an OpenGL interface to draw Texture2D, so as to render the Texture onto a terminal device screen for display.
For a terminal device using a Unity engine, as an example of a Texture map rendering scene shown in fig. 7, in the Unity engine, for the received Texture2D (i.e., Texture map), a Texture2D is used to create a Unity engine-supported Texture2D, for example, a CreateExternalTexture method of Texture2D is used to create the Texture2D, and finally the Texture2D is set into a RawImage (an image control for displaying non-interactive images for displaying decorations or icons), so that the Texture2D is rendered on a terminal device screen for display by the RawImage.
According to the embodiment, a cross-operating-system and cross-application-engine rendering mode can be provided for a user, so that texture maps are rendered into video images for playing in different types of terminal equipment, and the universality of a video playing scheme is further improved.
An image recognition apparatus according to one or more embodiments of the present invention will be described in detail below. Those skilled in the art will appreciate that these means can each be constructed using commercially available hardware components and by performing the steps taught in this disclosure.
Fig. 8 is a schematic structural diagram of a video playing apparatus according to an embodiment of the present invention, and as shown in fig. 8, the apparatus is applied to a video player installed in a terminal device. The device includes:
the acquisition module 11 is configured to acquire video data to be played;
a determining module 12, configured to determine a type of an operating system adopted by the terminal device, where the video player supports different types of operating systems;
and a rendering module 13, configured to generate a texture map corresponding to the video data in a video memory space of the terminal device by using a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal device performs video rendering based on the texture map.
Optionally, the rendering module 13 is specifically configured to, in a process of performing video rendering based on the texture map through a rendering control in the terminal device:
and transmitting the identifier corresponding to the texture mapping to the rendering control so that the rendering control reads the texture mapping from the video memory space according to the identifier to perform video rendering.
Optionally, the identifier corresponding to the texture map includes a handle identifier of the texture map in the video memory space.
Optionally, the rendering control includes a native rendering control corresponding to the operating system type, or an application engine connected to the terminal device.
Optionally, the rendering module 13 is configured to, in a process of generating a texture map corresponding to the video data in a video memory space of the terminal device by using a rendering method corresponding to the type of the operating system, specifically:
analyzing each frame of video image from the video data;
and generating a texture map corresponding to each frame of video image in the video memory space by adopting a video analysis method corresponding to the type of the operating system.
Optionally, the rendering module 13 is configured to, in the process of generating a texture map corresponding to each frame of video image in the video memory space by using a video parsing method corresponding to the type of the operating system, specifically:
if the operating system type is android, creating an initial texture map in the video memory space, and creating a surface texture and a memory area corresponding to the surface texture based on the initial texture map;
setting the memory area into an android native player in an android operating system;
in the android native player, calling back each frame of video image by adopting a frame available method (such as an onFrameAvailable method) of the surface texture;
and associating each frame of video image into a corresponding initial texture map by adopting a map updating method (such as an updatetexmage method) of the surface texture so as to obtain a corresponding texture map of each frame of video image.
Optionally, the rendering module 13 is configured to, in the process of generating a texture map corresponding to each frame of video image in the video memory space by using a video parsing method corresponding to the type of the operating system, specifically:
if the operating system type is iOS, processing each frame of video image in an iOS native player in the iOS operating system in a hardware decoding mode to obtain hard decoding video data corresponding to each frame of video image;
and creating a texture map corresponding to each frame of video Image based on the hard solution video data corresponding to each frame of video Image through a hard solution drawing method (such as a CVOpenGLESTextureCacheCreateTextureFrom Image method).
Optionally, the rendering module 13 is configured to, in the process of generating a texture map corresponding to each frame of video image in the video memory space by using a video parsing method corresponding to the type of the operating system, specifically:
if the operating system type is iOS, processing each frame of video image in an iOS native player in the iOS operating system in a software decoding mode to obtain texture data corresponding to each frame of video image;
and drawing each dimension texture in the texture data into a texture map of the video memory space to obtain a texture map corresponding to each frame of video image.
The apparatus shown in fig. 8 can perform the steps described in the foregoing embodiments, and the detailed performing process and technical effects refer to the descriptions in the foregoing embodiments, which are not described herein again.
In one possible design, the structure of the video playing apparatus shown in fig. 8 may be implemented as an electronic device, as shown in fig. 9, where the electronic device may include: a processor 21, a memory 22, and a communication interface 23. Wherein the memory 22 has stored thereon executable code which, when executed by the processor 21, makes the processor 21 at least to implement the video playing method as provided in the previous embodiments.
In addition, an embodiment of the present invention provides a non-transitory machine-readable storage medium having stored thereon executable code, which, when executed by a processor of an electronic device, causes the processor to implement at least a video playing method as provided in the foregoing embodiments.
The above-described apparatus embodiments are merely illustrative, wherein the units described as separate components may or may not be physically separate. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by an interface of a necessary general hardware platform, and of course, can also be implemented by an interface of a combination of hardware and software. With this understanding in mind, the above-described aspects and portions of the present technology which contribute substantially or in part to the prior art may be embodied in the form of a computer program product, which may be embodied on one or more computer-usable storage media having computer-usable program code embodied therein, including without limitation disk storage, CD-ROM, optical storage, and the like.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.
Claims (10)
1. A video playing method applied to a video player installed in a terminal device, the method comprising:
acquiring video data to be played;
determining the type of an operating system adopted by the terminal equipment, wherein the video player supports different types of operating systems;
and generating a texture map corresponding to the video data in a video memory space of the terminal equipment by a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal equipment performs video rendering based on the texture map.
2. The method of claim 1, wherein the rendering control in the terminal device performs video rendering based on the texture map, and comprises:
and transmitting the identifier corresponding to the texture mapping to the rendering control so that the rendering control reads the texture mapping from the video memory space according to the identifier to perform video rendering.
3. The method of claim 2, wherein the identifier corresponding to the texture map comprises a handle identifier of the texture map in the video memory space.
4. The method of claim 1, wherein the rendering control comprises a native rendering control corresponding to the operating system type or an application engine connected to the terminal device.
5. The method according to claim 1, wherein the generating a texture map corresponding to the video data in a video memory space of the terminal device by a rendering method corresponding to the operating system type comprises:
analyzing each frame of video image from the video data;
and generating a texture map corresponding to each frame of video image in the video memory space by adopting a video analysis method corresponding to the type of the operating system.
6. The method according to claim 5, wherein the generating the texture map corresponding to each frame of the video image in the video memory space by using the video parsing method corresponding to the operating system type comprises:
if the operating system type is android, creating an initial texture map in the video memory space, and creating a surface texture and a memory area corresponding to the surface texture based on the initial texture map;
setting the memory area to a surface texture in an android operating system, and calling back each frame of video image by a frame available method of the surface texture;
and associating each frame of video image to the corresponding initial texture map by adopting the surface texture map updating method so as to obtain the corresponding texture map of each frame of video image.
7. The method according to claim 5, wherein the generating the texture map corresponding to each frame of the video image in the video memory space by using the video parsing method corresponding to the operating system type comprises:
if the operating system type is iOS, processing each frame of video image in a player of the iOS in a hardware decoding mode to obtain hard decoding video data corresponding to each frame of video image;
and creating a texture mapping corresponding to each frame of video image based on the hard solution video data corresponding to each frame of video image through a hard solution drawing method.
8. A video playing apparatus applied to a video player installed in a terminal device, the apparatus comprising:
the acquisition module is used for acquiring video data to be played;
the determining module is used for determining the type of the operating system adopted by the terminal equipment, and the video player supports different types of operating systems;
and the rendering module is used for generating a texture map corresponding to the video data in a video memory space of the terminal equipment through a rendering method corresponding to the type of the operating system so as to enable a rendering control in the terminal equipment to perform video rendering based on the texture map.
9. A video playback device, comprising: a memory, a processor, a communication interface; wherein the memory has stored thereon executable code which, when executed by the processor, causes the processor to perform the video playback method of any one of claims 1 to 7.
10. A non-transitory machine-readable storage medium having stored thereon executable code, which when executed by a processor of an electronic device, causes the processor to perform the video playback method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111510114.7A CN114222185B (en) | 2021-12-10 | 2021-12-10 | Video playing method, terminal equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111510114.7A CN114222185B (en) | 2021-12-10 | 2021-12-10 | Video playing method, terminal equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114222185A true CN114222185A (en) | 2022-03-22 |
CN114222185B CN114222185B (en) | 2024-04-05 |
Family
ID=80700925
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111510114.7A Active CN114222185B (en) | 2021-12-10 | 2021-12-10 | Video playing method, terminal equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114222185B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115767182A (en) * | 2022-11-21 | 2023-03-07 | 北京新唐思创教育科技有限公司 | Image rendering method and device, electronic equipment and storage medium |
WO2023245495A1 (en) * | 2022-06-22 | 2023-12-28 | 云智联网络科技(北京)有限公司 | Method and apparatus for converting rendered data into video stream, and electronic device |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2012202491A1 (en) * | 2012-04-30 | 2013-11-14 | Canon Kabushiki Kaisha | Method, system and apparatus for rendering an image on a page |
WO2017024144A1 (en) * | 2015-08-04 | 2017-02-09 | Google Inc. | Systems and methods for interactively presenting a visible portion of a rendering surface on a user device |
CN106507115A (en) * | 2016-11-30 | 2017-03-15 | 上海爱葛网络科技有限公司 | Based on the coding/decoding method of the VR videos of iOS device, device and terminal device |
CN106713937A (en) * | 2016-12-30 | 2017-05-24 | 广州虎牙信息科技有限公司 | Video playing control method and device as well as terminal equipment |
CN108093293A (en) * | 2018-01-15 | 2018-05-29 | 北京奇艺世纪科技有限公司 | A kind of Video Rendering method and system |
US20190155585A1 (en) * | 2017-11-17 | 2019-05-23 | General Electric Company | Dynamic hybrid rendering |
CN109922360A (en) * | 2019-03-07 | 2019-06-21 | 腾讯科技(深圳)有限公司 | Method for processing video frequency, device and storage medium |
CN110602551A (en) * | 2019-08-22 | 2019-12-20 | 福建星网智慧科技股份有限公司 | Media playing method, player, equipment and storage medium of android frame layer |
CN111292387A (en) * | 2020-01-16 | 2020-06-16 | 广州小鹏汽车科技有限公司 | Dynamic picture loading method and device, storage medium and terminal equipment |
CN113457160A (en) * | 2021-07-15 | 2021-10-01 | 腾讯科技(深圳)有限公司 | Data processing method and device, electronic equipment and computer readable storage medium |
CN113674389A (en) * | 2021-10-25 | 2021-11-19 | 深圳须弥云图空间科技有限公司 | Scene rendering method and device, electronic equipment and storage medium |
CN113672387A (en) * | 2021-08-11 | 2021-11-19 | 上海交通大学 | Remote calling graphics rendering method and system based on drawing programming interface |
-
2021
- 2021-12-10 CN CN202111510114.7A patent/CN114222185B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2012202491A1 (en) * | 2012-04-30 | 2013-11-14 | Canon Kabushiki Kaisha | Method, system and apparatus for rendering an image on a page |
WO2017024144A1 (en) * | 2015-08-04 | 2017-02-09 | Google Inc. | Systems and methods for interactively presenting a visible portion of a rendering surface on a user device |
CN106507115A (en) * | 2016-11-30 | 2017-03-15 | 上海爱葛网络科技有限公司 | Based on the coding/decoding method of the VR videos of iOS device, device and terminal device |
CN106713937A (en) * | 2016-12-30 | 2017-05-24 | 广州虎牙信息科技有限公司 | Video playing control method and device as well as terminal equipment |
US20190155585A1 (en) * | 2017-11-17 | 2019-05-23 | General Electric Company | Dynamic hybrid rendering |
CN108093293A (en) * | 2018-01-15 | 2018-05-29 | 北京奇艺世纪科技有限公司 | A kind of Video Rendering method and system |
CN109922360A (en) * | 2019-03-07 | 2019-06-21 | 腾讯科技(深圳)有限公司 | Method for processing video frequency, device and storage medium |
CN110602551A (en) * | 2019-08-22 | 2019-12-20 | 福建星网智慧科技股份有限公司 | Media playing method, player, equipment and storage medium of android frame layer |
CN111292387A (en) * | 2020-01-16 | 2020-06-16 | 广州小鹏汽车科技有限公司 | Dynamic picture loading method and device, storage medium and terminal equipment |
CN113457160A (en) * | 2021-07-15 | 2021-10-01 | 腾讯科技(深圳)有限公司 | Data processing method and device, electronic equipment and computer readable storage medium |
CN113672387A (en) * | 2021-08-11 | 2021-11-19 | 上海交通大学 | Remote calling graphics rendering method and system based on drawing programming interface |
CN113674389A (en) * | 2021-10-25 | 2021-11-19 | 深圳须弥云图空间科技有限公司 | Scene rendering method and device, electronic equipment and storage medium |
Non-Patent Citations (3)
Title |
---|
YU-JUNG CHEN等: "Distributed rendering: Interaction delay reduction in remote rendering with client-end GPU-accelerated scene warping technique", 《2017 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW)》, 7 September 2017 (2017-09-07) * |
丁立国等: "基于Html5的Web Map矢量渲染技术研究", 《测绘工程 》, no. 8, 2 August 2017 (2017-08-02) * |
李伟男: "应用差异化更新技术实现GPU虚拟化场景性能优化", 《中国优秀硕士学位论文全文数据库》, no. 1, 15 January 2020 (2020-01-15) * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023245495A1 (en) * | 2022-06-22 | 2023-12-28 | 云智联网络科技(北京)有限公司 | Method and apparatus for converting rendered data into video stream, and electronic device |
CN115767182A (en) * | 2022-11-21 | 2023-03-07 | 北京新唐思创教育科技有限公司 | Image rendering method and device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN114222185B (en) | 2024-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111899322B (en) | Video processing method, animation rendering SDK, equipment and computer storage medium | |
CN109600666B (en) | Video playing method, device, medium and electronic equipment in game scene | |
CN111433743B (en) | APP remote control method and related equipment | |
CN111193876B (en) | Method and device for adding special effect in video | |
CN112235604B (en) | Rendering method and device, computer readable storage medium and electronic device | |
CN114222185B (en) | Video playing method, terminal equipment and storage medium | |
CN113457160B (en) | Data processing method, device, electronic equipment and computer readable storage medium | |
CN113542757A (en) | Image transmission method and device for cloud application, server and storage medium | |
CN110047119B (en) | Animation generation method and device comprising dynamic background and electronic equipment | |
US20160373502A1 (en) | Low latency application streaming using temporal frame transformation | |
CN112929740A (en) | Method, device, storage medium and equipment for rendering video stream | |
CN112055254A (en) | Video playing method, device, terminal and storage medium | |
CN114968152B (en) | Method for reducing VIRTIO-GPU extra performance loss | |
WO2022218042A1 (en) | Video processing method and apparatus, and video player, electronic device and readable medium | |
CN113411661B (en) | Method, apparatus, device, storage medium and program product for recording information | |
CN113411660B (en) | Video data processing method and device and electronic equipment | |
CN112367295B (en) | Plug-in display method and device, storage medium and electronic equipment | |
CN115393490A (en) | Image rendering method and device, storage medium and electronic equipment | |
CN117666985A (en) | Display interface control method and device, electronic equipment and storage medium | |
CN117065357A (en) | Media data processing method, device, computer equipment and storage medium | |
KR20080101468A (en) | System and method for relaying motion pictures using mobile communication device | |
CN114281773A (en) | Animation display method and device, electronic equipment and computer readable storage medium | |
CN108235144B (en) | Playing content obtaining method and device and computing equipment | |
CN118718390A (en) | Cloud game picture presentation method and device, electronic equipment and storage medium | |
CN118748737A (en) | Display equipment and special effect image generation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |