CN114222185B - Video playing method, terminal equipment and storage medium - Google Patents

Video playing method, terminal equipment and storage medium Download PDF

Info

Publication number
CN114222185B
CN114222185B CN202111510114.7A CN202111510114A CN114222185B CN 114222185 B CN114222185 B CN 114222185B CN 202111510114 A CN202111510114 A CN 202111510114A CN 114222185 B CN114222185 B CN 114222185B
Authority
CN
China
Prior art keywords
video
texture map
frame
operating system
rendering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111510114.7A
Other languages
Chinese (zh)
Other versions
CN114222185A (en
Inventor
王志明
常乐
童子奇
叶王毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hongenperfect Beijing Education Technology Development Co ltd
Original Assignee
Hongenperfect Beijing Education Technology Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hongenperfect Beijing Education Technology Development Co ltd filed Critical Hongenperfect Beijing Education Technology Development Co ltd
Priority to CN202111510114.7A priority Critical patent/CN114222185B/en
Publication of CN114222185A publication Critical patent/CN114222185A/en
Application granted granted Critical
Publication of CN114222185B publication Critical patent/CN114222185B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs

Abstract

The embodiment of the invention provides a video playing method, terminal equipment and a storage medium, wherein the method comprises the following steps: acquiring video data to be played; determining the type of an operating system adopted by the terminal equipment, wherein the video player supports different types of operating systems; and generating a texture map corresponding to the video data in a video memory space of the terminal equipment by a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal equipment performs video rendering based on the texture map. The texture map in the video memory space can be directly applied to rendering controls in the terminal equipment, so that the method can enable the terminal equipment of different operating systems or application engines to realize video playing functions, reduce the development and maintenance difficulties of the video playing functions, expand the application range of the video playing functions and ensure the compatibility of the video playing functions.

Description

Video playing method, terminal equipment and storage medium
Technical Field
The present invention relates to the field of image processing, and in particular, to a video playing method, a terminal device, and a storage medium.
Background
Video is becoming one of the main information dissemination ways as a way to vividly and directly show content. Thus, many applications currently require consideration of video playback requirements.
Currently, in the related art, video playing requirements in the development process of application programs can be supported for various types of operating systems and various application engines. Taking a video player as an example, if the video player needs to run in multiple types of operating systems or call multiple application engines to operate, the corresponding video player needs to be developed and maintained for each type of operating systems or each application engine respectively. Obviously, in the related art, the video player has compatibility problems in different operating systems or different application engines.
Therefore, a video playing scheme is needed to solve the problem of cross-platform compatibility of the video player.
Disclosure of Invention
The embodiment of the invention provides a video playing method, terminal equipment and a storage medium, which are used for realizing a video playing function across platforms and operating systems, expanding the application range of a video playing mode and ensuring the compatibility of a video playing scheme.
In a first aspect, an embodiment of the present invention provides a video playing method, which is applied to a video player installed in a terminal device, the method including:
acquiring video data to be played;
determining the type of an operating system adopted by the terminal equipment, wherein the video player supports different types of operating systems;
and generating a texture map corresponding to the video data in a video memory space of the terminal equipment through a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal equipment performs video rendering based on the texture map.
In a second aspect, an embodiment of the present invention provides a video playing apparatus, which is applied to a video player installed in a terminal device, the apparatus including:
the acquisition module is used for acquiring video data to be played;
the determining module is used for determining the type of the operating system adopted by the terminal equipment, and the video player supports different types of operating systems;
and the rendering module is used for generating a texture map corresponding to the video data in a video memory space of the terminal equipment through a rendering method corresponding to the type of the operating system so as to enable a rendering control in the terminal equipment to conduct video rendering based on the texture map.
In a third aspect, an embodiment of the present invention provides an electronic device, including: a memory, a processor, a communication interface; wherein the memory has executable code stored thereon, which when executed by the processor, causes the processor to at least implement the video playing method according to the first aspect.
In a fourth aspect, embodiments of the present invention provide a non-transitory machine-readable storage medium having executable code stored thereon, which when executed by a processor of an electronic device, causes the processor to at least implement a video playback method as described in the first aspect.
In the scheme provided by the embodiment of the invention, the video data to be played is obtained; determining the type of an operating system adopted by the terminal equipment, wherein the video player supports different types of operating systems; and generating a texture map corresponding to the video data in a video memory space of the terminal equipment by a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal equipment performs video rendering based on the texture map. The texture map in the video memory space can be directly applied to rendering controls in the terminal equipment, so that the terminal equipment of different operating systems or application engines can realize the video playing function, the development and maintenance difficulty of the video playing function is reduced, the application range of the video playing function is expanded, and the compatibility of the video playing function is ensured.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a video playing method according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a video data processing flow in an android system according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a video data processing flow in an iOS system according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a texture map rendering process in an android system according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a texture map rendering process in an iOS system according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a texture map rendering process in a Cocos engine according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a texture map rendering process in a Unity engine according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a video playing device according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electronic device corresponding to the video playing apparatus provided in the embodiment shown in fig. 8.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In addition, the sequence of steps in the method embodiments described below is only an example and is not strictly limited.
The video playing method provided by the embodiment of the invention can be executed by an electronic device, and the electronic device can be a terminal device with data processing capability, such as a PC (personal computer), a notebook computer, a smart phone and the like. Or the server connected with the terminal equipment can realize the video playing method and transmit the realization result to the terminal equipment for display. The server may be a physical server comprising a separate host, or may be a virtual server, or may be a cloud server or a server cluster.
With the development of device intelligence and the development of related software, video playing functions are used in many interactive scenes, such as: calling video conference software to communicate when a video conference is performed; in the intelligent teaching process, the discipline knowledge is taught in a video sharing mode; in shopping software, displaying commodities through videos; in the intelligent security software, the real-time state of a monitoring object (such as a room, a solitary old man and the like) is displayed through a video. Therefore, in many application program development processes, video playing needs to be considered.
However, in the related art, video playing requirements in the application development process can be supported for various types of operating systems and various application engines. Taking a video player as an example, if the video player needs to run in multiple types of operating systems or call multiple application engines to operate, the corresponding video player needs to be developed and maintained for each type of operating systems or each application engine respectively. Obviously, in the related art, the video player has compatibility problems in different operating systems or different application engines.
In addition, in the related art, the video player (or control) used by various types of operating systems and various application engines has the following problems: video playing control (video View) in Android operating system does not support nested display with native View component, and if background is switched in video playing process, resource release problem is brought in switching process, and black screen phenomenon occurs. The video playing control (for example, AVPlayer) of iOS may appear as a black screen, a green screen, etc. when playing some video. The Cocos engine usually uses a video player of the system, so that video pictures cannot be nested with contents generated by the engine and can only be arranged at the upper layer or the lower layer of the hierarchy where the contents of the engine are located. The video player used by the Unity engine has the problem that the loading slowly affects the user experience. In addition, the video player or the video playing control has poor support to the video format, and cannot provide functions such as buffering rate.
Therefore, how to implement the video playing function across operation and engines in different types of devices under the condition of multiple operating systems and multiple engines becomes a technical problem to be solved.
The embodiment of the invention provides a video playing method, as shown in fig. 1. Fig. 1 is a flowchart of a video playing method according to an embodiment of the present invention, which may include the following steps:
101. and acquiring video data to be played.
102. And determining the type of an operating system adopted by the terminal equipment, wherein the video player supports different types of operating systems.
103. And generating a texture map corresponding to the video data in a video memory space of the terminal equipment by a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal equipment performs video rendering based on the texture map.
In this embodiment, the method is applied to a video player installed in a terminal device. Here, the video player may be implemented as independent application software, or may be implemented as a control with a video playing function, and nested in other application software. Whatever the form, the essence is to provide a cross-platform video playing function for the developer, so as to reduce the development and maintenance threshold of the video playing function.
In 101, video data to be played is acquired.
In particular, the video data is used to carry image content to be presented. It will be appreciated that the video data may be pre-produced, such as a teacher pre-recorded teaching video in an intelligent educational scenario, or may be captured in real-time, such as a video stream captured in real-time by a user in a short video production scenario. In practical applications, the format and the acquisition mode of the video data can be customized according to the requirements of the application software where the video data is located.
In addition, in order to further improve the compatibility of the video playing function, the format of the video data is not limited in the present embodiment. For example, the video data format may be a moving picture experts group format (Moving Picture Experts Group, MPEG), an audio video interleave format (Audio Video Interleaved, AVI), a Movie format (MOV), a microsoft Media format (Windows Media Video, WMV), an audio video compression format (Real Media, RM), or the like.
102, determining the type of operating system adopted by the terminal equipment. Specifically, in an alternative embodiment, during installation of the video player, the configuration attribute of the terminal device may be read, so as to determine the type of operating system employed by the terminal device. Or in another alternative embodiment, the application downloading tool reads the configuration attribute of the terminal device, so as to determine the type of the operating system adopted by the terminal device according to the configuration data, and downloads and installs the video player of the corresponding version. Of course, this step may be implemented in other manners, and the embodiment is not limited thereto.
In this embodiment, the video player supports different types of operating systems. Here, the operating system refers to a native platform used by the terminal device. The native platform may be an operating system carried by the terminal device, such as an iOS operating system and an Android operating system.
103, generating a texture map corresponding to the video data in a video memory space of the terminal equipment through a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal equipment performs video rendering based on the texture map.
In this embodiment, the rendering control includes a native rendering control corresponding to an operating system type, or an application engine connected to the terminal device. In practical application, the native rendering control corresponding to the operating system type, for example, a SurfaceView control in the android native, and a UIView control in the iOS native. The application engine used by the terminal device is, for example, a Unity engine or a Cocos engine. In the terminal device, applications in the native platform or application engine and underlying resources may be invoked. The invokable interfaces, invocation methods, rendering methods in each operating system or application engine are different, see in particular the examples below.
In 103, optionally, the step of performing video rendering by the rendering control in the terminal device based on the texture map may be implemented as follows: and transmitting the identification corresponding to the texture map to the rendering control, so that the rendering control reads the texture map from the video memory space according to the identification to perform video rendering. The mark can be a mark in a Graphic Processing Unit (GPU), and the processing and the transmission of the texture mapping can be realized in the GPU in the mode, so that the complexity of rendering steps caused by data conversion between the GPU and a CPU is avoided, and the video playing performance is further improved. Therefore, the transmission process of the texture map can be directly completed in the video memory space through the identification, and the transmission efficiency from the current processing module to the rendering control is improved.
Optionally, the identification of the texture map correspondence includes, but is not limited to: handle (Handle) identification of texture maps in memory space. A handle is a special intelligent pointer that needs to be used when application software is to reference memory areas or objects managed by other systems, such as an operating system. Briefly, the handle according to this embodiment may be understood as an identifier for identifying the texture map, and may be used to describe the storage location of the texture map in the video memory space.
In this embodiment, the texture map corresponding to the video data is generated in the video memory space of the terminal device by the rendering method corresponding to the operating system type, and the texture map in the video memory space can be directly applied to the rendering control in the terminal device, so that the terminal devices of different operating systems or application engines can realize the video playing function, the development and maintenance difficulty of the video playing function is reduced, the application range of the video playing function is expanded, and the compatibility of the video playing function is ensured.
In the above or the following embodiments, in 103, optionally, generating, by a rendering method corresponding to an operating system type, a texture map corresponding to video data in a video memory space of a terminal device, includes:
analyzing each frame of video image from the video data; and generating a texture map corresponding to each frame of video image in a video memory space by adopting a video analysis method corresponding to the type of the operating system.
First, it is necessary to parse video data into video data frame by frame. And obtaining the video image of each frame. Further, the implementation of the texture map generation step is different for terminal devices using different operating systems, and the above steps are described below according to the type of operating system.
For android devices, whether the implementation forms are mobile phones, wearable devices, tablet computers and computers, the type of operating system used by the android devices is an android system. Based on this, in the above steps, the generation of the texture map corresponding to each frame of video image in the video memory space by using the video parsing method corresponding to the operating system type may be implemented as follows:
if the type of the operating system is android, an initial texture map is created in a video memory space, and Surface textures (Surface textures) and memory areas (surfaces) corresponding to the Surface textures are created based on the initial texture map; setting a Surface to an android native player in an android operating system; in the android native player, a frame available (onFrameAvailable) method of surface texture is adopted to call back each frame of video image; and associating each frame of video image into the corresponding initial texture map by using a map updating (updateTexImage) method of surface texture so as to obtain the corresponding texture map of each frame of video image.
Specifically, taking the video data processing scenario shown in fig. 2 as an example, after decoding video data by a video player to obtain each frame of video image, in an android system of a terminal device, an initial Texture map (Texture 2D) is created by a native video playing control or a currently-installed application engine, then a Surface Texture is created by the Texture2D, further a Surface is created by the Surface Texture, and the Surface is set to the video player.
Further, each frame of video image is recalled by the first thread using the onFrameAvailable method of SurfaceTexture. And binding each frame of video image into a corresponding texture map in a video memory space by adopting an updateTexImage method of surface texture in a second thread so as to obtain the corresponding texture map of each frame of video image. The binding here corresponds to pointing the identification of the texture map (i.e. the texture map id) to the video data decoded in the previous step, so that it is indicated in this way which data area in the memory space the video data corresponding to the texture map id belongs to. Through the step, callback threads can be unified, so that the texture map after binding can point to corresponding video data, and a basis is provided for calling in a subsequent rendering process. Finally, the Texture2D can be passed to an application engine or native player control for video rendering.
For the iOS device, no matter the implementation form of the iOS device is a mobile phone, a wearable device, a tablet computer or a computer, the type of an operating system used by the iOS device is an iOS system. Based on this, in the above steps, the texture map corresponding to each frame of video image is generated in the video memory space by adopting the video parsing method corresponding to the operating system type, and the two modes can be implemented as shown in fig. 3. One of the modes is specifically implemented as follows:
if the type of the operating system is iOS, processing each frame of video image in an iOS native player in the iOS operating system in a hardware decoding mode to obtain hard solution video data corresponding to each frame of video image; by a hard solution rendering method (CVOpenGLESTextureCacheCreateTextureFrom Image), a respective texture map for each frame of video image is created based on hard solution video data corresponding to each frame of video image.
Specifically, taking the video data processing scenario shown in fig. 3 as an example, after each frame of video image is obtained by decoding video data by a video player, in an iOS system of the terminal device, each frame of video image is recalled by a native video playing control or a currently-mounted application engine, and each frame of video image is carried by displaypixels. And processing each frame of video image in a hardware decoding mode to obtain hard-decoded video data corresponding to each frame of video image.
Further, in fig. 3, the hard solution video data is cvpixelbufferef, and based on this, a corresponding Texture map (Texture 2D) can be created for hard solution video data corresponding to each frame of video image using the method of cvopenglescripturetextureefromvimage. Finally, the Texture2D can be passed to an application engine or native player control for video rendering.
Another way for iOS devices is embodied as: if the type of the operating system is iOS, processing each frame of video image in an iOS native player in the iOS operating system in a software decoding mode to obtain texture data corresponding to each frame of video image; and drawing each dimension texture in the texture data into a texture map of a video memory space to obtain a texture map corresponding to each frame of video image.
Specifically, in fig. 3, for the soft solution case, the main difference is: pixels (pixels) can be obtained as texture data (e.g., YUV texture data) corresponding to each frame of video image by means of software decoding. And binding the three-dimensional texture data in the texture data with the three-dimensional texture map to obtain three texture maps. And creating an initial Texture2D in the video memory space, and drawing three Texture maps into the initial Texture2D through a Texture drawing control (such as a loader) in the native platform to obtain the Texture2D corresponding to each frame of video image. Where cvpixelbufferef is a pixel picture parameter in iOS systems.
In practical applications, the application engine that ultimately receives the texture map may be the Cocos engine or the Unity engine described above. The native player control may be a custom video playing control in the android system (such as ihvideo view shown in fig. 2) or a custom iOS video playing control in the iOS system (such as ihvideo view shown in fig. 3). Of course, other applications with video rendering capabilities are also possible, and are not limited herein.
In the above or the following embodiments, in 103, optionally, the rendering control in the terminal device performs video rendering based on the texture map, which may be implemented as: the texture map is rendered by an application engine or native player control used by the terminal device to obtain a video image for playback.
The implementation of the texture map rendering step is different for terminal devices using different operating systems or application engines, which are described below in terms of the operating system type or application engine type.
For android devices, whether the implementation forms are mobile phones, wearable devices, tablet computers and computers, the type of operating system used by the android devices is an android system. Specifically, as shown in fig. 4, for example, in the android system, the EGL module may create the context environment parameter (EGLContext), the Surface parameter (e.g., eglsource) and the related configuration parameter by using the Surface data (Surface) of the local window system (Surface view) or the Surface texture (Surface texture) of the texture window system (texture). Where EGL is an intermediate interface layer between open graphics library (Open Graphics Library, openGL) rendering and the local window system (SurfaceView, etc. in android systems). The introduction of EGLs herein may be used to render vector graphics across languages, across platforms. Further, the GLthread module renders the Texture2D onto an EGLSurface in a drawing frame (DrawFrame) module using the received Texture2D (i.e., texture map), thereby rendering the Texture2D onto a terminal device screen for display.
For the iOS device, no matter the implementation form of the iOS device is a mobile phone, a wearable device, a tablet computer or a computer, the type of an operating system used by the iOS device is an iOS system. Specifically, as illustrated in FIG. 5 as an example of a Texture map rendering scene, rendering Texture2D (i.e., texture map) in an iOS system may also be implemented in two specific ways. In one embodiment, if the Texture2D is created through the Metal interface, the Texture2D may be rendered directly through the MTKview method.
In another embodiment, if the Texture2D is generated through the OpenGL interface, then the Texture2D needs to be converted into CIImage data (i.e., a class of image frames that can be created based on the OpenGL top layer for shader processing on the image), and then the CIImage data is converted into UIImage data (i.e., a class of image), and finally the UIImage data is rendered through UIImageView (a basic control in the iOS system).
In practical applications, the former rendering mode is more efficient, but the latter rendering mode can be used to implement a video rendering flow when a video player is nested with other applications. Therefore, the corresponding IOS rendering mode can be determined according to the actual application requirements.
For a terminal device using a Cocos engine, as an example of a Texture map rendering scene shown in fig. 6, in the Cocos engine, for a received Texture2D (i.e., texture map), an OpenGL interface is called to draw the Texture2D by a drawing (draw) method of a Node (Node), so that it is rendered onto a screen of the terminal device for display.
For a terminal device using a Unity engine, as exemplified by a Texture map rendering scene shown in FIG. 7, in the Unity engine, for a received Texture2D (i.e., texture map), the Texture2D is created using the Texture2D, for example, the Texture2D is created using the CreateExternalTexture method of the Texture2D, and finally the Texture2D is set to a Rawimage (a type of image control for displaying non-interactive graphics for displaying decorations or icons) so that the Texture2D is rendered onto the terminal device screen for display by the Rawimage.
According to the embodiment, a rendering mode of crossing application engines of an operating system can be provided for a user, so that texture maps are rendered into video images for playing in different types of terminal equipment, and the universality of a video playing scheme is further improved.
An image recognition apparatus of one or more embodiments of the present invention will be described in detail below. Those skilled in the art will appreciate that these means may be configured by the steps taught by the present solution using commercially available hardware components.
Fig. 8 is a schematic structural diagram of a video playing device according to an embodiment of the present invention, and as shown in fig. 8, the device is applied to a video player installed in a terminal device. The device comprises:
an obtaining module 11, configured to obtain video data to be played;
a determining module 12, configured to determine an operating system type adopted by the terminal device, where the video player supports operating systems of different types;
and the rendering module 13 is configured to generate, in a video memory space of the terminal device, a texture map corresponding to the video data by using a rendering method corresponding to the operating system type, so that a rendering control in the terminal device performs video rendering based on the texture map.
Optionally, the rendering module 13 is specifically configured to, in a process of performing video rendering based on the texture map through a rendering control in the terminal device:
and transmitting the identification corresponding to the texture map to the rendering control, so that the rendering control reads the texture map from the video memory space according to the identification to perform video rendering.
Optionally, the identification corresponding to the texture map includes a handle identification of the texture map in the video memory space.
Optionally, the rendering control includes a native rendering control corresponding to the operating system type, or an application engine connected to the terminal device.
Optionally, the rendering module 13 is specifically configured to, in a process of generating a texture map corresponding to the video data in a video memory space of the terminal device by using a rendering method corresponding to the operating system type:
analyzing each frame of video image from the video data;
and generating a texture map corresponding to each frame of video image in the video memory space by adopting a video analysis method corresponding to the type of the operating system.
Optionally, the rendering module 13 adopts a video parsing method corresponding to the operating system type, and is specifically configured to:
if the type of the operating system is android, an initial texture map is created in the video memory space, and a surface texture and a memory area corresponding to the surface texture are created based on the initial texture map;
setting the memory area into an android native player in an android operating system;
in the android native player, recalling each frame of video image by adopting a frame available method (such as an onFrameAvailable method) of the surface texture;
and associating each frame of video image into a corresponding initial texture map by adopting a map updating method (such as an updateTexImage method) of the surface texture so as to obtain a corresponding texture map of each frame of video image.
Optionally, the rendering module 13 adopts a video parsing method corresponding to the operating system type, and is specifically configured to:
if the type of the operating system is iOS, processing each frame of video image in an iOS native player in the iOS operating system in a hardware decoding mode to obtain hard solution video data corresponding to each frame of video image;
by a hard solution rendering method (for example, CVOpenGLESTextureCacheCreateTextureFrom Image method), a texture map corresponding to each frame of video image is created based on the hard solution video data corresponding to each frame of video image.
Optionally, the rendering module 13 adopts a video parsing method corresponding to the operating system type, and is specifically configured to:
if the type of the operating system is iOS, processing each frame of video image in an iOS native player in the iOS operating system in a software decoding mode to obtain texture data corresponding to each frame of video image;
and drawing each dimension texture in the texture data into the texture map of the video memory space to obtain the texture map corresponding to each frame of video image.
The apparatus shown in fig. 8 may perform the steps described in the foregoing embodiments, and detailed execution and technical effects are referred to in the foregoing embodiments and are not described herein.
In one possible design, the structure of the video playing device shown in fig. 8 may be implemented as an electronic device, as shown in fig. 9, where the electronic device may include: a processor 21, a memory 22, a communication interface 23. Wherein the memory 22 has stored thereon executable code which, when executed by the processor 21, causes the processor 21 to at least implement the video playing method as provided in the previous embodiments.
In addition, embodiments of the present invention provide a non-transitory machine-readable storage medium having executable code stored thereon, which when executed by a processor of an electronic device, causes the processor to at least implement a video playback method as provided in the previous embodiments.
The apparatus embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be clear to those skilled in the art that the embodiments may be implemented by means of interfaces with the necessary general purpose hardware platforms, but may of course also be implemented by means of interfaces combining hardware and software. Based on such understanding, the foregoing aspects, in essence and portions contributing to the art, may be embodied in the form of a computer program product, which may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (7)

1. A video playing method applied to a video player installed in a terminal device, the method comprising:
acquiring video data to be played;
determining the type of an operating system adopted by the terminal equipment, wherein the video player supports different types of operating systems;
generating a texture map corresponding to the video data in a video memory space of the terminal equipment through a rendering method corresponding to the type of the operating system, so that a rendering control in the terminal equipment performs video rendering based on the texture map;
the generating, by the rendering method corresponding to the operating system type, a texture map corresponding to the video data in a video memory space of the terminal device includes: analyzing each frame of video image from the video data; generating a texture map corresponding to each frame of video image in the video memory space by adopting a video analysis method corresponding to the type of the operating system;
the generating, by using a video parsing method corresponding to the operating system type, a texture map corresponding to each frame of video image in the video memory space includes:
if the type of the operating system is android, an initial texture map is created in the video memory space, and a surface texture and a memory area corresponding to the surface texture are created based on the initial texture map; setting the memory area to the surface texture in the android operating system, and recalling each frame of video image by a frame available method of the surface texture; associating each frame of video image into a corresponding initial texture map by adopting the map updating method of the surface texture so as to obtain a corresponding texture map of each frame of video image; or,
if the type of the operating system is iOS, processing each frame of video image in a hardware decoding mode in a player of the iOS operating system to obtain hard solution video data corresponding to each frame of video image; and creating a texture map corresponding to each frame of video image based on the hard solution video data corresponding to each frame of video image through a hard solution drawing method.
2. The method of claim 1, wherein the rendering control in the terminal device performs video rendering based on the texture map, comprising:
and transmitting the identification corresponding to the texture map to the rendering control, so that the rendering control reads the texture map from the video memory space according to the identification to perform video rendering.
3. The method of claim 2, wherein the identification of the texture map correspondence comprises a handle identification of the texture map in the video memory space.
4. The method of claim 1, wherein the rendering control comprises a native rendering control corresponding to the operating system type or an application engine connected to the terminal device.
5. A video playing apparatus, characterized by being applied to a video player installed in a terminal device, comprising:
the acquisition module is used for acquiring video data to be played;
the determining module is used for determining the type of the operating system adopted by the terminal equipment, and the video player supports different types of operating systems;
the rendering module is used for generating a texture map corresponding to the video data in a video memory space of the terminal equipment through a rendering method corresponding to the type of the operating system so as to enable a rendering control in the terminal equipment to conduct video rendering based on the texture map;
the rendering module is specifically configured to, in a process of generating a texture map corresponding to the video data in a video memory space of the terminal device by using a rendering method corresponding to the operating system type: analyzing each frame of video image from the video data; generating a texture map corresponding to each frame of video image in the video memory space by adopting a video analysis method corresponding to the type of the operating system;
the rendering module adopts a video parsing method corresponding to the type of the operating system, and is specifically used for generating texture maps corresponding to each frame of video image in the video memory space: if the type of the operating system is android, an initial texture map is created in the video memory space, and a surface texture and a memory area corresponding to the surface texture are created based on the initial texture map; setting the memory area to the surface texture in the android operating system, and recalling each frame of video image by a frame available method of the surface texture; associating each frame of video image into a corresponding initial texture map by adopting the map updating method of the surface texture so as to obtain a corresponding texture map of each frame of video image; or if the type of the operating system is iOS, processing each frame of video image in a player of the iOS operating system in a hardware decoding mode to obtain hard solution video data corresponding to each frame of video image; and creating a texture map corresponding to each frame of video image based on the hard solution video data corresponding to each frame of video image through a hard solution drawing method.
6. A video playback device, comprising: a memory, a processor, a communication interface; wherein the memory has stored thereon executable code which, when executed by the processor, causes the processor to perform the video playback method of any one of claims 1 to 4.
7. A non-transitory machine-readable storage medium having stored thereon executable code which, when executed by a processor of an electronic device, causes the processor to perform the video playback method of any one of claims 1 to 4.
CN202111510114.7A 2021-12-10 2021-12-10 Video playing method, terminal equipment and storage medium Active CN114222185B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111510114.7A CN114222185B (en) 2021-12-10 2021-12-10 Video playing method, terminal equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111510114.7A CN114222185B (en) 2021-12-10 2021-12-10 Video playing method, terminal equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114222185A CN114222185A (en) 2022-03-22
CN114222185B true CN114222185B (en) 2024-04-05

Family

ID=80700925

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111510114.7A Active CN114222185B (en) 2021-12-10 2021-12-10 Video playing method, terminal equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114222185B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023245495A1 (en) * 2022-06-22 2023-12-28 云智联网络科技(北京)有限公司 Method and apparatus for converting rendered data into video stream, and electronic device

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2012202491A1 (en) * 2012-04-30 2013-11-14 Canon Kabushiki Kaisha Method, system and apparatus for rendering an image on a page
WO2017024144A1 (en) * 2015-08-04 2017-02-09 Google Inc. Systems and methods for interactively presenting a visible portion of a rendering surface on a user device
CN106507115A (en) * 2016-11-30 2017-03-15 上海爱葛网络科技有限公司 Based on the coding/decoding method of the VR videos of iOS device, device and terminal device
CN106713937A (en) * 2016-12-30 2017-05-24 广州虎牙信息科技有限公司 Video playing control method and device as well as terminal equipment
CN108093293A (en) * 2018-01-15 2018-05-29 北京奇艺世纪科技有限公司 A kind of Video Rendering method and system
CN109922360A (en) * 2019-03-07 2019-06-21 腾讯科技(深圳)有限公司 Method for processing video frequency, device and storage medium
CN110602551A (en) * 2019-08-22 2019-12-20 福建星网智慧科技股份有限公司 Media playing method, player, equipment and storage medium of android frame layer
CN111292387A (en) * 2020-01-16 2020-06-16 广州小鹏汽车科技有限公司 Dynamic picture loading method and device, storage medium and terminal equipment
CN113457160A (en) * 2021-07-15 2021-10-01 腾讯科技(深圳)有限公司 Data processing method and device, electronic equipment and computer readable storage medium
CN113674389A (en) * 2021-10-25 2021-11-19 深圳须弥云图空间科技有限公司 Scene rendering method and device, electronic equipment and storage medium
CN113672387A (en) * 2021-08-11 2021-11-19 上海交通大学 Remote calling graphics rendering method and system based on drawing programming interface

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190155585A1 (en) * 2017-11-17 2019-05-23 General Electric Company Dynamic hybrid rendering

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2012202491A1 (en) * 2012-04-30 2013-11-14 Canon Kabushiki Kaisha Method, system and apparatus for rendering an image on a page
WO2017024144A1 (en) * 2015-08-04 2017-02-09 Google Inc. Systems and methods for interactively presenting a visible portion of a rendering surface on a user device
CN106507115A (en) * 2016-11-30 2017-03-15 上海爱葛网络科技有限公司 Based on the coding/decoding method of the VR videos of iOS device, device and terminal device
CN106713937A (en) * 2016-12-30 2017-05-24 广州虎牙信息科技有限公司 Video playing control method and device as well as terminal equipment
CN108093293A (en) * 2018-01-15 2018-05-29 北京奇艺世纪科技有限公司 A kind of Video Rendering method and system
CN109922360A (en) * 2019-03-07 2019-06-21 腾讯科技(深圳)有限公司 Method for processing video frequency, device and storage medium
CN110602551A (en) * 2019-08-22 2019-12-20 福建星网智慧科技股份有限公司 Media playing method, player, equipment and storage medium of android frame layer
CN111292387A (en) * 2020-01-16 2020-06-16 广州小鹏汽车科技有限公司 Dynamic picture loading method and device, storage medium and terminal equipment
CN113457160A (en) * 2021-07-15 2021-10-01 腾讯科技(深圳)有限公司 Data processing method and device, electronic equipment and computer readable storage medium
CN113672387A (en) * 2021-08-11 2021-11-19 上海交通大学 Remote calling graphics rendering method and system based on drawing programming interface
CN113674389A (en) * 2021-10-25 2021-11-19 深圳须弥云图空间科技有限公司 Scene rendering method and device, electronic equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Distributed rendering: Interaction delay reduction in remote rendering with client-end GPU-accelerated scene warping technique;Yu-Jung Chen等;《2017 IEEE International Conference on Multimedia & Expo Workshops (ICMEW)》;20170907;全文 *
基于Html5的Web Map矢量渲染技术研究;丁立国等;《测绘工程 》;20170802(第8期);全文 *
应用差异化更新技术实现GPU虚拟化场景性能优化;李伟男;《中国优秀硕士学位论文全文数据库》;20200115(第1期);全文 *

Also Published As

Publication number Publication date
CN114222185A (en) 2022-03-22

Similar Documents

Publication Publication Date Title
CN111193876B (en) Method and device for adding special effect in video
CN112235604B (en) Rendering method and device, computer readable storage medium and electronic device
CN111899322B (en) Video processing method, animation rendering SDK, equipment and computer storage medium
EP3311565B1 (en) Low latency application streaming using temporal frame transformation
CN109040792B (en) Processing method for video redirection, cloud terminal and cloud desktop server
CN113457160A (en) Data processing method and device, electronic equipment and computer readable storage medium
CN113946402A (en) Cloud mobile phone acceleration method, system, equipment and storage medium based on rendering separation
CN113368492A (en) Rendering method and device
CN112929740A (en) Method, device, storage medium and equipment for rendering video stream
WO2022218042A1 (en) Video processing method and apparatus, and video player, electronic device and readable medium
CN114222185B (en) Video playing method, terminal equipment and storage medium
CN114570020A (en) Data processing method and system
CN113411660B (en) Video data processing method and device and electronic equipment
Zorrilla et al. HTML5-based system for interoperable 3D digital home applications
CN113411661B (en) Method, apparatus, device, storage medium and program product for recording information
CN112367295B (en) Plug-in display method and device, storage medium and electronic equipment
CN117065357A (en) Media data processing method, device, computer equipment and storage medium
CN111367598B (en) Method and device for processing action instruction, electronic equipment and computer readable storage medium
JPWO2014024255A1 (en) Terminal and video playback program
US9161009B2 (en) System, terminal device, and image capturing method
CN108235144B (en) Playing content obtaining method and device and computing equipment
CN112954452A (en) Video generation method, device, terminal and storage medium
CN114968152B (en) Method for reducing VIRTIO-GPU extra performance loss
CN116527983A (en) Page display method, device, equipment, storage medium and product
CN114430487A (en) Media display method and device and video processing equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant