CN116347158A - Video playing method and device, electronic equipment and computer readable storage medium - Google Patents
Video playing method and device, electronic equipment and computer readable storage medium Download PDFInfo
- Publication number
- CN116347158A CN116347158A CN202310413671.XA CN202310413671A CN116347158A CN 116347158 A CN116347158 A CN 116347158A CN 202310413671 A CN202310413671 A CN 202310413671A CN 116347158 A CN116347158 A CN 116347158A
- Authority
- CN
- China
- Prior art keywords
- video
- data
- target
- encapsulation
- processing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 238000012545 processing Methods 0.000 claims abstract description 88
- 238000005538 encapsulation Methods 0.000 claims abstract description 72
- 238000005516 engineering process Methods 0.000 claims description 15
- 238000004806 packaging method and process Methods 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 13
- 238000005070 sampling Methods 0.000 claims description 7
- 230000006870 function Effects 0.000 description 11
- 238000010586 diagram Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 230000001360 synchronised effect Effects 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 241000023320 Luma <angiosperm> Species 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000012092 media component Substances 0.000 description 1
- OSWPMRLSEDHDFF-UHFFFAOYSA-N methyl salicylate Chemical compound COC(=O)C1=CC=CC=C1O OSWPMRLSEDHDFF-UHFFFAOYSA-N 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440218—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/234309—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4 or from Quicktime to Realvideo
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4782—Web browsing, e.g. WebTV
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
The application belongs to the technical field of video playing, and discloses a method, a device, electronic equipment and a computer readable storage medium for video playing, wherein the method comprises the steps of pulling first video package data of target video of target equipment when a video playing request of the target video of the target equipment is determined to be received; unpacking and processing the first video package data by adopting a first appointed protocol to obtain video processing data; encapsulating the video processing data by adopting a second designated protocol to obtain second video encapsulation data; and sending the second video encapsulation data to the browser client side, so that the browser client side decapsulates and plays the second video encapsulation data. Therefore, the video data of different target devices can be uniformly converted into the video data in a common format, and the problem that the same client cannot be compatible and play videos of different video acquisition devices is solved.
Description
Technical Field
The present application relates to the field of video playing technologies, and in particular, to a method, an apparatus, an electronic device, and a computer readable storage medium for video playing.
Background
Because the models and the operating systems of different video acquisition devices are different, when the terminal device plays the video of the different video acquisition devices, corresponding clients are usually required to be installed for the different video acquisition devices respectively.
Obviously, the same client cannot be compatible with and play videos of different video acquisition devices.
Disclosure of Invention
An embodiment of the present application is directed to a method, an apparatus, an electronic device, and a computer readable storage medium for playing video, so as to solve the problem that the same client cannot be compatible and play video of different video capturing devices when playing video.
In one aspect, a method for playing video is provided, including:
when determining that a video playing request of target video of target equipment is received, pulling first video packaging data of the target video of the target equipment;
unpacking and processing the first video package data by adopting a first appointed protocol to obtain video processing data;
encapsulating the video processing data by adopting a second designated protocol to obtain second video encapsulation data;
and sending the second video encapsulation data to the browser client side, so that the browser client side decapsulates and plays the second video encapsulation data.
In one embodiment, determining that the first video package data of the target video of the target device is pulled when the video playing request of the target video of the target device is received includes:
when the video playing request sent by the browser client is determined to be received, sending a video pulling request aiming at the target video to target equipment;
receiving first video encapsulation data returned by the target equipment based on the video pulling request; the first video encapsulation data is obtained after encapsulating the target video based on a real-time streaming protocol.
In one embodiment, the unpacking and processing the first video package data using the first specified protocol to obtain video processing data includes:
based on a real-time streaming protocol, decapsulating the first video encapsulation data to obtain an decapsulated video stream;
respectively carrying out video format processing and audio sampling processing on the unpacked video stream to obtain initial processing data;
and performing audio and video synchronous processing on the initial processing data to obtain video processing data.
In one embodiment, encapsulating the video processing data using a second specified protocol to obtain second video encapsulated data, comprising:
and encapsulating the video processing data by adopting an encapsulation format mpeg to generate second video encapsulation data.
In one embodiment, sending the second video package data to the browser client includes:
and sending the second video encapsulation data to the browser client based on the JSM peg technology, so that the browser client decodes and plays the second video encapsulation data based on the JSM peg technology.
In one aspect, an apparatus for video playing is provided, including:
the pulling unit is used for determining to pull the first video packaging data of the target video of the target device when the video playing request of the target video of the target device is received;
the unpacking unit is used for unpacking and processing the first video package data by adopting a first appointed protocol to obtain video processing data;
the packaging unit is used for packaging the video processing data by adopting a second designated protocol to obtain second video packaging data;
and the sending unit is used for sending the second video encapsulation data to the browser client side so that the browser client side can unpack and play the second video encapsulation data.
In one embodiment, the pulling unit is configured to:
when the video playing request sent by the browser client is determined to be received, sending a video pulling request aiming at the target video to target equipment;
receiving first video encapsulation data returned by the target equipment based on the video pulling request; the first video encapsulation data is obtained after encapsulating the target video based on a real-time streaming protocol.
In one embodiment, the deblocking unit is configured to:
based on a real-time streaming protocol, decapsulating the first video encapsulation data to obtain an decapsulated video stream;
respectively carrying out video format processing and audio sampling processing on the unpacked video stream to obtain initial processing data;
and performing audio and video synchronous processing on the initial processing data to obtain video processing data.
In one embodiment, the packaging unit is configured to:
and encapsulating the video processing data by adopting an encapsulation format mpeg to generate second video encapsulation data.
In one embodiment, the transmitting unit is configured to:
and sending the second video encapsulation data to the browser client based on the JSM peg technology, so that the browser client decodes and plays the second video encapsulation data based on the JSM peg technology.
In one aspect, an electronic device is provided that includes a processor and a memory storing computer readable instructions that, when executed by the processor, perform the steps of a method as provided in various alternative implementations of any of the video playback described above.
In one aspect, a computer-readable storage medium is provided, on which a computer program is stored which, when executed by a processor, performs the steps of a method as provided in various alternative implementations of any of the video playback described above.
In one aspect, there is provided a computer program product which, when run on a computer, causes the computer to perform the steps of the method provided in various alternative implementations of video playback as described above.
In the method, the device, the electronic equipment and the computer readable storage medium for playing the video provided by the embodiment of the application, when the video playing request of the target video of the target equipment is determined to be received, the first video packaging data of the target video of the target equipment is pulled; unpacking and processing the first video package data by adopting a first appointed protocol to obtain video processing data; encapsulating the video processing data by adopting a second designated protocol to obtain second video encapsulation data; and sending the second video encapsulation data to the browser client side, so that the browser client side decapsulates and plays the second video encapsulation data. Therefore, the video data of different target devices can be uniformly converted into the video data in a common format, and the problem that the same client cannot be compatible and play videos of different video acquisition devices is solved.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application. The objectives and other advantages of the application will be realized and attained by the structure particularly pointed out in the written description and claims thereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for playing video according to an embodiment of the present application;
fig. 2 is a flowchart of a method for video pulling according to an embodiment of the present application;
fig. 3 is a schematic flow chart of video processing according to an embodiment of the present application;
fig. 4 is a block diagram of a video playing device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
Some of the terms referred to in the embodiments of the present application will be described first to facilitate understanding by those skilled in the art.
Terminal equipment: the mobile terminal, stationary terminal or portable terminal may be, for example, a mobile handset, a site, a unit, a device, a multimedia computer, a multimedia tablet, an internet node, a communicator, a desktop computer, a laptop computer, a notebook computer, a netbook computer, a tablet computer, a personal communications system device, a personal navigation device, a personal digital assistant, an audio/video player, a digital camera/camcorder, a positioning device, a television receiver, a radio broadcast receiver, an electronic book device, a game device, or any combination thereof, including the accessories and peripherals of these devices, or any combination thereof. It is also contemplated that the terminal device can support any type of interface (e.g., wearable device) for the user, etc.
And (3) a server: the cloud server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and can also be a cloud server for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, basic cloud computing services such as big data and artificial intelligent platforms and the like.
In order to solve the problem that the same client cannot be compatible and play videos of different video acquisition devices when playing videos, the embodiment of the application provides a video playing method, a video playing device, an electronic device and a computer readable storage medium.
In this embodiment of the present application, the execution body is an electronic device, and the electronic device may be a server or a terminal device.
Referring to fig. 1, a flowchart of a method for playing video according to an embodiment of the present application is shown, where a specific implementation flow of the method is as follows:
step 100: when determining that a video playing request of target video of target equipment is received, pulling first video packaging data of the target video of the target equipment; step 101: unpacking and processing the first video package data by adopting a first appointed protocol to obtain video processing data; step 102: encapsulating the video processing data by adopting a second designated protocol to obtain second video encapsulation data; step 103: and sending the second video encapsulation data to the browser client side, so that the browser client side decapsulates and plays the second video encapsulation data.
In one embodiment, to obtain a target video of a target device, the implementation process of step 100 may include:
s1001: and when the video playing request sent by the browser client is determined to be received, sending a video pulling request aiming at the target video to target equipment.
Specifically, the electronic device may establish a connection with the browser client using WebSocket protocol. And the browser client side responds to the video playing operation of the user, determines a device video source selected by the user, namely a target video of the target device, and sends a video playing request to the electronic device. The electronic device establishes a protocol connection with the target device using a transmission control protocol (TCP, transmission Control Protocol) protocol, and sends a video pull request to the target device based on the received video play request.
WebSocket is a protocol in which full duplex communications are performed over a single TCP connection.
S1002: and receiving first video encapsulation data returned by the target device based on the video pulling request.
Wherein the first video encapsulation data is obtained after encapsulating the target video based on a real-time streaming protocol (Real Time Streaming Protocol, RTSP).
Referring to fig. 2, a flow chart of a video pulling method is shown. In fig. 2, a target device, a streaming media layer, and an electronic device are included. In one embodiment, the target device performs data conversion processing on the target video sequentially by using the following modules: software development kits (Software Development Kit, SDK) red green blue (RedGreenBlue, RGB) or luma chroma (LuminanceChrominance, YUV) modules, SDK H264 frame modules, SDK MP4 modules, SDK RTSP stream modules, and RTSP endpoints. The target device exposes the RTSP endpoint so that the standard codestream (i.e., the first video encapsulation data) after the data conversion process can be transmitted to the electronic device through the RTSP protocol and the TCP protocol.
In one embodiment, to process video package data, the implementation of step 101 may include:
s1011: and based on the real-time streaming protocol, the first video encapsulation data is decapsulated to obtain an decapsulated video stream.
Specifically, through integrating the streaming media component, the streaming protocol conversion is performed, that is, the RTSP protocol is adopted to decapsulate the first video encapsulation data (that is, the RTSP video stream), so as to obtain the h264 bare stream (that is, the decapsulated video stream).
S1012: and respectively carrying out video format processing and audio sampling processing on the unpacked video stream to obtain initial processing data.
S1013: and performing audio and video synchronous processing on the initial processing data to obtain video processing data.
In one embodiment, to implement unified packaging of video, the implementation process of step 102 may include:
the video processing data is encapsulated with mpeg by an encapsulation format (mpeg) component, generating an mpeg video stream (i.e., second video encapsulation data).
In one embodiment, to implement video playback, the implementation process of step 103 may include:
and sending the second video encapsulation data to the browser client based on a video player (JSM peg) technology, so that the browser client decodes and plays the second video encapsulation data based on the JSM peg technology.
Specifically, the browser client and the server transmit video frames (namely second video package data) through the WebSocket and the JSMpeg, and the server transmits the second video package data to the corresponding browser client through point-to-point; and the browser client performs video decoding and playing through a JSMpeg soft decoding technology.
Referring to fig. 3, a flow chart of video processing is shown. An electronic device and a browser client are included in fig. 3. After receiving a video playing request sent by a browser client through a websocket, the electronic equipment decapsulates an RTSP video stream by adopting an RTSP protocol to obtain an h264 bare stream, and respectively carrying out audio processing, video processing (namely video format processing and audio sampling processing) and audio and video synchronous processing on the h264 bare stream to obtain video processing data; the video processing data is encoded into an mpeg video stream and the mpeg video stream is pushed to the browser client.
And the browser client performs JSM peg image rendering and playing on the mpeg video stream through a JSM peg soft decoding technology.
For a user, only the h5 page in the browser client is needed to be clicked, target equipment such as a camera is selected, real-time pictures of the target equipment can be seen, any browser plug-in is not required to be installed, an Application (APP) provided by a camera provider is not required to be installed, target equipment with any different brand models can be seen, the playing page can be stopped after the user exits or closes, consumption of streaming media resources is saved, and when different users view the same video stream, the video stream can be multiplexed, and cpu resource consumption is saved.
For a developer, the commercial SDK of an integrated camera brand manufacturer does not need to be docked, the corresponding development document does not need to be read for debugging and adaptation, the learning cost is low, the video streaming video player is easy to use by hands, and the video picture resolution, format, definition, video frame rate, sound quality and the like of the video streaming video player can be flexibly configured according to the requirements of users, and the video streaming video player is effective in time and flexible and convenient.
In this embodiment of the application, terminal equipment of arbitrary system all can play in real time and preview video in real time, need not to install the plug-in components, need not to install other APP, and no model restriction does not have camera model brand restriction, does not need the re-adaptation when playing video in different platforms, has realized the compatible broadcast of video, provides convenience for the user.
Based on the same inventive concept, the embodiment of the present application further provides a video playing device, and because the principle of solving the problem by using the device and the equipment is similar to that of a video playing method, the implementation of the device can refer to the implementation of the method, and the repetition is omitted.
Fig. 4 is a block diagram of a video playing device according to an embodiment of the present application, including:
a pulling unit 401, configured to determine, when a video playing request for a target video of a target device is received, pull first video package data of the target video of the target device;
a decapsulation unit 402, configured to decapsulate and process the first video encapsulation data by using a first specified protocol, so as to obtain video processing data;
an encapsulation unit 403, configured to encapsulate the video processing data by using a second specified protocol, so as to obtain second video encapsulated data;
and the sending unit 404 is configured to send the second video package data to the browser client, so that the browser client decapsulates and plays the second video package data.
In one embodiment, the pulling unit 401 is configured to:
when the video playing request sent by the browser client is determined to be received, sending a video pulling request aiming at the target video to target equipment;
receiving first video encapsulation data returned by the target equipment based on the video pulling request; the first video encapsulation data is obtained after encapsulating the target video based on a real-time streaming protocol.
In one embodiment, the decapsulation unit 402 is configured to:
based on a real-time streaming protocol, decapsulating the first video encapsulation data to obtain an decapsulated video stream;
respectively carrying out video format processing and audio sampling processing on the unpacked video stream to obtain initial processing data;
and performing audio and video synchronous processing on the initial processing data to obtain video processing data.
In one embodiment, the encapsulation unit 403 is configured to:
and encapsulating the video processing data by adopting an encapsulation format mpeg to generate second video encapsulation data.
In one embodiment, the sending unit 404 is configured to:
and sending the second video encapsulation data to the browser client based on the JSM peg technology, so that the browser client decodes and plays the second video encapsulation data based on the JSM peg technology.
In the method, the device, the electronic equipment and the computer readable storage medium for playing the video provided by the embodiment of the application, when the video playing request of the target video of the target equipment is determined to be received, the first video packaging data of the target video of the target equipment is pulled; unpacking and processing the first video package data by adopting a first appointed protocol to obtain video processing data; encapsulating the video processing data by adopting a second designated protocol to obtain second video encapsulation data; and sending the second video encapsulation data to the browser client side, so that the browser client side decapsulates and plays the second video encapsulation data. Therefore, the video data of different target devices can be uniformly converted into the video data in a common format, and the problem that the same client cannot be compatible and play videos of different video acquisition devices is solved.
Fig. 5 shows a schematic structural diagram of an electronic device 5000. Referring to fig. 5, an electronic device 5000 includes: the processor 5010 and the memory 5020, optionally, may also include a power supply 5030, a display unit 5040, and an input unit 5050.
The processor 5010 is a control center of the electronic device 5000, connects the respective components using various interfaces and lines, and performs various functions of the electronic device 5000 by running or executing software programs and/or data stored in the memory 5020, thereby performing overall monitoring of the electronic device 5000.
In the present embodiment, the processor 5010 executes the steps in the above embodiments when calling the computer program stored in the memory 5020.
Optionally, the processor 5010 may include one or more processing units; preferably, the processor 5010 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 5010. In some embodiments, the processor, memory, may be implemented on a single chip, and in some embodiments, they may be implemented separately on separate chips.
The memory 5020 may mainly include a storage program area that may store an operating system, various applications, and the like, and a storage data area; the storage data area may store data created according to the use of the electronic device 5000, and the like. In addition, the memory 5020 can include high-speed random access memory and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device, or the like.
The electronic device 5000 also includes a power supply 5030 (e.g., a battery) for powering the various components, which may be logically connected to the processor 5010 via a power management system to perform functions for managing charge, discharge, and power consumption by the power management system.
The display unit 5040 may be used to display information input by a user or information provided to the user, various menus of the electronic device 5000, and the like, and is mainly used to display a display interface of each application in the electronic device 5000 and objects such as text and pictures displayed in the display interface in the embodiment of the present invention. The display unit 5040 may include a display panel 5041. The display panel 5041 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The input unit 5050 may be used to receive information such as numbers or characters input by a user. The input unit 5050 may include a touch panel 5051 and other input devices 5052. Among other things, touch panel 5051, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on touch panel 5051 or thereabout using any suitable object or accessory such as a finger, stylus, etc.).
Specifically, the touch panel 5051 may detect a touch operation by a user, detect a signal resulting from the touch operation, convert the signal into a touch point coordinate, send the touch point coordinate to the processor 5010, and receive and execute a command sent from the processor 5010. In addition, the touch panel 5051 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. Other input devices 5052 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, on-off keys, etc.), a trackball, mouse, joystick, etc.
Of course, touch panel 5051 may overlay display panel 5041, and upon detection of a touch operation thereon or thereabout by touch panel 5051, is passed to processor 5010 to determine the type of touch event, whereupon processor 5010 provides a corresponding visual output on display panel 5041 in accordance with the type of touch event. Although in fig. 5, the touch panel 5051 and the display panel 5041 are provided as two separate components to implement the input and output functions of the electronic device 5000, in some embodiments, the touch panel 5051 may be integrated with the display panel 5041 to implement the input and output functions of the electronic device 5000.
The electronic device 5000 may also include one or more sensors, such as pressure sensors, gravitational acceleration sensors, proximity light sensors, and the like. Of course, the electronic device 5000 may also include other components such as a camera, as needed in a specific application, and these components are not shown in fig. 5 and will not be described in detail since they are not the components that are important in the embodiments of the present application.
It will be appreciated by those skilled in the art that fig. 5 is merely an example of an electronic device and is not meant to be limiting, and that more or fewer components than shown may be included, or certain components may be combined, or different components may be included.
In an embodiment of the present application, a computer-readable storage medium has stored thereon a computer program that, when executed by a processor, enables a communication device to perform the steps of the above-described embodiments.
For convenience of description, the above parts are described as being functionally divided into modules (or units) respectively. Of course, the functions of each module (or unit) may be implemented in the same piece or pieces of software or hardware when implementing the present application.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various modifications and variations can be made in the present application without departing from the spirit or scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims and the equivalents thereof, the present application is intended to cover such modifications and variations.
Claims (12)
1. A method of video playback, comprising:
when determining that a video playing request of a target video of target equipment is received, pulling first video packaging data of the target video of the target equipment;
unpacking and processing the first video package data by adopting a first appointed protocol to obtain video processing data;
encapsulating the video processing data by adopting a second designated protocol to obtain second video encapsulation data;
and sending the second video encapsulation data to a browser client side, so that the browser client side decapsulates and plays the second video encapsulation data.
2. The method of claim 1, wherein the determining to pull the first video package data for the target video of the target device when the video play request for the target video of the target device is received comprises:
when the video playing request sent by the browser client is determined to be received, sending a video pulling request aiming at the target video to the target equipment;
receiving first video packaging data returned by the target equipment based on the video pulling request; the first video encapsulation data is obtained after the target video is encapsulated based on a real-time streaming protocol.
3. The method of claim 2, wherein said decapsulating and processing the first video-encapsulated data using the first specified protocol to obtain video-processed data comprises:
based on a real-time streaming protocol, decapsulating the first video encapsulation data to obtain an decapsulated video stream;
performing video format processing and audio sampling processing on the unpacked video stream respectively to obtain initial processing data;
and performing audio and video synchronization processing on the initial processing data to obtain the video processing data.
4. The method of claim 1, wherein encapsulating the video processing data using a second specified protocol to obtain second video encapsulation data comprises:
and encapsulating the video processing data by adopting an encapsulation format mpeg to generate the second video encapsulation data.
5. The method of any of claims 1-4, wherein the sending the second video encapsulation data to a browser client comprises:
and sending the second video encapsulation data to the browser client based on a JSMpeg technology of a video player, so that the browser client decodes and plays the second video encapsulation data based on the JSMpeg technology.
6. An apparatus for video playback, comprising:
the pulling unit is used for determining that when a video playing request of a target video of target equipment is received, first video packaging data of the target video of the target equipment are pulled;
the unpacking unit is used for unpacking and processing the first video package data by adopting a first appointed protocol to obtain video processing data;
the encapsulation unit is used for encapsulating the video processing data by adopting a second designated protocol to obtain second video encapsulation data;
and the sending unit is used for sending the second video encapsulation data to the browser client side so that the browser client side can unpack and play the second video encapsulation data.
7. The apparatus of claim 6, wherein the pull unit is to:
when the video playing request sent by the browser client is determined to be received, sending a video pulling request aiming at the target video to the target equipment;
receiving first video packaging data returned by the target equipment based on the video pulling request; the first video encapsulation data is obtained after the target video is encapsulated based on a real-time streaming protocol.
8. The apparatus of claim 7, wherein the decapsulation unit is to:
based on a real-time streaming protocol, decapsulating the first video encapsulation data to obtain an decapsulated video stream;
performing video format processing and audio sampling processing on the unpacked video stream respectively to obtain initial processing data;
and performing audio and video synchronization processing on the initial processing data to obtain the video processing data.
9. The apparatus of claim 6, wherein the encapsulation unit is to:
and encapsulating the video processing data by adopting an encapsulation format mpeg to generate the second video encapsulation data.
10. The apparatus according to any of claims 6-9, wherein the sending unit is configured to:
and sending the second video encapsulation data to the browser client based on a JSMpeg technology of a video player, so that the browser client decodes and plays the second video encapsulation data based on the JSMpeg technology.
11. An electronic device comprising a processor and a memory storing computer readable instructions that, when executed by the processor, perform the method of any of claims 1-5.
12. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, performs the method according to any of claims 1-5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310413671.XA CN116347158A (en) | 2023-04-12 | 2023-04-12 | Video playing method and device, electronic equipment and computer readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310413671.XA CN116347158A (en) | 2023-04-12 | 2023-04-12 | Video playing method and device, electronic equipment and computer readable storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116347158A true CN116347158A (en) | 2023-06-27 |
Family
ID=86889459
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310413671.XA Pending CN116347158A (en) | 2023-04-12 | 2023-04-12 | Video playing method and device, electronic equipment and computer readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116347158A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117440186A (en) * | 2023-12-22 | 2024-01-23 | 深圳星网信通科技股份有限公司 | Video service integration method, video integration apparatus, and computer-readable storage medium |
-
2023
- 2023-04-12 CN CN202310413671.XA patent/CN116347158A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117440186A (en) * | 2023-12-22 | 2024-01-23 | 深圳星网信通科技股份有限公司 | Video service integration method, video integration apparatus, and computer-readable storage medium |
CN117440186B (en) * | 2023-12-22 | 2024-05-28 | 深圳星网信通科技股份有限公司 | Video service integration method, video integration apparatus, and computer-readable storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8934887B2 (en) | System and method for running mobile devices in the cloud | |
WO2019062606A1 (en) | Overlay comment information display method, providing method, and apparatus | |
WO2022105445A1 (en) | Browser-based application screen projection method and related apparatus | |
CN112527174B (en) | Information processing method and electronic equipment | |
WO2021249318A1 (en) | Screen projection method and terminal | |
US20180373736A1 (en) | Method and apparatus for storing resource and electronic device | |
CN112995759A (en) | Interactive service processing method, system, device, equipment and storage medium | |
CN111222063A (en) | Rich text rendering method and device, electronic equipment and storage medium | |
CN106406924B (en) | Control method and device for starting and quitting picture of application program and mobile terminal | |
CN112527222A (en) | Information processing method and electronic equipment | |
CN106569758A (en) | Wireless projection screen method and device | |
WO2023030099A1 (en) | Cross-device interaction method and apparatus, and screen projection system and terminal | |
CN113613064B (en) | Video processing method, device, storage medium and terminal | |
CN113535063A (en) | Live broadcast page switching method, video page switching method, electronic device and storage medium | |
JP2023522266A (en) | Method, apparatus, device and medium for multimedia data delivery | |
US20230333803A1 (en) | Enhanced Screen Sharing Method and System, and Electronic Device | |
CN112969093A (en) | Interactive service processing method, device, equipment and storage medium | |
CN116347158A (en) | Video playing method and device, electronic equipment and computer readable storage medium | |
CN113225616A (en) | Video playing method and device, computer equipment and readable storage medium | |
CN109587497B (en) | Audio data transmission method, device and system for FLV (flash video) stream | |
CN107241651B (en) | Media data playing method and device and intelligent terminal | |
CN113873187B (en) | Cross-terminal screen recording method, terminal equipment and storage medium | |
US20240098045A1 (en) | Chat interaction method, electronic device, and server | |
CN113157092A (en) | Visualization method, terminal device and storage medium | |
WO2021217467A1 (en) | Method and apparatus for testing intelligent camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |