CN112511887B - Video playing control method, corresponding device, equipment, system and storage medium - Google Patents

Video playing control method, corresponding device, equipment, system and storage medium Download PDF

Info

Publication number
CN112511887B
CN112511887B CN202011379312.XA CN202011379312A CN112511887B CN 112511887 B CN112511887 B CN 112511887B CN 202011379312 A CN202011379312 A CN 202011379312A CN 112511887 B CN112511887 B CN 112511887B
Authority
CN
China
Prior art keywords
video
playing
thread
equipment
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011379312.XA
Other languages
Chinese (zh)
Other versions
CN112511887A (en
Inventor
杨永贵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202011379312.XA priority Critical patent/CN112511887B/en
Publication of CN112511887A publication Critical patent/CN112511887A/en
Application granted granted Critical
Publication of CN112511887B publication Critical patent/CN112511887B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/47217End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content

Abstract

The application provides a video playing control method, and a corresponding device, equipment, system and storage medium. The video playing control method comprises the following steps: in the video playing process, acquiring a video playing position of the main equipment and a video playing position of the equipment; when the difference value between the video playing position of the main equipment and the video playing position of the equipment is larger than a difference value threshold value, pausing the playing of the video of the equipment by a playing thread, and jumping the playing position of the playing thread from the current playing position to a key frame position corresponding to the video playing position of the main equipment; starting from the video playing position of the main equipment, decoding, rendering and playing each video frame through a decoding thread until reaching the key frame position; and starting from the key frame position, resuming the playing thread to play the video of the device. The application can realize video synchronization of any play position among different display devices.

Description

Video playing control method, corresponding device, equipment, system and storage medium
Technical Field
The application relates to the technical field of video processing, in particular to a video playing control method and corresponding device, equipment, system and storage medium.
Background
With the increasing demand of digital cinema playing systems for definition, the resolution of the output picture far exceeds the output resolution of a computer or a display device, so that a plurality of display devices are often required to splice a complete picture with ultra-high resolution together, and how to ensure the synchronization of the pictures output by all the devices becomes a key problem.
A common soft synchronization method is: the whole video is split into equal parts, the equal parts of small videos are placed on different display devices to be played, the playing time of the different display devices is synchronized through network communication, so that an ultra-high resolution picture is synchronized and spliced, and when the local synchronization is carried out, the skip is usually realized through a seekTo function (skip function) of a media player (Android player).
However, this method has the following problems: because the video released to different display devices is different, the key frames and decoding packets contained are different, and for versions before and below android8.0, the seekTo function of the player MediaPlayer only supports jumping to the key frames and does not support tuning to specific playing positions, so that the key frames cannot be aligned, and synchronous playing cannot be realized for each display device.
Disclosure of Invention
Aiming at the defects of the prior art, the application provides a video playing control method, and a corresponding device, equipment, system and storage medium, which are used for solving the technical problem that the version before android8.0 in the prior art cannot realize synchronous playing at any position.
In a first aspect, an embodiment of the present application provides a video play control method, which is applied to any slave device, including:
in the video playing process, acquiring a video playing position of the main equipment and a video playing position of the equipment;
when the difference value between the video playing position of the main equipment and the video playing position of the equipment is larger than a difference value threshold value, pausing the playing of the video of the equipment by a playing thread, and jumping the playing position of the playing thread from the current playing position to a key frame position corresponding to the video playing position of the main equipment;
starting from the video playing position of the main equipment, decoding, rendering and playing each video frame through a decoding thread until reaching the key frame position;
and starting from the key frame position, resuming the playing thread to play the video of the device.
In a second aspect, an embodiment of the present application provides a video play control apparatus, which is applied to any slave device, including:
the information acquisition module is used for acquiring the video playing position of the main equipment and the video playing position of the equipment in the video playing process;
the first playing control module is used for suspending the playing of the video of the equipment by the playing thread when the difference value between the video playing position of the main equipment and the video playing position of the equipment is larger than a difference value threshold value, and jumping the playing position of the playing thread from the current playing position to a key frame position corresponding to the video playing position of the main equipment;
the second play control module is used for decoding, rendering and playing each video frame through the decoding thread from the video play position of the main equipment until reaching the key frame position;
and the third play control module is used for recovering the play of the video of the equipment by the play thread from the key frame position.
In a third aspect, an embodiment of the present application provides a display apparatus, as a slave apparatus, including:
a memory;
a processor electrically connected to the memory;
the memory stores a computer program that is executed by the processor to implement the video play control method provided in the first aspect of the embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a video playing system, including: a display device as a master device and a display device as a slave device provided in the third aspect of the embodiment of the present application;
each display device is connected in series, and two adjacent display devices are connected in communication.
In a fifth aspect, an embodiment of the present application provides a computer readable storage medium storing a computer program, where the computer program when executed by a processor implements the video play control method provided in the first aspect of the embodiment of the present application.
The technical scheme provided by the embodiment of the application has at least the following beneficial effects:
the embodiment of the application adopts the play configuration of double threads (play threads and decoding threads), and can jump the video play position of the device from the current video play position to the received video play position of the main device through the decoding threads, thereby realizing video synchronization of any play position among different display devices, overcoming the defects of the prior art and solving the problem that the version before Android8.0 can not realize video synchronization of any position among different display devices; most of the existing boards are developed based on the version before Android8.0, and the development based on the version after Android8.0 is less, so that the technical scheme of the embodiment of the application has higher practicability.
Additional aspects and advantages of the application will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application.
Drawings
The foregoing and/or additional aspects and advantages of the application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a schematic diagram of a working principle of a video playing system according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a communication principle of a master device and a slave device according to an embodiment of the present application;
fig. 3 is a schematic flow chart of a video playing control method according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a two-thread video switch according to an embodiment of the present application;
FIG. 5 is a schematic diagram illustrating a synchronization principle of a decoding thread according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a video playing control device according to an embodiment of the present application;
fig. 7 is a schematic structural frame of a display device according to an embodiment of the present application.
Detailed Description
The present application is described in detail below, examples of embodiments of the application are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar components or components having the same or similar functions throughout. Further, if detailed description of the known technology is not necessary for the illustrated features of the present application, it will be omitted. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the application.
It will be understood by those skilled in the art that all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs unless defined otherwise. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by those skilled in the art. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein includes all or any element and all combination of one or more of the associated listed items.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments.
The embodiment of the application provides a video playing system, which comprises: a display device as a master device and a display device as a slave device; each display device is connected in series, and two adjacent display devices are connected in communication.
Optionally, referring to the example of fig. 1, the master device in the embodiment of the present application is further configured to be communicatively connected to the server, receive a play plan (splice play plan) sent by the server, adjust clock information (i.e. time of the real-time clock RTC) of the board card of the device according to the play plan, make the clock information of the board card of the device coincide with the play plan, generate a timestamp based on the play plan, and send the timestamp to the connected slave device.
In an alternative embodiment, the slave device is configured to: acquiring a time stamp sent by a main device; and adjusting the clock information (namely the time of a real-time clock (RTC)) of the equipment board card according to the time stamp, synchronizing the clock information of the equipment board card (representing the display equipment executing the current step) with the clock information of the master equipment board card, and sending the time stamp to other slave equipment.
In another alternative implementation, referring to the example of fig. 1, the slave device in the embodiment of the present application is further configured to be in communication with a server, receive a play plan sent by the server, and adjust clock information of a board card of the device according to the play plan, so that the clock information of the board card of the device is consistent with the play plan.
Optionally, the information transmission can be realized between the master device and the server and between the slave devices based on an IOT (Internet of Things ) communication mode, wherein the IOT communication mode is a normal connection communication mode based on an MQTT (Message Queuing Telemetry Transport, message queue telemetry transmission) protocol; the server sends a play plan to the master device and/or the slave device based on the IOT communication mode, and the master device and/or the slave device can acquire clock information (i.e., time of the real-time clock RTC) of the server from the play plan.
Alternatively, referring to the example of fig. 1, the transmission of the time stamp may be implemented by serial communication between the master device and the slave device, and between the slave device and the slave device.
Optionally, the master device is configured with a play thread and a send thread. The playing thread comprises a media player, a TextureView and a MediaExtractor, wherein the MediaPlayer is used for playing video, the TextureView is used for rendering video, and the MediaExtractor is used for acquiring the video playing position of the device; the Send Thread is used to Send the video playing position to the slave device at regular time.
Optionally, the slave device is configured with a play thread, a decode thread, and a receive thread. Wherein, the playing thread is the same as the main equipment; the decoding threads comprise a MediaCodec (decoder) for decoding the video and a TextureView (texture view control) for rendering the video; the receiving Thread is used for receiving the video playing position sent by the main equipment.
Alternatively, referring to the example of fig. 2, transmission of the video playing position between the master device and the slave device may be implemented based on UDP (User Datagram Protocol ), where the sending Thread of the master device is a UDP-based Thread UDP Send Thread, and the receiving Thread of the slave device is a UDP-based Thread UDP Receive Thread.
When receiving the video playing position sent by the master device, the slave device in the embodiment of the application can adjust the video playing position of the slave device based on the video playing position so as to realize synchronous playing with the video of the master device, and the specific principle of the slave device is described in detail in the following method embodiments.
Based on the same inventive concept, an embodiment of the present application provides a video play control method, which can be applied to any slave device, as shown in fig. 3, and the method includes:
s301, acquiring a video playing position of the main equipment and a video playing position of the equipment in the video playing process.
Optionally, before the video playing position of the master device and the video playing position of the device are obtained, any one of the following time synchronization operations is further included:
mode one: acquiring a time stamp sent by a main device; according to the time stamp, the clock information of the equipment board card is adjusted, so that the clock information of the equipment board card is synchronous with the clock information of the master equipment board card, and the time stamp is sent to other slave equipment; the time stamp is generated by the master device based on the play plan.
A second mode, obtaining a playing plan sent by a server; and adjusting the clock information of the equipment board card according to the playing plan so that the clock information of the equipment board card is consistent with the playing plan.
The process can realize time synchronization among different display devices, and is favorable for realizing synchronization of video playing on the basis of time synchronization.
S302, when the difference value between the video playing position of the main device and the video playing position of the device is larger than a difference value threshold, the playing of the video of the device by the playing thread is paused, and the playing position of the playing thread is jumped to a key frame position corresponding to the video playing position of the main device from the current playing position.
The difference threshold value in the embodiment of the application can be set according to actual requirements, and when the difference value between the video playing position of the main equipment and the video playing position of the equipment is smaller than or equal to the difference threshold value, the video playing position of the equipment is not required to be adjusted.
Optionally, referring to Scene1 in fig. 4, suspending the playing of the video of the device by the playing thread includes:
saving the current surface texture as surface texture1 (first surface texture object); invoking onSurfaceTextureDestroyed control removes TextureView1 (a texture view control representing a play thread) in ViewGroup.
The method is used for suspending the playing thread to play the video of the device, so as to prepare for playing the video of the decoding thread and realize transition to the decoding thread.
Optionally, in the embodiment of the present application, the playing position of the playing thread may be jumped from the current playing position to a key frame position corresponding to the video playing position of the main device through a SeekTo function, where the key frame position corresponding to the video playing position of the main device is specifically the first key frame after the video playing position of the main device.
S303, starting from the video playing position of the main equipment, decoding, rendering and playing each video frame through the decoding thread until reaching the key frame position.
Fig. 5 illustrates the working principle of the decoding thread, and the part from Video sequence header to Video sequence tail in fig. 5 is the whole video frame sequence of the video of the device, and a plurality of GOPs (Group Of Pictures ) are included between Video sequence header and Video sequence tail, and each GOP includes a plurality of video frames (including I frames, P frames and B frames). The position 7500ms (milliseconds) pointed by Seek in fig. 5 is the acquired video playing position of the master device, specifically the position of the first B frame in a GOP; the position 7700ms pointed by Result is the position of a Key frame (Key free) corresponding to the video playing position of the main device, specifically the position of an I frame in the next GOP of the GOP where the video playing position of the main device is located.
After receiving the video playing position of the main equipment and determining that synchronous adjustment is needed to be carried out on the equipment, a playing thread (MediaPlayer+TextureView1+MediaExactor) pauses playing at the current video playing position of the equipment, and jumps to a position 7700ms of a key frame through a SeekTo function; the decoding thread (MediaCodec+TextureView 2) acquires the position of the key frame through the MediaExactor, and decodes, renders and plays each video frame from the position of 7500ms until the position of the key frame reaches 7700 ms; the play thread (mediaplayer+textureview 1+mediaextractor) resumes the video from the key frame position 7700ms and continues playing the following video frames.
The decoding thread of the embodiment of the application can directly jump from the current video playing position of the equipment to the video playing position of the main equipment, and starts decoding from the video playing position of the main equipment
Optionally, decoding, rendering, and playing each video frame by a decoding thread, including: decoding each video frame by a decoder; rendering the decoded video frame through the texture view control; and calling a display thread to play the rendered video frame.
Optionally, before decoding each video by the decoder, further comprising: initializing a decoder and a video resource of the decoder; in the embodiment of the application, the video resource of the decoder is video data to be decoded, the video data to be decoded is video data between the video playing position of the main device and the key frame, the initialization operation of the video resource of the decoder comprises the steps of obtaining the video data between the video playing position of the main device and the key frame, and asynchronously recalling the video data to the decoder for decoding after the initialization.
Referring to the example of fig. 4, the initialization operation (Initialize MediaCodec) of the decoder may be performed during a pause of the playing thread from playing the video of the device.
Optionally, before the display thread is invoked to play the rendered video frame, the method further comprises: initializing a display thread mHandler.post (mRunneable), wherein the initialization operation comprises setting a time interval for playing two adjacent frames, and the display thread mHandler.post (mRunneable) can be called when video playing needs to be started after initialization.
Optionally, referring to Scene2 in fig. 4, rendering the decoded video frame through the texture view control in the play thread includes:
calling onSurfaceTextureAvailable, adding TextureView2 (texture view control representing decoding thread) in ViewGroup, and inputting SurfaceTexture1 into TextureView2 for rendering.
The method is used for rendering the decoded video frames, and further playing the rendered video frames, so that natural switching from playing thread video playing to decoding flow video playing can be realized.
S304, starting from the key frame position, resuming the playing thread to play the video of the device.
Optionally, referring to scenario one in fig. 4, resuming playing of the video of the device by the playing thread includes:
saving the current surface texture as surface texture2; calling onSurfaceTextureDestroed, removing TextureView2 in ViewGroup; calling onSurfaceTextureAvailable, adding TextureView1 to ViewGroup, inputting SurfaceTexture2 into the TextureView1 for rendering, and playing the rendered SurfaceTexture2 through a MediaPlayer.
The decoding thread can be naturally transited and switched to the playing thread in the mode, the subsequent video frames are rendered and played again through the playing thread, so that the synchronization of the key frames is realized, and the synchronization of the subsequent video frames is further realized.
The operation of initializing the decoder in fig. 4 (Initialize MediaCodec) may not be performed in resuming the playback of the device video by the playback thread.
Based on the same inventive concept, a video play control device provided in an embodiment of the present application, as shown in fig. 6, the video play control device 600 includes: an information acquisition module 601, a first play control module 602, a second play control module 603, and a third play control module.
The information obtaining module 601 is configured to obtain a video playing position of the main device and a video playing position of the local device in a video playing process;
a first play control module 602, configured to suspend playing of a video of the host device by a play thread when a difference between the video play position of the host device and the video play position of the host device is greater than a difference threshold, and skip the play position of the play thread from a current play position to a key frame position corresponding to the video play position of the host device;
a second play control module 603, configured to decode, render, and play each video frame by a decoding thread from the video play position of the master device until the key frame position is reached;
and the third play control module 604 is configured to resume, from the key frame position, playing of the video of the device by the play thread.
Optionally, the video playing control device provided by the embodiment of the present application further includes: and a clock synchronization module.
The clock synchronization module is used for acquiring the time stamp sent by the master device; adjusting clock information of the equipment board card according to the time stamp, synchronizing the clock information of the equipment board card with the clock information of the master equipment board card, and sending the time stamp to other slave equipment; the time stamp is generated by the master device based on a play plan.
Optionally, when the playing thread pauses playing the video of the device, the first playing control module 602 is specifically configured to: saving the current surface texture object as a first surface texture object; invoking a surface texture object elimination control, and removing the texture view control of the playing thread in a view container.
Optionally, the second play control module 603 is specifically configured to: decoding each video frame by the decoder; rendering the decoded video frame through the texture view control; and calling a display thread to play the rendered video frame.
Optionally, when the decoded video frame is rendered through the texture view control, the second play control module 603 is specifically configured to: and calling a surface texture object adding control, adding the texture view control of the decoding thread in a view container, and inputting the first surface texture object into the texture view control of the decoding thread for rendering.
Optionally, when resuming the playing of the video of the device by the playing thread, the third playing control module 604 is specifically configured to: saving the current surface texture object as a second surface texture object; invoking a surface texture object elimination control, removing the texture view control of the decoding thread in a view container; and calling a surface texture object adding control, adding the texture view control of the playing thread in a view container, inputting the second surface texture object into the texture view control of the playing thread for rendering, and playing the rendered second surface texture object through a media player in the playing thread.
The video play control device 600 of the present embodiment may execute any video play control method provided in the present embodiment, and its implementation principle is similar, and the content not shown in detail in the present embodiment may refer to the foregoing embodiments, which are not described herein again.
Based on the same inventive concept, an embodiment of the present application provides a display apparatus including: the device comprises a memory and a processor, wherein the memory is electrically connected with the processor.
The memory stores a computer program that is executed by the processor to implement any of the video play control methods provided by the embodiments of the present application.
Those skilled in the art will appreciate that the display devices provided by embodiments of the present application may be specially designed and manufactured for the required purposes, or may comprise known devices in general purpose computers. These devices have computer programs stored therein that are selectively activated or reconfigured. Such a computer program may be stored in a device (e.g., computer) readable medium or in any type of medium suitable for storing electronic instructions and coupled to a bus, respectively.
The present application provides, in an alternative embodiment, a display device, as shown in fig. 7, the display device 700 comprising: the memory 701 and the processor 702 are electrically connected, such as by a bus 703, the memory 701 and the processor 702.
Optionally, the memory 701 is used for storing application program codes for executing the inventive arrangements, and the execution is controlled by the processor 702. The processor 702 is configured to execute application program codes stored in the memory 701, so as to implement any one of the video play control methods provided in the embodiments of the present application.
The Memory 701 may be, but is not limited to, a ROM (Read-Only Memory) or other type of static storage device that can store static information and instructions, a RAM (Random Access Memory ) or other type of dynamic storage device that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory, electrically erasable programmable Read-Only Memory), a CD-ROM (Compact Disc Read-Only Memory) or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
The processor 702 may be a CPU (Central Processing Unit ), general purpose processor, DSP (Digital Signal Processor, data signal processor), ASIC (Application Specific Integrated Circuit ), FPGA (Field-Programmable Gate Array, field programmable gate array) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. The processor 702 may also be a combination of computing functions, e.g., including one or more microprocessor combinations, a combination of a DSP and a microprocessor, etc.
Bus 703 may include a path that communicates information between the components. The bus may be a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus or an EISA (Extended Industry Standard Architecture ) bus. The buses may be divided into address buses, data buses, control buses, etc. For ease of illustration, only one thick line is shown in fig. 7, but not only one bus or one type of bus.
Optionally, the electronic device 700 may also include a transceiver 704. The transceiver 704 may be used for both reception and transmission of signals. The transceiver 704 may allow the electronic device 700 to communicate wirelessly or by wire with other devices to exchange data. It should be noted that, in practical application, the transceiver 704 is not limited to one.
Optionally, the electronic device 700 may further comprise an input unit 705. The input unit 705 may be used to receive input digital, character, image, and/or sound information, or to generate key signal inputs related to user settings and function controls of the electronic device 700. The input unit 705 may include, but is not limited to, one or more of a touch screen, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, a joystick, a camera, a microphone, etc.
Optionally, the electronic device 700 may further comprise an output unit 706. An output unit 706 may be used to output or present information processed by the processor 702. The output unit 706 may include, but is not limited to, one or more of a display device, a speaker, a vibration device, and the like.
While fig. 7 shows a display apparatus 700 having various devices, it is to be understood that not all illustrated devices are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
Based on the same inventive concept, the embodiments of the present application provide a computer readable storage medium having a computer program stored thereon, which when executed by a processor, implements any of the video play control methods provided by the embodiments of the present application.
The computer readable medium includes, but is not limited to, any type of disk including floppy disks, hard disks, optical disks, CD-ROMs, and magneto-optical disks, ROM, RAM, EPROM (Erasable Programmable Read-Only Memory), EEPROMs, flash Memory, magnetic cards, or optical cards. That is, a readable medium includes any medium that stores or transmits information in a form readable by a device (e.g., a computer).
The embodiment of the application provides a computer readable storage medium suitable for any video playing control method, which is not described herein.
By applying the technical scheme of the embodiment of the application, at least the following beneficial effects can be realized:
1) The embodiment of the application adopts the play configuration of double threads (play threads and decoding threads), and can jump the video play position of the device from the current video play position to the received video play position of the main device through the decoding threads, thereby realizing video synchronization of any play position among different display devices, overcoming the defects of the prior art and solving the problem that the version (below android 7.1.2) before android8.0 can not realize video synchronization of any position among different display devices; most of the existing boards are developed based on the version before Android8.0, and the development based on the version after Android8.0 is little, so that the technical scheme of the embodiment of the application has higher practicability.
2) The embodiment of the application can complete the synchronization of the system time among different display devices based on the playing plan, and is beneficial to realizing the synchronization of video playing on the basis of the synchronization of the system time.
3) According to the embodiment of the application, through a double-thread (playing thread and decoding thread) video switching scheme, the naturalness of switching can be ensured, and seamless switching is facilitated.
4) The technical scheme of the embodiment of the application is to adapt and adjust the application layer of the android system, so that the application flexibility can be increased, the adaptation cost of the system layer can be reduced, and the method and the device can be applied to most boards, thereby reducing the limitation of the board in terms of the board.
Those of skill in the art will appreciate that the various operations, methods, steps in the flow, acts, schemes, and alternatives discussed in the present application may be alternated, altered, combined, or eliminated. Further, other steps, means, or steps in a process having various operations, methods, or procedures discussed herein may be alternated, altered, rearranged, disassembled, combined, or eliminated. Further, steps, measures, schemes in the prior art with various operations, methods, flows disclosed in the present application may also be alternated, altered, rearranged, decomposed, combined, or deleted.
In the description of the present application, it should be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the flowcharts of the figures may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily being sequential, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
The foregoing is only a partial embodiment of the present application, and it should be noted that it will be apparent to those skilled in the art that modifications and adaptations can be made without departing from the principles of the present application, and such modifications and adaptations are intended to be comprehended within the scope of the present application.

Claims (8)

1. A video play control method applied to any slave device, comprising:
in the video playing process, acquiring a video playing position of the main equipment and a video playing position of the equipment;
when the difference value between the video playing position of the main equipment and the video playing position of the equipment is larger than a difference value threshold value, the current surface texture object is stored as a first surface texture object; invoking a surface texture object elimination control, removing a texture view control of a playing thread in a view container, and jumping the playing position of the playing thread from the current playing position to a key frame position corresponding to the video playing position of the main equipment;
decoding each video frame by a decoder from the video playing position of the main device; calling a surface texture object adding control, adding a texture view control of a decoding thread in a view container, and inputting the first surface texture object into the texture view control of the decoding thread for rendering; calling a display thread to play the rendered video frame until reaching the position of the key frame;
starting from the key frame position, saving the current surface texture object as a second surface texture object; invoking a surface texture object elimination control, and removing a texture view control of the decoding thread in a view container; and calling a surface texture object adding control, adding the texture view control of the playing thread in a view container, inputting the second surface texture object into the texture view control of the playing thread for rendering, and playing the rendered second surface texture object through a media player in the playing thread.
2. The method according to claim 1, wherein before the step of obtaining the video playing position of the master device and the video playing position of the own device, further comprises:
acquiring a time stamp sent by the master device; the time stamp is generated by the master device based on a play plan;
and adjusting the clock information of the equipment board card according to the time stamp, synchronizing the clock information of the equipment board card with the clock information of the master equipment board card, and sending the time stamp to other slave equipment.
3. A video play control device applied to any slave device, comprising:
the information acquisition module is used for acquiring the video playing position of the main equipment and the video playing position of the equipment in the video playing process;
the first playing control module is used for storing the current surface texture object as a first surface texture object when the difference value between the video playing position of the main equipment and the video playing position of the equipment is larger than a difference value threshold value; invoking a surface texture object elimination control, removing a texture view control of a playing thread in a view container, and jumping the playing position of the playing thread from the current playing position to a key frame position corresponding to the video playing position of the main equipment;
the second playing control module is used for decoding each video frame from the video playing position of the main equipment through a decoder; calling a surface texture object adding control, adding a texture view control of a decoding thread in a view container, and inputting the first surface texture object into the texture view control of the decoding thread for rendering; calling a display thread to play the rendered video frame until reaching the position of the key frame;
the third play control module is used for storing the current surface texture object as a second surface texture object from the key frame position; invoking a surface texture object elimination control, and removing a texture view control of the decoding thread in a view container; and calling a surface texture object adding control, adding the texture view control of the playing thread in a view container, inputting the second surface texture object into the texture view control of the playing thread for rendering, and playing the rendered second surface texture object through a media player in the playing thread.
4. The video playback control device according to claim 3, further comprising:
the clock synchronization module is used for acquiring the time stamp sent by the master device; adjusting clock information of the equipment board card according to the time stamp, synchronizing the clock information of the equipment board card with the clock information of the master equipment board card, and sending the time stamp to other slave equipment;
the time stamp is generated by the master device based on a play plan.
5. A display device, as a slave device, characterized by comprising:
a memory;
a processor electrically connected to the memory;
the memory stores a computer program that is executed by the processor to implement the video playback control method according to claim 1 or 2.
6. A video playback system, comprising: a display device as a master device and a display device as a slave device as claimed in claim 5;
each display device is connected in series, and two adjacent display devices are connected in communication.
7. The video playback system of claim 6, wherein,
the master device is further configured to be in communication connection with a server, receive a play plan sent by the server, adjust clock information of a board card of the device according to the play plan, enable the clock information of the board card of the device to be consistent with the play plan, generate a time stamp based on the play plan, and send the time stamp to the connected slave device.
8. A computer-readable storage medium, characterized in that a computer program is stored, which, when executed by a processor, implements the video play control method according to claim 1 or 2.
CN202011379312.XA 2020-11-30 2020-11-30 Video playing control method, corresponding device, equipment, system and storage medium Active CN112511887B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011379312.XA CN112511887B (en) 2020-11-30 2020-11-30 Video playing control method, corresponding device, equipment, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011379312.XA CN112511887B (en) 2020-11-30 2020-11-30 Video playing control method, corresponding device, equipment, system and storage medium

Publications (2)

Publication Number Publication Date
CN112511887A CN112511887A (en) 2021-03-16
CN112511887B true CN112511887B (en) 2023-10-13

Family

ID=74968907

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011379312.XA Active CN112511887B (en) 2020-11-30 2020-11-30 Video playing control method, corresponding device, equipment, system and storage medium

Country Status (1)

Country Link
CN (1) CN112511887B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115297274A (en) * 2022-08-04 2022-11-04 京东方科技集团股份有限公司 Multi-screen video display method, system, playing end and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2479984A1 (en) * 2011-01-19 2012-07-25 Thomson Licensing Device and method for synchronizing content received from different sources
CN107801085A (en) * 2017-09-30 2018-03-13 口碑(上海)信息技术有限公司 A kind of video playing control method and device
CN108650541A (en) * 2018-05-09 2018-10-12 福建星网视易信息系统有限公司 Realize that the method and system of video is played simultaneously in distinct device
CN110636370A (en) * 2018-06-25 2019-12-31 阿里巴巴集团控股有限公司 Video processing method and device, electronic equipment and readable medium
CN110719524A (en) * 2019-10-16 2020-01-21 腾讯科技(深圳)有限公司 Video playing method and device, intelligent playing equipment and storage medium
CN111432260A (en) * 2020-03-31 2020-07-17 腾讯科技(深圳)有限公司 Method, device and equipment for synchronizing multiple paths of video pictures and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10705709B2 (en) * 2018-03-05 2020-07-07 Qingdao Hisense Media Network Technology Co., Ltd. Playing media

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2479984A1 (en) * 2011-01-19 2012-07-25 Thomson Licensing Device and method for synchronizing content received from different sources
CN107801085A (en) * 2017-09-30 2018-03-13 口碑(上海)信息技术有限公司 A kind of video playing control method and device
CN108650541A (en) * 2018-05-09 2018-10-12 福建星网视易信息系统有限公司 Realize that the method and system of video is played simultaneously in distinct device
CN110636370A (en) * 2018-06-25 2019-12-31 阿里巴巴集团控股有限公司 Video processing method and device, electronic equipment and readable medium
CN110719524A (en) * 2019-10-16 2020-01-21 腾讯科技(深圳)有限公司 Video playing method and device, intelligent playing equipment and storage medium
CN111432260A (en) * 2020-03-31 2020-07-17 腾讯科技(深圳)有限公司 Method, device and equipment for synchronizing multiple paths of video pictures and storage medium

Also Published As

Publication number Publication date
CN112511887A (en) 2021-03-16

Similar Documents

Publication Publication Date Title
US10930318B2 (en) Gapless video looping
CN110636346B (en) Code rate self-adaptive switching method and device, electronic equipment and storage medium
CN110366033B (en) Video playing method, device, equipment and storage medium
WO2020248909A1 (en) Video decoding method and apparatus, computer device, and storage medium
CN111418215B (en) Media client, method, non-transitory data storage device
KR20110022653A (en) Uniform video decoding and display
JP2023138511A (en) Dynamic reduction in play-out of replacement content to help align end of replacement content with end of replaced content
WO2019170073A1 (en) Media playback
CN112055254B (en) Video playing method, device, terminal and storage medium
CN108769815B (en) Video processing method and device
CN112511887B (en) Video playing control method, corresponding device, equipment, system and storage medium
EP2629514A1 (en) Video playback device, information processing device, and video playback method
CN112770168A (en) Video playing method and related device and equipment
US10674188B2 (en) Playback apparatus, method of controlling playback apparatus, playback method and server apparatus
US6650709B2 (en) Image decoding device, image decoding method, and storage medium and integrated circuit thereof
CN113923507A (en) Low-delay video rendering method and device for Android terminal
US20100076944A1 (en) Multiprocessor systems for processing multimedia data and methods thereof
CN114071225B (en) Frame animation playing method, device and system
US20100111490A1 (en) Information processing apparatus, effect program, and content correction processing method
CN115134661A (en) Video processing method and video processing application
WO2018178748A1 (en) Terminal-to-mobile-device system, where a terminal is controlled through a mobile device, and terminal remote control method
CN112804577B (en) Video switching display method and device, video processing equipment and storage medium
CN114025162A (en) Entropy decoding method, medium, program product, and electronic device
CN116847150A (en) Ultrahigh-definition multimedia playing method and device, computer equipment and storage medium
CN117531193A (en) Audio and video data processing method and device, readable storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant