CN114302202B - Audio and video screen projection method, device, equipment, system and storage medium - Google Patents

Audio and video screen projection method, device, equipment, system and storage medium Download PDF

Info

Publication number
CN114302202B
CN114302202B CN202111584667.7A CN202111584667A CN114302202B CN 114302202 B CN114302202 B CN 114302202B CN 202111584667 A CN202111584667 A CN 202111584667A CN 114302202 B CN114302202 B CN 114302202B
Authority
CN
China
Prior art keywords
audio
video
data
plug
screen projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111584667.7A
Other languages
Chinese (zh)
Other versions
CN114302202A (en
Inventor
刘兆祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Skyworth RGB Electronics Co Ltd
Original Assignee
Shenzhen Skyworth RGB Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Skyworth RGB Electronics Co Ltd filed Critical Shenzhen Skyworth RGB Electronics Co Ltd
Priority to CN202111584667.7A priority Critical patent/CN114302202B/en
Priority to PCT/CN2021/142286 priority patent/WO2023115618A1/en
Publication of CN114302202A publication Critical patent/CN114302202A/en
Application granted granted Critical
Publication of CN114302202B publication Critical patent/CN114302202B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention discloses an audio and video screen projection method, device, equipment, system and storage medium, relating to the technical field of screen projection, wherein the method comprises the following steps: acquiring audio and video data sent by terminal equipment; performing type analysis on the audio and video data based on a pre-stored type database to obtain a corresponding coding format; obtaining a target solution composite plug-in according to the coding format and a pre-stored plug-in registry; creating an audio and video processing thread according to the target solution composite plug-in; and performing de-compounding processing on the audio and video data based on the audio and video processing thread to obtain audio and video and play the audio and video. The invention solves the problem of poor compatibility of audio and video screen projection in the prior art, and achieves the technical effect of meeting the requirement from screen projection of different terminal equipment to screen projection equipment.

Description

Audio and video screen projection method, device, equipment, system and storage medium
Technical Field
The invention relates to the technical field of screen projection, in particular to an audio and video screen projection method, device, equipment, system and storage medium.
Background
With the continuous development of network technology, people have higher and higher requirements on intelligent products, wherein a television is used as a main product of an intelligent home, a mobile phone is used as a living necessity of people, and the mobile phone and the television are interconnected to realize screen projection. At present, a plurality of screen projection technologies are loaded on televisions on the market, but the screen projection technologies have some problems in the aspect of compatibility, for example, when the same television is projected by using different mobile phones, the abnormality of no sound in the existing picture, no picture in the sound and the like easily occurs, and the reason is that the television cannot meet the audio and video formats of the screen projection of various mobile phones, so that the compatibility is poor.
Disclosure of Invention
The main purposes of the invention are as follows: the audio and video screen projection method, device, equipment, system and storage medium are provided, and the technical problem of poor compatibility of audio and video screen projection in the prior art is solved.
In order to achieve the purpose, the invention adopts the following technical scheme:
in a first aspect, the present invention provides an audio/video screen projection method, which is applied to a screen projection device, and includes:
acquiring audio and video data sent by terminal equipment;
performing type analysis on the audio and video data based on a pre-stored type database to obtain a corresponding coding format;
obtaining a target solution composite plug-in according to the coding format and a pre-stored plug-in registry;
creating an audio and video processing thread according to the target solution composite plug-in;
performing de-compounding processing on the audio and video data based on the audio and video processing thread to obtain audio and video;
and playing the audio and the video.
Optionally, in the audio and video screen projection method, before the step of acquiring the audio and video data sent by the terminal device, the method further includes:
acquiring a connection establishment request sent by the terminal equipment;
responding to the connection establishing request, and distributing a network protocol address for the terminal equipment;
establishing a real-time stream transmission channel between the terminal equipment and the screen projection equipment according to the network protocol address;
the acquiring of the audio and video data sent by the terminal device comprises:
and acquiring audio and video data sent by the terminal equipment through the real-time streaming transmission channel.
Optionally, in the above method for screen projection of an audio/video, the analyzing the type of the audio/video data based on a pre-stored type database to obtain a corresponding encoding format includes:
searching a type corresponding to the audio and video data in a pre-stored type database to obtain a data format of the audio and video data;
and according to the data format of the audio and video data, carrying out composite stream analysis on the audio and video data to obtain a corresponding coding format.
Optionally, in the above audio/video screen projection method, the searching for the type corresponding to the audio/video data in a pre-stored type database to obtain the data format of the audio/video data includes:
acquiring a message field of the audio and video data;
matching the message field with a data type in a pre-stored type database;
if the matching is successful, obtaining the type corresponding to the message field;
determining the type corresponding to the message field as the data format of the audio and video data;
and if the matching fails, performing failure alarm prompt.
Optionally, in the above audio/video screen projection method, the performing composite stream analysis on the audio/video data according to the data format of the audio/video data to obtain a corresponding coding format includes:
acquiring field types corresponding to the data formats of the audio and video data according to preset configuration rules, wherein the preset configuration rules comprise corresponding relations between the data formats and the field types;
and determining the coding format of the audio and video data according to the field type.
Optionally, in the above audio/video screen projection method, before the step of obtaining the target solution composite plug-in according to the encoding format and the pre-stored plug-in registry, the method further includes:
acquiring a pre-stored plug-in and a data format supported by the pre-stored plug-in;
and establishing a mapping relation between the data format and the pre-stored plug-in unit to obtain the pre-stored plug-in unit registry.
Optionally, in the above audio/video screen projection method, the obtaining a target solution composite plug-in according to the coding format and a pre-stored plug-in registry includes:
and searching a solution composite plug-in corresponding to the coding format in a pre-stored plug-in registry to obtain a target solution composite plug-in, wherein the target solution composite plug-in is a plug-in capable of performing solution composite processing on the composite stream data in the coding format.
Optionally, in the above method for audio and video screen projection, the performing de-compounding processing on the audio and video data based on the audio and video processing thread to obtain audio and video includes:
according to the target de-compounding plug-in, de-compounding the audio and video data to obtain audio stream data and video stream data;
respectively carrying out type analysis on the audio stream data and the video stream data to determine an audio decoder and a video decoder;
decoding the audio stream data through the audio decoder to obtain audio;
and decoding the video stream data through the video decoder to obtain a video.
Optionally, in the above method for audio/video screen projection, the analyzing the types of the audio stream data and the video stream data, and determining the audio decoder and the video decoder includes:
respectively carrying out type analysis on the audio stream data and the video stream data to obtain the data type of the audio stream data and the data type of the video stream data;
determining an audio coding format corresponding to the audio stream data according to the data type of the audio stream data;
determining a video coding format corresponding to the video stream data according to the data type of the video stream data;
determining a corresponding audio decoder according to the audio coding format;
and determining a corresponding video decoder according to the video coding format.
Optionally, in the above method for screen projection of audio and video, before the step of playing the audio and the video, the method further includes:
constructing a multimedia architecture, wherein the multimedia architecture comprises an audio player and a video player;
the playing the audio and the video comprises:
and synchronously playing the audio and the video through the multimedia architecture.
In a second aspect, the present invention provides an audio/video screen projection apparatus, including:
the data acquisition module is used for acquiring audio and video data sent by the terminal equipment;
the type analysis module is used for carrying out type analysis on the audio and video data based on a pre-stored type database to obtain a corresponding coding format;
the plug-in searching module is used for obtaining a target solution composite plug-in according to the coding format and a pre-stored plug-in registry;
the thread creating module is used for creating an audio and video processing thread according to the target solution composite plug-in;
the data processing module is used for performing de-compounding processing on the audio and video data based on the audio and video processing thread to obtain audio and video;
and the screen projection playing module is used for playing the audio and the video.
In a third aspect, the present invention provides a screen projection apparatus, comprising:
the memory stores an audio and video screen projection program;
and the processor executes the audio and video screen projection program to realize the audio and video screen projection method.
In a fourth aspect, the present invention provides a screen projection system, comprising:
a terminal device;
a screen projection device as described above;
and the terminal equipment is in communication connection with the screen projection equipment.
In a fifth aspect, the present invention provides a computer-readable storage medium having a computer program stored thereon, the computer program being executable by one or more processors to implement the audio/video screen projection method as described above.
One or more technical solutions provided by the present invention may have the following advantages or at least achieve the following technical effects:
according to the audio and video screen projection method, device, equipment, system and storage medium, the type of audio and video data sent by the terminal equipment is analyzed to obtain a corresponding coding format, a de-compounding plug-in is searched in a pre-stored plug-in registry according to the coding format, after a target de-compounding plug-in is obtained, an audio and video processing thread is established according to the target de-compounding plug-in to perform de-compounding processing on the audio and video data sent by the terminal equipment to obtain audio and video, and finally the audio and video are played to achieve the purpose of projecting the audio and video of the terminal equipment to the screen projection equipment; the method and the device can correspondingly create audio and video processing threads aiming at different types of audio and video data, can perform de-compounding, decoding, playing and other processing on the various types of audio and video data, have higher compatibility, can meet screen projection of different terminal devices, and improve the applicability and market competitiveness of the screen projection equipment.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a schematic flow chart of a first embodiment of an audio/video screen projection method according to the present invention;
FIG. 2 is a schematic diagram of a hardware structure of a screen projection device according to the present invention;
fig. 3 is a schematic flowchart of a second embodiment of an audio/video screen projection method according to the present invention;
FIG. 4 is a detailed flowchart of step S300 in FIG. 3;
fig. 5 is a functional module schematic diagram of the audio/video screen projection device according to the first embodiment of the present invention.
The implementation, functional features and advantages of the present invention will be further described with reference to the accompanying drawings.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that, in the present invention, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of another like element in a process, method, article, or system that comprises the element. In addition, in the present invention, suffixes such as "module", "part", or "unit" used to represent elements are used only for facilitating the description of the present invention, and have no specific meaning in themselves. Thus, "module", "component" or "unit" may be used mixedly.
The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations. In addition, the technical solutions of the respective embodiments may be combined with each other, but must be realized by those skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination of technical solutions should be considered to be absent and not be within the protection scope of the present invention.
In view of the technical problem of poor compatibility of audio and video screen projection in the prior art, the invention provides an audio and video screen projection method, which has the following general idea:
acquiring audio and video data sent by terminal equipment; performing type analysis on the audio and video data based on a pre-stored type database to obtain a corresponding coding format; acquiring a target solution composite plug-in according to the coding format and a pre-stored plug-in registry; creating an audio and video processing thread according to the target solution composite plug-in; performing de-compounding processing on the audio and video data based on the audio and video processing thread to obtain audio and video; and playing the audio and the video.
According to the technical scheme, the type of the audio and video data sent by the terminal equipment is analyzed to obtain a corresponding coding format, the de-compositing plug-in is searched in a pre-stored plug-in registry according to the coding format, after the target de-compositing plug-in is obtained, an audio and video processing thread is created according to the target de-compositing plug-in to perform de-compositing processing on the audio and video data sent by the terminal equipment to obtain audio and video, and finally the audio and video are played to achieve the purpose of projecting the audio and video of the terminal equipment to the screen projecting equipment; the method and the device can correspondingly create audio and video processing threads aiming at different types of audio and video data, can perform de-compounding, decoding, playing and other processing on the various types of audio and video data, have higher compatibility, can meet screen projection of different terminal devices, and improve the applicability and market competitiveness of the screen projection equipment.
Example one
Referring to the flow diagram of fig. 1, a first embodiment of the audio/video screen projection method of the present invention is provided, and the audio/video screen projection method is applied to a screen projection device. The screen projection equipment is terminal equipment capable of realizing network connection, and can be electronic equipment such as a television, a conference panel, an intelligent display screen and the like. The screen projection device can be in communication connection with the terminal device through a network, for example, the terminal device and the screen projection device are in communication connection with the same wireless network, wherein the terminal device can be an electronic device such as a mobile phone, a tablet computer, a notebook computer and a desktop host.
Fig. 2 is a schematic diagram of a hardware structure of the projection device. The apparatus may include: a processor 1001, such as a CPU (Central Processing Unit), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005.
Those skilled in the art will appreciate that the hardware configuration shown in fig. 2 is not intended to be limiting of the screen projecting device of the present invention and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
Specifically, the communication bus 1002 is used for realizing connection communication among these components;
the user interface 1003 is used for connecting a client and performing data communication with the client, and the user interface 1003 may include an output unit, such as a display screen, and an input unit, such as a key;
the network interface 1004 is used for connecting to the backend server and performing data communication with the backend server, and the network interface 1004 may include an input/output interface, such as a standard wired interface, a wireless interface, such as a Wi-Fi interface;
the memory 1005 is used for storing various types of data, which may include, for example, instructions of any application program or method in the screen projection apparatus and application program-related data, and the memory 1005 may be a high-speed RAM memory, or a stable memory such as a disk memory, and optionally, the memory 1005 may be a storage device independent of the processor 1001;
specifically, with continued reference to fig. 2, the memory 1005 may include an operating system, a network communication module, a user interface module, and an audio/video screen projection program, where the network communication module is mainly used to connect to a terminal device and perform data communication with the terminal device;
the processor 1001 is configured to call the audio/video screen projection program stored in the memory 1005, and perform the following operations:
acquiring audio and video data sent by terminal equipment;
performing type analysis on the audio and video data based on a pre-stored type database to obtain a corresponding coding format;
acquiring a target solution composite plug-in according to the coding format and a pre-stored plug-in registry;
creating an audio and video processing thread according to the target solution composite plug-in;
performing de-compounding processing on the audio and video data based on the audio and video processing thread to obtain audio and video;
and playing the audio and the video.
Based on the screen projection device, the audio/video screen projection method of the present embodiment is described in detail below with reference to the flow diagram shown in fig. 1. The method may comprise the steps of:
step S110: and acquiring audio and video data sent by the terminal equipment.
Specifically, the terminal device may be an electronic device such as a mobile phone, a tablet computer, a notebook computer, a desktop host, and the like, the screen projection device may be an electronic device such as a television, an intelligent display screen, a conference tablet, and the like, and the terminal device and the screen projection device are connected to the same network to realize communication connection. The audio-video data refers to composite stream data containing audio and video. The screen projection equipment can acquire audio and video data sent by the terminal equipment through the real-time stream channel.
Step S120: and performing type analysis on the audio and video data based on a pre-stored type database to obtain a corresponding coding format.
Specifically, the type refers to a data format of composite stream data of the audio and video data, and the encoding format refers to a format in which an encoder is adopted for encoding when the terminal device encodes the audio and video data into a data packet form and sends the data packet form to the screen projection device. Performing type analysis (Typefind) on the acquired audio and video data, namely finding out the data format of the audio and video data in a pre-stored type database, and matching the coding format according to the data format, thereby obtaining the coding format corresponding to the audio and video data.
Step S130: and obtaining the target solution composite plug-in according to the coding format and a pre-stored plug-in registry.
Specifically, the pre-stored plug-in registry is a table in which plug-ins and data formats that can be supported by the corresponding plug-ins are stored, wherein the plug-ins and the data formats have a mapping relationship. When screen projection equipment is initialized, a large number of plug-ins are associated with corresponding supportable data formats and then registered in a table to obtain a pre-stored plug-in registry. The plug-ins may include, among other things, de-multiplexing plug-ins, decoding plug-ins, player plug-ins, and the like. De-compositing refers to the decomposition of composite stream data into an audio stream and a video stream. And searching in a pre-stored plug-in registry according to the obtained coding format of the audio and video data to obtain a plug-in capable of de-compounding the audio and video data, and calling the plug-in to obtain a target de-compounding plug-in.
Step S140: and creating an audio and video processing thread according to the target solution composite plug-in.
Specifically, after the target de-compositing plug-in is determined in steps S120 and S130, a thread (Pipeline) is created based on the plug-in. A thread refers to a model having a series of thread segments, each of which performs an action. The thread in the invention is a specific thread based on the audio and video data after the audio and video data is acquired, and has customization. Creating an audio/video processing thread includes creating a series of thread segments, for example, defining thread segments corresponding to processes for de-compounding audio/video data by using the obtained target de-compounding plug-in, and defining thread segments corresponding to different processes for sequentially performing operations such as queue, decoder selection, decoding, parameter setting, player selection and the like on the audio stream and the video stream. Generally speaking, the purpose of creating the audio and video processing thread is to perform a series of operations on received audio and video data by using the audio and video processing thread, the operations have repeatability and continuity, and the real-time processing can be performed on the audio and video data acquired in real time, so that real-time screen projection of the audio and video on the terminal equipment is realized.
Step S150: and performing de-compounding processing on the audio and video data based on the audio and video processing thread to obtain audio and video.
Specifically, after the audio/video processing thread corresponding to the audio/video data acquired in step S110 is created and obtained in step S140, the audio/video processing thread is directly utilized to process the audio/video data acquired in real time, including but not limited to performing operations such as de-compounding, queuing, attribute analysis, decoder selection, decoding, parameter setting, and the like, wherein an execution sequence or an executed object of the operations may be set based on actual requirements, for example, processing of a video stream may be more complex than processing of an audio stream, and the parameters are more set. And after the audio and video data are subjected to de-compounding processing by utilizing the audio and video processing thread, the split audio and video can be obtained.
Step S160: and playing the audio and the video.
Specifically, after the audio and video are obtained in step S150, the screen projection device may select a corresponding audio player and a corresponding video player to play the obtained audio and video, respectively. The audio player and the video player can be the audio player and the video player of the screen projection device, and can also be the audio player and the video player in a multimedia architecture independently created based on the audio and video data. The audio and video can be played synchronously when being played, so that the audio and video synchronization is ensured, the audio and video output is realized, and the audio and video of the terminal equipment can be successfully played on the screen projection equipment.
The audio and video screen projection method provided by this embodiment obtains a corresponding coding format by performing type analysis on audio and video data sent by a terminal device, searches for a de-compositing plug-in a pre-stored plug-in registry according to the coding format, and after a target de-compositing plug-in is obtained, creates an audio and video processing thread according to the target de-compositing plug-in to perform de-compositing processing on the audio and video data sent by the terminal device, obtain audio and video, and finally play the audio and video, thereby achieving the purpose of projecting the audio and video of the terminal device to a screen projection device; the method and the device can correspondingly create audio and video processing threads aiming at different types of audio and video data, can perform de-compounding, decoding, playing and other processing on the various types of audio and video data, have higher compatibility, can meet screen projection of different terminal devices, and improve the applicability and market competitiveness of the screen projection equipment.
Example two
Based on the same inventive concept, referring to fig. 3 to 4, a second embodiment of the audio/video screen projection method of the present invention is proposed, and the audio/video screen projection method is applied to screen projection equipment.
The following describes in detail the audio/video screen projection method according to this embodiment with reference to the flow diagram shown in fig. 3. The method may comprise the steps of:
step S210: and acquiring a connection establishment request sent by the terminal equipment.
Specifically, the terminal device and the screen projection device are in communication connection through a network, the terminal device of the embodiment takes a mobile phone as an example, and the screen projection device takes a television as an example for explanation, that is, the mobile phone and the television are connected to the same WiFi network. When a user initiates a screen projection request on terminal equipment, a request for establishing connection with the screen projection equipment is generated firstly, and the request is sent to the screen projection equipment, and the screen projection equipment acquires the connection establishment request sent by the terminal equipment. In this embodiment, the mobile phone sends the connection establishment request to the television through the WiFi network, and specifically, the identity confirmation and the information sending are implemented through a bottom layer negotiation WPS (Wi-Fi Protected Setup) technology and a handshake protocol of the WiFi network, where the related bottom layer negotiation WPS technology and handshake protocol are both prior art and are not described herein again.
Step S220: and responding to the connection establishing request, and allocating a network protocol address to the terminal equipment.
Specifically, after receiving the connection establishment request, the screen projection device allocates an Internet Protocol Address (IP Address) to the terminal device in response to the request, and specifically, dynamically allocates the IP Address and corresponding Configuration information to the terminal device through a Dynamic Host Configuration Protocol (DHCP), and sends the IP Address and the Configuration information to the terminal device for storage and utilization, which means that a connection establishment request including the IP Address and the Configuration information is subsequently and directly sent to the screen projection device. In the embodiment, after receiving a connection establishment request sent by a mobile phone, a television responds to the request, allocates an IP address and corresponding configuration information for the mobile phone, and sends the IP address and the corresponding configuration information to the mobile phone, the mobile phone stores the IP address and the corresponding configuration information after receiving the IP address and the corresponding configuration information, and then can directly send the connection establishment request containing the IP address to the television, and the television does not need to repeatedly allocate the IP address, so that the step of establishing connection can be saved, and a real-time streaming channel can be established quickly in the next step.
Step S230: and establishing a real-time stream transmission channel between the terminal equipment and the screen projection equipment according to the network protocol address.
Specifically, the screen projection device allocates an IP address to the terminal device, and after receiving the IP address, the terminal device may send a request for establishing a transmission channel to the screen projection device again, where the request includes the IP address, and after receiving the request for establishing the transmission channel, the screen projection device establishes a Real-Time Streaming transmission channel between the terminal device and the screen projection device through a Real-Time Streaming Protocol (RTSP) according to the IP address in response to the request. In this embodiment, after receiving the IP address allocated to the television, the mobile phone sends a request including the IP address for establishing a transmission channel to the television through the WiFi network, and the television responds to the request to successfully establish a real-time streaming transmission channel between the mobile phone and the television.
Step S240: and acquiring audio and video data sent by the terminal equipment.
Specifically, audio and video data sent by the terminal device through the real-time streaming transmission channel is obtained. After the real-time streaming transmission channel is established in step S230, the terminal device sends real-time audio and video data to the screen projection device through the real-time streaming transmission channel, and the screen projection device can acquire the audio and video data.
Step S250: and performing type analysis on the audio and video data based on a pre-stored type database to obtain a corresponding coding format.
Further, the step S250 may include:
step S251: and searching a type corresponding to the audio and video data in a pre-stored type database to obtain the data format of the audio and video data.
Specifically, after receiving the audio and video data, the screen projection device searches the type of the audio and video data in a pre-stored type database, and matches the type to obtain the type corresponding to the audio and video data, wherein the type is used as the data format of the audio and video data.
Specifically, step S251 may include:
step S251.1: and acquiring the message field of the audio and video data.
When the audio and video data is transmitted, the audio and video data is generally transmitted in a form of a message, and after the audio and video data message is acquired in step S240, a field of the message, that is, a message field of the audio and video data, can be acquired in this step.
Step S251.2: and matching the message field with the data type in a pre-stored type database.
The pre-stored type database is a table containing different fields and corresponding relations of different data types, and one field represents a specific data type, so that the data type corresponding to the message field can be found by matching corresponding search. Although the preset type database can preset a plurality of corresponding relations according to actual conditions, it is inevitable that a certain new data type is not in the table or a field indicates an error, and the like, and therefore, matching may be successful or failed.
Step S251.3: if the matching is successful, obtaining the type corresponding to the message field;
step S251.4: and determining the type corresponding to the message field as the data format of the audio and video data.
When the data type corresponding to the message field is found in the pre-stored type database, it indicates that the data type of the composite stream data is considered at the beginning of the design of the screen projection device, and correspondingly, a corresponding de-composite plug-in or other plug-ins are also generally stored for the data type. After the type corresponding to the message field is obtained, the type can be determined as the data format of the audio and video data, namely the data format of the composite stream data.
Step S251.5: and if the matching fails, performing failure alarm prompt.
If the message field obtained by the transmission process is incomplete due to errors or other reasons, or the audio/video data is in a brand-new composite stream data format, and is not in the composite stream data format considered for the design of the screen projection device, or has other reasons such as defects or faults, the situation that the matching error occurs in the step S251.2 may exist, that is, the matching fails, and at this time, the screen projection device may perform a failure alarm prompt to remind the user to take measures in time, such as reestablishing the connection between the terminal device and the screen projection device, reestablishing a real-time stream transmission channel, and the like.
Step S252: and according to the data format of the audio and video data, carrying out composite stream analysis on the audio and video data to obtain a corresponding coding format.
Specifically, after the screen projection device obtains the data format of the audio/video data, since the audio/video data is the composite stream data, the composite stream analysis can be continuously performed on the audio/video data, and the coding format adopted by the composite stream is found out, so that the coding format corresponding to the audio/video data is obtained, for example, the composite streams in the same data format may have different coding formats, and therefore, the coding format needs to be specifically determined. The existing televisions cannot meet different coding formats, so that audio or video cannot be played subsequently.
Specifically, step 252 may include:
step S252.1: and acquiring a field type corresponding to the data format of the audio and video data according to a preset configuration rule, wherein the preset configuration rule comprises a corresponding relation between each data format and each field type.
Specifically, the preset configuration rule is a table of data format-field type-encoding format, and one data format may adopt multiple encoding formats, for example, the mp4 format may have four encoding formats, which are H264, MPEG4, H264, VP6, and the like. Different encoding formats can be distinguished by using field types, namely, one encoding format corresponds to one field type, namely, the corresponding field type in the preset configuration rule represents the data format of the audio and video data.
Step S252.2: and determining the coding format of the audio and video data according to the field type.
Specifically, after the field type corresponding to the audio/video data is obtained, a corresponding encoding format can be determined, where the encoding format is the encoding format adopted by the audio/video data.
Step S260: and acquiring the pre-stored plug-in and the data format supported by the pre-stored plug-in.
Specifically, at the beginning of design, when the method is initialized, some common plug-ins can be directly stored into the screen projection device, and the screen projection device acquires the pre-stored plug-ins. Each plug-in can only process the preset data format correspondingly, so that the data formats supported by the plug-ins are different, and the data formats supported by the plug-ins are synchronously acquired when the pre-stored plug-ins are acquired.
Step S270: and establishing a mapping relation between the data format and the pre-stored plug-in unit to obtain the pre-stored plug-in unit registry.
Specifically, since the data formats that the plug-ins with different functions can support are various, and there may be a case where several plug-ins support the same data format, the corresponding pre-stored plug-ins are set based on the data format, and the mapping relationship between the data format and the pre-stored plug-ins is established. The pre-stored plug-ins can be classified according to different functions, for example, the pre-stored plug-ins corresponding to the data format a include a de-multiplexing plug-in a1 and a decoding plug-in a2, the pre-stored plug-ins corresponding to the data format B include a decoding plug-in B1 and an encrypting plug-in B2, and the pre-stored plug-ins are specifically set according to actual conditions. After the mapping relation between the data format and the pre-stored plug-in is set, the data format, the pre-stored plug-in and the mapping relation are registered in a table, namely a pre-stored plug-in registry is obtained for use when the plug-in is searched or called in the following.
Step S280: and obtaining the target solution composite plug-in according to the coding format and a pre-stored plug-in registry.
Specifically, the step S280 may include:
step S281: and searching a solution composite plug-in corresponding to the coding format in a pre-stored plug-in registry to obtain a target solution composite plug-in, wherein the target solution composite plug-in is a plug-in capable of performing solution composite processing on the composite stream data in the coding format.
Specifically, after the coding format corresponding to the audio/video data is obtained in step S250, the decoding module is searched in the pre-stored plug-in registry, and the usable decoding module corresponding to the coding format is searched, that is, the coding format is matched with the data format in the pre-stored plug-in registry, and when the matching is successful, a plurality of plug-ins corresponding to the data format are directly obtained, and a plug-in with a decoding function is found in the plurality of plug-ins, that is, the target decoding module is the target decoding module.
Step S290: and creating an audio and video processing thread according to the target solution composite plug-in.
Specifically, after the target de-compounding plug-in is obtained, an audio/video processing thread dedicated to the audio/video data is created, wherein the audio/video processing thread includes a plurality of thread segments. For example, a thread segment for decoding an audio stream obtained by de-compounding the audio and video data may be set, specifically, analysis of a coding format of the audio stream obtained by de-compounding may be set to obtain the coding format of the audio stream, a pre-stored plug-in registry is used to determine an audio decoder of the audio stream according to the coding format, and finally, the audio stream is transmitted to the audio decoder to be decoded to obtain audio. For another example, setting a thread segment for transmitting the obtained audio stream to a determined audio decoder for decoding may specifically set that the audio stream is firstly subjected to queue sorting, then the audio stream is sequentially input into the audio decoder according to the queue to output audio, and finally parameter setting may be performed on the output audio to obtain higher-quality audio, and the like. It should be noted that, in the process of actually creating an audio/video processing thread, except that the first step is to set the target de-composition plug-in to de-compound the audio/video data acquired in step S240 to obtain the thread segments of the audio stream and the video stream, the thread segments of the audio stream and the video stream that are to be specifically processed respectively later may be set according to actual conditions, but the audio stream and the video stream need to be kept synchronous to prevent the problem of asynchronous sound and picture in the subsequent playing.
In the specific implementation process, the related parameters can be separated when the audio/video processing thread is created, the RTSP independently interacts with the terminal device and the screen projection device, and the parameters are set. For example, the video projected by the mobile phone at this time is 360P, and may be 720P next time, but no matter what parameter of the audio/video data, the data format and the encoding format based on the terminal device generally do not change, so that the buffer area where the created audio/video processing thread is located can be maintained, and an independent buffer area is created in addition to set the parameter. Therefore, even if the terminal equipment is reconnected, if only the parameters are changed and other formats and IP addresses are not changed, the audio and video processing thread created last time can be used, and the buffer space is saved. The created audio and video processing thread is located in a specific cache region of the screen projection device, and the audio and video processing thread can be called when the same terminal device projects the screen after the connection is established subsequently, so that the screen projection process is saved, the delay is reduced, the blockage is prevented, and the picture performance is smoother.
Step S300: and performing de-compounding processing on the audio and video data based on the audio and video processing thread to obtain audio and video.
Further, as shown in the flowchart of fig. 4, the step S300 may include:
step S301: and de-compounding the audio and video data according to the target de-compounding plug-in to obtain audio stream data and video stream data.
Specifically, the audio and video data are processed by using the audio and video processing thread, the audio and video data are de-multiplexed by using the target de-multiplexing plug-in to obtain audio stream data and video stream data, and then the audio and video stream data and the video stream data are respectively processed according to the processing sequence or the set process defined in the audio and video processing thread.
Step S302: and respectively carrying out type analysis on the audio stream data and the video stream data to determine an audio decoder and a video decoder.
Specifically, the step S302 may include:
step S302.1: respectively carrying out type analysis on the audio stream data and the video stream data to obtain the data type of the audio stream data and the data type of the video stream data;
step S302.2: determining an audio coding format corresponding to the audio stream data according to the data type of the audio stream data;
step S302.3: and determining a video coding format corresponding to the video stream data according to the data type of the video stream data.
Specifically, in the actual implementation process, audio and video data of the same data type, that is, audio stream data of different formats and video stream data of different formats may exist after a certain composite stream data format is de-composited, the data type of the audio stream data may be determined first, and then the coding format corresponding to the data type of the audio stream data is obtained, so as to obtain the audio coding format corresponding to the audio stream data; and the video stream part also determines the data type of the video stream data, and then acquires the coding format corresponding to the data type of the video stream data, so as to obtain the video coding format corresponding to the video stream data. Since there may be different encoding formats for Audio stream data or video stream data in the same data format, for example, the Audio stream data may be encoded by PCM (Pulse Code Modulation) or AC3 (Audio Coding 3), and therefore, it is necessary to determine the Audio encoding format of the Audio stream data and the video encoding format of the video stream data.
Step S302.4: determining a corresponding audio decoder according to the audio coding format;
step S302.5: and determining a corresponding video decoder according to the video coding format.
Specifically, both the audio decoder and the video decoder can be stored in a pre-stored plug-in registry in a plug-in form, here, the audio decoder corresponding to the audio stream data is determined according to the audio coding format, that is, in the pre-stored plug-in registry, the audio coding format is matched with the data format in the table, a plug-in with a decoding function is found, that is, the audio decoder is obtained, the video decoder corresponding to the video stream data is determined according to the video coding format and the pre-stored plug-in registry, that is, in the pre-stored plug-in registry, the video coding format is matched with the data format in the table, and a plug-in with a decoding function is found, that is, the video decoder is obtained. For example, a corresponding PCM decoder or AC3 decoder may be determined according to a PCM encoding format or AC3 encoding format, and likewise, a corresponding video decoder may be determined according to a specific video encoding format.
Step S303: decoding the audio stream data through the audio decoder to obtain audio;
step S304: and decoding the video stream data through the video decoder to obtain a video.
Specifically, the screen projection device decodes audio stream data by using an audio decoder, and decodes video stream data by using a video decoder, so as to obtain audio and video to be played, wherein the lengths of the audio and the video are the same, and the lengths of the audio and the video are the same as the lengths of the acquired audio and video data.
Step S310: and playing the audio and the video.
Specifically, the screen projection device can utilize a local audio player and a local video player or an audio playing plug-in and a video playing plug-in to synchronously play audio and video, so as to ensure the synchronization of sound and pictures. The audio playing plug-in and the video playing plug-in can be searched and called in a pre-stored plug-in registry correspondingly based on the data format of the audio or the video, so that the situation that the format of the audio or the video is not supported by the local audio player and the local video player is prevented from occurring. Therefore, when an audio and video processing thread is created, the pre-stored plug-in registry can be set to be used for multiple times, and plug-ins required to be used in different processes can be called, so that the method is suitable for calling corresponding plug-ins in different stages, different types or different data formats of composite stream data, audio stream data, video stream data, audio or video and the like, and the compatibility is improved.
In another embodiment, before step S310, the method may further include:
step S320: a multimedia architecture is constructed that includes an audio player and a video player.
Further, the step S310 may include:
and synchronously playing the audio and the video through the multimedia architecture.
In particular, the multimedia architecture (gsstreamer, an open source multimedia framework for building streaming applications) can simplify the development of audio/video applications. The screen projection equipment constructs the multimedia framework, can reduce the thread segment of the audio and video processing thread, and utilizes the multimedia framework to locally play audio or video, specifically, according to the target de-composition plug-in, the audio and video data are de-composited to obtain audio stream data and video stream data; respectively carrying out type analysis on the audio stream data and the video stream data to obtain a corresponding audio coding format and a corresponding video coding format; determining a corresponding audio decoder according to the audio coding format, and determining a corresponding video decoder according to the video coding format; decoding the audio stream data according to an audio decoder to obtain audio, and decoding the video stream data according to a video decoder to obtain video; and finally, controlling the multimedia framework to play the audio and the video simultaneously.
It should be noted that, for further details in the implementation based on steps S240, S250, and S280 to S310, reference may be made to the description in the implementation based on steps S110 to S160 in the first embodiment, and for brevity of the description, no further description is given here.
The audio and video screen projection method provided by the embodiment transmits audio and video data through the safely constructed real-time stream data channel, so that the safety is high, and the real-time performance is good; the audio and video processing thread is automatically established for the audio and video data, so that the customization is high, the method is more practical, the most reasonable plug-in is conveniently adopted for processing the audio and video format of the terminal equipment, the method is suitable for more formats, and the screen projection compatibility is improved; by automatically searching for a proper plug-in a pre-stored plug-in registry and automatically creating an audio and video processing thread, the compatibility of the screen projection equipment is improved, and the screen projection delay and the pause can be obviously improved.
EXAMPLE III
Based on the same inventive concept, referring to fig. 5, a first embodiment of the audio/video screen projection device of the present invention is provided, and the audio/video screen projection device may be a virtual device and is applied to a screen projection device.
The following describes in detail the audio/video screen projection apparatus provided in this embodiment with reference to a schematic functional module diagram shown in fig. 5, where the apparatus may include:
the data acquisition module is used for acquiring audio and video data sent by the terminal equipment;
the type analysis module is used for carrying out type analysis on the audio and video data based on a pre-stored type database to obtain a corresponding coding format;
the plug-in searching module is used for obtaining a target solution composite plug-in according to the coding format and a pre-stored plug-in registry;
the thread creating module is used for creating an audio and video processing thread according to the target solution composite plug-in;
the data processing module is used for performing de-compounding processing on the audio and video data based on the audio and video processing thread to obtain audio and video;
and the screen projection playing module is used for playing the audio and the video.
Further, the apparatus may further include:
a request acquisition module, configured to acquire a connection establishment request sent by the terminal device;
the address allocation module is used for responding to the connection establishment request and allocating a network protocol address to the terminal equipment;
the channel establishing module is used for establishing a real-time stream transmission channel between the terminal equipment and the screen projection equipment according to the network protocol address;
the data acquisition module is specifically configured to:
and acquiring audio and video data sent by the terminal equipment through the real-time streaming transmission channel.
Further, the type analysis module may include:
the data format acquisition unit is used for searching a type corresponding to the audio and video data in a pre-stored type database to obtain the data format of the audio and video data;
and the coding format acquisition unit is used for carrying out composite stream analysis on the audio and video data according to the data format of the audio and video data to obtain a corresponding coding format.
Still further, the data format acquiring unit may include:
a field acquiring subunit, configured to acquire a message field of the audio/video data;
the matching subunit is used for matching the message field with the data type in a pre-stored type database;
a matching result subunit, configured to, if matching is successful, obtain a type corresponding to the message field;
and the data format determining subunit is used for determining the type corresponding to the message field as the data format of the audio and video data.
Still further, the encoding format obtaining unit may include:
a field type obtaining subunit, configured to obtain, according to a preset configuration rule, a field type corresponding to a data format of the audio/video data, where the preset configuration rule includes a correspondence between each data format and each field type;
and the coding format determining subunit is used for determining the coding format of the audio and video data according to the field type.
Further, the apparatus may further include:
the plug-in pre-storage module is used for acquiring pre-stored plug-ins and data formats supported by the pre-stored plug-ins;
and the registry establishing module is used for establishing a mapping relation between the data format and the pre-stored plug-in to obtain the pre-stored plug-in registry.
Further, the plug-in lookup module may include:
and the plug-in searching unit is used for searching the de-compounding plug-in corresponding to the coding format in a pre-stored plug-in registry to obtain a target de-compounding plug-in, wherein the target de-compounding plug-in is a plug-in capable of de-compounding the composite stream data in the coding format.
Further, the data processing module may include:
the first thread segment unit is used for de-compounding the audio and video data according to the target de-compounding plug-in to obtain audio stream data and video stream data;
a second thread Duan Shanyuan, configured to perform type analysis on the audio stream data and the video stream data, respectively, and determine an audio decoder and a video decoder;
a third thread Duan Shanyuan, configured to decode the audio stream data by the audio decoder to obtain an audio;
a fourth thread Duan Shanyuan, configured to decode the video stream data by the video decoder to obtain a video.
Still further, the second thread segment unit may include:
the type analysis subunit is used for respectively carrying out type analysis on the audio stream data and the video stream data to obtain the data type of the audio stream data and the data type of the video stream data;
the audio coding format determining subunit is configured to determine, according to the data type of the audio stream data, an audio coding format corresponding to the audio stream data;
the video coding format determining subunit is configured to determine, according to the data type of the video stream data, a video coding format corresponding to the video stream data;
the audio decoder determining subunit is used for determining a corresponding audio decoder according to the audio coding format;
and the video decoder determining subunit is used for determining a corresponding video decoder according to the video coding format.
Further, the apparatus may further include:
the multimedia module is used for constructing a multimedia architecture, and the multimedia architecture comprises an audio player and a video player;
the screen projection playing module is specifically used for:
and synchronously playing the audio and the video through the multimedia architecture.
It should be noted that, the functions and corresponding achieved technical effects that can be achieved by each module in the audio/video screen projecting device provided in this embodiment may refer to the description of the specific implementation manner in each embodiment of the audio/video screen projecting method of the present invention, and for the sake of brevity of the description, no further description is given here.
Example four
Based on the same inventive concept, referring to fig. 2, a hardware structure diagram of a screen projection device according to embodiments of the present invention is shown. The embodiment provides a screen projection device, which may include a processor and a memory, where the memory stores an audio/video screen projection program, and when the audio/video screen projection program is executed by the processor, all or part of the steps of each embodiment of the audio/video screen projection method are implemented.
Specifically, the screen projection device is a terminal device capable of realizing network connection, and may be a terminal device such as a television, a conference tablet, an intelligent display screen, and the like.
It will be appreciated that the device may also include a communications bus, a user interface and a network interface.
Wherein the communication bus is used for realizing connection communication among the components.
The user interface is used for connecting the client and performing data communication with the client, and may include an output unit such as a display screen and an input unit such as a keyboard, and optionally may also include other input/output interfaces such as a standard wired interface and a wireless interface.
The network interface is used for connecting the background server and performing data communication with the background server, and the network interface may include an input/output interface, such as a standard wired interface, a wireless interface, such as a Wi-Fi interface.
The memory is used to store various types of data, which may include, for example, instructions for any application or method within the screen-projecting device, as well as application-related data. The Memory may be implemented by any type of volatile or non-volatile Memory device or combination thereof, such as Static Random Access Memory (SRAM), random Access Memory (RAM), electrically Erasable Programmable Read-Only Memory (EEPROM), erasable Programmable Read-Only Memory (EPROM), programmable Read-Only Memory (PROM) or PROM), read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic or optical disk, or alternatively, the Memory may be a storage device independent of the processor.
The Processor may be an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and is configured to call an audio/video screen projecting program stored in the memory and execute the audio/video screen projecting method.
EXAMPLE five
Based on the same inventive concept, the present embodiment provides a screen projection system, which includes:
the system comprises terminal equipment and screen projection equipment;
and the terminal equipment is in communication connection with the screen projection equipment.
The specific structure of the screen projection device refers to the above embodiments, and since the present embodiment adopts all technical solutions of all the above embodiments, all beneficial effects brought by the technical solutions of the above embodiments are at least achieved, and are not described in detail herein.
EXAMPLE six
Based on the same inventive concept, the present embodiment provides a computer-readable storage medium, such as a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, a server, an App application store, etc., wherein the storage medium stores thereon a computer program, the computer program is executable by one or more processors, and when the computer program is executed by the processors, the computer program can implement all or part of the steps of the various embodiments of the audiovisual screen projection method of the present invention.
It should be noted that the above-mentioned serial numbers of the embodiments of the present invention are merely for description, and do not represent the merits of the embodiments.
The above description is only an alternative embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes performed by the present specification and drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (11)

1. An audio and video screen projection method is applied to screen projection equipment, and comprises the following steps:
acquiring audio and video data sent by terminal equipment; the audio and video data are composite stream data containing audio and video;
performing type analysis on the audio and video data based on a pre-stored type database to obtain a corresponding coding format;
acquiring a target solution composite plug-in according to the coding format and a pre-stored plug-in registry;
creating an audio and video processing thread according to the target solution composite plug-in;
performing de-compounding processing on the audio and video data based on the audio and video processing thread to obtain audio and video;
playing the audio and the video;
the type analysis of the audio and video data is carried out based on a pre-stored type database, and the obtaining of the corresponding coding format comprises the following steps:
searching a type corresponding to the audio and video data in a pre-stored type database to obtain a data format of the audio and video data; wherein the type is a data format of the composite stream data;
according to the data format of the audio and video data, carrying out composite stream analysis on the audio and video data to obtain a corresponding coding format;
the audio and video data are subjected to de-compounding processing based on the audio and video processing thread, and the obtaining of the audio and the video comprises the following steps:
according to the target de-compounding plug-in, de-compounding the audio and video data to obtain audio stream data and video stream data;
respectively carrying out type analysis on the audio stream data and the video stream data to determine an audio decoder and a video decoder;
decoding the audio stream data through the audio decoder to obtain audio;
decoding the video stream data through the video decoder to obtain a video;
the analyzing the types of the audio stream data and the video stream data respectively, and determining the audio decoder and the video decoder comprises:
respectively carrying out type analysis on the audio stream data and the video stream data to obtain the data type of the audio stream data and the data type of the video stream data;
determining an audio coding format corresponding to the audio stream data according to the data type of the audio stream data;
determining a video coding format corresponding to the video stream data according to the data type of the video stream data;
determining a corresponding audio decoder according to the audio coding format;
and determining a corresponding video decoder according to the video coding format.
2. The audio-video screen projection method according to claim 1, wherein before the step of acquiring the audio-video data sent by the terminal device, the method further comprises:
acquiring a connection establishment request sent by the terminal equipment;
responding to the connection establishing request, and distributing a network protocol address for the terminal equipment;
establishing a real-time stream transmission channel between the terminal equipment and the screen projection equipment according to the network protocol address;
the acquiring of the audio and video data sent by the terminal device comprises:
and acquiring audio and video data sent by the terminal equipment through the real-time streaming transmission channel.
3. The audio-video screen projection method according to claim 1, wherein the step of searching for the type corresponding to the audio-video data in a pre-stored type database to obtain the data format of the audio-video data comprises:
acquiring a message field of the audio and video data;
matching the message field with a data type in a pre-stored type database;
if the matching is successful, obtaining the type corresponding to the message field;
and determining the type corresponding to the message field as the data format of the audio and video data.
4. The audio-video screen projection method according to claim 1, wherein the performing composite stream analysis on the audio-video data according to the data format of the audio-video data to obtain a corresponding encoding format comprises:
acquiring field types corresponding to the data formats of the audio and video data according to preset configuration rules, wherein the preset configuration rules comprise corresponding relations between the data formats and the field types;
and determining the coding format of the audio and video data according to the field type.
5. The audio-video screen projection method according to claim 1, wherein before the step of obtaining the target de-compositing plug-in according to the encoding format and the pre-stored plug-in registry, the method further comprises:
acquiring a pre-stored plug-in and a data format supported by the pre-stored plug-in;
and establishing a mapping relation between the data format and the pre-stored plug-in unit to obtain the pre-stored plug-in unit registry.
6. The audio-video screen projection method according to claim 1, wherein the obtaining of the target solution composite plug-in according to the encoding format and the pre-stored plug-in registry comprises:
and searching a solution composite plug-in corresponding to the coding format in a pre-stored plug-in registry to obtain a target solution composite plug-in, wherein the target solution composite plug-in is a plug-in capable of performing solution composite processing on the composite stream data in the coding format.
7. The audiovisual screen projection method of claim 1, wherein prior to said step of playing said audio and said video, said method further comprises:
constructing a multimedia architecture, wherein the multimedia architecture comprises an audio player and a video player;
the playing the audio and the video comprises:
and synchronously playing the audio and the video through the multimedia architecture.
8. An audio/video screen projection device, characterized in that the device comprises:
the data acquisition module is used for acquiring audio and video data sent by the terminal equipment; the audio and video data are composite stream data containing audio and video;
the type analysis module is used for carrying out type analysis on the audio and video data based on a pre-stored type database to obtain a corresponding coding format;
the plug-in searching module is used for obtaining a target solution composite plug-in according to the coding format and a pre-stored plug-in registry;
the thread creating module is used for creating an audio and video processing thread according to the target solution composite plug-in;
the data processing module is used for performing de-compounding processing on the audio and video data based on the audio and video processing thread to obtain audio and video;
the screen projection playing module is used for playing the audio and the video;
the type analysis module comprises:
the data format acquisition unit is used for searching a type corresponding to the audio and video data in a pre-stored type database to obtain the data format of the audio and video data; wherein the type is a data format of the composite stream data;
the encoding format acquisition unit is used for carrying out composite stream analysis on the audio and video data according to the data format of the audio and video data to obtain a corresponding encoding format;
the data processing module comprises:
the first thread segment unit is used for de-compounding the audio and video data according to the target de-compounding plug-in to obtain audio stream data and video stream data;
a second thread Duan Shanyuan, configured to perform type analysis on the audio stream data and the video stream data, respectively, and determine an audio decoder and a video decoder;
a third thread Duan Shanyuan, configured to decode the audio stream data through the audio decoder to obtain an audio;
a fourth thread Duan Shanyuan, configured to decode the video stream data by the video decoder to obtain a video;
the second thread segment unit includes:
the type analysis subunit is used for respectively carrying out type analysis on the audio stream data and the video stream data to obtain the data type of the audio stream data and the data type of the video stream data;
the audio coding format determining subunit is configured to determine, according to the data type of the audio stream data, an audio coding format corresponding to the audio stream data;
the video coding format determining subunit is configured to determine, according to the data type of the video stream data, a video coding format corresponding to the video stream data;
the audio decoder determining subunit is used for determining a corresponding audio decoder according to the audio coding format;
and the video decoder determining subunit is used for determining a corresponding video decoder according to the video coding format.
9. A screen projection apparatus, comprising:
the memory stores an audio and video screen projection program;
a processor executing the audiovisual screen projection program to implement the audiovisual screen projection method as claimed in any one of claims 1 to 7.
10. A screen projection system, the system comprising:
a terminal device;
the screen projecting device of claim 9;
and the terminal equipment is in communication connection with the screen projection equipment.
11. A computer-readable storage medium, having stored thereon a computer program executable by one or more processors to implement an audiovisual projection method as claimed in any of claims 1 to 7.
CN202111584667.7A 2021-12-22 2021-12-22 Audio and video screen projection method, device, equipment, system and storage medium Active CN114302202B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111584667.7A CN114302202B (en) 2021-12-22 2021-12-22 Audio and video screen projection method, device, equipment, system and storage medium
PCT/CN2021/142286 WO2023115618A1 (en) 2021-12-22 2021-12-29 Audio/video screen projection method, device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111584667.7A CN114302202B (en) 2021-12-22 2021-12-22 Audio and video screen projection method, device, equipment, system and storage medium

Publications (2)

Publication Number Publication Date
CN114302202A CN114302202A (en) 2022-04-08
CN114302202B true CN114302202B (en) 2023-03-14

Family

ID=80969896

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111584667.7A Active CN114302202B (en) 2021-12-22 2021-12-22 Audio and video screen projection method, device, equipment, system and storage medium

Country Status (2)

Country Link
CN (1) CN114302202B (en)
WO (1) WO2023115618A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115087132A (en) * 2022-06-08 2022-09-20 苏州华启智能科技有限公司 Miracast-based screen projection method, electronic device and system
CN115278258A (en) * 2022-07-11 2022-11-01 海南视联通信技术有限公司 Audio and video data processing method and device, terminal and readable storage medium
CN115065852B (en) * 2022-08-22 2022-11-01 深圳创维-Rgb电子有限公司 Sound and picture synchronization method and device, electronic equipment and readable storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101568030A (en) * 2009-06-05 2009-10-28 湖南工程学院 Method and system for decoding self-adaptive multi-standard reconfigurable video
CN101771869A (en) * 2008-12-30 2010-07-07 深圳市万兴软件有限公司 AV (audio/video) encoding and decoding device and method
CN102122300A (en) * 2011-03-08 2011-07-13 天脉聚源(北京)传媒科技有限公司 Method for converting video format
CN102655602A (en) * 2012-04-19 2012-09-05 北京交通大学 Decoding device for adaptive clients and plugin acquisition method
CN104954863A (en) * 2015-06-05 2015-09-30 深圳市九洲电器有限公司 Audio and video decoding component selection method and system
CN107155124A (en) * 2017-06-19 2017-09-12 环球智达科技(北京)有限公司 The method played for multi-source
CN109194698A (en) * 2018-11-01 2019-01-11 浩云科技股份有限公司 A kind of Real-time Video Processing System and method based on GStreamer frame
CN109558105A (en) * 2018-12-14 2019-04-02 广州视源电子科技股份有限公司 Screen method is thrown, screen device is thrown, throws screen equipment
CN110418186A (en) * 2019-02-01 2019-11-05 腾讯科技(深圳)有限公司 Audio and video playing method, apparatus, computer equipment and storage medium
CN113422967A (en) * 2021-06-07 2021-09-21 深圳康佳电子科技有限公司 Screen projection display control method and device, terminal equipment and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7936933B2 (en) * 2005-11-18 2011-05-03 Microsoft Corporation Accelerating video data decoding
CN105681912A (en) * 2015-10-16 2016-06-15 乐视致新电子科技(天津)有限公司 Video playing method and device
CN108668162B (en) * 2018-03-20 2021-06-04 海信视像科技股份有限公司 Video file playing processing method and device and intelligent terminal
CN110784758B (en) * 2019-08-22 2021-11-05 腾讯科技(深圳)有限公司 Screen projection processing method and device, electronic equipment and computer program medium
CN110740363B (en) * 2019-10-25 2022-11-18 华为终端有限公司 Screen projection method and system and electronic equipment
CN113038200A (en) * 2019-12-25 2021-06-25 杭州海康威视数字技术股份有限公司 Device, system and method for realizing wireless screen projection
CN113316013B (en) * 2021-05-31 2022-04-26 烽火通信科技股份有限公司 Video screen projection method and system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101771869A (en) * 2008-12-30 2010-07-07 深圳市万兴软件有限公司 AV (audio/video) encoding and decoding device and method
CN101568030A (en) * 2009-06-05 2009-10-28 湖南工程学院 Method and system for decoding self-adaptive multi-standard reconfigurable video
CN102122300A (en) * 2011-03-08 2011-07-13 天脉聚源(北京)传媒科技有限公司 Method for converting video format
CN102655602A (en) * 2012-04-19 2012-09-05 北京交通大学 Decoding device for adaptive clients and plugin acquisition method
CN104954863A (en) * 2015-06-05 2015-09-30 深圳市九洲电器有限公司 Audio and video decoding component selection method and system
CN107155124A (en) * 2017-06-19 2017-09-12 环球智达科技(北京)有限公司 The method played for multi-source
CN109194698A (en) * 2018-11-01 2019-01-11 浩云科技股份有限公司 A kind of Real-time Video Processing System and method based on GStreamer frame
CN109558105A (en) * 2018-12-14 2019-04-02 广州视源电子科技股份有限公司 Screen method is thrown, screen device is thrown, throws screen equipment
CN110418186A (en) * 2019-02-01 2019-11-05 腾讯科技(深圳)有限公司 Audio and video playing method, apparatus, computer equipment and storage medium
CN113422967A (en) * 2021-06-07 2021-09-21 深圳康佳电子科技有限公司 Screen projection display control method and device, terminal equipment and storage medium

Also Published As

Publication number Publication date
CN114302202A (en) 2022-04-08
WO2023115618A1 (en) 2023-06-29

Similar Documents

Publication Publication Date Title
CN114302202B (en) Audio and video screen projection method, device, equipment, system and storage medium
US9906574B2 (en) Method, server, client and system for data presentation in a multiplayer session
US10284644B2 (en) Information processing and content transmission for multi-display
CN107911437B (en) Screen sharing method and device, electronic equipment and storage medium
US20200195842A1 (en) Information pushing method, storage medium, terminal device, and server
WO2015074445A1 (en) Multipath wireless displaying method and device
WO2020151660A1 (en) Stb cloudification method and system, thin stb, virtual stb and platform, and storage medium
US8872886B2 (en) Method, apparatus, and system for establishing multi-cascade channel
US20090225153A1 (en) Apparatus For Synthesizing Information, Apparatus For Controlling Information, And Computer Program Product
WO2017020790A1 (en) Multi-screen control method and device
US20170171496A1 (en) Method and Electronic Device for Screen Projection
US20120240180A1 (en) Set-top box, earphone, and multimedia playing method
CN112153457A (en) Wireless screen projection connection method and device, computer storage medium and electronic equipment
WO2023160361A1 (en) Rtc data processing method and apparatus
CN113926185A (en) Data processing method, device, equipment and storage medium
KR102064284B1 (en) Device, system, and method to perform real-time communication
CN112788384A (en) Wireless digital television screen projection method and device, computer equipment and storage medium
WO2021078208A1 (en) Information acquisition method, cloudified set top box system, entity set top box, and storage medium
WO2013078614A1 (en) Audio compression algorithm selection method, device and communication system in scenario of virtual desktop
CN112770078A (en) Video conference terminal fusion method and device and readable storage medium
WO2019141150A1 (en) Cross-platform content control method and device, terminal, server, and storage medium
CN115802007A (en) Monitoring system control method and device based on RTSP (real time streaming protocol) and readable storage medium
CN108616760B (en) Live broadcasting method based on browser, server, anchor terminal and user terminal
WO2018171567A1 (en) Method, server, and terminal for playing back media stream
JP5961860B2 (en) Media stream playback method and set top box

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant