CN117241054A - Cross-network live broadcast system, method, electronic equipment and storage medium - Google Patents

Cross-network live broadcast system, method, electronic equipment and storage medium Download PDF

Info

Publication number
CN117241054A
CN117241054A CN202311331281.4A CN202311331281A CN117241054A CN 117241054 A CN117241054 A CN 117241054A CN 202311331281 A CN202311331281 A CN 202311331281A CN 117241054 A CN117241054 A CN 117241054A
Authority
CN
China
Prior art keywords
stream data
direct
decoded
data
video
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311331281.4A
Other languages
Chinese (zh)
Inventor
全克球
戴建武
康丽丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Thunisoft Information Technology Co ltd
Original Assignee
Beijing Thunisoft Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Thunisoft Information Technology Co ltd filed Critical Beijing Thunisoft Information Technology Co ltd
Priority to CN202311331281.4A priority Critical patent/CN117241054A/en
Publication of CN117241054A publication Critical patent/CN117241054A/en
Pending legal-status Critical Current

Links

Landscapes

  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The embodiment of the application provides a cross-network live broadcast system, a method, electronic equipment and a storage medium, wherein the cross-network live broadcast system comprises the following components: the first video decoder, the video encoder and the second video decoder are connected through an HDMI (high definition multimedia interface) line, and the second video decoder is connected with a large screen display in a second intranet through an HDMI line; the method comprises the steps that a first video decoder arranged in a first intranet decodes first direct-broadcasting stream data pushed by a direct-broadcasting source in the first intranet; the video encoder encodes the decoded first direct broadcast stream data output by the first video decoder through the HDMI line to obtain second direct broadcast stream data; the second video decoder decodes the second live stream data sent by the video encoder through the Internet, and sends the decoded second live stream data to the large-screen display through the HDMI line for output by the large-screen display. Therefore, the cross-network live broadcast function is realized with lower cost, and the cross-network live broadcast requirement is met.

Description

Cross-network live broadcast system, method, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a cross-network live broadcast system, a method, an electronic device, and a storage medium.
Background
Because the intranet network of the court is an internal local area network, related resources such as court trial live broadcasting and the like can be accessed only in the intranet, but because other courts also want to watch the court situation of related cases in real time sometimes, the common practice is to go to the scene, and the travel cost is increased. Therefore, it is necessary to solve the problem of cross-network live broadcast between different intranets.
Disclosure of Invention
Aspects of the application provide a cross-network live broadcast system, a method, electronic equipment and a storage medium, which are used for solving the problem of cross-network live broadcast among different internal networks.
The embodiment of the application provides a cross-network live broadcast system, which comprises: the first video decoder, the video encoder and the second video decoder are connected through an HDMI (high definition multimedia interface) line, and the second video decoder is connected with a large screen display in a second intranet through an HDMI line; the first video decoder is arranged in a first intranet and is used for receiving first direct-current data pushed by a direct-broadcast source in the first intranet, decoding the first direct-current data and obtaining decoded first direct-current data; the video encoder is used for receiving the decoded first direct-current stream data output by the first video decoder through the HDMI line, and carrying out encoding processing on the decoded first direct-current stream data to obtain second direct-current stream data; the second video decoder is used for receiving the second live stream data sent by the video encoder through the Internet, decoding the second live stream data to obtain decoded second live stream data, and sending the decoded second live stream data to the large-screen display through the HDMI line so as to enable the large-screen display to output the decoded second live stream data.
The embodiment of the application also provides a cross-network live broadcast method which is applied to a cross-network live broadcast system, and the system comprises the following steps: the first video decoder, the video encoder and the second video decoder are connected through an HDMI (high definition multimedia interface) line, the second video decoder is connected with a large screen display in a second intranet through an HDMI line, and the first video decoder is arranged in the first intranet; the first video decoder receives first direct-current data pushed by a direct-current source in a first intranet, and decodes the first direct-current data to obtain decoded first direct-current data; the video encoder receives the decoded first direct-broadcasting stream data output by the first video decoder through an HDMI line, and encodes the decoded first direct-broadcasting stream data to obtain second direct-broadcasting stream data; the second video decoder receives the second live stream data sent by the video encoder through the Internet, decodes the second live stream data to obtain decoded second live stream data, and sends the decoded second live stream data to the large-screen display through the HDMI line so as to enable the large-screen display to output the decoded second live stream data.
An embodiment of the present application provides an electronic device, including: a memory and a processor; a memory for storing a computer program; the processor is coupled to the memory for executing the computer program for performing the steps in the cross-network live method.
An embodiment of the application provides a computer-readable storage medium storing a computer program, which is characterized in that the computer program, when executed by a processor, causes the processor to implement steps in a live method across networks.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
fig. 1 is a schematic structural diagram of a cross-network live broadcast system according to an embodiment of the present application;
fig. 2 is a flowchart of a cross-network live broadcast method provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or" describes the access relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may represent: there are three cases, a alone, a and B together, and B alone, wherein a, B may be singular or plural. In the text description of the present application, the character "/" generally indicates that the front-rear associated object is an or relationship. In addition, in the embodiments of the present application, "first", "second", "third", etc. are only for distinguishing the contents of different objects, and have no other special meaning.
The following presents a simplified summary of several terms involved in an embodiment of the present application:
HDMI (High Definition Multimedia Interface), high-definition multimedia interface) line: uncompressed high definition video and multi-channel audio data can be transmitted with high quality, and the highest data transmission speed is 5Gbps. Meanwhile, the signal transmission is not required to be subjected to digital/analog or analog/digital conversion, and the highest quality video signal transmission can be ensured. In addition, the HDMI line has the following features: 1. a data transmission rate of 5Gbps is supported, and 30 meters can be transmitted at maximum, sufficient to cope with a 1080p video and an 8-channel audio signal. 2. Because a 1080p video and an 8 channel audio signal require less than 4GB/s, the HDMI line has a large margin. And 3. The HDMI supports EDID and DDC2B, so that the device with HDMI has the characteristic of plug and play, and the signal source and the display device can automatically perform negotiation and automatically select the most suitable video/audio format.
Video decoder: the compressed signal can be restored to the original signal, and the compressed data (such as sound or image) can be restored according to the corresponding standard and then output through a sound card or a display screen. The method has wide application in the fields of multimedia players, digital television set-top boxes and the like.
Video encoder: the device is used for rapidly compressing large-capacity video data, and can make the process of storing and transmitting video more efficient and save bandwidth and storage space. The video encoder can be used for changing the size of the video and changing the format of the video, so that the video playing becomes clearer and smoother.
Large screen display: i.e. large screen displays, the displayed screen diagonal size is relatively large.
YUV refers to a pixel coding format in which Luminance parameters (luminence or Luma) and Chrominance parameters (chromance or Chroma) are represented separately.
Because the intranet network of the court is an internal local area network, related resources such as court trial live broadcasting and the like can be accessed only in the intranet, but because other courts also want to watch the court situation of related cases in real time sometimes, the common practice is to go to the scene, and the travel cost is increased. Therefore, it is necessary to solve the problem of cross-network live broadcast between different intranets.
Based on this, the embodiment of the application provides a cross-network live broadcast system, a method, an electronic device and a storage medium, wherein the cross-network live broadcast system comprises: the first video decoder, the video encoder and the second video decoder are connected through an HDMI (high definition multimedia interface) line, and the second video decoder is connected with a large screen display in a second intranet through an HDMI line; the first video decoder is arranged in a first intranet and is used for receiving first direct-current data pushed by a direct-broadcast source in the first intranet, decoding the first direct-current data and obtaining decoded first direct-current data; the video encoder is used for receiving the decoded first direct-current stream data output by the first video decoder through the HDMI line, and carrying out encoding processing on the decoded first direct-current stream data to obtain second direct-current stream data; the second video decoder is used for receiving the second live stream data sent by the video encoder through the Internet, decoding the second live stream data to obtain decoded second live stream data, and sending the decoded second live stream data to the large-screen display through the HDMI line so as to enable the large-screen display to output the decoded second live stream data. Therefore, the live stream data of the first intranet can be displayed or played on a large-screen display in the second intranet, so that the cross-network live broadcast function is realized at lower cost, and the cross-network live broadcast requirements of various application scenes are met.
The following describes in detail the technical solutions provided by the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic structural diagram of a cross-network live broadcast system according to an embodiment of the present application. Referring to fig. 1, the system includes a first video decoder, a video encoder, and a second video decoder; the video encoder is connected with the first video decoder through an HDMI line, and the second video decoder is connected with a large screen display in the second intranet through an HDMI line. For easy understanding and distinction, the first intranet and the second intranet refer to two different local area networks; the first video decoder refers to a video decoder arranged in a first intranet; the second video decoder refers to a video decoder in communication with a large screen display.
In this embodiment, the first video decoder is disposed in the first intranet, and is configured to receive first direct-current data pushed by a live broadcast source in the first intranet, and decode the first direct-current data to obtain decoded first direct-current data.
In practical application, a live broadcast source in a first intranet pushes live broadcast stream data to a first video decoder through the intranet network, and the live broadcast stream data is called as first direct broadcast stream data. The first direct stream data may be live stream data of any streaming media transmission protocol including, for example, but not limited to: RTP (Real-time Transport Protocol, real-time streaming protocol), RTCP (Real-time Transport Control Protocol, real-time transmission control protocol), RTSP (Real Time Streaming Protocol, real-time streaming protocol).
The first video decoder decodes the received first direct current stream data to obtain decoded first direct current stream data. In some optional embodiments, the first video decoder may further receive first direct-broadcast streaming data pushed by each of the plurality of direct-broadcast sources in the first intranet; the first video decoder decodes the plurality of first direct current stream data to obtain decoded first direct current stream data. Specifically, when the first video decoder decodes the plurality of first direct current data, the first video decoder decodes video data in the plurality of first direct current data to obtain video data in the decoded first direct current data; and on the other hand, decoding the audio data in the plurality of first direct current stream data to obtain the audio data in the decoded first direct current stream data.
Illustratively, when the first video decoder decodes the plurality of first direct current data, a blank new YUV picture with a specified resolution is created, video data in each first direct current data is decoded into YUV data, and each YUV picture in the YUV data is superimposed on the new YUV picture according to the position and the picture size of each YUV picture in the YUV data in the new YUV picture, so as to obtain video data in the decoded first direct current data. Wherein the designated resolution is flexibly set as needed.
Illustratively, when the first video decoder decodes audio data in the plurality of first direct-current data, the audio data in each of the first direct-current data is decoded into PCM (pulse code modulation ) format data; and mixing the PCM format data corresponding to the plurality of first direct current data by using a normalization algorithm to obtain the audio data in the decoded first direct current data.
In this embodiment, the video encoder is configured to receive the decoded first direct-broadcast stream data output by the first video decoder via the HDMI line, and encode the decoded first direct-broadcast stream data to obtain second direct-broadcast stream data.
Specifically, the first video decoder transmits the decoded first direct-current stream data to the video encoder via the HDMI line; the video encoder encodes the decoded first direct-current stream data to obtain second direct-current stream data, wherein the second direct-current stream data is the direct-current stream data output by the video encoder.
In practical applications, the video encoder performs encoding processing according to configured encoding parameters, where the encoding parameters include, for example, but are not limited to: frame rate or code rate, etc. In practice, the video encoder may provide hardware encoding capabilities and/or software encoding capabilities. The software code is encoded using a CPU (Central Processing Unit ); the hardware coding refers to coding without using a CPU, and coding is performed using hardware such as a graphics card GPU (Graphic Processing Unit, graphics processing unit), a dedicated DSP (digital signal processing) chip, an FPGA (Field Programmable Gate Array ) chip, or the like.
Further optionally, in order to improve flexibility and reliability of cross-network live broadcast, the video encoder encodes the decoded first direct-broadcast stream data, so as to obtain second direct-broadcast stream data, where the method is specifically used for:
configuring coding parameters of a video coder according to a target coding parameter configuration strategy; judging whether the video format of the decoded first direct-current stream data supports hardware coding or not; if so, carrying out hardware coding on the decoded first direct-current data by utilizing a hardware coding module in the video coder according to the coding parameters; and if not, performing software coding on the decoded first direct current data by using a software coding module in the video coder according to the coding parameters.
Illustratively, in order to improve flexibility of live broadcast across networks, when the video encoder configures the encoding parameters of the video encoder according to the target encoding parameter configuration policy, the video encoder is specifically configured to: in response to the configuration operation, obtaining a target encoding parameter configuration policy configured for the video encoder; if the target coding parameter configuration strategy is a coding parameter configuration strategy with changeable picture fluency and unchanged picture definition, configuring coding parameters of a video encoder by taking the frame rate as a target and the code rate as a target; if the target coding parameter configuration strategy is a coding parameter configuration strategy with changeable picture definition and unchanged picture fluency, the coding parameters of the video coder are configured with the code rate changeable and the frame rate unchangeable as targets.
In practical application, a user can configure a target coding parameter configuration strategy according to needs so as to meet the control requirement on the picture smoothness or the picture definition. When the frame rate is variable and the code rate is not changed as a target, the video encoder changes the frame rate of the decoded first direct-current stream data and keeps the code rate of the decoded first direct-current stream data unchanged; when the coding parameters of the video coder are configured with the aim of changing the code rate and the frame rate, the video coder changes the code rate of the decoded first direct current data, and the frame rate of the decoded first direct current data is kept unchanged.
In practical application, the video format of live stream data supported by hardware coding is limited, and the software coding can support the video formats of various live stream data. Thus, to improve reliability of live broadcast across networks, whether hardware encoding or software encoding may be selected according to the video format of the decoded first direct-broadcast stream data.
Further optionally, the advantages of hardware coding and software coding can be fully considered to improve the effect of cross-network live broadcast, and under the condition that the video format of the decoded first direct-current data supports hardware coding, the decoded first direct-current data is also subjected to software coding by using a software coding module in a video coder according to coding parameters; and carrying out fusion processing on the hardware coding result and the software coding result to obtain second live stream data. For example, the fusion processing such as weighted summation, averaging or accumulation is performed on the hardware encoding result and the software encoding result, but the present application is not limited thereto.
In this embodiment, the second video decoder is configured to receive the second live stream data sent by the video encoder through the internet, decode the second live stream data, obtain decoded second live stream data, and send the decoded second live stream data to the large-screen display through the HDMI line, so that the large-screen display can display the decoded second live stream data.
Specifically, the video encoder sends second live stream data to the second video decoder through the internet, and the second video decoder decodes the received second live stream data to obtain decoded second live stream data. The video encoder sends the decoded second live stream data to the large-screen display through the HDMI line, and the large-screen display in the second intranet outputs the decoded second live stream data.
In practice, the second video decoder may provide hardware decoding capabilities and/or software decoding capabilities. The software decoding uses a CPU to decode; the hardware decoding means that the CPU is not used for decoding, and the display card GPU, a special DSP chip, an FPGA chip and other hardware are used for decoding.
In practical application, the video format of the live stream data supported by hardware decoding is limited, and the video format of various live stream data can be supported by software decoding. Thus, to improve reliability of the cross-network live broadcast, whether hardware decoding or software decoding may be selected according to the video format of the second live stream data. Thus, when the second video decoder decodes the second live stream data, the second video decoder is specifically configured to: judging whether the video format of the second live stream data supports hardware decoding or not; if so, performing hardware decoding on the second live stream data by using a hardware decoding module in the second video decoder; and if not, performing software decoding on the second live stream data by using a software decoding module in the second video decoder.
Further optionally, the advantages of hardware decoding and software decoding can be fully considered to improve the effect of cross-network live broadcast, and under the condition that the video format of the second live stream data supports hardware decoding, the software decoding module in the second video decoder is utilized to perform software decoding on the second live stream data; and carrying out fusion processing on the second live stream data decoded by the hardware and the second live stream data encoded by the software to obtain the decoded second live stream data. For example, the fusion processing such as weighted summation, averaging or accumulation is performed on the hardware decoding result and the software decoding result, but the present application is not limited thereto.
The cross-network live broadcast system provided by the embodiment of the application comprises: the first video decoder, the video encoder and the second video decoder are connected through an HDMI (high definition multimedia interface) line, and the second video decoder is connected with a large screen display in a second intranet through an HDMI line; the first video decoder is arranged in a first intranet and is used for receiving first direct-current data pushed by a direct-broadcast source in the first intranet, decoding the first direct-current data and obtaining decoded first direct-current data; the video encoder is used for receiving the decoded first direct-current stream data output by the first video decoder through the HDMI line, and carrying out encoding processing on the decoded first direct-current stream data to obtain second direct-current stream data; the second video decoder is used for receiving the second live stream data sent by the video encoder through the Internet, decoding the second live stream data to obtain decoded second live stream data, and sending the decoded second live stream data to the large-screen display through the HDMI line so as to enable the large-screen display to output the decoded second live stream data. Therefore, the live stream data of the first intranet can be displayed or played on a large-screen display in the second intranet, so that the cross-network live broadcast function is realized at lower cost, and the cross-network live broadcast requirements of various application scenes are met.
Aspects of the application provide a cross-network live broadcast system, a method, electronic equipment and a storage medium, which are used for solving the problem of cross-network live broadcast among different internal networks.
Fig. 2 is a flowchart of a cross-network live broadcast method provided by an embodiment of the present application. The method is applied to a cross-network live broadcast system, and the system comprises the following steps: the video encoder is connected with the first video decoder through an HDMI (high definition multimedia interface) line, the second video decoder is connected with a large screen display in a second intranet through an HDMI line, and the first video decoder is arranged in the first intranet. Referring to fig. 2, the method includes:
201. the first video decoder receives first direct-current data pushed by a direct-current source in a first intranet, and decodes the first direct-current data to obtain decoded first direct-current data.
202. The video encoder receives the decoded first direct-current stream data output by the first video decoder through an HDMI line, and encodes the decoded first direct-current stream data to obtain second direct-current stream data.
203. The second video decoder receives the second live stream data sent by the video encoder through the Internet, decodes the second live stream data to obtain decoded second live stream data, and sends the decoded second live stream data to the large-screen display through the HDMI line so as to enable the large-screen display to output the decoded second live stream data.
Further optionally, the video encoder encodes the decoded first live stream data, and obtaining the second live stream data includes: configuring coding parameters of a video coder according to a target coding parameter configuration strategy; judging whether the video format of the decoded first direct-current stream data supports hardware coding or not;
if so, carrying out hardware coding on the decoded first direct-current data by utilizing a hardware coding module in the video coder according to the coding parameters; and if not, performing software coding on the decoded first direct current data by using a software coding module in the video coder according to the coding parameters.
Further optionally, in the case that the video format of the decoded first direct-current data supports hardware coding, the video encoder performs software coding on the decoded first direct-current data by using a software coding module in the video encoder according to coding parameters; and carrying out fusion processing on the hardware coding result and the software coding result to obtain second live stream data.
Further optionally, the decoding of the second live stream data by the second video decoder includes:
judging whether the video format of the second live stream data supports hardware decoding or not; if so, performing hardware decoding on the second live stream data by using a hardware decoding module in the second video decoder; and if not, performing software decoding on the second live stream data by using a software decoding module in the second video decoder.
Further optionally, in the case that the video format of the second live stream data supports hardware decoding, performing software decoding on the second live stream data by using a software decoding module in the second video decoder; and carrying out fusion processing on the second live stream data decoded by the hardware and the second live stream data encoded by the software to obtain the decoded second live stream data.
Further optionally, the video encoder configures the encoding parameters of the video encoder according to a target encoding parameter configuration policy, including: in response to the configuration operation, obtaining a target encoding parameter configuration policy configured for the video encoder; if the target coding parameter configuration strategy is a coding parameter configuration strategy with changeable picture fluency and unchanged picture definition, configuring coding parameters of a video encoder by taking the frame rate as a target and the code rate as a target; if the target coding parameter configuration strategy is a coding parameter configuration strategy with changeable picture definition and unchanged picture fluency, the coding parameters of the video coder are configured with the code rate changeable and the frame rate unchangeable as targets.
Further optionally, the first video decoder decodes the first direct current data to obtain decoded first direct current data, including: receiving first direct-broadcast stream data respectively pushed by a plurality of direct-broadcast sources in a first intranet; creating a blank new YUV picture with a specified resolution, decoding video data in each first direct current data into YUV data, and overlapping each YUV picture in the YUV data to the new YUV picture according to the position and the picture size of each YUV picture in the YUV data in the new YUV picture so as to obtain video data in the decoded first direct current data; decoding audio data in each first direct-current stream data into pulse modulation coded PCM format data; and mixing the PCM format data corresponding to the plurality of first direct current data by using a normalization algorithm to obtain the audio data in the decoded first direct current data.
The cross-network live broadcast system provided by the embodiment of the application comprises: the first video decoder, the video encoder and the second video decoder are connected through an HDMI (high definition multimedia interface) line, and the second video decoder is connected with a large screen display in a second intranet through an HDMI line; the first video decoder is arranged in a first intranet and is used for receiving first direct-current data pushed by a direct-broadcast source in the first intranet, decoding the first direct-current data and obtaining decoded first direct-current data; the video encoder is used for receiving the decoded first direct-current stream data output by the first video decoder through the HDMI line, and carrying out encoding processing on the decoded first direct-current stream data to obtain second direct-current stream data; the second video decoder is used for receiving the second live stream data sent by the video encoder through the Internet, decoding the second live stream data to obtain decoded second live stream data, and sending the decoded second live stream data to the large-screen display through the HDMI line so as to enable the large-screen display to output the decoded second live stream data. Therefore, the live stream data of the first intranet can be displayed or played on a large-screen display in the second intranet, so that the cross-network live broadcast function is realized at lower cost, and the cross-network live broadcast requirements of various application scenes are met.
It should be noted that, the execution subjects of each step of the method provided in the above embodiment may be the same device, or the method may also be executed by different devices. For example, the execution subject of steps 201 to 203 may be device a; for another example, the execution subject of steps 201 and 202 may be device a, and the execution subject of step 203 may be device B; etc.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations appearing in a specific order are included, but it should be clearly understood that the operations may be performed out of the order in which they appear herein or performed in parallel, the sequence numbers of the operations such as 201, 202, etc. are merely used to distinguish between the various operations, and the sequence numbers themselves do not represent any order of execution. In addition, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first" and "second" herein are used to distinguish different messages, devices, modules, etc., and do not represent a sequence, and are not limited to the "first" and the "second" being different types.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region, and provide corresponding operation entries for the user to select authorization or rejection.
Fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application. As shown in fig. 3, the electronic device includes: a memory 31 and a processor 32;
memory 31 is used to store computer programs and may be configured to store various other data to support operations on the computing platform. Examples of such data include instructions for any application or method operating on a computing platform, contact data, phonebook data, messages, pictures, videos, and the like.
The Memory 31 may be implemented by any type or combination of volatile or non-volatile Memory devices, such as Static Random access Memory (Static Random-AccessMemory, SRAM), electrically erasable programmable Read-Only Memory (Electrically Erasable Programmable Read Only Memory, EEPROM), erasable programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM), programmable Read-Only Memory (Programmable Read-Only Memory, PROM), read-Only Memory (ROM), magnetic Memory, flash Memory, magnetic or optical disk.
A processor 32 coupled to the memory 31 for executing the computer program in the memory 31 for: steps in the cross-network live method are performed.
Further optionally, as shown in fig. 3, the electronic device further includes: communication component 33, display 34, power component 35, audio component 36, and other components. Only some of the components are schematically shown in fig. 3, which does not mean that the electronic device only comprises the components shown in fig. 3. In addition, the components within the dashed box in fig. 3 are optional components, not necessarily optional components, depending on the product form of the electronic device.
The detailed implementation process of each action performed by the processor may refer to the related description in the foregoing method embodiment or the apparatus embodiment, and will not be repeated herein.
Accordingly, the present application also provides a computer readable storage medium storing a computer program, where the computer program is executed to implement the steps executable by the electronic device in the above method embodiments.
Accordingly, embodiments of the present application also provide a computer program product comprising a computer program/instructions which, when executed by a processor, cause the processor to carry out the steps of the above-described method embodiments that are executable by an electronic device.
The communication component is configured to facilitate wired or wireless communication between the device in which the communication component is located and other devices. The device where the communication component is located may access a wireless network based on a communication standard, such as a mobile communication network of WiFi (Wireless Fidelity ), 2G (2 generation,2 generation), 3G (3 generation ), 4G (4 generation,4 generation)/LTE (long Term Evolution ), 5G (5 generation,5 generation), or a combination thereof. In one exemplary embodiment, the communication component receives a broadcast signal or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component further includes a near field communication (Near Field Communication, NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on radio frequency identification (Radio Frequency Identification, RFID) technology, infrared data association (The Infrared Data Association, irDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
The display includes a screen, which may include a liquid crystal display (Liquid Crystal Display, LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or sliding action, but also the duration and pressure associated with the touch or sliding operation.
The power supply component provides power for various components of equipment where the power supply component is located. The power components may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the devices in which the power components are located.
The audio component described above may be configured to output and/or input an audio signal. For example, the audio component includes a Microphone (MIC) configured to receive external audio signals when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may be further stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-readable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (Central Processing Unit, CPUs), input/output interfaces, network interfaces, and memory.
The Memory may include non-volatile Memory in a computer readable medium, random access Memory (Random Access Memory, RAM) and/or non-volatile Memory, etc., such as Read Only Memory (ROM) or flash RAM. Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase Change RAM (PRAM), static Random-Access Memory (SRAM), dynamic Random-Access Memory (Dynamic Random Access Memory, DRAM), other types of Random-Access Memory (Random Access Memory, RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash Memory or other Memory technology, compact disc Read Only Memory (CD-ROM), digital versatile disc (Digital versatiledisc, DVD) or other optical storage, magnetic cassettes, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, operable to store information that may be accessed by the computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises an element.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.

Claims (10)

1. A cross-network live broadcast system, comprising: the video encoder is connected with the first video decoder through an HDMI (high definition multimedia interface) line, and the second video decoder is connected with a large screen display in a second intranet through an HDMI line;
the first video decoder is arranged in a first intranet and is used for receiving first direct-current stream data pushed by a direct-broadcast source in the first intranet, decoding the first direct-current stream data and obtaining decoded first direct-current stream data;
the video encoder is used for receiving the decoded first direct-broadcast stream data output by the first video decoder through an HDMI line, and encoding the decoded first direct-broadcast stream data to obtain second direct-broadcast stream data;
the second video decoder is configured to receive the second live stream data sent by the video encoder through the internet, decode the second live stream data to obtain decoded second live stream data, and send the decoded second live stream data to the large screen display through the HDMI line, so that the large screen display outputs the decoded second live stream data.
2. The system according to claim 1, wherein the video encoder is configured to encode the decoded first live stream data to obtain second live stream data when:
configuring coding parameters of the video coder according to a target coding parameter configuration strategy;
judging whether the video format of the decoded first direct current stream data supports hardware coding or not;
if so, carrying out hardware coding on the decoded first direct current data by utilizing a hardware coding module in the video coder according to the coding parameters;
and if not, performing software coding on the decoded first direct current data by using a software coding module in the video coder according to the coding parameters.
3. The system of claim 2, further comprising:
under the condition that the video format of the decoded first direct current data supports hardware coding, the software coding module in the video coder is utilized to carry out software coding on the decoded first direct current data according to the coding parameters;
and carrying out fusion processing on the hardware coding result and the software coding result to obtain second live stream data.
4. The system according to claim 1, wherein when the second video decoder decodes the second live stream data, the second video decoder is specifically configured to:
judging whether the video format of the second live stream data supports hardware decoding or not;
if so, performing hardware decoding on the second live stream data by using a hardware decoding module in the second video decoder;
and if not, performing software decoding on the second live stream data by using a software decoding module in the second video decoder.
5. The system of claim 4, further comprising:
in the case that the video format of the second live stream data supports hardware decoding, performing software decoding on the second live stream data by using a software decoding module in the second video decoder;
and carrying out fusion processing on the second live stream data decoded by the hardware and the second live stream data encoded by the software to obtain the decoded second live stream data.
6. The system according to claim 2, wherein the video encoder is configured with the encoding parameters of the video encoder according to a target encoding parameter configuration policy, specifically configured to:
in response to a configuration operation, obtaining a target encoding parameter configuration policy configured for the video encoder;
if the target coding parameter configuration strategy is a coding parameter configuration strategy with changeable picture fluency and unchanged picture definition, configuring coding parameters of the video encoder by taking the frame rate as a target and the code rate as a target;
if the target coding parameter configuration strategy is a coding parameter configuration strategy with changeable picture definition and unchanged picture fluency, the coding parameters of the video coder are configured with the code rate being changeable and the frame rate not being changeable as targets.
7. The system of claim 2, wherein the first video decoder is specifically configured to:
receiving first direct-broadcast stream data respectively pushed by a plurality of direct-broadcast sources in the first intranet;
creating a blank new YUV picture with a specified resolution, decoding video data in each first direct current data into YUV data, and overlapping each YUV picture in the YUV data to the new YUV picture according to the position and the picture size of each YUV picture in the YUV data in the new YUV picture so as to obtain video data in the decoded first direct current data;
decoding audio data in each first direct-current stream data into pulse modulation coded PCM format data; and mixing the PCM format data corresponding to the plurality of first direct current data by using a normalization algorithm to obtain the audio data in the decoded first direct current data.
8. A cross-network live broadcast method, characterized in that it is applied to a cross-network live broadcast system, said system comprising: the video encoder is connected with the first video decoder through a high-definition multimedia interface (HDMI) line, the second video decoder is connected with a large-screen display in a second intranet through an HDMI line, and the first video decoder is arranged in the first intranet;
the first video decoder receives first direct-current stream data pushed by a direct-broadcast source in the first intranet, and decodes the first direct-current stream data to obtain decoded first direct-current stream data;
the video encoder receives the decoded first direct-current stream data output by the first video decoder through an HDMI line, and encodes the decoded first direct-current stream data to obtain second direct-current stream data;
the second video decoder receives the second live stream data sent by the video encoder through the Internet, decodes the second live stream data to obtain decoded second live stream data, and sends the decoded second live stream data to the large screen display through the HDMI line so that the large screen display can output the decoded second live stream data.
9. An electronic device, comprising: a memory and a processor; the memory is used for storing a computer program; the processor is coupled to the memory for executing the computer program for performing the steps in the method of claim 8.
10. A computer readable storage medium storing a computer program, which when executed by a processor causes the processor to carry out the steps in the method of claim 8.
CN202311331281.4A 2023-10-13 2023-10-13 Cross-network live broadcast system, method, electronic equipment and storage medium Pending CN117241054A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311331281.4A CN117241054A (en) 2023-10-13 2023-10-13 Cross-network live broadcast system, method, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311331281.4A CN117241054A (en) 2023-10-13 2023-10-13 Cross-network live broadcast system, method, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117241054A true CN117241054A (en) 2023-12-15

Family

ID=89091204

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311331281.4A Pending CN117241054A (en) 2023-10-13 2023-10-13 Cross-network live broadcast system, method, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117241054A (en)

Similar Documents

Publication Publication Date Title
JP7370360B2 (en) Method and device for adapting video content decoded from an elementary stream to display characteristics
CN109327728B (en) One-to-many same-screen method, device and system, same-screen equipment and storage medium
CN110740363B (en) Screen projection method and system and electronic equipment
JP7011031B2 (en) Chroma prediction method and device
CN110460745B (en) Display device
CN110489073B (en) Conversion method and conversion device
US9514783B2 (en) Video editing with connected high-resolution video camera and video cloud server
WO2019169682A1 (en) Audio-video synthesis method and system
CN108063976B (en) Video processing method and device
CN103841389B (en) A kind of video broadcasting method and player
US20110026591A1 (en) System and method of compressing video content
CN110868625A (en) Video playing method and device, electronic equipment and storage medium
KR102617258B1 (en) Image processing method and apparatus
CN111316625A (en) Method and apparatus for generating a second image from a first image
JP7100052B2 (en) Electronic device and its control method
CN113938470A (en) Method and device for playing RTSP data source by browser and streaming media server
KR20210004702A (en) Artificial intelligence processor and performing neural network operation thereof
US20110085023A1 (en) Method And System For Communicating 3D Video Via A Wireless Communication Link
JP2016052015A (en) Electronic apparatus and gamut determination method
CN117241054A (en) Cross-network live broadcast system, method, electronic equipment and storage medium
CN106412684A (en) High-definition video wireless transmission method and system
KR20120012089A (en) System and method for proving video using scalable video coding
CN113038277B (en) Video processing method and device
CN113747099B (en) Video transmission method and device
US12028510B2 (en) Decoding device and operating method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination