CN116264619A - Resource processing method, device, server, terminal, system and storage medium - Google Patents

Resource processing method, device, server, terminal, system and storage medium Download PDF

Info

Publication number
CN116264619A
CN116264619A CN202111536341.7A CN202111536341A CN116264619A CN 116264619 A CN116264619 A CN 116264619A CN 202111536341 A CN202111536341 A CN 202111536341A CN 116264619 A CN116264619 A CN 116264619A
Authority
CN
China
Prior art keywords
terminal
multimedia resource
image
node
resource
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111536341.7A
Other languages
Chinese (zh)
Inventor
李志成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111536341.7A priority Critical patent/CN116264619A/en
Publication of CN116264619A publication Critical patent/CN116264619A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed

Abstract

The application discloses a resource processing method, a device, a server, a terminal, a system and a storage medium, and belongs to the technical field of Internet. The method comprises the following steps: acquiring a first multimedia resource, wherein the first multimedia resource is obtained by encoding an acquired multi-frame first direct-broadcasting image through a first terminal according to configuration information of the first terminal and a reference encoding algorithm; acquiring configuration information of at least one second terminal in the same live broadcast room as the first terminal; and adjusting the first multimedia resources based on the configuration information of each second terminal to obtain at least one second multimedia resource. The method is applicable to content distribution scenarios, such as content distribution of the acquired at least one second multimedia resource. The method can be realized by generating a multimedia resource by the first terminal, has lower requirements on the real-time coding capacity and the uplink bandwidth of the first terminal, and can lighten the pressure of the first terminal. And the obtained second multimedia resource has higher matching degree with the second terminal.

Description

Resource processing method, device, server, terminal, system and storage medium
Technical Field
The embodiment of the application relates to the technical field of internet, in particular to a resource processing method, a device, a server, a terminal, a system and a storage medium.
Background
With the continuous development of internet technology, the application scene of network live broadcast is more and more. For example, webcast may be applied not only in a self-media scenario, but also in a news entertainment, online education, social network, etc.
In the related art, after a multi-frame live image is acquired by a main broadcasting terminal, the multi-frame live image is encoded for multiple times to obtain a plurality of live video streams, and the resolution and the code rate of each live video stream are different. And uploading the plurality of live video streams to a server. And the audience terminal pulls the live video stream matched with the audience terminal from the server and displays the video corresponding to the live video stream.
However, since the anchor terminal is required to acquire and upload a plurality of live video streams, this requires the anchor terminal to have a strong real-time encoding capability and uplink network bandwidth. When the real-time encoding capability and/or the uplink network bandwidth of the anchor terminal are poor, the live video stream acquired by the audience terminal is delayed. Moreover, when the live video streams which are not matched with the audience terminal do not exist in the live video streams acquired by the anchor terminal, the audience terminal can only pull the live video streams which are not matched with the audience terminal to display, so that the display effect of the video displayed by the audience terminal is poor.
Disclosure of Invention
The embodiment of the application provides a resource processing method, a device, a server, a terminal, a system and a storage medium, which can be used for solving the problem that the pressure of a first terminal is high due to high real-time coding capacity and uplink network bandwidth requirements of the first terminal in the related technology. The technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a resource processing method, where the method includes:
acquiring a first multimedia resource, wherein the first multimedia resource is obtained by encoding an acquired multi-frame first direct broadcast image through a first terminal according to configuration information of the first terminal and a reference encoding algorithm;
acquiring configuration information of at least one second terminal in the same live broadcast room as the first terminal;
and adjusting the first multimedia resources based on the configuration information of each second terminal to obtain at least one second multimedia resource.
In a second aspect, an embodiment of the present application provides a resource processing method, where the method includes:
acquiring a multi-frame first direct broadcast image and configuration information of a first terminal;
encoding the multi-frame first direct-broadcasting image based on the configuration information of the first terminal and a reference encoding algorithm to obtain a first multimedia resource;
And sending the first multimedia resource to a server, wherein the server is used for adjusting the first multimedia resource to obtain at least one second multimedia resource.
In a third aspect, an embodiment of the present application provides a resource processing apparatus, where the apparatus includes:
the acquisition module is used for acquiring a first multimedia resource, and the first multimedia resource is obtained by encoding the acquired multi-frame first direct broadcast image through a first terminal according to the configuration information of the first terminal and a reference encoding algorithm;
the acquisition module is further used for acquiring configuration information of at least one second terminal in the same live broadcast room as the first terminal;
and the adjustment module is used for adjusting the first multimedia resources based on the configuration information of each second terminal to obtain at least one second multimedia resource.
In a fourth aspect, an embodiment of the present application provides a resource processing apparatus, where the apparatus includes:
the acquisition module is used for acquiring a plurality of frames of first direct broadcast images and configuration information of the first terminal;
the encoding module is used for encoding the multi-frame first direct-broadcasting image based on the configuration information of the first terminal and a reference encoding algorithm to obtain a first multimedia resource;
The sending module is used for sending the first multimedia resource to a server, and the server is used for adjusting the first multimedia resource to obtain at least one second multimedia resource.
In a fifth aspect, an embodiment of the present application provides a server, where the server includes a processor and a memory, where the memory stores at least one program code, and the at least one program code is loaded and executed by the processor, so that the server implements the resource processing method described in the first aspect.
In a sixth aspect, an embodiment of the present application provides a terminal, where the terminal includes a processor and a memory, where the memory stores at least one program code, and the at least one program code is loaded and executed by the processor, so that the terminal implements the resource processing method in the second aspect.
In a seventh aspect, an embodiment of the present application provides a resource processing system, where the resource processing system includes a server and a terminal, where the server is configured to execute the resource processing method described in the first aspect, and the terminal is configured to execute the resource processing method described in the second aspect.
In an eighth aspect, there is also provided a computer readable storage medium having stored therein at least one program code loaded and executed by a processor to cause a computer to implement any of the above-described resource processing methods.
In a ninth aspect, there is also provided a computer program or computer program product having stored therein at least one computer instruction that is loaded and executed by a processor to cause a computer to implement any of the above-described resource processing methods.
The technical scheme provided by the embodiment of the application at least brings the following beneficial effects:
the technical scheme provided by the embodiment of the application can be realized by generating a first multimedia resource by the first terminal, has low requirements on the real-time coding capacity and the uplink bandwidth of the first terminal, and can lighten the pressure of the first terminal. In addition, the server in the application adjusts the first multimedia resource based on the configuration information of at least one second terminal in the same live broadcast room as the first terminal to obtain at least one second multimedia resource, and the obtained second multimedia resource can meet the requirements of the second terminal.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an implementation environment of a resource processing method according to an embodiment of the present application;
FIG. 2 is a flowchart of a resource processing method according to an embodiment of the present application;
FIG. 3 is a flowchart of a resource processing method according to an embodiment of the present application;
FIG. 4 is a flowchart of a resource processing method according to an embodiment of the present application;
FIG. 5 is a flowchart of a resource processing method according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a resource processing device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a resource processing device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
For ease of understanding, several terms referred to in the embodiments of the present application are explained first:
cloud computing (Cloud computing) is a computing model that distributes computing tasks across a pool of resources made up of a large number of computers, enabling various applications to acquire computing power, storage space, and information services as needed. The network that provides the resources is referred to as the "cloud". Resources in the cloud are infinitely expandable in the sense of users, and can be acquired at any time, used as needed, expanded at any time and paid for use as needed.
As a basic capability provider of cloud computing, a cloud computing resource pool (abbreviated as "cloud platform", generally called IaaS (Infrastructure as a Service, infrastructure as a service) platform) is established, and multiple types of virtual resources are deployed in the resource pool for external clients to select for use. The cloud computing resource pool mainly comprises: computing devices (which are virtualized machines, including operating systems), storage devices, network devices.
According to the logic function division, a PaaS (Platform as a Service ) layer can be deployed on the IaS layer, a SaaS (Software as a Service ) layer can be deployed on the PaaS layer, and the SaaS can also be directly deployed on the IaS. PaaS is a platform on which software runs, such as a database, web container, etc. SaaS is a wide variety of business software such as web portals, sms mass senders, etc. Generally, saaS and PaaS are upper layers relative to IaaS.
Cloud technology (Cloud technology) refers to a hosting technology for integrating hardware, software, network and other series resources in a wide area network or a local area network to realize calculation, storage, processing and sharing of data. The cloud technology is a generic term of network technology, information technology, integration technology, management platform technology, application technology and the like based on cloud computing business model application, can form a resource pool, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical networking systems require a large amount of computing, storage resources, such as video websites, picture-like websites, and more portals. Along with the high development and application of the internet industry, each article possibly has an own identification mark in the future, the identification mark needs to be transmitted to a background system for logic processing, data with different levels can be processed separately, and various industry data needs strong system rear shield support and can be realized only through cloud computing.
Cloud Storage (Cloud Storage) is a new concept that extends and develops in the concept of Cloud computing, and a distributed Cloud Storage system (hereinafter referred to as a Storage system for short) refers to a Storage system that integrates a large number of Storage devices (Storage devices are also referred to as Storage nodes) of various types in a network to work cooperatively through application software or application interfaces through functions such as cluster application, network technology, and distributed Storage file system, so as to provide data Storage and service access functions for the outside.
At present, the storage method of the storage system is as follows: when creating logical volumes, each logical volume is allocated a physical storage space, which may be a disk composition of a certain storage device or of several storage devices. The client stores data on a certain logical volume, that is, the data is stored on a file system, the file system divides the data into a plurality of parts, each part is an object, the object not only contains the data but also contains additional information such as an Identity (ID) of the data, the file system writes each object into a physical storage space of the logical volume, and the file system records storage position information of each object, so that when the client requests to access the data, the file system can enable the client to access the data according to the storage position information of each object.
The process of the storage system for distributing physical storage space for the logical volumes is as follows: physical storage space is divided into stripes in advance according to the group of capacity measures for objects stored on a logical volume (which measures tend to have a large margin with respect to the capacity of the object actually to be stored) and redundant array of independent disks (Redundant Array of Independent Disk, RAID), and a logical volume can be understood as a stripe, whereby physical storage space is allocated to a logical volume.
Web instant messaging (Web Real-Time Communication, webRTC): is an API (Application Programming Interface, application program interface) that supports web browsers for real-time voice conversations or video conversations. It was sourced on month 6, 1 of 2011 and was incorporated into the world wide web consortium (World Wide Web Consortium, W3C) recommendation with Google, mozilla (a browser), opera (a browser) support. WebRTC provides core technologies for video conferencing, including audio and video acquisition, codec, network transmission, display, etc., and also supports cross-platform: windows (an operating system), linux (an operating system), mac (an operating system), android (an operating system).
Fast live (Live Event Broadcasting, LEB): the method is also called ultra-low delay live broadcast, is an extension of standard live broadcast in an ultra-low delay play scene, has lower delay than the traditional live broadcast protocol, and provides millisecond-level extreme live broadcast viewing experience for audience. Specific scene requirements with higher requirements on delay performance, such as online education, live sports event, online answering and the like, can be met.
Transcoding: converting from one file format or audio-video resolution code rate to another file format or audio-video resolution code rate. Such as transcoding an audio Video RTMP (Real time messaging protocol, real time transport protocol) streaming format (h.264/1080P) into an FLV (Flash Video) file format (h.265/720P).
Decoding timestamp (Decoding Time Stamp, DTS): for telling the player when to decode the video data of this frame.
Display timestamp (Presentation Time Stamp, PTS): for telling the player when to display the video data of this frame.
Flash Video (FLV) format is a Video format that evolves with the exit of Flash MX. Because the file formed by the method is very small, the loading speed is very high, the protocol is simple, and the long connection transmission based on HTTP1.1 is a bit, the 80% + of the domestic live platform is mainly FLV/RTMP.
Dynamic rate adaptation techniques (HTTP Live Streaming, HLS) include an index file in m3u8 (a video format) format, TS media slice files, and KEY encryption string files.
Dynamic adaptive streaming over HTTP (Dynamic Adaptive Streaming over HTTP, DASH): high quality streaming media can be transported via HTTP (Hyper Text Transfer Protocol ) by an adaptive bit rate streaming technique.
Content delivery network (Content Delivery Network, CDN): through a layer of intelligent virtual network formed by node servers placed everywhere in the network and based on the existing internet, the CDN can redirect the user's request to the service node nearest to the user in real time according to the network traffic and the comprehensive information of the links of each node, the load condition, the cluster and response time to the user and the like. The basic idea of the CDN is to avoid bottlenecks and links on the internet that may affect the data transmission speed and stability as much as possible. The purpose of the CDN is to enable the user to obtain the required content nearby, solve the congestion condition of the Internet (Internet) network and improve the response speed of the user for accessing the website.
Source station: and uploading and accessing the video source to a server for storing the most original audio and video data.
An intermediate source: in order to reduce bandwidth consumption of a large number of source-returning requests to source stations caused by direct broadcast downlink concurrency, a secondary cache server is added.
Bandwidth estimation (Band Width Enhancement, BWE) determines how large video streams you can send and without causing network congestion, thus ensuring that video quality is not degraded.
Multipoint conference unit (Multipoint Conferencing Unit, MCU): a star structure is composed of a server and a plurality of terminals. Each terminal sends the audio and video streams which the terminal wants to share to the server, the server side mixes the audio and video streams of all the terminals in the same room to finally generate a mixed audio and video stream which is then sent to each terminal, and therefore each terminal can see/hear the audio and video of other terminals. In practice, the server is an audio-video mixer, and the pressure of the server in this scheme is very high.
-a selective forwarding unit (Selective Forwarding Unit, SFU): the system consists of a server and a plurality of terminals, but unlike an MCU, the SFU does not mix audio and video, and after receiving an audio and video data stream shared by a certain terminal, the SFU directly forwards the audio and video data stream to other terminals in a room, and the SFU is actually an audio and video routing repeater.
Fig. 1 is a schematic diagram of an implementation environment of a resource processing method according to an embodiment of the present application, as shown in fig. 1, where the implementation environment includes: a first terminal 101, a server 102 and a second terminal 103.
The first terminal 101 and the second terminal 103 are located in the same live room, and the number of the second terminals 103 is at least one. The first terminal 101 and the second terminal 103 may be, but are not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, etc. The server 102 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs, basic cloud computing services such as big data and artificial intelligence platforms, and the like. The first terminal 101 and the second terminal 103 may be directly or indirectly connected to the server 102 through wired or wireless communication, respectively, which is not limited herein.
It should be understood by those skilled in the art that the above-mentioned first terminal 101, server 102 and second terminal 103 are only examples, and other terminals or servers that may be present in the present application or may be present in the future are also included in the scope of the present application and are incorporated herein by reference.
Based on the above implementation environment, the embodiments of the present application provide a resource processing method, taking a flowchart of the resource processing method provided in the embodiment of the present application as shown in fig. 2 as an example, where the method may be illustrated by the interaction between the first terminal 101 and the server 102 in fig. 1. As shown in fig. 2, the method comprises the steps of:
in step 201, the first terminal acquires a plurality of frames of a first direct broadcast image and configuration information of the first terminal.
In the exemplary embodiment of the present application, a first application program for acquiring multimedia resources is installed and executed in the first terminal, and the first application program is any type of application program, which is not limited in the embodiment of the present application. For example, the first application may be a social class application or a short video class application. And responding to a selection instruction of the first user to the first application program, displaying a first display page of the first application program, wherein a control for acquiring the multimedia resource is displayed in the first display page. Illustratively, the control for obtaining the multimedia asset is a "shoot" control.
In one possible implementation, a camera device for capturing live images is also installed and operated in the first terminal. The camera device comprises at least one of a front camera and a rear camera of the first terminal. And responding to a selection instruction of the first user on a control for acquiring the multimedia resource, and acquiring a multi-frame first direct-broadcasting image by the first application program through calling a camera device of the first terminal.
Alternatively, the first terminal may acquire the multi-frame first direct-broadcast image in other manners, which is not limited in the embodiment of the present application. For example, the first terminal acquires an image displayed on the first terminal through the screen sharing application program, and takes the image displayed on the first terminal as a first direct-broadcasting image.
Optionally, the configuration information of the first terminal includes at least one of a screen resolution of the first terminal and a terminal code rate. The screen resolution is used for indicating the resolution of screen display, and the screen resolution determines the setting of how much information is displayed on the terminal screen. Screen resolution is measured in terms of horizontal pixels and vertical pixels. When the screen resolution is low, there are few pixels displayed on the screen, but the size of each pixel is relatively large. When the screen resolution is high, there are many pixels displayed on the screen, but the size of each pixel is relatively small. In the case of the same screen size, the higher the screen resolution, the finer and finer the display effect of the screen. The terminal code rate is the number of data bits transmitted per unit time during data transmission. The code rate is also called bit rate, and represents how many bits are needed for the encoded video/audio data per second, i.e. the amount of data obtained by compressing the image displayed per second. The code rate is generally kbps (kilobits per second), the larger the code rate in unit time is, the higher the precision is, the closer the processed file is to the original file, that is, the more the details of the picture are.
Alternatively, there are two ways to obtain the screen resolution of the first terminal.
In the first mode, the screen resolution of the first terminal is stored in the first terminal. The first terminal acquires the screen resolution of the first terminal in the storage space of the first terminal.
Illustratively, the screen resolution of the first terminal is 1280×1080. That is, the number of pixels included in the vertical direction of the screen of the first terminal is 1280, and the number of pixels included in the horizontal direction is 1080.
And in the second mode, a screen resolution setting page is also displayed in the first terminal, and the first user logged in the first terminal sets the screen resolution of the first terminal through the screen resolution setting page. The first terminal obtains the screen resolution of the first terminal based on the setting of the first user.
In one possible implementation manner, the following two manners of acquiring the terminal code rate of the first terminal are included.
The first acquisition mode is based on the uploading speed of the first terminal, and the terminal code rate of the first terminal is determined.
Optionally, a second application program for determining the uploading speed is installed and operated in the first terminal, and the first terminal determines the uploading speed of the first terminal by calling the second application program. And determining the terminal code rate of the first terminal based on the uploading speed of the first terminal.
In one possible implementation, the terminal code rate B of the first terminal is determined according to the following formula (1) based on the upload speed of the first terminal.
B=V*1024(1)
In the above formula (1), V is the uploading speed of the first terminal.
And determining the terminal code rate of the first terminal based on the screen resolution of the first terminal in the second acquisition mode.
In one possible implementation manner, each screen resolution corresponds to one recommended code rate, the recommended code rate corresponding to the screen resolution of the first terminal is determined based on the screen resolution of the first terminal, and the recommended code rate corresponding to the screen resolution of the first terminal is taken as the terminal code rate of the first terminal.
The following table is a table showing the correspondence between the screen resolution and the proposed code rate provided in the embodiment of the present application.
List one
Screen resolution Suggested code rate
720*380 534
960*720 1350
1024*1080 2160
1280*1080 2700
1600*1200 3750
As shown in the above table one, when the screen resolution is 720×380, the corresponding recommended code rate is 534. When the screen resolution is other, the corresponding suggested code rate is shown in the above table one, and will not be described here again.
It should be noted that the above table is merely an example of several screen resolutions and suggested code rates provided in the embodiments of the present application, and is not intended to limit the number of screen resolutions and suggested code rates.
Illustratively, the screen resolution of the first terminal is 1280×1080, and the proposed code rate corresponding to the screen resolution is 2700. Therefore, the terminal code rate of the first terminal is determined to be 2700.
Alternatively, the frame rate of the first terminal may be further obtained, where the frame rate of the first terminal may be set by a first user logged in the first terminal, or may be set by a manufacturer of the first terminal. A terminal code rate of the first terminal is determined based on a screen resolution of the first terminal and a frame rate of the first terminal.
The following table two shows a table of the corresponding relationship of the screen resolution, the frame rate and the code rate provided in the embodiment of the present application.
Watch II
Figure BDA0003412671590000101
As shown in the above table two, when the screen resolution is 720×380 and the frame rate is 60, the corresponding code rate is 1283; when the screen resolution is 720×380 and the frame rate is 30, the corresponding code rate is 641; when the screen resolution is 720×380 and the frame rate is 15, the corresponding code rate is 321. When the screen resolution and the frame rate are other, the corresponding code rate is shown in the table two, and will not be described again here.
Illustratively, the screen resolution of the first terminal is 1280×1080, the frame rate of the first terminal is 60, and the terminal code rate of the first terminal is 6480.
In step 202, the first terminal encodes a multi-frame first direct-broadcast image based on configuration information of the first terminal and a reference encoding algorithm, to obtain a first multimedia resource.
In one possible implementation manner, after the first terminal obtains the multi-frame first direct-broadcasting image and the configuration information of the first terminal, the process of encoding the multi-frame first direct-broadcasting image based on the configuration information of the first terminal and the reference encoding algorithm includes: and processing the image information of the multi-frame first live image based on the configuration information of the first terminal to obtain a multi-frame second live image. The image information of the second live image of each frame is consistent with the configuration information of the first terminal. And encoding the multi-frame second live image based on a reference encoding algorithm to obtain a first media resource.
Illustratively, the first multimedia asset comprises image data of a plurality of frames of the second live image. Optionally, the first multimedia resource further includes a decoding time and a display time of the second live image of each frame.
Optionally, the processing the image information of the multi-frame first live image based on the configuration information of the first terminal, and the process of obtaining the multi-frame second live image includes: and adjusting the image information of the first direct broadcast image of each frame to the configuration information of the first terminal. And responding to the configuration information of the first terminal to comprise the screen resolution and the terminal code rate of the first terminal, and when the image information comprises the image resolution and the image code rate, enabling the image resolution of the second live image to be consistent with the screen resolution of the first terminal, and enabling the image code rate of the second live image to be consistent with the terminal code rate of the first terminal.
Optionally, the process of encoding the multi-frame second live image based on the reference encoding algorithm to obtain the first media resource includes: and encoding the multi-frame second live image based on a reference encoding algorithm to obtain image data of each frame of second live image. And obtaining a first multimedia resource based on the image data of the multi-frame second live image, wherein the first multimedia resource comprises the image data of each frame of the second live image.
The reference encoding algorithm may be any type of encoding algorithm, which is not limited in the embodiments of the present application. Illustratively, the reference encoding algorithm is h.264. For another example, the reference encoding algorithm is h.265.
Optionally, the image data of the first frame second live image is data obtained by encoding content included in the first frame second live image. The image data of the second live image of the second frame is delta data obtained by encoding the difference between the image contents of the second live image of the second frame and the second live image of the first frame. The image data of the nth frame second live image is delta data obtained by encoding the difference between the image contents of the nth frame second live image and the nth-1 frame second live image. N is an integer greater than 1.
In step 203, the first terminal transmits a first multimedia resource to the server.
In one possible implementation, after the first terminal obtains the first multimedia resource, the first terminal sends the first multimedia resource to the server, and the server adjusts the first multimedia resource to obtain at least one second multimedia resource. The first terminal may send the first multimedia resource to the server by referring to a transmission protocol. The reference transport protocol is any type of transport protocol, and embodiments of the present application are not limited in this regard.
The reference transmission protocol may be an RTMP-based protocol, a CMAF (Common Media Application Format ) protocol, a LLHLS (Low Latency HLS) protocol, or a WebRTC protocol, for example. Of course, the reference transmission protocol may be other transmission protocols, which are not limited in this embodiment of the present application.
Among other things, the RTMP protocol supports TLS (Transport Layer Security, secure transport layer protocol)/SSL (Secure Sockets Layer, secure socket protocol) encryption, even based on a variant of UDP (User Data Protocol, user datagram protocol), i.e. real-time media streaming protocol (for end-to-end connection). RTMP splits a video stream (first multimedia asset) into segments that can be dynamically resized, and within the channel, packets related to audio and video may be interleaved and multiplexed. The RTMP protocol is a protocol based on HTTP1.1 long connection transport.
The CMAF protocol is adaptively broadcast over HTTP (adaptive bit rate based on overall network bandwidth changes). In CMAF, fragmented FMP4 (streaming format based on MPEG-4part 12) fragments are transported over HTTP, which contains two non-passing playlists of the same content for a particular player: IOS (HLS) or Android/Microsoft (MPEG DASH). The CMAF protocol is not designed for Low-delay broadcasting, but as concerns for Low-delay continue to grow, some manufacturers offer extensions such as Low Latency CMAF. This extension assumes that both the first terminal and the second terminal support two methods:
1. and (3) block coding: the fragments are divided into sub-fragments (small fragments with moof+mdat mp4 boxes, eventually constituting the whole fragment suitable for playback) and sent before the whole fragment is spliced.
2. Block transfer coding: the sub-segments are sent to the CDN using HTTP 1.1: only 1 HTTP POST request for the entire segment (25 frames per second) is sent every 4 seconds, and 100 small segments (one frame per segment) can be sent in the same session. The player may also attempt to download incomplete fragments, while the CDN uses chunked transport encoding to provide the completed portion, and then remain connected until a new fragment is added to the fragment being downloaded. Once the entire clip is formed on the CDN side, the clip transmitted to the player is completed.
The LLHLS protocol consists of the following parts: the minimum duration of the generated partial fragments is 200 ms, which can be used before the whole fragment consisting of these parts is completed. The outdated partial clips are periodically deleted from the playlist. The server may send the updated playlist with the new clip using HTTP/2 push mode. It is the responsibility of the server to keep the request (blocking) until a version of the playlist containing the new clip is available, blocking the playlist reload can eliminate polling. Instead of a complete playlist, an increment of the playlist is sent. The server announces the new partial fragment that is forthcoming (preload hint). Information about the playlist is loaded simultaneously in adjacent profiles to expedite the switch.
The WebRTC protocol is a set of standard, protocol and JavaScript (a programming language) programming interfaces that implement end-to-end encryption through DTLS (Datagram Transport Layer Sscurity, packet transport layer protocol) -SRTP (secure real-time-time Transport Protocol) within a point-to-point connection. In addition, the technology does not use third party plug-ins or software, nor does it degrade quality and delay through firewalls. Video transmission is typically accomplished using UDP based WebRTC. Uplink datachannel is responsible for multiplexing, sending, congestion control and reliable transmission through SCTP (application data) and SRTP protocols. DTLS is used for handshake exchange and further traffic encryption.
In step 204, the server receives a first multimedia resource sent by the first terminal.
Optionally, the server receives a first multimedia resource sent by the first terminal, that is, the server acquires the first multimedia resource, and the first multimedia resource is obtained by encoding the acquired first direct-broadcast image by the first terminal according to the configuration information and the reference encoding algorithm of the first terminal.
In step 205, the server obtains configuration information of at least one second terminal in the same live room as the first terminal.
In one possible implementation, the first terminal is a terminal of a first user (e.g., a host user) and the second terminal is a terminal of a second user (e.g., a viewer user). The second terminal is provided with and runs a first application program for acquiring the multimedia resources. And responding to a selection instruction of the second user to the first application program, and displaying a second display page of the first application program. And a plurality of live rooms are displayed in the second display page, each live room is provided with a host user and at least one audience user, and the live rooms are virtual rooms. And responding to a selected instruction of the second user on any live room, and enabling the second user to enter the selected live room. I.e. the second terminal is in the same live room as the first terminal.
Optionally, the server determines a live room in which the first terminal is located, obtains terminal identifiers of all terminals except the first terminal in the live room in which the first terminal is located, takes all terminals except the first terminal as the second terminal, and further obtains configuration information of the second terminal.
Optionally, the configuration information of the second terminal includes at least one of a screen resolution of the second terminal and a terminal code rate. The server includes, but is not limited to, the following two ways of obtaining the screen resolution of the second terminal.
The first acquisition mode is that the server acquires the screen resolution of each second terminal based on the terminal identification of the second terminal in the same live broadcast room as the first terminal.
Optionally, the server stores a correspondence between the terminal identifiers and the screen resolutions, and after the server obtains the terminal identifiers of the second terminals, the server determines the screen resolutions of the second terminals based on the terminal identifiers of the second terminals and the correspondence between the terminal identifiers and the screen resolutions.
And the second acquisition mode is that the server sends a screen resolution acquisition request to a second terminal in the same live broadcast room as the first terminal, wherein the screen resolution acquisition request is used for acquiring the screen resolution of the second terminal. The server receives the screen resolution returned by each second terminal, that is, the server acquires the screen resolution of the second terminal in the same live room as the first terminal.
It should be noted that any of the above-mentioned acquiring methods may be selected to acquire the screen resolution of the second terminal, and other methods may also be selected to acquire the screen resolution of the second terminal, which is not limited in this embodiment of the present application.
Optionally, the following two ways of acquiring the terminal code rate of the second terminal are included but not limited to.
In the first mode, the server obtains the terminal code rate of the second terminal based on the screen resolution of the second terminal.
The server determines the suggested code rate corresponding to the screen resolution of the second terminal based on the screen resolution of the second terminal and the corresponding relation between the screen resolution and the suggested code rate, and takes the suggested code rate corresponding to the screen resolution of the second terminal as the terminal code rate of the second terminal.
When the server obtains the terminal code rate of the second terminal, the frame rate of the second terminal may be considered, and the frame rate of the second terminal may be set by a second user logged in the second terminal, or may be set by a manufacturer of the second terminal. The server determines a terminal code rate of the second terminal based on the frame rate of the second terminal and the screen resolution of the second terminal.
Optionally, the server stores a correspondence relationship among screen resolution, frame rate, and code rate. The server determines a terminal code rate of the second terminal based on the screen resolution and the frame rate of the second terminal and a correspondence of the screen resolution, the frame rate and the code rate.
And in a second mode, the server sends a code rate acquisition request to the second terminals in the same live broadcast room as the first terminal, wherein the code rate acquisition request is used for acquiring the terminal code rate of each second terminal. And the server receives the code rate returned by each second terminal, thereby obtaining the terminal code rate of each second terminal.
It should be noted that any of the above modes may be selected to obtain the terminal code rate of the second terminal, and other modes may also be selected to obtain the terminal code rate of the second terminal, which is not limited in this embodiment of the present application.
After the server obtains the configuration information of each second terminal, the terminal identifier of each second terminal and the configuration information of each second terminal may be stored correspondingly.
In step 206, the server adjusts the first multimedia resource based on the configuration information of each second terminal, to obtain at least one second multimedia resource.
In one possible implementation manner, the server adjusts the first multimedia resource based on the configuration information of each second terminal, so as to obtain at least one second multimedia resource, that is, a process that the server transcodes the first multimedia resource.
And the server decodes the first multimedia resource based on a reference decoding algorithm to obtain a multi-frame second live image. And adjusting the image information of the multi-frame second live image based on the configuration information of the second terminal to obtain a multi-frame third live image. And encoding the multi-frame third live image based on a reference encoding algorithm to obtain a second multimedia resource, wherein the second multimedia resource comprises image data of the multi-frame third live image. The reference decoding algorithm is a decoding algorithm corresponding to the reference encoding algorithm.
The image information includes at least one of an image resolution and an image code rate, and the configuration information of the second terminal includes at least one of a screen resolution and a terminal code rate. In the embodiment of the application, only taking an example that the image information includes image resolution and image code rate, and the configuration information of the second terminal includes screen resolution and terminal code rate as the examples, the process of adjusting the image information of the multi-frame second live image based on the configuration information of the second terminal to obtain the multi-frame third live image is described.
The acquiring process of the multi-frame third live image comprises the following steps: and adjusting the image resolution of the multi-frame second live image based on the screen resolution of the second terminal to obtain a multi-frame fourth live image, wherein the image resolution of the fourth live image is consistent with the screen resolution of the second terminal. And adjusting the image code rate of the fourth live images based on the terminal code rate of the second terminal to obtain a third live image, wherein the image code rate of the third live image is consistent with the terminal code rate of the second terminal.
The acquiring process of the multi-frame third live image may further include: and the server adjusts the code rate of the multi-frame second live image based on the terminal code rate of the second terminal to obtain a multi-frame fifth live image. And adjusting the image resolution of the multi-frame fifth live image based on the screen resolution of the second terminal to obtain a multi-frame third live image. The embodiment of the application does not limit the adjustment sequence of the image resolution and the image code rate.
It should be noted that, the process of the server encoding the multi-frame third live image based on the reference encoding algorithm to obtain the second multimedia resource is similar to the process of the first terminal encoding the multi-frame second live image based on the reference encoding algorithm to obtain the first multimedia resource, which is not described herein again.
After the server acquires at least one second multimedia resource, the at least one second multimedia resource and the image information of the live image included in each second multimedia resource are correspondingly stored in a storage space of the server, so that the second multimedia resource can be sent to the second terminal later.
After the server receives the new first multimedia asset, the server regenerates at least one new second multimedia asset based on the new multimedia asset. The at least one second multimedia asset previously stored may be deleted while the server stores the at least one new second multimedia asset, the at least one new second multimedia asset being stored in a storage location of the at least one previous second multimedia asset. Of course, the server may also store at least one new second multimedia asset in other storage locations.
In step 207, the server determines a target multimedia resource matching the configuration information of the target terminal, which is any one of the at least one second terminal, among the at least one second multimedia resource.
In a possible implementation manner, the resource processing method provided by the embodiment of the present application is suitable for a content distribution scenario, for example, the server distributes content to the acquired at least one second multimedia resource.
Optionally, after the target terminal enters the live broadcast room of the first terminal, the target terminal sends a multimedia resource distribution request to the server, where the multimedia distribution request carries a terminal identifier of the target terminal, and the multimedia resource distribution request is used to acquire a multimedia resource of the first terminal. The target terminal is any one of at least one second terminal in the same live room as the first terminal. The terminal identifier of the target terminal may be any identifier capable of uniquely representing one terminal, which is not limited in the embodiment of the present application.
After receiving the multimedia resource distribution request sent by the target terminal, the server analyzes the multimedia resource distribution request to obtain the terminal identification of the target terminal. In step 205, the server stores the terminal identifier and the configuration information correspondingly, and obtains the configuration information of the target terminal based on the terminal identifier of the target terminal and the correspondence between the terminal identifier and the configuration information.
In one possible implementation manner, after the target terminal enters the live broadcast room of the first terminal, the server actively acquires the configuration information of the target terminal without the target terminal sending a multimedia resource distribution request to the server, and further determines a target multimedia resource matched with the configuration information of the target terminal in at least one second multimedia resource based on the configuration information of the target terminal.
The image information of the live image included in each of the at least one second multimedia asset generated by the server is different. After the server obtains the configuration information of the target terminal, the target multimedia resource matched with the configuration information of the target terminal is determined in the at least one second multimedia resource based on the configuration information of the target terminal and the image information of the live broadcast image respectively included by the at least one second multimedia resource.
The target multimedia resource matched with the configuration information of the target terminal means that the image information of the live image included in the target multimedia resource is consistent with the configuration information of the target terminal.
Illustratively, the server generates three second multimedia assets, multimedia asset 1, multimedia asset 2, and multimedia asset 3, respectively. Wherein, the image information of the live broadcast image included in each second multimedia resource is: the image information of the live image included in the multimedia resource 1 is: the resolution of the image is 720 x 380, and the code rate of the image is 534; the image information of the live image included in the multimedia resource 2 is: the resolution of the image is 960 x 720, and the code rate of the image is 1350; the image information of the live image included in the multimedia resource 3 is: the resolution of the image is 1280 x 1080, and the code rate of the image is 6480. The configuration information of the target terminal is as follows: the screen resolution is 960 x 720, and the terminal code rate is 1350. Since the image information of the live image included in the multimedia asset 2 coincides with the configuration information of the target terminal, the multimedia asset 2 is regarded as the target multimedia asset matching the configuration information of the target terminal.
In step 208, the server transmits the target multimedia asset to the target terminal.
In one possible implementation, after determining the target multimedia resource, the server directly sends the target multimedia resource to the target terminal. The server may send the target multimedia resource to the target terminal by referring to a transmission protocol.
Optionally, the server includes a plurality of nodes, and after the server acquires the at least one second multimedia resource, the server may synchronize the at least one second multimedia resource to each node included in the server. The server determines a target node among the plurality of nodes based on the node states and the node information of the respective nodes. The target node includes the target multimedia resource since the server synchronizes at least one second multimedia resource to each node included in the server. The server sends a resource forwarding request to the target node, wherein the resource forwarding request carries a resource identifier of the target multimedia resource and a terminal identifier of the target terminal, and the target multimedia resource is sent to the target terminal through the target node.
Yet another case is: the server does not synchronize the at least one second multimedia resource to each node comprised by the server after the at least one second multimedia resource is acquired. I.e. the target multimedia resource is not included in the target node. After determining the target node, the server sends the target multimedia resource and the terminal identification of the target terminal to the target node, so that the target node obtains the target multimedia resource, and the target node sends the target multimedia resource to the target terminal.
The node information comprises a node load rate, wherein the node load rate is used for indicating the ratio of the number of tasks which the node needs to transmit currently to the total number of tasks which the node can transmit. The higher the node load rate, the more tasks the node currently needs to transmit. The lower the node load rate, the fewer the number of tasks that the node is currently required to transmit. The server takes a node of the plurality of nodes, the node state of which meets the state requirement, as a candidate node. The target node is determined among the candidate nodes based on the node load rates of the respective candidate nodes. The node with the node state meeting the state requirement refers to a node with the node state being in a working state, and the node state being in the working state indicates that the node is a normal node and can perform resource transmission. A node state being inactive means that the node is an abnormal node and cannot perform resource transfer.
In one possible implementation, the server randomly determines one node among the candidate nodes as the target node. The server may further use, as the target node, a candidate node whose node load rate satisfies the load requirement among the candidate nodes. Illustratively, the candidate node with the lowest node load rate among the candidate nodes is taken as the target node.
Optionally, the node information further comprises a response time of the node. After determining the candidate nodes among the plurality of nodes, the server may also determine response times for each candidate node. A target node is determined among the candidate nodes based on the node load rates and response times of the respective candidate nodes.
Wherein determining the response time of each candidate node comprises: and determining the distance corresponding to each candidate node based on the position of the target terminal and the position of each candidate node. And determining the response time of each candidate node based on the distance corresponding to each candidate node. A target node is determined among the candidate nodes based on the node load rates of the respective candidate nodes and the response times of the respective candidate nodes.
Optionally, each node corresponds to a response speed, and the response speed corresponding to each node may be the same or different. The process of determining the response time of each candidate node based on the distance corresponding to each candidate node includes: and taking the quotient between the distance corresponding to the candidate node and the response speed of the candidate node as the response time of the candidate node. The response time of each candidate node is obtained in the same way.
In one possible implementation, there are three implementations that determine a target node among candidate nodes based on the node load rates of the respective candidate nodes and the response times of the respective candidate nodes.
In one implementation, a first reference node is determined among candidate nodes based on node load rates of the candidate nodes. A target node is determined in the first reference node based on the response time of the first reference node.
Optionally, a candidate node with a node load rate smaller than a load rate threshold value is used as the first reference node. And taking the node with the shortest response time in the first reference nodes as a target node. The load factor threshold is set based on experience, and may be adjusted according to the implementation environment, which is not limited in the embodiment of the present application. Illustratively, the load factor threshold is 0.8.
Implementation two, based on response time of each candidate node, determining a second reference node in the candidate nodes. The target node is determined in the second reference node based on the node load rate of the second reference node.
In one possible implementation, a candidate node whose response time is less than the time threshold is taken as the second reference node. And taking the node with the lowest node load rate in the second reference nodes as a target node. The time threshold is set based on experience, and may be adjusted according to the implementation environment, which is not limited in the embodiment of the present application. The time threshold is, for example, 10 milliseconds.
In the third implementation manner, based on the node load rate of each candidate node and the response time of each candidate node, determining an index value of each candidate node, and taking the candidate node with the index value meeting the index requirement as a target node.
In one possible implementation, the index value of each candidate node is determined according to the following formula (2) based on the node load rate of each candidate node and the response time of each candidate node.
P i =S i *α+T i *β (2)
In the above formula (2), P i Index value representing i-th candidate node, S i Node load factor, T, representing the ith candidate node i The response time of the i candidate node is represented, alpha represents the load factor weight parameter, and beta represents the time weight parameter.
The load factor weight parameter and the time weight parameter are set based on experience, and can be adjusted according to the implementation environment.
After the index value of each candidate node is determined, the candidate node with the lowest index value is taken as the target node. And then the target node sends the target multimedia resource to the target terminal. And after the target terminal acquires the target multimedia resource, decoding the target multimedia resource according to a reference decoding algorithm to obtain a multi-frame third live broadcast image. The display time of the third live image of each frame can also be obtained after decoding. And the target terminal displays the third live image according to the display time of the third live image of each frame.
The SDP (Session Description Protocol ) negotiation at the time of playing the target multimedia asset by the target terminal remains consistent with the standard WebRTC simulcast (simulcast) scheme, which is fully compatible for standard browsers, java scripts and second terminals of various terminal types.
When the simulcast interface is embodied in SDP negotiation, the video media line may appear as a=ssrc-group: SIM typeface, format a=ssrc-group: SIM stream0 stream1 stream2. Here, { stream0 stream1 stream 2.} is a plurality of layers (video streams) of the simulcast, and the sequence length is usually not more than 3. They are arranged in order from small to large in resolution. Illustratively, if the resolution of stream0 is w0xh0, the resolution of stream1 is w1xh1, and the resolution of stream2 is w2xh2, then the resolution satisfies: w0xh0< w1xh1< w2xh2.
An example of a simulcast SDP negotiation is as follows:
a=ssrc-group: a SIM 3462331267 49866344; encoding video streams 3462331267 and 49866344 in SIM format;
a=ssrc-group: FID 3462331267 1502500952; encoding video streams 3462331267 and 1502500952 in FID format;
a=ssrc-group: FID 49866344 241640858; encoding video streams 49866344 and 241640858 in FID format;
a=ssrc: 3462331267cname m+kwZezC1JiVXDIB; the canonical name identifier of the// video stream 3462331267 is m+kwzezc1jivxdib;
a=ssrc: 498634 cnamem+kwzezc1jivxdib; the canonical name identifier of the// video stream 49866344 is m+kwzezc1jivxdib;
a=ssrc: 1502500952cname m+kwzezc1jivxdib; the canonical name identifier of the// video stream 1502500952 is m+kwzezc1jivxdib;
a=ssrc: 241640858cname m+kwzezc1jivxdib; the canonical name identifier of the// video stream 241640858 is m+kwzezc1jivxdib;
a=ssrc: 3462331267cname m+kwZezC1JiVXDIB; the canonical name identifier of the// video stream 3462331267 is m+kwzezc1jivxdib;
RTCP (Real-time Transport Control Protocol, real-time transport protocol) provides a globally unique Canonical Name identifier (CNAME) for each video stream, which the receiver uses to track an RTP stream (Real-time transport video stream).
a=ssrc-group: the FID 3462331267 1502500952 is typically used to associate a set of regular RTP streams with a retransmission RTP stream.
a=ssrc-group: the SIM 3462331267 49866344 associates two groups of media streams together that are encoded with quality from low to high in resolution size.
The following is an example of source code configuration for a simulcast layer number change:
{1920, 1080,3, 5000, 4000, 800}, with a resolution of 1920x1080, the maximum allowed simulcast layer number of 3 layers, a maximum code rate of 5000Kbps, a start code rate of 4000Kbps, and a minimum code rate of 800Kbps;
{1280, 720,3, 2500, 2500, 600}, when the/resolution is 1280x720, the maximum allowed simulcast layer number is 3, the maximum code rate is 2500Kbps, the initial code rate is 2500Kbps, and the minimum code rate is 600Kbps;
{960, 540,3, 1200, 1200, 350}, when the/resolution is 960x540, the maximum allowed simulcast layer number is 3, the maximum code rate is 1200Kbps, the initial code rate is 1200Kbps, and the minimum code rate is 350Kbps;
{640, 360,2, 700, 500, 150}, the maximum allowed simulcast layer number is 2 when the/(m/m) resolution is 640x360, the maximum code rate is 700Kbps, the initial code rate is 500Kbps, and the minimum code rate is 150Kbps;
{480, 270,2, 450, 350, 150}, the maximum allowed simulcast layer number is 2 when the/(m/m) resolution is 480x270, the maximum code rate is 450Kbps, the initial code rate is 350Kbps, and the minimum code rate is 150Kbps;
{320, 180,1, 200, 150, 30}, the maximum allowed simulcast layer number is 1 layer when the/(m/m) resolution is 320x180, the maximum code rate is 200Kbps, the initial code rate is 150Kbps, and the minimum code rate is 30Kbps;
{0, 1, 200, 150, 30}, when the/(m) resolution is 0x0, the maximum allowed simulcast layer number is 1 layer, the maximum code rate is 200Kbps, the initial code rate is 150Kbps, and the minimum code rate is 30Kbps;
if the resolution of the acquired video frames sent to the encoder changes in WebRTC, a reset encoder operation is triggered (Reconfigure Encoder), and the simulcast layer number is recalculated. The maximum allowed simulcast layer number is 3 layers for a 1920x1080 acquisition resolution and 2 layers for a 640x360 acquisition resolution. The simulcast layer number changes when the acquired video resolution changes from 1920x1080 to 640x 360.
The server used in the above-mentioned resource processing method may be a CDN, an SFU, or other types of servers, which is not limited in this embodiment of the present application. The processing calculation power of the server can be infinitely expanded, voice image quality enhancement, super division, frame insertion, coding algorithm support and the like of the first multimedia resource can be flexibly processed in a mode of adding a Filter, the uplink network bandwidth capacity and the real-time coding capacity of the first terminal are not limited, and the acquired second multimedia resource can be flexibly adapted to and compatible with configuration information of the second terminal.
The method can be realized by generating a first multimedia resource by the first terminal, has low requirements on the real-time coding capacity and the uplink bandwidth of the first terminal, and can lighten the pressure of the first terminal. In addition, the server in the application adjusts the first multimedia resource based on the configuration information of at least one second terminal in the same live broadcast room as the first terminal to obtain at least one second multimedia resource, and the obtained second multimedia resource can meet the requirements of the second terminal.
In addition, when the server adjusts the first multimedia resource, the image information of the live image included in the first multimedia resource is adjusted, and the display time of the live image is not adjusted, so that the display time of the live image included in the second multimedia resource is consistent with the display time of the live image included in the first multimedia resource, and the second terminal cannot delay, jam or the like when playing the live image included in the second multimedia resource.
Fig. 3 is a flowchart illustrating a resource processing method according to an embodiment of the present application, which may be performed by the server 102 in fig. 1. As shown in fig. 3, the method comprises the steps of:
In step 301, a first multimedia resource is acquired, where the first multimedia resource is obtained by encoding, by a first terminal, an acquired multi-frame first direct broadcast image according to configuration information and a reference encoding algorithm of the first terminal.
In one possible implementation manner, the process of obtaining the first multimedia resource is consistent with the process in step 204, which is not described herein.
In step 302, configuration information of at least one second terminal in the same live room as the first terminal is obtained.
In a possible implementation manner, the process of obtaining the configuration information of at least one second terminal in the same live room as the first terminal is similar to the process of step 205, which is not described herein.
In step 303, the first multimedia resource is adjusted based on the configuration information of each second terminal, so as to obtain at least one second multimedia resource.
In a possible implementation manner, the process of obtaining at least one second multimedia resource is consistent with the process of step 206, which is not described herein.
The method can be realized by generating a first multimedia resource by the first terminal, has low requirements on the real-time coding capacity and the uplink bandwidth of the first terminal, and can lighten the pressure of the first terminal. In addition, the server in the application adjusts the first multimedia resource based on the configuration information of at least one second terminal in the same live broadcast room as the first terminal to obtain at least one second multimedia resource, and the obtained second multimedia resource can meet the requirements of the second terminal.
Fig. 4 is a flowchart of a resource processing method according to an embodiment of the present application, where the method may be performed by the first terminal 101 in fig. 1. As shown in fig. 4, the method comprises the steps of:
in step 401, a plurality of frames of first direct broadcast images and configuration information of a first terminal are acquired.
In a possible implementation manner, the process of acquiring the multi-frame first direct broadcast image and the configuration information of the first terminal is consistent with the process of step 201, which is not described herein.
In step 402, a multi-frame first direct broadcast image is encoded based on configuration information of a first terminal and a reference encoding algorithm to obtain a first multimedia resource.
In one possible implementation manner, the process of obtaining the first multimedia resource is consistent with the process of step 202, which is not described herein.
In step 403, the first multimedia resource is sent to a server, and the server is configured to adjust the first multimedia resource to obtain at least one second multimedia resource.
In a possible implementation manner, the process of sending the first multimedia resource to the server is consistent with the process of step 203, which is not described herein.
The method can be realized by generating a first multimedia resource by the first terminal, has low requirements on the real-time coding capacity and the uplink bandwidth of the first terminal, and can lighten the pressure of the first terminal.
Fig. 5 is a flowchart of a resource processing method according to an embodiment of the present application. In this fig. 3, the first terminal and the three second terminals are in the same live room. The method comprises the steps that a first terminal obtains a first multimedia resource, and image information of live images included in the first multimedia resource is as follows: the image resolution is 1080P, the frame rate is 60FPS, and the code rate is 10Mbps. The first terminal sends a first multimedia resource to the server.
The server receives the first multimedia resources, adjusts the first multimedia resources to obtain three second multimedia resources, and the image information of the live image included in one second multimedia resource corresponds to the configuration information of one second terminal. The image information of the live image included in the first and second multimedia resources is respectively: the image resolution is 1080P, the frame rate is 60FPS, and the code rate is 4Mbps. The image information of the live image included in the second multimedia resource is respectively: the image resolution is 720P, the frame rate is 30FPS, and the code rate is 2Mbps. The image information of the live image included in the third second multimedia resource is respectively: the image resolution is 480P, the frame rate is 15FPS, and the code rate is 800Kbps.
The server sends a first second multimedia resource matched with the first second terminal to the first second terminal, sends a second multimedia resource matched with the second terminal to the second terminal, and sends a third second multimedia resource matched with the third second terminal to the third second terminal. When the server sends the second multimedia resources matched with each second terminal to each second terminal, the server can also determine the target node matched with each second terminal, and send the second multimedia resources matched with each second terminal to each second terminal through the target node matched with each second terminal.
Fig. 6 is a schematic structural diagram of a resource processing device according to an embodiment of the present application, where, as shown in fig. 6, the device includes:
the acquiring module 601 is configured to acquire a first multimedia resource, where the first multimedia resource is obtained by encoding, by a first terminal, an acquired multi-frame first direct broadcast image according to configuration information and a reference encoding algorithm of the first terminal;
the acquiring module 601 is further configured to acquire configuration information of at least one second terminal in the same live room as the first terminal;
And the adjustment module 602 is configured to adjust the first multimedia resource based on the configuration information of each second terminal, so as to obtain at least one second multimedia resource.
In a possible implementation manner, the adjustment module 602 is configured to decode the first multimedia resource based on a reference decoding algorithm to obtain a multi-frame second live image; based on the configuration information of the second terminal, adjusting the image information of the multi-frame second live image to obtain a multi-frame third live image; and encoding the multi-frame third live image based on a reference encoding algorithm to obtain a second multimedia resource, wherein the second multimedia resource comprises image data of the multi-frame third live image.
In one possible implementation, the configuration information of the second terminal includes a screen resolution and a terminal code rate, and the image information includes an image resolution and an image code rate;
the adjusting module 602 is configured to adjust an image resolution of the multiple frames of the second live image based on a screen resolution of the second terminal, to obtain multiple frames of a fourth live image; and adjusting the image code rate of the fourth live images based on the terminal code rate of the second terminal to obtain a third live image.
In a possible implementation manner, the obtaining module 601 is further configured to obtain a terminal code rate of the second terminal based on a screen resolution of the second terminal; or sending a code rate acquisition request to the second terminal, wherein the code rate acquisition request is used for acquiring the terminal code rate of the second terminal, and receiving the terminal code rate of the second terminal returned by the second terminal based on the acquisition request.
In a possible implementation manner, the obtaining module 601 is configured to obtain a frame rate of the second terminal; and determining the terminal code rate of the second terminal based on the frame rate of the second terminal and the screen resolution of the second terminal.
In one possible implementation, the apparatus further includes:
a determining module, configured to determine, from at least one second multimedia resource, a target multimedia resource that matches configuration information of a target terminal, where the target terminal is any one of the at least one second terminal;
and the sending module is used for sending the target multimedia resource to the target terminal.
In one possible implementation, the server includes a plurality of nodes; a transmitting module for determining a target node among the plurality of nodes based on the node status and the node information of each node; and responding to the target node comprising the target multimedia resource, sending a resource forwarding request to the target node, wherein the resource forwarding request carries a resource identifier of the target multimedia resource and a terminal identifier of the target terminal, and sending the target multimedia resource to the target terminal through the target node.
In one possible implementation manner, the determining module is configured to use a node, which is among the plurality of nodes and has a node state meeting a state requirement, as a candidate node; the target node is determined among the candidate nodes based on the node load rates of the respective candidate nodes.
In a possible implementation manner, the determining module is further configured to determine a distance corresponding to each candidate node based on the location of the target terminal and the location of each candidate node; determining response time of each candidate node based on the distance corresponding to each candidate node;
and the determining module is used for determining the target node in the candidate nodes based on the node load rates of the candidate nodes and the response time of the candidate nodes.
The device can be realized by generating a first multimedia resource by the first terminal, has low requirements on the real-time coding capacity and the uplink bandwidth of the first terminal, and can lighten the pressure of the first terminal. In addition, the server in the application adjusts the first multimedia resource based on the configuration information of at least one second terminal in the same live broadcast room as the first terminal to obtain at least one second multimedia resource, and the obtained second multimedia resource can meet the requirements of the second terminal.
Fig. 7 is a schematic structural diagram of a resource processing device according to an embodiment of the present application, where, as shown in fig. 7, the device includes:
an acquiring module 701, configured to acquire a plurality of frames of first direct broadcast images and configuration information of a first terminal;
the encoding module 702 is configured to encode a plurality of frames of the first direct broadcast image based on configuration information of the first terminal and a reference encoding algorithm, so as to obtain a first multimedia resource;
the sending module 703 is configured to send the first multimedia resource to a server, where the server is configured to adjust the first multimedia resource to obtain at least one second multimedia resource.
In a possible implementation manner, the encoding module 702 is configured to adjust image information of the multiple frames of the first live image based on configuration information of the first terminal to obtain multiple frames of the second live image, where the image information includes at least one of an image resolution and an image code rate; and encoding the multi-frame second live image based on a reference encoding algorithm to obtain a first multimedia resource, wherein the first multimedia resource comprises image data of the multi-frame second live image.
The device can be realized by generating a first multimedia resource by the first terminal, has low requirements on the real-time coding capacity and the uplink bandwidth of the first terminal, and can lighten the pressure of the first terminal.
It should be understood that, in implementing the functions of the apparatus provided above, only the division of the above functional modules is illustrated, and in practical application, the above functional allocation may be implemented by different functional modules, that is, the internal structure of the device is divided into different functional modules, so as to implement all or part of the functions described above. In addition, the apparatus and the method embodiments provided in the foregoing embodiments belong to the same concept, and specific implementation processes of the apparatus and the method embodiments are detailed in the method embodiments and are not repeated herein.
Fig. 8 shows a block diagram of a terminal 800 according to an exemplary embodiment of the present application. The terminal 800 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. Terminal 800 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, and the like.
In general, the terminal 800 includes: a processor 801 and a memory 802.
Processor 801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 801 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 801 may also include a main processor, which is a processor for processing data in an awake state, also referred to as a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 801 may integrate a GPU (Graphics Processing Unit, image processor) for taking care of rendering and rendering of the content that the display screen is required to display. In some embodiments, the processor 801 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 802 may include one or more computer-readable storage media, which may be non-transitory. Memory 802 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 802 is used to store at least one instruction for execution by processor 801 to implement the resource processing method provided by the method embodiment shown in fig. 4 in the present application.
In some embodiments, the terminal 800 may further optionally include: a peripheral interface 803, and at least one peripheral. The processor 801, the memory 802, and the peripheral interface 803 may be connected by a bus or signal line. Individual peripheral devices may be connected to the peripheral device interface 803 by buses, signal lines, or a circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 804, a display 805, a camera assembly 806, audio circuitry 807, a positioning assembly 808, and a power supply 809.
Peripheral interface 803 may be used to connect at least one Input/Output (I/O) related peripheral to processor 801 and memory 802. In some embodiments, processor 801, memory 802, and peripheral interface 803 are integrated on the same chip or circuit board; in some other embodiments, either or both of the processor 801, the memory 802, and the peripheral interface 803 may be implemented on separate chips or circuit boards, which is not limited in this embodiment.
The Radio Frequency circuit 804 is configured to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 804 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 804 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 804 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuitry 804 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuitry 804 may also include NFC (Near Field Communication ) related circuitry, which is not limited in this application.
The display 805 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 805 is a touch display, the display 805 also has the ability to collect touch signals at or above the surface of the display 805. The touch signal may be input as a control signal to the processor 801 for processing. At this time, the display 805 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 805 may be one and disposed on a front panel of the terminal 800; in other embodiments, the display 805 may be at least two, respectively disposed on different surfaces of the terminal 800 or in a folded design; in other embodiments, the display 805 may be a flexible display disposed on a curved surface or a folded surface of the terminal 800. Even more, the display 805 may be arranged in an irregular pattern other than rectangular, i.e., a shaped screen. The display 805 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 806 is used to capture images or video. Optionally, the camera assembly 806 includes a front camera and a rear camera. Typically, a front camera is disposed on the front panel of the terminal 800 and a rear camera is disposed on the rear surface of the terminal 800. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, the camera assembly 806 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
Audio circuitry 807 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and the environment, converting the sound waves into electric signals, inputting the electric signals to the processor 801 for processing, or inputting the electric signals to the radio frequency circuit 804 for voice communication. For stereo acquisition or noise reduction purposes, a plurality of microphones may be respectively disposed at different portions of the terminal 800. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 801 or the radio frequency circuit 804 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, audio circuit 807 may also include a headphone jack.
The location component 808 is utilized to locate the current geographic location of the terminal 800 to enable navigation or LBS (Location Based Service, location-based services). The positioning component 808 may be a positioning component based on the United states GPS (Global Positioning System ), the Beidou system of China, or the Galileo system of Russia.
A power supply 809 is used to power the various components in the terminal 800. The power supply 809 may be an alternating current, direct current, disposable battery, or rechargeable battery. When the power supply 809 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the terminal 800 also includes one or more sensors 810. The one or more sensors 810 include, but are not limited to: acceleration sensor 811, gyroscope sensor 812, pressure sensor 813, optical sensor 815, and proximity sensor 816.
The acceleration sensor 811 can detect the magnitudes of accelerations on three coordinate axes of the coordinate system established with the terminal 800. For example, the acceleration sensor 811 may be used to detect components of gravitational acceleration in three coordinate axes. The processor 801 may control the display screen 805 to display a user interface in a landscape view or a portrait view based on the gravitational acceleration signal acquired by the acceleration sensor 811. Acceleration sensor 811 may also be used for the acquisition of motion data of a game or user.
The gyro sensor 812 may detect a body direction and a rotation angle of the terminal 800, and the gyro sensor 812 may collect a 3D motion of the user to the terminal 800 in cooperation with the acceleration sensor 811. The processor 801 may implement the following functions based on the data collected by the gyro sensor 812: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 813 may be disposed at a side frame of the terminal 800 and/or at a lower layer of the display 805. When the pressure sensor 813 is disposed on a side frame of the terminal 800, a grip signal of the terminal 800 by a user may be detected, and the processor 801 performs left-right hand recognition or shortcut operation according to the grip signal collected by the pressure sensor 813. When the pressure sensor 813 is disposed at the lower layer of the display screen 805, the processor 801 controls the operability control on the UI interface according to the pressure operation of the user on the display screen 805. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 815 is used to collect the ambient light intensity. In one embodiment, the processor 801 may control the display brightness of the display screen 805 based on the intensity of ambient light collected by the optical sensor 815. Specifically, when the intensity of the ambient light is high, the display brightness of the display screen 805 is turned up; when the ambient light intensity is low, the display brightness of the display screen 805 is turned down. In another embodiment, the processor 801 may also dynamically adjust the shooting parameters of the camera module 806 based on the ambient light intensity collected by the optical sensor 815.
A proximity sensor 816, also referred to as a distance sensor, is typically provided on the front panel of the terminal 800. The proximity sensor 816 is used to collect the distance between the user and the front of the terminal 800. In one embodiment, when the proximity sensor 816 detects that the distance between the user and the front of the terminal 800 gradually decreases, the processor 801 controls the display 805 to switch from the bright screen state to the off screen state; when the proximity sensor 816 detects that the distance between the user and the front surface of the terminal 800 gradually increases, the processor 801 controls the display 805 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the structure shown in fig. 8 is not limiting and that more or fewer components than shown may be included or certain components may be combined or a different arrangement of components may be employed.
Fig. 9 is a schematic structural diagram of a server provided in the embodiment of the present application, where the server 900 may have a relatively large difference due to different configurations or performances, and may include one or more processors (Central Processing Units, CPU) 901 and one or more memories 902, where at least one program code is stored in the one or more memories 902, and the at least one program code is loaded and executed by the one or more processors 901 to implement the resource processing method provided in the method embodiment shown in fig. 3. Of course, the server 900 may also have a wired or wireless network interface, a keyboard, an input/output interface, and other components for implementing the functions of the device, which are not described herein.
In an exemplary embodiment, the embodiment of the present application further provides a resource processing system, where the resource processing system includes a server and a terminal, the server is configured to execute the resource processing method shown in fig. 3, and the terminal is configured to execute the resource processing method shown in fig. 4.
In an exemplary embodiment, there is also provided a computer-readable storage medium having stored therein at least one program code loaded and executed by a processor to cause a computer to implement any of the above-described resource processing methods.
Alternatively, the above-mentioned computer readable storage medium may be a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a Read-Only optical disk (CD-ROM), a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program or computer program product is also provided, in which at least one computer instruction is stored, which is loaded and executed by a processor, to cause a computer to implement any of the above-mentioned resource processing methods.
It should be understood that references herein to "a plurality" are to two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
The foregoing description of the exemplary embodiments of the present application is not intended to limit the invention to the particular embodiments of the present application, but to limit the scope of the invention to any modification, equivalents, or improvements made within the principles of the present application.

Claims (18)

1. A method of resource processing, the method comprising:
acquiring a first multimedia resource, wherein the first multimedia resource is obtained by encoding an acquired multi-frame first direct broadcast image through a first terminal according to configuration information of the first terminal and a reference encoding algorithm;
acquiring configuration information of at least one second terminal in the same live broadcast room as the first terminal;
and adjusting the first multimedia resources based on the configuration information of each second terminal to obtain at least one second multimedia resource.
2. The method according to claim 1, wherein said adjusting the first multimedia resource based on the configuration information of each second terminal to obtain at least one second multimedia resource comprises:
decoding the first multimedia resource based on a reference decoding algorithm to obtain a multi-frame second live image;
based on the configuration information of the second terminal, adjusting the image information of the multi-frame second live image to obtain a multi-frame third live image;
and encoding the multi-frame third live image based on the reference encoding algorithm to obtain a second multimedia resource, wherein the second multimedia resource comprises image data of the multi-frame third live image.
3. The method of claim 2, wherein the configuration information of the second terminal includes a screen resolution and a terminal code rate, and the image information includes an image resolution and an image code rate;
the step of adjusting the image information of the multi-frame second live image based on the configuration information of the second terminal to obtain a multi-frame third live image includes:
based on the screen resolution of the second terminal, adjusting the image resolution of the multi-frame second live image to obtain a multi-frame fourth live image;
And adjusting the image code rate of the fourth live images based on the terminal code rate of the second terminal to obtain the third live images.
4. The method of claim 3, wherein the adjusting the image code rate of the fourth live image of the plurality of frames based on the terminal code rate of the second terminal, before obtaining the third live image of the plurality of frames, further comprises:
acquiring a terminal code rate of the second terminal based on the screen resolution of the second terminal;
or sending a code rate acquisition request to the second terminal, wherein the code rate acquisition request is used for acquiring the terminal code rate of the second terminal, and receiving the terminal code rate of the second terminal returned by the second terminal based on the acquisition request.
5. The method of claim 4, wherein the obtaining the terminal code rate of the second terminal based on the screen resolution of the second terminal comprises:
acquiring the frame rate of the second terminal;
and determining a terminal code rate of the second terminal based on the frame rate of the second terminal and the screen resolution of the second terminal.
6. The method according to any one of claims 1 to 5, wherein after adjusting the first multimedia resource based on the configuration information of each second terminal to obtain at least one second multimedia resource, the method further comprises:
Determining a target multimedia resource matched with configuration information of a target terminal in the at least one second multimedia resource, wherein the target terminal is any one of the at least one second terminal;
and sending the target multimedia resource to the target terminal.
7. The method of claim 6, wherein the server comprises a plurality of nodes; the sending the target multimedia resource to the target terminal includes:
determining a target node among the plurality of nodes based on the node status and node information of each node;
and responding to the target node comprising the target multimedia resource, sending a resource forwarding request to the target node, wherein the resource forwarding request carries a resource identifier of the target multimedia resource and a terminal identifier of the target terminal, and sending the target multimedia resource to the target terminal through the target node.
8. The method of claim 7, wherein the node information comprises a node load rate; the determining a target node among the plurality of nodes based on the node status and the node information of each node includes:
Taking a node of the plurality of nodes, the node state of which meets the state requirement, as a candidate node;
a target node is determined among the candidate nodes based on node load rates of the respective candidate nodes.
9. The method of claim 8, wherein the method further comprises, prior to determining the target node among the candidate nodes based on the node load rates of the respective candidate nodes:
determining the distance corresponding to each candidate node based on the position of the target terminal and the position of each candidate node;
determining response time of each candidate node based on the distance corresponding to each candidate node;
the determining the target node in the candidate nodes based on the node load rate of each candidate node comprises the following steps:
a target node is determined among the candidate nodes based on the node load rates of the candidate nodes and the response times of the candidate nodes.
10. A method of resource processing, the method comprising:
acquiring a multi-frame first direct broadcast image and configuration information of a first terminal;
encoding the multi-frame first direct-broadcasting image based on the configuration information of the first terminal and a reference encoding algorithm to obtain a first multimedia resource;
And sending the first multimedia resource to a server, wherein the server is used for adjusting the first multimedia resource to obtain at least one second multimedia resource.
11. The method of claim 10, wherein the encoding the multi-frame first always-on image based on the configuration information of the first terminal and a reference encoding algorithm to obtain a first multimedia resource comprises:
adjusting the image information of the multi-frame first live image based on the configuration information of the first terminal to obtain a multi-frame second live image, wherein the image information comprises at least one of image resolution and image code rate;
and encoding the multi-frame second live image based on the reference encoding algorithm to obtain a first multimedia resource, wherein the first multimedia resource comprises image data of the multi-frame second live image.
12. A resource processing apparatus, the apparatus comprising:
the acquisition module is used for acquiring a first multimedia resource, and the first multimedia resource is obtained by encoding the acquired multi-frame first direct broadcast image through a first terminal according to the configuration information of the first terminal and a reference encoding algorithm;
The acquisition module is further used for acquiring configuration information of at least one second terminal in the same live broadcast room as the first terminal;
and the adjustment module is used for adjusting the first multimedia resources based on the configuration information of each second terminal to obtain at least one second multimedia resource.
13. A resource processing apparatus, the apparatus comprising:
the acquisition module is used for acquiring a plurality of frames of first direct broadcast images and configuration information of the first terminal;
the encoding module is used for encoding the multi-frame first direct-broadcasting image based on the configuration information of the first terminal and a reference encoding algorithm to obtain a first multimedia resource;
the sending module is used for sending the first multimedia resource to a server, and the server is used for adjusting the first multimedia resource to obtain at least one second multimedia resource.
14. A server, characterized in that it comprises a processor and a memory, in which at least one program code is stored, which is loaded and executed by the processor, to cause the terminal to implement the resource processing method according to any of claims 1 to 9.
15. A terminal comprising a processor and a memory, wherein the memory has stored therein at least one program code that is loaded and executed by the processor to cause the terminal to implement the resource processing method of claim 10 or 11.
16. A resource processing system, characterized in that it comprises a server for executing the resource processing method according to any of the preceding claims 1 to 9 and a terminal for executing the resource processing method according to claim 10 or 11.
17. A computer readable storage medium having stored therein at least one program code, the at least one program code being loaded and executed by a processor to cause a computer to implement the resource processing method of any of claims 1 to 11.
18. A computer program product, characterized in that at least one computer instruction is stored in the computer program product, which is loaded and executed by a processor to cause the computer to implement the resource processing method according to any of claims 1 to 11.
CN202111536341.7A 2021-12-15 2021-12-15 Resource processing method, device, server, terminal, system and storage medium Pending CN116264619A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111536341.7A CN116264619A (en) 2021-12-15 2021-12-15 Resource processing method, device, server, terminal, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111536341.7A CN116264619A (en) 2021-12-15 2021-12-15 Resource processing method, device, server, terminal, system and storage medium

Publications (1)

Publication Number Publication Date
CN116264619A true CN116264619A (en) 2023-06-16

Family

ID=86722418

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111536341.7A Pending CN116264619A (en) 2021-12-15 2021-12-15 Resource processing method, device, server, terminal, system and storage medium

Country Status (1)

Country Link
CN (1) CN116264619A (en)

Similar Documents

Publication Publication Date Title
Shi et al. Mobile VR on edge cloud: A latency-driven design
US11496532B2 (en) Offering media services through network edge
CN108769726B (en) Multimedia data pushing method and device, storage medium and equipment
US20180077461A1 (en) Electronic device, interractive mehotd therefor, user terminal and server
US20220095002A1 (en) Method for transmitting media stream, and electronic device
CN109874043B (en) Video stream sending method, video stream playing method and video stream playing device
US11528311B2 (en) Method for transmitting multimedia resource and terminal
CN107409237B (en) Method, medium and system for dynamically adjusting cloud game data stream
US11652864B2 (en) Method and apparatus for transmitting resources and non-transitory storage medium
US11632642B2 (en) Immersive media with media device
US11089073B2 (en) Method and device for sharing multimedia content
WO2022121775A1 (en) Screen projection method, and device
US10917477B2 (en) Method and apparatus for MMT integration in CDN
CN115665474A (en) Live broadcast method and device, electronic equipment and storage medium
CN110996122A (en) Video frame transmission method and device, computer equipment and storage medium
KR20230010711A (en) Method for using 5G edge application servers for live streaming of user-generated content
US11265356B2 (en) Network assistance functions for virtual reality dyanmic streaming
CN116980392A (en) Media stream processing method, device, computer equipment and storage medium
CN116264619A (en) Resource processing method, device, server, terminal, system and storage medium
CN114071170B (en) Network live broadcast interaction method and device
KR100996241B1 (en) Moving picture dividing device and moving picture dividing method therefore
EP3386203B1 (en) Signalling of auxiliary content for a broadcast signal
CN111711835B (en) Multi-channel audio and video integration method and system and computer readable storage medium
US11695488B2 (en) ATSC over-the-air (OTA) broadcast of public volumetric augmented reality (AR)
WO2022199484A1 (en) Media playback method and apparatus and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40088750

Country of ref document: HK