CN111193926A - Encoded data processing method, apparatus, computer device and storage medium - Google Patents

Encoded data processing method, apparatus, computer device and storage medium Download PDF

Info

Publication number
CN111193926A
CN111193926A CN201811353206.7A CN201811353206A CN111193926A CN 111193926 A CN111193926 A CN 111193926A CN 201811353206 A CN201811353206 A CN 201811353206A CN 111193926 A CN111193926 A CN 111193926A
Authority
CN
China
Prior art keywords
complexity
data frame
encoded
motion
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811353206.7A
Other languages
Chinese (zh)
Other versions
CN111193926B (en
Inventor
张清
金飞剑
刘海军
王诗涛
丁飘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201811353206.7A priority Critical patent/CN111193926B/en
Publication of CN111193926A publication Critical patent/CN111193926A/en
Application granted granted Critical
Publication of CN111193926B publication Critical patent/CN111193926B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/147Data rate or code amount at the encoder output according to rate distortion criteria
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/124Quantisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/14Coding unit complexity, e.g. amount of activity or edge presence estimation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/142Detection of scene cut or scene change

Abstract

The application relates to a method, a device, computer equipment and a storage medium for processing coded data, which are used for acquiring a frame to be coded; determining the complexity of the data frame to be coded; determining a coding parameter corresponding to the data frame to be coded according to the first complexity and the second complexity; the first complexity is the complexity of the data frame to be coded, and the second complexity is the complexity of the preorder data frame; the preamble data frame is at least one frame data frame which is continuous with the data frame to be encoded and is before the data frame to be encoded. Therefore, the encoding parameters are determined according to the complexity of the data frame to be encoded and the complexity of the preamble data frame, and different first complexity and second complexity can correspond to different encoding parameters. Therefore, the code stream obtained by encoding through the obtained encoding parameters can be suitable for different scenes. Thus, the adaptability of the encoded data processing can be improved.

Description

Encoded data processing method, apparatus, computer device and storage medium
Technical Field
The present application relates to the field of communications technologies, and in particular, to a method and an apparatus for processing encoded data, a computer device, and a storage medium.
Background
Encoding is the process by which information is converted from one form or format to another. When the coded data in the multimedia data format needs to be communicated and transmitted, the coded data needs to be coded to form a coded code stream suitable for being transmitted in the internet, so that the coding is widely used in the aspect of communication.
The traditional coding data processing method carries out data coding according to fixed coding parameters, and is difficult to adapt to different scene requirements, so that the problem of poor adaptability exists.
Disclosure of Invention
In view of the above, it is necessary to provide an encoded data processing method, apparatus, computer device, and storage medium capable of improving adaptability to the above-described technical problem.
A method of encoding data processing, the method comprising:
acquiring a data frame to be encoded;
determining the complexity of the data frame to be coded;
determining a coding parameter corresponding to the data frame to be coded according to the first complexity and the second complexity; the first complexity is the complexity of the data frame to be coded, and the second complexity is the complexity of the preorder data frame; the preamble data frame is at least one frame data frame which is continuous with the data frame to be encoded and is before the data frame to be encoded.
In an embodiment of the disclosure, when both the first complexity and the second complexity of a first to-be-encoded data frame and a second to-be-encoded data frame are greater than a preset motion complexity value, if the first complexity of the first to-be-encoded data frame is greater than or equal to the first complexity of the second to-be-encoded data frame and the second complexity of the first to-be-encoded data frame is greater than or equal to the second complexity of the second to-be-encoded data frame, an encoding rate of the first to-be-encoded data frame is greater than or equal to an encoding rate of the second to-be-encoded data frame, and an encoding frame rate of the second to-be-encoded data frame is greater than or equal to an encoding frame rate of the first to-be-encoded data frame.
In one embodiment, the obtaining a frame of data to be encoded includes: acquiring a frame of data to be encoded based on a screen image.
In an embodiment of the present invention, after determining the encoding parameter corresponding to the data frame to be encoded according to the first complexity and the second complexity, the method further includes:
and coding the frame to be coded according to the coding parameters corresponding to the frame to be coded to obtain a coded code stream.
In one embodiment, before acquiring the data frame to be encoded, the method further includes: receiving a wireless screen projection instruction, wherein the wireless screen projection instruction carries a target terminal identifier;
after the encoding of the frame to be encoded is performed according to the encoding parameters corresponding to the frame to be encoded to obtain an encoded code stream, the method further includes: and transmitting the coding code stream according to the target terminal identification.
An encoding data processing apparatus, the apparatus comprising:
the data frame acquisition module is used for acquiring a data frame to be encoded;
the complexity determining module is used for determining the complexity of the data frame to be coded;
the encoding parameter determining module is used for determining the encoding parameters corresponding to the data frame to be encoded according to the first complexity and the second complexity; the first complexity is the complexity of the data frame to be coded, and the second complexity is the complexity of the preorder data frame; the preamble data frame is at least one frame data frame which is continuous with the data frame to be encoded and is before the data frame to be encoded.
A computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the following steps when executing the computer program:
acquiring a data frame to be encoded;
determining the complexity of the data frame to be coded;
determining a coding parameter corresponding to the data frame to be coded according to the first complexity and the second complexity; the first complexity is the complexity of the data frame to be coded, and the second complexity is the complexity of the preorder data frame; the preamble data frame is at least one frame data frame which is continuous with the data frame to be encoded and is before the data frame to be encoded.
A computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
acquiring a data frame to be encoded;
determining the complexity of the data frame to be coded;
determining a coding parameter corresponding to the data frame to be coded according to the first complexity and the second complexity; the first complexity is the complexity of the data frame to be coded, and the second complexity is the complexity of the preorder data frame; the preamble data frame is at least one frame data frame which is continuous with the data frame to be encoded and is before the data frame to be encoded.
The encoding data processing method, the encoding data processing device, the computer equipment and the storage medium acquire a frame to be encoded; determining the complexity of the data frame to be coded; determining a coding parameter corresponding to the data frame to be coded according to the first complexity and the second complexity; the first complexity is the complexity of the data frame to be coded, and the second complexity is the complexity of the preorder data frame; the preamble data frame is at least one frame data frame which is continuous with the data frame to be encoded and is before the data frame to be encoded. Therefore, the encoding parameters are determined according to the complexity of the data frame to be encoded and the complexity of the preamble data frame, and different first complexity and second complexity can correspond to different encoding parameters. Therefore, the code stream obtained by encoding through the obtained encoding parameters can be suitable for different scenes. Thus, the adaptability of the encoded data processing can be improved.
Drawings
FIG. 1 is a diagram of an application environment of a method for processing encoded data according to an embodiment;
FIG. 2 is a flow diagram illustrating a method for encoding data processing according to one embodiment;
FIG. 3 is a flow chart of a method for processing encoded data according to another embodiment;
FIG. 4 is a schematic diagram illustrating an operation of an encoded data processing method according to an embodiment;
FIG. 5 is a block diagram showing a structure of an encoding data processing apparatus according to an embodiment;
FIG. 6 is a block diagram of a computer device in one embodiment;
fig. 7 is a block diagram showing a configuration of a computer device according to another embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
FIG. 1 is a diagram of an application environment of a method for processing encoded data according to an embodiment. The encoding data processing method can be applied to the application environment as shown in fig. 1. Wherein the first terminal 102 and the second terminal 104 communicate with the server 106 through a network.
In an embodiment, the encoding data processing method may be executed on the first terminal 102, and after the first terminal 102 determines the encoding parameter through the encoding data processing method and encodes the frame to be encoded with the encoding parameter to obtain an encoded code stream, the encoded code stream may be sent to the server 106 through a network. The server 106 forwards the encoded code stream to the second terminal 106, and the second terminal 104 decodes and plays the encoded code stream.
In another embodiment, the encoding data processing method may be run on the server 106, the server 106 determines the encoding parameter through the encoding data processing method, and after the frame of the data to be encoded is encoded according to the encoding parameter to obtain the encoding code stream, the encoding code stream may be sent to the first terminal 102 through the network, and the first terminal 102 decodes and plays the encoding code stream.
The first terminal 102 and the second terminal 104 may be, but not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, and the server 106 may be implemented by an independent server or a server cluster composed of a plurality of servers.
In one embodiment, as shown in FIG. 2, an encoded data processing method is provided. The method may be executed by the first terminal 102 in fig. 1, or may be executed by the server 106 in fig. 1. The coded data processing method comprises the following steps:
s202, acquiring a data frame to be encoded.
The data frame to be encoded is a data frame to be encoded. The data frame to be encoded can be acquired in an acquisition mode. The data frame to be encoded can also be transmitted to the execution device through various ways after being collected by other devices, after the execution device stores the data frame to be encoded, the execution device acquires the data frame to be encoded from the memory when executing the method. The execution device is a device for executing the method, and may be the first terminal or the server.
S204, determining the complexity of the data frame to be encoded.
The execution device determines the complexity of the data frame to be encoded. The complexity level may include a spatio-Temporal complexity level, such as the complexity level may include Spatial Information (SI) or/and Temporal Information (TI) of the data frame to be encoded. The spatial information is a spatial detail quantity representing a frame of image. Spatially more complex scenes typically have higher SI values. The temporal information is a temporal variation characterizing the video sequence. Sequences with higher degrees of motion generally have higher TI values.
The complexity level may include a rate distortion loss (RDCost) based on the frame to be encoded. The rate-distortion loss is a parameter for measuring the code rate and distortion in video coding. Such as the intra-frame and inter-frame best RDCost predicted based on the data frame to be encoded.
The complexity level may be determined according to the data frame to be encoded, or may be determined according to the data frame to be encoded and its preamble data frame.
S206, determining the coding parameters corresponding to the data frame to be coded according to the first complexity and the second complexity.
And the execution equipment determines the coding parameters corresponding to the data frame to be coded according to the first complexity and the second complexity. The first complexity is the complexity of the data frame to be coded, and the second complexity is the complexity of the preorder data frame; the preamble data frame is at least one frame data frame that is consecutive to and precedes the frame to be encoded. Assuming that Z1, Z2, Z3 are successively acquired, consecutive data frames, the preamble data frame of Z3 may be Z2 or Z2, Z1, but not Z1.
The encoding parameters can be encoding frame rate and encoding code rate. Here, the encoding frame rate is the number of data frames encoded per second, and the basic unit is fps (frame per second). The coding rate is the code rate of the coded data frame. The code rate, also referred to as bit rate, refers to the number of bits (bits) transmitted per second. The basic unit of the coding rate is bps (bit persecond), and the higher the coding rate is, the faster the data transmission speed is.
The encoding Parameter may also be a Quantization Parameter (QP). The quantization parameter refers to a sequence number corresponding to a quantization step. For example, if there are 52 quantization steps in h.264/MPEG-4AVC (Moving Picture Expert Group-4advanced video Coding, Moving Picture experts Group-4advanced video encoder), the values of the quantization parameters are 0 to 51, and each quantization parameter is used to identify one quantization step of the 52 quantization steps.
The range for the first complexity level and the second complexity level may be divided, with different ranges mapped to different encoding parameters.
A mapping rule may also be set, and the first complexity and the second complexity are mapped to obtain corresponding encoding parameters. For example, for any two frames of data to be encoded, it is assumed that the data frames are a first data frame to be encoded and a second data frame to be encoded, respectively. When the first complexity and the second complexity of the first to-be-encoded data frame and the second to-be-encoded data frame are both greater than the first preset motion complexity value, if the first complexity of the first to-be-encoded data frame is greater than or equal to the first complexity of the second to-be-encoded data frame and the second complexity of the first to-be-encoded data frame is greater than or equal to the second complexity of the second to-be-encoded data frame, the encoding rate of the first to-be-encoded data frame is greater than or equal to the encoding rate of the second to-be-encoded data frame, and the encoding frame rate of the second to-be-encoded data frame is greater than or equal to the encoding frame rate of the first to-be-.
Therefore, the encoding parameters are determined according to the complexity of the data frame to be encoded and the complexity of the preorder data frame, and different first complexity and second complexity can correspond to different encoding parameters. For example, for scenes with violent motion and complex textures, the coding rate can be set to a higher value to avoid picture blurring; for the picture with simpler texture, the coding rate can be set to be a lower value so as to avoid underflow of quantization parameters and waste of the coding rate.
Acquiring a frame to be encoded based on the encoding data processing method of the embodiment; determining the complexity of a data frame to be encoded; determining a coding parameter corresponding to a frame to be coded according to the first complexity and the second complexity; the first complexity is the complexity of the data frame to be coded, and the second complexity is the complexity of the preorder data frame; the preamble data frame is at least one frame data frame that is consecutive to and precedes the frame to be encoded. Therefore, the encoding parameters are determined according to the complexity of the data frame to be encoded and the complexity of the preorder data frame, and different first complexity and second complexity can correspond to different encoding parameters. Therefore, the code stream obtained by encoding through the obtained encoding parameters can be suitable for different scenes. Thus, the adaptability of the encoded data processing can be improved.
In one embodiment, determining the encoding parameters corresponding to the frame to be encoded according to the first complexity and the second complexity includes: determining a scene type corresponding to a data frame to be encoded according to the first complexity and the second complexity; and determining the coding parameters corresponding to the frame to be coded based on the scene type corresponding to the frame to be coded.
The execution equipment determines a scene type corresponding to the data frame to be coded according to the first complexity and the second complexity; and determining the coding parameters corresponding to the frame to be coded based on the scene type corresponding to the frame to be coded. The scene types may include a motion scene and a continuous still scene. The motion scene may indicate that the content of the frame to be encoded is different from the data content of the preamble data frame, for example, the image corresponding to the frame to be encoded is different from the image corresponding to the preamble data frame. A persistent still scene may indicate that the content of the frame to be encoded is the same as the data content of the preamble data frame, e.g., the image of the frame to be encoded is the same as the corresponding image of the preamble data frame.
Generally, at least one of the first complexity and the second complexity of the data frame corresponding to the moving scene is greater than the first complexity and the second complexity of the data frame corresponding to the continuous static scene. Therefore, a preset static complexity value can be set by means of a preset threshold value: when the first complexity and the second complexity are both equal to or less than a preset static complexity value, determining that the scene type corresponding to the data frame to be coded is a continuous static scene; and when at least one of the first complexity and the second complexity is greater than a preset static complexity value, determining that the scene type corresponding to the coded data frame is a motion scene. And when at least one of the first complexity and the second complexity is greater than the first preset motion complexity value, determining that the scene type corresponding to the coded data frame is a motion scene. The first predetermined motion complexity value is greater than or equal to the predetermined stationary complexity value.
The coding parameters corresponding to the motion scene comprise a coding rate and a coding frame rate, which can be respectively recorded as a motion coding rate and a motion coding frame rate; the coding parameters corresponding to the persistent static scene also include a coding rate and a coding frame rate, which can be respectively recorded as a static coding rate and a static coding frame rate. The stationary encoding frame rate is less than the motion encoding frame rate. The stationary coding rate may be greater than or equal to the motion coding rate.
Based on the coded data processing method of the embodiment, the scene type of the data frame to be coded can be determined according to the complexity of the data frame to be coded and the preamble data frame, and the coding parameters can be determined based on the scene type, so that the code stream obtained by coding the obtained coding parameters can be suitable for different types of scenes. Thus, the adaptability of the encoded data processing can be improved.
In one embodiment, determining a scene type corresponding to a data frame to be encoded according to a first complexity and a second complexity includes: and when at least one of the first complexity and the second complexity is greater than a first preset motion complexity value, determining that the scene type corresponding to the data frame to be coded is a motion scene.
And when at least one of the first complexity and the second complexity is greater than a first preset motion complexity value, the execution equipment determines that the scene type corresponding to the data frame to be coded is a motion scene. Correspondingly, determining the encoding parameters corresponding to the data frame to be encoded based on the scene type corresponding to the data frame to be encoded, including: when the scene type corresponding to the frame to be encoded is a motion scene, determining the encoding parameter corresponding to the frame to be encoded to be equal to the encoding parameter corresponding to the scene type corresponding to the frame to be encoded.
The scene type of the motion scene may include at least two. Such as may include at least two of a continuous motion scene, a motion-to-still scene, and a still-to-motion scene. As another example, a second continuous motion scene, a first continuous motion scene; the method can further include a third continuous motion scene between the second continuous motion scene and the first continuous motion scene on the basis of the second continuous motion scene and the first continuous motion scene; still further, the number of the third continuous motion scenes may be not less than 2.
The continuous motion scene may be a motion scene in which both the first complexity and the second complexity are greater than a first preset motion complexity value. The static rotation motion scene may be a motion scene with a first complexity greater than a first preset motion complexity value and a second complexity equal to or less than a preset static complexity value. The moving to static scene may be a moving scene with a second complexity greater than a first preset moving complexity value and a first complexity equal to or less than a preset static complexity value.
The second continuous motion scene may be a motion scene in which the first complexity and the second complexity are both greater than a third preset motion complexity value; the first continuous motion scene can be a motion scene with a first complexity and a second complexity which are both greater than a preset small motion complexity value and less than or equal to a second preset motion complexity value; the third continuous motion scene may be a motion scene in which the first complexity and the second complexity are both greater than the second preset motion complexity value and less than or equal to the third preset motion complexity value. The third preset motion complexity value is greater than or equal to a second preset motion complexity value, and the second preset motion complexity value is greater than or equal to a preset small motion complexity value. It is to be understood that the first preset motion complexity value may be equal to the preset small motion complexity value.
Wherein, the values of the coding parameters corresponding to different motion scenes are different. For example, the encoding parameters corresponding to the continuous motion scene include an encoding frame rate and an encoding code rate, which may be respectively recorded as a first encoding frame rate and a first encoding code rate; the coding parameters corresponding to the moving-to-static scene comprise a coding frame rate and a coding rate, which can be respectively recorded as a second coding frame rate and a second coding rate, wherein the second coding frame rate is greater than or equal to the first coding frame rate, and the second coding rate is greater than or equal to the first coding rate; the coding parameters corresponding to the static and dynamic scene include a coding frame rate and a coding rate, which may be respectively recorded as a third coding frame rate and a third coding rate, where the third coding frame rate is greater than or equal to the first coding frame rate, and the third coding rate is greater than or equal to the first coding rate. For another example, the coding parameters corresponding to the first continuous motion scene include a coding frame rate and a coding rate, which may be respectively recorded as a small motion coding frame rate and a small motion coding rate; the coding parameters corresponding to the second continuous motion scene comprise a coding frame rate and a coding rate, which can be respectively recorded as a large motion coding frame rate and a large motion coding rate; the coding parameters corresponding to the third continuous motion scene comprise a coding frame rate and a coding rate, which can be respectively recorded as a middle motion coding frame rate and a middle motion coding rate; the small motion coding frame rate is greater than or equal to the medium motion coding frame rate, and the medium motion coding frame rate is greater than or equal to the large motion coding frame rate; the large motion coding rate is greater than or equal to the medium motion coding rate, and the medium motion coding rate is greater than or equal to the small motion coding rate.
Based on the encoding data processing method of this embodiment, when the scene type is a motion scene, the encoding parameter corresponding to the frame to be encoded may be determined to be equal to the encoding parameter corresponding to the scene type corresponding to the frame to be encoded. Therefore, the code stream obtained by coding the obtained coding parameters can be suitable for different types of scenes. Thus, the adaptability of the encoded data processing can be improved.
In one embodiment, when at least one of the first complexity and the second complexity is greater than a first preset motion complexity value, determining that a scene type corresponding to a frame to be encoded is a motion scene includes: and when the first complexity and the second complexity are both greater than a first preset motion complexity value, determining that the scene type corresponding to the data frame to be coded is a continuous motion scene.
And when the first complexity and the second complexity are both greater than a first preset motion complexity value, the execution equipment determines that the scene type corresponding to the data frame to be coded is a continuous motion scene. The continuous motion scene may represent a motion scene in which the complexity of no less than two consecutive data frames is greater than a first preset motion complexity value. The continuous motion scene may be mapped to one scene type for a set of encoding parameters or may be mapped to multiple scene types for multiple sets of encoding parameters. As such, the motion scene includes at least a continuous motion scene, such that the scene type includes at least a continuous motion scene.
In one embodiment, when at least one of the first complexity and the second complexity is greater than a first preset motion complexity value, determining that a scene type corresponding to a frame to be encoded is a motion scene, further includes: when the first complexity is larger than a first preset motion complexity value and the second complexity is equal to or smaller than a preset static complexity value, determining that the scene type corresponding to the data frame to be coded is a static motion scene; or/and when the second complexity is greater than a first preset motion complexity value and the first complexity is equal to or less than a preset static complexity value, determining that the scene type corresponding to the data frame to be coded is a motion-to-static scene; wherein the first preset motion complexity value is greater than or equal to the preset static complexity value.
When the first complexity is larger than a first preset motion complexity value and the second complexity is equal to or smaller than a preset static complexity value, the execution equipment determines that the scene type corresponding to the data frame to be coded is a static motion scene; or/and when the second complexity is greater than the first preset motion complexity value and the first complexity is equal to or less than the preset static complexity value, the execution equipment determines that the scene type corresponding to the data frame to be coded is a motion-to-static scene.
It is to be understood that, in this embodiment, the continuous motion scene may be mapped to one scene type of a set of encoding parameters, or may be mapped to multiple scene types of multiple sets of encoding parameters. Thus, the motion scene at least comprises a continuous motion scene and a still motion scene or/and a motion-to-still scene, and therefore, the scene type at least comprises a continuous motion scene and a still motion scene or/and a motion-to-still scene. Thus, the adaptability of the encoded data processing is further improved.
Furthermore, the encoding parameters corresponding to the continuous motion scene include a first encoding frame rate and a first encoding code rate. Meanwhile, the coding parameters corresponding to the scene from motion to static comprise a second coding frame rate and a second coding rate, wherein the second coding frame rate is greater than the first coding frame rate, and the second coding rate is greater than the first coding rate; or/and the coding parameters corresponding to the static and dynamic scene comprise a third coding frame rate and a third coding rate, wherein the third coding frame rate is greater than the first coding frame rate, and the third coding rate is greater than the first coding rate. Therefore, the continuous motion scene and the static rotation motion scene or/and the motion rotation static scene respectively correspond to different coding parameter values, and the adaptability of coding data processing is further improved.
In one embodiment, the continuous motion scene comprises at least a first continuous motion scene and a second continuous motion scene; when the first complexity and the second complexity are both greater than a first preset motion complexity value and less than or equal to a second preset motion complexity value, the scene type corresponding to the data frame to be coded is a first continuous motion scene; the second preset motion complexity value is greater than or equal to the first preset motion complexity value; when the first complexity and the second complexity are both greater than a third preset motion complexity value, the scene type corresponding to the data frame to be coded is a second continuous motion scene; the third preset motion complexity value is greater than or equal to the second preset motion complexity value.
In this way, the continuous motion scene may be mapped to a plurality of scene types of a plurality of sets of encoding parameters, so that the motion scene at least includes a first continuous motion scene and a second continuous motion scene, thereby further improving the adaptability of encoding data processing.
In this embodiment, the coding parameters corresponding to the first continuous motion scene include a coding frame rate and a coding rate, which may be respectively recorded as a small motion coding frame rate and a small motion coding rate; the coding parameters corresponding to the second continuous motion scene comprise a coding frame rate and a coding rate, which can be respectively recorded as a large motion coding frame rate and a large motion coding rate; the small motion coding frame rate is greater than or equal to the large motion coding frame rate; the large motion coding rate is greater than or equal to the small motion coding rate.
Further, the continuous motion scene also comprises a third continuous motion scene. And when the first complexity and the second complexity are both greater than the second preset motion complexity value and less than or equal to a third preset motion complexity value, the scene type corresponding to the data frame to be coded is a third continuous motion scene. In this way, the motion scene at least includes the third continuous motion scene, the first continuous motion scene and the second continuous motion scene, thereby further improving the adaptability of the encoded data processing. It is to be understood that the number of third continuous motion scenes is at least 1. When the number of the third continuous motion scenes is greater than 1, more than 1 preset complexity value may be set between the first preset motion complexity value and the third preset motion complexity value. Thus, more scene requirements can be met. Thereby further improving the adaptability of the encoded data processing.
Furthermore, the coding parameters corresponding to the first continuous motion scene include a small motion coding frame rate and a small motion coding code rate; the coding parameters corresponding to the second continuous motion scene comprise a large motion coding frame rate and a large motion coding code rate; coding parameters corresponding to the third continuous motion scene comprise a coding frame rate and a coding rate, and can be recorded as a middle motion coding frame rate and a middle motion coding rate; the small motion coding frame rate is greater than or equal to the medium motion coding frame rate, and the medium motion coding frame rate is greater than or equal to the large motion coding frame rate; the large motion coding rate is greater than or equal to the medium motion coding rate, and the medium motion coding rate is greater than or equal to the small motion coding rate.
Therefore, the third continuous motion scene, the first continuous motion scene and the second continuous motion scene respectively correspond to different coding code rates, and the adaptability of coded data processing is further improved.
In one embodiment, determining a scene type corresponding to a data frame to be encoded according to a first complexity and a second complexity includes: and when the first complexity and the second complexity are both equal to or less than a preset static complexity value, determining the scene type corresponding to the data frame to be coded as a continuous static scene.
And when the first complexity and the second complexity are both equal to or less than a preset static complexity value, the execution equipment determines that the scene type corresponding to the data frame to be coded is a continuous static scene.
Correspondingly, determining the encoding parameters corresponding to the data frame to be encoded based on the scene type corresponding to the data frame to be encoded, including: and when the scene type corresponding to the data frame to be encoded is a continuous static scene, if the data frame to be encoded meets the static frame skipping condition, returning to the step of acquiring the data frame to be encoded.
Wherein the still frame skipping condition may include: the time interval between the data frame to be encoded and the previous data frame is less than or equal to a preset time interval, where the preset time interval is a preset maximum time interval (which may be denoted as Tmax) of the static skip frame. The still frame skip condition may further include: the data frame to be encoded is a still frame, and the still frame may mean that the data frame to be encoded is the same as the previous data frame. The frame to be encoded may be the same as the previous encoded data frame, which means that the frame to be encoded and the previous encoded data frame have the same content. The frame to be encoded may be the same as the previous encoded data frame, or may mean that the frame to be encoded has the same characteristics as the previous encoded data frame.
The encoding data processing method of the embodiment satisfies the static frame skipping condition, and does not need to encode the frame to be encoded. In the embodiment that the data to be encoded needs to be encoded, the step of acquiring the frame of the data to be encoded may be skipped by skipping to the step of acquiring the frame of the data to be encoded, so as to acquire the next frame of the data to be encoded, and continue to perform the subsequent step of determining the complexity and the step of determining the encoding parameter. The frame to be encoded meeting the static frame skipping condition does not need to be encoded by determining that the encoding frame rate is less than or equal to the preset threshold value. Thus, the adaptability of the encoded data processing can be improved.
In one embodiment, when the content of the data frame to be encoded is the same as that of the previous data frame, and the time interval between the data frame to be encoded and the previous data frame is less than or equal to the preset time interval, the static frame skipping condition is satisfied. That is, the still frame skipping condition includes: the content of the data frame to be encoded is the same as that of the previous data frame, and the time interval between the data frame to be encoded and the previous data frame is less than or equal to the preset time interval. In this way, the frame skipping condition can be applied to more application scenes, thereby further improving the adaptability of the encoded data processing.
In one embodiment, determining, based on a scene type corresponding to a frame to be encoded, an encoding parameter corresponding to the frame to be encoded further includes: when the scene type corresponding to the data frame to be encoded is a continuous static scene, if the data frame to be encoded does not meet the static frame skipping condition, determining the encoding parameter corresponding to the data frame to be encoded to be equal to the encoding parameter corresponding to the previous data frame.
When the scene type corresponding to the data frame to be encoded is a continuous static scene, if the data frame to be encoded does not meet the static frame skipping condition, the execution device determines the encoding parameter corresponding to the data frame to be encoded to be equal to the encoding parameter corresponding to the previous data frame.
When the still frame skipping condition is not satisfied, the frame to be encoded still needs to be encoded. Because the scene corresponding to the data frame to be encoded is a continuous static scene, the encoding parameter corresponding to the previous data frame can be used, and thus, the data frame to be encoded which does not meet the static frame skipping condition can be encoded by adopting the parameter corresponding to the previous data frame. Thus, the adaptability of the encoded data processing can be improved.
In one embodiment, determining a scene type corresponding to a data frame to be encoded according to a first complexity and a second complexity includes: when the first complexity degree and the second complexity degree meet preset conditions, determining a scene type corresponding to a data frame to be coded according to the met preset conditions; and when the first complexity and the second complexity do not meet the preset condition, determining that the scene type corresponding to the data frame to be encoded is the scene type of a data frame before the data frame to be encoded.
When the first complexity and the second complexity meet preset conditions, the execution equipment determines a scene type corresponding to a data frame to be coded according to the met preset conditions; when the first complexity and the second complexity do not meet the preset condition, the execution device determines that the scene type corresponding to the data frame to be encoded is the scene type of the data frame before the data frame to be encoded.
The preset conditions may include: the first complexity degree is larger than a first preset motion complexity value, the second complexity degree is equal to or smaller than a preset static complexity value, and the first preset motion complexity value is larger than or equal to the preset static complexity value; the second complexity is larger than the first preset motion complexity value, and the first complexity is equal to or smaller than the preset static complexity value; the first complexity and the second complexity are both greater than a first preset motion complexity value and less than or equal to a second preset motion complexity value, and the second preset motion complexity value is greater than or equal to the first preset motion complexity value; the first complexity and the second complexity are both greater than a third preset motion complexity value, and the third preset motion complexity value is greater than or equal to a second preset motion complexity value; the first complexity and the second complexity are both greater than the second preset motion complexity value and less than or equal to the third preset motion complexity value.
Correspondingly, when the satisfied preset condition is that the first complexity is greater than the first preset motion complexity value, and the second complexity is equal to or less than the preset static complexity value, the scene type corresponding to the data frame to be encoded determined according to the satisfied preset condition may be a static rotation motion scene. And when the second complexity is greater than the first preset motion complexity value and the first complexity is equal to or less than the preset static complexity value, determining the scene type corresponding to the data frame to be coded according to the met preset conditions, wherein the scene type corresponding to the data frame to be coded can be a motion-to-static scene. And when the satisfied preset conditions are that the first complexity and the second complexity are both greater than a first preset motion complexity value and less than or equal to a second preset motion complexity value, determining the scene type corresponding to the data frame to be coded as a first continuous motion scene according to the satisfied preset conditions. When the satisfied preset condition is that the first complexity and the second complexity are both greater than a third preset motion complexity value, the scene type corresponding to the data frame to be coded determined according to the satisfied preset condition can be a second continuous motion scene; when the satisfied preset conditions are that the first complexity and the second complexity are both greater than the second preset motion complexity value and less than or equal to a third preset motion complexity value, the scene type corresponding to the data frame to be encoded determined according to the satisfied preset conditions may be a third continuous motion scene.
It is understood that the preset conditions do not cover all the possible values of the first complexity level and the second complexity level; if the first complexity and the second complexity are both greater than the preset static complexity and less than the preset motion complexity; for another example, the first complexity is greater than the preset large motion complexity, and the second responsibility is less than the preset medium motion complexity and greater than the preset small motion complexity; and so on. In this embodiment, when the first complexity and the second complexity are outside the preset condition, the scene type corresponding to the data frame to be encoded is delayed from the scene type of the previous data frame. Thus, on one hand, the robustness of the coding data processing method can be better, and on the other hand, the applicability of the coding data processing method is further improved.
In an embodiment, when both the first complexity and the second complexity of the first to-be-encoded data frame and the second to-be-encoded data frame are greater than the first preset motion complexity value, if the first complexity of the first to-be-encoded data frame is greater than or equal to the first complexity of the second to-be-encoded data frame and the second complexity of the first to-be-encoded data frame is greater than or equal to the second complexity of the second to-be-encoded data frame, the encoding rate of the first to-be-encoded data frame is greater than or equal to the encoding rate of the second to-be-encoded data frame, and the encoding frame rate of the second to-be-encoded data frame is greater than or equal to the encoding frame rate of the first to-be-encoded.
In two data frames with the first complexity and the second complexity both greater than a first preset motion complexity value, the coding code rate of the data frame with the first complexity and the second complexity greater than the coding code rate of the data frame with the first complexity and the second complexity less than the coding code rate of the data frame with the first complexity and the second complexity greater than the; the coding frame rate of the data frame with the smaller first complexity and the second complexity is not less than the coding frame rate of the data frame with the larger first complexity and the second complexity.
Therefore, frames to be coded with different complexity can correspond to different coding parameters, so that the method is suitable for different scenes and improves the adaptability of coded data processing.
In one embodiment, acquiring a frame to be encoded includes: acquiring a frame of data to be encoded based on a screen image.
The execution device acquires a frame of data to be encoded based on the screen image. The data frame to be encoded based on the screen image may be a video frame played in a full screen mode, or may be a data frame acquired in a screen capture mode, for example, the screen image is captured as the data frame to be encoded according to a preset capture frame rate. The acquisition frame rate is the number of data frames acquired per second, in fps (frames per second).
Therefore, the encoding data processing method is applied to the field of wireless screen projection, and when the wireless screen projection displays data frames, different requirements are imposed on instantaneity and definition in different scenes, different complexity degrees can correspond to different encoding parameters, and the improvement of the encoding data processing adaptability can be more obvious.
In one embodiment, after determining the encoding parameter corresponding to the frame to be encoded according to the first complexity and the second complexity, the method further includes: and coding the data frame to be coded according to the coding parameters corresponding to the data frame to be coded to obtain a coding code stream.
And the execution equipment encodes the data frame to be encoded according to the encoding parameter corresponding to the data frame to be encoded to obtain an encoding code stream. The data frame to be coded can be coded by the coder to obtain a coded code stream. For example, the encoding parameters corresponding to the frame to be encoded may be input to the encoder, and the encoder encodes the frame to be encoded. And thus, coding is carried out according to the obtained coding parameters to obtain a coding code stream. Therefore, the coding code stream can adapt to different scenes, and the applicability of the coding code stream is improved.
In one embodiment, the encoding of the frame to be encoded is performed according to the encoding parameter corresponding to the frame to be encoded, so as to obtain an encoded code stream, and then the transmitting of the encoded code stream is further included. And the execution equipment sends the coded code stream.
The encoded code stream may be sent to a server or to a client terminal. For example, when the execution device is the first terminal 102 shown in fig. 1, the encoded code stream may be sent to the server 106 and forwarded by the server 106 to the second terminal 104. When the execution device is a server as shown in fig. 1, the encoded code stream may be transmitted to the second terminal 104.
The encoding parameters can be changed along with the complexity of the data frame to be encoded so as to be suitable for different scenes, and the situation that the encoding code rate and the encoding frame rate are required to be high and not changed at the same time is not the case in order to simultaneously meet various scene requirements of high definition and high real-time performance. Therefore, after the parameters to be coded are coded based on the coding parameters to obtain the coded code stream, the transmitted coded code stream can save bandwidth.
Further, before acquiring the data frame to be encoded, the method further includes: and receiving a wireless screen projection instruction, wherein the wireless screen projection instruction carries a target terminal identifier.
Transmitting the encoded code stream, comprising: and transmitting the coded code stream according to the target terminal identification.
The wireless screen projection instruction may be received at the client terminal. For example, the triggering manner that can be adopted for receiving the wireless screen projection instruction may include: triggering a preset button on the touch screen, drawing a preset pattern on the touch screen, and pressing the preset button on the execution equipment.
The target terminal identification is a target terminal used for identifying screen projection. It can be understood that the execution device of the present embodiment is a screen-projected sending terminal, i.e., a first terminal. The first terminal receives a wireless screen projection instruction, and the wireless screen projection instruction carries a target terminal identifier. And the first terminal sends the code stream to the server according to the target terminal identification. When the screen is projected wirelessly, the coding code stream sent by the sending terminal can be forwarded through the server. When the code stream is forwarded, the code stream needs to be forwarded according to the target terminal identifier.
The encoding parameters can be changed along with the complexity of the data frame to be encoded so as to be suitable for different scenes, and the situation that the encoding code rate and the encoding frame rate are required to be high and not changed at the same time is not the case in order to simultaneously meet various scene requirements of high definition and high real-time performance. And when the screen is shot wirelessly, the first terminal needs to send the data to be shot to the server, and the data is forwarded to the second terminal by the server. Therefore, after the parameters to be coded are coded based on the coding parameters to obtain the coding code stream, the sent coding code stream can save the bandwidth, is particularly suitable for a wireless screen projection scene, and can save the uplink bandwidth.
As shown in fig. 3 and 4, in one embodiment, the method for processing encoded data includes the following steps:
s302, acquiring a data frame to be encoded;
s304, determining the complexity of the data frame to be coded;
s306, determining a scene type corresponding to the data frame to be coded according to the first complexity and the second complexity;
s308a, when the scene type corresponding to the data frame to be encoded is a motion scene, determining the encoding parameter corresponding to the data frame to be encoded to be equal to the encoding parameter corresponding to the scene type corresponding to the data frame to be encoded;
s308b, when the scene type corresponding to the data frame to be encoded is a continuous static scene, if the data frame to be encoded meets the static frame skipping condition, returning to the step of acquiring the data frame to be encoded;
s308c, when the scene type corresponding to the data frame to be encoded is a continuous static scene, if the data frame to be encoded does not satisfy the static frame skipping condition, determining the encoding parameter corresponding to the data frame to be encoded to be equal to the encoding parameter corresponding to the previous data frame;
s309, coding the data frame to be coded according to the coding parameter corresponding to the data frame to be coded to obtain a coding code stream;
the motion scene comprises a static rotation motion scene, a motion rotation static scene, a first continuous motion scene, a second continuous motion scene and a continuous medium and small motion scene;
when the first complexity and the second complexity are both greater than a first preset motion complexity value, determining that the scene type corresponding to the data frame to be coded is a continuous motion scene; the first preset motion complexity value is greater than or equal to a preset static complexity value;
when the second complexity is larger than the first preset motion complexity value and the first complexity is equal to or smaller than the preset static complexity value, the scene type corresponding to the data frame to be coded is a motion-to-static scene;
when the first complexity and the second complexity are both greater than a first preset motion complexity value and less than or equal to a second preset motion complexity value, the scene type corresponding to the data frame to be coded is a first continuous motion scene; the second preset motion complexity value is greater than or equal to the first preset motion complexity value;
when the first complexity and the second complexity are both greater than a third preset motion complexity value, the scene type corresponding to the data frame to be coded is a second continuous motion scene; the third preset motion complexity value is greater than or equal to the second preset motion complexity value;
when the first complexity and the second complexity are both greater than a second preset motion complexity value and less than or equal to a third preset motion complexity value, the scene type corresponding to the data frame to be coded is a third continuous motion scene;
the coding parameters corresponding to the first continuous motion scene comprise a small motion coding frame rate (represented by fps 3) and a small motion coding rate (represented by Enc 3); the coding parameters corresponding to the second continuous motion scene comprise a large motion coding frame rate (represented by fps 1) and a large motion coding rate (represented by Enc 1); the coding parameters corresponding to the third continuous motion scene comprise a medium motion coding frame rate (represented by fps 2) and a medium motion coding code rate (represented by Enc 2); the small motion coding frame rate is greater than or equal to the medium motion coding frame rate, and the medium motion coding frame rate is greater than or equal to the large motion coding frame rate; the large motion coding rate is greater than or equal to the medium motion coding rate, and the medium motion coding rate is greater than or equal to the small motion coding rate; the coding parameters corresponding to the moving-to-static scene comprise a second coding frame rate (represented by fps 4) and a second coding rate (represented by Enc 4), wherein the second coding frame rate is greater than or equal to the large-moving coding frame rate, and the second coding rate is greater than or equal to the small-moving coding rate; the coding parameters corresponding to the static motion scene comprise a third coding frame rate (represented by fps 5) and a third coding rate (represented by Enc 5), wherein the third coding frame rate is greater than or equal to the large motion coding frame rate, and the third coding rate is greater than or equal to the small motion coding rate. That is, fps3 is not less than fps2 not less than fps1, fps4 not less than fps1, and fps5 not less than fps 1; enc1 is more than or equal to Enc2 is more than or equal to Enc3, Enc4 is more than or equal to Enc3, and Enc5 is more than or equal to Enc 3.
In a set of tests applied to wireless projection, the sharpness of a continuous static scene was improved from 1030MTF50P to 1048MTF50P, with the code rate reduced from 1326kbps to 410 kbps; in a motion scene, a strategy of determining corresponding coding parameters according to the scene is adopted. The MTF50P represents a frequency Value corresponding to 50% of the Peak Value (Peak Value) of the MTF (Modulation Transfer Function), and is a parameter for measuring the sharpness of an image. Therefore, when the method is applied to wireless screen projection, the method can be suitable for various different scenes, can ensure that a user can see clear and smooth pictures, and simultaneously saves unnecessary bandwidth.
It should be understood that although the steps in the flowcharts of fig. 2, 3 and 4 are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 3, and 4 may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 5, there is provided an encoding data processing apparatus operating in the server 106 or the first terminal 102 in fig. 1, including:
a data frame obtaining module 502, configured to obtain a data frame to be encoded;
a complexity determining module 504, configured to determine a complexity of the data frame to be encoded;
a coding parameter determining module 506, configured to determine a coding parameter corresponding to the frame to be coded according to the first complexity and the second complexity; the first complexity is the complexity of the data frame to be coded, and the second complexity is the complexity of the preorder data frame; the preamble data frame is at least one frame data frame which is continuous with the data frame to be encoded and is before the data frame to be encoded.
Acquiring a frame to be encoded based on the encoded data processing apparatus of the embodiment; determining the complexity of the data frame to be coded; determining a coding parameter corresponding to the data frame to be coded according to the first complexity and the second complexity; the first complexity is the complexity of the data frame to be coded, and the second complexity is the complexity of the preorder data frame; the preamble data frame is at least one frame data frame which is continuous with the data frame to be encoded and is before the data frame to be encoded. Therefore, the encoding parameters are determined according to the complexity of the data frame to be encoded and the complexity of the preamble data frame, and different first complexity and second complexity can correspond to different encoding parameters. Therefore, the code stream obtained by encoding through the obtained encoding parameters can be suitable for different scenes. Thus, the adaptability of the encoded data processing can be improved.
In one embodiment, the system further comprises a scene determination module;
a scene determining module, configured to determine a scene type corresponding to the data frame to be encoded according to the first complexity and the second complexity;
and the coding parameter determining module is used for determining the coding parameters corresponding to the data frame to be coded based on the scene type corresponding to the data frame to be coded.
In an embodiment of the present invention, the scene determining module is configured to determine that a scene type corresponding to the data frame to be encoded is a motion scene when at least one of the first complexity and the second complexity is greater than a first preset motion complexity value;
and the encoding parameter determining module is used for determining the encoding parameter corresponding to the frame to be encoded to be equal to the encoding parameter corresponding to the scene type corresponding to the frame to be encoded when the scene type corresponding to the frame to be encoded is a motion scene.
In an embodiment of the disclosure, the scene determining module is configured to determine that a scene type corresponding to the data frame to be encoded is a continuous motion scene when the first complexity and the second complexity are both greater than the first preset motion complexity value.
In an embodiment, the scene determining module is further configured to determine that a scene type corresponding to the data frame to be encoded is a static motion scene when the first complexity is greater than the first preset motion complexity value and the second complexity is equal to or less than a preset static complexity value; or/and when the second complexity is greater than the first preset motion complexity value and the first complexity is equal to or less than the preset static complexity value, determining that the scene type corresponding to the data frame to be coded is a motion-to-static scene; wherein the first preset motion complexity value is greater than or equal to the preset static complexity value.
In one embodiment, the encoding parameters corresponding to the continuous motion scene include a first encoding frame rate and a first encoding rate; the coding parameters corresponding to the moving-to-static scene comprise a second coding frame rate and a second coding rate, wherein the second coding frame rate is greater than or equal to the first coding frame rate, and the second coding rate is greater than or equal to the first coding rate; or/and the coding parameters corresponding to the static and dynamic scene comprise a third coding frame rate and a third coding rate, wherein the third coding frame rate is greater than or equal to the first coding frame rate, and the third coding rate is greater than or equal to the first coding rate.
In one embodiment, the continuous motion scene includes at least a first continuous motion scene and a second continuous motion scene;
when the first complexity and the second complexity are both greater than the first preset motion complexity value and less than or equal to a second preset motion complexity value, the scene type corresponding to the data frame to be coded is a first continuous motion scene; the second preset motion complexity value is greater than or equal to the first preset motion complexity value;
when the first complexity and the second complexity are both greater than a third preset motion complexity value, the scene type corresponding to the data frame to be coded is a second continuous motion scene; the third preset motion complexity value is greater than or equal to the second preset motion complexity value.
In one embodiment, the continuous motion scene further includes a third continuous motion scene;
and when the first complexity and the second complexity are both greater than the second preset motion complexity value and less than or equal to the third preset motion complexity value, the scene type corresponding to the data frame to be coded is a third continuous motion scene.
In one embodiment, the coding parameters corresponding to the first continuous motion scene include a small motion coding frame rate and a small motion coding code rate; the coding parameters corresponding to the second continuous motion scene comprise a large motion coding frame rate and a large motion coding code rate; the coding parameters corresponding to the third continuous motion scene comprise a medium motion coding frame rate and a medium motion coding code rate; the small motion coding frame rate is greater than or equal to the medium motion coding frame rate, and the medium motion coding frame rate is greater than or equal to the large motion coding frame rate; the large motion coding rate is greater than or equal to the medium motion coding rate, and the medium motion coding rate is greater than or equal to the small motion coding rate.
In an embodiment of the present invention, the scene determining module is configured to determine that a scene type corresponding to the data frame to be encoded is a continuous static scene when the first complexity and the second complexity are both equal to or smaller than a preset static complexity value;
and the coding parameter determining module is used for returning to the step of acquiring the frame to be coded if the frame to be coded meets a static frame skipping condition when the scene type corresponding to the frame to be coded is a continuous static scene.
In one embodiment, when the content of the data frame to be encoded is the same as that of a previous data frame, and the time interval between the data frame to be encoded and the previous data frame is less than or equal to a preset time interval, the static frame skipping condition is satisfied.
In an embodiment of the present invention, the encoding parameter determining module is further configured to determine, when the scene type corresponding to the data frame to be encoded is a continuous static scene, if the data frame to be encoded does not satisfy the static frame skipping condition, an encoding parameter corresponding to the data frame to be encoded is equal to an encoding parameter corresponding to a previous data frame.
In one embodiment, the scene determining module is configured to determine, when the first complexity and the second complexity meet a preset condition, a scene type corresponding to the data frame to be encoded according to the preset condition that is met; and when the first complexity and the second complexity do not meet a preset condition, determining that the scene type corresponding to the data frame to be encoded is the scene type of a data frame before the data frame to be encoded.
In an embodiment of the disclosure, when both the first complexity and the second complexity of a first to-be-encoded data frame and a second to-be-encoded data frame are greater than a first preset motion complexity value, if the first complexity of the first to-be-encoded data frame is greater than or equal to the first complexity of the second to-be-encoded data frame and the second complexity of the first to-be-encoded data frame is greater than or equal to the second complexity of the second to-be-encoded data frame, an encoding rate of the first to-be-encoded data frame is greater than or equal to an encoding rate of the second to-be-encoded data frame, and an encoding frame rate of the second to-be-encoded data frame is greater than or equal to an encoding frame rate of the first to-be-encoded data frame.
In one embodiment, the data frame acquiring module is configured to acquire a frame of data to be encoded based on a screen image.
In one embodiment, the apparatus further comprises:
and the data coding module is used for coding the data frame to be coded according to the coding parameters corresponding to the data frame to be coded to obtain a coding code stream.
In one embodiment, the apparatus further comprises: a screen projection instruction receiving module and a coding code stream sending module;
the screen projection instruction receiving module is used for receiving a wireless screen projection instruction before the data frame acquiring module acquires the data frame to be encoded, wherein the wireless screen projection instruction carries a target terminal identifier;
and the coding code stream sending module is used for sending the coding code stream according to the target terminal identification.
In one embodiment, a computer device is provided, which may be a server, and its internal structure diagram may be as shown in fig. 6. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an encoded data processing method.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 7. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an encoded data processing method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the configurations shown in fig. 6 and 7 are merely block diagrams of some configurations relevant to the present disclosure, and do not constitute a limitation on the computing devices to which the present disclosure may be applied, and that a particular computing device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, which may be a server or a terminal, and its internal structure diagram may be as shown in fig. 6 or 7. The computer device comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the coding data processing method when executing the computer program.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the above-mentioned encoded data processing method.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (15)

1. A method of encoding data processing, the method comprising:
acquiring a data frame to be encoded;
determining the complexity of the data frame to be coded;
determining a coding parameter corresponding to the data frame to be coded according to the first complexity and the second complexity; the first complexity is the complexity of the data frame to be coded, and the second complexity is the complexity of the preorder data frame; the preamble data frame is at least one frame data frame which is continuous with the data frame to be encoded and is before the data frame to be encoded.
2. The method according to claim 1, wherein the determining the encoding parameter corresponding to the data frame to be encoded according to the first complexity and the second complexity comprises:
determining a scene type corresponding to the data frame to be encoded according to the first complexity and the second complexity;
and determining the coding parameters corresponding to the data frame to be coded based on the scene type corresponding to the data frame to be coded.
3. The method of claim 2, wherein:
determining the scene type corresponding to the data frame to be encoded according to the first complexity and the second complexity, including: when at least one of the first complexity and the second complexity is larger than a first preset motion complexity value, determining that the scene type corresponding to the data frame to be coded is a motion scene;
determining the encoding parameters corresponding to the data frame to be encoded based on the scene type corresponding to the data frame to be encoded, including: and when the scene type corresponding to the data frame to be encoded is a motion scene, determining the encoding parameter corresponding to the data frame to be encoded to be equal to the encoding parameter corresponding to the scene type corresponding to the data frame to be encoded.
4. The method according to claim 3, wherein the determining that the scene type corresponding to the data frame to be encoded is a motion scene when at least one of the first complexity level and the second complexity level is greater than a first preset motion complexity value comprises:
and when the first complexity and the second complexity are both greater than the first preset motion complexity value, determining that the scene type corresponding to the data frame to be coded is a continuous motion scene.
5. The method according to claim 4, wherein the determining that the scene type corresponding to the data frame to be encoded is a motion scene when at least one of the first complexity level and the second complexity level is greater than a first preset motion complexity value further comprises:
when the first complexity is larger than the first preset motion complexity value and the second complexity is equal to or smaller than a preset static complexity value, determining that the scene type corresponding to the data frame to be coded is a static transition scene;
and/or the first and/or second light-emitting diodes are arranged in the light-emitting diode,
when the second complexity is greater than the first preset motion complexity value and the first complexity is equal to or less than the preset static complexity value, determining that the scene type corresponding to the data frame to be coded is a motion-to-static scene;
wherein the first preset motion complexity value is greater than or equal to the preset static complexity value.
6. The method of claim 5, wherein the coding parameters corresponding to the continuous motion scene comprise a first coding frame rate and a first coding rate; and the number of the first and second electrodes,
the coding parameters corresponding to the moving-to-static scene comprise a second coding frame rate and a second coding rate, wherein the second coding frame rate is greater than or equal to the first coding frame rate, and the second coding rate is greater than or equal to the first coding rate;
and/or the first and/or second light-emitting diodes are arranged in the light-emitting diode,
the coding parameters corresponding to the static and dynamic scene comprise a third coding frame rate and a third coding rate, the third coding frame rate is greater than or equal to the first coding frame rate, and the third coding rate is greater than or equal to the first coding rate.
7. The method of claim 4, wherein the continuous motion scene comprises at least a first continuous motion scene and a second continuous motion scene;
when the first complexity and the second complexity are both greater than the first preset motion complexity value and less than or equal to a second preset motion complexity value, the scene type corresponding to the data frame to be coded is the first continuous motion scene; the second preset motion complexity value is greater than or equal to the first preset motion complexity value;
when the first complexity and the second complexity are both greater than a third preset motion complexity value, the scene type corresponding to the data frame to be coded is the second continuous motion scene; the third preset motion complexity value is greater than or equal to the second preset motion complexity value.
8. The method of claim 7, wherein the continuous motion scene further comprises a third continuous motion scene;
and when the first complexity and the second complexity are both greater than the second preset motion complexity value and less than or equal to the third preset motion complexity value, the scene type corresponding to the data frame to be coded is the third continuous motion scene.
9. The method of claim 8, wherein: the coding parameters corresponding to the first continuous motion scene comprise a small motion coding frame rate and a small motion coding code rate; the coding parameters corresponding to the second continuous motion scene comprise a large motion coding frame rate and a large motion coding code rate; the coding parameters corresponding to the third continuous motion scene comprise a medium motion coding frame rate and a medium motion coding code rate; the small motion coding frame rate is greater than or equal to the medium motion coding frame rate, and the medium motion coding frame rate is greater than or equal to the large motion coding frame rate; the large motion coding rate is greater than or equal to the medium motion coding rate, and the medium motion coding rate is greater than or equal to the small motion coding rate.
10. The method of claim 2, wherein:
determining the scene type corresponding to the data frame to be encoded according to the first complexity and the second complexity, including: when the first complexity and the second complexity are both equal to or less than a preset static complexity value, determining that the scene type corresponding to the data frame to be encoded is a continuous static scene;
determining the encoding parameters corresponding to the data frame to be encoded based on the scene type corresponding to the data frame to be encoded, including: when the scene type corresponding to the data frame to be encoded is a continuous static scene, if the data frame to be encoded meets a static frame skipping condition, returning to the step of acquiring the data frame to be encoded;
and when the content of the frame to be encoded is the same as that of the previous encoded data frame, and the time interval between the frame to be encoded and the previous encoded data frame is less than or equal to a preset time interval, the static frame skipping condition is met.
11. The method according to claim 10, wherein the determining the encoding parameters corresponding to the data frame to be encoded based on the scene type corresponding to the data frame to be encoded further comprises:
when the scene type corresponding to the data frame to be encoded is a continuous static scene, if the data frame to be encoded does not meet the static frame skipping condition, determining the encoding parameter corresponding to the data frame to be encoded to be equal to the encoding parameter corresponding to the previous data frame.
12. The method according to claim 2, wherein the determining the scene type corresponding to the data frame to be encoded according to the first complexity and the second complexity comprises:
when the first complexity degree and the second complexity degree meet preset conditions, determining a scene type corresponding to the data frame to be coded according to the met preset conditions;
and when the first complexity and the second complexity do not meet a preset condition, determining that the scene type corresponding to the data frame to be encoded is the scene type of a data frame before the data frame to be encoded.
13. An encoding data processing apparatus, the apparatus comprising:
the data frame acquisition module is used for acquiring a data frame to be encoded;
the complexity determining module is used for determining the complexity of the data frame to be coded;
the encoding parameter determining module is used for determining the encoding parameters corresponding to the data frame to be encoded according to the first complexity and the second complexity; the first complexity is the complexity of the data frame to be coded, and the second complexity is the complexity of the preorder data frame; the preamble data frame is at least one frame data frame which is continuous with the data frame to be encoded and is before the data frame to be encoded.
14. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method according to any of claims 1 to 12.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 12.
CN201811353206.7A 2018-11-14 2018-11-14 Encoded data processing method, apparatus, computer device and storage medium Active CN111193926B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811353206.7A CN111193926B (en) 2018-11-14 2018-11-14 Encoded data processing method, apparatus, computer device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811353206.7A CN111193926B (en) 2018-11-14 2018-11-14 Encoded data processing method, apparatus, computer device and storage medium

Publications (2)

Publication Number Publication Date
CN111193926A true CN111193926A (en) 2020-05-22
CN111193926B CN111193926B (en) 2022-10-25

Family

ID=70710525

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811353206.7A Active CN111193926B (en) 2018-11-14 2018-11-14 Encoded data processing method, apparatus, computer device and storage medium

Country Status (1)

Country Link
CN (1) CN111193926B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111901603A (en) * 2020-07-28 2020-11-06 上海工程技术大学 Encoding method and decoding method for static background video

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100111162A1 (en) * 2008-10-30 2010-05-06 Vixs Systems, Inc. Video transcoding system with drastic scene change detection and method for use therewith
CN102572380A (en) * 2010-12-29 2012-07-11 中国移动通信集团公司 Video monitoring coding method and device
CN102625106A (en) * 2012-03-28 2012-08-01 上海交通大学 Scene self-adaptive screen encoding rate control method and system
US20140153639A1 (en) * 2012-12-03 2014-06-05 Vixs Systems, Inc. Video encoding system with adaptive hierarchical b-frames and method for use therewith
CN105187832A (en) * 2015-09-09 2015-12-23 成都金本华电子有限公司 Mobile video code rate control method based on 2.5G wireless network
CN105847805A (en) * 2016-03-30 2016-08-10 乐视控股(北京)有限公司 Sliding window-based code rate control method and apparatus
CN105959700A (en) * 2016-05-31 2016-09-21 腾讯科技(深圳)有限公司 Video image coding method and device
CN106657998A (en) * 2016-09-20 2017-05-10 杭州比特瑞旺电脑有限公司 KVM video coding quantization parameter range control method
CN107155107A (en) * 2017-03-21 2017-09-12 腾讯科技(深圳)有限公司 Method for video coding and device, video encoding/decoding method and device
CN107820087A (en) * 2017-10-26 2018-03-20 济南中维世纪科技有限公司 According to the method for mobile detection result dynamic regulation code check

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100111162A1 (en) * 2008-10-30 2010-05-06 Vixs Systems, Inc. Video transcoding system with drastic scene change detection and method for use therewith
CN102572380A (en) * 2010-12-29 2012-07-11 中国移动通信集团公司 Video monitoring coding method and device
CN102625106A (en) * 2012-03-28 2012-08-01 上海交通大学 Scene self-adaptive screen encoding rate control method and system
US20140153639A1 (en) * 2012-12-03 2014-06-05 Vixs Systems, Inc. Video encoding system with adaptive hierarchical b-frames and method for use therewith
CN105187832A (en) * 2015-09-09 2015-12-23 成都金本华电子有限公司 Mobile video code rate control method based on 2.5G wireless network
CN105847805A (en) * 2016-03-30 2016-08-10 乐视控股(北京)有限公司 Sliding window-based code rate control method and apparatus
CN105959700A (en) * 2016-05-31 2016-09-21 腾讯科技(深圳)有限公司 Video image coding method and device
CN106657998A (en) * 2016-09-20 2017-05-10 杭州比特瑞旺电脑有限公司 KVM video coding quantization parameter range control method
CN107155107A (en) * 2017-03-21 2017-09-12 腾讯科技(深圳)有限公司 Method for video coding and device, video encoding/decoding method and device
CN107820087A (en) * 2017-10-26 2018-03-20 济南中维世纪科技有限公司 According to the method for mobile detection result dynamic regulation code check

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111901603A (en) * 2020-07-28 2020-11-06 上海工程技术大学 Encoding method and decoding method for static background video
CN111901603B (en) * 2020-07-28 2023-06-02 上海工程技术大学 Coding method and decoding method for static background video

Also Published As

Publication number Publication date
CN111193926B (en) 2022-10-25

Similar Documents

Publication Publication Date Title
JP6899448B2 (en) Video coding process, computer equipment and computer programs
CN111193927B (en) Encoded data processing method, apparatus, computer device and storage medium
US11206405B2 (en) Video encoding method and apparatus, video decoding method and apparatus, computer device, and storage medium
US11109038B2 (en) Intra-coded frame rate allocation method, computer device and storage medium
JP5284471B2 (en) Method for encoding a series of digitized images
CN109788316B (en) Code rate control method and device, video transcoding method and device, computer equipment and storage medium
CN108012163B (en) Code rate control method and device for video coding
CN110392284B (en) Video encoding method, video data processing method, video encoding apparatus, video data processing apparatus, computer device, and storage medium
CN106534859B (en) Image transmission method and device based on SPICE protocol
CN111193924B (en) Method and device for determining video code rate, computer equipment and storage medium
CN110248192B (en) Encoder switching method, decoder switching method, screen sharing method and screen sharing system
US9258622B2 (en) Method of accessing a spatio-temporal part of a video sequence of images
JP2016111699A (en) Method and device for real-time encoding
CN111193926B (en) Encoded data processing method, apparatus, computer device and storage medium
US10536726B2 (en) Pixel patch collection for prediction in video coding system
WO2018076370A1 (en) Video frame processing method and device
US10735773B2 (en) Video coding techniques for high quality coding of low motion content
CN114374841A (en) Optimization method and device for video coding rate control and electronic equipment
CN112351282A (en) Image data transmission method and device, nonvolatile storage medium and processor
US9451288B2 (en) Inferred key frames for fast initiation of video coding sessions
WO2024078403A1 (en) Image processing method and apparatus, and device
WO2020181540A1 (en) Video processing method and device, encoding apparatus, and decoding apparatus
CN117425009A (en) Bit number distribution method, device, equipment and storage medium in video coding
CN116866604A (en) Image processing method and device
CN117812268A (en) Video transcoding method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant