CN110381311B - Video frame encoding method, video frame encoding device, computer readable medium and electronic equipment - Google Patents

Video frame encoding method, video frame encoding device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN110381311B
CN110381311B CN201910586661.XA CN201910586661A CN110381311B CN 110381311 B CN110381311 B CN 110381311B CN 201910586661 A CN201910586661 A CN 201910586661A CN 110381311 B CN110381311 B CN 110381311B
Authority
CN
China
Prior art keywords
coding
encoding
decision process
coding unit
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910586661.XA
Other languages
Chinese (zh)
Other versions
CN110381311A (en
Inventor
张清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910586661.XA priority Critical patent/CN110381311B/en
Publication of CN110381311A publication Critical patent/CN110381311A/en
Application granted granted Critical
Publication of CN110381311B publication Critical patent/CN110381311B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

Embodiments of the present application provide a method, an apparatus, a computer readable medium, and an electronic device for encoding a video frame. The video frame encoding method comprises the following steps: determining an encoding region where an encoding unit in a video frame to be encoded is located, wherein the encoding region is obtained by dividing the video frame to be encoded; according to the type of the coding region where the coding unit is located, performing a coding mode decision process corresponding to the type of the coding region on the coding unit; if the coding mode decision process performed on the coding unit shows that the coding effect of the target coding mode on the coding unit reaches the set effect, skipping the decision process of other coding modes; and carrying out coding processing on the coding unit through the target coding mode. According to the technical scheme, the calculation complexity of the coding mode decision process can be reduced, so that the coding time of the video frame can be shortened, and the instantaneity of video frame coding is improved.

Description

Video frame encoding method, video frame encoding device, computer readable medium and electronic equipment
Technical Field
The present invention relates to the field of computers and communication technologies, and in particular, to a video frame encoding method, apparatus, computer readable medium and electronic device.
Background
The video Coding scheme based on the Block proposed in the related art mainly performs, for a certain CU (Coding Unit), a decision of modes such as a normal inter prediction (if the slice type where the CU is located is an inter prediction slice), an Intra prediction, an Intra Block Copy (if an IBC (Intra Block Copy) tool is turned on), a Palette Coding (if a PLT (Palette Mode) tool is turned on), and then, by comparing the Rate-prediction Cost (RD Cost) of all possible Coding modes, the best Coding Mode for the current CU is found, so as to achieve the purpose of compression. However, this approach has high computational complexity and thus has poor real-time performance in encoding video frames.
Disclosure of Invention
The embodiment of the application provides a video frame coding method, a video frame coding device, a computer readable medium and electronic equipment, so that the calculation complexity of a coding mode decision process can be reduced at least to a certain extent, the coding time of a video frame is shortened, and the real-time performance of video frame coding is improved.
Other features and advantages of the present application will be apparent from the following detailed description, or may be learned in part by the practice of the application.
According to an aspect of an embodiment of the present application, there is provided a method for encoding a video frame, including: determining an encoding region where an encoding unit in a video frame to be encoded is located, wherein the encoding region is obtained by dividing the video frame to be encoded; according to the type of the coding region where the coding unit is located, performing a coding mode decision process corresponding to the type of the coding region on the coding unit; if the coding mode decision process performed on the coding unit shows that the coding effect of the target coding mode on the coding unit reaches the set effect, skipping the decision process of other coding modes; and carrying out coding processing on the coding unit through the target coding mode.
According to an aspect of an embodiment of the present application, there is provided an encoding apparatus of a video frame, including: the determining unit is used for determining an encoding area where the encoding unit in the video frame to be encoded is located, wherein the encoding area is obtained by dividing the video frame to be encoded; the processing unit is used for carrying out a coding mode decision process corresponding to the type of the coding region on the coding unit according to the type of the coding region where the coding unit is located, wherein if the coding mode decision process carried out on the coding unit indicates that the coding effect of the target coding mode on the coding unit reaches a set effect, the decision process of other coding modes is skipped; and the coding unit is used for carrying out coding processing on the coding unit through the target coding mode.
In some embodiments of the present application, based on the foregoing solution, the processing unit is configured to: and if the type of the coding region where the coding unit is located is an intra-frame coding type, performing a directional intra-frame prediction mode decision process on at least the coding unit.
In some embodiments of the present application, based on the foregoing solution, the processing unit is configured to: calculating Discrete Cosine Transform (DCT) coefficients corresponding to an optimal prediction mode of the directional intra-frame prediction by the coding unit, if the proportion of non-zero coefficients in the DCT coefficients corresponding to the optimal prediction mode of the directional intra-frame prediction is smaller than a proportion threshold value, determining that the coding effect of the directional intra-frame prediction mode on the coding unit reaches a set effect, and determining the directional intra-frame prediction mode as the target coding mode.
In some embodiments of the present application, based on the foregoing scheme, the scaling threshold is in positive correlation with a residual coefficient of the directional intra-prediction mode.
In some embodiments of the present application, based on the foregoing solution, the processing unit is configured to: and calculating the rate distortion cost corresponding to the optimal prediction mode of the directional intra-frame prediction by the coding unit, if the rate distortion cost corresponding to the optimal prediction mode of the directional intra-frame prediction is smaller than a first threshold value, determining that the coding effect of the directional intra-frame prediction mode on the coding unit reaches a set effect, and determining the directional intra-frame prediction mode as the target coding mode.
In some embodiments of the present application, based on the foregoing solution, the processing unit is configured to: and if the type of the coding region where the coding unit is located is not the intra-frame coding type, at least one of an inter-frame prediction mode decision process and a directional intra-frame prediction mode decision process is carried out on the coding unit.
In some embodiments of the present application, based on the foregoing solution, the processing unit is configured to: and calculating the rate distortion cost corresponding to the optimal prediction mode of the inter-frame prediction by the coding unit, if the rate distortion cost corresponding to the optimal prediction mode of the inter-frame prediction is smaller than a second threshold value, determining that the coding effect of the inter-frame prediction mode on the coding unit reaches a set effect, and determining the inter-frame prediction mode as the target coding mode.
In some embodiments of the present application, based on the foregoing scheme, the processing unit performs at least one of an inter prediction mode decision process and a directional intra prediction mode decision process on the coding unit, including: firstly, carrying out an inter-frame prediction mode decision process on the coding unit, and then carrying out a directional intra-frame prediction mode decision process on the coding unit; or firstly, carrying out a directional intra-frame prediction mode decision process on the coding unit, and then carrying out an inter-frame prediction mode decision process on the coding unit.
In some embodiments of the present application, based on the foregoing solution, the processing unit is configured to: and sequentially carrying out each coding mode decision process corresponding to the type of the coding region on the coding units according to a preset sequence.
In some embodiments of the present application, based on the foregoing scheme, the coding region includes a Slice or Tile.
In some embodiments of the present application, based on the foregoing, the video frames to be encoded comprise video frames generated from electronic device screen content, and the other encoding modes comprise palette encoding modes.
In some embodiments of the present application, based on the foregoing solution, in a case where the optimal coding mode decision process of the coding unit is bottom-up, the processing unit is further configured to: and determining whether to skip the decision process of the palette coding mode for the coding unit according to the optimal coding prediction mode corresponding to each sub coding unit of the coding unit.
In some embodiments of the present application, based on the foregoing solution, the processing unit is configured to: if the optimal coding prediction mode of a predetermined number of sub-coding units of the coding unit is not the palette coding mode, determining to skip a decision process of the palette coding mode for the coding unit.
According to an aspect of the embodiments of the present application, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements a method of encoding video frames as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided an electronic device including: one or more processors; and a storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the method of encoding video frames as described in the above embodiments.
In the technical schemes provided by some embodiments of the present application, by performing an encoding mode decision process corresponding to the type of the encoding region on the encoding unit according to the type of the encoding region where the encoding unit is located, when the encoding mode decision process performed on the encoding unit indicates that the encoding effect of the target encoding mode on the encoding unit can reach the set effect, the decision process of other encoding modes is skipped, so that when the encoding mode of the encoding unit is determined, the encoding process is performed on the encoding unit through the target encoding mode, so that when the encoding mode of the encoding unit is determined, only if the encoding effect of a certain encoding mode on the encoding unit is determined to be better, the decision process of all encoding modes can be omitted, and then the calculation complexity of the encoding mode decision process can be effectively reduced under the condition that the encoding effect is ensured, and then the encoding duration of video frames can be shortened, and the real-time of video frame encoding is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application. It is apparent that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art. In the drawings:
FIG. 1 shows a schematic diagram of an exemplary system architecture to which the technical solutions of embodiments of the present application may be applied;
FIG. 2 illustrates a flow chart of a method of encoding a video frame according to one embodiment of the present application;
FIG. 3 illustrates a flow chart of a method of fast encoding of video frames according to one embodiment of the present application;
FIG. 4 illustrates a block diagram of an encoding device for video frames according to one embodiment of the present application;
fig. 5 shows a schematic diagram of a computer system suitable for use in implementing the electronic device of the embodiments of the present application.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present application. One skilled in the relevant art will recognize, however, that the aspects of the application can be practiced without one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known methods, devices, implementations, or operations are not shown or described in detail to avoid obscuring aspects of the application.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, the functional entities may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and do not necessarily include all of the elements and operations/steps, nor must they be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
Fig. 1 shows a schematic diagram of an exemplary system architecture to which the technical solutions of the embodiments of the present application may be applied.
As shown in fig. 1, the system architecture may include a terminal device (such as one or more of the smartphone 101, tablet 102, and portable computer 103 shown in fig. 1, but of course, a desktop computer, etc.), a network 104, and a server 105. The network 104 is the medium used to provide communication links between the terminal devices and the server 105. The network 104 may include various connection types, such as wired communication links, wireless communication links, and the like.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, the server 105 may be a server cluster formed by a plurality of servers.
In an embodiment of the present application, the terminal device may obtain the video to be encoded from the server 105 through the network 104, after the terminal device obtains the video to be encoded, the coding region where the coding unit in the video frame to be encoded is located may be determined, where the coding region may be a Slice (Slice) or a Tile (Tile is a rectangular region), etc., and after the coding region where the coding unit is located is determined, the type of the coding region where the coding unit is located may be determined, for example, the type of the coding region may be an intra-frame coding type, an inter-frame coding type, etc. After determining the type of the coding region where the coding unit is located, a coding mode decision process corresponding to the type of the coding region may be performed on the coding unit according to the type of the coding region where the coding unit is located. For example, if the type of the coding region in which the coding unit is located is an intra-frame coding type, a directional intra-frame prediction mode decision process may be performed; if the type of the coding region where the coding unit is located is an inter-frame coding type, an inter-frame prediction mode decision process, a directional intra-frame prediction mode decision process, etc. can be performed.
In one embodiment of the present application, when performing a coding mode decision process corresponding to the type of the coding region on the coding unit, if the coding mode decision process performed on the coding unit indicates that the coding effect of the target coding mode on the coding unit can reach the set effect, the decision process of other coding modes may be skipped, and then the coding process is performed on the coding unit directly through the target coding mode. Therefore, the technical scheme of the embodiment of the application can ensure the coding effect without carrying out decision processes of all coding modes, thereby effectively reducing the calculation complexity of the coding mode decision process and being beneficial to improving the coding efficiency of video frames.
It should be noted that, the method for encoding a video frame provided in the embodiment of the present application may be performed by a terminal device, and accordingly, the apparatus for encoding a video frame may be disposed in the terminal device. However, in other embodiments of the present application, the server 105 may also have similar functions as the terminal device, so as to perform the encoding scheme of the video frame provided in the embodiments of the present application.
The implementation details of the technical solutions of the embodiments of the present application are described in detail below:
fig. 2 shows a flow chart of a method of encoding a video frame according to an embodiment of the present application, which may be performed by a device having a calculation processing function, such as the terminal device shown in fig. 1. Referring to fig. 2, the method for encoding a video frame at least includes steps S210 to S240, which are described in detail as follows:
in step S210, an encoding region where an encoding unit in a video frame to be encoded is located is determined, where the encoding region is obtained by dividing the video frame to be encoded.
In one embodiment of the present application, the video frame to be encoded may be a video frame generated according to the screen content of the electronic device, such as a video frame formed by a sharing picture of the screen content of the electronic device, and so on. The Slice (Slice) is composed of a series of Coding Tree Units (CTUs) containing Coding units (i.e., CUs).
In one embodiment of the present application, the coding region may be a Slice (Slice) or a Tile (Tile is a rectangular region), and the type of the coding region may be an intra-coding type, an inter-coding type, or the like. For example, if the encoded region is a Slice, then the Slice type may be an intra-coded Slice type (i.e., I Slice), an inter-coded Slice type, etc., where inter-coded Slice types include a unidirectional inter-coded Slice type (i.e., P Slice) and a bidirectional inter-coded Slice type (i.e., B Slice).
With continued reference to fig. 2, in step S220, according to the type of the coding region in which the coding unit is located, a coding mode decision process corresponding to the type of the coding region is performed on the coding unit.
In one embodiment of the present application, if the type of the coding region in which the coding unit is located is an intra-frame coding type, at least the directional intra-prediction mode decision process is performed on the coding unit. I.e. the coding mode decision process corresponding to the intra coding type comprises at least a directional intra prediction mode decision process. Of course, intra block copy mode, palette coding mode, etc. may also be included.
In one embodiment of the present application, performing a directional intra-prediction mode decision process on a coding unit may include: the computation coding unit performs discrete cosine transform (Discrete Cosine Transform, abbreviated as DCT) coefficients corresponding to an optimal prediction mode of directional intra-prediction. Of course, the rate distortion cost corresponding to the optimal prediction mode for directional intra prediction by the coding unit may also be calculated.
In one embodiment of the present application, if the type of coding region in which the coding unit is located is not an intra-coding type (e.g., may be an inter-coding type), at least one of an inter-prediction mode decision process and a directional intra-prediction mode decision process is performed on the coding unit. I.e. the coding mode decision process corresponding to the non-intra coding type comprises at least one of an inter prediction mode decision process and a directional intra prediction mode decision process. Of course, intra block copy mode, palette coding mode, etc. may also be included.
In one embodiment of the present application, if the coding mode decision process corresponding to the type of the coding region includes a plurality of coding mode decision processes, each coding mode decision process corresponding to the type of the coding region may be sequentially performed on the coding units according to a predetermined order, so that when it is determined that the coding effect of a certain coding mode on the coding units is better, the decision process of other coding modes may be skipped, and further, the computational complexity may be reduced and the coding efficiency may be improved. For example, if an inter-prediction mode decision process and a directional intra-prediction mode decision process are required to be performed on the coding unit, the inter-prediction mode decision process may be performed on the coding unit first, and then the directional intra-prediction mode decision process may be performed on the coding unit; or the directional intra-frame prediction mode decision process can be performed on the coding unit first, and then the inter-frame prediction mode decision process can be performed on the coding unit.
In one embodiment of the present application, performing an inter prediction mode decision process on a coding unit may include: and calculating the rate distortion cost corresponding to the optimal prediction mode of the inter-frame prediction by the coding unit.
With continued reference to fig. 2, in step S230, if the coding mode decision process performed on the coding unit indicates that the coding effect of the target coding mode on the coding unit reaches the set effect, the decision process of other coding modes is skipped.
In one embodiment of the present application, if the DCT coefficient corresponding to the optimal prediction mode of the directional intra-frame prediction performed by the coding unit is calculated, if the proportion of the non-zero coefficient in the DCT coefficient corresponding to the optimal prediction mode of the directional intra-frame prediction is smaller than the proportion threshold, it is determined that the coding effect of the directional intra-frame prediction mode on the coding unit reaches the set effect, and the directional intra-frame prediction mode is determined as the target coding mode, that is, the decision process of other coding modes is skipped. Wherein the proportional threshold is in positive correlation with the residual coefficient of the directional intra prediction mode.
In one embodiment of the present application, if a rate distortion cost corresponding to an optimal prediction mode of directional intra-prediction performed by an encoding unit is calculated, if the rate distortion cost corresponding to the optimal prediction mode of directional intra-prediction is smaller than a first threshold, it is determined that an encoding effect of the directional intra-prediction mode on the encoding unit reaches a set effect, and the directional intra-prediction mode is determined as a target encoding mode, that is, a decision process of skipping other encoding modes is performed.
In one embodiment of the present application, if the rate-distortion cost corresponding to the optimal prediction mode of the inter-frame prediction performed by the coding unit is calculated, if the rate-distortion cost corresponding to the optimal prediction mode of the inter-frame prediction is smaller than the second threshold, it is determined that the coding effect of the inter-frame prediction mode on the coding unit reaches the set effect, and the inter-frame prediction mode is determined as the target coding mode, that is, the decision process of skipping other coding modes is performed.
In one embodiment of the present application, if the video frame to be encoded is a video frame generated from electronic device screen content, such as a video frame formed from shared pictures of electronic device screen content, the other encoding mode may be a palette encoding mode. The palette coding mode is to cluster the colors of pixels in the CU of the coding unit into a palette, then map each pixel in the CU into a color index in the palette, and finally code the palette and the color index.
With continued reference to fig. 2, in step S240, the encoding unit is subjected to encoding processing by the target encoding mode.
The technical scheme of the embodiment shown in fig. 2 enables that when the coding mode of the coding unit is determined, the decision process of all coding modes is not required as long as the coding effect of a certain coding mode on the coding unit is determined, so that the calculation complexity of the coding mode decision process can be effectively reduced under the condition of ensuring the coding effect, the coding time of a video frame can be shortened, and the real-time performance of video frame coding is improved.
In one embodiment of the present application, if the optimal coding mode decision process of the coding unit is performed from bottom to top, it may be determined whether to skip the decision process of the palette coding mode for the coding unit according to the optimal coding prediction mode corresponding to each sub coding unit of the coding unit. For example, if the optimal coding prediction modes of a predetermined number of sub-coding units in the sub-coding units of the coding unit are not palette coding modes, determining to skip the decision process of the palette coding mode for the coding unit, and further, reducing the computational complexity of the coding mode decision process by skipping the palette coding mode, thereby shortening the coding duration of the video frame.
The following takes the encoding processing of the content pictures generated by screen sharing (such as the content pictures generated by document sharing of office software PPT, word, etc.) as an example, and the technical scheme of the embodiment of the present application will be described in detail:
in one embodiment of the present application, the screen sharing generated content frames have a significant difference from the natural image captured by the camera—the screen sharing generated content frames are typically noise free, and have discrete hues, distinct lines and sharp edges; while video captured by a camera is typically noisy, continuous in hue, and relatively noisy in texture. In order to improve the coding performance of content pictures generated by screen sharing, a coding technology, i.e. a palette coding mode, specially designed for screen content is proposed in the related art. The palette coding mode is to cluster the colors of pixels in the coding unit CU into one palette, then map each pixel in the CU into a color index in the palette, and finally code the palette and the color index. The palette coding mode is suitable for an area with relatively small color number and non-flat texture, can represent information in the CU with a small code rate, and has relatively high algorithm complexity and difficulty in assembly optimization although the palette coding mode can obviously improve the coding efficiency of the video of the screen content, so that the palette coding mode has defects in real-time application. The coding scheme provided in the embodiment of the application can determine whether the current CU to be coded is suitable for palette coding, and for the CU unsuitable for palette coding, the decision process of the palette coding mode can be skipped, so that a great amount of RDO (Rate-Distortion Optimization, code Rate-distortion optimization) decision time is saved, and the coding speed can be improved without reducing the coding performance.
As shown in fig. 3, a method for fast encoding of video frames according to one embodiment of the present application includes the steps of:
step S301, after the CU to be encoded is input, judging whether the type of the strip where the CU to be encoded is located is I Slice, if yes, executing step S3034; otherwise, step S302 is performed.
Step S302, a process of inter prediction mode decision is performed.
In one embodiment of the present application, a common inter Prediction mode decision is performed on a CU to be coded, including deciding an optimal PU (Prediction Unit) partition of the CU and an optimal Prediction mode of each PU, and calculating an RD Cost corresponding to the optimal Prediction mode. And recording the RD Cost corresponding to the optimal inter prediction mode of the CU as min_inter_cost.
Step S303, a process of normal intra-frame prediction mode decision is performed, and the normal intra-frame prediction mode decision is a directional intra-frame prediction mode.
In one embodiment of the present application, a common intra prediction mode decision is performed on a CU to be encoded, including deciding an optimal PU partition of the CU and an optimal prediction mode of each PU, and calculating an RD Cost corresponding to the optimal prediction mode. And recording the RD Cost corresponding to the optimal common intra prediction mode of the CU as min_intra_cost. Meanwhile, DCT coefficients corresponding to the optimal mode can be calculated when the common intra-frame prediction mode is decided.
It should be noted that, if the Slice type of the CU to be encoded is not I Slice, the inter prediction mode decision process may be performed first and then the normal intra prediction mode decision process may be performed as shown in fig. 3; or the normal intra prediction mode decision process can be performed first, and then the inter prediction mode decision process can be performed.
Step S304, judging whether to skip the decision process of the palette PLT coding mode rapidly according to the inter prediction mode decision process and the common intra prediction mode decision process, if so, executing step S306; otherwise, step S305 is performed.
In one embodiment of the present application, the conditions for skipping the PLT coding mode decision process include (the decision process for PLT coding mode can be skipped if any of the following conditions is satisfied):
a) If the percentage of NON-ZERO coefficients (the number of NON-ZERO coefficients/the CU block size) in the DCT coefficients corresponding to the optimal prediction mode of the normal intra-frame prediction is smaller than the threshold no_zero_dct_th, it indicates that the residual distribution is more concentrated when the current block uses the normal intra-frame prediction mode, the prediction effect is better, and the PLT mode can be not further tried. The non_zero_dct_th may be a value related to QP (Quantization Parameter, a quantization parameter, here, a residual coefficient), and the larger the QP, the larger the non_zero_dct_th.
b) If min_intra_cost < min_intra_th, it indicates that the normal INTRA prediction is good, so the PLT decision process can be skipped.
c) If min_inter_cost < min_inter_th, it indicates that INTER prediction is good, so the decision process of PLT mode can be skipped.
d) When the optimal mode decision process of the CU to be encoded is from bottom to top (i.e., from 8x8 to 64x 64), it is possible to determine whether to skip the PLT decision process quickly using the optimal RD costs (from the optimal RD costs of the sub-CUs, the optimal coding mode selected by the sub-CUs) of the four already-decided sub-CUs of the CU to be encoded (e.g., the CU of 32x32 contains 4 sub-CUs of 16x 16) and the optimal mode. For example, if the optimal prediction mode of at least N sub-CUs out of four sub-CUs of the CU to be encoded is not PLT, the decision process of PLT mode is skipped for the CU to be encoded.
In step S305, the RD Cost corresponding to the PLT coding mode is calculated and denoted as min_plt_cost.
Step S306, determining the optimal prediction mode according to the RD Cost corresponding to each coding mode. For example, if the inter prediction effect is not good as the normal intra prediction effect, and the prediction effect of the PLT coding mode is determined to be good according to the min_plt_cost, the PLT coding mode may be determined to be the optimal prediction mode.
It should be noted that: the technical solution of the embodiment shown in fig. 3 is described by taking the case that the coding region where the coding unit CU is located is a stripe as an example, and in other embodiments of the present application, the coding region may also be in other division forms such as Tile.
According to the technical scheme, the calculation complexity can be reduced under the condition that the compression efficiency is guaranteed as much as possible. Experiments prove that by utilizing the technical scheme of the embodiment of the application, the effect of improving the coding speed by about 30% can be achieved under the condition that the BD-Rate (BD-Rate calculates the average value of the difference values of two RD curves corresponding to two algorithms), and the curve fitting is carried out on a plurality of (generally 4) points to be tested, the difference value is carried out, and finally the average value is taken) loss is about 1.6%. The technical scheme of the embodiment of the application can be successfully applied to TSE (Tencent Screen Encoder Tencent screen encoder), the encoder can remarkably improve the encoding efficiency of the screen content sequence, and the method can also be applied to wireless screen projection projects.
The following describes an embodiment of an apparatus of the present application, which may be used to perform the method of encoding video frames in the above-described embodiments of the present application. For details not disclosed in the embodiments of the apparatus of the present application, please refer to the embodiments of the video frame encoding method described in the present application.
Fig. 4 shows a block diagram of an encoding apparatus of video frames according to an embodiment of the present application.
Referring to fig. 4, an apparatus 400 for encoding video frames according to an embodiment of the present application includes: a determining unit 402, a processing unit 404 and an encoding unit 406.
The determining unit 402 is configured to determine an encoding area where an encoding unit in a video frame to be encoded is located, where the encoding area is obtained by dividing the video frame to be encoded; the processing unit 404 is configured to perform, for the coding unit, a coding mode decision process corresponding to a type of the coding region according to the type of the coding region where the coding unit is located, where if the coding mode decision process performed for the coding unit indicates that the coding effect of the target coding mode on the coding unit reaches a set effect, skipping decision processes of other coding modes; the encoding unit 406 is configured to perform encoding processing on the encoding unit by using the target encoding mode.
In some embodiments of the present application, the processing unit 404 is configured to: and if the type of the coding region where the coding unit is located is an intra-frame coding type, performing a directional intra-frame prediction mode decision process on at least the coding unit.
In some embodiments of the present application, the processing unit 404 is configured to: calculating Discrete Cosine Transform (DCT) coefficients corresponding to an optimal prediction mode of the directional intra-frame prediction by the coding unit, if the proportion of non-zero coefficients in the DCT coefficients corresponding to the optimal prediction mode of the directional intra-frame prediction is smaller than a proportion threshold value, determining that the coding effect of the directional intra-frame prediction mode on the coding unit reaches a set effect, and determining the directional intra-frame prediction mode as the target coding mode.
In some embodiments of the present application, the scaling threshold is in positive correlation with residual coefficients of the directional intra-prediction mode.
In some embodiments of the present application, the processing unit 404 is configured to: and calculating the rate distortion cost corresponding to the optimal prediction mode of the directional intra-frame prediction by the coding unit, if the rate distortion cost corresponding to the optimal prediction mode of the directional intra-frame prediction is smaller than a first threshold value, determining that the coding effect of the directional intra-frame prediction mode on the coding unit reaches a set effect, and determining the directional intra-frame prediction mode as the target coding mode.
In some embodiments of the present application, the processing unit 404 is configured to: and if the type of the coding region where the coding unit is located is not the intra-frame coding type, at least one of an inter-frame prediction mode decision process and a directional intra-frame prediction mode decision process is carried out on the coding unit.
In some embodiments of the present application, the processing unit 404 is configured to: and calculating the rate distortion cost corresponding to the optimal prediction mode of the inter-frame prediction by the coding unit, if the rate distortion cost corresponding to the optimal prediction mode of the inter-frame prediction is smaller than a second threshold value, determining that the coding effect of the inter-frame prediction mode on the coding unit reaches a set effect, and determining the inter-frame prediction mode as the target coding mode.
In some embodiments of the present application, the processing unit 404 performs at least one of an inter prediction mode decision process and a directional intra prediction mode decision process on the coding unit, including: firstly, carrying out an inter-frame prediction mode decision process on the coding unit, and then carrying out a directional intra-frame prediction mode decision process on the coding unit; or firstly, carrying out a directional intra-frame prediction mode decision process on the coding unit, and then carrying out an inter-frame prediction mode decision process on the coding unit.
In some embodiments of the present application, the processing unit 404 is configured to: and sequentially carrying out each coding mode decision process corresponding to the type of the coding region on the coding units according to a preset sequence.
In some embodiments of the present application, based on the foregoing scheme, the coding region includes a Slice or Tile.
In some embodiments of the present application, the video frames to be encoded comprise video frames generated from electronic device screen content, and the other encoding modes comprise palette encoding modes.
In some embodiments of the present application, where the optimal coding mode decision process of the coding unit is bottom-up, the processing unit 404 is further configured to: and determining whether to skip the decision process of the palette coding mode for the coding unit according to the optimal coding prediction mode corresponding to each sub coding unit of the coding unit.
In some embodiments of the present application, the processing unit 404 is configured to: if the optimal coding prediction mode of a predetermined number of sub-coding units of the coding unit is not the palette coding mode, determining to skip a decision process of the palette coding mode for the coding unit.
Fig. 5 shows a schematic diagram of a computer system suitable for use in implementing the electronic device of the embodiments of the present application.
It should be noted that, the computer system 500 of the electronic device shown in fig. 5 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present application.
As shown in fig. 5, the computer system 500 includes a central processing unit (Central Processing Unit, CPU) 501, which can perform various appropriate actions and processes, such as performing the methods described in the above embodiments, according to a program stored in a Read-Only Memory (ROM) 502 or a program loaded from a storage section 508 into a random access Memory (Random Access Memory, RAM) 503. In the RAM 503, various programs and data required for the system operation are also stored. The CPU 501, ROM 502, and RAM 503 are connected to each other through a bus 504. An Input/Output (I/O) interface 505 is also connected to bus 504.
The following components are connected to the I/O interface 505: an input section 506 including a keyboard, a mouse, and the like; an output portion 507 including a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and the like, and a speaker, and the like; a storage portion 508 including a hard disk and the like; and a communication section 509 including a network interface card such as a LAN (Local Area Network ) card, a modem, or the like. The communication section 509 performs communication processing via a network such as the internet. The drive 510 is also connected to the I/O interface 505 as needed. A removable medium 511 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 510 as needed so that a computer program read therefrom is mounted into the storage section 508 as needed.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method shown in the flowchart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication portion 509, and/or installed from the removable media 511. When executed by a Central Processing Unit (CPU) 501, performs the various functions defined in the system of the present application.
It should be noted that, the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with a computer-readable computer program embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. A computer program embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Where each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units involved in the embodiments of the present application may be implemented by means of software, or may be implemented by means of hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
As another aspect, the present application also provides a computer-readable medium that may be contained in the electronic device described in the above embodiment; or may exist alone without being incorporated into the electronic device. The computer-readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to implement the methods described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit, in accordance with embodiments of the present application. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a usb disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a touch terminal, or a network device, etc.) to perform the method according to the embodiments of the present application.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (12)

1. A method of encoding a video frame, comprising:
determining an encoding region where an encoding unit in a video frame to be encoded generated according to screen content of electronic equipment is located, wherein the encoding region is obtained by dividing the video frame to be encoded;
according to the type of the coding region where the coding unit is located, performing a coding mode decision process corresponding to the type of the coding region on the coding unit;
if the coding mode decision process performed on the coding unit shows that the coding effect of the target coding mode on the coding unit reaches the set effect, skipping the decision process of the palette coding mode;
The coding unit is coded through the target coding mode;
wherein, in case the optimal coding mode decision process of the coding unit is bottom-up, the coding method further comprises: if the optimal coding prediction mode of a predetermined number of sub-coding units of the coding unit is not the palette coding mode, determining to skip a decision process of the palette coding mode for the coding unit.
2. The method according to claim 1, wherein performing a coding mode decision process corresponding to a type of a coding region on the coding unit according to the type of the coding region on which the coding unit is located, comprises:
and if the type of the coding region where the coding unit is located is an intra-frame coding type, performing a directional intra-frame prediction mode decision process on at least the coding unit.
3. The method of encoding a video frame according to claim 2, wherein performing a directional intra prediction mode decision process on the encoding unit comprises: calculating discrete cosine transform DCT coefficients corresponding to an optimal prediction mode of the directional intra-frame prediction by the coding unit;
The video frame encoding method further comprises the following steps: and if the proportion of the non-zero coefficient in the DCT coefficient corresponding to the optimal prediction mode of the directional intra-frame prediction is smaller than a proportion threshold value, determining that the coding effect of the directional intra-frame prediction mode on the coding unit reaches a set effect, and determining the directional intra-frame prediction mode as the target coding mode.
4. A method of encoding a video frame as claimed in claim 3 wherein the scale threshold is in positive correlation with residual coefficients of the directional intra prediction mode.
5. The method of encoding a video frame according to claim 2, wherein performing a directional intra prediction mode decision process on the encoding unit comprises: calculating the rate distortion cost corresponding to the optimal prediction mode of the directional intra-frame prediction by the coding unit;
the video frame encoding method further comprises the following steps: and if the rate distortion cost corresponding to the optimal prediction mode of the directional intra-frame prediction is smaller than a first threshold value, determining that the coding effect of the directional intra-frame prediction mode on the coding unit reaches a set effect, and determining the directional intra-frame prediction mode as the target coding mode.
6. The method according to claim 1, wherein performing a coding mode decision process corresponding to a type of a coding region on the coding unit according to the type of the coding region on which the coding unit is located, comprises:
and if the type of the coding region where the coding unit is located is not the intra-frame coding type, at least one of an inter-frame prediction mode decision process and a directional intra-frame prediction mode decision process is carried out on the coding unit.
7. The method of encoding a video frame as claimed in claim 6, wherein performing an inter prediction mode decision process on the encoding unit comprises: calculating rate distortion cost corresponding to an optimal prediction mode of the coding unit for inter-frame prediction;
the video frame coding method further comprises the following steps: if the rate distortion cost corresponding to the optimal prediction mode of the inter-frame prediction is smaller than a second threshold, determining that the coding effect of the inter-frame prediction mode on the coding unit reaches a set effect, and determining the inter-frame prediction mode as the target coding mode.
8. The encoding method of a video frame according to claim 1, wherein performing an encoding mode decision process corresponding to the type of the encoding region on the encoding unit comprises:
And sequentially carrying out each coding mode decision process corresponding to the type of the coding region on the coding units according to a preset sequence.
9. The method of encoding a video frame according to claim 1, wherein the encoding region comprises a Slice (Slice) or a rectangular block (Tile).
10. An apparatus for encoding a video frame, comprising:
the device comprises a determining unit, a coding unit and a coding unit, wherein the determining unit is used for determining a coding area where the coding unit is located in a video frame to be coded generated according to the screen content of the electronic equipment, the coding area is obtained by dividing the video frame to be coded, and the video frame to be coded comprises a video frame generated according to the screen content of the electronic equipment;
the processing unit is used for carrying out a coding mode decision process corresponding to the type of the coding region on the coding unit according to the type of the coding region where the coding unit is located, wherein if the coding mode decision process carried out on the coding unit indicates that the coding effect of the target coding mode on the coding unit reaches the set effect, the decision process of the palette coding mode is skipped;
the coding unit is used for carrying out coding processing on the coding unit through the target coding mode;
Wherein the processing unit is further configured to: and if the optimal coding prediction mode of a preset number of sub coding units in the sub coding units of the coding unit is not the palette coding mode under the condition that the optimal coding mode decision process of the coding unit is from bottom to top, determining to skip the decision process of the palette coding mode for the coding unit.
11. A computer readable medium on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of encoding video frames according to any of claims 1 to 9.
12. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which when executed by the one or more processors cause the one or more processors to implement the method of encoding a video frame as claimed in any one of claims 1 to 9.
CN201910586661.XA 2019-07-01 2019-07-01 Video frame encoding method, video frame encoding device, computer readable medium and electronic equipment Active CN110381311B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910586661.XA CN110381311B (en) 2019-07-01 2019-07-01 Video frame encoding method, video frame encoding device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910586661.XA CN110381311B (en) 2019-07-01 2019-07-01 Video frame encoding method, video frame encoding device, computer readable medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN110381311A CN110381311A (en) 2019-10-25
CN110381311B true CN110381311B (en) 2023-06-30

Family

ID=68251597

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910586661.XA Active CN110381311B (en) 2019-07-01 2019-07-01 Video frame encoding method, video frame encoding device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN110381311B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117241033A (en) 2019-08-06 2023-12-15 北京字节跳动网络技术有限公司 Video encoding and decoding using screen content encoding tools
CN114342400B (en) 2019-09-02 2024-06-28 北京字节跳动网络技术有限公司 Color format based codec mode determination
JP7332795B2 (en) 2019-09-21 2023-08-23 北京字節跳動網絡技術有限公司 Base size constraints for chroma intra mode
WO2021204295A1 (en) * 2020-04-10 2021-10-14 Beijing Bytedance Network Technology Co., Ltd. Minimum allowed quantization for transform skip blocks in video coding
CN113747162B (en) * 2020-05-29 2023-09-29 北京金山云网络技术有限公司 Video processing method and device, storage medium and electronic device
CN111988607B (en) * 2020-08-07 2023-03-24 北京奇艺世纪科技有限公司 Encoding unit processing method and device, electronic equipment and storage medium
CN112153385B (en) * 2020-11-25 2021-03-02 腾讯科技(深圳)有限公司 Encoding processing method, device, equipment and storage medium
CN115474055B (en) * 2021-06-10 2024-07-09 腾讯科技(深圳)有限公司 Video encoding method, encoder, medium and electronic device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106797478A (en) * 2014-11-06 2017-05-31 联发科技股份有限公司 The method of the palette coding based on sub-block
CN107277509A (en) * 2017-08-03 2017-10-20 重庆邮电大学 A kind of fast intra-frame predicting method based on screen content
CN107623850A (en) * 2017-09-26 2018-01-23 杭州电子科技大学 A kind of quick screen contents encoding method based on temporal correlation

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8155189B2 (en) * 2005-10-19 2012-04-10 Freescale Semiconductor, Inc. System and method of coding mode decision for video encoding
US8559512B2 (en) * 2010-05-05 2013-10-15 Ceva D.S.P. Ltd. Device, system, and method for predicting residual data for intra and inter frame encoding of image or video data
CN102186070B (en) * 2011-04-20 2013-06-05 北京工业大学 Method for realizing rapid video coding by adopting hierarchical structure anticipation
US11223832B2 (en) * 2014-07-24 2022-01-11 Qualcomm Incorporated Methods and apparatus for encoding video data using block palettes and sub-block and pixel scanning orders
GB2531003A (en) * 2014-10-06 2016-04-13 Canon Kk Method and apparatus for vector encoding in video coding and decoding
CN105933717B (en) * 2016-04-25 2019-03-12 北京奇艺世纪科技有限公司 A kind of mode decision method and apparatus of coding unit
CN107623848B (en) * 2017-09-04 2019-11-19 浙江大华技术股份有限公司 A kind of method for video coding and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106797478A (en) * 2014-11-06 2017-05-31 联发科技股份有限公司 The method of the palette coding based on sub-block
CN107277509A (en) * 2017-08-03 2017-10-20 重庆邮电大学 A kind of fast intra-frame predicting method based on screen content
CN107623850A (en) * 2017-09-26 2018-01-23 杭州电子科技大学 A kind of quick screen contents encoding method based on temporal correlation

Also Published As

Publication number Publication date
CN110381311A (en) 2019-10-25

Similar Documents

Publication Publication Date Title
CN110381311B (en) Video frame encoding method, video frame encoding device, computer readable medium and electronic equipment
US12010302B2 (en) Image processing apparatuses including CNN-based in-loop filter
US10841583B2 (en) Coding unit depth determining method and apparatus
US20190149828A1 (en) Image encoding/decoding method and apparatus using intra-screen prediction
CN110036637B (en) Method and device for denoising and vocalizing reconstructed image
KR102123958B1 (en) Method for reducing real-time video noise in coding process, terminal, and computer-readable non-volatile storage medium
CN111711825B (en) Deblocking filtering method, device, equipment and medium in video coding and decoding
US20150063452A1 (en) High efficiency video coding (hevc) intra prediction encoding apparatus and method
WO2016180129A1 (en) Prediction mode selection method, apparatus and device
JP2005524346A (en) Chroma deblocking filter
CN103891278A (en) Method, device, and program for encoding and decoding image
US20230024834A1 (en) Video decoding method, video encoding method, and related apparatuses
WO2019114225A1 (en) Coding unit division determining method and device, computing device and readable storage medium
US20230056211A1 (en) Video encoding method and apparatus, video decoding method and apparatus, computer program product, computer-readable storage medium, and electronic device
CN114303380B (en) Encoder, decoder and corresponding methods for CABAC coding of indices of geometric partition flags
CN112106372A (en) Method and apparatus for hybrid intra prediction
US20170201767A1 (en) Video encoding device and video encoding method
KR20100136883A (en) Method and apparatus for filtering image by using pseudo-random filter
Han et al. Quadtree-based non-local Kuan’s filtering in video compression
KR20180099369A (en) Method for video rate control, and video coding method and apparatus the same
US20150312590A1 (en) Methods for encoding and decoding a picture and corresponding devices
CN115834882A (en) Intra-frame prediction method and device, electronic equipment and storage medium
CN110249630B (en) Deblocking filter apparatus, method and storage medium
US20210185314A1 (en) Image decoding device, image coding device, image processing system, and program
CN115442617A (en) Video processing method and device based on video coding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant