CN113473135A - Intra-frame prediction method, device and medium for non-linear texture - Google Patents
Intra-frame prediction method, device and medium for non-linear texture Download PDFInfo
- Publication number
- CN113473135A CN113473135A CN202110577503.5A CN202110577503A CN113473135A CN 113473135 A CN113473135 A CN 113473135A CN 202110577503 A CN202110577503 A CN 202110577503A CN 113473135 A CN113473135 A CN 113473135A
- Authority
- CN
- China
- Prior art keywords
- prediction
- intra
- quadratic function
- reference pixel
- linear texture
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/157—Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
- H04N19/159—Prediction type, e.g. intra-frame, inter-frame or bidirectional frame prediction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/182—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a pixel
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
Abstract
The present disclosure relates to a method, an apparatus, and a medium for intra prediction for non-linear texture, where the method is used in an intra prediction module in the field of image and video coding, and the method includes: determining a current intra prediction mode; predicting, in the intra-prediction module, a non-linear texture using prediction modeling including a quadratic function; the position of the reference pixel is derived from the result of predicting the non-linear texture, and a predicted pixel value is generated from the reference pixel interpolation. The method solves the problem of efficient prediction modeling of the nonlinear texture content oriented intra-frame prediction module of the image and video coding and decoding standard. The method and the device can generate a high-fidelity prediction signal close to the original signal, reduce the prediction residual error and improve the coding efficiency.
Description
Technical Field
The present disclosure relates to the field of intra prediction technologies, and in particular, to a method, an apparatus, and a medium for intra prediction for non-linear texture.
Background
In image and video coding, spatial redundancy of a signal is reduced mainly by intra prediction. Mainstream video codec standards (e.g., VVC, AVS3, etc.) define a series of intra-prediction angular modes in an intra-prediction module to generate prediction content. However, the current angle prediction mode can only generate linear texture, and the nonlinear texture cannot be accurately and efficiently modeled. Therefore, the invention provides an intra-frame prediction algorithm oriented to the non-linear texture, which improves the intra-frame prediction efficiency and the coding efficiency.
The intra prediction modes of VVC have 67 modes, including DC mode, Planar mode and other 65 angular prediction modes. The intra prediction angular mode defines a direction of intra prediction, i.e., a direction through which the reference pixels are projected onto corresponding positions of the prediction block to form the prediction block. Taking fig. 1 as an example, the angle α corresponds to a tangential angle of a certain intra-prediction direction, and the pixel intensities at the passing positions are the same. Thus, for a predicted pixel p [ x0] [ y0], its corresponding reference pixel position c can be derived by the following equation, and its pixel intensity can be interpolated by a multi-tap filter.
c=x0-tanα*y0 (1)
p[x0][y0]=f[0]*p[c][0]+f[1]*p[c][1]+f[2]*p[c][2]+f[3]*p[c][3] (2)。
Disclosure of Invention
The method aims to solve the technical problem that the image and video coding standard in the prior art cannot generate nonlinear texture content close to an original signal in an intra-frame prediction module.
To achieve the above technical object, the present disclosure provides a non-linear texture-oriented intra prediction method, including:
determining a current intra prediction mode;
predicting, in the intra-prediction module, a non-linear texture using prediction modeling including a quadratic function;
the position of the reference pixel is derived from the result of predicting the non-linear texture, and a predicted pixel value is generated from the reference pixel interpolation.
Further, the intra prediction modes specifically include:
a normal prediction mode close to the reference pixel and an extended prediction mode far from the reference pixel.
Further, the predictive modeling is a model using a quadratic function or a model using a linear combination of a linear function and a quadratic function.
Further, the predicting the non-linear texture in the intra prediction module using prediction modeling including a quadratic function specifically includes:
the predictive modeling including a quadratic function is represented using two angular prediction modes to predict a non-linear texture.
Further, the two angular prediction modes belong to the same vertical prediction mode set or the same horizontal prediction mode set.
Further, the deriving the reference pixel according to the result of predicting the non-linear texture specifically includes:
when the two angular prediction modes belong to the same set of vertical prediction modes,
using the formula
Determining a reference pixel position c;
wherein, the angle alpha corresponds to the tangential direction when the quadratic function enters the prediction block, and the angle beta corresponds to the tangential direction when the quadratic function leaves the prediction block;
x0denotes the abscissa, y0Represents the ordinate, h represents the height of the intra-predicted block;
using the formula
p[x0][y0]=f[0]*p[c][0]+f[1]*p[c][1]+f[2]*p[c][2]+f[3]*p[c][3]
Determining a location of an intra-predicted pixel;
wherein p [ x0] [ y0] represents an intra-predicted pixel;
when the two angular prediction modes belong to the same set of horizontal prediction modes,
using the formula
Determining a reference pixel position c;
wherein, the angle alpha corresponds to the tangential direction when the quadratic function enters the prediction block, and the angle beta corresponds to the tangential direction when the quadratic function leaves the prediction block;
x0denotes the abscissa, y0Denotes the ordinate, w denotes the width of the intra prediction block;
using the formula
p[x0][y0]=f[0]*p[0][c]+f[1]*p[0][c+1]+f[2]*p[0][c+2]+f[3]*p[0][c+3]
Determining a location of an intra-predicted pixel;
where p [ x0] [ y0] denotes an intra prediction pixel.
Further, the generating of the prediction pixel value according to the position of the reference pixel specifically adopts a gaussian interpolation or cubic spline interpolation method to generate the prediction pixel value according to the position of the reference pixel.
Further, the combination method of the angle prediction mode is based on a statistical rule model and also based on a data-driven method of deep learning or machine learning.
To achieve the above technical objects, the present disclosure can also provide a computer storage medium having a computer program stored thereon, the computer program being for implementing the steps of the above non-linear texture-oriented intra prediction method when executed by a processor.
To achieve the above technical object, the present disclosure further provides an electronic device including a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to implement the steps of the non-linear texture oriented intra prediction method in real time.
The beneficial effect of this disclosure does:
the method solves the problem of efficient prediction modeling of the nonlinear texture content oriented intra-frame prediction module of the image and video coding and decoding standard. The method and the device can generate a high-fidelity prediction signal close to the original signal, reduce the prediction residual error and improve the coding efficiency.
Drawings
FIG. 1 illustrates a schematic diagram of a related art intra prediction method;
fig. 2 shows a schematic flow diagram of embodiment 1 of the present disclosure;
fig. 3 shows a schematic diagram of an intra prediction method of embodiment 1 of the present disclosure;
fig. 4 shows a schematic structural diagram of embodiment 3 of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
Various structural schematics according to embodiments of the present disclosure are shown in the figures. The figures are not drawn to scale, wherein certain details are exaggerated and possibly omitted for clarity of presentation. The shapes of various regions, layers, and relative sizes and positional relationships therebetween shown in the drawings are merely exemplary, and deviations may occur in practice due to manufacturing tolerances or technical limitations, and a person skilled in the art may additionally design regions/layers having different shapes, sizes, relative positions, as actually required.
The first embodiment is as follows:
as shown in fig. 2:
the present disclosure provides a non-linear texture-oriented intra prediction method, including:
determining a current intra prediction mode;
predicting, in the intra-prediction module, a non-linear texture using prediction modeling including a quadratic function;
reference pixels are derived from the results of predicting the non-linear texture, and predicted pixel values are generated from the locations of the reference pixels.
Further, the intra prediction modes specifically include:
a normal prediction mode close to the reference pixel and an extended prediction mode far from the reference pixel.
Further, the predictive modeling is a model using a quadratic function or a model using a linear combination of a linear function and a quadratic function.
Further, the predicting the non-linear texture in the intra prediction module using prediction modeling including a quadratic function specifically includes:
the predictive modeling including a quadratic function is represented using two angular prediction modes to predict a non-linear texture.
Further, the two angular prediction modes belong to the same vertical prediction mode set or the same horizontal prediction mode set.
Further, the deriving the reference pixel according to the result of predicting the non-linear texture specifically includes:
when the two angular prediction modes belong to the same set of vertical prediction modes,
using the formula
Determining a reference pixel position c;
wherein, the angle alpha corresponds to the tangential direction when the quadratic function enters the prediction block, and the angle beta corresponds to the tangential direction when the quadratic function leaves the prediction block;
x0denotes the abscissa, y0Represents the ordinate, h represents the height of the intra-predicted block;
using the formula
p[x0][y0]=f[0]*p[c][0]+f[1]*p[c][1]+f[2]*p[c][2]+f[3]*p[c][3]
Determining a location of an intra-predicted pixel;
wherein p [ x0] [ y0] represents an intra-predicted pixel;
when the two angular prediction modes belong to the same set of horizontal prediction modes,
using the formula
Determining a reference pixel position c;
wherein, the angle alpha corresponds to the tangential direction when the quadratic function enters the prediction block, and the angle beta corresponds to the tangential direction when the quadratic function leaves the prediction block;
x0denotes the abscissa, y0Denotes the ordinate, w denotes the width of the intra prediction block;
using the formula
p[x0][y0]=f[0]*p[0][c]+f[1]*p[0][c+1]+f[2]*p[0][c+2]+f[3]*p[0][c+3]
Determining a location of an intra-predicted pixel;
where p [ x0] [ y0] denotes an intra prediction pixel.
Further, the generating of the prediction pixel value according to the position of the reference pixel specifically adopts a gaussian interpolation or cubic spline interpolation method to generate the prediction pixel value according to the position of the reference pixel.
Further, the combination method of the angle prediction mode is based on a statistical rule model and also based on a data-driven method of deep learning or machine learning.
The non-linear texture-oriented intra prediction method proposed by the present invention is shown in fig. 3. By means of the form of a quadratic function, the present invention represents the non-linear texture using two angular prediction modes. In fig. 3, both angular prediction modes belong to the vertical mode set. Specifically, the prediction mode close to the reference pixel is called a regular prediction mode, and the angle α corresponds to the tangential direction of the quadratic function when entering the prediction block; the prediction mode away from the reference pixel is called extended prediction mode, whose angle β corresponds to the tangential direction of the quadratic function as it leaves the prediction block. At points on the same quadratic function, the pixel intensities are the same. Thus, for a predicted pixel p [ x0] [ y0], its corresponding reference pixel position c can be derived by the following equation, and its pixel intensity can be interpolated by a multi-tap filter.
When both angular prediction modes belong to the vertical mode set, the position of the reference pixel and the pixel intensity are derived as shown in the following formula.
p[x0][y0]=f[0]*p[c][0]+f[1]*p[c][1]+f[2]*p[c][2]+f[3]*p[c][3] (2)
Wherein, the angle alpha corresponds to the tangential direction when the quadratic function enters the prediction block, and the angle beta corresponds to the tangential direction when the quadratic function leaves the prediction block;
x0denotes the abscissa, y0Represents the ordinate, h represents the height of the intra-predicted block;
accordingly, when both angular prediction modes belong to the horizontal mode set, the position of the reference pixel and the pixel intensity derivation process are shown as follows.
p[x0][y0]=f[0]*p[0][c]+f[1]*p[0][c+1]+f[2]*p[0][c+2]+f[3]*p[0][c+3] (4)
In this disclosure, the two angular prediction modes should belong to either the same set of vertical modes or the same set of horizontal modes. To reduce the number of searches, the conventional prediction mode can only be selected from a Most Probable mode set (MPM). Further, for each conventional prediction mode, only the 4 extended prediction modes with the highest occurrence frequency based on the statistical rule are searched. The invention can obtain the coding performance gain of 0.1 percent on average under VTM10.0 Intra coding mode (All Intra, AI).
Table 1 VTM 10.0-based encoding performance of the present disclosure
Example two:
the present disclosure can also provide a computer storage medium having stored thereon a computer program for implementing the steps of the non-linear texture-oriented intra prediction method described above when executed by a processor.
The computer storage medium of the present disclosure may be implemented with a semiconductor memory, a magnetic core memory, a magnetic drum memory, or a magnetic disk memory.
Semiconductor memories are mainly used as semiconductor memory elements of computers, and there are two types, Mos and bipolar memory elements. Mos devices have high integration, simple process, but slow speed. The bipolar element has the advantages of complex process, high power consumption, low integration level and high speed. NMos and CMos were introduced to make Mos memory dominate in semiconductor memory. NMos is fast, e.g. 45ns for 1K bit sram from intel. The CMos power consumption is low, and the access time of the 4K-bit CMos static memory is 300 ns. The semiconductor memories described above are all Random Access Memories (RAMs), i.e. read and write new contents randomly during operation. And a semiconductor Read Only Memory (ROM), which can be read out randomly but cannot be written in during operation, is used to store solidified programs and data. The ROM is classified into a non-rewritable fuse type ROM, PROM, and a rewritable EPROM.
The magnetic core memory has the characteristics of low cost and high reliability, and has more than 20 years of practical use experience. Magnetic core memories were widely used as main memories before the mid 70's. The storage capacity can reach more than 10 bits, and the access time is 300ns at the fastest speed. The typical international magnetic core memory has a capacity of 4 MS-8 MB and an access cycle of 1.0-1.5 mus. After semiconductor memory is rapidly developed to replace magnetic core memory as a main memory location, magnetic core memory can still be applied as a large-capacity expansion memory.
Drum memory, an external memory for magnetic recording. Because of its fast information access speed and stable and reliable operation, it is being replaced by disk memory, but it is still used as external memory for real-time process control computers and medium and large computers. In order to meet the needs of small and micro computers, subminiature magnetic drums have emerged, which are small, lightweight, highly reliable, and convenient to use.
Magnetic disk memory, an external memory for magnetic recording. It combines the advantages of drum and tape storage, i.e. its storage capacity is larger than that of drum, its access speed is faster than that of tape storage, and it can be stored off-line, so that the magnetic disk is widely used as large-capacity external storage in various computer systems. Magnetic disks are generally classified into two main categories, hard disks and floppy disk memories.
Hard disk memories are of a wide variety. The structure is divided into a replaceable type and a fixed type. The replaceable disk is replaceable and the fixed disk is fixed. The replaceable and fixed magnetic disks have both multi-disk combinations and single-chip structures, and are divided into fixed head types and movable head types. The fixed head type magnetic disk has a small capacity, a low recording density, a high access speed, and a high cost. The movable head type magnetic disk has a high recording density (up to 1000 to 6250 bits/inch) and thus a large capacity, but has a low access speed compared with a fixed head magnetic disk. The storage capacity of a magnetic disk product can reach several hundred megabytes with a bit density of 6250 bits per inch and a track density of 475 tracks per inch. The disk set of the multiple replaceable disk memory can be replaced, so that the disk set has large off-body capacity, large capacity and high speed, can store large-capacity information data, and is widely applied to an online information retrieval system and a database management system.
Example three:
the present disclosure also provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the steps of the non-linear texture oriented intra prediction method described above are implemented.
Fig. 4 is a schematic diagram of an internal structure of the electronic device in one embodiment. As shown in fig. 4, the electronic device includes a processor, a storage medium, a memory, and a network interface connected through a system bus. The storage medium of the computer device stores an operating system, a database and computer readable instructions, the database can store control information sequences, and the computer readable instructions when executed by the processor can enable the processor to realize a nonlinear texture-oriented intra prediction method. The processor of the electrical device is used to provide computing and control capabilities to support the operation of the entire computer device. The memory of the computer device may have stored therein computer readable instructions that, when executed by the processor, may cause the processor to perform a non-linear texture oriented intra prediction method. The network interface of the computer device is used for connecting and communicating with the terminal. Those skilled in the art will appreciate that the architecture shown in fig. 4 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
The electronic device includes, but is not limited to, a smart phone, a computer, a tablet, a wearable smart device, an artificial smart device, a mobile power source, and the like.
The processor may be composed of an integrated circuit in some embodiments, for example, a single packaged integrated circuit, or may be composed of a plurality of integrated circuits packaged with the same or different functions, including one or more Central Processing Units (CPUs), microprocessors, digital Processing chips, graphics processors, and combinations of various control chips. The processor is a Control Unit of the electronic device, connects various components of the electronic device by using various interfaces and lines, and executes various functions and processes data of the electronic device by running or executing programs or modules (for example, executing remote data reading and writing programs, etc.) stored in the memory and calling data stored in the memory.
The bus may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. The bus is arranged to enable connected communication between the memory and at least one processor or the like.
Fig. 4 shows only an electronic device having components, and those skilled in the art will appreciate that the structure shown in fig. 4 does not constitute a limitation of the electronic device, and may include fewer or more components than those shown, or some components may be combined, or a different arrangement of components.
For example, although not shown, the electronic device may further include a power supply (such as a battery) for supplying power to each component, and preferably, the power supply may be logically connected to the at least one processor through a power management device, so that functions such as charge management, discharge management, and power consumption management are implemented through the power management device. The power supply may also include any component of one or more dc or ac power sources, recharging devices, power failure detection circuitry, power converters or inverters, power status indicators, and the like. The electronic device may further include various sensors, a bluetooth module, a Wi-Fi module, and the like, which are not described herein again.
Further, the electronic device may further include a network interface, and optionally, the network interface may include a wired interface and/or a wireless interface (such as a WI-FI interface, a bluetooth interface, etc.), which are generally used to establish a communication connection between the electronic device and other electronic devices.
Optionally, the electronic device may further comprise a user interface, which may be a Display (Display), an input unit (such as a Keyboard), and optionally a standard wired interface, a wireless interface. Alternatively, in some embodiments, the display may be an LED display, a liquid crystal display, a touch-sensitive liquid crystal display, an OLED (Organic Light-Emitting Diode) touch device, or the like. The display, which may also be referred to as a display screen or display unit, is suitable, among other things, for displaying information processed in the electronic device and for displaying a visualized user interface.
Further, the computer usable storage medium may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the blockchain node, and the like.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus, device and method can be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules is only one logical functional division, and other divisions may be realized in practice.
The modules described as separate parts may or may not be physically separate, and parts displayed as modules may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment.
In addition, functional modules in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional module.
The embodiments of the present disclosure have been described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be devised by those skilled in the art without departing from the scope of the present disclosure, and such alternatives and modifications are intended to be within the scope of the present disclosure.
Claims (10)
1. A non-linear texture-oriented intra-frame prediction method is used in an intra-frame prediction module in the field of image and video coding, and is characterized by comprising the following steps:
determining a current intra prediction mode;
predicting, in the intra-prediction module, a non-linear texture using prediction modeling including a quadratic function;
the position of the reference pixel is derived from the result of predicting the non-linear texture, and a predicted pixel value is generated from the reference pixel interpolation.
2. The method according to claim 1, wherein the intra prediction modes specifically comprise:
a normal prediction mode close to the reference pixel and an extended prediction mode far from the reference pixel.
3. The method of claim 1, wherein the predictive modeling is a model using a quadratic function or a model using a linear combination of a linear function and a quadratic function.
4. The method according to claim 1, wherein the predicting the non-linear texture in the intra prediction module using prediction modeling including a quadratic function comprises:
the predictive modeling including a quadratic function is represented using two angular prediction modes to predict a non-linear texture.
5. The method according to claim 4, wherein the two angular prediction modes belong to the same set of vertical prediction modes or the same set of horizontal prediction modes.
6. The method according to claim 5, wherein deriving the reference pixel based on the result of predicting the non-linear texture comprises:
when the two angular prediction modes belong to the same set of vertical prediction modes,
using the formula
Determining a reference pixel position c;
wherein, the angle alpha corresponds to the tangential direction when the quadratic function enters the prediction block, and the angle beta corresponds to the tangential direction when the quadratic function leaves the prediction block;
x0denotes the abscissa, y0Represents the ordinate, h represents the height of the intra-predicted block;
using the formula
p[x0][y0]=f[0]*p[c][0]+f[1]*p[c][1]+f[2]*p[c][2]+f[3]*p[c][3]
Determining a location of an intra-predicted pixel;
wherein p [ x0] [ y0] represents an intra-predicted pixel;
when the two angular prediction modes belong to the same set of horizontal prediction modes,
using the formula
Determining a reference pixel position c;
wherein, the angle alpha corresponds to the tangential direction when the quadratic function enters the prediction block, and the angle beta corresponds to the tangential direction when the quadratic function leaves the prediction block;
x0denotes the abscissa, y0Denotes the ordinate, w denotes the width of the intra prediction block;
using the formula
p[x0][y0]=f[0]*p[0][c]+f[1]*p[0][c+1]+f[2]*p[0][c+2]+f[3]*p[0][c+3]
Determining a location of an intra-predicted pixel;
where p [ x0] [ y0] denotes an intra prediction pixel.
7. Method according to claim 6, wherein generating the prediction pixel value from the position of the reference pixel uses a Gaussian interpolation or cubic spline interpolation method to generate the prediction pixel value from the position of the reference pixel.
8. The method according to any one of claims 4 to 7, wherein the combination of the angle prediction modes is based on a statistical rule model and is further based on a data-driven method based on deep learning or machine learning.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps corresponding to the non-linear texture oriented intra prediction method as claimed in any one of claims 1 to 8 when executing the computer program.
10. A computer storage medium having computer program instructions stored thereon, wherein the program instructions, when executed by a processor, are adapted to implement the corresponding steps of the non-linear texture oriented intra prediction method as claimed in any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110577503.5A CN113473135B (en) | 2021-05-26 | 2021-05-26 | Intra-frame prediction method, device and medium for nonlinear texture |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110577503.5A CN113473135B (en) | 2021-05-26 | 2021-05-26 | Intra-frame prediction method, device and medium for nonlinear texture |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113473135A true CN113473135A (en) | 2021-10-01 |
CN113473135B CN113473135B (en) | 2023-09-01 |
Family
ID=77871681
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110577503.5A Active CN113473135B (en) | 2021-05-26 | 2021-05-26 | Intra-frame prediction method, device and medium for nonlinear texture |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113473135B (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102984523A (en) * | 2012-12-14 | 2013-03-20 | 北京大学 | Multi-directional intra-frame forecast encoding and decoding method and device |
JP2013090015A (en) * | 2011-10-13 | 2013-05-13 | Nippon Hoso Kyokai <Nhk> | Intra prediction apparatus, encoder, decoder and program |
JP2018107692A (en) * | 2016-12-27 | 2018-07-05 | Kddi株式会社 | Moving image decoder, moving image decoding method, moving image encoder, moving image encoding method and computer readable recording medium |
US20180278954A1 (en) * | 2015-09-25 | 2018-09-27 | Thomson Licensing | Method and apparatus for intra prediction in video encoding and decoding |
US20180288425A1 (en) * | 2017-04-04 | 2018-10-04 | Arris Enterprises Llc | Memory Reduction Implementation for Weighted Angular Prediction |
WO2019081925A1 (en) * | 2017-10-27 | 2019-05-02 | Sony Corporation | Image data encoding and decoding |
US20200304832A1 (en) * | 2019-03-21 | 2020-09-24 | Qualcomm Incorporated | Generalized reference sample derivation methods for intra prediction in video coding |
CN112640458A (en) * | 2019-01-16 | 2021-04-09 | Oppo广东移动通信有限公司 | Information processing method and device, equipment and storage medium |
-
2021
- 2021-05-26 CN CN202110577503.5A patent/CN113473135B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013090015A (en) * | 2011-10-13 | 2013-05-13 | Nippon Hoso Kyokai <Nhk> | Intra prediction apparatus, encoder, decoder and program |
CN102984523A (en) * | 2012-12-14 | 2013-03-20 | 北京大学 | Multi-directional intra-frame forecast encoding and decoding method and device |
US20180278954A1 (en) * | 2015-09-25 | 2018-09-27 | Thomson Licensing | Method and apparatus for intra prediction in video encoding and decoding |
JP2018107692A (en) * | 2016-12-27 | 2018-07-05 | Kddi株式会社 | Moving image decoder, moving image decoding method, moving image encoder, moving image encoding method and computer readable recording medium |
US20180288425A1 (en) * | 2017-04-04 | 2018-10-04 | Arris Enterprises Llc | Memory Reduction Implementation for Weighted Angular Prediction |
WO2019081925A1 (en) * | 2017-10-27 | 2019-05-02 | Sony Corporation | Image data encoding and decoding |
CN112640458A (en) * | 2019-01-16 | 2021-04-09 | Oppo广东移动通信有限公司 | Information processing method and device, equipment and storage medium |
US20200304832A1 (en) * | 2019-03-21 | 2020-09-24 | Qualcomm Incorporated | Generalized reference sample derivation methods for intra prediction in video coding |
Non-Patent Citations (2)
Title |
---|
LIANG ZHAO等: "wide angular intra prediction for versatile video coding", DATA COMPRESSION CONFERENCE, pages 1 - 10 * |
R.FERNANDES等: "Efficient HEVC intra-frame prediction using curved angular modes", ELECRONICS LETTERS, pages 1 - 3 * |
Also Published As
Publication number | Publication date |
---|---|
CN113473135B (en) | 2023-09-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113569508B (en) | Database model construction method and device for data indexing and access based on ID | |
US20180357166A1 (en) | Method and apparatus for system resource management | |
Edstrom et al. | Data-pattern enabled self-recovery low-power storage system for big video data | |
WO2022252565A1 (en) | Target detection system, method and apparatus, and device and medium | |
Han et al. | A hybrid display frame buffer architecture for energy efficient display subsystems | |
CN110888861A (en) | Novel big data storage method | |
CN112911285B (en) | Hardware encoder intra mode decision circuit, method, apparatus, device and medium | |
CN113473135B (en) | Intra-frame prediction method, device and medium for nonlinear texture | |
CN113806539A (en) | Text data enhancement system, method, device and medium | |
CN112911309B (en) | avs2 encoder motion vector processing system, method, apparatus, device and medium | |
CN114882444A (en) | Image fusion processing method, device and medium | |
CN113866638A (en) | Battery parameter inference method, device, equipment and medium | |
CN113516368A (en) | Method, device, equipment and medium for predicting uncertainty risk of city and community | |
CN112685189B (en) | Method, device, equipment and medium for realizing data processing | |
CN114896422A (en) | Knowledge graph complementing method, device, equipment and medium | |
CN114882489B (en) | Method, device, equipment and medium for horizontally correcting rotating license plate | |
CN117061749A (en) | Multi-transformation coding and decoding method, system, medium and equipment | |
Cho | Fast memory and storage architectures for the big data era | |
CN117240841A (en) | Electronic whiteboard document storage method, system, medium and device based on face recognition | |
CN113792119A (en) | Article originality evaluation system, method, device and medium | |
CN112929665A (en) | Target tracking method, device, equipment and medium combining super-resolution and video coding | |
CN115221122A (en) | Mobile terminal log system, log management method, medium and device | |
CN114943085A (en) | Device, method, equipment and medium for protecting data of visual signal | |
CN112232115A (en) | Calculation factor implantation method, medium and equipment | |
CN115421712A (en) | Software development method, device, equipment and medium based on component model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |