CN104769950B - Crossing plane filtering for the carrier chrominance signal enhancing in Video coding - Google Patents

Crossing plane filtering for the carrier chrominance signal enhancing in Video coding Download PDF

Info

Publication number
CN104769950B
CN104769950B CN201380050776.8A CN201380050776A CN104769950B CN 104769950 B CN104769950 B CN 104769950B CN 201380050776 A CN201380050776 A CN 201380050776A CN 104769950 B CN104769950 B CN 104769950B
Authority
CN
China
Prior art keywords
filter
pass filter
vision signal
video
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201380050776.8A
Other languages
Chinese (zh)
Other versions
CN104769950A (en
Inventor
董洁
贺玉文
叶琰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
InterDigital Madison Patent Holdings SAS
Original Assignee
Vid Scale Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vid Scale Inc filed Critical Vid Scale Inc
Priority to CN201811230342.7A priority Critical patent/CN109327704B/en
Priority to CN202110601265.7A priority patent/CN113518228A/en
Publication of CN104769950A publication Critical patent/CN104769950A/en
Application granted granted Critical
Publication of CN104769950B publication Critical patent/CN104769950B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/80Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
    • H04N19/86Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression involving reduction of coding artifacts, e.g. of blockiness
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/117Filters, e.g. for pre-processing or post-processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability

Abstract

Crossing plane filtering is used for the information from corresponding bright plane to restore the fuzzy edge in one or two colorimetric plane and/or texture.It can implement adaptive crossing plane filter.It can quantify and/or send crossing plane filter coefficient with signal, to which the expense in bit stream makes penalty minimize.The region (such as fringe region) of video image can be selected using crossing plane filtering.It can implement crossing plane filter in single-layer video coded system and/or multi-layer video coding system.

Description

Crossing plane filtering for the carrier chrominance signal enhancing in Video coding
Cross reference to related applications
This application claims enjoy that September in 2012 submits on the 28th application No. is 61/707,682 US provisional patent Shens Please, submit on 2 8th, 2013 application No. is 61/762,611 U.S. Provisional Application, the applications submitted on March 12 in 2013 Number for 61/778,218 U.S. Provisional Application and July 12 in 2013 submit application No. is 61/845,792 U.S. to face When the equity applied, contents of these applications are all incorporated as referring to herein.
Background technology
Compressed digital video signals are come by using video coding system, for example, so as to reduce consumption memory space and/ Or reduce bandwidth consumption associated with this signal.For example, block-based hybrid video coding system is widely deployed It is used with frequent.
Digital video signal typically has there are three planes of color (color plane), including:Luminance plane (luma Plane), the poor colorimetric plane (blue-difference chroma plane) of blue and red poor colorimetric plane (red- difference chroma plane).The pixel of colorimetric plane typically has the smaller dynamic range of specific luminance plane, from And the colorimetric plane of video image typically specific luminance plane is smoother and/or has less details.Therefore, video image Chrominance block can be easier to be correctly predicted, such as consume less resource and/or generate less prediction error.
However, will produce using the Video coding of known colorimetric prediction technology has significantly fuzzy side in colorimetric plane The video image of edge and/or texture.
Invention content
It can be filtered using crossing plane to restore one or two by using the information from corresponding bright plane Fuzzy edge and/or texture (texture) in colorimetric plane.It can implement adaptive crossing plane filter.It can quantify And/or crossing plane filter coefficient is sent with signal, so that the expense in bit stream is (such as being reduced of can bearing And/or be minimized) without bringing penalty.Can determine crossing plane filter one or more features (such as Size, separability, symmetry etc.) so that the expense in bit stream is (such as being reduced and/or by most of can bearing Smallization) without bringing penalty.It can be by crossing plane filtering application in a variety of colors double sampling (subsampling) format (such as 4:4:4,4:2:2 and 4:2:0) video.Crossing plane can be applied to filter to select The region of video image, such as applied to fringe region and/or by the bitstream with one or more parameters of signal transmission The region of defined.It can implement crossing plane filter in single-layer video coded system and/or multi-layer video coding system.
According to crossing plane filter exemplary video decoding process may include receive vision signal and with the video The associated crossing plane filter of signal.Video decoding process may include that crossing plane filter is applied to vision signal Luminance plane pixel to determine chroma offset.Video decoding process may include the phase that chroma offset is added to vision signal Answer colorimetric plane pixel.
Video encoder can be configured for crossing plane filtering.Video encoder may include network interface, The network interface is configured as receiving vision signal and crossing plane filter associated with the vision signal.Video coding is set Standby may include processor, which is configured as crossing plane filter being applied to the luminance plane pixel of vision signal To determine chroma offset.The processor can be configured as the corresponding colorimetric plane picture that chroma offset is added to vision signal Element.
The exemplary video cataloged procedure filtered according to crossing plane may include receiving vision signal.Video coding process May include generating crossing plane filter using the component of vision signal.Video coding process may include pair with intersect it is flat Filter associated filter coefficient in face is quantified.Video coding process may include being encoded to the filter coefficient In the bit stream for representing vision signal.Video coding process may include the transmission bit stream.
Video encoder can be configured for crossing plane filtering.Video encoder may include network interface, The network interface is configured as receiving vision signal.Video encoder may include processor, which is configured as making Crossing plane filter is generated with the component of vision signal.The processor can be configured as pair and crossing plane filter phase Associated filter coefficient is quantified.The processor, which can be configured as the filter coefficient being encoded to, represents video letter Number bit stream in.The processor can be configured as the transmission bit stream, such as is transmitted via network interface.
Description of the drawings
Fig. 1 is the block diagram for showing illustrative block-based video encoder;
Fig. 2 is the block diagram for showing illustrative block-based Video Decoder;
Fig. 3 is the block diagram for showing illustrative two layers of coding for spatial scalable video device;
Fig. 4 is the block diagram for showing illustrative two layers of spatial scalable video decoder;
Fig. 5 is the block diagram of exemplary inter-layer prediction processing and administrative unit;
Fig. 6 A depict exemplary 4:4:4 color double sampling formats;
Fig. 6 B depict exemplary 4:2:2 color double sampling formats;
Fig. 6 C depict exemplary 4:2:0 color double sampling format;
Fig. 7 is the exemplary block diagram of crossing plane filtering;
Fig. 8 A and Fig. 8 B are another exemplary block diagrams of crossing plane filtering;
Fig. 9 A and Fig. 9 B are another exemplary block diagrams of crossing plane filtering;
Figure 10 A are depicted for 4:4:Crossing plane filter (filter _ Y4Cb and the filter of selection chroma pixel in 4 Wave device _ Y4Cr) exemplary size and support area;
Figure 10 B are depicted for 4:2:Crossing plane filter (filter _ Y4Cb and the filter of selection chroma pixel in 2 Wave device _ Y4Cr) exemplary size and support area;
Figure 10 C are depicted for 4:2:Crossing plane filter (filter _ Y4Cb and the filter of selection chroma pixel in 0 Wave device _ Y4Cr) exemplary size and support area;
Figure 11 A are depicted for 4:4:Crossing plane filter (filter _ Y4Cb and the filter of selection chroma pixel in 4 Wave device _ Y4Cr) exemplary unified size (unified size) and support area;
Figure 11 B are depicted for 4:2:Crossing plane filter (filter _ Y4Cb and the filter of selection chroma pixel in 2 Wave device _ Y4Cr) exemplary unified size and support area;
Figure 11 C are depicted for 4:2:Crossing plane filter (filter _ Y4Cb and the filter of selection chroma pixel in 0 Wave device _ Y4Cr) exemplary unified size and support area;
Figure 12 A depict the example for lacking symmetric properties of exemplary crossing plane filter;
Figure 12 B depict the gentle vertical symmetry attribute of exemplary water of exemplary crossing plane filter;
Figure 12 C depict the exemplary vertical symmetric properties of exemplary crossing plane filter;
Figure 12 D depict the exemplary horizontal symmetric properties of exemplary crossing plane filter;
Figure 12 E depict the exemplary dots symmetric properties of exemplary crossing plane filter;
Figure 13 A depict the gentle vertical one-dimensional filter of the exemplary water without symmetry;
Figure 13 B depict the gentle vertical one-dimensional filter of the exemplary water with symmetry;
Figure 14 is to show the exemplary exemplary syntax table that crossing plane filter coefficient set is sent with signal;
Figure 15 A and Figure 15 B depict the exemplary arrangement of crossing plane filter coefficient;
Figure 16 is to show the exemplary exemplary syntax table that multiple crossing plane filter coefficient sets are sent with signal;
Figure 17 is to show the exemplary example that the information for defining the region filtered for crossing plane is sent with signal Property syntax table;
Figure 18 depicts the example in the multiple images region arrived according to the examinations of the crossing plane filtering based on region;
Figure 19 be show with signal by about the information of multiple regions and multiple crossing plane filter coefficient sets together The exemplary exemplary syntax table sent;
Figure 20 depicts the exemplary photo level selection algorithm filtered for crossing plane;
Figure 21 A depict the system diagram of example communication system, can implement disclosed one in the communication system Or multiple embodiments;
Figure 21 B depict the example wireless transmitter/receiver unit that can be used in the communication system shown in Figure 21 A (WTRU) system diagram;
Figure 21 C depict the sample radio access network and example that can be used in the communication system shown in Figure 21 A The system diagram of property core net;
Figure 21 D depict the sample radio access network and example that can be used in the communication system shown in Figure 21 A The system diagram of property core net;
Figure 21 E depict the sample radio access network and example that can be used in the communication system shown in Figure 21 A The system diagram of property core net.
Specific implementation mode
Fig. 1 shows illustrative block-based video encoder.(such as block-by-block processing) input video letter can be handled Numbers 102.Video module unit may include 16x16 pixel.This module unit is properly termed as macro block (MB).Video block cell size Such as 64x64 pixel can be expanded to.High definition video signal (example can be compressed using the video block that size is extended Such as 1080p vision signals or more).The block size of extension is properly termed as coding unit (CU).CU can be divided into one or more Predicting unit (PU), individual prediction technique can be applied to one or more of PU.
For one or more input video blocks (such as each input video block), such as MB or CU, space can be executed Prediction 160 and/or time prediction 162.Spatial prediction 160 (it is properly termed as interior prediction (intra prediction)) can make For from the pixel of the coded contiguous block of one or more of video pictures and/or piece, such as so as to predicted video block. Spatial prediction 160 can reduce spatial redundancy, which may be that vision signal is intrinsic.(it can be with for time prediction 162 (inter prediction) and/or motion compensated prediction are predicted between referred to as) or motion prediction (estimation and compensation) 162 can make With the pixel from one or more coded video pictures, such as so as to predicted video block.Time prediction can be reduced Time redundancy, the time redundancy may be that vision signal is intrinsic.Time prediction signal for video block may include one Or multiple motion vectors and/or one or more reference picture index (for example, if if having used multiple reference pictures), with Recognition time prediction signal can be originated from which of reference picture memory 164 reference picture.
After executing spatial prediction and/or time prediction, pattern determines block 180 (such as in encoder), can Pattern decision and other encoder control logics 180 are referred to as, prediction mode can be selected, such as based on rate-distortion optimization side Method is selected.Prediction block can be subtracted from video block 116.104 and/or 106 prediction residuals of quantization can be converted.It can inverse amount Change the residual error coefficient after 110 and/or the one or more quantizations of inverse transformation 112, such as to form reconstructed residual.The reconstructed residual It can be added in prediction block 126, such as to form reconstructing video block.
In addition, in loop filtering, such as reconstructing video block is being stored in reference picture memory 164 and/or is being used for It, can will for example one or more deblocking (deblocking) filters and/or adaptive ring before encoding subsequent video block Path filter (for example, loop filter 166) is applied on reconstructing video block.It, can be in order to form output video bit stream 120 By the residual error after coding mode ((inter) or interior (intra) between such as), prediction mode information, movable information, and/or quantization Coefficient is sent to entropy code unit 108, such as forms bit stream 120 to be further compressed and/or be packaged.
Fig. 2 shows illustrative block-based Video Decoder, the block-based coding described in Fig. 1 can correspond to Device.Unpacking and/or entropy decoding can be carried out to video bit stream 202 at such as entropy decoding unit 208.It can be by coding mode And/or predictive information be sent to spatial prediction unit 260 (such as interior coding) or time prediction unit 262 (such as Between encode), such as to form prediction block.Time prediction 262 can be referred to as motion compensated prediction 262, can by one or Multiple residual transform coefficients are sent to inverse quantization unit 210 and/or inverse transformation block 212, such as residual block to be reconstructed. Prediction block and residual block can be added together at 226, such as to form reconstructed blocks.It can be added in such as reconstructed blocks It is stored in ginseng before being added to (such as being transmitted to display equipment) the reconstruct output video 220 that will be transmitted and/or in reconstructed blocks Before examining in picture memory 264, reconstructed blocks are handled by filtering (such as using loop filter 266) in loop, with example It is such as used to predict one or more subsequent video blocks.
Can have in computing capability, register and/or memory size, display resolution, display frame rate etc. etc. Video is for example enjoyed by smart phone and/or tablet in the equipment of different abilities.Network and/or transmission channel can be Packet loss rate, available channel bandwidth, burst error rate etc. have different characteristics.Can by cable network and/or The combination of wireless network transmits video data, and this can enable one or more bottoms (underlying) video transmission channel spies Sign is complicated.In this case, scalable video coding can improve the video quality provided by Video Applications, for example, by The video quality that the Video Applications run in equipment with different abilities are provided by heterogeneous network.
Scalable video coding can indicate (such as temporal resolution, spatial resolution, quality etc.) to regarding according to highest Frequency signal is encoded, but can be for example according to regulation speed used by the one or more application run on the client device Rate and/or expression are decoded from each subset of one or more video flowings.Scalable video coding can save bandwidth And/or memory.
Fig. 3 shows tool, and there are one exemplary two layers of scalable video codings of basal layer (BL) and an enhancement layer (EL) System.Spatial resolution between the two layers can be different, so as to application space scalability.Base layer encoder (such as efficient video coding (HEVC) encoder) can input base layer videos and encode, such as block-by-block coding, and Base layer bit stream (such as according to the block diagram described in Fig. 1) can be generated.Enhancement layer encoder can be defeated to enhancement-layer video Enter to be encoded, such as block-by-block coding, and enhancement layer bit-stream (such as according to the block diagram described in Fig. 1) can be generated.It can To improve the code efficiency (such as code efficiency of enhancement layer coding) of scalable video coding system.It is, for example, possible to use coming Forecasting accuracy is improved from the signal correlation of basal layer reconstructing video.
Basal layer reconstructing video can be handled, so that at least portion of the base layer pictures of one or more processing Dividing can be inserted into enhancement layer decoder image cache (EL DPB) and/or for predicting that enhancement-layer video inputs.Basal layer Video and enhancement-layer video can be the substantially the same video sources indicated with each different spatial resolutions, so that they It corresponds to each other via such as down-sampling process.Inter-layer prediction (ILP) processing can be held by interlayer management and/or administrative unit Row, such as the up-sampling that can be used for the spatial resolution that basal layer reconstructs being aligned with the spatial resolution of enhancement-layer video are grasped Make.Scalable video coding bit-streams may include the base layer bit stream generated by basal layer and enhancement layer encoder, enhance Layer bit stream, and/or inter-layer prediction information.
Inter-layer prediction information can be handled by ILP and administrative unit generates.For example, ILP information may include following one Or more persons:The type of the interlayer management of application;Which up-sampling filtering is the one or more parameters used in processing (such as use Device);It should treated which of base layer pictures base layer pictures are inserted into EL DPB by one or more;Etc..It can Basal layer and enhancement layer bit-stream and/or ILP information to be multiplexed together, such as to form scalable bit stream (example Such as SHVC bit streams).
Fig. 4 shows exemplary two layers of scalable video decoding for the salable encoder that can correspond to describe in Fig. 3 Device.The decoder can execute one or more operations, such as be executed according to the sequence opposite with encoder.It can contract It puts bit stream and demultiplexes into base layer bit stream, enhancement layer bit-stream, and/or ILP information.Base layer decoder can be to basis Layer bit stream is decoded and/or can generate basal layer reconstruct.
ILP processing and administrative unit can receive ILP information and/or can handle basal layer reconstruct, such as according to reception To ILP information handle.ILP processing and administrative unit can be selectively by one or more treated base layer pictures It is inserted into EL DPB, such as is inserted into according to received ILP information.Enhancement layer decoder can be to enhancement layer ratio Spy's stream (such as the enhancement layer bit-stream has the combination of time reference picture and/or inter-layer reference picture (such as one or more Treated base layer pictures)) it is decoded, so as to reconstructed enhancement layers video.For the ease of carrying out disclosure, term " join by interlayer Examine picture " and " treated base layer pictures " may be used interchangeably.
Fig. 5 depict exemplary inter-layer prediction and processing administrative unit, such as its can be shown in FIG. 3 exemplary two Implement in exemplary two layers of spatial scalable video decoder shown in sheaf space salable video encoder and/or Fig. 4.Layer Between predict and processing administrative unit may include one or more grades (such as three grades described in Fig. 5).The first order (such as Grade 1) in, BL reconstructed pictures (such as before it is sampled) can be enhanced.In the second level (such as grade 2), it can execute Up-sampling (such as when the resolution ratio of the resolution ratio of BL less than the EL in the scaling of space).The resolution ratio and tool of the output of the second level There is the resolution ratio of the EL of the sample grid (grid) of alignment essentially identical.Can the picture after up-sampling be for example put into EL Enhancing is executed in the third level (such as grade 3) before in DPB, this can improve inter-layer reference picture quality.
Can one or more of above three grade grade be executed by inter-layer prediction and processing administrative unit or do not executed Any grade.For example, in signal-to-noise ratio (SNR) scalability (wherein the resolution ratio of BL pictures substantially with low-qualityer EL The resolution ratio of picture is identical), it can not execute one or more of above three grade grade (such as executing all grades), such as from And BL reconstructed pictures can be inserted into EL DPB to be directly used in inter-layer prediction.In spatial scalability, it can execute The second level, such as so that the BL reconstructed pictures after up-sampling have the sample grid being aligned with EL pictures.Can be executed One and the third level improve inter-layer reference picture quality, for example, its higher efficiency that can help to realize EL codings.
Picture level ILP is executed in scalable video coding system (such as shown in Fig. 3 and Fig. 4) can reduce implementation again Miscellaneous degree, such as because (such as at block grade) each basal layer and/or enhancement layer encoder and/or decoder logic can be down to It is partially used again without changing.Advanced (such as picture level and/or chip level) is configured one or more each places Base layer pictures after reason are inserted into enhancement layer DPB.In order to improve code efficiency, one can be allowed in scalable system Or multiple pieces of grades change, such as to promote block grade inter-layer prediction, can be executed except picture level inter-layer prediction.
Single layer and/or multi-layer video coding system described herein can be used for encoding color video.In colour In video, carry brightness and chrominance information each pixel can by each intensity of master (primary) color combination (such as YCbCr, RGB or YUV) it constitutes.Each video frame of color video can be by three rectangle battle arrays corresponding to three color channels Row composition.The sampling of one or more of color channel (such as each color channel) can have discrete and/or limited width Degree, the amplitude can be indicated in digital video application using 8 bit values.It can be in video capture and/or display system Use red, green and blue (RGB) mass-tone.
In Video coding and/or transmission, the vision signal in rgb space can be transformed into other one or more face In the colour space (such as with brightness and/or chromaticity coordinate), such as the YUV of PAL and SECAM TV systems and for NTSC The YIQ of TV systems, such as with to reduce the compatibility of bandwidth consumption and/or realization and monochrome video application.The value of Y-component can To represent the brightness of pixel, and other two components (such as Cb and Cr) can carry chrominance information.Digital color space (such as YCbCr) can be analog color space (such as YUV) scaling and/or shifted versions.It is sat for obtaining YCbCr from RGB coordinates Target transformation matrix can be expressed as equation (1).
Because human visual system (HVS) is less than the sensibility to brightness to the sensibility of color, therefore can be to coloration Component Cb and Cr carry out double sampling, and hardly deteriorate the video quality experienced.Color double sampling format can be by By the digital triple instruction of colon separation.For example, according to 4:2:2 color double sampling formats are used for the level of chromatic component Sample rate can halve, and Vertical Sampling rate can be constant.According to 4:2:0 color double sampling format, it is associated in order to reduce Data rate, the sample rate for chromatic component can halve in the horizontal direction and the vertical direction.According to can be used for Use the 4 of the application of very high video quality:4:4 color double sampling formats, chromatic component can have with for luminance component The essentially identical sample rate of sample rate.It respectively depicts and is shown for above-mentioned color double sampling format in Fig. 6 A- Fig. 6 C The example sampled grid of brightness and chroma sample.
Y, Cb and Cr planes of color of frame in video sequence can (such as highly relevant) related to content, but two A colorimetric plane can show the less texture of specific luminance plane and/or edge.Three planes of color can share identical fortune It is dynamic.It, can not be in block when block-based hybrid video coding system (such as according to Fig. 1 and Fig. 2) is applied to color block These three planes independently encode.If predicting to encode color block between, the two chrominance blocks can reuse The movable information of luminance block, such as motion vector and/or reference key.It is bright if encoded to color block with interior prediction Spend block can than one or two of two chrominance blocks chrominance block have more prediction directions it is available, such as this be because Can have more types and/or stronger edge for luminance block.
For example, according to H.264/AVC interior prediction, luminance block can have nine candidate directions, but chrominance block can have There are four candidate directions.According to HEVC interior predictions, chrominance block can have there are four candidate direction, and luminance block can have it is more In four candidate directions (such as 35 candidate directions).It can independently execute for brightness and/or colorimetric prediction mistake Each transformation and/or quantizing process, such as execute inside or after prediction.(such as it to be wherein used for brightness in low bit rate 30 four) QP is more than, and coloration can have finer (lighter) quantization than corresponding brightness, and (such as smaller quantization walks It is long), such as this is because edge and/or texture in colorimetric plane can be exquisiter and can be more affected by coarse quantization The influence of (heavy quantization), wherein coarse quantization can cause visible pseudomorphism, such as bleeding.
The equipment for being configured as executing Video coding (such as to encode and/or decoding video signal) is properly termed as video Encoding device.This video encoder may include the equipment that can handle video, for example, TV, digital media player, DVD player, Blu-ray player, the media player device of networking, desktop computer, pocket pc, tablet device, shifting Mobile phone, video conferencing system, the video coding system etc. based on hardware and/or software.This video encoder can be with Including cordless communication network element, such as wireless transmitter/receiver unit (WTRU), base station, gateway or other network elements.
Video encoder can be configured as receives vision signal (such as video bit stream) via network interface.Depending on Frequency encoding device can have radio network interface, wired network interface or its arbitrary combination.For example, if Video coding is set Standby is cordless communication network element (such as wireless transmitter receiver unit (WTRU)), then network interface can be the transmitting-receiving letter of WTRU Machine.By way of further example, if video encoder is the equipment (example that can handle video for not being configurable for wireless communication Such as rear end-rack encoder), then network interface can be cable network connection (such as optical fiber connection).By way of further example, network connects Mouth can be configured as with physical storage media (such as CD drive, memory card interface, to the direct of video camera Interface etc.) interface that is communicated.It should be understood that network interface is not limited to these examples, and network interface may include making Other interfaces of vision signal can be received by obtaining video encoder.
Video encoder can be configured as to one or more vision signals (such as network by video encoder The source video signal that interface arrives) execute crossing plane filtering.
It can be filtered using crossing plane, such as restore one or two to use the information from corresponding bright plane Fuzzy edge in a colorimetric plane and/or texture.It can implement adaptive crossing plane filter.It can quantify and/or use Signal sends crossing plane filter coefficient, so that the expense in bit stream reduces (such as minimum) penalty, such as It is executed according to the threshold level of the transmission performance of bit stream associated with vision signal.It (such as can be exported in bit stream Video bit stream) in transmission and/or can be in the relevant out-of-band delivery crossing plane filter coefficient of bit stream.
It can determine the one or more features (such as size, separability, symmetry etc.) of crossing plane filter, So that the expense in bit stream can be born, without making penalty.Crossing plane filtering can be applied to have difference Color double sampling format (such as including 4:4:4,4:2:2 and 4:2:0) video.It can come using crossing plane filtering Select video image region (such as fringe region and/or can in the bitstream with signal send one or more areas Domain).It can implement crossing plane filter in single-layer video coded system.It can implement to hand in multi-layer video coding system Pitch flat filter.
Luminance plane may be used as improving the guide of the quality of one or two colorimetric plane.For example, about luminance plane One or more parts of information may be mixed in in corresponding colorimetric plane.For the ease of open, original (such as do not compile Code) three planes of color of video image can be respectively labeled as Y_org, Cb_org and Cr_org, and original video Three planes of color of version can be respectively labeled as Y_rec, Cb_rec and Cr_rec after the coding of image.
Fig. 7 show can use crossing plane filtering example, such as by using inverse process (such as above The process (1) shown) Y_rec, Cb_rec and Cr_rec switched back into rgb space, wherein by three planes in rgb space It is respectively labeled as R_rec, G_rec and B_rec.Y_org, Cb_org and Cr_org can be switched back to rgb space (such as Substantially simultaneously convert), it is hereby achieved that being marked as each original RGB planes of R_org, G_org and B_org.Most Small two, which multiply (LS) training method, to make (R_org, R_rec), (G_org, G_rec) and (B_org, B_rec) plane It is training dataset three filters being respectively trained for R, G and B plane, these three filters are marked as Filter_R, filter_G and filter_B.It is right respectively to be come by using filter_R, filter_G and filter_B R_rec, G_rec and B_rec are filtered, and can obtain three improvement for being marked as R_imp, G_imp and B_imp RGB planes, and/or with R_org with R_rec, G_org with G_rec and B_org compared with each distortion between B_rec, Distortion between R_org and R_imp, G_org and G_imp and B_org and B_imp can be reduced (such as minimum respectively Change).R_imp, G_imp and B_imp can be switched to YCbCr space, and can obtain Y_imp, Cb_imp and Cr_ Imp, wherein Cb_imp and Cr_imp can be the output of crossing plane filtering.
Such as shown in Figure 7 back and forth converting colors space can consume encoder and/or decoder-side side or The computing resource (such as undesirable a large amount of computing resources) of both sides.Because space transfer process and filtering are all linear , at least part of the crossing plane filtering shown can be for example next approximate using the process of simplification, in the simplification process In executed in YCbCr space it is described operation one or more of operation (such as all operations).
As shown in Figure 8 A, in order to improve the quality of Cb_rec, LS training modules can by Y_rec, Cb_rec, Cr_rec with And Cb_org is as training dataset, and can by related obtained optimum filter filter_Y4Cb, filter_Cb4Cb, And filter_Cr4Cb can be respectively applied to Y_rec, Cb_rec and Cr_rec.It can will be to the filtering of three planes Each output is added to together, such as to obtain the improvement Cb planes for being marked as Cb_imp.It can be instructed by LS methods Practice three optimum filters, so as to for example make the distortion minimization between Cb_imp and Cb_org according to equation (2):
WhereinTwo-dimentional (2-D) convolution is represented ,+and-respectively represent addition of matrices and subtraction, and E [(X)2] represent matrix Each element is square in X.
As shown in Figure 8 B, in order to improve the quality of Cr_rec, LS training modules can by Y_rec, Cb_rec, Cr_rec with And Cr_org is as training dataset, and can by related obtained optimum filter filter_Y4Cr, filter_Cb4Cr, And filter_Cr4Cr can be respectively applied to Y_rec, Cb_rec and Cr_rec.It can will be to the filtering of three planes Each output is added to together, such as to obtain the improvement Cr planes for being marked as Cr_imp.It can be instructed by LS methods Practice three optimum filters, so as to for example make the distortion minimization between Cr_imp and Cr_org according to equation (3):
Cr is seldom to improving Cb contributions.Cb is seldom to improving Cr contributions.
It can be with crossing plane filtering technique shown in simplification figure 8A and Fig. 8 B.For example, as shown in Figure 9 A, can by The quality of Cb planes is improved using Y and Cb planes rather than Cr planes in LS training, to two filter, that is, filter_ Y4Cb and filter_Cb4Cb can be obtained by related and can be respectively applied to Y and Cb.It can be by each defeated of filter Go out to be added to together, such as to obtain the improvement Cb planes for being marked as Cb_imp.
It as shown in Figure 9 B, can be by improving Cr planes using Y and Cr planes rather than Cb planes in being trained in LS Quality, to which two filter, that is, filter_Y4Cr and filter_Cr4Cr can be obtained and can be applied respectively by related In Y and Cr.Each output of filter can be added to together, for example, so as to obtain be marked as Cr_imp improvement Cr it is flat Face.
Crossing plane filtering technique shown in Fig. 9 A and Fig. 9 B can reduce training and/or the respective calculating of filtering is complicated Degree, and/or the overhead-bits that crossing plane filter coefficient is transmitted to decoder-side can be reduced, to which penalty can be Small.
In order in video coding system implement crossing plane filtering, can solve it is following one or more:Crossing plane Filter size determines;Crossing plane filter coefficient quantifies and/or transmission (such as being sent with signal);And by crossing plane Filtering is suitable for one or more regional areas.
In order to train best crossing plane filter, it may be determined that suitable filter size.The size of filter can be with It is substantially proportional to the size of the expense with filter association and/or with the computation complexity of filter.For example, 3x3 filters There can be nine filter coefficients to be transmitted, and nine multiplication and eight sub-additions may be used to complete to a pixel Filtering.5x5 filters can have 25 filter coefficients to be transmitted, and may be used 25 multiplication and 24 sub-additions complete the filtering to a pixel.Lower minimum distortion (example may be implemented in larger filter Such as according to equation (2) and (3)), and/or better performance can be provided.Filter size can be selected come multiple to for example calculating Miscellaneous degree, overhead size and/or performance are balanced.
The filter of plane trained itself is can be applied to, such as filter_Cb4Cb and filter_Cr4Cr can To be embodied as low-pass filter.It can be used for the filter of crossing plane trained, such as filter_Y4Cb, filter_ Y4Cr, filter_Cb4Cr and filter_Cr4Cb may be embodied as high-pass filter.Use different size of different filtering Device can influence very little to the performance of corresponding video coded system.The size of crossing plane filter can keep very little (such as It is as small as possible), such as to which performance loss is insignificant.For example, crossing plane filter size can be selected, to basic Performance loss is not observed.It can implement crossing plane filter (such as MxN crossing plane filters, the wherein M of larger size Can be integer with N).
For example, for low-pass filter, such as filter_Cb4Cb and filter_Cr4Cr, filter size can be implemented For 1x1, to which the filter has a coefficient being multiplied with each pixel to be filtered.The filter_Cb4Cb of 1x1 and The filter coefficient of filter_Cr4Cr can be fixed as 1.0, so as to save (such as do not apply and/or without signal send out Send) filter_Cb4Cb and filter_Cr4Cr.
For high-pass filter, such as filter_Y4Cb and filter_Y4Cr, filter size may rely on or solely Stand on color sampling format.Crossing plane filter size may rely on color sampling format.For example, selection color can be directed to Spend size and/or support area that pixel implements crossing plane filter (such as filter_Y4Cb and filter_Y4Cr), example As shown in Figure 10 A-10C, wherein circle can represent the respective position of luma samples, and black triangle can represent coloration The respective position of sample, and the brightness sample for being filtered to (such as by hollow triangle expression) selection chroma sample Originally it can be represented by gray circles.As shown, the filter size of filter_Y4Cb and filter_Y4Cr is for 4:4:4 Hes 4:2:Can be 3x3 for 2 color formats, and for 4:2:Can be 4x3 for 0 color format.Filter size can be with Independently of color format, such as shown in Figure 11 A- Figure 11 C.Such as according to for 4:2:The size of 0 format, filter size can To be 4x3.
High-pass filter after training can be applied to Y plane by crossing plane filtering, and can tie filtering Offset of the fruit (being labeled as Y_offset4Cb and Y_offset4Cr) as the respective pixel being added in colorimetric plane, such as root It is executed according to equation (4) and (5).
Cb_imp=Cb_rec+Y_offset4Cb and Cr_imp=Cr_rec+Y_offset4Cr
(5)
Crossing plane filter coefficient can be quantified.Such as before being transmitted, the crossing plane filtering after training Device can have the real-valued coefficients that can be quantized.For example, filter_Y4Cb can be by being marked as the integer of filter_int Filter comes substantially approximate.Element in filter_int can have small dynamic range (such as can according to the expression of 4- bits With from -8 to 7).Second coefficient is marked as coeff., and can be used to make filter_int more accurately close to Filter_Y4Cb, such as according to shown in equation (6).
filter_Y4Cb≈filter_int×coeff. (6)
It is that the coeff. of real number value number can be by M/2 in equation (6)NApproximation, wherein M and N are integer, such as basis Shown in equation (7).
filter_Y4Cb≈filter_int×M/2N (7)
In order to transmit filter_Y4Cb, can for example by filter_int coefficient and M, N be encoded to bit stream together In.Above-mentioned quantification technique can be for example extended, to quantify filter_Y4Cr.
Crossing plane filter (such as filter_Y4Cb and/or filter_Y4Cr) can have flexible separation property And/or symmetry.The crossing plane filter category introduced here can be described relative to exemplary 4x3 crossing planes filter Property (such as according to Figure 10 A- Figure 10 C or Figure 11 A- Figure 11 C), but the crossing plane filter attributes can be applied to other filters Wave device size.
Crossing plane filter can have various symmetric properties, such as according to Figure 12 A- Figure 12 E descriptions.Crossing plane Filter can not have symmetry, such as described in Figure 12 A.Each square can represent a filter coefficient, And it can be marked with unique index, unique index can indicate that its value can be different from the value of remaining filter coefficient.It hands over Horizontal and vertical symmetry can be had by pitching flat filter, such as shown in Figure 12 B, to which coefficient can have and one Or the identical value of one or more of other multiple quadrants corresponding coefficient.Crossing plane filter can have vertical symmetry Property, such as indicated in fig. 12 c.Crossing plane filter can have horizontal symmetry, such as indicated in fig. 12d.Crossing plane Filter can have point-symmetry property, such as shown in figure 12e.
Crossing plane filter can be not limited to symmetry shown in Figure 12 A- Figure 12 E, and can have there are one or Other multiple symmetry.If (for example, at least two coefficients can be used at least two coefficients value having the same in filter It is identical to index to mark), then crossing plane filter can have symmetry.For example, for high-pass crossover flat filter (such as filter_Y4Cb and filter_Y4Cr), it is advantageous that:In the boundary along filter support area, to one or Multiple (such as all) coefficients do not apply symmetry, but the intra coeff to filter support area one or more (such as It is all) coefficient applies certain symmetry (for example, horizontal and vertical symmetrical, horizontal symmetrical, vertical symmetry or point symmetry).
Crossing plane filter can be separable.For example, the crossing plane filtering using 4x3 two dimensional filters can be with It is equivalent to and 1x3 horizontal filters is applied to row (such as during first order) and 4x1 vertical filters are applied to the first order Output row (such as during second level).The first order and the sequence of the second level can change.Symmetry can be applied to 1x3 Horizontal filter and/or 4x1 vertical filters.Figure 13 A and Figure 13 B are respectively depicted without symmetry and with symmetry Two one-dimensional filtering devices.
No matter whether crossing plane filter is separable and/or symmetrical, and filter coefficient is encoded to bit stream In can be confined to the filter coefficient with unique value.It, can be with for example, according to the crossing plane filter described in Figure 12 A 12 filter coefficients (index is 0 to 11) are encoded.It, can be with according to the crossing plane filter described in Figure 12 B Four filter coefficients (index is 0 to 3) are encoded.(example can be reduced by implementing symmetry in crossing plane filter In video signal bitstream) overhead size.
The summation of the filter coefficient of crossing plane filter can be equal to zero, such as crossing plane filter (such as Filter_Y4Cb and filter_Y4Cr) be high-pass filter in the case of.According to this attribute (it can be constraint), hand over The summation of other coefficients can be equal to by pitching the amplitude of the coefficient (a for example, at least coefficient) in flat filter, but can be had There is opposite symbol.If there is crossing plane filter X coefficient to transmit, (such as illustrated in fig. 12,12) X is equal to, can With by (such as having to explicitly encode) in X-1 coefficient coding to bit stream.Decoder can receive X-1 coefficient and can example The value of (such as implicitly obtaining) remaining coefficient is such as obtained based on the constraint of zero summation.
For example in video bit stream crossing plane filter factor can be sent with signal.Exemplary syntax table in Figure 14 Lattice show two-dimentional, non-separable, the asymmetrical crossing plane sent with signal for colorimetric plane (such as Cb or Cr) The example of filter coefficient set.The item that can be applied in exemplary syntax table is described below.Item num_coeff_hori_ Minus1 can indicate the number of coefficients in the horizontal direction of crossing plane filter plus 1 (+1).Item num_coeff_vert_ Minus1 can indicate the number of coefficients in the vertical direction of crossing plane filter plus 1 (+1).Item num_coeff_ Reduced_flag, which is equal to 0, can indicate that the quantity of crossing plane filter coefficient can be equal to (num_coeff_hori_ Minus1+1) × (num_coeff_vert_minus1+1), such as shown in fig. 15.As shown in fig. 15, num_coeff_ Hori_minus1 is equal to 2, and num_coeff_vert_minus1 is equal to 3.
Num_coeff_reduced_flag is equal to the quantity that 1 can indicate crossing plane filter coefficient, and (it can be with Typically equal to (num_coeff_hori_minus1+1) × (num_coeff_vert_minus1+1)) it can be for example by going It is reduced to (num_coeff_hori_minus1+1) × (num_coeff_vert_minus1+1)-except the coefficient at four angles 4, such as example as shown in fig. 15b.The support area of crossing plane filter can be reduced, such as pass through four angles of removal Coefficient executes.The flexibility that can provide enhancing using num_coeff_reduced_flag, such as regardless of whether filtering Device coefficient is reduced.
Item filter_coeff_plus8 [i] subtracts 8 and can correspond to i-th of crossing plane filter coefficient.Filter The value of coefficient can be in the range of such as -8 to 7.In this case, item filter_coeff_plus8 [i] can be 0 to 15 Range in, and can be encoded, such as encoded according to the fixed-length code (FLC) (FLC) of 4 bits.Item scaling_ Factor_abs_minus1 and scaling_factor_sign can provide zoom factor (such as M in equation (7)) together Value it is as follows:
M=(1-2*scaling_factor_sign) * (scaling_factor_abs_minus1+1)
Item bit_shifting can specify that the amount of bits moved to right after scaling process.This can represent equation (7) N in.
The different zones of picture can have different statistical attributes.Obtain for one or more this regions (such as For each this region) crossing plane filter coefficient can improve chroma coder performance.In order to illustrate different intersections Flat filter coefficient set can be applied to the different zones of picture or piece, for the picture or piece, can picture level (such as In adaptive pictures (APS)) and/or the multiple crossing plane filter coefficient sets of chip level (such as in piece header) transmission.
If filtered using crossing plane in post-processing is implemented, such as by crossing plane filtering before showing video Applied to reconstructing video, then one or more filter coefficient sets can be used as supplemental enhancement information (SEI) message and be transmitted. For each planes of color, the sum of signal transmitting filter set can be used.If the quantity is more than zero, can transmit, Such as it is sequentially transmitted one or more crossing plane filter coefficient sets.
The exemplary syntax table of Figure 16 shows and is used in the SEI message for being referred to alternatively as cross_plane_filter () Signal sends the example of multiple crossing plane filter coefficient sets.It is described below and can be applied in exemplary syntax table ?.Item cross_plane_filter_enabled_flag can specify that crossing plane filtering is activated equal to one (1).On the contrary Ground, item cross_plane_filter_enabled_flag can specify that crossing plane filtering is disabled equal to zero (0).
Item cb_num_of_filter_sets can specify that the intersection that can be used for being encoded to the Cb planes of current image The quantity of flat filter coefficient.Item cb_num_of_filter_sets, which is equal to zero (0), can indicate that crossing plane filtering is not answered Cb planes for current image.Item cb_filter_coeff [i] can be i-th of crossing plane filtering for Cb planes Device coefficient set.Cb_filter_coeff can be data structure, and may include it is following one or more:num_ coeff_hori_minus1、num_coeff_vert_minus1、num_coeff_reduced_flag、filter_coeff_ Plus8, scaling_factor_abs_minus1, scaling_factor_sign and bit_shifting.
Item cr_num_of_filter_sets can specify that the intersection that can be used for being encoded to the Cr planes of current image The quantity of flat filter coefficient set.Item cr_num_of_filter_sets, which is equal to zero (0), can indicate crossing plane filtering not Cr planes applied to current image.Item cr_filter_coeff [i] can be i-th of crossing plane filter for Cr planes Wave device coefficient set.Cr_filter_coeff can be data structure, and may include it is following one or more:num_ coeff_hori_minus1、num_coeff_vert_minus1、num_coeff_reduced_flag、filter_coeff_ Plus8, scaling_factor_abs_minus1, scaling_factor_sign and bit_shifting.
It can implement the crossing plane filtering based on region.Crossing plane filtering can be adapted for one in video image A or multiple regional areas are filtered, such as are restored associated coloration in expectation (such as under guiding of luminance plane) and put down In the case of the loss of high-frequency information in face.For example, crossing plane filtering can be applied to the area of edge and/or texture-rich Domain.Edge detection can be first carried out, such as one or more of areas are can be applied to find crossing plane filter Domain.Can high-pass filter (such as filter_Y4Cb and/or filter_Y4Cr) be applied to Y plane first.
The amplitude of filter result can imply that whether filtered pixel is located in high-frequency region.It can significantly indicate to filter Sharp edges in the region of pixel after wave.Amplitude close to zero can indicate that filtered pixel is located in homogeneous region. Threshold value may be used to measure the filtering output of filter_Y4Cb and/or filter_Y4Cr.The filtering can be exported and be added To the respective pixel in colorimetric plane, such as in the case where it is more than threshold value.For example, each chroma pixel in smooth region May be constant, this can be to avoid stochastic filtering noise.Crossing plane filtering based on region can maintain the same of coding efficiency When reduce video encoding complexity.For example, solution can be sent to signal for the area information that may include one or more regions Code device.
In the implementation of the crossing plane filtering based on region, it can detect (such as smooth with different statistical attributes , the abundant region of bright-coloured, texture, and/or edge) one or more regions, such as detected in coder side.It can To obtain multiple crossing plane filters, and some corresponding regions in one or more regions can be applied to. Decoder-side can will be sent to about the information in some respective regions in one or more regions.These information can wrap Include the area in such as region, the position in region, and/or the specific crossing plane filter applied to region.
The exemplary syntax table of Figure 17 shows the example that the information about specific region is sent with signal.The following contents It can be applied to the item in exemplary syntax table.Top_offset (upper _ offset), left_offset (left side _ offset), Right_offset (right side _ offset) and bottom_offset (under _ offset) can specify that area and/or position when term area It sets.These can represent the upside from current region, left side, right side and bottom side to corresponding four sides of associated picture , each distance for example for pixel, such as shown in figure 18.
Cross_plane_filtering_region_info () may include the intersection in the specified region about Cb planes The crossing plane in the specified region of plane filtering, Cr planes filters or the intersection in the respectively specified region of Cb planes and Cr planes The information of plane filtering.
Item cb_filtering_enabled_flag is equal to the friendship that one (1) can be with indicator to the current region of Cb planes Fork plane filtering is activated.Item cb_filtering_enabled_flag can be with indicator to the current of Cb planes equal to zero (0) The crossing plane filtering in region is disabled.Item cb_filter_idx can specify that crossing plane filter cb_filter_coeff [cb_filter_idx] (such as signaling cb_filter_coeff as described in Figure 16) can be applied to the current of Cb planes Region.
Item cr_filtering_enabled_flag is equal to the friendship that one (1) can be with indicator to the current region of Cr planes Fork plane filtering is activated.Item cr_filtering_enabled_flag can be with indicator to the current of Cr planes equal to zero (0) The crossing plane filtering in region is disabled.Item cr_filter_idx can specify that crossing plane filter cr_filter_coeff [cr_filter_idx] (such as signaling cr_filter_coeff as described in Figure 16) can be applied to the current of Cr planes Region.
It can be transmitted about one in picture level (such as in APS or SEI message) or in chip level (such as in piece header) A or multiple regions information.The exemplary syntax table of Figure 19, which is shown, is being referred to alternatively as cross_plane_filter () SEI message in the example that is sent multiple crossing plane filters and multiple regions together with signal.Information about region It is indicated with italic.
The following contents can be applied to the item in exemplary syntax table.Item cb_num_of_regions_minus1 is added 1 (+1) can specify that the multiple regions in Cb planes.Each region can be filtered by corresponding crossing plane filter Wave.Item cb_num_of_regions_minus1, which is equal to zero (0), can indicate that the entirety of Cb planes can be by a crossing plane Filter filtering.Item cb_region_info [i] can be the ith zone information in Cb planes.Item cb_region_info Can be data structure, and may include it is following one or more:top_offset,left_offset,right_ Offset, bottom_offset, cb_filtering_enabled_flag and cb_filter_idx.
Item cr_num_of_regions_minus1 can specify that the multiple regions in Cr planes plus 1 (+1).It can be by Corresponding crossing plane filter is filtered each region.Item cr_num_of_regions_minus1 is equal to zero (0) It can indicate that the entirety of Cr planes can be by a crossing plane filter filtering.Item cr_region_info [i] can be Cr Ith zone information in plane.Item cr_region_info can be data structure, and may include following one or more Person:top_offset,left_offset,right_offset,bottom_offset,cr_filtering_enabled_ Flag and cr_filter_idx.
It can be filtered in single-layer video coded system and/or in multi-layer video coding system using crossing plane.Root According to single-layer video coding (such as depicted in figs. 1 and 2), crossing plane can be applied to filter, such as to improve reference picture (such as the picture stored in reference picture memory 164 and/or 264), it is one or more subsequent so as to be better anticipated Frame (such as about colorimetric plane).
Crossing plane filtering may be used as post-processing approach.It is regarded for example, crossing plane filtering can be applied to reconstruct output Frequently 220 (such as before it is shown).Although this filtering is not a part for MCP loops and does not therefore interfere with then The coding of picture, but post-process can (such as directly) be modified to the video quality of display.For example, crossing plane filtering can To be applied in HEVC post-processings by using supplemental enhancement information (SEI) signaling.Can (such as in the sei message) transmit In the crossing plane filter information of coder side estimation.
According to the example (such as shown in Fig. 3 and Fig. 4) for using multi-layer video coding, such as by one or more pictures Before being put into EL DPB buffers (such as reference picture list), crossing plane filtering can be applied in one or more adopt Sample BL pictures, for predicting higher picture.As shown in figure 5, crossing plane filtering can be executed in the third level.In order to change Into the quality of one or two of basal layer reconstructed picture (such as ILP pictures) after up-sampling colorimetric plane, training and/or Involved corresponding bright plane can be a luminance plane from same ILP pictures in filtering, wherein training and/or filter Wave process uses identical in being encoded with single-layer video.
It, can be in the basal layer reconstructed picture not up-sampled according to another example for using multi-layer video coding (such as directly) corresponding luminance plane is used, to support crossing plane training and/or filtering, such as to enhance in ILP pictures Colorimetric plane.For example, according to 4:2:The 2X space S VC of 0 video source, the size of base layer Luma plane can substantially with The size of the corresponding colorimetric plane of one or two of ILP pictures is identical (such as accurately identical).Two kinds of plane Sample grid can be different.Luminance plane in base layer pictures can be filtered by phase correction filter, such as so as to It is aligned with the sampling grids of the colorimetric plane in ILP pictures (such as being accurately aligned with).One or more following operations can be with Operation (such as operation for single-layer video coding) described in elsewhere herein is identical.It is considered that color format is 4:4:4 (such as according to Figure 10 A or Figure 11 A).It is supported for the colorimetric plane in ILP pictures using base layer Luma plane Crossing plane filtering can for example be extended to the space scaling and/or other color lattice of other ratios by simply deriving Formula.
It, can be by crossing plane filtering application in not being sampled also according to another example for using multi-layer video coding Reconstructed base layer picture.The output that can be filtered to crossing plane up-samples.As shown in figure 5, can hold in the first stage Row crossing plane filters.Scaling (such as wherein BL ratios EL has lower resolution ratio) in space can will intersect flat Face filtering application can be related to more in lower pixel than other one or more multi-layer video coding examples described herein Few computation complexity.Equation (2) and (3) cannot directly apply, such as this is because with reference to equation (2),And CborgCan have different dimension and It cannot directly be subtracted.Yrec、CbrecAnd CrrecThere can be in base layer pictures identical resolution ratio.CborgIt can be with With the identical resolution ratio in enhancement-layer pictures.It can be realized according to multi-layer video coding using equation (8) and (9) The derivation of the exemplary crossing plane filter coefficient:
Wherein U can be up-sampling function, which as input and can export tool using base layer pictures There is the picture after the up-sampling of enhancement layer resolution.
According to the crossing plane filtering technique shown in Fig. 9 A and Fig. 9 B, colorimetric plane can be by luminance plane and its own (such as excluding other colorimetric planes) enhances, and can simplify equation (8) and (9), such as such as institute in equation (10) and (11) Show.
It, can be by filter_Cb4Cb and/or filter_ based on crossing plane filtering technique shown in Fig. 9 A and Fig. 9 B The size of Cr4Cr is reduced to 1x1, and can the value of filter coefficient be set as 1.0.Can simplify equation (10) and (11), such as shown in equation (12) and (13).
Adaptively crossing plane can be applied to filter.For example, when applied to multi-layer video coding, can for example scheme Shown in 5 first and/or the third level in adaptively apply crossing plane filter.
Crossing plane filtering can adaptively be applied to one or more code levels, such as including sequence-level, picture One or more of grade, chip level and block grade.Such as according to sequence level adaptation, encoder can determine in the first order and/or It is encoded come the part (such as entirety of video sequence) to video sequence using crossing plane filtering in the third level.This is really Surely it can be expressed as example may include in sequence header and/or one or more sequence-level parameter set (such as video parameter collection (VPS) and/or sequence parameter set (SPS)) in binary flags.
Such as according to picture level adaptation, encoder can be determined uses crossing plane in the first order and/or the third level It filters to be encoded to one or more EL pictures (such as each EL pictures of video sequence).For the determination can indicate Can such as be included in picture header and/or one or more picture level parameter collection (such as auto-adaptive parameter set (APS) and/or Image parameters collection (PPS)) in binary flags.
Such as according to piece level adaptation, encoder can be determined and be filtered using crossing plane in the first order and/or the third level Wave encodes one or more EL piece of video (such as each EL pieces).The determination can be expressed as example can by including Binary flags in piece header.Adaptively it can implement (example according to (such as expanding to) other one or more ranks Signaling mechanism as described above).
For example, can be filtered for multi-layer video coding to implement the crossing plane based on picture.It can be sent with signal Relevant information is filtered with this crossing plane.For example, one or more mark (such as uplane_filtering_flag and/ Or vplane_filtering_flag) it can be encoded (such as being encoded once for each picture), and can be transmitted To decoder.Mark uplane_filtering_flag and/or vplane_filtering_flag can indicate for example to intersect Whether plane filtering should be respectively applied to Cb planes and/or Cr planes.Encoder can be determined that as one or more pictures The colorimetric plane of (such as based on picture one by one) enables or disabling crossing plane filtering.Encoder, which can be configured as, makes this One determines, for example, so as to improve coding efficiency and/or according to the expectation rank of coding efficiency and complexity (such as open intersect it is flat Face filtering can increase decoding complex degree).
Encoder can be configured as to be determined whether to filter based on the crossing plane of picture using one or more technologies Wave is applied to one or more colorimetric planes.For example, according to the example for executing picture level selection, the Cb before and after filtering is flat Face (such as Cb_rec and Cb_imp) can compare with the original Cb planes (such as Cb_org) in EL pictures.Can calculate and Compare front and back mean square error (MSE) value (MSE_rec and MSE_imp can be known respectively as) of filtering.For example, MSE_imp can be with Less than MSE_rec, can indicate that using crossing plane filtering distortion can be reduced, and intersection can be enabled in Cb planes Plane filtering.If MSE_imp is not less than MSE_rec, crossing plane filtering can be disabled in Cb planes.According to this skill Art can calculate MSE based on entire picture, this may mean that single weight factor can be in MSE application in calculation in one A or multiple pixels (such as each pixel).
According to another example for executing picture level selection, can be calculated based on one or more pixels involved in ILP MSE, such as calculated based on pixel those of involved in only ILP.When encoder determines whether crossing plane filtering application When in Cb planes, the ILP mappings (map) for picture are perhaps also unavailable.For example, can carry out encoding it to EL pictures Before be determined, however when EL pictures have been encoded, ILP mappings can be disabled.
According to another example for executing picture level selection, multi-path (pass) coding strategy can be utilized.In the first access In, EL pictures can be encoded, and ILP mappings can be recorded.In alternate path, it may be determined that whether filtered using crossing plane Wave, such as be determined according to the MSE calculating that can be confined to the marked ILP blocks of ILP mappings.Can according to this determination come pair Picture is encoded.This multi-path coding can be time-consuming, and the calculating that can be related to bigger than unipath coding is multiple Miscellaneous degree.
One or more of each picture (such as each picture of video sequence) mobile object and non-moving object phase Than that can be more likely to be encoded by ILP pictures.The ILP mappings of continuous picture (such as continuous picture of video sequence) can be phase (such as can show and be highly correlated) closed.This continuous ILP mappings can show one or more relative to each other Displacement (displacement) (such as relatively small displacement).This displacement can be the respective different moments due to such as picture It is caused.
According to another example for executing picture level selection, the ILP mappings of the EL pictures of one or more previous codings can be with The ILP of current EL pictures for predicting to be encoded maps.The ILP mappings of prediction can be used for encoding current EL pictures mistake Positioning is likely to be used for one or more blocks of ILP in journey.This block that may be used is properly termed as potential ILP blocks.One Or multiple potential ILP blocks may include that (such as according to descriptions above) and/or can be used during calculating MSE Determine whether that crossing plane is applied to filter based on the MSE calculated in for example.
The dimension of ILP mappings can depend on the granularity (granularity) of such as encoder selection.If such as picture Dimension be WxH (such as according to brightness resolution), then the dimension of ILP can be WxH, and one of item can represent corresponding picture Whether element is for ILP.The dimension of ILP mappings can be (W/M) x (H/N), and one of item can represent size as MxN Relevant block whether be for ILP.According to exemplary implementation, M=N=4 can be selected.
Accurate ILP mappings (such as being recorded after EL pictures are encoded) can be binary map, to item (example Such as each item) value in two possible values (such as zero (0) or one (1)) can be limited to, described two possible values can To indicate whether the item is for ILP.Value 0 and 1 can indicate respectively that for example described item is for ILP or is not to use In ILP's.
The ILP mappings predicted can be multi-level mapping.According to this ILP map, each item can have it is multiple can The value of energy, the multiple possible values can represent multi-level degree of belief during prediction is by the block for being used for ILP (confidence).The value of bigger can indicate higher degree of belief.Implemented according to illustrative, can be used from 0 to 128 The probable value for the ILP mappings predicted, wherein 128 represent highest confidence level, and 0 represents lowest confidence.
Figure 20 depicts the exemplary diagram chip level selection algorithm 2000 filtered for crossing plane.Shown picture level choosing It selects algorithm and can be applied to such as Cb planes and/or Cr planes.It, can be initial 2010, such as before encoding the first picture The ILP mappings for changing prediction, are marked as PredILPMap.According to discribed algorithm, it can be assumed that each block can have It will be used for the equal chance of ILP, and the value of each item of PredILPMap can be set as 128.
2020, encoder may determine whether to filter using crossing plane.It can be filtered by crossing plane and generate enhancing Cb planes, that is, Cb_imp.The MSE of weighting can be calculated, such as is calculated using equation (14) and (15).
In equation (14) and (15), the Cb that Cb_rec and Cb_imp can be represented before and after crossing plane filters is flat Face, Cb_org can represent the original Cb planes for the current EL pictures that will be encoded, and (x, y) can represent luminance plane The position of some pixel in grid.As shown, equation (14) and (15) assume 4:2:0 color double sampling, and ILP reflects The item penetrated represents 4x4 block sizes, in Cb planes corresponding position and PredILPMap can be respectively (x/2, y/2) and (x/4,y/4).For each pixel, variance (Cb_imp (x/2, y/2)-Cb_org (x/2, y/2))2Or (Cb_rec (x/2, y/ 2)–Cb_org(x/2,y/2))2It can be arrived by the corresponding Factors Weighting in PredILPMap, such as by error accumulation It is weighted before in Weighted_MSE_imp or Weighted_MSE_rec.This, which may mean that, is more likely to be used ILP One or more pixels on distortion can have higher weight in the MSE of weighting.
Alternately or additionally, 2020, the Cr planes, that is, Cr_imp for generating enhancing can be filtered by crossing plane.It can be with The MSE of weighting is calculated, such as is calculated using equation (16) and (17).
In equation (16) and (17), the Cr that Cr_rec and Cr_imp can be represented before and after crossing plane filters is flat Face, Cr_org can represent the original Cr planes for the current EL pictures that will be encoded, and (x, y) can represent luminance plane The position of some pixel in grid.As shown, equation (16) and (17) assume 4:2:0 color double sampling, and ILP reflects The item penetrated represents 4x4 block sizes, in Cr planes corresponding position and PredILPMap can be respectively (x/2, y/2) and (x/4,y/4).For each pixel, variance (Cr_imp (x/2, y/2)-Cr_org (x/2, y/2))2Or (Cr_rec (x/2, y/ 2)–Cr_org(x/2,y/2))2It can be arrived by the corresponding Factors Weighting in PredILPMap, such as by error accumulation It is weighted before in Weighted_MSE_imp or Weighted_MSE_rec.This, which may mean that, is more likely to be used ILP One or more pixels on distortion can have higher weight in the MSE of weighting.
Weighted_MSE_imp and Weighted_MSE_rec can be compared to each other.If Weighted_MSE_imp Less than Weighted_MSE_rec, it can indicate that crossing plane filtering can reduce distortion (such as potential ILP in the block one The distortion of a or multiple potential ILP blocks), then it can enable crossing plane filtering.If Weighted_MSE_imp is not less than Weighted_MSE_rec can then disable crossing plane filtering.
Once being determined at 2020, so that it may to be encoded to current EL pictures at 2030, and can be Current ILP mapping is recorded at 2040, can be marked and is.Current ILP can be mapped and for example in current EL EL pictures after picture are used together.Current ILP mappings can be accurate rather than predict, and can be binary systems 's.If corresponding block is used for ILP, 128 can will be set as the value of described piece of item.If corresponding block not by For ILP, then it can will be set as zero (0) for the value of described piece of item.
2050, current ILP mappings can be used for updating predicted ILP mappings, such as shown in equation (18).Root According to illustrative renewal process, the ILP of previous prediction mapping (such as PredILPMap (x, y)) and current ILP map (such as CurrILPMap (x, y)) and can with divided by 2, this may mean that ILP associated with another picture mapping can be to update Prediction ILP mappings afterwards have relatively small influence.
2060, it may be determined that whether have arrived at the ending of video sequence.If reaching the knot of video sequence not yet Tail can then repeat one or more of operations described above operation (such as 2020 to 2060), such as so as to subsequent EL pictures encoded.If having arrived at the ending of video sequence, the illustrative picture level selection algorithm at 2070 2000 can terminate.
(such as utilizing crossing plane filtering) video coding technique described herein can be according in Figure 21 A- Figure 21 E Video is transmitted in the wireless communication system (such as example wireless communications 2100) and its component of description to implement.
Figure 21 A are can to realize that the example communication system 2100 of one or more disclosed embodiments is shown wherein Figure.For example, wireless network (such as wireless network of the one or more components including communication system 2100) can be configured as The carrying for extending to except wireless network (such as except garden wall garden wall associated with wireless network) is allow to be assigned QoS Feature.
Communication system 2100 can be for providing the contents such as voice, data, video, message, broadcast to multiple nothings The multi-access systems of line user.Communication system 2100 enable to multiple wireless users by it is shared include wireless bandwidth be System resource accesses these contents.For example, communication system 2100 can use one or more channel access methods, such as code point Multiple access (CDMA), time division multiple acess (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), Single Carrier Frequency Division Multiple Access (SC-FDMA) Deng.
As illustrated in fig. 21, communication system 2100 may include at least one wireless transmitter/receiver unit (WTRU) (such as WTRU 2102a, 2102b, 2102c and 2102d), radio access network (RAN) 2104, core net 2106, common exchanging telephone Net (PSTN) 2108, internet 2110 and other networks 2112, but it is to be understood that disclosed embodiment contemplates arbitrarily WTRU, base station, network and/or the network element of quantity.Each in WTRU 2102a, 2102b, 2102c, 2102d can be with It is configured as any kind of equipment for working and/or communicating in wireless environments.For example, WTRU 2102a, 2102b, 2102c, 2102d can be configured as sending and/or receive wireless signal, and may include user equipment (UE), movement Stand, fixed or moving user unit, pager, cellular phone, personal digital assistant (PDA), smart phone, laptop computer, Net book, PC, wireless sensor, consumer electronics product etc..
Communication system 2100 can also include base station 2114a and base station 2114b.Each in base station 2114a, 2114b Can any kind of be configured as wirelessly being connected at least one of WTRU 2102a, 2102b, 2102c, 2102d It connects in order to access one or more communication networks as such as core net 2106, internet 2110 and/or network 2112 Device.As an example, base station 2114a, 2114b can be base station transceiver (BTS), node B, e node B, Home Node B, Family expenses e node B, site controller, access point (AP), wireless router etc..Although base station 2114a, 2114b are drawn as respectively Discrete component, but it is understood that base station 2114a, 2114b may include any number of interconnection base station and/or network element.
Base station 2114a can be a part of RAN 2104, which can also include other base stations and/or network Element (not shown), such as base station controller (BSC), radio network controller (RNC), relay node etc..Base station 2114a And/or base station 2114b can be configured as and emit and/or receive wireless signal, the specific geographic area in specific geographical area Domain is referred to as cell (not shown).The cell is further divided into cell sector.For example, cell associated with base station 2114a It is divided into three sectors.In this way, in one embodiment, base station 2114a includes three transceivers, such as each cell Use a transceiver.In another embodiment, base station 2114a can use multiple-input and multiple-output (MIMO) technology, because This, each sector of cell uses multiple transceivers.
Base station 2114a, 2114b can be by air interfaces 2116 and WTRU 2102a, 2102b, 2102c, 2102d One or more communication, the air interface 2116 can be that any wireless communication link appropriate is (such as radio frequency (RF), micro- Wave, infrared ray (IR), ultraviolet light (UV), visible light etc.).It can be come using any radio access technologies (RAT) appropriate Establish air interface 2116.
More specifically, as described above, communication system 2100 can be multi-access systems and may be used one or more Channel access scheme, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA etc..For example, the base station 2114a in RAN 2104 Such as Universal Mobile Telecommunications System (UMTS) terrestrial radio access may be implemented with WTRU 2102a, 2102b, 2102c (UTRA) etc radiotechnics, the wherein radiotechnics can establish air interface using wideband CDMA (WCDMA) 2116.WCDMA may include the communication protocol of such as high-speed packet access (HSPA) and/or evolved HSPA (HSPA+) etc. HSPA may include high-speed downlink packet access (HSDPA) and/or High Speed Uplink Packet access (HSUPA).
In another embodiment, base station 2114a and WTRU 2102a, 2102b, 2102c may be implemented such as evolved UMTS terrestrial radios access the radiotechnics of (E-UTRA) etc, and the wherein radiotechnics can use LTE and/or height Grade LTE (LTE-A) establishes air interface 2116.
In other embodiments, such as IEEE may be implemented in base station 2114a and WTRU 2102a, 2102b, 2102c 802.16 (i.e. Worldwide Interoperability for Microwave intercommunication accesses (WiMAX)), CDMA2000, CDMA2000 1X, CDMA2000EV-DO, Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), global system for mobile communications (GSM), increase The radiotechnics such as strong type data rate GSM evolution (EDGE), GSM EDGE (GERAN).
Base station 2114b in Figure 21 A can be such as wireless router, Home Node B, family expenses e node B or access point, And it can be promoted using any RAT appropriate wireless in the regional areas such as place of business, family, vehicle, campus Connection.In one embodiment, base station 2114b and WTRU 2102c and/or 2102d can implement such as IEEE 802.11 Etc radiotechnics to establish WLAN (WLAN).In another embodiment, base station 2114b and WTRU2102c, 2102d can implement the radiotechnics of such as IEEE 802.15 etc to establish wireless personal area network (WPAN). In yet another embodiment, base station 2114b and WTRU 2102c, 2102d can utilize based on cellular RAT (such as WCDMA, CDMA2000, GSM, LTE, LTE-A etc.) to establish picocell or Femto cell.As illustrated in fig. 21, base station 2114b can have to be directly connected to internet 2110.Therefore, base station 2114b can need not connect via core net 2106 Enter internet 2110.
RAN 2104 can be communicated with core net 2106, the core net 2106 can be configured as to WTRU 2102a, One or more of 2102b, 2102c, 2102d provide the voice on voice, data, application program, and/or Internet protocol (VoIP) any kind of network serviced.For example, core net 2106 can provide Call- Control1, billing of services, based on movement Service, prepaid call, internet connection, video distribution of position etc., and/or execute the advanced securities work(such as user authentication Energy.Although Figure 21 A are not shown, it is appreciated that RAN 2104 and/or core net 2106 can with RAN2104 using identical Other RAN of RAT or different RAT are directly or indirectly communicated.For example, E-UTRA radio skills can be utilized except being connected to Except the RAN 2104 of art, core net 2106 can also be communicated with using another RAN (not shown) of gsm radio technology.
Core net 2106 can also act as WTRU 2102a, 2102b, 2102c, 2102d access PSTN2108, internet 2110, and/or other networks 2112 gateway.PSTN 2108 may include the electricity for providing plain old telephone service (POTS) Road switched telephone.Internet 2110 may include the global system of the interconnected computer networks and equipment using common communicating protocol System, the common communicating protocol are, for example, TCP in transmission control protocol (TCP)/Internet protocol (IP) internet protocol suite, use User data datagram protocol (UDP) and IP.Network 2112 may include possessed and/or operated by other service providers wired or Cordless communication network.For example, network 2112 may include being connected to that identical RAT or different can be used from RAN 2104 Another core net of one or more RAN of RAT.
Some or all WTRU 2102a, 2102b, 2102c, 2102d in communication system 2100 may include multimode energy Power, i.e. WTRU 2102a, 2102b, 2102c, 2102d may include for by different Radio Links and different wireless networks Multiple transceivers of network communication.For example, WTRU 2102c shown in Figure 21 A can be configured as and may be used cellular nothing The base station 2114a of line power technology is communicated, and is communicated with the base station 2114b that 802 radiotechnics of IEEE may be used.
Figure 21 B are the system diagrams of example WTRU 2102.As illustrated in fig. 21b, WTRU2102 may include processor 2118, Transceiver 2120, launch/receive element 2122, speaker/microphone 2124, keyboard 2126, display/touch screen 2128, Non-removable memory 2130, removable memory 2132, power supply 2134, global positioning system (GPS) chipset 2136 and/ Or other peripheral equipments 2138.It should be understood that while consistent with the embodiments, WTRU 2102 may include aforementioned member Any sub-portfolio of part.
Processor 2118 can be general processor, application specific processor, conventional processors, digital signal processor (DSP), Multi-microprocessor, one or more microprocessors associated with DSP core, controller, microcontroller, application-specific integrated circuit (ASIC), field programmable gate array (FPGA) circuit, the integrated circuit (IC) of any other type, state machine etc..Processing Device 2118 can execute Signal coding, data processing, power control, input/output processing, and/or enable WTRU 2102 Any other function of operating in wireless environments.Processor 2118 may be coupled to transceiver 2120, transceiver 2120 It may be coupled to launch/receive element 2122.Although processor 2118 and transceiver 2120 are depicted as individual member by Figure 21 B Part, it is appreciated that processor 2118 and transceiver 2120 can be integrated together in electronic building brick or chip.
Launch/receive element 2122, which can be configured as, to be sent out by air interface 2116 to base station (such as base station 2114a) It penetrates signal and/or receives signal from base station (such as base station 2114a).For example, in another embodiment, transmitting/reception member Part 2122 can be configured as emitting and/or receiving the antenna of RF signals.In yet another embodiment, transmitting/reception member Part 2122 can be configured as emitting and/or receiving the transmitter/detector of such as IR, UV or visible light signal.At one In embodiment, launch/receive element 2122 can be configured as transmitting and receive both RF and optical signal.It should be understood that hair Penetrate/receiving element 2122 can be configured as transmitting and/or receive wireless signal any combinations.
In addition, though launch/receive element 2122 is drawn as discrete component in Figure 21 B, but a WTRU 2102 can be wrapped Include any number of launch/receive element 2122.More specifically, MIMO technology may be used in WTRU 2102.Therefore, one In a embodiment, WTRU 2102 may include two for emitting and receiving wireless signal by air interface 2116 Or more launch/receive element 2122 (such as mutiple antennas).
Transceiver 2120 can be configured as modulation will the signal that is emitted by launch/receive element 2122 and/or to by The signal that launch/receive element 2122 receives is demodulated.As described above, WTRU 2102 can have multi-mode ability.Cause This, for example, transceiver 2120 may include for enable WTRU 2102 via such as UTRA and IEEE 802.11 it Multiple transceivers of a variety of RAT communications of class.
The processor 2118 of WTRU 2102 may be coupled to speaker/microphone 2124, keyboard 2126, and/or display Device/touch screen 2128 (such as liquid crystal display (LCD) display unit or Organic Light Emitting Diode (OLED) display unit), and User input data can be received from these components.Processor 2118 can also to speaker/microphone 2124, keyboard 2126, And/or display/touch screen 2128 exports user data.In addition, processor 2118 can be accessed from any type of suitable Memory (such as non-removable memory 2130 and removable memory 2132) information, or store data in this and deposit In reservoir.Non-removable memory 2130 may include random access memory (RAM), read-only memory (ROM), hard disk or The memory storage device of any other type.Removable memory 2132 may include Subscriber Identity Module (SIM) card, memory Stick, secure digital (SD) storage card etc..In other embodiments, processor 2118 can access to come from and not be located at physically The information of memory on WTRU 2102 (such as on server or home computer (not shown)) and store data in this In memory.
Processor 2118 can receive electric power from power supply 2134, and can be configured as distribution and/or control to WTRU The electric power of other elements in 2102.Power supply 2134 can be used to any appropriate equipment of the power supplies of WTRU 2102.For example, Power supply 2134 may include one or more dry cells (such as ni-Cd (NiCd), nickel zinc (NiZn), ni-mh (NiMH), lithium ion (Li) etc.), solar cell, fuel cell etc..
Processor 2118 can also be coupled to GPS chip group 2136, GPS chip group 2136 can be configured as offer about The location information (for example, longitude and latitude) of the current location of WTRU 2102.In addition to the information from GPS chip group 2136 Or alternatively, WTRU 2102 can receive position by air interface 2116 from base station (such as base station 2114a, 2114b) Information and/or its position is determined based on receiving the sequential of signal from the base station near two or more.It will be appreciated that WTRU 2102 can be obtained while keeping consistent with embodiment by any location determination embodiment appropriate Location information.
Processor 2118 is also coupled to other peripheral equipments 2138, and peripheral equipment 2138 may include providing additional spy Sign, function and/or one or more software and/or hardware modules of wired or wireless connection.For example, peripheral equipment 2138 can be with Including accelerometer, electronic compass, satellite transceiver, digital camera (for take pictures or video), universal serial bus (USB) Port, vibratory equipment, television transceiver, Earphone with microphone,Module, frequency modulation (FM) radio unit, digital music are broadcast Put device, media player, video game machine module, explorer etc..
Figure 21 C are the system diagrams for the communication system 2100 for including RAN 2104a and core net 2106a, the RAN 2104a and Core net 2106a respectively includes the exemplary implementation of RAN 2104 and core net 2106.As described above, such as RAN of RAN 2104 2104a UTRA radiotechnics can be used to pass through air interface 2116 to be communicated with WTRU 2102a, 2102b, 2102c. The RAN 2104a can also be communicated with core net 2106a.As shown in fig. 21 c, RAN 2104a may include node B2140a, 2140b, 2140c, wherein each node B may include one or more transceivers, for by air interface 2116 with WTRU 2102a, 2102b, 2102c are communicated.Each in node B 2140a, 2140b, 2140c can be with RAN Specific cell (not shown) in 2104a is connected.RAN 2104a can also include RNC 2142a, 2142b.It should be understood that While keeping consistent with embodiment, RAN 2104a may include any amount of node B and RNC.
As shown in fig. 21 c, node B 2140a, 2140b can be communicated with RNC 2142a.In addition, node B 2140c It can be communicated with RNC 2142b.Node B 2140a, 2140b, 2140c can via Iub interface respectively with RNC 2142a, 2142b are communicated.RNC 2142a, 2142b can be in communication with each other by Iub interface.RNC 2142a, 2142b Each can be configured as controls its node B 2140a, 2140b, 2140c for being connected respectively.In addition, can be by RNC Each in 2142a, 2142b can be configured as execution or support other functions, such as open sea wharf, load control System, admissions control, packet scheduling, switching control, macro-diversity, security function, data encryption etc..
Core net 2106a shown in Figure 21 C may include Media Gateway (MGW) 2144, mobile switching centre (MSC) 2146, Serving GPRS Support Node (SGSN) 2148 and/or Gateway GPRS Support Node (GGSN) 2150.Although by aforementioned components It is expressed as a part of core net 2106a, it should be appreciated that, any part all can be by core network operators in these components Entity in addition is all and/or operates.
RNC 2142a in RAN 2104a can be connected to the MSC 2146 in core net 2106a via IuCS interfaces.It can MSC 2146 is connected to MGW 2144.MSC 2146 and MGW 2144 can be provided to WTRU 2102a, 2102b, 2102c to electricity The connection of road exchange network, such as PSTN 2108, to promote WTRU 2102a, 2102b, 2102c to communicate and set with conventional land lines Communication between standby.
The SGSN that RNC 2142a in RAN 2103 can also be connected to via IuPS interfaces in core net 2106a 2148.SGSN 2148 can be connected to GGSN 2150.SGSN 2148 and GGSN 2150 can to WTRU 2102a, 2102b, 2102c provides the access for packet switching network, such as internet 2110, to promote WTRU 2102a, 2102b, 2102c With the communication between IP enabled devices.
As set forth above, it is possible to which core net 2106a is connected to network 2112, core net 2106a may include being serviced by other Provider it is all and/or operation wired or wireless network.
Figure 21 D are the system diagrams for the communication system 2100 for including RAN 2104b and core net 2106b, the RAN 2104b and Core net 2106b respectively includes the exemplary implementation of RAN 2104 and core net 2106.As described above, RAN 2104b can be used E-UTRA radiotechnics is communicated by air interface 2116 with WTRU 2102a, 2102b, 2102c.RAN 2104b can be with It is communicated with core net 2106b.
RAN 2104b may include e nodes B 2140d, 2140e, 2140f it should be appreciated that in holding and embodiment While consistent, RAN 2104b may include any number of e nodes B.It is each in e nodes B 2140d, 2140e, 2140f It is a to may include one or more transceivers, for being led to by air interface 2116 and WTRU 2102a, 2102b, 2102c Letter.In one embodiment, e nodes B 2140d, 2140e, 2140f can utilize MIMO technology.Therefore e node B 2140d Multiple antennas can be used for example to send wireless signal to WTRU 2102a and receive from it wireless signal.
In eNB 2140d, 2140e, 2140f each can (not shown) associated with specific cell, can by with It is set to processing radio resources management decision-making, the scheduling on handover decisions, and/or user group uplink and/or downlink Etc..As shown in figure 21d, e nodes B 2140d, 2140e, 2140f can by X2 interface with communicate with one another.
Core net 2106b shown in Figure 21 D may include mobility management entity (MME) 2143, gateway 2145, With packet data network (PDN) gateway 2147 etc..Although each of foregoing units is shown as a part of core net 2106b, It is to be understood that any one in these units can be possessed by other entities other than core network operators and/ Or operation.
MME 2143 can be connected to every in e nodes B 2140d of RAN 2104b, 2140e, 2140f via S1 interfaces One, and it is used as control node.For example, MME 2143 can be responsible for WTRU 2102a, 2102b, 2102c certification user, hold Carry activation/deactivation, WTRU 2102a, 2102b, 2102c initial connection during select particular service gateway etc..MME 2143 can also provide control plane function in RAN 2104b and using other radiotechnics such as GSM or Switch between other RAN (not shown)s of WCDMA.
Gateway 2145 can be connected to via S1 interfaces in e nodes B 2140d of RAN 2104b, 2140e, 2140f Each.Gateway 2145 usually can be to/from WTRU 2102a, 2102b, 2102c routing and forwarding user data point Group.Other functions can also be performed in gateway 2145, such as are anchored user plane during switching between e nodes B, when downlink chain Circuit-switched data can by WTRU 2102a, 2102b, 2102c use when triggering page, management and storage WTRU 2102a, 2102b, Content of 2102c etc..
Gateway 2145 can also be connected to PDN Gateway 2147, PDN Gateway 2147 to WTRU 2102a, 2102b, 2102c provide to packet switching network (such as internet 2110) access, in order to WTRU 2102a, 2102b, 2102c with Communication between IP enabled devices.
Core net 2106b can be in order to the communication with other networks.For example, core net 2106b can to WTRU 2102a, 2102b, 2102c provide to circuit-switched network (such as PSTN 2108) access, in order to WTRU 2102a, 2102b, Communication between 2102c and conventional land line traffic equipment.For example, core net 2106b may include that (such as IP is more for IP gateway Media subsystem (IMS) server), or communicate, the IP gateway is as between core net 2106b and PSTN 2108 Interface.In addition, core net 2106b can provide the access to network 2112, the network to WTRU 2102a, 2102b, 2102c 2112 may include the wired or wireless network that other service providers possess and/or operate.
Figure 21 E are the system diagrams for the communication system 2100 for including RAN 2104c and core net 2106c, the RAN 2104c and Core net 2106c respectively includes the exemplary implementation of RAN 2104 and core net 2106.RAN 2104 (such as RAN 2104c) can To be to use 802.16 radiotechnics of IEEE with logical by air interface 2116 and WTRU 2102a, 2102b and 2102c The access service network (ASN) of letter.As described below, WTRU 2102a, 2102b, and/or 2102c, RAN 2104c and core The communication link netted between the different function entity of 2106c can be defined as reference point.
As shown in figure 21e, RAN 2104c may include base station 2140g, 2140h, 2140i and ASN gateway 2141, but It is it should be understood that while consistent with the embodiments, RAN 2104c may include any number of base station and ASN Gateway.Base station 2140g, 2140h, 2140i can be respectively associated with the specific cell (not shown) in RAN 2104c, and can To include respectively one or more transceivers, to be communicated with WTRU 2102a, 2102b, 2102c by air interface 2116.? In one embodiment, base station 2140g, 2140h, 2140i can implement MIMO technology.To, for example, base station 2140g WTRU 2102a can be given to transmit wireless signal using mutiple antennas, and receive the signal from the WTRU 2102a.Base Stand 2140g, 2140h, 2140i can also provide mobile management function to ps domain, such as handover trigger, tunnel building, radio resource Management, business classification, service quality (QoS) strategy implement etc..ASN gateways 2141 may be used as service hub, and can With duty pager, cache user profile, it is routed to core net 2106c etc..
Air interface 2116 between WTRU 2102a, 2102b, 2102c and RAN 2104c can be defined as implementation The R1 reference points of 802.16 specifications of IEEE.In addition, each WTRU in WTRU 2102a, 2102b, 2102c can be established and core The logic interfacing (not shown) of heart net 2106c.Logic interfacing between WTRU 2102a, 2102b, 2102c and core net 2106c R2 reference points can be defined as, which can be used for certification, mandate, IP Host Administrations, and/or mobile management.
The communication link between each base station in base station 2140g, 2140h, 2140i can be defined as R8 reference points, should R8 reference points may include for promoting the WTRU between base station to switch the agreement with data transfer.Base station 2140g, 2140h, Communication link between 2140i and ASN gateways 2141 can be defined as R6 reference points.R6 reference points may include for based on Associated mobility events of each WTRU in WTRU 2102a, 2102b, 2102c promote the agreement of mobile management.
As shown in figure 21e, RAN 2104c may be coupled to core net 2106c.Between RAN 2104c and core net 2106c Communication link can be defined as R3 reference points, the R3 reference points include for promote such as data transfer and mobile management The agreement of energy.Core net 2106c may include mobile IP home agent (MIP-HA) 2154, certification, mandate, book keeping operation (AAA) clothes Business device 2156, and/or gateway 2158.Although each element in aforementioned components is described as a part of core net 2106c, It is to be appreciated that the arbitrary element in these elements can be possessed and/or runed by the entity in addition to core network operators.
MIP-HA can be responsible for IP address management, and enable WTRU 2102a, 2102b, 2102c different ASN and/ Or it is roamed between different core network.MIP-HA 2154 can be provided for WTRU 2102a, 2102b, 2102c for grouping The access for switching net (such as internet 2110), it is logical between WTRU 2102a, 2102b, 2102c and IP enabled device to promote Letter.Aaa server 2156 can be responsible for user authentication and support user service.Gateway 2158 can be WTRU 2102a, 2102b, 2102c provide for circuit-switched network (such as PSTN 2108) access, with promote WTRU 2102a, 2102b, Communication between 2102c and conventional land lines communication equipment.For example, gateway 2158 can carry for WTRU 2102a, 2102b, 2102c For being directed to connecing for network 2112 (may include other wired or wireless networks for being possessed and/or being operated by other service providers) Enter.
Although being not shown in Figure 21 E, it will be appreciated that RAN 2104c may be coupled to other ASN and core Heart net 2106c may be coupled to other core nets.Communication link between RAN 2104c and other ASN can be defined as R4 Reference point, the R4 reference points may include for coordinate WTRU 2102a, 2102b, 2102c RAN 2104c and other RAN it Between ambulant agreement.Communication link between core net 2106c and other core nets can be defined as R5 reference points, should R5 reference points may include the agreement for promoting the network interconnection between household core net and accessed core net.
It can implement the above process in combining computer program in computer-readable medium, software or firmware, with Just it is executed by computer or processor.The example of computer-readable medium includes but not limited to electric signal (by wired and/or nothing Line connection is sent) or computer readable storage medium.The example of computer readable storage medium includes read-only memory (ROM), random access memory (RAM), register, cache memory, semiconductor memory system, magnetic medium be (such as But be not limited to internal hard drive and moveable magnetic disc), magnet-optical medium or optical medium, such as CD-ROM disks or digital multi Disk (DVD).Processor associated with software can be used to implement RF transceiver, in WTRU, terminal, base station, RNC Or it is used in arbitrary host.It can be tied herein according to the feature and/or element of the description of one or more illustrative embodiments It closes and is usually used according to the feature and member of one or more other illustrative embodiments descriptions.

Claims (24)

1. a kind of video encoding/decoding method, this method include:
It receives vision signal and is designed to the high-pass filter of the vision signal;
The high-pass filter is applied to the luminance plane pixel of the vision signal, to determine chroma offset;And
The chroma offset is added to the corresponding colorimetric plane pixel of the vision signal.
2. according to the method described in claim 1, this method further includes:
The instruction in the region of the high-pass filter will be applied about the vision signal to it by receiving;And
The high-pass filter is applied to the region.
3. according to the method described in claim 1, this method further includes:It receives described about the high-pass filter to be applied to The instruction of at least one of sequence-level, picture level, chip level and the block grade of vision signal.
4. according to the method described in claim 1, wherein applying high-pass filter conduct in single-layer video cataloged procedure It post-processes and is performed.
5. according to the method described in claim 1, wherein being held during multi-layer video coding using the high-pass filter Row, the luminance plane pixel be up-sampling after base layer Luma planar pixel, and the colorimetric plane pixel be on adopt Basal layer colorimetric plane pixel after sample.
6. according to the method described in claim 1, wherein being held during multi-layer video coding using the high-pass filter Row, the luminance plane pixel is the base layer Luma planar pixel being not yet sampled, and the colorimetric plane pixel is Basal layer colorimetric plane pixel after up-sampling.
7. a kind of video encoder, the video encoder include:
Network interface, the network interface are configured as:
It receives vision signal and is designed to the high-pass filter of the vision signal;And
Processor, the processor are configured as:
The high-pass filter is applied to the luminance plane pixel of the vision signal, to determine chroma offset;And
The chroma offset is added to the corresponding colorimetric plane pixel of the vision signal.
8. video encoder according to claim 7, wherein the processor is additionally configured to:
The finger in the region of the high-pass filter will be applied about the vision signal to it by being received via the network interface Show;And
The high-pass filter is applied to the region.
9. video encoder according to claim 7, wherein the processor is additionally configured to:It is connect via the network It mouthful receives and to be applied in the sequence-level of the vision signal, picture level, chip level and block grade extremely about by the high-pass filter The instruction of few one.
10. video encoder according to claim 7, wherein the processor be configured as it is encoded in single-layer video In journey the high-pass filter is applied as post-processing.
11. video encoder according to claim 7, wherein the processor is configured as in multi-layer video coding mistake It is the base layer Luma planar pixel after up-sampling, and institute that the high-pass filter, the luminance plane pixel are applied in journey It is the basal layer colorimetric plane pixel after up-sampling to state colorimetric plane pixel.
12. video encoder according to claim 7, wherein the processor is configured as in multi-layer video coding mistake The high-pass filter is applied in journey, the luminance plane pixel is the base layer Luma planar pixel not being sampled also, and And the colorimetric plane pixel is the basal layer colorimetric plane pixel after up-sampling.
13. a kind of method of encoded video signal, this method include:
High-pass filter is generated using the component of the vision signal, wherein the high-pass filter is designed to be applied to institute The luminance plane component for stating vision signal, to generate the output for the colorimetric plane component that can apply to the vision signal;
Pair filter coefficient associated with the high-pass filter quantifies;
The filter coefficient is encoded in the bit stream for representing the vision signal;And
Transmit the bit stream.
14. according to the method for claim 13, wherein the high-pass filter is generated according to training set.
15. according to the method for claim 14, wherein the training set include the vision signal coding after brightness point It measures, chromatic component and the original chrominance components of the vision signal after the coding of the vision signal.
16. according to the method for claim 13, this method further includes:According in coding efficiency and color double sampling format At least one determine the feature of the high-pass filter.
17. according to the method for claim 16, wherein the size for being characterized in the high-pass filter, high pass filter At least one of the symmetry of the separability of wave device and the high-pass filter.
18. according to the method for claim 13, this method further includes:
Identify the region in the encoded picture of the vision signal, and
Transmit the instruction for the high-pass filter to be applied to the region.
19. a kind of video encoder, the video encoder include:
Network interface, the network interface are configured as:
Receive vision signal;And
Processor, the processor are configured as:
High-pass filter is generated using the component of the vision signal, including the design high-pass filter is described to be applied to The luminance plane component of vision signal, to generate the output for the colorimetric plane component that can apply to the vision signal;
Pair filter coefficient associated with the high-pass filter quantifies;
The filter coefficient is encoded in the bit stream for representing the vision signal;And
The bit stream is transmitted via the network interface.
20. video encoder according to claim 19, wherein the processor is configured as being generated according to training set The high-pass filter.
21. video encoder according to claim 20, wherein the training set includes the coding of the vision signal Afterwards luminance component, the vision signal coding after chromatic component and the original chrominance components of the vision signal.
22. video encoder according to claim 19, wherein the processor is additionally configured to according to coding efficiency The feature of the high-pass filter is determined at least one of color double sampling format.
23. video encoder according to claim 22, wherein size, the institute for being characterized in the high-pass filter State the separability of high-pass filter and at least one of the symmetry of the high-pass filter.
24. video encoder according to claim 19, wherein the processor is additionally configured to:
Identify the region in the encoded picture of the vision signal, and
Instruction via network interface transmission for the high-pass filter to be applied to the region.
CN201380050776.8A 2012-09-28 2013-09-27 Crossing plane filtering for the carrier chrominance signal enhancing in Video coding Active CN104769950B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811230342.7A CN109327704B (en) 2012-09-28 2013-09-27 Cross-plane filtering for chroma signal enhancement in video coding
CN202110601265.7A CN113518228A (en) 2012-09-28 2013-09-27 Cross-plane filtering for chroma signal enhancement in video coding

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
US201261707682P 2012-09-28 2012-09-28
US61/707,682 2012-09-28
US201361762611P 2013-02-08 2013-02-08
US61/762,611 2013-02-08
US201361778218P 2013-03-12 2013-03-12
US61/778,218 2013-03-12
US201361845792P 2013-07-12 2013-07-12
US61/845,792 2013-07-12
PCT/US2013/062133 WO2014052731A2 (en) 2012-09-28 2013-09-27 Cross-plane filtering for chroma signal enhancement in video coding

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN202110601265.7A Division CN113518228A (en) 2012-09-28 2013-09-27 Cross-plane filtering for chroma signal enhancement in video coding
CN201811230342.7A Division CN109327704B (en) 2012-09-28 2013-09-27 Cross-plane filtering for chroma signal enhancement in video coding

Publications (2)

Publication Number Publication Date
CN104769950A CN104769950A (en) 2015-07-08
CN104769950B true CN104769950B (en) 2018-11-13

Family

ID=50385193

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201380050776.8A Active CN104769950B (en) 2012-09-28 2013-09-27 Crossing plane filtering for the carrier chrominance signal enhancing in Video coding
CN202110601265.7A Pending CN113518228A (en) 2012-09-28 2013-09-27 Cross-plane filtering for chroma signal enhancement in video coding
CN201811230342.7A Active CN109327704B (en) 2012-09-28 2013-09-27 Cross-plane filtering for chroma signal enhancement in video coding

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN202110601265.7A Pending CN113518228A (en) 2012-09-28 2013-09-27 Cross-plane filtering for chroma signal enhancement in video coding
CN201811230342.7A Active CN109327704B (en) 2012-09-28 2013-09-27 Cross-plane filtering for chroma signal enhancement in video coding

Country Status (7)

Country Link
US (4) US10397616B2 (en)
EP (2) EP2901703A2 (en)
JP (4) JP6175505B2 (en)
KR (2) KR102028244B1 (en)
CN (3) CN104769950B (en)
TW (1) TWI652935B (en)
WO (1) WO2014052731A2 (en)

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013164922A1 (en) * 2012-05-02 2013-11-07 ソニー株式会社 Image processing device and image processing method
WO2014052731A2 (en) 2012-09-28 2014-04-03 Vid Scale, Inc. Cross-plane filtering for chroma signal enhancement in video coding
US9648353B2 (en) 2013-04-04 2017-05-09 Qualcomm Incorporated Multiple base layer reference pictures for SHVC
US20160065974A1 (en) * 2013-04-05 2016-03-03 Samsung Electronics Co., Ltd. Method and apparatus for encoding and decoding video with respect to filtering
US10708588B2 (en) * 2013-06-19 2020-07-07 Apple Inc. Sample adaptive offset control
US9294766B2 (en) 2013-09-09 2016-03-22 Apple Inc. Chroma quantization in video coding
GB201500719D0 (en) * 2015-01-15 2015-03-04 Barco Nv Method for chromo reconstruction
EP3284259A1 (en) * 2015-04-17 2018-02-21 VID SCALE, Inc. Chroma enhancement filtering for high dynamic range video coding
EP3320684A1 (en) * 2015-07-08 2018-05-16 VID SCALE, Inc. Enhanced chroma coding using cross plane filtering
EP3329679A1 (en) 2015-07-28 2018-06-06 VID SCALE, Inc. High dynamic range video coding architectures with multiple operating modes
US10009622B1 (en) * 2015-12-15 2018-06-26 Google Llc Video coding with degradation of residuals
WO2017123487A1 (en) 2016-01-15 2017-07-20 Vid Scale, Inc. System and method for enhanced motion compensation using adaptive filtering
EP4221201A1 (en) 2018-01-29 2023-08-02 InterDigital VC Holdings, Inc. Encoding and decoding with refinement of the reconstructed picture
JP2021528004A (en) * 2018-06-21 2021-10-14 インターデジタル ヴイシー ホールディングス, インコーポレイテッド Improved mode processing in video coding and decoding
WO2020139039A1 (en) * 2018-12-27 2020-07-02 인텔렉추얼디스커버리 주식회사 Image encoding/decoding method and device
JP7119236B2 (en) * 2019-01-09 2022-08-16 ベイジン、ターチア、インターネット、インフォメーション、テクノロジー、カンパニー、リミテッド Video Coding Using Cross-Component Linear Models
GB2586484B (en) * 2019-08-20 2023-03-08 Canon Kk A filter
US11234010B2 (en) 2019-08-28 2022-01-25 Qualcomm Incorporated Cross-component adaptive loop filtering for video coding
US11356707B2 (en) * 2019-09-23 2022-06-07 Qualcomm Incorporated Signaling filters for video processing
JP2022554307A (en) 2019-10-29 2022-12-28 北京字節跳動網絡技術有限公司 Cross-Component Adaptive Loop Filter Signaling
US11303936B2 (en) * 2020-02-21 2022-04-12 Tencent America LLC Method and apparatus for filtering
CN112514401A (en) * 2020-04-09 2021-03-16 北京大学 Method and device for loop filtering
WO2021247883A1 (en) * 2020-06-03 2021-12-09 Beijing Dajia Internet Information Technology Co., Ltd. Chroma coding enhancement in cross-component correlation
EP4218235A4 (en) * 2020-09-23 2024-04-03 Beijing Dajia Internet Information Tech Co Ltd Chroma coding enhancement in cross-component sample adaptive offset with virtual boundary
WO2022164757A1 (en) * 2021-02-01 2022-08-04 Beijing Dajia Internet Information Technology Co., Ltd. Chroma coding enhancement in cross-component sample adaptive offset
WO2022170073A1 (en) * 2021-02-08 2022-08-11 Beijing Dajia Internet Information Technology Co., Ltd. Cross-component adaptive loop filter
CN113099221B (en) * 2021-02-22 2023-06-02 浙江大华技术股份有限公司 Cross-component sample point self-adaptive compensation method, coding method and related device
JP2024507857A (en) * 2021-02-22 2024-02-21 ベイジン、ターチア、インターネット、インフォメーション、テクノロジー、カンパニー、リミテッド Coding enhancements in inter-component sample adaptive offsets
KR20230156790A (en) * 2021-03-18 2023-11-14 베이징 다지아 인터넷 인포메이션 테크놀로지 컴퍼니 리미티드 Coding improvements in cross-component sample adaptive offset

Family Cites Families (54)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4701783A (en) * 1982-09-14 1987-10-20 New York Institute Of Technology Technique for encoding and decoding video with improved separation of chrominance and luminance
US7102669B2 (en) * 2002-04-02 2006-09-05 Freescale Semiconductor, Inc. Digital color image pre-processing
KR20040043750A (en) * 2002-11-19 2004-05-27 엘지전자 주식회사 Method for implementing cross-filtering effect in a digital displayer
US7333544B2 (en) * 2003-07-16 2008-02-19 Samsung Electronics Co., Ltd. Lossless image encoding/decoding method and apparatus using inter-color plane prediction
US7724827B2 (en) * 2003-09-07 2010-05-25 Microsoft Corporation Multi-layer run level encoding and decoding
US7822286B2 (en) 2003-11-07 2010-10-26 Mitsubishi Electric Research Laboratories, Inc. Filtering artifacts in images with 3D spatio-temporal fuzzy filters
KR100754388B1 (en) 2003-12-27 2007-08-31 삼성전자주식회사 Residue image down/up sampling method and appratus, image encoding/decoding method and apparatus using residue sampling
US7397515B2 (en) * 2004-01-30 2008-07-08 Broadcom Corporation Method and system for cross-chrominance removal using motion detection
US8391672B2 (en) * 2004-02-06 2013-03-05 Panasonic Corporation Recording medium, reproduction device, program, and reproduction method
KR100884149B1 (en) * 2004-06-02 2009-02-17 파나소닉 주식회사 Recording medium capable of performing a high-speed random access in a slide show, reproduction device, computer readable medium, recording method, and reproduction method
EP1800494A1 (en) 2004-10-13 2007-06-27 Thomson Licensing Method and apparatus for complexity scalable video encoding and decoding
KR100679022B1 (en) * 2004-10-18 2007-02-05 삼성전자주식회사 Video coding and decoding method using inter-layer filtering, video ecoder and decoder
WO2006108654A2 (en) 2005-04-13 2006-10-19 Universität Hannover Method and apparatus for enhanced video coding
TWI314720B (en) * 2005-05-31 2009-09-11 Himax Tech Inc 2d yc separation device and yc separation system
US7551232B2 (en) 2005-11-14 2009-06-23 Lsi Corporation Noise adaptive 3D composite noise reduction
CN101009842B (en) * 2006-01-11 2012-02-01 华为技术有限公司 Method and device for value insertion in the hierarchical video compression
US7579670B2 (en) * 2006-07-03 2009-08-25 Semiconductor Components Industries, L.L.C. Integrated filter having ground plane structure
KR101266168B1 (en) * 2006-08-16 2013-05-21 삼성전자주식회사 Method and apparatus for encoding, decoding video
US9001899B2 (en) * 2006-09-15 2015-04-07 Freescale Semiconductor, Inc. Video information processing system with selective chroma deblock filtering
EP2119236A1 (en) * 2007-03-15 2009-11-18 Nokia Corporation System and method for providing improved residual prediction for spatial scalability in video coding
US8270472B2 (en) * 2007-11-09 2012-09-18 Thomson Licensing Methods and apparatus for adaptive reference filtering (ARF) of bi-predictive pictures in multi-view coded video
KR20100133006A (en) * 2008-07-04 2010-12-20 가부시끼가이샤 도시바 Dynamic image encoding/decoding method and device
US8270466B2 (en) 2008-10-03 2012-09-18 Sony Corporation Adaptive decimation filter
CN101404765B (en) * 2008-10-24 2010-12-08 宁波大学 Interactive multi-view point video encoding method
CN101778371B (en) 2009-01-09 2012-08-29 电信科学技术研究院 Paging method and device
CN102474606B (en) * 2009-07-20 2014-09-24 三星电子株式会社 Method and apparatus for coding and decoding color channels in layered video coding and decoding
JPWO2011033643A1 (en) 2009-09-17 2013-02-07 株式会社東芝 Video encoding method and video decoding method
WO2011043797A2 (en) * 2009-10-05 2011-04-14 Thomson Licensing Methods and apparatus for adaptive filtering of prediction pixels for chroma components in video encoding and decoding
KR101682147B1 (en) 2010-04-05 2016-12-05 삼성전자주식회사 Method and apparatus for interpolation based on transform and inverse transform
CN201726499U (en) * 2010-05-04 2011-01-26 武汉光华芯科技有限公司 Composite video signal luminance and chrominance separation system
CN101902653B (en) * 2010-06-25 2013-04-24 杭州爱威芯科技有限公司 Luminance sample direction prediction-based in-field luminance and chrominance (YC) separation method
US20120008687A1 (en) 2010-07-06 2012-01-12 Apple Inc. Video coding using vector quantized deblocking filters
US9693070B2 (en) 2011-06-24 2017-06-27 Texas Instruments Incorporated Luma-based chroma intra-prediction for video coding
CN104980756A (en) 2011-06-28 2015-10-14 三星电子株式会社 Video decoding method using offset adjustments according to pixel classification and apparatus therefor
US9641866B2 (en) 2011-08-18 2017-05-02 Qualcomm Incorporated Applying partition-based filters
US9807403B2 (en) 2011-10-21 2017-10-31 Qualcomm Incorporated Adaptive loop filtering for chroma components
WO2013070629A1 (en) 2011-11-07 2013-05-16 Huawei Technologies Co., Ltd. New angular table for improving intra prediction
EP2804377A4 (en) 2012-01-13 2015-12-09 Sharp Kk Image decoding device, image encoding device, and data structure of encoded data
US9380302B2 (en) 2012-02-27 2016-06-28 Texas Instruments Incorporated Sample adaptive offset (SAO) parameter signaling
WO2013164922A1 (en) * 2012-05-02 2013-11-07 ソニー株式会社 Image processing device and image processing method
US20140086316A1 (en) * 2012-09-24 2014-03-27 Louis Joseph Kerofsky Video compression with color space scalability
WO2014052731A2 (en) * 2012-09-28 2014-04-03 Vid Scale, Inc. Cross-plane filtering for chroma signal enhancement in video coding
JP6788346B2 (en) 2012-10-01 2020-11-25 ジーイー ビデオ コンプレッション エルエルシー Scalable video coding using subpartition derivation of subblocks for prediction from the base layer
US9357211B2 (en) 2012-12-28 2016-05-31 Qualcomm Incorporated Device and method for scalable and multiview/3D coding of video information
WO2014115283A1 (en) 2013-01-24 2014-07-31 シャープ株式会社 Image decoding device and image encoding device
CN105009585B (en) 2013-04-02 2018-09-25 明达半导体股份有限公司 Method for processing video frequency and video process apparatus
US9503732B2 (en) 2013-04-10 2016-11-22 Arris Enterprises, Inc. Re-sampling with phase offset adjustment for luma and chroma to select filters in scalable video coding
US8810727B1 (en) 2013-05-07 2014-08-19 Qualcomm Technologies, Inc. Method for scaling channel of an image
US9686561B2 (en) * 2013-06-17 2017-06-20 Qualcomm Incorporated Inter-component filtering
WO2015003753A1 (en) 2013-07-12 2015-01-15 Nokia Solutions And Networks Oy Redirection of m2m devices
US10129542B2 (en) 2013-10-17 2018-11-13 Futurewei Technologies, Inc. Reference pixel selection and filtering for intra coding of depth map
WO2015062098A1 (en) 2013-11-01 2015-05-07 华为技术有限公司 Network selection method and core network device
TW201642655A (en) 2015-04-21 2016-12-01 Vid衡器股份有限公司 Artistic intent based video coding
JP6750234B2 (en) 2016-01-28 2020-09-02 横浜ゴム株式会社 Tire operation service system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
INTRA PREDICTION METHOD BASED ON THE LINEAR RELATIONSHIP BETWEEN THE CHANNELS FOR YUV 4:2:0 INTRA CODING;Sang Heon Lee etal;《2009 16th IEEE International Conference on Image Processing(ICIP)》;20091110;第2.1-2.3节 *
New intra chroma prediction using inter-channel correlation;Jungsun Kim;《oint Collaborative Team on Video Coding (JCT-VC)》;20100728;第2节 *

Also Published As

Publication number Publication date
KR20160105944A (en) 2016-09-07
EP2901703A2 (en) 2015-08-05
JP2017200235A (en) 2017-11-02
JP6175505B2 (en) 2017-08-02
US20140092999A1 (en) 2014-04-03
CN109327704B (en) 2021-06-18
US20220286712A1 (en) 2022-09-08
EP3661215A1 (en) 2020-06-03
TWI652935B (en) 2019-03-01
KR20150065766A (en) 2015-06-15
JP7433019B2 (en) 2024-02-19
JP2020036353A (en) 2020-03-05
US10397616B2 (en) 2019-08-27
US11356708B2 (en) 2022-06-07
WO2014052731A3 (en) 2014-10-23
CN113518228A (en) 2021-10-19
JP2023011047A (en) 2023-01-20
JP6671321B2 (en) 2020-03-25
WO2014052731A2 (en) 2014-04-03
US20190327494A1 (en) 2019-10-24
CN109327704A (en) 2019-02-12
CN104769950A (en) 2015-07-08
JP2015531569A (en) 2015-11-02
KR101654814B1 (en) 2016-09-22
KR102028244B1 (en) 2019-10-02
US20200404341A1 (en) 2020-12-24
US10798423B2 (en) 2020-10-06
TW201436530A (en) 2014-09-16

Similar Documents

Publication Publication Date Title
CN104769950B (en) Crossing plane filtering for the carrier chrominance signal enhancing in Video coding
CN104685877B (en) Adaptive upsampling for multi-layer video coding
CN105765979B (en) Inter-layer prediction for scalable video coding
CN106233726B (en) System and method for rgb video coding enhancing
CN105900432B (en) Two-dimentional palette coding for screen content coding
CN103797792B (en) For the system and method for spatial prediction
CN107836116A (en) The enhancing chroma coder filtered using crossing plane
CN105874793B (en) The method and apparatus that combination gradability for multi-layer video coding is handled
CN105122805B (en) For the equipment of the inter-layer reference picture enhancing of multi-layer video coding
CN107211147A (en) For non-4:4:The palette coding of 4 screen content videos
CN104704831B (en) The sampling grids information of space layer is used in multi-layer video coding
CN107548556A (en) Video coding based on artistic intent
CN107534769A (en) Colourity for high dynamic range video decoding strengthens filtering
CN106797469A (en) The improved palette coding for screen content coding
CN107079157A (en) For decorrelation between the component of Video coding
CN107211146A (en) One-dimensional transform pattern and coefficient scanning order
CN108337519A (en) Video encoder and method
CN108156463A (en) For the method and apparatus of the motion-vector prediction of gradable video encoding

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20201217

Address after: Fa Guobali

Patentee after: Interactive Digital Madison Patent Holdings

Address before: Delaware, USA

Patentee before: VID SCALE, Inc.