US20220329831A1 - Enhanced chroma coding using cross plane filtering - Google Patents
Enhanced chroma coding using cross plane filtering Download PDFInfo
- Publication number
- US20220329831A1 US20220329831A1 US17/848,190 US202217848190A US2022329831A1 US 20220329831 A1 US20220329831 A1 US 20220329831A1 US 202217848190 A US202217848190 A US 202217848190A US 2022329831 A1 US2022329831 A1 US 2022329831A1
- Authority
- US
- United States
- Prior art keywords
- luma
- chroma
- samples
- sample
- cross
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001914 filtration Methods 0.000 title abstract description 46
- 241000023320 Luma <angiosperm> Species 0.000 claims abstract description 253
- OSWPMRLSEDHDFF-UHFFFAOYSA-N methyl salicylate Chemical compound COC(=O)C1=CC=CC=C1O OSWPMRLSEDHDFF-UHFFFAOYSA-N 0.000 claims abstract description 253
- 238000000034 method Methods 0.000 claims description 22
- 239000000523 sample Substances 0.000 description 150
- 238000004891 communication Methods 0.000 description 42
- 238000005516 engineering process Methods 0.000 description 18
- 239000013598 vector Substances 0.000 description 16
- 238000010586 diagram Methods 0.000 description 10
- 239000013074 reference sample Substances 0.000 description 7
- 238000005192 partition Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 230000011664 signaling Effects 0.000 description 5
- 101150014732 asnS gene Proteins 0.000 description 4
- 230000002708 enhancing effect Effects 0.000 description 4
- 230000002123 temporal effect Effects 0.000 description 4
- 208000037170 Delayed Emergence from Anesthesia Diseases 0.000 description 3
- 241000760358 Enodes Species 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 3
- 238000009499 grossing Methods 0.000 description 3
- 230000006872 improvement Effects 0.000 description 3
- 230000002093 peripheral effect Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- VYZAMTAEIAYCRO-UHFFFAOYSA-N Chromium Chemical compound [Cr] VYZAMTAEIAYCRO-UHFFFAOYSA-N 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000013475 authorization Methods 0.000 description 2
- 229910001416 lithium ion Inorganic materials 0.000 description 2
- QELJHCBNGDEXLD-UHFFFAOYSA-N nickel zinc Chemical compound [Ni].[Zn] QELJHCBNGDEXLD-UHFFFAOYSA-N 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000000638 solvent extraction Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- HBBGRARXTFLTSG-UHFFFAOYSA-N Lithium ion Chemical compound [Li+] HBBGRARXTFLTSG-UHFFFAOYSA-N 0.000 description 1
- 235000010627 Phaseolus vulgaris Nutrition 0.000 description 1
- 244000046052 Phaseolus vulgaris Species 0.000 description 1
- 241000700159 Rattus Species 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 238000004873 anchoring Methods 0.000 description 1
- OJIJEKBXJYRIBZ-UHFFFAOYSA-N cadmium nickel Chemical compound [Ni].[Cd] OJIJEKBXJYRIBZ-UHFFFAOYSA-N 0.000 description 1
- 230000001364 causal effect Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000009849 deactivation Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 238000012886 linear function Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 229910052987 metal hydride Inorganic materials 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 229910052759 nickel Inorganic materials 0.000 description 1
- PXHVJJICTQNCMI-UHFFFAOYSA-N nickel Substances [Ni] PXHVJJICTQNCMI-UHFFFAOYSA-N 0.000 description 1
- -1 nickel metal hydride Chemical class 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000013139 quantization Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/117—Filters, e.g. for pre-processing or post-processing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/132—Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/136—Incoming video signal characteristics or properties
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/157—Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
- H04N19/159—Prediction type, e.g. intra-frame, inter-frame or bidirectional frame prediction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/176—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/182—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a pixel
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/186—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/46—Embedding additional information in the video signal during the compression process
- H04N19/467—Embedding additional information in the video signal during the compression process characterised by the embedded information being invisible, e.g. watermarking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/70—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/80—Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/80—Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
- H04N19/82—Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation involving filtering within a prediction loop
Definitions
- Video coding systems may be used to compress digital video signals. For example, video coding systems may reduce storage space consumed and/or reduce transmission bandwidth consumption associated with video signals. For example, block-based hybrid video coding systems may be used.
- Digital video signals may include three color planes.
- the three color planes may include a luma plane, a blue-difference chroma plane, and a red-difference chroma plane.
- Pixels of the chroma planes may have smaller dynamic ranges than pixels of the luma plane.
- the chroma planes of a video image may be smoother and/or have less detail than the luma plane of the video image.
- a chroma block of a video image may be easier to predict (e.g., accurately predict). For example, prediction of the chroma block may consume fewer resources and/or result in less prediction error.
- a high dynamic range (HDR) video may offer a wider dynamic range than a standard dynamic range (SDR) video.
- the dynamic range of HDR video may be closer to the capacities of the human eye.
- Chrome artifacts in HDR video may be more visible against a brighter background than chrome artifacts in SDR video.
- HDR video coding may include preprocessing, coding, decoding, and/or post-processing.
- An indication of a cross-plane filter associated with a current picture may be received.
- the indication may include one or more filter coefficients associated with the cross-plane filter.
- the current picture may include an intra-coded video block and a plurality of reference samples.
- the plurality of reference samples may be used to predict the intra-coded video block.
- a luma sample region may be determined in the current picture.
- the luma sample region may include a plurality of luma samples.
- the luma sample region may be a 3 ⁇ 3 block of luma samples.
- the plurality of luma samples may include predicted luma samples such that the cross-plane filter is applied to the predicted luma samples prior to reconstruction of the luma samples.
- the plurality of luma samples may include reconstructed luma samples such that the cross-plane filter may be applied to the reconstructed luma samples after reconstruction.
- the luma sample region may be determined based on a selected intra prediction mode.
- the luma sample region may be determined for enhancing a corresponding chroma sample in the current picture.
- the corresponding chroma sample may be a predicted chroma sample or a reconstructed chroma sample.
- the enhanced chroma sample may be used for prediction.
- the corresponding chroma sample is a reconstructed chroma sample before in-loop filtering
- the enhanced chroma sample may be used to replace the corresponding chroma sample before in-loop filtering is applied.
- the corresponding chroma sample may be a reference chroma sample used to predict one or more chroma samples in the intra-coded video block.
- the received chroma enhancement indicator may be received at a block level.
- the cross-plane filter may be applied to a plurality of luma samples in the luma sample region to determine an offset.
- the cross-plane filter may be a high pass filter.
- the offset may be applied to the corresponding chroma sample to determine an enhanced chroma sample.
- the luma sample region may include an unavailable luma sample.
- the unavailable luma sample may be replaced with a neighboring available luma sample, for example, prior to applying the cross-plane filter to the plurality of luma samples.
- the cross-plane filter to apply may be determined based on the selected intra prediction mode.
- a plurality of luma sample regions may be determined for a current picture. For example, a first luma sample region and a second luma sample region may be determined.
- the first luma sample region may include a first plurality of luma samples.
- the second luma sample region may include a second plurality of luma samples.
- the first luma sample region may neighbor a first corresponding chroma sample in the current picture.
- the second luma sample region may neighbor a second corresponding chroma sample in the current picture.
- the cross-plane filter may be applied to the first plurality of luma samples and the second plurality of luma samples to determine a first offset and a second offset, respectively.
- the first offset may be applied to the first corresponding chroma sample to determine a first enhanced chroma sample.
- the second offset may be applied to the second corresponding chroma sample to determine a second enhanced chroma sample.
- FIG. 1 shows an example of a block based hybrid video encoder.
- FIG. 2 shows an example of a block based hybrid video decoder.
- FIG. 3 shows an example of reference samples R x,y used for prediction to obtain predicted samples P x,y for a block size of N ⁇ N samples.
- FIG. 4 shows an example of partitioning modes for intra prediction unit.
- FIG. 5 shows an example of angular intra prediction modes.
- FIG. 6 shows an example of an intra boundary filter.
- FIG. 7 shows an example of different partitions HEVC inter-prediction coding.
- FIG. 8A shows an example of motion compensated prediction using motion vectors.
- FIG. 8B shows an example of fractional pixel interpolation.
- FIG. 9 shows an example of chroma reference sample enhancement with cross plane filtering that does not use reconstructed luma samples from the current block.
- FIG. 10 shows an example of chroma reference sample enhancement with cross plane filtering using reconstructed luma samples from a current block.
- FIG. 11 shows an example of a block based hybrid encoder with predicted chroma sample enhancement.
- FIG. 12 shows an example of enhancing predicted chroma samples with cross plane filtering.
- FIG. 13 shows an example of a block based hybrid encoder with chroma enhancement on reconstructed chroma samples.
- FIG. 14 shows an example of chroma reconstructed sample enhancement with cross plane filtering.
- FIG. 15A is a system diagram of an example communications system in which one or more disclosed embodiments may be implemented.
- FIG. 15B is a system diagram of an example wireless transmit/receive unit (WTRU) that may be used within the communications system illustrated in FIG. 15A .
- WTRU wireless transmit/receive unit
- FIG. 15C is a system diagram of an example radio access network and an example core network that may be used within the communications system illustrated in FIG. 15A .
- FIG. 15D is a system diagram of another example radio access network and an example core network that may be used within the communications system illustrated in FIG. 15A .
- FIG. 15E is a system diagram of another example radio access network and an example core network that may be used within the communications system illustrated in FIG. 15A .
- Video coding systems may compress digital video signals, for example, to reduce the storage and/or transmission bandwidth of digital video signals.
- video coding systems such as block-based, wavelet-based, object-based systems and block-based hybrid video coding systems.
- block-based video coding systems are H.261, (Moving Picture Experts Group) MPEG-1, MPEG-2, H.263, H.264/Advanced Video Coding (AVC) and H.265/High Efficiency Video Coding (HEVC).
- FIG. 1 shows an example of a block based hybrid video encoder.
- Spatial prediction e.g., intra prediction
- temporal prediction e.g. inter prediction
- a prediction block generated from intra or inter prediction may be subtracted from a current video block.
- a resulting prediction residual may be transformed and quantized.
- a residual may be reconstructed, for example, by inverse quantizing and inverse transforming quantized residual coefficients.
- a reconstructed residual may be added to a prediction block, e.g., to form a reconstruction video block.
- In-loop filtering may be applied to a reconstructed video block.
- a filtered reconstructed video block which may be stored in a decoded picture buffer, may be used to code one or more next video blocks.
- FIG. 2 shows an example of a block based hybrid video decoder.
- the decoder in FIG. 2 may correspond to the encoder in FIG. 1 .
- An encoded video bitstream may be parsed and entropy decoded.
- a coding mode and associated prediction information may be passed, for example, to spatial prediction or motion compensated prediction, e.g., to form a prediction block.
- Residual transform coefficients may be inverse quantized and inverse transformed, for example, to reconstruct a residual block.
- a prediction block and a reconstructed residual block may be added together, e.g., to form a reconstructed block.
- In-loop filtering may be applied to a reconstructed video block.
- a filtered reconstructed video block which may be stored in a reference picture buffer, may be used to predict future video blocks.
- Intra coding may be used, for example, to eliminate spatial correlation in some image and video coding techniques, such as Joint Photographic Experts Group (JPEG), H 261, MPEG-1, MPEG-2, H.263, H.264/AVC and H.265/HEVC.
- Directional intra prediction may be used, for example, in H.264/AVC and H.265/HEVC, e.g., to improve coding efficiency.
- Intra prediction modes may utilize a set of reference samples, e.g., from above and to the left of a current block to be predicted. Reference samples may be denoted as R x,y .
- a position (x, y) may have its origin one pixel above and to the left of a block's top-left corner.
- a predicted sample value at the position (x, y) may be denoted as P x,y .
- FIG. 3 shows an example of reference samples R x,y used for prediction to obtain predicted samples P x,y for a block size of N ⁇ N samples.
- FIG. 4 shows an example of partitioning modes for an intra prediction unit (PU).
- HEVC intra coding may support multiple types of PU division, e.g., PART_2N ⁇ 2N and PART_N ⁇ N, which may split a coding unit (CU) into one or four equal size PUs, respectively.
- PART_2N ⁇ 2N may be available when a CU size is a configured minimum CU size.
- An 8 ⁇ 8 CU split into four 4 ⁇ 4 PUs may have four luma prediction blocks (PBs), for example, for 4:2:0 chroma formats. There may be one 4 ⁇ 4 PB per chroma channel for intra coded blocks, for example, to avoid high throughput caused by 2 ⁇ 2 chroma intra prediction blocks.
- PBs luma prediction blocks
- a CU may be split into multiple transform units (TUs).
- Intra prediction may be applied sequentially to TUs. For example, as compared to applying intra prediction at PU level. Sequential intra prediction may permit use in intra prediction neighboring reference samples from previous reconstructed TUs that are closer to current TU samples.
- FIG. 5 shows an example of angular intra prediction modes HEVC may support one or more (e.g. 35) intra prediction modes.
- the one or more intra prediction modes may include a DC mode, a planar mode, and/or 33 directional or ‘angular’ intra prediction modes.
- Angular intra prediction may be used to efficiently model different directional structures in video and image content.
- the number and angularity of prediction directions may be selected based on a trade-off between encoding complexity and coding efficiency.
- a predicted sample P x,y may be obtained, for example, by projecting its location to a reference row or column of pixels, applying a selected prediction direction, and interpolating a predicted value for the sample at 1/32 pixel accuracy. Interpolation may be performed linearly utilizing the two closest reference samples, e.g., R i,0 and R i+1.0 for vertical prediction (e.g. mode 18-34 as shown in FIG. 5 ) and R 0,i and R 0,i+L for horizontal prediction (e.g. mode 2-17 as shown in FIG. 5 ).
- HEVC may support one or more intra prediction modes for luma intra prediction for a variety of (e.g. all) PU sizes.
- HEVC may define multiple (e.g. three) most probable modes (MPNs) for a (e.g. each) PU, for example, based on the modes of one or more neighboring PUs.
- MPNs most probable modes
- a current intra prediction mode may be equal to one of the elements in a set of MPMs.
- An index in the set may be transmitted to the decoder.
- a code (e.g. a 5-bit fixed length code) may be used to determine a selected mode outside the set of MPMs.
- Reference samples may be smoothed.
- a 3-tap smoothing filter may be applied to one or more reference samples.
- the smoothing may be applied, for example, when an intra_smoothing_disabled_flag is set to 0, filtering may be controlled, for example, by a given intra prediction mode and/or transform block size.
- angular modes may use filtered reference samples, for example, except horizontal and vertical angular modes.
- modes not using filtered reference samples may be extended to four modes (e.g. 9, 11, 25, 27) closest to horizontal and vertical as shown in FIG. 5 .
- diagonal modes (2, 18, 34) may use filtered reference samples.
- Intra prediction may be applied for a chroma component.
- an intra prediction mode may be specified, e.g., as planar, TX, horizontal, vertical, ‘DM_CHROMA’ mode, diagonal mode (34), for example, for one or more prediction blocks (PBs) associated with chroma.
- PBs prediction blocks
- Table 1 shows an example mapping between an intra prediction mode and an intra prediction direction for chroma.
- a chroma color channel intra prediction mode may be based on a corresponding luma PB intra prediction mode and/or an intra_chroma_pred_mod syntax element.
- Table 2 shows an example specification of intra prediction mode for 4:2:2 chroma format, e.g., when a DM_CHROMA mode is selected and a 4:2:2 chroma format is in use.
- An intra prediction mode for a chroma PB may be derived, for example, from an intra prediction mode for a corresponding luma PB, e.g., as specified in Table 2.
- FIG. 6 shows an example of an intra boundary filter.
- An intra-boundary filter may be used, for example, when reconstructing intra-predicted transform blocks (TBs).
- An intra-boundary filter may be used, for example, to filter predicted luma samples along the left and/or top edges of the TB for PBs using horizontal, vertical, and/or DC intra prediction modes, e.g., as shown in FIG. 6 .
- An intra boundary filter may be defined, for example, based on an array of predicted samples p as an input and/or predSamples as an output.
- Intra boundary filtering provided by Eq. (3), Eq. (4) and Eq. (5) may be used to generate predSamples as an output, for example, for DC intra-prediction applied to luma transform blocks of size (nTbS) less than 32 ⁇ 32 and a DC predictor dcVal:
- An improvement may be provided by boundary smoothing, e.g. 0.4% average improvement.
- An intra boundary filter may be applied on a luma component.
- An intra boundary filter may not be applied on a chroma component, e.g., because prediction for chroma components tends to be smooth.
- HEVC intra mode residual coding may utilize intra mode dependent transforms and/or coefficient scanning to code residual information.
- a discrete sine transform (DST) may be selected for 4 ⁇ 4 luma blocks.
- a discrete cosine transform (DCT) may be selected/used for other types of blocks.
- a linear-model (LM) based chroma intra prediction mode may be used, for example, to predict chroma samples from collocated reconstructed luma samples using a linear model (LM), e.g., in accordance with Eq. (6):
- Pred C may indicate predicted chroma samples in a block and Rec L may indicate corresponding reconstructed luma samples in a block.
- Parameters ⁇ and ⁇ may be derived from causal reconstructed luma and chroma samples around a current block.
- Linear model chroma intra prediction may improve coding efficiency.
- experimental results in a test configuration indicate average Bj ⁇ ntegaard delta rate (BD-rate) reductions of Y, Cb, Cr components comprising 1.3%, 6.5% and 5.5%, respectively.
- BD-rate Bj ⁇ ntegaard delta rate
- a similar level of coding efficiency improvements of chroma components may be provided in a test configuration
- FIG. 7 shows an example of different partitions for HEVC inter-prediction coding.
- Inter coding may be used, for example, to remove or reduce temporal redundancy.
- HEVC inter-prediction coding may support more PB partition shapes than intra-prediction coding (e.g., intra-coding).
- Intra prediction may support, for example, partitions PART_2N ⁇ 2N, PART_2N ⁇ N, PART_N ⁇ 2N, PART_N ⁇ N.
- Inter-picture prediction may support, for example, partitions PART_2N ⁇ 2N, PART_2N ⁇ N, PART_N ⁇ 2N, PART_N ⁇ N and asymmetric motion partitions PART_2N ⁇ nU, PART_2N ⁇ nD, PART_nL ⁇ 2N, and PART_nR ⁇ 2N.
- An (e.g. each) inter-predicted PU may have a set of motion parameters comprising one or more motion vectors and one or more reference picture indices.
- a P slice may, for example, use one reference picture list and a B slice may, for example, use two reference picture lists.
- Inter-prediction samples of a PB may be determined from one or more samples of a corresponding block region in a reference picture identified by a reference picture index.
- the reference picture index may be at a position displaced by horizontal and vertical components of a motion vector (MV).
- FIG. 8A shows an example of motion compensated prediction using motion vectors (MVs) Horizontal and vertical motion vectors may be denoted as d x and d y , respectively.
- MVs motion vectors
- FIG. 8B shows an example of fractional sample interpolation.
- Fractional sample interpolation may be used, for example, when a motion vector has a fractional value. Fractional sample interpolation may generate prediction samples for non-integer sample positions.
- HEVC may support MV's, for example, with units of 1 ⁇ 4 of the distance between luma samples.
- HEVC may support MVs, for example, with units of 1 ⁇ 8 of the distance between chroma samples, e.g., in 4:2:0 format.
- Motion vector prediction may exploit spatial-temporal correlation of a motion vector with neighboring PUs
- a merge mode may be used, for example, to reduce motion vector signaling cost.
- a merge mode merge candidate list may comprise a list of motion vector candidates from neighboring PU positions (e.g. spatial neighbors and/or temporal neighbors) and/or zero vectors.
- An encoder may select a (e.g. the best) predictor from a merge candidate list and may transmit a corresponding index indicating the predictor chosen from the merge candidate list.
- Cross plane filtering may use high frequency information from luma, for example, to improve and enhance chroma quality.
- High frequency information may be extracted from luma, for example, by applying a high pass filter on the luma component.
- Luma and chroma components may have some correlations, such as object contours and edge areas.
- Cross-plane filtering for chroma enhancement may include applying high pass filtering to a luma component. An output of the high pass filtering may be added to the chroma component to determine an enhanced chroma component. The output of the high pass filtering may be an offset.
- Eq. 7 and Eq. 8 indicate an example of a chroma enhancement:
- Y is Luma
- C is chroma
- cross_plane_filter is a filter applied to the luma signal
- Y rec is the reconstructed luma signal.
- Y offset is the output of the filtering
- C rec is the reconstructed chroma signal, which may be a Cb or Cr component
- C enh is the enhanced chroma signal.
- the filter may be a 1D or 2D filter.
- a cross plane filter may be derived, for example, based on original chroma and luma and/or reconstructed chroma and luma, e.g., using a Least Square method.
- a luma prediction mode may be utilized in DM_CHROMA mode for intra prediction, for example, to derive a chroma prediction mode that reduces a signaling overhead of a chroma prediction mode.
- the chroma prediction mode may be signaled as DM_CHROMA, for example, when a chroma prediction block (PB) utilizes the same prediction mode as a corresponding luma PB.
- PB chroma prediction block
- a linear model (LM) chroma prediction mode may predict one or more chroma samples from collocated reconstructed luma samples, for example, by a linear model.
- Reference chroma samples (R x,y ), corresponding chroma prediction samples (P x,y ), and reconstructed chroma samples may be processed independently of their corresponding luma component(s). For example, for inter prediction, chroma prediction samples and chroma reconstructed samples may be generated independently from their corresponding luma component(s). Cross-plane filtering may be applied at different stages of the intra coding process.
- Cross-plane filtering may be used to enhance one or more neighboring chroma samples.
- the one or more neighboring chroma samples may neighbor the current video block.
- the neighboring chroma samples may include reference chroma samples, reconstructed chroma samples, and/or predicted chroma samples.
- a predicted chroma sample may be used for prediction.
- Neighboring reference chroma samples, R x,y may be used to generate predicted chroma samples P x,y , for intra coding, e.g., as shown in FIG. 3 .
- Cross-plane chroma filtering may derive high fidelity information from one or more luma samples that correspond to a chroma sample of the one or more neighboring chroma samples, for example, to improve chroma in coding color space.
- a cross plane filter may be applied to one or more neighboring luma samples.
- the neighboring luma sample(s) may include reference luma samples, reconstructed luma samples, and/or predicted luma samples. Derived high pass information may be used, for example, to enhance the quality of chroma reference samples. Enhanced reference chroma samples may be used to generate predicted chroma samples.
- One or more enhanced chroma samples may be determined using cross plane filtering.
- a cross plane filter may be applied to one or more neighboring luma samples.
- the cross plane filter may be a high pass filter.
- the one or more neighboring luma samples may correspond to a chroma sample to be enhanced.
- the cross plane filter may be applied to available and/or unavailable luma samples.
- a predicted luma sample and/or a reconstructed luma sample (e.g., before or after loop filtering) may be an available luma sample.
- a non-reconstructed luma sample and/or a non-predicted luma sample may be an unavailable luma sample.
- FIG. 9 is an example of chroma reference sample enhancement 900 with cross plane filtering that does not use reconstructed luma samples from a current video block.
- the current video block may include a 4 ⁇ 4 block of samples, e.g., as defined by a solid line in FIG. 9 .
- One or more luma sample regions such as 902 A, 902 B may be determined for a current picture.
- a luma sample region such as 902 A, 902 B may include a plurality of luma samples that neighbor a corresponding chroma sample and/or a luma sample that is collocated to the corresponding chroma sample.
- a sample may neighbor another sample if it is above, below, to the left of, to the right of, and/or diagonal to the other sample.
- a neighboring sample may be next to the corresponding sample.
- Collocated samples may include a luma sample at the same location as a chroma sample.
- the luma sample region 902 A may be determined to include one or more neighboring luma samples and/or a collocated luma sample.
- the luma sample region 902 A may be determined such that a chroma sample for enhancement (e.g., enhanced chroma sample 912 A) is at the center of each of the one or more luma sample regions 902 A, 902 B, respectively.
- a chroma sample for enhancement e.g., enhanced chroma sample 912 A
- the chroma sample that is located at the center of a luma sample region may be enhanced using the cross plane filtering.
- the luma samples in a luma sample region such as 902 A may include one or more luma sample(s) 904 A reference luma sample(s) 906 A, and/or predicted luma sample(s) 908 A.
- the reference luma samples 906 A may be reconstructed luma samples used to replace predicted luma sample(s) 908 A.
- each of the one or more predicted luma samples 908 A, 908 B may be replaced by (e.g., padded from) a respective neighboring reconstructed luma sample of the one or more reconstructed luma samples 906 .
- a luma sample region 902 A, 902 B may bean M ⁇ N window of luma samples, e.g., such as the 3 ⁇ 3 window highlighted by a dashed box in FIG. 9 .
- the luma samples in the M ⁇ N window may correspond to a chroma sample location.
- a cross plane filter may be applied to the plurality of luma samples of the luma sample region 902 A, 902 B.
- An offset may be determined as an output of applying the cross plane filter to the plurality of luma samples of the luma sample region 902 A, 902 B.
- the offset may be applied 910 A, 910 B (e.g., added) to the corresponding chroma sample, for example, to determine an enhanced chroma sample, such as 912 A, 912 B.
- the cross plane filter may be applied to a plurality of luma sample regions, such as 902 A, 902 B in the current picture to determine a plurality of enhanced chroma samples 912 A, 912 B.
- An enhanced chroma sample, such as enhanced chroma samples 912 A, 912 B may be used as reference samples for intra-prediction of the current video block.
- the plurality of luma samples in an M ⁇ N luma sample region may include (e.g. only) reconstructed luma samples before or after in-loop filtering.
- the plurality of luma samples in the M ⁇ N luma sample region may be available.
- one or more luma samples in an M ⁇ N luma sample region may not have been reconstructed, e.g., as shown in FIG. 9 .
- Luma samples that have not been reconstructed may be unavailable luma samples.
- the unavailable luma samples may be replaced by (e.g., padded using) a neighboring available (e.g., reconstructed) luma sample Prediction and reconstruction of luma and chroma samples in the current block may be performed in parallel.
- FIG. 10 is an example of chroma reference sample enhancement 1000 with cross plane filtering using reconstructed luma samples from a current block.
- the luma samples in a current block may be reconstructed before the respective corresponding chroma samples, for example, when higher latency between different color channels may be tolerated.
- Reconstructed luma samples in an M ⁇ N (e.g. 3 ⁇ 3) luma sample region 1002 A, 10028 may be available, e.g. without padding, when the cross plane tilter is applied.
- the luma sample region 1002 A, 1002 B may be an M ⁇ N (e.g., 3 ⁇ 3) window.
- a luma sample region such as 1002 A, 1002 B may be determined for a current picture.
- a luma sample region such as 1002 A, 1002 B may include a plurality of luma samples that neighbor a corresponding chroma sample and/or a luma sample that is collocated with the corresponding chroma sample.
- the plurality of luma samples in each of the luma sample regions 1002 A, 1002 B may include one or more reconstructed luma samples 1004 A, 1004 B that are outside the current block, and/or one or more reconstructed luma samples 1008 A, 1008 B that are within the current block.
- Cross plane filtering may be applied to the plurality of luma samples in the one or more luma sample regions 1002 A, 1002 B.
- the luma samples in the one or more luma sample regions 1002 A, 1002 B may include some luma samples within a current block (e.g., as shown in FIG. 9 ).
- the luma samples in the current block may include one or more reconstructed luma samples, e.g., before or after loop filtering.
- One or more enhanced reference chroma samples 1012 A, 1012 B may be generated, for example, in accordance with Eq. 9.
- R C [x][y] may be reconstructed reference chroma samples before enhancement
- R C_enh [x][y] may be enhanced reference chroma samples
- S L (x L , y L ) may be an array of reconstructed luma samples centering at position (x L , y L ).
- the one or more enhance reference chroma samples 1012 A, 1012 B may correspond to a center of the one or more luma sample regions 1002 A, 1002 B.
- a luma sample position (x L , y L ) of a corresponding chroma sample position (x, y) may be calculated based on chroma format, for example, in accordance with Eq. 10:
- scaleX and scaleY may be, for example, (2,2), (2,1) (1,1), respectively, for chroma format 4:2:0, 4:2:2, and 4:4:4.
- Enhanced chroma reference samples R C_enh [x][y] may be used in the intra prediction process, for example, using a directional/DC/planar intra prediction mode, e.g., as shown in FIG. 5 .
- a two-dimensional (2-D) cross plane filter may be applied to a luma sample region (e.g., an N ⁇ N region of luma samples as shown in FIGS. 9 and 10 ).
- a one-dimensional (1-D) cross plane filter may be applied to one or more luma samples in a 1 ⁇ N or N ⁇ 1 luma sample region.
- N horizontal luma reference samples above a current block may be filtered, for example, when an N ⁇ 1 luma sample region is used in cross plane filtering.
- N vertical luma reference samples to the left of a current block may be filtered, for example, when a 1 ⁇ N luma sample region is used in cross plane filtering.
- a luma sample region may be adaptively selected, for example, based on an intra prediction mode.
- the intra prediction mode may be a directional (e.g., vertical, horizontal, etc.) intra prediction mode, a DC intra prediction mode, a planar intra prediction mode, or any other intra prediction mode.
- the cross plane filter to apply may be determined based on which intra prediction mode is selected.
- a different set of cross-plane filters may be selected, for example, to match the edge and/or boundary characteristics of various different modes.
- a vertical prediction mode may use the top row of reference samples. When the vertical prediction mode is selected, one or more chroma reference samples above the current video block may be enhanced.
- a luma sample region may be selected such that it includes one or more luma samples that neighbor a corresponding chroma reference sample above the current video block.
- the luma sample region may include a luma sample that is collocated with the corresponding chroma reference sample above the current video block.
- one or more chroma reference samples to the left or right of the current video block may be enhanced.
- a luma sample region may be selected such that it includes one or more luma samples that neighbor a corresponding chroma reference samples to the left or right of the current video block.
- the luma sample region may include a luma sample that is collocated with the corresponding chroma reference sample to the left or right of the current video block.
- An edge may occur vertically, for example, when a vertical prediction mode is selected.
- a luma sample region may be selected with a horizontal rectangular shape, for example, in comparison to a square 2-D luma sample region depicted in examples shown in FIGS. 9 and 10 .
- a 1-D horizontal luma sample region may be selected and/or applied for a vertical selection mode
- 1-D horizontal filtering e.g., using the top neighboring luma samples
- a left column of reference samples may be used.
- An edge may occur horizontally, for example, when horizontal prediction mode is selected.
- a luma sample region with a vertical rectangular shape may be used.
- a 1-D vertical luma sample region may be selected and/or applied.
- a 1D vertical filtering (e.g. using the left neighboring luma samples) may retrieve the horizontal high pass edge information more effectively and/or reduce filtering complexity.
- a luma sample region may be selected with an ‘L’ shape such that the luma sample region corresponds to the chroma samples in the current video block.
- the mean of the one or more reference samples to the left and above the current video block may be used to predict the one or more samples in the current video block.
- planar intra-prediction mode a linear function of the one or more reference samples to the left and above the current video block may be used to predict the one or more samples in the current video block.
- the luma sample region may be selected such that the one or more reference chroma samples to the left and above the current video block are enhanced.
- a chroma enhancement indicator may be signaled, for example, to indicate whether to enable or disable chroma enhancement.
- the chroma enhancement indicator may be signaled, for example, at slice level, picture level, group of picture level, or sequence level.
- a chroma enhancement indicator may be signaled at a block level, for example, to indicate whether chroma enhancement processing is applied to a current coding block.
- the chroma enhancement indicator may be signaled, for example, when a current block is an intra coded block.
- One or more cross plane filter coefficients may be derived at an encoder and transmitted to a decoder.
- a filter coefficient training e.g., for enhancement filters applied to intra coded blocks to improve reference samples, may use intra coded samples.
- One or more cross plane filter coefficients may be transmitted, for example, at slice level, picture level, group of picture level, or sequence level. The one or more cross plane filter coefficients may be transmitted at a block level.
- FIG. 11 is an example of a block based hybrid video coding device 1100 (e.g., an encoder or a decoder) with predicted chroma sample enhancement.
- a prediction e.g., intra prediction 1104 and/or inter prediction 1106
- a prediction residual 1108 may be determined, for example by subtracting a prediction block 1114 from an original block of the input video 1102 .
- the prediction residual 1108 may be transformed, for example, using DCT and/or DST block transforms, and/or quantized 1110 .
- a prediction residual 1108 may be transformed for luma and/or chroma components of the input video 1102 . Improving the accuracy of predicted chroma samples may improve coding efficiency.
- Chroma enhancement 1112 may be implemented, for example, by using a cross plane (e.g., high pass) filter to enhance one or more predicted chroma samples. Chroma samples may be enhanced, for example, for object contours and/or edge areas.
- a cross plane filter may be used on one or more (e.g. all) predicted chroma samples of a current block, for example, using reconstructed luma samples of the same and neighboring blocks.
- the one or more predicted chroma samples of a current block may be enhanced by applying a cross plane filter to a plurality of luma samples that correspond to the one or more predicted chroma samples.
- the current block may be an intra-coded video block.
- An enhanced predicted chroma sample may be generated, for example, as indicated in Eq. 11:
- P C [x][y] may be a predicted chroma sample generated at chroma position (x,y).
- a predicted chroma sample P C [x][y] may be predicted using intra prediction or inter prediction.
- a cross plane filter used in chroma enhancement may vary, for example, depending on the coding mode (e.g. intra or inter) for a current block.
- a plurality of (e.g. two) sets of cross plane filters may be trained separately.
- a first set of cross plane filters may be applied to one or more intra predicted chroma samples.
- a second set of cross plane filters may be applied to one or more inter predicted chroma samples.
- S L (x L , y L ) may be reconstructed luma samples at position (x L , y L ), where (x L , y L ) may be calculated based on chroma format, for example, as indicated in Eq. 12:
- (scaleX, scaleY) may be (2,2), (2,1) and (1,1), respectively, for chroma format 4:2:0, 4:2:2 and 4:4:4.
- a cross plane filter may be applied to one or more neighboring corresponding reference luma samples, for example, to improve the accuracy of one or more predicted chroma samples.
- the cross plane filter may enhance the quality of the one or more predicted chroma samples. Prediction residual information may be smaller, leading to improved coding performance, for example, when enhanced predicted chroma samples are used to generate a prediction residual.
- Cross plane filtering may be used to derive one or more enhanced chroma samples.
- a cross plane filter may be applied to a neighborhood of luma samples (e.g. a luma sample region) that correspond to a current chroma sample location.
- FIG. 12 is an example of enhancing predicted chroma samples with cross plane filtering.
- a plurality of luma sample regions 1202 A, 120 B, 120 C may be determined. Each of the plurality of luma sample regions 1202 A, 1202 B, 1202 C may correspond to a predicted chroma sample.
- a cross plane filter may be applied to a plurality of luma samples within a luma sample region 1202 A, 1202 B, 1202 C (e.g. a 3 ⁇ 3 window of luma samples in the dashed box shown in FIG. 12 ).
- the cross plane filter may be applied to reconstructed luma samples within a current block (e.g. as indicated by vertical striped circles).
- the current block may be an intra-coded video block.
- An output may be determined.
- the output of applying the cross plane filter may be an offset, for example given by Eq. 11.
- An enhanced predicted chroma sample 1206 A, 1206 B, 1206 C may be determined, for example, by applying (e.g., adding) the offset 1204 A, 1204 B, 1204 C to a corresponding chroma sample to determine an enhanced chroma sample 1206 A, 1206 B, 1206 C.
- Availability of corresponding reconstructed luma samples of the current block to be filtered for use in enhancing predicted chroma samples may introduce a coding latency between luma and chroma components on a particular block.
- a cross plane filter may be applied on luma samples that are not yet reconstructed 1208 .
- Luma samples that are not yet reconstructed 1208 may be padded, for example, using reconstructed luma samples that neighbor (e.g., to the left or top of) the not yet reconstructed luma samples 1208 .
- an unavailable luma sample may be replaced by a neighboring reconstructed luma sample.
- Separate (e.g., different) filters may be applied, for example, depending on a prediction mode (e.g. intra or inter prediction). Other techniques (e.g. procedures) may be used to classify and apply different cross plane filters.
- a cross plane filter may be further classified or subclassified for applicability, for example, depending on whether integer or fractional (e.g. half or quarter) pixel motion vectors are used.
- a cross plane filter may be classified and/or subclassified for applicability for example, depending on which reference picture is used in inter prediction.
- the cross plane filter may be adaptively selected and/or applied these and other coding parameters.
- FIG. 13 is an example of a block based hybrid video coding device 1300 (e.g., an encoder or a decoder) with chroma enhancement on reconstructed chroma samples.
- chroma enhancement may be applied after prediction and reconstruction.
- a reconstructed block 1304 (e.g. before in-loop filtering) of an input video 1302 may be generated, for example, by adding a reconstructed residual block from the inverse quantization/inverse transform 1306 to a prediction block 1308 .
- Enhancement of reconstructed chroma samples may improve the overall picture quality and/or may improve coding efficiency of the following blocks or pictures.
- Chroma enhancement 1312 may be implemented, for example, by applying a cross plane filter. Applying the cross plane filter may enhance one or more reconstructed chroma samples, e.g., at object contours and/or edge areas. The one or more reconstructed chroma samples may be enhanced before or after in-loop filtering.
- a cross plane filter may be applied to reconstructed luma samples, S L [x][y], for example, to enhance the reconstructed chroma samples of a current block.
- Enhanced reconstructed chroma samples may be calculated, for example, in accordance with Eq. 13.
- S C [x][y] may be reconstructed chroma samples and S C_enh [x][y] may be enhanced reconstructed chroma samples.
- FIG. 14 is an example of enhancement of one or more chroma reconstructed samples with cross plane filtering.
- a plurality of luma sample regions 1402 , 1404 may be determined.
- a first luma sample region 1402 may include a plurality of reconstructed luma samples from a current block 1410 .
- a second luma sample region 1404 may include a plurality of reconstructed luma samples from the current block 1410 and a plurality of reconstructed luma samples from one or more previous blocks.
- a cross plane filter may be applied to one or more reconstructed luma samples S L (x L , y L ).
- One or more reconstructed chroma samples S C [x][y] of a current block 1410 may be enhanced, for example, by applying 1406 , 1408 (e.g., adding) the output of the selected and applied cross plane filter to a corresponding reconstructed chroma sample to generate one or more enhanced reconstructed chroma samples 1412 , 1414 , S C_enh [x][y].
- a luma sample region such as luma sample region 1402 may include one or more reconstructed luma samples from a current block 1410 and one or more reconstructed luma samples from one or more previous blocks.
- Cross plane filter classification, adaptive selection, and application may be applicable to enhancement of reconstructed chroma samples.
- a cross plane filter classification may depend, for example, on a block prediction mode (e.g. intra or inter), a motion vector precision, and/or a reference picture, etc.
- One or more sets of cross plane filters may be signaled in the bitstream.
- the one or more sets of cross plane tilters may be signaled based on the filter classification methods utilized.
- a number of filter sets to be signaled may be denoted as N.
- the filter coefficients of N sets of cross plane filters may be transmitted over slice level, picture level, group of picture level, or sequence level.
- a decoder may select one or more appropriate cross plane filters based on the coding mode, motion vector precision, and/or a reference picture of the current block.
- the decoder may apply one or more appropriate cross plane filters, for example, based on the coding mode, motion vector precision, and/or reference picture of the current block.
- FIG. 15A is a diagram of an example communications system 100 in which one or more disclosed embodiments may be implemented.
- the communications system 100 may be a multiple access system that provides content, such as voice, data, video, messaging, broadcast, etc., to multiple wireless users.
- the communications system 100 may enable multiple wireless users to access such content through the sharing of system resources, including wireless bandwidth.
- the communications systems 100 may employ one or more channel access methods, such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single-carrier FDMA (SC-FDMA), and the like.
- CDMA code division multiple access
- TDMA time division multiple access
- FDMA frequency division multiple access
- OFDMA orthogonal FDMA
- SC-FDMA single-carrier FDMA
- the communications system 100 may include wireless transmit/receive units (WTRUs) 102 a , 102 b , 102 c , and/or 102 d (which generally or collectively may be referred to as WTRU 102 ), a radio access network (RAN) 103 / 104 / 105 , a core network 100 / 107 / 109 , a public switched telephone network (PSTN) 108 , the Internet 110 , and other networks 112 , though it will be appreciated that the disclosed embodiments contemplate any number of WTRUs, base stations, networks, and/or network elements.
- WTRUs wireless transmit/receive units
- RAN radio access network
- PSTN public switched telephone network
- Each of the WTRUs 102 a , 102 b , 102 c , 102 d may be any type of device configured to operate and/or communicate in a wireless environment.
- the WTRUs 102 a , 102 b , 102 c , 102 d may be configured to transmit and/or receive wireless signals and may include user equipment (UE), a mobile station, a fixed or mobile subscriber unit, a pager, a cellular telephone, a personal digital assistant (PDA), a smartphone, a laptop, a netbook, a personal computer, a wireless sensor, consumer electronics, and the like.
- UE user equipment
- PDA personal digital assistant
- the communications systems 100 may also include a base station 114 a and a base station 114 b .
- Each of the base stations 114 a , 114 b may be any type of device configured to wirelessly interface with at least one of the WTRUs 102 a , 102 b , 102 c , 102 d to facilitate access to one or more communication networks, such as the core network 106 / 107 / 109 , the Internet 110 , and/or the networks 112 .
- the base stations 114 a , 114 b may be a base transceiver station (BTS), a Node-B, an eNode B, a Home Node B, a Home eNode B, a site controller, an access point (AP), a wireless router, and the like. While the base stations 114 a , 114 b are each depicted as a single element, it will be appreciated that the base stations 114 a , 114 b may include any number of interconnected base stations and/or network elements.
- BTS base transceiver station
- AP access point
- the base station 114 a may be part of the RAN 103 / 104 / 105 , which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC), a radio network controller (RNC), relay nodes, etc.
- BSC base station controller
- RNC radio network controller
- the base station 114 a and/or the base station 114 b may be configured to transmit and/or receive wireless signals within a particular geographic region, which may be referred to as a cell (not shown).
- the cell may further be divided into cell sectors.
- the cell associated with the base station 114 a may be divided into three sectors.
- the base station 114 a may include three transceivers, e.g., one for each sector of the cell.
- the base station 114 a may employ multiple-input multiple output (MIMO) technology and, therefore, may utilize multiple transceivers for each sector of the cell.
- MIMO multiple-input multiple output
- the base stations 114 a , 114 b may communicate with one or more of the WTRUs 102 a , 102 b , 102 c , 102 d over an air interface 115 / 116 / 117 , which may be any suitable wireless communication link (e.g., radio frequency (RF), microwave, infrared (IR), ultraviolet (UV), visible light, etc.).
- the air interface 115 / 116 / 117 may be established using any suitable radio access technology (RAT).
- RAT radio access technology
- the communications system 100 may be a multiple access system and may employ one or more channel access schemes, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA, and the like.
- the base station 114 a in the RAN 103 / 104 / 105 and the WTRUs 102 a , 102 b , 102 c may implement a radio technology such as Universal Mobile Telecommunications System (UMTS) Terrestrial Radio Access (UTRA), which may establish the air interface 115 / 116 / 117 using wideband CDMA (WCDMA).
- WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA+).
- HSPA may include High-Speed Downlink Packet Access (HSDPA) and/or High-Speed Uplink Packet Access (HSUPA).
- the base station 114 a and the WTRUs 102 a , 102 b , 102 c may implement a radio technology such as Evolved UMTS Terrestrial Radio Access (E-UTRA), which may establish the air interface 115 / 116 / 117 using Long Term Evolution (LTE) and/or LTE-Advanced (LTE-A).
- E-UTRA Evolved UMTS Terrestrial Radio Access
- LTE Long Term Evolution
- LTE-A LTE-Advanced
- the base station 114 a and the WTRUs 102 a , 102 b , 102 c may implement radio technologies such as IEEE 802.16 (e.g., Worldwide Interoperability for Microwave Access (WiMAX)), CDMA2000, CDMA2000 1 ⁇ , CDMA2000 EV-DO.
- IEEE 802.16 e.g., Worldwide Interoperability for Microwave Access (WiMAX)
- CDMA2000 Code Division Multiple Access 2000
- CDMA2000 1 ⁇ Code Division Multiple Access 2000 1 ⁇
- CDMA2000 EV-DO Code Division Multiple Access 2000
- IS-856 Interim Standard 2000
- GSM Global System for Mobile communications
- EDGE Enhanced Data rates for GSM Evolution
- GERAN GSM EDGE
- the base station 114 b in FIG. 15A may be a wireless router, Home Node B, Home eNode B, or access point, for example, and may utilize any suitable RAT for facilitating wireless connectivity in a localized area, such as a place of business, a home, a vehicle, a campus, and the like.
- the base station 114 b and the WTRUs 102 c , 102 d may implement a radio technology such as IEEE 802.11 to establish a wireless local area network (WLAN).
- WLAN wireless local area network
- the base station 114 b and the WTRUs 102 c , 102 d may implement a radio technology such as IEEE 802.15 to establish a wireless personal area network (WPAN)
- the base station 114 b and the WTRUs 102 c , 102 d may utilize a cellular-based RAT (e.g., WCDMA, CDMA2000, GSM, LTE, LTE-A, etc.) to establish a picocell or femtocell.
- the base station 114 b may have a direct connection to the Internet 110 .
- the base station 114 b may not be used to access the Internet 110 via the core network 106 / 107 / 109 .
- the RAN 103 / 104 / 105 may be in communication with the core network 106 / 107 / 109 , which may be any type of network configured to provide voice, data, applications, and/or voice over internet protocol (VoIP) services to one or more of the WTRUs 102 a , 102 b , 102 c , 102 d .
- the core network 106 / 107 / 109 may provide call control, billing services, mobile location-based services, pre-paid calling, Internet connectivity, video distribution, etc., and/or perform high-level security functions, such as user authentication.
- the RAN 103 / 104 / 105 and/or the core network 106 / 107 / 109 may be in direct or indirect communication with other RANs that employ the same RAT as the RAN 103 / 104 / 105 or a different RAT.
- the core network 106 / 107 / 109 may also be in communication with another RAN (not shown) employing a GSM radio technology.
- the core network 106 / 107 / 109 may also serve as a gateway for the WTRUs 102 a , 102 b , 102 c , 102 d to access the PSTN 108 , the Internet 110 , and/or other networks 112 .
- the PSTN 108 may include circuit-switched telephone networks that provide plain old telephone service (POTS).
- POTS plain old telephone service
- the Internet 110 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and the internet protocol (IP) in the TCP/IP internet protocol suite.
- the networks 112 may include wired or wireless communications networks owned and/or operated by other service providers.
- the networks 112 may include another core network connected to one or more RANs, which may employ the same RAT as the RAN 103 / 104 / 105 or a different RAT.
- One or more of the WTRUs 102 a , 102 b , 102 c , 102 d in the communications system 100 may include multi-mode capabilities, e.g., the WTRUs 102 a , 102 b , 102 c , 102 d may include multiple transceivers for communicating with different wireless networks over different wireless links.
- the WTRU 102 c shown in FIG. 15A may be configured to communicate with the base station 114 a , which may employ a cellular-based radio technology, and with the base station 114 b , which may employ an IEEE 802 radio technology.
- FIG. 15B is a system diagram of an example WTRU 102 .
- the WTRU 102 may include a processor 118 , a transceiver 120 , a transmit/receive element 122 , a speaker/microphone 124 , a keypad 126 , a display/touchpad 128 , non-removable memory 130 , removable memory 132 , a power source 134 , a global positioning system (GPS) chipset 136 , and other peripherals 138 .
- GPS global positioning system
- the base stations 114 a and 114 b , and/or the nodes that base stations 114 a and 114 b may represent, such as but not limited to transceiver station (BTS), a Node-B, a site controller, an access point (AP), a home node-B, an evolved home node-B (eNodeB), a home evolved node-B (HeNB), a home evolved node-B gateway, and proxy nodes, among others, may include one or more of the elements depicted in FIG. 15B and described herein.
- BTS transceiver station
- Node-B a Node-B
- AP access point
- eNodeB evolved home node-B
- HeNB home evolved node-B gateway
- proxy nodes among others, may include one or more of the elements depicted in FIG. 15B and described herein.
- the processor 118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like.
- the processor 118 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables the WTRU 102 to operate in a wireless environment.
- the processor 118 may be coupled to the transceiver 120 , which may be coupled to the transmit/receive element 122 . While FIG. 15B depicts the processor 118 and the transceiver 120 as separate components, it will be appreciated that the processor 118 and the transceiver 120 may be integrated together in an electronic package or chip.
- the transmit/receive element 122 may be configured to transmit signals to, or receive signals from, a base station (e.g., the base station 14 a ) over the air interface 115 / 116 / 117 .
- the transmit/receive element 122 may be an antenna configured to transmit and/or receive RE signals.
- the transmit/receive element 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example.
- the transmit/receive element 122 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receive element 122 may be configured to transmit and/or receive any combination of wireless signals.
- the WTRU 102 may include any number of transmit/receive elements 122 . More specifically, the WTRU 102 may employ MIMO technology. Thus, in one embodiment, the WTRU 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over the air interface 115 / 116 / 117 .
- the transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receive element 122 and to demodulate the signals that are received by the transmit/receive element 122 .
- the WTRU 102 may have multi-mode capabilities.
- the transceiver 120 may include multiple transceivers for enabling the WTRU 102 to communicate via multiple RATs, such as UTRA and IEEE 802.11, for example.
- the processor 118 of the WTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124 , the keypad 126 , and/or the display/touchpad 128 (e.g., a liquid crystal display LCD) display unit or organic light-emitting diode (OLED) display unit).
- the processor 118 may also output user data to the speaker/microphone 124 , the keypad 126 , and/or the display/touchpad 128 .
- the processor 118 may access information from, and store data in, any type of suitable memory, such as the non-removable memory 130 and/or the removable memory 132 .
- the non-removable memory 130 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device.
- the removable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like.
- SIM subscriber identity module
- SD secure digital
- the processor 118 may access information from, and store data in, memory that is not physically located on the WTRU 102 , such as on a server or a home computer (not shown).
- the processor 118 may receive power from the power source 134 , and may be configured to distribute and/or control the power to the other components in the WTRU 102 .
- the power source 134 may be any suitable device for powering the WTRU 102 .
- the power source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like.
- the processor 118 may also be coupled to the GPS chipset 136 , which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of the WTRU 102 .
- location information e.g., longitude and latitude
- the WTRU 102 may receive location information over the air interface 115 / 116 / 117 from a base station (e.g., base stations 114 a , 114 b ) and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that the WTRU 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment.
- the processor 118 may further be coupled to other peripherals 138 , which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity, for example, the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like.
- the peripherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video
- FIG. 15C is a system diagram of the RAN 103 and the core network 106 according to an embodiment.
- the RAN 103 may employ a UTRA radio technology to communicate with the WTRUs 102 a , 102 b , 102 c over the air interface 115 .
- the RAN 103 may also be in communication with the core network 106 .
- the RAN 103 may include Node-Bs 140 a , 140 b , 140 c , which may each include one or more transceivers for communicating with the WTRUs 102 a , 102 b , 102 c over the air interface 115 .
- the Node-Bs 140 a , 140 b , 140 e may each be associated with a particular cell (not shown) within the RAN 103 .
- the RAN 103 may also include RNCs 142 a , 142 b . It will be appreciated that the RAN 103 may include any number of Node-Bs and RNCs while remaining consistent with an embodiment.
- the Node-Bs 140 a , 140 b may be in communication with the RNC 142 a Additionally, the Node-B 140 c may be in communication with the RNC 142 b .
- the Node-Bs 140 a , 140 b , 140 c may communicate with the respective RNCs 142 a , 142 b via an hub interface.
- the RNCs 142 a , 142 b may be in communication with one another via an Iur interface.
- Each of the RNCs 142 a , 142 b may be configured to control the respective Node-Bs 140 a , 140 b , 140 c to which it is connected.
- the core network 106 shown in FIG. 15C may include a media gateway (MGW) 144 , a mobile switching center (MSC) 146 , a serving GPRS support node (SGSN) 148 , and/or a gateway GPRS support node (GGSN) 150 . While each of the foregoing elements are depicted as part of the core network 106 , it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.
- MGW media gateway
- MSC mobile switching center
- SGSN serving GPRS support node
- GGSN gateway GPRS support node
- the RNC 142 a in the RAN 103 may be connected to the MSC 146 in the core network 106 via an IuCS interface.
- the MSC 146 may be connected to the MGW 144 .
- the MSC 146 and the MGW 144 may provide the WTRUs 102 a , 102 b , 102 c with access to circuit-switched networks, such as the PSTN 108 , to facilitate communications between the WTRUs 102 a , 102 b , 102 c and land-line communications devices.
- the RNC 142 a in the RAN 103 may also be connected to the SGSN 148 in the core network 106 via an IuPS interface.
- the SGSN 148 may be connected to the GGSN 150 .
- the SGSN 148 and the GGSN 150 may provide the WTRUs 102 a , 102 b , 102 c with access to packet-switched networks, such as the Internet 110 , to facilitate communications between and the WTRUs 102 a , 102 b , 102 c and IP-enabled devices.
- the core network 106 may also be connected to the networks 112 , which may include other wired or wireless networks that are owned and/or operated by other service providers.
- FIG. 15D is a system diagram of the RAN 104 and the core network 107 according to an embodiment.
- the RAN 104 may employ an E-UTRA radio technology to communicate with the WTRUs 102 a , 102 b , 102 c over the air interface 116 .
- the RAN 104 may also be in communication with the core network 107 .
- the RAN 104 may include eNode-Bs 160 a , 160 b , 160 c , though it will be appreciated that the RAN 104 may include any number of eNode-Bs while remaining consistent with an embodiment.
- the eNode-Bs 160 a , 160 b , 160 c may each include one or more transceivers for communicating with the WTRUs 102 a , 102 b , 102 c over the air interface 116 .
- the eNode-Bs 160 a , 160 b , 160 c may implement MIMO technology.
- the eNode-B 160 a for example, may use multiple antennas to transmit wireless signals to, and receive wireless signals from, the WTRU 102 a.
- Each of the eNode-Bs 160 a , 160 b , 160 c may be associated with a particular cell (not shown) and may be configured to handle radio resource management decisions, handover decisions, scheduling of users in the uplink and/or downlink, and the like. As shown in FIG. 15D , the eNode-Bs 160 a , 160 b , 160 c may communicate with one another over an X2 interface.
- the core network 107 shown in FIG. 15D may include a mobility management gateway (MME) 162 , a serving gateway 164 , and a packet data network (PDN) gateway 166 . While each of the foregoing elements are depicted as part of the core network 107 , it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.
- MME mobility management gateway
- PDN packet data network
- the MME 162 may be connected to each of the eNode-Bs 160 a , 160 b , 160 c in the RAN 104 via an S1 interface and may serve as a control node.
- the MME 162 may be responsible for authenticating users of the WTRUs 102 a , 102 b , 102 c , bearer activation/deactivation, selecting a particular serving gateway during an initial attach of the WTRUs 102 a , 102 b , 102 c , and the like.
- the MME 162 may also provide a control plane function for switching between the RAN 104 and other RANs (not shown) that employ other radio technologies, such as GSM or WCDMA.
- the serving gateway 164 may be connected to each of the eNode-Bs 160 a , 160 b , 160 c in the RAN 104 via the S1 interface.
- the serving gateway 164 may generally route and forward user data packets to/from the WTRUs 102 a , 102 b , 102 c .
- the serving gateway 164 may also perform other functions, such as anchoring user planes during inter-eNode B handovers, triggering paging when downlink data is available for the WTRUs 102 a , 102 b , 102 c , managing and storing contexts of the WTRUs 102 a , 102 b , 102 c , and the like.
- the serving gateway 164 may also be connected to the PDN gateway 166 , which may provide the WTRUs 102 a , 102 b , 102 c with access to packet-switched networks, such as the Internet 110 , to facilitate communications between the WTRUs 102 a , 102 b , 102 c and IP-enabled devices.
- the PDN gateway 166 may provide the WTRUs 102 a , 102 b , 102 c with access to packet-switched networks, such as the Internet 110 , to facilitate communications between the WTRUs 102 a , 102 b , 102 c and IP-enabled devices.
- the core network 107 may facilitate communications with other networks.
- the core network 107 may provide the WTRUs 102 a , 102 b , 102 c with access to circuit-switched networks, such as the PSTN 108 , to facilitate communications between the WTRUs 102 a , 102 b , 102 c and land-line communications devices.
- the core network 107 may include, or may communicate with, an IP gateway (e.g., an IP multimedia subsystem (IMS) server) that serves as an interface between the core network 107 and the PSTN 108 .
- the core network 107 may provide the WTRUs 102 a , 102 b , 102 c with access to the networks 112 , which may include other wired or wireless networks that are owned and/or operated by other service providers.
- IMS IP multimedia subsystem
- FIG. 15E is a system diagram of the RAN 105 and the core network 109 according to an embodiment.
- the RAN 105 may be an access service network (ASN) that employs IEEE 802.16 radio technology to communicate with the WTRUs 102 a , 102 b , 102 c over the air interface 117 .
- ASN access service network
- the communication links between the different functional entities of the WTRUs 102 a , 102 b , 102 c , the RAN 105 , and the core network 100 may be defined as reference points.
- the RAN 105 may include base stations 180 a , 180 b , 180 c , and an ASN gateway 182 , though it will be appreciated that the RAN 105 may include any number of base stations and ASN gateways while remaining consistent with an embodiment.
- the base stations 180 a , 180 b , 180 c may each be associated with a particular cell (not shown) in the RAN 105 and may each include one or more transceivers for communicating with the WTRUs 102 a , 102 b , 102 c over the air interface 117 .
- the base stations 180 a , 180 b , 180 c may implement MIMO technology.
- the base station 180 a may use multiple antennas to transmit wireless signals to, and receive wireless signals from, the WTRU 102 a .
- the base stations 180 a , 180 b , 180 c may also provide mobility management functions, such as handoff triggering, tunnel establishment, radio resource management, traffic classification, quality of service (QoS) policy enforcement, and the like.
- the ASN gateway 182 may serve as a traffic aggregation point and may be responsible for paging, caching of subscriber profiles, routing to the core network 109 , and the like.
- the air interface 117 between the WTRUs 102 a , 102 b , 102 c and the RAN 105 may be defined as an R1 reference point that implements the IEEE 802.16 specification.
- each of the WTRUs 102 a , 102 b , 102 c may establish a logical interface (not shown) with the core network 109 .
- the logical interface between the WTRUs 102 a , 102 b , 102 c and the core network 109 may be defined as an R2 reference point, which may be used for authentication, authorization, IP host configuration management, and/or mobility management.
- the communication link between each of the base stations 180 a , 180 b , 180 c may be defined as an R8 reference point that includes protocols for facilitating WTRU handovers and the transfer of data between base stations.
- the communication link between the base stations 180 a , 180 b , 180 c and the ASN gateway 182 may be defined as an R6 reference point.
- the R6 reference point may include protocols for facilitating mobility management based on mobility events associated with each of the WTRUs 102 a , 102 h , 102 c.
- the RAN 105 may be connected to the core network 109 .
- the communication link between the RAN 105 and the core network 109 may defined as an R3 reference point that includes protocols for facilitating data transfer and mobility management capabilities, for example.
- the core network 109 may include a mobile IP home agent (MIP-HA) 184 , an authentication, authorization, accounting (AAA) server 186 , and a gateway 188 . While each of the foregoing elements are depicted as part of the core network 109 , it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator.
- MIP-HA mobile IP home agent
- AAA authentication, authorization, accounting
- the MIP-HA may be responsible for IP address management, and may enable the WTRUs 102 a , 102 b , 102 c to roam between different ASNs and/or different core networks.
- the MIP-HA 184 may provide the WTRUs 102 a , 102 b , 102 c with access to packet-switched networks, such as the internet 110 , to facilitate communications between the WTRUs 102 a , 102 b , 102 c and IP-enabled devices.
- the AAA server 186 may be responsible for user authentication and for supporting user services.
- the gateway 188 may facilitate interworking with other networks.
- the gateway 188 may provide the WTRUs 102 a , 102 b , 102 c with access to circuit-switched networks, such as the PSTN 108 , to facilitate communications between the WTRUs 102 a , 102 b , 102 c and land-line communications devices.
- the gateway 188 may provide the WTRUs 102 a , 102 b , 102 c with access to the networks 112 , which may include other wired or wireless networks that are owned and/or operated by other service providers.
- the RAN 105 may be connected to other ASNs and the core network 109 may be connected to other core networks.
- the communication link between the RAN 105 the other ASNs may be defined as an R4 reference point, which may include protocols for coordinating the mobility of the WTRUs 102 a , 102 b , 102 c between the RAN 105 and the other ASNs.
- the communication link between the core network 109 and the other core networks may be defined as an R5 reference, which may include protocols for facilitating interworking between home core networks and visited core networks.
- a reference, predicted and/or reconstructed chroma sample may be enhanced, for example, using information derived from cross plane filtering of a 1-D or 2-D3 M ⁇ N window of luma samples (a filter support region) corresponding to the reference, predicted or reconstructed chroma sample, respectively.
- Luma samples may be reconstructed or padded.
- a filter support region may be adaptively selected, for example, based on a directional intra prediction mode.
- Cross plane filters may be classified, e.g., for applicability, and may be adaptively selected, for example, based on a filter support region, whether intra or inter prediction mode is used for a current block, whether integer or fractional pixel motion vectors are used and/or whether a reference picture is used in inter prediction Signaling may be provided to a decoder, for example, to indicate at least one of whether chroma enhancement is enabled, whether chroma enhancement is applied to a current block, a cross plane filter type, a cross plane filter (e.g. set of filters) and corresponding cross plane filter coefficients.
- a decoder may select a cross plane filter to apply to a filter support region based on received signaling.
- ROM read only memory
- RAM random access memory
- register cache memory
- semiconductor memory devices magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs).
- a processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.
Abstract
Cross plane filtering may be used for enhanced chroma coding. An indication of a cross-plane filter associated with a current picture may be received. The current picture may include an intra-coded video block and a plurality of reference samples. The plurality of reference samples may be used to predict the intra-coded video block. A luma sample region may be determined in the current picture. The luma sample region may determined to enhance a corresponding chroma sample in the current picture. The cross-plane filter may be applied to a plurality of luma samples in the luma sample region to determine an offset. The cross-plane filter may be a high pass filter. The offset may be applied to the corresponding chroma sample to determine an enhanced chroma sample.
Description
- This application claims priority to U.S. provisional patent application No. 62/190,008, filed Jul. 8, 2015, which is incorporated herein by reference in its entirety.
- Video coding systems may be used to compress digital video signals. For example, video coding systems may reduce storage space consumed and/or reduce transmission bandwidth consumption associated with video signals. For example, block-based hybrid video coding systems may be used.
- Digital video signals may include three color planes. The three color planes may include a luma plane, a blue-difference chroma plane, and a red-difference chroma plane. Pixels of the chroma planes may have smaller dynamic ranges than pixels of the luma plane. For example, the chroma planes of a video image may be smoother and/or have less detail than the luma plane of the video image. A chroma block of a video image may be easier to predict (e.g., accurately predict). For example, prediction of the chroma block may consume fewer resources and/or result in less prediction error.
- A high dynamic range (HDR) video may offer a wider dynamic range than a standard dynamic range (SDR) video. The dynamic range of HDR video may be closer to the capacities of the human eye. Chrome artifacts in HDR video may be more visible against a brighter background than chrome artifacts in SDR video. HDR video coding may include preprocessing, coding, decoding, and/or post-processing.
- Systems, methods, and instrumentalities are disclosed for enhanced chroma coding using cross plane filtering. An indication of a cross-plane filter associated with a current picture may be received. The indication may include one or more filter coefficients associated with the cross-plane filter. The current picture may include an intra-coded video block and a plurality of reference samples. The plurality of reference samples may be used to predict the intra-coded video block. A luma sample region may be determined in the current picture. The luma sample region may include a plurality of luma samples. For example, the luma sample region may be a 3×3 block of luma samples. The plurality of luma samples may include predicted luma samples such that the cross-plane filter is applied to the predicted luma samples prior to reconstruction of the luma samples. The plurality of luma samples may include reconstructed luma samples such that the cross-plane filter may be applied to the reconstructed luma samples after reconstruction. The luma sample region may be determined based on a selected intra prediction mode.
- The luma sample region may be determined for enhancing a corresponding chroma sample in the current picture. The corresponding chroma sample may be a predicted chroma sample or a reconstructed chroma sample. When the corresponding chroma sample is a predicted chroma sample in the intra-coded video block, the enhanced chroma sample may be used for prediction. When the corresponding chroma sample is a reconstructed chroma sample before in-loop filtering, the enhanced chroma sample may be used to replace the corresponding chroma sample before in-loop filtering is applied. The corresponding chroma sample may be a reference chroma sample used to predict one or more chroma samples in the intra-coded video block. A determination of whether to apply the cross-plane filter based on a received chroma enhancement indicator. The received chroma enhancement indicator may be received at a block level. The cross-plane filter may be applied to a plurality of luma samples in the luma sample region to determine an offset. The cross-plane filter may be a high pass filter. The offset may be applied to the corresponding chroma sample to determine an enhanced chroma sample. The luma sample region may include an unavailable luma sample. The unavailable luma sample may be replaced with a neighboring available luma sample, for example, prior to applying the cross-plane filter to the plurality of luma samples. The cross-plane filter to apply may be determined based on the selected intra prediction mode.
- A plurality of luma sample regions may be determined for a current picture. For example, a first luma sample region and a second luma sample region may be determined. The first luma sample region may include a first plurality of luma samples. The second luma sample region may include a second plurality of luma samples. The first luma sample region may neighbor a first corresponding chroma sample in the current picture. The second luma sample region may neighbor a second corresponding chroma sample in the current picture. The cross-plane filter may be applied to the first plurality of luma samples and the second plurality of luma samples to determine a first offset and a second offset, respectively. The first offset may be applied to the first corresponding chroma sample to determine a first enhanced chroma sample. The second offset may be applied to the second corresponding chroma sample to determine a second enhanced chroma sample.
- The patent or application file contains at least one drawing executed in color. Copies of this patent or patent application publication with color drawing(s) will be provided by the Office upon request and payment of the necessary fee.
-
FIG. 1 shows an example of a block based hybrid video encoder. -
FIG. 2 shows an example of a block based hybrid video decoder. -
FIG. 3 shows an example of reference samples Rx,y used for prediction to obtain predicted samples Px,y for a block size of N×N samples. -
FIG. 4 shows an example of partitioning modes for intra prediction unit. -
FIG. 5 shows an example of angular intra prediction modes. -
FIG. 6 shows an example of an intra boundary filter. -
FIG. 7 shows an example of different partitions HEVC inter-prediction coding. -
FIG. 8A shows an example of motion compensated prediction using motion vectors. -
FIG. 8B shows an example of fractional pixel interpolation. -
FIG. 9 shows an example of chroma reference sample enhancement with cross plane filtering that does not use reconstructed luma samples from the current block. -
FIG. 10 shows an example of chroma reference sample enhancement with cross plane filtering using reconstructed luma samples from a current block. -
FIG. 11 shows an example of a block based hybrid encoder with predicted chroma sample enhancement. -
FIG. 12 shows an example of enhancing predicted chroma samples with cross plane filtering. -
FIG. 13 shows an example of a block based hybrid encoder with chroma enhancement on reconstructed chroma samples. -
FIG. 14 shows an example of chroma reconstructed sample enhancement with cross plane filtering. -
FIG. 15A is a system diagram of an example communications system in which one or more disclosed embodiments may be implemented. -
FIG. 15B is a system diagram of an example wireless transmit/receive unit (WTRU) that may be used within the communications system illustrated inFIG. 15A . -
FIG. 15C is a system diagram of an example radio access network and an example core network that may be used within the communications system illustrated inFIG. 15A . -
FIG. 15D is a system diagram of another example radio access network and an example core network that may be used within the communications system illustrated inFIG. 15A . -
FIG. 15E is a system diagram of another example radio access network and an example core network that may be used within the communications system illustrated inFIG. 15A . - A detailed description of illustrative embodiments will now be described with reference to the various figures. Although this description provides a detailed example of possible implementations, it should be noted that the details are intended to be exemplary and in no way limit the scope of the application.
- Video coding systems may compress digital video signals, for example, to reduce the storage and/or transmission bandwidth of digital video signals. There are a variety of video coding systems, such as block-based, wavelet-based, object-based systems and block-based hybrid video coding systems. Examples of block-based video coding systems are H.261, (Moving Picture Experts Group) MPEG-1, MPEG-2, H.263, H.264/Advanced Video Coding (AVC) and H.265/High Efficiency Video Coding (HEVC).
-
FIG. 1 shows an example of a block based hybrid video encoder. Spatial prediction (e.g., intra prediction) or temporal prediction (e.g. inter prediction) may be performed, for example, for a (e.g., each) video block, e.g., to reduce spatial and temporal redundancy in video blocks. A prediction block generated from intra or inter prediction may be subtracted from a current video block. A resulting prediction residual may be transformed and quantized. A residual may be reconstructed, for example, by inverse quantizing and inverse transforming quantized residual coefficients. A reconstructed residual may be added to a prediction block, e.g., to form a reconstruction video block. In-loop filtering may be applied to a reconstructed video block. A filtered reconstructed video block, which may be stored in a decoded picture buffer, may be used to code one or more next video blocks. -
FIG. 2 shows an example of a block based hybrid video decoder. The decoder inFIG. 2 may correspond to the encoder inFIG. 1 . An encoded video bitstream may be parsed and entropy decoded. A coding mode and associated prediction information may be passed, for example, to spatial prediction or motion compensated prediction, e.g., to form a prediction block. Residual transform coefficients may be inverse quantized and inverse transformed, for example, to reconstruct a residual block. A prediction block and a reconstructed residual block may be added together, e.g., to form a reconstructed block. In-loop filtering may be applied to a reconstructed video block. A filtered reconstructed video block, which may be stored in a reference picture buffer, may be used to predict future video blocks. - Intra coding may be used, for example, to eliminate spatial correlation in some image and video coding techniques, such as Joint Photographic Experts Group (JPEG), H 261, MPEG-1, MPEG-2, H.263, H.264/AVC and H.265/HEVC. Directional intra prediction may be used, for example, in H.264/AVC and H.265/HEVC, e.g., to improve coding efficiency. Intra prediction modes may utilize a set of reference samples, e.g., from above and to the left of a current block to be predicted. Reference samples may be denoted as Rx,y. In an example, a position (x, y) may have its origin one pixel above and to the left of a block's top-left corner. A predicted sample value at the position (x, y) may be denoted as Px,y.
-
FIG. 3 shows an example of reference samples Rx,y used for prediction to obtain predicted samples Px,y for a block size of N×N samples. -
FIG. 4 shows an example of partitioning modes for an intra prediction unit (PU). HEVC intra coding may support multiple types of PU division, e.g., PART_2N×2N and PART_N×N, which may split a coding unit (CU) into one or four equal size PUs, respectively. PART_2N×2N may be available when a CU size is a configured minimum CU size. - An 8×8 CU split into four 4×4 PUs may have four luma prediction blocks (PBs), for example, for 4:2:0 chroma formats. There may be one 4×4 PB per chroma channel for intra coded blocks, for example, to avoid high throughput caused by 2×2 chroma intra prediction blocks.
- A CU may be split into multiple transform units (TUs). Intra prediction may be applied sequentially to TUs. For example, as compared to applying intra prediction at PU level. Sequential intra prediction may permit use in intra prediction neighboring reference samples from previous reconstructed TUs that are closer to current TU samples.
-
FIG. 5 shows an example of angular intra prediction modes HEVC may support one or more (e.g. 35) intra prediction modes. The one or more intra prediction modes may include a DC mode, a planar mode, and/or 33 directional or ‘angular’ intra prediction modes. - Angular intra prediction may be used to efficiently model different directional structures in video and image content. The number and angularity of prediction directions may be selected based on a trade-off between encoding complexity and coding efficiency.
- A predicted sample Px,y may be obtained, for example, by projecting its location to a reference row or column of pixels, applying a selected prediction direction, and interpolating a predicted value for the sample at 1/32 pixel accuracy. Interpolation may be performed linearly utilizing the two closest reference samples, e.g., Ri,0 and Ri+1.0 for vertical prediction (e.g. mode 18-34 as shown in
FIG. 5 ) and R0,i and R0,i+L for horizontal prediction (e.g. mode 2-17 as shown inFIG. 5 ). - HEVC may support one or more intra prediction modes for luma intra prediction for a variety of (e.g. all) PU sizes. HEVC may define multiple (e.g. three) most probable modes (MPNs) for a (e.g. each) PU, for example, based on the modes of one or more neighboring PUs. A current intra prediction mode may be equal to one of the elements in a set of MPMs. An index in the set may be transmitted to the decoder. A code (e.g. a 5-bit fixed length code) may be used to determine a selected mode outside the set of MPMs.
- Reference samples may be smoothed. In an example, a 3-tap smoothing filter may be applied to one or more reference samples. The smoothing may be applied, for example, when an intra_smoothing_disabled_flag is set to 0, filtering may be controlled, for example, by a given intra prediction mode and/or transform block size. In an example, e.g., for 32×32 blocks, angular modes may use filtered reference samples, for example, except horizontal and vertical angular modes. In another example, e.g., for 16×16 blocks, modes not using filtered reference samples may be extended to four modes (e.g. 9, 11, 25, 27) closest to horizontal and vertical as shown in
FIG. 5 . In another example, e.g., for 8×8 blocks, diagonal modes (2, 18, 34) may use filtered reference samples. - Intra prediction may be applied for a chroma component. In an example, an intra prediction mode may be specified, e.g., as planar, TX, horizontal, vertical, ‘DM_CHROMA’ mode, diagonal mode (34), for example, for one or more prediction blocks (PBs) associated with chroma.
- Table 1 shows an example mapping between an intra prediction mode and an intra prediction direction for chroma. A chroma color channel intra prediction mode may be based on a corresponding luma PB intra prediction mode and/or an intra_chroma_pred_mod syntax element.
-
TABLE 1 Luma intra prediction direction, X Otherwise intra_chroma_pred_mode 0 26 10 1 (0 <= X <= 34) 0 34 0 0 0 0 1 26 34 26 26 26 2 10 10 34 10 10 3 1 1 1 34 1 4 (DM_CHROMA) 0 26 10 1 X - Table 2 shows an example specification of intra prediction mode for 4:2:2 chroma format, e.g., when a DM_CHROMA mode is selected and a 4:2:2 chroma format is in use. An intra prediction mode for a chroma PB may be derived, for example, from an intra prediction mode for a corresponding luma PB, e.g., as specified in Table 2.
-
TABLE 2 intra pred 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 mode intra pred 0 1 2 2 2 2 3 5 7 8 10 12 13 15 17 18 19 20 mode for 4:2:2 chroma intra pred 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 mode intra pred 21 22 23 23 24 24 25 25 26 27 27 28 28 29 29 30 31 mode for 4:2:2 chroma -
FIG. 6 shows an example of an intra boundary filter. An intra-boundary filter may be used, for example, when reconstructing intra-predicted transform blocks (TBs). An intra-boundary filter may be used, for example, to filter predicted luma samples along the left and/or top edges of the TB for PBs using horizontal, vertical, and/or DC intra prediction modes, e.g., as shown inFIG. 6 . - An intra boundary filter may be defined, for example, based on an array of predicted samples p as an input and/or predSamples as an output.
- Intra boundary filtering provided by Eq. (1) may be used to generate predSamples as an output, for example, for horizontal intra-prediction applied to luma transform blocks of size (nTbS) less than 32×32, disableIntraBoundaryFilter equal to 0, where x=0, nTbS−1, y=0:
-
predSamplesx,y=Clip1Y((P x,−1 −P −1,−1)>>1)) (1) - Intra boundary filtering provided by Eq. (2) may be used to generate prodSamples as an output, for example, for vertical intra-prediction applied to luma transform blocks of size (nTbS) less than 32×32, disableIntraBoundaryFilter equal to 0, where x=0 . . . nTbS−1, y=0:
-
predSamplesx,y=Clip1Y(P x,−1+((P −1,y −P −1,−1)>>1)) (2) - Intra boundary filtering provided by Eq. (3), Eq. (4) and Eq. (5) may be used to generate predSamples as an output, for example, for DC intra-prediction applied to luma transform blocks of size (nTbS) less than 32×32 and a DC predictor dcVal:
-
predSamples0,0=(P −1,0+2*dcVal+P 0,−1+2)>>2 (3) -
predSamplesx,0=(P x,−1+3*dcVal+2)>>2,with x=1 . . . nTbS−1 (4) -
predSamples0,y=(P −1,y+3*dcVal+2)>>2,with y=1 . . . nTbS−1 (5) - An improvement may be provided by boundary smoothing, e.g. 0.4% average improvement. An intra boundary filter may be applied on a luma component. An intra boundary filter may not be applied on a chroma component, e.g., because prediction for chroma components tends to be smooth.
- HEVC intra mode residual coding may utilize intra mode dependent transforms and/or coefficient scanning to code residual information. A discrete sine transform (DST) may be selected for 4×4 luma blocks. A discrete cosine transform (DCT) may be selected/used for other types of blocks.
- A linear-model (LM) based chroma intra prediction mode may be used, for example, to predict chroma samples from collocated reconstructed luma samples using a linear model (LM), e.g., in accordance with Eq. (6):
-
PredC[x,y]=α·Rec L[x,y]+β (6) - where PredC may indicate predicted chroma samples in a block and RecL may indicate corresponding reconstructed luma samples in a block. Parameters α and β may be derived from causal reconstructed luma and chroma samples around a current block.
- Linear model chroma intra prediction may improve coding efficiency. As an example, experimental results in a test configuration indicate average Bjøntegaard delta rate (BD-rate) reductions of Y, Cb, Cr components comprising 1.3%, 6.5% and 5.5%, respectively. In an example, a similar level of coding efficiency improvements of chroma components may be provided in a test configuration
-
FIG. 7 shows an example of different partitions for HEVC inter-prediction coding. Inter coding may be used, for example, to remove or reduce temporal redundancy. HEVC inter-prediction coding may support more PB partition shapes than intra-prediction coding (e.g., intra-coding). Intra prediction may support, for example, partitions PART_2N×2N, PART_2N×N, PART_N×2N, PART_N×N. Inter-picture prediction may support, for example, partitions PART_2N×2N, PART_2N×N, PART_N×2N, PART_N×N and asymmetric motion partitions PART_2N×nU, PART_2N×nD, PART_nL×2N, and PART_nR×2N. - An (e.g. each) inter-predicted PU may have a set of motion parameters comprising one or more motion vectors and one or more reference picture indices. A P slice may, for example, use one reference picture list and a B slice may, for example, use two reference picture lists. Inter-prediction samples of a PB may be determined from one or more samples of a corresponding block region in a reference picture identified by a reference picture index. The reference picture index may be at a position displaced by horizontal and vertical components of a motion vector (MV).
-
FIG. 8A shows an example of motion compensated prediction using motion vectors (MVs) Horizontal and vertical motion vectors may be denoted as dx and dy, respectively. -
FIG. 8B shows an example of fractional sample interpolation. Fractional sample interpolation may be used, for example, when a motion vector has a fractional value. Fractional sample interpolation may generate prediction samples for non-integer sample positions. HEVC may support MV's, for example, with units of ¼ of the distance between luma samples. HEVC may support MVs, for example, with units of ⅛ of the distance between chroma samples, e.g., in 4:2:0 format. - Motion vector prediction may exploit spatial-temporal correlation of a motion vector with neighboring PUs A merge mode may be used, for example, to reduce motion vector signaling cost. A merge mode merge candidate list may comprise a list of motion vector candidates from neighboring PU positions (e.g. spatial neighbors and/or temporal neighbors) and/or zero vectors. An encoder may select a (e.g. the best) predictor from a merge candidate list and may transmit a corresponding index indicating the predictor chosen from the merge candidate list.
- Cross plane filtering may use high frequency information from luma, for example, to improve and enhance chroma quality. High frequency information may be extracted from luma, for example, by applying a high pass filter on the luma component.
- Luma and chroma components may have some correlations, such as object contours and edge areas. Cross-plane filtering for chroma enhancement may include applying high pass filtering to a luma component. An output of the high pass filtering may be added to the chroma component to determine an enhanced chroma component. The output of the high pass filtering may be an offset. Eq. 7 and Eq. 8 indicate an example of a chroma enhancement:
-
Y offset =Y rec⊗cross_plane_filter (7) -
C enh =C rec +Y offset (8) - where Y is Luma, C is chroma, cross_plane_filter is a filter applied to the luma signal, Yrec is the reconstructed luma signal. Yoffset is the output of the filtering, Crec is the reconstructed chroma signal, which may be a Cb or Cr component and Cenh is the enhanced chroma signal. The filter may be a 1D or 2D filter. A cross plane filter may be derived, for example, based on original chroma and luma and/or reconstructed chroma and luma, e.g., using a Least Square method.
- A luma prediction mode may be utilized in DM_CHROMA mode for intra prediction, for example, to derive a chroma prediction mode that reduces a signaling overhead of a chroma prediction mode. The chroma prediction mode may be signaled as DM_CHROMA, for example, when a chroma prediction block (PB) utilizes the same prediction mode as a corresponding luma PB. A linear model (LM) chroma prediction mode may predict one or more chroma samples from collocated reconstructed luma samples, for example, by a linear model.
- Reference chroma samples (Rx,y), corresponding chroma prediction samples (Px,y), and reconstructed chroma samples may be processed independently of their corresponding luma component(s). For example, for inter prediction, chroma prediction samples and chroma reconstructed samples may be generated independently from their corresponding luma component(s). Cross-plane filtering may be applied at different stages of the intra coding process.
- Cross-plane filtering may be used to enhance one or more neighboring chroma samples. The one or more neighboring chroma samples may neighbor the current video block. The neighboring chroma samples may include reference chroma samples, reconstructed chroma samples, and/or predicted chroma samples. For example, a predicted chroma sample may be used for prediction. Neighboring reference chroma samples, Rx,y, may be used to generate predicted chroma samples Px,y, for intra coding, e.g., as shown in
FIG. 3 . Cross-plane chroma filtering may derive high fidelity information from one or more luma samples that correspond to a chroma sample of the one or more neighboring chroma samples, for example, to improve chroma in coding color space. - A cross plane filter may be applied to one or more neighboring luma samples. The neighboring luma sample(s) may include reference luma samples, reconstructed luma samples, and/or predicted luma samples. Derived high pass information may be used, for example, to enhance the quality of chroma reference samples. Enhanced reference chroma samples may be used to generate predicted chroma samples.
- One or more enhanced chroma samples may be determined using cross plane filtering. For example, a cross plane filter may be applied to one or more neighboring luma samples. The cross plane filter may be a high pass filter. The one or more neighboring luma samples may correspond to a chroma sample to be enhanced. The cross plane filter may be applied to available and/or unavailable luma samples. A predicted luma sample and/or a reconstructed luma sample (e.g., before or after loop filtering) may be an available luma sample. A non-reconstructed luma sample and/or a non-predicted luma sample may be an unavailable luma sample.
-
FIG. 9 is an example of chromareference sample enhancement 900 with cross plane filtering that does not use reconstructed luma samples from a current video block. The current video block may include a 4×4 block of samples, e.g., as defined by a solid line inFIG. 9 . - One or more luma sample regions such as 902A, 902B may be determined for a current picture. As shown, a luma sample region such as 902A, 902B may include a plurality of luma samples that neighbor a corresponding chroma sample and/or a luma sample that is collocated to the corresponding chroma sample. A sample may neighbor another sample if it is above, below, to the left of, to the right of, and/or diagonal to the other sample. For example, a neighboring sample may be next to the corresponding sample. Collocated samples may include a luma sample at the same location as a chroma sample. The
luma sample region 902A may be determined to include one or more neighboring luma samples and/or a collocated luma sample. Theluma sample region 902A may be determined such that a chroma sample for enhancement (e.g.,enhanced chroma sample 912A) is at the center of each of the one or moreluma sample regions - The luma samples in a luma sample region such as 902A, may include one or more luma sample(s) 904A reference luma sample(s) 906A, and/or predicted luma sample(s) 908A. The
reference luma samples 906A may be reconstructed luma samples used to replace predicted luma sample(s) 908A. For example, each of the one or more predictedluma samples 908A, 908B may be replaced by (e.g., padded from) a respective neighboring reconstructed luma sample of the one or more reconstructed luma samples 906. For example, aluma sample region FIG. 9 . The luma samples in the M×N window may correspond to a chroma sample location. A cross plane filter may be applied to the plurality of luma samples of theluma sample region luma sample region chroma samples 912A, 912B. An enhanced chroma sample, such as enhancedchroma samples 912A, 912B may be used as reference samples for intra-prediction of the current video block. - In an example, the plurality of luma samples in an M×N luma sample region, e.g., a 3×3 window, may include (e.g. only) reconstructed luma samples before or after in-loop filtering. For example, the plurality of luma samples in the M×N luma sample region may be available. In another example, one or more luma samples in an M×N luma sample region may not have been reconstructed, e.g., as shown in
FIG. 9 . Luma samples that have not been reconstructed may be unavailable luma samples. The unavailable luma samples may be replaced by (e.g., padded using) a neighboring available (e.g., reconstructed) luma sample Prediction and reconstruction of luma and chroma samples in the current block may be performed in parallel. -
FIG. 10 is an example of chromareference sample enhancement 1000 with cross plane filtering using reconstructed luma samples from a current block. In an example, the luma samples in a current block may be reconstructed before the respective corresponding chroma samples, for example, when higher latency between different color channels may be tolerated. Reconstructed luma samples in an M×N (e.g. 3×3)luma sample region 1002A, 10028 may be available, e.g. without padding, when the cross plane tilter is applied. Theluma sample region 1002A, 1002B may be an M×N (e.g., 3×3) window. - For example, a luma sample region such as 1002A, 1002B may be determined for a current picture. A luma sample region such as 1002A, 1002B may include a plurality of luma samples that neighbor a corresponding chroma sample and/or a luma sample that is collocated with the corresponding chroma sample. The plurality of luma samples in each of the
luma sample regions 1002A, 1002B may include one or morereconstructed luma samples reconstructed luma samples 1008A, 1008B that are within the current block. - Cross plane filtering may be applied to the plurality of luma samples in the one or more
luma sample regions 1002A, 1002B. The luma samples in the one or moreluma sample regions 1002A, 1002B may include some luma samples within a current block (e.g., as shown inFIG. 9 ). The luma samples in the current block may include one or more reconstructed luma samples, e.g., before or after loop filtering. - One or more enhanced
reference chroma samples 1012A, 1012B may be generated, for example, in accordance with Eq. 9. -
R C_enh[x][y]=R C[x][y]+S L(x L ,y L)⊗cross_plane_filter (9) - where RC[x][y] may be reconstructed reference chroma samples before enhancement, RC_enh [x][y] may be enhanced reference chroma samples, SL(xL, yL) may be an array of reconstructed luma samples centering at position (xL, yL). The one or more enhance
reference chroma samples 1012A, 1012B may correspond to a center of the one or moreluma sample regions 1002A, 1002B. - A luma sample position (xL, yL) of a corresponding chroma sample position (x, y) may be calculated based on chroma format, for example, in accordance with Eq. 10:
-
x L=scaleX*x -
y L=scaleY*y (10) - where scaleX and scaleY may be, for example, (2,2), (2,1) (1,1), respectively, for chroma format 4:2:0, 4:2:2, and 4:4:4. Enhanced chroma reference samples RC_enh [x][y] may be used in the intra prediction process, for example, using a directional/DC/planar intra prediction mode, e.g., as shown in
FIG. 5 . - A two-dimensional (2-D) cross plane filter may be applied to a luma sample region (e.g., an N×N region of luma samples as shown in
FIGS. 9 and 10 ). A one-dimensional (1-D) cross plane filter may be applied to one or more luma samples in a 1×N or N×1 luma sample region. As an example, N horizontal luma reference samples above a current block may be filtered, for example, when an N×1 luma sample region is used in cross plane filtering. As an example, N vertical luma reference samples to the left of a current block may be filtered, for example, when a 1×N luma sample region is used in cross plane filtering. - In an example, a luma sample region may be adaptively selected, for example, based on an intra prediction mode. The intra prediction mode may be a directional (e.g., vertical, horizontal, etc.) intra prediction mode, a DC intra prediction mode, a planar intra prediction mode, or any other intra prediction mode. The cross plane filter to apply may be determined based on which intra prediction mode is selected. A different set of cross-plane filters may be selected, for example, to match the edge and/or boundary characteristics of various different modes. As an example, a vertical prediction mode may use the top row of reference samples. When the vertical prediction mode is selected, one or more chroma reference samples above the current video block may be enhanced. For example, a luma sample region may be selected such that it includes one or more luma samples that neighbor a corresponding chroma reference sample above the current video block. The luma sample region may include a luma sample that is collocated with the corresponding chroma reference sample above the current video block. As another example, when horizontal prediction mode is selected, one or more chroma reference samples to the left or right of the current video block may be enhanced. A luma sample region may be selected such that it includes one or more luma samples that neighbor a corresponding chroma reference samples to the left or right of the current video block. The luma sample region may include a luma sample that is collocated with the corresponding chroma reference sample to the left or right of the current video block. An edge may occur vertically, for example, when a vertical prediction mode is selected.
- A luma sample region may be selected with a horizontal rectangular shape, for example, in comparison to a square 2-D luma sample region depicted in examples shown in
FIGS. 9 and 10 . For example, a 1-D horizontal luma sample region may be selected and/or applied for a vertical selection mode 1-D horizontal filtering (e.g., using the top neighboring luma samples) may be used, for example, to retrieve the vertical high pass edge information more effectively and/or reduce filtering complexity. In an example of adaptive filtering for a horizontal prediction mode, a left column of reference samples may be used. An edge may occur horizontally, for example, when horizontal prediction mode is selected. A luma sample region with a vertical rectangular shape may be used. For example, a 1-D vertical luma sample region may be selected and/or applied. A 1D vertical filtering (e.g. using the left neighboring luma samples) may retrieve the horizontal high pass edge information more effectively and/or reduce filtering complexity. - A luma sample region may be selected with an ‘L’ shape such that the luma sample region corresponds to the chroma samples in the current video block. For example, when DC intra-prediction mode is selected, the mean of the one or more reference samples to the left and above the current video block may be used to predict the one or more samples in the current video block. As another example, when planar intra-prediction mode is selected, a linear function of the one or more reference samples to the left and above the current video block may be used to predict the one or more samples in the current video block. When DC intra-prediction and/or planar intra-prediction mode is selected, the luma sample region may be selected such that the one or more reference chroma samples to the left and above the current video block are enhanced.
- A chroma enhancement indicator may be signaled, for example, to indicate whether to enable or disable chroma enhancement. The chroma enhancement indicator may be signaled, for example, at slice level, picture level, group of picture level, or sequence level.
- A chroma enhancement indicator may be signaled at a block level, for example, to indicate whether chroma enhancement processing is applied to a current coding block. The chroma enhancement indicator may be signaled, for example, when a current block is an intra coded block.
- One or more cross plane filter coefficients may be derived at an encoder and transmitted to a decoder. A filter coefficient training, e.g., for enhancement filters applied to intra coded blocks to improve reference samples, may use intra coded samples. One or more cross plane filter coefficients may be transmitted, for example, at slice level, picture level, group of picture level, or sequence level. The one or more cross plane filter coefficients may be transmitted at a block level.
-
FIG. 11 is an example of a block based hybrid video coding device 1100 (e.g., an encoder or a decoder) with predicted chroma sample enhancement. A prediction (e.g.,intra prediction 1104 and/or inter prediction 1106) may be performed on aninput video 1102. A prediction residual 1108 may be determined, for example by subtracting aprediction block 1114 from an original block of theinput video 1102. The prediction residual 1108 may be transformed, for example, using DCT and/or DST block transforms, and/or quantized 1110. A prediction residual 1108 may be transformed for luma and/or chroma components of theinput video 1102. Improving the accuracy of predicted chroma samples may improve coding efficiency.Chroma enhancement 1112 may be implemented, for example, by using a cross plane (e.g., high pass) filter to enhance one or more predicted chroma samples. Chroma samples may be enhanced, for example, for object contours and/or edge areas. - A cross plane filter may be used on one or more (e.g. all) predicted chroma samples of a current block, for example, using reconstructed luma samples of the same and neighboring blocks. For example, the one or more predicted chroma samples of a current block may be enhanced by applying a cross plane filter to a plurality of luma samples that correspond to the one or more predicted chroma samples. The current block may be an intra-coded video block. An enhanced predicted chroma sample may be generated, for example, as indicated in Eq. 11:
-
P C_enh[X][y]=P C[x][y]+S L(x L ,y L)⊗cross_plane_filter (11) - where PC[x][y] may be a predicted chroma sample generated at chroma position (x,y). A predicted chroma sample PC[x][y] may be predicted using intra prediction or inter prediction. Although not shown in equation (11), a cross plane filter used in chroma enhancement may vary, for example, depending on the coding mode (e.g. intra or inter) for a current block. In an example, a plurality of (e.g. two) sets of cross plane filters may be trained separately. A first set of cross plane filters may be applied to one or more intra predicted chroma samples. A second set of cross plane filters may be applied to one or more inter predicted chroma samples.
- SL(xL, yL) may be reconstructed luma samples at position (xL, yL), where (xL, yL) may be calculated based on chroma format, for example, as indicated in Eq. 12:
-
x L=scaleX*x -
y L=scaleY*y (12) - where (scaleX, scaleY) may be (2,2), (2,1) and (1,1), respectively, for chroma format 4:2:0, 4:2:2 and 4:4:4.
- A cross plane filter may be applied to one or more neighboring corresponding reference luma samples, for example, to improve the accuracy of one or more predicted chroma samples.
- The cross plane filter may enhance the quality of the one or more predicted chroma samples. Prediction residual information may be smaller, leading to improved coding performance, for example, when enhanced predicted chroma samples are used to generate a prediction residual.
- Cross plane filtering may be used to derive one or more enhanced chroma samples. For example, a cross plane filter may be applied to a neighborhood of luma samples (e.g. a luma sample region) that correspond to a current chroma sample location.
-
FIG. 12 is an example of enhancing predicted chroma samples with cross plane filtering. A plurality ofluma sample regions 1202A, 120B, 120C may be determined. Each of the plurality ofluma sample regions luma sample region FIG. 12 ). The cross plane filter may be applied to reconstructed luma samples within a current block (e.g. as indicated by vertical striped circles). The current block may be an intra-coded video block. An output may be determined. The output of applying the cross plane filter may be an offset, for example given by Eq. 11. An enhanced predictedchroma sample enhanced chroma sample - A cross plane filter may be applied on luma samples that are not yet reconstructed 1208. Luma samples that are not yet reconstructed 1208 may be padded, for example, using reconstructed luma samples that neighbor (e.g., to the left or top of) the not yet reconstructed
luma samples 1208. For example, an unavailable luma sample may be replaced by a neighboring reconstructed luma sample. - Separate (e.g., different) filters may be applied, for example, depending on a prediction mode (e.g. intra or inter prediction). Other techniques (e.g. procedures) may be used to classify and apply different cross plane filters. As an example, a cross plane filter may be further classified or subclassified for applicability, for example, depending on whether integer or fractional (e.g. half or quarter) pixel motion vectors are used. A cross plane filter may be classified and/or subclassified for applicability for example, depending on which reference picture is used in inter prediction. The cross plane filter may be adaptively selected and/or applied these and other coding parameters.
-
FIG. 13 is an example of a block based hybrid video coding device 1300 (e.g., an encoder or a decoder) with chroma enhancement on reconstructed chroma samples. For example, chroma enhancement may be applied after prediction and reconstruction. A reconstructed block 1304 (e.g. before in-loop filtering) of aninput video 1302 may be generated, for example, by adding a reconstructed residual block from the inverse quantization/inverse transform 1306 to aprediction block 1308. Enhancement of reconstructed chroma samples may improve the overall picture quality and/or may improve coding efficiency of the following blocks or pictures.Chroma enhancement 1312 may be implemented, for example, by applying a cross plane filter. Applying the cross plane filter may enhance one or more reconstructed chroma samples, e.g., at object contours and/or edge areas. The one or more reconstructed chroma samples may be enhanced before or after in-loop filtering. - In an example, a cross plane filter may be applied to reconstructed luma samples, SL[x][y], for example, to enhance the reconstructed chroma samples of a current block. Enhanced reconstructed chroma samples may be calculated, for example, in accordance with Eq. 13.
-
S C_enh[x][y]=S C[x][y]+S L(X L ,y L)⊗cross_plane_filter (13) - where SC[x][y] may be reconstructed chroma samples and SC_enh[x][y] may be enhanced reconstructed chroma samples.
-
FIG. 14 is an example of enhancement of one or more chroma reconstructed samples with cross plane filtering. - A plurality of
luma sample regions 1402, 1404 may be determined. A firstluma sample region 1402 may include a plurality of reconstructed luma samples from acurrent block 1410. A second luma sample region 1404 may include a plurality of reconstructed luma samples from thecurrent block 1410 and a plurality of reconstructed luma samples from one or more previous blocks. - A cross plane filter may be applied to one or more reconstructed luma samples SL (xL, yL). One or more reconstructed chroma samples SC[x][y] of a
current block 1410 may be enhanced, for example, by applying 1406, 1408 (e.g., adding) the output of the selected and applied cross plane filter to a corresponding reconstructed chroma sample to generate one or more enhanced reconstructedchroma samples luma sample region 1402 may include one or more reconstructed luma samples from acurrent block 1410 and one or more reconstructed luma samples from one or more previous blocks. - Cross plane filter classification, adaptive selection, and application, e.g., as described herein, may be applicable to enhancement of reconstructed chroma samples. A cross plane filter classification may depend, for example, on a block prediction mode (e.g. intra or inter), a motion vector precision, and/or a reference picture, etc.
- Signaling of filter coefficients may be performed, for example, as described herein. One or more sets of cross plane filters may be signaled in the bitstream. The one or more sets of cross plane tilters may be signaled based on the filter classification methods utilized. A number of filter sets to be signaled may be denoted as N. The filter coefficients of N sets of cross plane filters may be transmitted over slice level, picture level, group of picture level, or sequence level. A decoder may select one or more appropriate cross plane filters based on the coding mode, motion vector precision, and/or a reference picture of the current block. The decoder may apply one or more appropriate cross plane filters, for example, based on the coding mode, motion vector precision, and/or reference picture of the current block.
-
FIG. 15A is a diagram of anexample communications system 100 in which one or more disclosed embodiments may be implemented. Thecommunications system 100 may be a multiple access system that provides content, such as voice, data, video, messaging, broadcast, etc., to multiple wireless users. Thecommunications system 100 may enable multiple wireless users to access such content through the sharing of system resources, including wireless bandwidth. For example, thecommunications systems 100 may employ one or more channel access methods, such as code division multiple access (CDMA), time division multiple access (TDMA), frequency division multiple access (FDMA), orthogonal FDMA (OFDMA), single-carrier FDMA (SC-FDMA), and the like. - As shown in
FIG. 15A , thecommunications system 100 may include wireless transmit/receive units (WTRUs) 102 a, 102 b, 102 c, and/or 102 d (which generally or collectively may be referred to as WTRU 102), a radio access network (RAN) 103/104/105, acore network 100/107/109, a public switched telephone network (PSTN) 108, theInternet 110, andother networks 112, though it will be appreciated that the disclosed embodiments contemplate any number of WTRUs, base stations, networks, and/or network elements. Each of theWTRUs WTRUs - The
communications systems 100 may also include a base station 114 a and abase station 114 b. Each of thebase stations 114 a, 114 b may be any type of device configured to wirelessly interface with at least one of theWTRUs core network 106/107/109, theInternet 110, and/or thenetworks 112. By way of example, thebase stations 114 a, 114 b may be a base transceiver station (BTS), a Node-B, an eNode B, a Home Node B, a Home eNode B, a site controller, an access point (AP), a wireless router, and the like. While thebase stations 114 a, 114 b are each depicted as a single element, it will be appreciated that thebase stations 114 a, 114 b may include any number of interconnected base stations and/or network elements. - The base station 114 a may be part of the
RAN 103/104/105, which may also include other base stations and/or network elements (not shown), such as a base station controller (BSC), a radio network controller (RNC), relay nodes, etc. The base station 114 a and/or thebase station 114 b may be configured to transmit and/or receive wireless signals within a particular geographic region, which may be referred to as a cell (not shown). The cell may further be divided into cell sectors. For example, the cell associated with the base station 114 a may be divided into three sectors. Thus, in one embodiment, the base station 114 a may include three transceivers, e.g., one for each sector of the cell. In another embodiment, the base station 114 a may employ multiple-input multiple output (MIMO) technology and, therefore, may utilize multiple transceivers for each sector of the cell. - The
base stations 114 a, 114 b may communicate with one or more of theWTRUs air interface 115/116/117, which may be any suitable wireless communication link (e.g., radio frequency (RF), microwave, infrared (IR), ultraviolet (UV), visible light, etc.). Theair interface 115/116/117 may be established using any suitable radio access technology (RAT). - More specifically, as noted above, the
communications system 100 may be a multiple access system and may employ one or more channel access schemes, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA, and the like. For example, the base station 114 a in theRAN 103/104/105 and theWTRUs air interface 115/116/117 using wideband CDMA (WCDMA). WCDMA may include communication protocols such as High-Speed Packet Access (HSPA) and/or Evolved HSPA (HSPA+). HSPA may include High-Speed Downlink Packet Access (HSDPA) and/or High-Speed Uplink Packet Access (HSUPA). - In another embodiment, the base station 114 a and the
WTRUs air interface 115/116/117 using Long Term Evolution (LTE) and/or LTE-Advanced (LTE-A). - In other embodiments, the base station 114 a and the
WTRUs CDMA2000 1×, CDMA2000 EV-DO. Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), Global System for Mobile communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), GSM EDGE (GERAN), and the like. - The
base station 114 b inFIG. 15A may be a wireless router, Home Node B, Home eNode B, or access point, for example, and may utilize any suitable RAT for facilitating wireless connectivity in a localized area, such as a place of business, a home, a vehicle, a campus, and the like. In one embodiment, thebase station 114 b and theWTRUs base station 114 b and theWTRUs base station 114 b and theWTRUs FIG. 15A , thebase station 114 b may have a direct connection to theInternet 110. Thus, thebase station 114 b may not be used to access theInternet 110 via thecore network 106/107/109. - The
RAN 103/104/105 may be in communication with thecore network 106/107/109, which may be any type of network configured to provide voice, data, applications, and/or voice over internet protocol (VoIP) services to one or more of theWTRUs core network 106/107/109 may provide call control, billing services, mobile location-based services, pre-paid calling, Internet connectivity, video distribution, etc., and/or perform high-level security functions, such as user authentication. Although not shown inFIG. 15A , it will be appreciated that theRAN 103/104/105 and/or thecore network 106/107/109 may be in direct or indirect communication with other RANs that employ the same RAT as theRAN 103/104/105 or a different RAT. For example, in addition to being connected to theRAN 103/104/105, which may be utilizing an E-UTRA radio technology, thecore network 106/107/109 may also be in communication with another RAN (not shown) employing a GSM radio technology. - The
core network 106/107/109 may also serve as a gateway for theWTRUs PSTN 108, theInternet 110, and/orother networks 112. ThePSTN 108 may include circuit-switched telephone networks that provide plain old telephone service (POTS). TheInternet 110 may include a global system of interconnected computer networks and devices that use common communication protocols, such as the transmission control protocol (TCP), user datagram protocol (UDP) and the internet protocol (IP) in the TCP/IP internet protocol suite. Thenetworks 112 may include wired or wireless communications networks owned and/or operated by other service providers. For example, thenetworks 112 may include another core network connected to one or more RANs, which may employ the same RAT as theRAN 103/104/105 or a different RAT. - One or more of the
WTRUs communications system 100 may include multi-mode capabilities, e.g., theWTRUs WTRU 102 c shown inFIG. 15A may be configured to communicate with the base station 114 a, which may employ a cellular-based radio technology, and with thebase station 114 b, which may employ an IEEE 802 radio technology. -
FIG. 15B is a system diagram of anexample WTRU 102. As shown inFIG. 15B , theWTRU 102 may include aprocessor 118, atransceiver 120, a transmit/receiveelement 122, a speaker/microphone 124, akeypad 126, a display/touchpad 128,non-removable memory 130,removable memory 132, apower source 134, a global positioning system (GPS)chipset 136, andother peripherals 138. It will be appreciated that theWTRU 102 may include any sub-combination of the foregoing elements while remaining consistent with an embodiment. Also, embodiments contemplate that thebase stations 114 a and 114 b, and/or the nodes thatbase stations 114 a and 114 b may represent, such as but not limited to transceiver station (BTS), a Node-B, a site controller, an access point (AP), a home node-B, an evolved home node-B (eNodeB), a home evolved node-B (HeNB), a home evolved node-B gateway, and proxy nodes, among others, may include one or more of the elements depicted inFIG. 15B and described herein. - The
processor 118 may be a general purpose processor, a special purpose processor, a conventional processor, a digital signal processor (DSP), a plurality of microprocessors, one or more microprocessors in association with a DSP core, a controller, a microcontroller, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Array (FPGAs) circuits, any other type of integrated circuit (IC), a state machine, and the like. Theprocessor 118 may perform signal coding, data processing, power control, input/output processing, and/or any other functionality that enables theWTRU 102 to operate in a wireless environment. Theprocessor 118 may be coupled to thetransceiver 120, which may be coupled to the transmit/receiveelement 122. WhileFIG. 15B depicts theprocessor 118 and thetransceiver 120 as separate components, it will be appreciated that theprocessor 118 and thetransceiver 120 may be integrated together in an electronic package or chip. - The transmit/receive
element 122 may be configured to transmit signals to, or receive signals from, a base station (e.g., the base station 14 a) over theair interface 115/116/117. For example, in one embodiment, the transmit/receiveelement 122 may be an antenna configured to transmit and/or receive RE signals. In another embodiment, the transmit/receiveelement 122 may be an emitter/detector configured to transmit and/or receive IR, UV, or visible light signals, for example. In yet another embodiment, the transmit/receiveelement 122 may be configured to transmit and receive both RF and light signals. It will be appreciated that the transmit/receiveelement 122 may be configured to transmit and/or receive any combination of wireless signals. - In addition, although the transmit/receive
element 122 is depicted inFIG. 15B as a single element, theWTRU 102 may include any number of transmit/receiveelements 122. More specifically, theWTRU 102 may employ MIMO technology. Thus, in one embodiment, theWTRU 102 may include two or more transmit/receive elements 122 (e.g., multiple antennas) for transmitting and receiving wireless signals over theair interface 115/116/117. - The
transceiver 120 may be configured to modulate the signals that are to be transmitted by the transmit/receiveelement 122 and to demodulate the signals that are received by the transmit/receiveelement 122. As noted above, theWTRU 102 may have multi-mode capabilities. Thus, thetransceiver 120 may include multiple transceivers for enabling theWTRU 102 to communicate via multiple RATs, such as UTRA and IEEE 802.11, for example. - The
processor 118 of theWTRU 102 may be coupled to, and may receive user input data from, the speaker/microphone 124, thekeypad 126, and/or the display/touchpad 128 (e.g., a liquid crystal display LCD) display unit or organic light-emitting diode (OLED) display unit). Theprocessor 118 may also output user data to the speaker/microphone 124, thekeypad 126, and/or the display/touchpad 128. In addition, theprocessor 118 may access information from, and store data in, any type of suitable memory, such as thenon-removable memory 130 and/or theremovable memory 132. Thenon-removable memory 130 may include random-access memory (RAM), read-only memory (ROM), a hard disk, or any other type of memory storage device. Theremovable memory 132 may include a subscriber identity module (SIM) card, a memory stick, a secure digital (SD) memory card, and the like. In other embodiments, theprocessor 118 may access information from, and store data in, memory that is not physically located on theWTRU 102, such as on a server or a home computer (not shown). - The
processor 118 may receive power from thepower source 134, and may be configured to distribute and/or control the power to the other components in theWTRU 102. Thepower source 134 may be any suitable device for powering theWTRU 102. For example, thepower source 134 may include one or more dry cell batteries (e.g., nickel-cadmium (NiCd), nickel-zinc (NiZn), nickel metal hydride (NiMH), lithium-ion (Li-ion), etc.), solar cells, fuel cells, and the like. - The
processor 118 may also be coupled to theGPS chipset 136, which may be configured to provide location information (e.g., longitude and latitude) regarding the current location of theWTRU 102. In addition to, or in lieu of, the information from theGPS chipset 136, theWTRU 102 may receive location information over theair interface 115/116/117 from a base station (e.g.,base stations 114 a, 114 b) and/or determine its location based on the timing of the signals being received from two or more nearby base stations. It will be appreciated that theWTRU 102 may acquire location information by way of any suitable location-determination method while remaining consistent with an embodiment. - The
processor 118 may further be coupled toother peripherals 138, which may include one or more software and/or hardware modules that provide additional features, functionality and/or wired or wireless connectivity, for example, theperipherals 138 may include an accelerometer, an e-compass, a satellite transceiver, a digital camera (for photographs or video), a universal serial bus (USB) port, a vibration device, a television transceiver, a hands free headset, a Bluetooth® module, a frequency modulated (FM) radio unit, a digital music player, a media player, a video game player module, an Internet browser, and the like. -
FIG. 15C is a system diagram of theRAN 103 and thecore network 106 according to an embodiment. As noted above, theRAN 103 may employ a UTRA radio technology to communicate with theWTRUs air interface 115. TheRAN 103 may also be in communication with thecore network 106. As shown inFIG. 15C , theRAN 103 may include Node-Bs WTRUs air interface 115. The Node-Bs RAN 103. TheRAN 103 may also includeRNCs 142 a, 142 b. It will be appreciated that theRAN 103 may include any number of Node-Bs and RNCs while remaining consistent with an embodiment. - As shown in
FIG. 15C , the Node-Bs RNC 142 a Additionally, the Node-B 140 c may be in communication with the RNC142 b. The Node-Bs respective RNCs 142 a, 142 b via an hub interface. TheRNCs 142 a, 142 b may be in communication with one another via an Iur interface. Each of theRNCs 142 a, 142 b may be configured to control the respective Node-Bs RNCs 142 a, 142 b may be configured to carry out or support other functionality, such as outer loop power control, load control, admission control, packet scheduling, handover control, macrodiversity, security functions, data encryption, and the like. - The
core network 106 shown inFIG. 15C may include a media gateway (MGW) 144, a mobile switching center (MSC) 146, a serving GPRS support node (SGSN) 148, and/or a gateway GPRS support node (GGSN) 150. While each of the foregoing elements are depicted as part of thecore network 106, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator. - The
RNC 142 a in theRAN 103 may be connected to theMSC 146 in thecore network 106 via an IuCS interface. TheMSC 146 may be connected to theMGW 144. TheMSC 146 and theMGW 144 may provide the WTRUs 102 a, 102 b, 102 c with access to circuit-switched networks, such as thePSTN 108, to facilitate communications between theWTRUs - The
RNC 142 a in theRAN 103 may also be connected to theSGSN 148 in thecore network 106 via an IuPS interface. TheSGSN 148 may be connected to theGGSN 150. TheSGSN 148 and theGGSN 150 may provide the WTRUs 102 a, 102 b, 102 c with access to packet-switched networks, such as theInternet 110, to facilitate communications between and theWTRUs - As noted above, the
core network 106 may also be connected to thenetworks 112, which may include other wired or wireless networks that are owned and/or operated by other service providers. -
FIG. 15D is a system diagram of theRAN 104 and thecore network 107 according to an embodiment. As noted above, theRAN 104 may employ an E-UTRA radio technology to communicate with theWTRUs air interface 116. TheRAN 104 may also be in communication with thecore network 107. - The
RAN 104 may include eNode-Bs RAN 104 may include any number of eNode-Bs while remaining consistent with an embodiment. The eNode-Bs WTRUs air interface 116. In one embodiment, the eNode-Bs B 160 a, for example, may use multiple antennas to transmit wireless signals to, and receive wireless signals from, theWTRU 102 a. - Each of the eNode-
Bs FIG. 15D , the eNode-Bs - The
core network 107 shown inFIG. 15D may include a mobility management gateway (MME) 162, a servinggateway 164, and a packet data network (PDN) gateway 166. While each of the foregoing elements are depicted as part of thecore network 107, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator. - The
MME 162 may be connected to each of the eNode-Bs RAN 104 via an S1 interface and may serve as a control node. For example, theMME 162 may be responsible for authenticating users of theWTRUs WTRUs MME 162 may also provide a control plane function for switching between theRAN 104 and other RANs (not shown) that employ other radio technologies, such as GSM or WCDMA. - The serving
gateway 164 may be connected to each of the eNode-Bs RAN 104 via the S1 interface. The servinggateway 164 may generally route and forward user data packets to/from theWTRUs gateway 164 may also perform other functions, such as anchoring user planes during inter-eNode B handovers, triggering paging when downlink data is available for theWTRUs WTRUs - The serving
gateway 164 may also be connected to the PDN gateway 166, which may provide the WTRUs 102 a, 102 b, 102 c with access to packet-switched networks, such as theInternet 110, to facilitate communications between theWTRUs - The
core network 107 may facilitate communications with other networks. For example, thecore network 107 may provide the WTRUs 102 a, 102 b, 102 c with access to circuit-switched networks, such as thePSTN 108, to facilitate communications between theWTRUs core network 107 may include, or may communicate with, an IP gateway (e.g., an IP multimedia subsystem (IMS) server) that serves as an interface between thecore network 107 and thePSTN 108. In addition, thecore network 107 may provide the WTRUs 102 a, 102 b, 102 c with access to thenetworks 112, which may include other wired or wireless networks that are owned and/or operated by other service providers. -
FIG. 15E is a system diagram of theRAN 105 and the core network 109 according to an embodiment. TheRAN 105 may be an access service network (ASN) that employs IEEE 802.16 radio technology to communicate with theWTRUs air interface 117. As will be further discussed below, the communication links between the different functional entities of theWTRUs RAN 105, and thecore network 100 may be defined as reference points. - As shown in
FIG. 15E , theRAN 105 may includebase stations ASN gateway 182, though it will be appreciated that theRAN 105 may include any number of base stations and ASN gateways while remaining consistent with an embodiment. Thebase stations RAN 105 and may each include one or more transceivers for communicating with theWTRUs air interface 117. In one embodiment, thebase stations WTRU 102 a. Thebase stations ASN gateway 182 may serve as a traffic aggregation point and may be responsible for paging, caching of subscriber profiles, routing to the core network 109, and the like. - The
air interface 117 between theWTRUs RAN 105 may be defined as an R1 reference point that implements the IEEE 802.16 specification. In addition, each of theWTRUs WTRUs - The communication link between each of the
base stations base stations ASN gateway 182 may be defined as an R6 reference point. The R6 reference point may include protocols for facilitating mobility management based on mobility events associated with each of theWTRUs - As shown in
FIG. 15E , theRAN 105 may be connected to the core network 109. The communication link between theRAN 105 and the core network 109 may defined as an R3 reference point that includes protocols for facilitating data transfer and mobility management capabilities, for example. The core network 109 may include a mobile IP home agent (MIP-HA) 184, an authentication, authorization, accounting (AAA) server 186, and a gateway 188. While each of the foregoing elements are depicted as part of the core network 109, it will be appreciated that any one of these elements may be owned and/or operated by an entity other than the core network operator. - The MIP-HA may be responsible for IP address management, and may enable the WTRUs 102 a, 102 b, 102 c to roam between different ASNs and/or different core networks. The MIP-
HA 184 may provide the WTRUs 102 a, 102 b, 102 c with access to packet-switched networks, such as theinternet 110, to facilitate communications between theWTRUs PSTN 108, to facilitate communications between theWTRUs networks 112, which may include other wired or wireless networks that are owned and/or operated by other service providers. - Although not shown in
FIG. 15E , it will be appreciated that theRAN 105 may be connected to other ASNs and the core network 109 may be connected to other core networks. The communication link between theRAN 105 the other ASNs may be defined as an R4 reference point, which may include protocols for coordinating the mobility of theWTRUs RAN 105 and the other ASNs. The communication link between the core network 109 and the other core networks may be defined as an R5 reference, which may include protocols for facilitating interworking between home core networks and visited core networks. - Systems, methods, and instrumentalities have been disclosed for enhanced chroma coding using cross plane filtering. A reference, predicted and/or reconstructed chroma sample may be enhanced, for example, using information derived from cross plane filtering of a 1-D or 2-D3 M×N window of luma samples (a filter support region) corresponding to the reference, predicted or reconstructed chroma sample, respectively. Luma samples may be reconstructed or padded. A filter support region may be adaptively selected, for example, based on a directional intra prediction mode. Cross plane filters may be classified, e.g., for applicability, and may be adaptively selected, for example, based on a filter support region, whether intra or inter prediction mode is used for a current block, whether integer or fractional pixel motion vectors are used and/or whether a reference picture is used in inter prediction Signaling may be provided to a decoder, for example, to indicate at least one of whether chroma enhancement is enabled, whether chroma enhancement is applied to a current block, a cross plane filter type, a cross plane filter (e.g. set of filters) and corresponding cross plane filter coefficients. A decoder may select a cross plane filter to apply to a filter support region based on received signaling.
- Although features and elements are described above in particular combinations, one of ordinary skill in the art will appreciate that each feature or element can be used alone or in any combination with the other features and elements. In addition, the methods described herein may be implemented in a computer program, software, or firmware incorporated in a computer-readable medium for execution by a computer or processor Examples of computer-readable media include electronic signals (transmitted over wired or wireless connections) and computer-readable storage media Examples of computer-readable storage media include, but are not limited to, a read only memory (ROM), a random access memory (RAM), a register, cache memory, semiconductor memory devices, magnetic media such as internal hard disks and removable disks, magneto-optical media, and optical media such as CD-ROM disks, and digital versatile disks (DVDs). A processor in association with software may be used to implement a radio frequency transceiver for use in a WTRU, UE, terminal, base station, RNC, or any host computer.
Claims (21)
1-26. (canceled)
27. A method comprising:
obtaining a cross-plane filter indication, wherein the cross-plane filter indication is configured to indicate that a chroma sample is to be predicted using an offset;
determining a luma region associated with predicting the chroma sample, wherein the luma region comprises a plurality of luma samples;
deriving the offset based on the plurality of luma samples in the determined luma region;
predicting the chroma sample using the derived offset; and
reconstructing a video block based on the predicted chroma sample.
28. The method for claim 27 , wherein the luma region is determined based on at least one of an intra prediction mode or an inter prediction mode.
29. The method of claim 27 , wherein the luma region is determined such that the luma region comprises at least one of a luma sample that neighbors a corresponding chroma sample or the luma sample that is collocated with the corresponding chroma sample.
30. The method of claim 27 , wherein the video block comprises the chroma sample.
31. An apparatus comprising:
a processor configured to:
obtain a cross-plane filter indication, wherein the cross-plane filter indication is configured to indicate that a chroma sample is to be predicted using an offset
determine a luma region associated with predicting the chroma sample, the luma region comprising a plurality of luma samples;
derive the offset based on the plurality of luma samples in the determined luma region;
predict the chroma sample using the derived offset; and
reconstruct a video block based on the predicted chroma sample.
32. The apparatus for claim 31 , wherein the luma region is determined based on at least one of an intra prediction mode or an inter prediction mode.
33. The apparatus for claim 31 , wherein the luma region is determined such that the luma region comprises at least one of a luma sample that neighbors a corresponding chroma sample or the luma sample that is collocated with the corresponding chroma sample.
34. The apparatus for claim 31 , wherein the video block comprises the chroma sample.
35. The apparatus of claim 31 , wherein the apparatus further comprises a memory.
36. A method comprising:
determining a luma region associated with predicting a chroma sample, wherein the luma region comprises a plurality of luma samples;
deriving an offset based on the plurality of luma samples in the determined luma region;
generating a cross-plane filter indication, wherein the cross-plane filter indication is configured to indicate that the chroma sample is to be predicted using the derived offset; and
including the cross-plane filter indication in a bitstream that is representative of a video signal.
37. The method for claim 36 , wherein the luma region is determined based on at least one of an intra prediction mode or an inter prediction mode.
38. The method for claim 36 , wherein the luma region is determined such that the luma region comprises at least one of a luma sample that neighbors a corresponding chroma sample or the luma sample that is collocated with the corresponding chroma sample.
39. The method of claim 36 , wherein the method comprises:
generating a cross-plane filter number indication, wherein the cross-plane filter number indication indicates a number of cross-plane filters to be signaled; and
including the cross-plane filter number indication in the bitstream.
40. The method of claim 36 , wherein the cross-plane filter indication is included in a slice level or a picture level.
41. An apparatus comprising:
a processor configured to:
determine a luma region associated with predicting a chroma sample, wherein the luma region comprises a plurality of luma samples;
derive an offset based on the plurality of luma samples in the determined luma region;
generate a cross-plane filter indication, wherein the cross-plane filter indication is configured to indicate that the chroma sample is to be predicted using the derived offset; and
include the cross-plane filter indication in a bitstream that is representative of a video signal.
42. The apparatus for claim 41 , wherein the luma region is determined based on at least one of an intra prediction mode or an inter prediction mode.
43. The apparatus for claim 41 , wherein the luma region is determined such that the luma region comprises at least one of a luma sample that neighbors a corresponding chroma sample or the luma sample that is collocated with the corresponding chroma sample.
44. The apparatus for claim 41 , wherein the processor is configured to:
generate a cross-plane filter number indication, wherein the cross-plane filter number indication indicates a number of cross-plane filters to be signaled; and
include the cross-plane filter number indication in the bitstream.
45. The apparatus for claim 41 , wherein the cross-plane filter indication is included in a slice level or a picture level.
46. The apparatus of claim 41 , wherein the apparatus further comprises a memory.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/848,190 US20220329831A1 (en) | 2015-07-08 | 2022-06-23 | Enhanced chroma coding using cross plane filtering |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562190008P | 2015-07-08 | 2015-07-08 | |
PCT/US2016/041445 WO2017007989A1 (en) | 2015-07-08 | 2016-07-08 | Enhanced chroma coding using cross plane filtering |
US201815742836A | 2018-01-08 | 2018-01-08 | |
US17/848,190 US20220329831A1 (en) | 2015-07-08 | 2022-06-23 | Enhanced chroma coding using cross plane filtering |
Related Parent Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2016/041445 Continuation WO2017007989A1 (en) | 2015-07-08 | 2016-07-08 | Enhanced chroma coding using cross plane filtering |
US15/742,836 Continuation US11438605B2 (en) | 2015-07-08 | 2016-07-08 | Enhanced chroma coding using cross plane filtering |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220329831A1 true US20220329831A1 (en) | 2022-10-13 |
Family
ID=56555775
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/742,836 Active US11438605B2 (en) | 2015-07-08 | 2016-07-08 | Enhanced chroma coding using cross plane filtering |
US17/848,190 Pending US20220329831A1 (en) | 2015-07-08 | 2022-06-23 | Enhanced chroma coding using cross plane filtering |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/742,836 Active US11438605B2 (en) | 2015-07-08 | 2016-07-08 | Enhanced chroma coding using cross plane filtering |
Country Status (7)
Country | Link |
---|---|
US (2) | US11438605B2 (en) |
EP (1) | EP3320684A1 (en) |
JP (3) | JP2018527784A (en) |
KR (4) | KR102570911B1 (en) |
CN (2) | CN107836116B (en) |
TW (1) | TWI715598B (en) |
WO (1) | WO2017007989A1 (en) |
Families Citing this family (51)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10368107B2 (en) * | 2016-08-15 | 2019-07-30 | Qualcomm Incorporated | Intra video coding using a decoupled tree structure |
US10999602B2 (en) | 2016-12-23 | 2021-05-04 | Apple Inc. | Sphere projected motion estimation/compensation and mode decision |
US11259046B2 (en) | 2017-02-15 | 2022-02-22 | Apple Inc. | Processing of equirectangular object data to compensate for distortion by spherical projections |
US10924747B2 (en) | 2017-02-27 | 2021-02-16 | Apple Inc. | Video coding techniques for multi-view video |
US11093752B2 (en) | 2017-06-02 | 2021-08-17 | Apple Inc. | Object tracking in multi-view video |
WO2019059107A1 (en) * | 2017-09-20 | 2019-03-28 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Encoding device, decoding device, encoding method and decoding method |
WO2019066384A1 (en) * | 2017-09-26 | 2019-04-04 | 삼성전자 주식회사 | Video decoding method and device using cross-component prediction, and video encoding method and device using cross-component prediction |
EP3737093A4 (en) * | 2017-11-28 | 2022-02-09 | Electronics and Telecommunications Research Institute | Image encoding/decoding method and device, and recording medium stored with bitstream |
CN115695790A (en) * | 2018-01-15 | 2023-02-03 | 三星电子株式会社 | Encoding method and apparatus thereof, and decoding method and apparatus thereof |
EP4221201A1 (en) | 2018-01-29 | 2023-08-02 | InterDigital VC Holdings, Inc. | Encoding and decoding with refinement of the reconstructed picture |
CN112690000B (en) | 2018-09-21 | 2022-02-18 | 华为技术有限公司 | Apparatus and method for inverse quantization |
WO2020057506A1 (en) * | 2018-09-21 | 2020-03-26 | 华为技术有限公司 | Prediction method and device for chroma block |
MX2021003853A (en) * | 2018-10-05 | 2021-09-08 | Huawei Tech Co Ltd | Intra prediction method and device. |
CN112956199B (en) | 2018-11-06 | 2023-07-28 | 北京字节跳动网络技术有限公司 | Simplified parameter derivation for intra prediction |
WO2020108591A1 (en) | 2018-12-01 | 2020-06-04 | Beijing Bytedance Network Technology Co., Ltd. | Parameter derivation for intra prediction |
JP2022514824A (en) * | 2018-12-07 | 2022-02-16 | フラウンホーファー-ゲゼルシャフト・ツール・フェルデルング・デル・アンゲヴァンテン・フォルシュング・アインゲトラーゲネル・フェライン | Encoders, decoders, and methods to enhance the computational robustness of intercomponent linear model parameters |
BR112021010428A2 (en) | 2018-12-07 | 2021-08-24 | Beijing Bytedance Network Technology Co., Ltd. | Method for processing video, apparatus in a video system, and computer program product |
CN116456083A (en) * | 2018-12-25 | 2023-07-18 | Oppo广东移动通信有限公司 | Decoding prediction method, device and computer storage medium |
SG11202108209YA (en) | 2019-02-22 | 2021-08-30 | Beijing Bytedance Network Technology Co Ltd | Neighbouring sample selection for intra prediction |
MX2021009894A (en) | 2019-02-24 | 2022-05-18 | Beijing Bytedance Network Tech Co Ltd | Parameter derivation for intra prediction. |
WO2020192642A1 (en) * | 2019-03-24 | 2020-10-01 | Beijing Bytedance Network Technology Co., Ltd. | Conditions in parameter derivation for intra prediction |
CN113632474B (en) * | 2019-03-26 | 2022-12-09 | 北京字节跳动网络技术有限公司 | Parameter derivation for inter-prediction |
CN117221532B (en) * | 2019-04-09 | 2024-03-29 | 北京达佳互联信息技术有限公司 | Method, apparatus and storage medium for video decoding |
WO2020207493A1 (en) | 2019-04-12 | 2020-10-15 | Beijing Bytedance Network Technology Co., Ltd. | Transform coding based on matrix-based intra prediction |
KR20210145757A (en) | 2019-04-16 | 2021-12-02 | 베이징 바이트댄스 네트워크 테크놀로지 컴퍼니, 리미티드 | Matrix Derivation in Intra Coding Mode |
JP2022531147A (en) | 2019-05-01 | 2022-07-06 | 北京字節跳動網絡技術有限公司 | Matrix-based intra-prediction with filtering |
WO2020221372A1 (en) | 2019-05-01 | 2020-11-05 | Beijing Bytedance Network Technology Co., Ltd. | Context coding for matrix-based intra prediction |
CN117412039A (en) | 2019-05-22 | 2024-01-16 | 北京字节跳动网络技术有限公司 | Matrix-based intra prediction using upsampling |
EP3959876A4 (en) | 2019-05-31 | 2022-06-29 | Beijing Bytedance Network Technology Co., Ltd. | Restricted upsampling process in matrix-based intra prediction |
WO2020244610A1 (en) | 2019-06-05 | 2020-12-10 | Beijing Bytedance Network Technology Co., Ltd. | Context determination for matrix-based intra prediction |
EP3987778B1 (en) * | 2019-06-21 | 2023-11-22 | Vid Scale, Inc. | Precision refinement for motion compensation with optical flow |
WO2021025167A1 (en) * | 2019-08-08 | 2021-02-11 | Panasonic Intellectual Property Corporation Of America | System and method for video coding |
KR20220038670A (en) * | 2019-08-08 | 2022-03-29 | 파나소닉 인텔렉츄얼 프로퍼티 코포레이션 오브 아메리카 | Systems and methods for video coding |
US11197030B2 (en) * | 2019-08-08 | 2021-12-07 | Panasonic Intellectual Property Corporation Of America | System and method for video coding |
WO2021025168A1 (en) * | 2019-08-08 | 2021-02-11 | Panasonic Intellectual Property Corporation Of America | System and method for video coding |
CN114270850A (en) | 2019-08-08 | 2022-04-01 | 松下电器(美国)知识产权公司 | System and method for video encoding |
GB2586484B (en) * | 2019-08-20 | 2023-03-08 | Canon Kk | A filter |
WO2021040458A1 (en) * | 2019-08-28 | 2021-03-04 | 주식회사 케이티 | Video signal processing method and device |
CA3152831A1 (en) * | 2019-08-29 | 2021-03-04 | Lg Electronics Inc. | Image encoding and decoding using adaptive loop and cross-component filtering |
WO2021049593A1 (en) | 2019-09-11 | 2021-03-18 | Panasonic Intellectual Property Corporation Of America | System and method for video coding |
JP7328445B2 (en) | 2019-09-19 | 2023-08-16 | 北京字節跳動網絡技術有限公司 | Derivation of reference sample positions in video coding |
KR20220065758A (en) | 2019-09-20 | 2022-05-20 | 베이징 바이트댄스 네트워크 테크놀로지 컴퍼니, 리미티드 | Scaling process of coding blocks |
WO2021061814A1 (en) * | 2019-09-23 | 2021-04-01 | Vid Scale, Inc. | Joint component video frame filtering |
WO2021083188A1 (en) | 2019-10-28 | 2021-05-06 | Beijing Bytedance Network Technology Co., Ltd. | Syntax signaling and parsing based on colour component |
JP2022554307A (en) * | 2019-10-29 | 2022-12-28 | 北京字節跳動網絡技術有限公司 | Cross-Component Adaptive Loop Filter Signaling |
CN113132739B (en) * | 2019-12-31 | 2022-11-01 | 杭州海康威视数字技术股份有限公司 | Boundary strength determination method, boundary strength determination device, boundary strength encoding and decoding device and equipment |
US11303936B2 (en) * | 2020-02-21 | 2022-04-12 | Tencent America LLC | Method and apparatus for filtering |
US11575909B2 (en) * | 2020-04-07 | 2023-02-07 | Tencent America LLC | Method and apparatus for video coding |
CN114007067B (en) * | 2020-07-28 | 2023-05-23 | 北京达佳互联信息技术有限公司 | Method, apparatus and medium for decoding video signal |
WO2022035687A1 (en) * | 2020-08-13 | 2022-02-17 | Beijing Dajia Internet Information Technology Co., Ltd. | Chroma coding enhancement in cross-component sample adaptive offset |
WO2023097064A1 (en) * | 2021-11-25 | 2023-06-01 | Beijing Dajia Internet Information Technology Co., Ltd. | Cross-component sample adaptive offset |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080019605A1 (en) * | 2003-11-07 | 2008-01-24 | Sehoon Yea | Filtering Artifacts in Images with 3D Spatio-Temporal Fuzzy Filters |
US20110122308A1 (en) * | 2009-11-20 | 2011-05-26 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US20130114696A1 (en) * | 2011-11-07 | 2013-05-09 | Futurewei Technologies, Co. | Angular Table for Improving Intra Prediction |
US20140314142A1 (en) * | 2011-11-04 | 2014-10-23 | Infobridge Pte. Ltd. | Apparatus of decoding video data |
US20140369426A1 (en) * | 2013-06-17 | 2014-12-18 | Qualcomm Incorporated | Inter-component filtering |
US20150003524A1 (en) * | 2012-01-13 | 2015-01-01 | Sharp Kabushiki Kaisha | Image decoding device, image encoding device, and data structure of encoded data |
US20150178892A1 (en) * | 2010-04-05 | 2015-06-25 | Samsung Electronics Co., Ltd. | Method and apparatus for performing interpolation based on transform and inverse transform |
US20150281687A1 (en) * | 2013-01-24 | 2015-10-01 | Sharp Kabushiki Kaisha | Image decoding apparatus and image coding apparatus |
US20150373330A1 (en) * | 2013-04-02 | 2015-12-24 | Chips & Media, Inc | Method and apparatus for processing video |
Family Cites Families (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4701783A (en) | 1982-09-14 | 1987-10-20 | New York Institute Of Technology | Technique for encoding and decoding video with improved separation of chrominance and luminance |
US5450216A (en) * | 1994-08-12 | 1995-09-12 | International Business Machines Corporation | Color image gamut-mapping system with chroma enhancement at human-insensitive spatial frequencies |
US6462790B1 (en) * | 1999-03-12 | 2002-10-08 | Fortel Dtv, Inc. | Digital comb filter for decoding composite video signals |
US7102669B2 (en) | 2002-04-02 | 2006-09-05 | Freescale Semiconductor, Inc. | Digital color image pre-processing |
US7333544B2 (en) | 2003-07-16 | 2008-02-19 | Samsung Electronics Co., Ltd. | Lossless image encoding/decoding method and apparatus using inter-color plane prediction |
US7397515B2 (en) | 2004-01-30 | 2008-07-08 | Broadcom Corporation | Method and system for cross-chrominance removal using motion detection |
US7724949B2 (en) * | 2004-06-10 | 2010-05-25 | Qualcomm Incorporated | Advanced chroma enhancement |
WO2006108654A2 (en) | 2005-04-13 | 2006-10-19 | Universität Hannover | Method and apparatus for enhanced video coding |
TWI314720B (en) | 2005-05-31 | 2009-09-11 | Himax Tech Inc | 2d yc separation device and yc separation system |
CN101009842B (en) | 2006-01-11 | 2012-02-01 | 华为技术有限公司 | Method and device for value insertion in the hierarchical video compression |
KR101266168B1 (en) | 2006-08-16 | 2013-05-21 | 삼성전자주식회사 | Method and apparatus for encoding, decoding video |
US8270472B2 (en) | 2007-11-09 | 2012-09-18 | Thomson Licensing | Methods and apparatus for adaptive reference filtering (ARF) of bi-predictive pictures in multi-view coded video |
KR20100133006A (en) | 2008-07-04 | 2010-12-20 | 가부시끼가이샤 도시바 | Dynamic image encoding/decoding method and device |
US8270466B2 (en) | 2008-10-03 | 2012-09-18 | Sony Corporation | Adaptive decimation filter |
CN101778371B (en) | 2009-01-09 | 2012-08-29 | 电信科学技术研究院 | Paging method and device |
JPWO2011033643A1 (en) | 2009-09-17 | 2013-02-07 | 株式会社東芝 | Video encoding method and video decoding method |
US8638342B2 (en) * | 2009-10-20 | 2014-01-28 | Apple Inc. | System and method for demosaicing image data using weighted gradients |
CN201726499U (en) | 2010-05-04 | 2011-01-26 | 武汉光华芯科技有限公司 | Composite video signal luminance and chrominance separation system |
CN101902653B (en) | 2010-06-25 | 2013-04-24 | 杭州爱威芯科技有限公司 | Luminance sample direction prediction-based in-field luminance and chrominance (YC) separation method |
US20120008687A1 (en) | 2010-07-06 | 2012-01-12 | Apple Inc. | Video coding using vector quantized deblocking filters |
US9154788B2 (en) | 2011-06-03 | 2015-10-06 | Panasonic Intellectual Property Corporation Of America | Image coding method and image decoding method |
US9693070B2 (en) * | 2011-06-24 | 2017-06-27 | Texas Instruments Incorporated | Luma-based chroma intra-prediction for video coding |
CN104980756A (en) * | 2011-06-28 | 2015-10-14 | 三星电子株式会社 | Video decoding method using offset adjustments according to pixel classification and apparatus therefor |
US9641866B2 (en) | 2011-08-18 | 2017-05-02 | Qualcomm Incorporated | Applying partition-based filters |
WO2013155662A1 (en) | 2012-04-16 | 2013-10-24 | Mediatek Singapore Pte. Ltd. | Methods and apparatuses of simplification for intra chroma lm mode |
WO2013164922A1 (en) | 2012-05-02 | 2013-11-07 | ソニー株式会社 | Image processing device and image processing method |
US20140086316A1 (en) | 2012-09-24 | 2014-03-27 | Louis Joseph Kerofsky | Video compression with color space scalability |
WO2014052731A2 (en) * | 2012-09-28 | 2014-04-03 | Vid Scale, Inc. | Cross-plane filtering for chroma signal enhancement in video coding |
JP6788346B2 (en) * | 2012-10-01 | 2020-11-25 | ジーイー ビデオ コンプレッション エルエルシー | Scalable video coding using subpartition derivation of subblocks for prediction from the base layer |
US9357211B2 (en) * | 2012-12-28 | 2016-05-31 | Qualcomm Incorporated | Device and method for scalable and multiview/3D coding of video information |
US9503732B2 (en) | 2013-04-10 | 2016-11-22 | Arris Enterprises, Inc. | Re-sampling with phase offset adjustment for luma and chroma to select filters in scalable video coding |
US8810727B1 (en) * | 2013-05-07 | 2014-08-19 | Qualcomm Technologies, Inc. | Method for scaling channel of an image |
WO2015003753A1 (en) | 2013-07-12 | 2015-01-15 | Nokia Solutions And Networks Oy | Redirection of m2m devices |
US10129542B2 (en) * | 2013-10-17 | 2018-11-13 | Futurewei Technologies, Inc. | Reference pixel selection and filtering for intra coding of depth map |
WO2015062098A1 (en) | 2013-11-01 | 2015-05-07 | 华为技术有限公司 | Network selection method and core network device |
JP6750234B2 (en) | 2016-01-28 | 2020-09-02 | 横浜ゴム株式会社 | Tire operation service system and method |
-
2016
- 2016-07-08 EP EP16745284.6A patent/EP3320684A1/en active Pending
- 2016-07-08 KR KR1020227037335A patent/KR102570911B1/en active IP Right Grant
- 2016-07-08 CN CN201680040052.9A patent/CN107836116B/en active Active
- 2016-07-08 CN CN202110824891.2A patent/CN113810691A/en active Pending
- 2016-07-08 JP JP2018500441A patent/JP2018527784A/en not_active Ceased
- 2016-07-08 KR KR1020237028450A patent/KR20230128138A/en not_active Application Discontinuation
- 2016-07-08 US US15/742,836 patent/US11438605B2/en active Active
- 2016-07-08 KR KR1020217025704A patent/KR102460912B1/en active IP Right Grant
- 2016-07-08 KR KR1020187002181A patent/KR102291835B1/en active IP Right Grant
- 2016-07-08 WO PCT/US2016/041445 patent/WO2017007989A1/en active Application Filing
- 2016-07-11 TW TW105121738A patent/TWI715598B/en active
-
2021
- 2021-03-22 JP JP2021047297A patent/JP7208288B2/en active Active
-
2022
- 2022-06-23 US US17/848,190 patent/US20220329831A1/en active Pending
-
2023
- 2023-01-05 JP JP2023000576A patent/JP2023030205A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080019605A1 (en) * | 2003-11-07 | 2008-01-24 | Sehoon Yea | Filtering Artifacts in Images with 3D Spatio-Temporal Fuzzy Filters |
US20110122308A1 (en) * | 2009-11-20 | 2011-05-26 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
US20150178892A1 (en) * | 2010-04-05 | 2015-06-25 | Samsung Electronics Co., Ltd. | Method and apparatus for performing interpolation based on transform and inverse transform |
US20140314142A1 (en) * | 2011-11-04 | 2014-10-23 | Infobridge Pte. Ltd. | Apparatus of decoding video data |
US20130114696A1 (en) * | 2011-11-07 | 2013-05-09 | Futurewei Technologies, Co. | Angular Table for Improving Intra Prediction |
US20150003524A1 (en) * | 2012-01-13 | 2015-01-01 | Sharp Kabushiki Kaisha | Image decoding device, image encoding device, and data structure of encoded data |
US20150281687A1 (en) * | 2013-01-24 | 2015-10-01 | Sharp Kabushiki Kaisha | Image decoding apparatus and image coding apparatus |
US20150373330A1 (en) * | 2013-04-02 | 2015-12-24 | Chips & Media, Inc | Method and apparatus for processing video |
US20140369426A1 (en) * | 2013-06-17 | 2014-12-18 | Qualcomm Incorporated | Inter-component filtering |
Also Published As
Publication number | Publication date |
---|---|
KR102291835B1 (en) | 2021-08-23 |
KR102570911B1 (en) | 2023-08-24 |
TWI715598B (en) | 2021-01-11 |
TW201724854A (en) | 2017-07-01 |
KR102460912B1 (en) | 2022-10-28 |
KR20230128138A (en) | 2023-09-01 |
CN113810691A (en) | 2021-12-17 |
CN107836116B (en) | 2021-08-06 |
JP2023030205A (en) | 2023-03-07 |
KR20180039052A (en) | 2018-04-17 |
JP2021103889A (en) | 2021-07-15 |
WO2017007989A1 (en) | 2017-01-12 |
EP3320684A1 (en) | 2018-05-16 |
US11438605B2 (en) | 2022-09-06 |
KR20210104929A (en) | 2021-08-25 |
JP7208288B2 (en) | 2023-01-18 |
KR20220146717A (en) | 2022-11-01 |
JP2018527784A (en) | 2018-09-20 |
CN107836116A (en) | 2018-03-23 |
US20180220138A1 (en) | 2018-08-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220329831A1 (en) | Enhanced chroma coding using cross plane filtering | |
US20210250592A1 (en) | Intra block copy mode for screen content coding | |
US20240015321A1 (en) | Systems and methods for spatial prediction | |
US20210274203A1 (en) | Systems and methods for rgb video coding enhancement | |
US10484686B2 (en) | Palette coding modes and palette flipping | |
US20230179795A1 (en) | Bl-PREDICTION FOR VIDEO CODING | |
US10469847B2 (en) | Inter-component de-correlation for video coding | |
US10516882B2 (en) | Intra-block copy searching | |
US10972731B2 (en) | Systems and methods for coding in super-block based video coding framework | |
US20170374384A1 (en) | Palette coding for non-4:4:4 screen content video | |
US9438898B2 (en) | Reference picture lists modification | |
US20150256828A1 (en) | Adaptive Upsampling For Multi-Layer Video Coding | |
US20190014333A1 (en) | Inter-layer prediction for scalable video coding | |
US20150288856A1 (en) | Temporal Filter For Denoising A High Dynamic Range Video |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |