CN105830440B - The colour gamut scalable video decoding device and method of the phase alignment of brightness and coloration are carried out using interpolation - Google Patents

The colour gamut scalable video decoding device and method of the phase alignment of brightness and coloration are carried out using interpolation Download PDF

Info

Publication number
CN105830440B
CN105830440B CN201480068000.3A CN201480068000A CN105830440B CN 105830440 B CN105830440 B CN 105830440B CN 201480068000 A CN201480068000 A CN 201480068000A CN 105830440 B CN105830440 B CN 105830440B
Authority
CN
China
Prior art keywords
component
sampling location
chromatic
sampling
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201480068000.3A
Other languages
Chinese (zh)
Other versions
CN105830440A (en
Inventor
叶琰
董洁
贺玉文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vid Scale Inc
Original Assignee
Vid Scale Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vid Scale Inc filed Critical Vid Scale Inc
Priority to CN201810287571.6A priority Critical patent/CN108337519B/en
Publication of CN105830440A publication Critical patent/CN105830440A/en
Application granted granted Critical
Publication of CN105830440B publication Critical patent/CN105830440B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/67Circuits for processing colour signals for matrixing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/597Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/64Systems for the transmission or the storage of the colour picture signal; Details therefor, e.g. coding or decoding means therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N11/00Colour television systems
    • H04N11/24High-definition television systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/139Format conversion, e.g. of frame-rate or size
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/15Processing image signals for colour aspects of image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/161Encoding, multiplexing or demultiplexing different image signal components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/59Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial sub-sampling or interpolation, e.g. alteration of picture size or resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/80Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)
  • Color Television Systems (AREA)

Abstract

Video encoder and method, for reception and the associated picture of the first color space, wherein the picture includes the first component, the second component of the second sampling location (L0) and the second component of third sampling location (L4) of the first sampling location (C0).First interpolation filter is applied to the second component of second sampling location (L0) and the second component of the third sampling location (L4), to determine the second component of first sampling location (L (C0)).The second component of first sampling location (L (C0)) can be associated with first color space.Color transformation model is applied to first component of first sampling location (C0) and the second component of first sampling location (L (C0)), and first component of first sampling location is transformed into the second color space.

Description

The colour gamut scalable video of the phase alignment of brightness and coloration is carried out using interpolation Device and method
Cross reference to related applications
This application claims the U.S. Provisional Application No.61/915 submitted on December 13rd, 2013,892 equity is complete Portion's content is hereby expressly incorporated by reference.
Background technology
The brightness of input video stream and the phase (phase) of chroma sample position possibly can not be aligned.This brightness and color The misalignment of degree sample position influences whether the accuracy of 3D LUT interpolation, so as to influence estimative 3D LUT.
Invention content
For the system and or method of estimated color transform components.Video encoder can receive and the first color space Associated picture.The picture may include the first component of the first sampling location, the second component of the second sampling location, Yi Ji The second component of three sampling locations.First interpolation filter can be applied to the of the second sampling location by the video encoder Two components and the second component of third sampling location, to determine the second component of the first sampling location.The of first sampling location Two components can be associated with the first color space.Color transformation model can be applied to the first sampling location by the video encoder The first component and the first sampling location second component, the first component of the first sampling location is converted into (translate) To the second color space.First component can be luminance component, and the second component can be the first chromatic component (example Such as red color difference component and/or chroma blue difference component) or the second chromatic component (such as red color difference component and/or indigo plant Chroma color difference component).First component can be the first chromatic component (such as red color difference component and/or chroma blue Difference component) or the second chromatic component, and the second component can be luminance component.
The video encoder can apply the first interpolation filter.First interpolation filter may include:By second The second component of sampling location multiplies 3;The second component of the second sampling location after multiplying, the second component of third sampling location with And 2 be added with determine itself and;And should and divided by 4.First interpolation filter may include:By the of the second sampling location Two components, the second component of third sampling location and 1 be added with determine itself and;And should and divided by 2.
The picture may include the second component of the 4th sampling location and the second component of the 5th sampling location.It is described to regard Frequency encoding device the first interpolation filter can be applied to the second component of the second sampling location, second point of third sampling location Amount, the second component of the 4th sampling location and the second component of the 5th sampling location, to determine the second of the first sampling location Component.First interpolation filter may include:By the second component of the second sampling location and second point of third sampling location Amount is added to determine the first He;The second component of 4th sampling location is added to determine with the second component of the 5th sampling location Second He;By described second and multiply 3 with determine third and;By described first with the third with and 4 be added with determine the 4th With;And by the described 4th and divided by 8.
The picture may include the third component of the second sampling location and the third component of third sampling location.Described first Component can be luminance component, and the second component can be red color difference component, and the third component can be blue Colour difference component.First interpolation filter can be applied to the third component and the of the second sampling location by the video encoder The third component of three sampling locations, to determine the third component of the first sampling location.The third component of first sampling location It can be associated with the first color space.Color transformation model can be applied to the first of the first sampling location by the video encoder The third component of component, the second component of the first sampling location and the first sampling location, by the first of the first sampling location Component is transformed into the second color space.
The picture may include the third component of the first sampling location.The video encoder can be by color transformation model The third component of the first component, the first sampling location applied to the first sampling location and second point of the first sampling location Amount, the second color space is transformed by the first component of the first sampling location.First component can be the first coloration point Amount, the second component can be luminance component, and the third component can be the second chromatic component.First component can To be the second chromatic component, the second component can be luminance component, and the third component can be the first chromatic component.
The feature of the picture can be 4:2:0 chroma format.The color transformation model can be based on 3 dimension look-up tables (LUT)。
Video encoder according to claim 1, wherein the processor is further configured to receive scalable ratio Spy's stream, the scalable bitstream include base and enhancement layer, wherein the base includes the picture, the base and first Color space is associated with, and the enhancement layer is associated with the second color space.
Video encoder can receive and the associated picture of the first color space.The picture may include the first sampling location The first chromatic component, the second chromatic component of the first sampling location, the luminance component of the second sampling location, third sampling location Luminance component, the luminance component of the 4th sampling location and the luminance component of the 5th sampling location.The video encoder Can by the first interpolation filter be applied to it is following in two or more with determine the first sampling location luminance component:Second The luminance component of sampling location, the luminance component of third sampling location, the luminance component of the 4th sampling location and the 5th sampling The luminance component of position, wherein the luminance component of first sampling location is associated with the first color space.The Video coding Color transformation model can be applied to the first chromatic component of the first sampling location by equipment, the second coloration of the first sampling location is divided First chromatic component of the first sampling location is transformed into the second color sky by amount and the luminance component of the first sampling location Between.Color transformation model can be applied to the first chromatic component of the first sampling location, the first sampling by the video encoder Second chromatic component of position and the luminance component of the first sampling location, by the second chromatic component of the first sampling location It is transformed into the second color space.First chromatic component and/or the second chromatic component can be red color difference component and/or Chroma blue difference component.
Video encoder can receive and the associated picture of the first color space.The picture may include the first sampling location Luminance component, the first chromatic component of the second sampling location, the second chromatic component of the second sampling location, third sampling location The first chromatic component, the second chromatic component of third sampling location, the first chromatic component of the 4th sampling location, the 4th sampling The second chromatic component, the first chromatic component of the 5th sampling location and the second chromatic component of the 5th sampling location of position. The video encoder can by the first interpolation filter be applied to it is following in two or more with determine the first sample bits The first chromatic component put:First chromatic component of the second sampling location, the first chromatic component of third sampling location, the 4th adopt First chromatic component of sample position and the first chromatic component of the 5th sampling location, wherein the of first sampling location Chrominance component is associated with the first color space.The video encoder can by the first interpolation filter be applied to it is following in Two or more is with the second chromatic component of determining first sampling location:Second chromatic component of the second sampling location, third The second chromatic component, the second chromatic component of the 4th sampling location and the second coloration of the 5th sampling location of sampling location Component, wherein the second chromatic component of first sampling location is associated with the first color space.The video encoder can Color transformation model is applied to the luminance component of the first sampling location, the first chromatic component, the Yi Ji of the first sampling location The luminance component of first sampling location is transformed into the second color space by the second chromatic component of one sampling location.Institute It can be red color difference component and/or chroma blue difference component to state the first chromatic component and/or second chromatic component.
Description of the drawings
Fig. 1 is the block diagram of the example scalable video coding system with one or more layers (such as N number of layer).
Fig. 2 is to use solid (such as the 2 views) time of Video coding of MVC and/or the example of inter-layer prediction.
Fig. 3 is that CIE colors define or the example color between the BT.709 (HDTV) in space and BT.2020 (UHDTV) It is preliminary to compare.
Fig. 4 A are regarding for terminal user between picture is classified (graded) in BT.709 and is rendered in BT.709 Differential example.
Fig. 4 B are for the parallax example of terminal user between picture is classified in BT.2020 and is rendered in BT.709.
Fig. 5 is colour gamut scalability (CGS) encoding examples using picture level inter-layer prediction (ILP).
Fig. 6 is the example of the 3D look-up tables of 8 bit YUV signals.
Fig. 7 is the example of (such as being used in 3D LUT estimations) three weight calculation linearly or in tetrahedron interpolation.
Fig. 8 is the example of (such as being used in 3D LUT estimations) tetrahedron interpolation.
Fig. 9 A-9F are tetrahedral the showing of (such as being used in 3D LUT estimations) encirclement (encompass) interpolation point Example.
Figure 10 is showing for the phase shift between the brightness of chroma format (such as 420 chroma formats) and chromatic component Example, it is rectangular to represent luminance pixel grid in the chroma format, and circle can represent coloration grid.
Figure 11 A are the diagrams of example communication system, can be implemented disclosed in one or more in the example communication system Embodiment.
Figure 11 B are the example wireless transmitter/receiver units (WTRU) that can be used in the communication system shown in Figure 11 A System diagram.
Figure 11 C are the Example radio access networks and example core network that can be used in the communication system shown in Figure 11 A The system diagram of network.
Figure 11 D are another Example radio access networks and core net that can be used in the communication system shown in Figure 11 A The system diagram of network.
Figure 11 E are another Example radio access networks and core net that can be used in the communication system shown in Figure 11 A The system diagram of network.
Specific embodiment
Digital video compaction technique can realize efficient digital video communication, distribution and consumption, such as H.261, MPEG-1, MPEG-2, H.263, MPEG-4 part 2s and H.264/MPEG-4 the 10th part AVC.
Compared with conventional digital video service (such as sending TV signals by satellite, cable and terrestrial transmission channel), more It can be disposed in isomerous environment come more Video Applications (such as IPTV, Video chat, mobile video and stream video).It lifts For example, Video Applications can provide different size of cell for the video flowing in network, such.Isomerism may be present in visitor In family end and network.For example, in client-side, it is possible to provide and/or use is with variable screen size and display property The N screen schemes of consumer video content in the equipment (including smart phone, tablet computer, PC and TV etc.) of energy.In network side, depending on Frequency can transmit in internet, WiFi network, movement (3G and 4G) network and/or above-mentioned arbitrary combination.
Scalable video can encode signal under highest resolution.Scalable video can be according to certain Using special speed that is required and/or being supported by client device and resolution ratio decoding is realized from stream subset.Described point Resolution can be defined by multiple video parameters, and the video parameter includes but not limited to spatial resolution, and (such as picture is big It is small), temporal resolution (such as frame rate), video quality (such as subjective quality (such as MOS) and/or objective quality (such as PSNR or SSIM or VQM) etc.).Other common video parameters may include chroma format (such as YUV420 or YUV422 or YUV444), bit-depth (such as 8 bits or 10 bit videos), complexity, view, colour gamut and/or the ratio of width to height (such as 16:9 Or 4:3).International video standard (such as MPEG-2 videos, H.263, MPEG4 visions and H.264) can have and support scalability The tool and/or document (profile) of pattern.
Scalable video can realize the transmission and decoding of partial bit stream.The transmission and decoding of partial bit stream can make It obtains scalable video (SVC) system and provides lower time and/or spatial resolution or the precision of reduction for Video service (fidelity), while relatively high reconstruction quality (such as partial bit stream give each rate) is kept.List can be used in SVC Circuit decodes to implement, and thus SVC decoders can establish a motion compensation circuit on just decoded layer, but not at one Or establish motion compensation circuit on other multiple lower levels.For example, bit stream may include two layers, and the layer includes can be with Be the first layer (such as layer 1) of base and can be enhancement layer the second layer (such as layer 2).When such SVC decoders are rebuild During 2 video of layer, the prediction established after decoded picture buffer and motion compensation can be limited in layer 2.In this of SVC In implementation process, each reference picture from lower level possibly can not be folly reconstructed, this may be reduced at decoder Memory consumption and/or computational complexity.
Single loop decoding can be realized by limited inter-layer texture prediction, wherein for giving the current block in given layer, be come Being predicted from the spatial texture of lower level can be allowed to when corresponding lower level block encodes in intra mode.This can be referred to as limited Intra prediction.When lower level block encodes in intra mode, can be reconstructed without operation of motion compensation and/or decoded Picture buffer.
SVC can implement one or more additional inter-layer predication techniques (such as motion vector from one or more lower levels Prediction, residual prediction, model prediction etc.).The behavior can improve the rate distortion efficiency of enhancement layer.It is decoded using single loop SVC implementations can show the computational complexity of the memory consumption reduced at decoder and/or reduction, and can show increased realization Complexity, such as due to the dependency degree to block grade inter-layer prediction.In order to compensate for caused by implementing single loop decoding limitation Performance loss can increase encoder design and computational complexity to realize required performance.SVC may not support alternating expression (interlaced) coding of content.
Fig. 1 depicts the simplified block diagram of exemplary block-based hybrid salable Video coding (SVC) system.It will be by layer The space and/or time signal resolution ratio that 1 (such as base) represents can be generated by carrying out down-sampling to incoming video signal. In subsequent coding stage, the setting (such as Q1) of encoder can determine the credit rating of essential information.It is one or more subsequent Higher level base can be used to rebuild Y1 encode and/or decode, this can represent the approximation of higher level level of resolution.On What sampling unit can perform base's reconstruction signal is upsampled to 2 resolution ratio of layer.It can be by multiple layers (such as N number of layer, layer 1st, 2 ... N) perform down-sampling and/or up-sampling.Down-sampling and/or up-sampling rate can be different, for example, depending on two layers it Between scalability dimension.
According to the example scalable video coding system of Fig. 1, for giving higher level n, (such as 2≤n≤N, N are layer Total quantity), it can be believed by subtracting the relatively low layer signal (such as layer n-1 signals) of up-sampling from current layer n signals to generate difference Number.The differential signal can be encoded.If respectively by the vision signal that two layers (n1 and n2) represent with identical space point Corresponding down-sampling and/or up-sampling operation then can be neglected in resolution.It can not make to given layer n (such as 1≤n≤N) or multiple layers It is decoded in the case of the decoded information of higher level.
The coding of residual signals (such as differential signal between two layers) dependent on the layer in addition to base, such as using Example SVC system in Fig. 1, can lead to visual artifacts.Such visual artifacts may be due to for example limiting its dynamic model Caused by the quantization and/or standardization of the residual signals enclosed and/or the quantization performed during being encoded to residual signals.One Or estimation and/or motion compensated prediction can be used as respective coding mode in multiple higher layer coders.Residual signals In estimation and/or compensation may be different from conventional motion estimation, and may tend to visual artifacts.In order to reduce The generation of (such as minimize) visual artifacts, can implement increasingly complex residual quantization (such as together with joint quantizing process, The joint quantizing process may include for the quantization and/or standardization of the residual signals that limit its dynamic range and to residual Both quantizations that difference signal performs during encoding.Such quantizing process can increase the complexity of SVC system.
Multi-view video coding (MVC) can provide view scalability.It, can be right according to view scalability example Base layer bitstream is decoded to rebuild conventional two-dimensional (2D) video, and one or more additional enhancement layers can be solved Code is represented (view representation) with rebuilding other views of same vision signal.When such view is combined one And shown by three-dimensional (3D), the 3D videos with appropriate depth perception can be generated.
Fig. 2 shows use three-dimensional video-frequency of the MVC codings with left view (such as layer 1) and right view (such as layer 2) The pre- geodesic structure of example.Left view video can be encoded with the pre- geodesic structures of I-B-B-P, and right view video can be encoded with P- The pre- geodesic structures of B-B-B.As shown in Fig. 2, in right view, first picture arranged side by side can with first I picture in left view P pictures are encoded as, and picture subsequent in right view can be encoded as B pictures, the B pictures have in right view The first prediction of time reference and the second prediction of interlayer benchmark in left view.MVC possibly can not be supported single time Road decodes feature.For example, as shown in Fig. 2, the decoded condition of right view (such as layer 2) video be left view (such as Layer 1) in full picture validity, wherein one or more (such as each) layers (such as view), which have, respective to be compensated back Road.Implementing MVC may include that high level syntax changes, and may not include block grade and change.Such case can delay (ease) MVC's Implement.For example, MVC can be by being carried out in piece and/or picture level configuration baseline picture.MVC can support two or more The coding of view, such as by extending example shown in Figure 2 to realize inter-layer prediction on multiple views.
Mpeg frame compatibility (MFC) Video coding can provide scalable extension for 3D Video codings.For example, MFC can be Frame compatible basic layer video (such as two views are encapsulated into same frame) provides scalable extension, and can provide one or more Enhancement layer is to restore full resolution view.Three-dimensional 3D videos can have there are two view, including left view and right view.Three-dimensional 3D Content can by by described two views encapsulate and/or be multiplexed into a frame and by the video after encapsulation is carried out compression and It transmits to be transmitted.In receiver side, after the decoding, the frame can be decapsulated and be shown as two views.It is this right The multiplexing of view can perform in time-domain or spatial domain.When performing in the spatial domain, in order to keep identical picture size, institute Stating two views can be according to one or more arrangement by space down-sampling (such as by factor 2) and encapsulation.For example, side by side Arrange the right side that can down-sampling left view be placed on to the left side of picture and down-sampling right view is placed on to picture.Other Arrangement may include up and down, line by line, checkerboard type etc..One or more can for example be passed through by being used to implement the arrangement of frame compatibility 3D videos Frame encapsulation arranges SEI message to convey.Increase although the 3D that this arrangement is realized is transmitted with minimum bandwidth consumption, space Down-sampling can cause the user experience that obscures and/or can reduce visual quality and 3D videos of picture.
Video Applications (such as IPTV, Video chat, mobile video and/or stream video) can be able to portion in isomerous environment Administration.Isomerism may be present in client-side.Isomerism may be present in network.N screens may include with variable screen size and/ Or video content is consumed in the equipment of display performance, the equipment includes smart phone, tablet computer, PC and/or TV etc. Deng.N screens can help to the isomerism of such as client-side.Video can internet, WiFi network, mobile network (such as 3G and/ Or 4G) and/or these networks arbitrary combination upper (such as in network side) transmission.Scalable video can improve user's body It tests and/or Video service quality.Scalable video may include encoding signal under highest resolution.It is scalable to regard Frequency coding may include the effective network bandwidth supported according to used in certain applications and/or by client device and/or regard Frequency division resolution realizes decoding from stream subset.Resolution ratio is characterized in that multiple video parameters.Video parameter may include it is following in One or more:Spatial resolution, temporal resolution, video quality, chroma format, bit-depth, complexity, view, colour gamut And/or the ratio of width to height etc..Spatial resolution may include picture size.Temporal resolution may include frame rate.Video quality can wrap Include subjective quality (such as MOS) and/or objective quality (such as PSNR, SSIM or VQM).Chroma format may include YUV420, YUV422 or YUV444 etc..Bit-depth may include 8 bit videos, 10 bit videos etc..The ratio of width to height may include 16:9 or 4:3 etc..HEVC scalable extensions can at least support spatial scalability (such as scalable bitstream may include more than one sky Between signal in resolution ratio), (such as scalable bitstream may include the letter in more than one credit rating to quality scalability Number) and/or standard scalability (such as scalable bitstream may include using the base H.264/AVC encoded and one or more A enhancement layer encoded using HEVC).According to spatial scalability, scalable bitstream may include one or more spatial discriminations Signal in rate.According to quality scalability, scalable bitstream may include the signal in one or more credit ratings.According to Standard scalability, scalable bitstream may include using the base for example H.264/AVC encoded and one or more uses Such as the enhancement layer of HEVC codings.Quality scalability is referred to alternatively as SNR scalabilities.View scalability can support 3D to regard Frequency is applied.According to view scalability, extensible video stream may include both 2D and 3D vision signals.
Video coding system (such as video coding system according to the scalable extension of efficient video coding (SHVC)) can It is configured to perform the equipment of Video coding including one or more.Be configured to perform Video coding (such as coding and/or solution Code vision signal) equipment be referred to alternatively as video encoder.This video encoder may include video available devices, such as Television set, digital media player, DVD players, blue lightTMIt is player, network media player equipment, desktop computer, portable Formula PC, tablet device, mobile phone, video conferencing system, the video coding system based on hardware and/or software, it is all It is such as such.Such video encoder may include cordless communication network element, such as wireless transmitter/receiver unit (WTRU), Base station, gateway or other network elements.
The scalable enhancing of HEVC will discuss here.For example, it may be had been established for spatial scalability One or more targets (target).For example, it compared with using non-telescoping coding, is surveyed for high-resolution video Amount with 25% bit-rate reduction for 2x spatial scalabilities and for the 50% of 1.5x spatial scalabilities, it can be achieved that compare The target of special rate reduction.For example, scalability can be used for expanding the range of scalable HEVC use-cases.When base uses H.264/AVC when or MPEG2 is encoded, scalability can refer to the type of scalability, while one or more of enhancement layers can It is encoded using such as HEVC.Scalability can be using H.264/AVC or MPEG2 coding conventional contents (legacy Content) back compatible is provided, and improve the matter of the conventional contents with one or more enhancement layers using HEVC codings Amount, so as to provide better code efficiency.
3D scalable video technologies are referred to alternatively as 3D Video codings or 3DV.3DV will discuss herein.3DV can be opened Send out different experience of the target for the view scalability of automatic stereo application.Autostereoscopic display and application are allowed or are made one In the case of without wearing heavy eyeglass with regard to 3D can be experienced.In order to realized in the case of not lenses fitted it is appropriate or Good 3D experience, can be provided and/or using more than two views.To multiple views, (such as 9 views or 10 regard Figure) carry out coding may be costly.3DV can be provided and/or using a kind of mixed method, to some with relatively large difference Away from and the view (such as 2 or 3 views) of depth map of depth information of view can be provided encoded.Showing side, Coded views and depth map can be decoded, and remaining view can be used decoded view and its use View synthesis technology Depth map generation.3DV is contemplated that a variety of methods to be encoded to the view and its depth map, such as use different technologies The combination of (such as H.264/AVC, MVC and HEVC) encodes the view and its depth map, including using a kind of skill (H.264/AVC) art is encoded base and one or more enhancement layers is carried out using another technology (such as HEVC) Coding.3DV can provide the menu of different options, and application can be selected from the described menu.
Table 1 summarises the example of different type scalability discussed herein.In 1 bottom of table, bit-depth scalable Property and chroma format scalability can be restricted to (be tied to) by professional video using video format (such as Higher than 8 bit videos and/or higher than YUV4:2:0 chroma).Bit-depth scalable and chroma format can be stretched Contracting can be used.The ratio of width to height scalability and colour gamut scalability can be provided that and/or (current as desired scalability First stage offer, use and/or the planning of scalable HEVC exploitation possibly can not be directed to).
Fig. 3 shows the comparison on CIE color definitions between BT.709 (HDTV) and BT.2020 (UHDTV).Using Advanced display technology, compared with HDTV specifications (such as BT.709), ultrahigh resolution TV (UHDTV) can support higher resolution Rate, the bit-depth of bigger, higher frame rate and broader colour gamut.Due to the available hi-fi qualities of BT.2020, User experience can obtain significant increase.UHDTV can support highest 4K (3840x2160) and 8K (7680x4320) resolution ratio, most The picture sample bit-depth of the frame rate of high 120Hz and 10 bits or 12 bits.The color space of UHDTV 310 can be by BT.2020 is defined.The color space of UHDTV 320 can be defined by BT.790.It is rendered (render) using BT.2020 310 The comparable color space volume bigger using HDTV 320 (such as BT.709) of color volume (volume of colors), this Mean that using UHDTV specifications more perceived color information can be rendered.
The example of the different scalability types of table 1.
Colour gamut scalability.Colour gamut scalable (CGS) coding can be that multi-layer coding, two of which or more layer can have There are different colour gamuts and bit-depth.For example, as shown in table 1, in 2 layers of scalable system, base can be BT.709 Defined in HDTV colour gamuts, and enhancement layer can be the UHDTV colour gamuts defined in BT.2020.P3 colour gamuts are workable color Domain.The P3 colour gamuts can use in digital movie application.Color gamut conversion technology can be used by base in interlayer process in CGS codings Color gamut conversion is into enhancement layer colour gamut.After application color gamut conversion, the interlayer reference base picture generated can be used for prediction to have more The enhancement-layer pictures of accuracy that are good or improving.
Fig. 4 A and Fig. 4 B describe showing for parallax of the terminal user respectively between BT.709 colour gamuts and BT.2020 colour gamuts Example.In Fig. 4 A and Fig. 4 B, different colour gamuts can be used to be classified (color graded) by color gradient twice for same content.Citing For, the content in Fig. 4 A can be in BT.709 by color gradient classification and rendering/display on BT.709 is shown.Fig. 4 B's is interior Holding can be in BT.2020 by color gradient classification and rendering/display on BT.709 is shown.As shown, between two pictures Aberration can be different.
Fig. 5 shows colour gamut scalability (CGS) encoding examples using picture level inter-layer prediction.According to example embodiment party Formula, Fig. 4 A can in base layer encoder and Fig. 4 B can be in enhancement layer coding.Additional interlayer management can be provided that and/or increase for improving Strong layer code efficiency, such as use the CGS coded systems in Fig. 5.Color gamut conversion can use in the interlayer management for CGS. By using color gamut conversion, the color in BT.709 spaces can be switched to BT.2020 spaces.Color in BT.709 spaces Available for more effectively predicting the enhancement layer signal in BT.2020 spaces.
As shown in figure 5, base (BL) video input 530 can be HD vision signals, and enhancement layer (EL) video input 502 can be UHD vision signals.HD vision signals 530 and UHD vision signals 502 can pass through one or more of following phase It is mutually corresponding:One or more down-sampling parameters (such as spatial scalability);One or more color gradient value parameters (such as Colour gamut scalability);And one or more tone mapping parameters (such as bit-depth scalable) 528.
BL encoders 518 may include such as efficient video coding (HEVC) video encoder or H.264/AVC Video coding Device.BL encoders 518, which are configured to rebuild picture for one or more BL of prediction, (such as is stored in BL DPB In 320) generation BL bit streams 532.EL encoders 504 may include such as HEVC encoders.EL encoders 504 can for example wrap One or more high level syntax amendments are included, to support inter-layer prediction by the way that interlayer reference base picture is added to EL DPB.EL is compiled Code device 504 is configured to rebuild picture (such as being stored in EL DPB 506) for one or more EL of prediction Generate EL bit streams 508.
One or more of BL DPB 520 rebuild BL pictures can use one in interlayer management (ILP) unit 522 Or multiple picture level interlayer management technologies are handled, the picture level interlayer management technology includes one of following or more Person:Up-sample (such as spatial scalability), color gamut conversion (such as colour gamut scalability) and reverse tone mapping (such as bit-depth scalable).It is one or more that treated rebuilds BL pictures and can be used as the benchmark encoded for EL Picture.Interlayer management can be based on the enhancing video information 514 received from EL encoders 504 and/or received from BL encoders 518 Elementary video information 516 perform.EL code efficiencies can be improved in this way.
In 526, EL bit streams 508, BL bit streams 532 and parameter (such as the ILP information used in interlayer management 524) scalable bitstream 512 can be multiplexed together into.For example, scalable bitstream 512 may include SHVC bit streams.
Even if fixed (such as BL is in 709 and EL is in 2020) in BL colour gamuts and EL colour gamuts, for colour gamut The model parameter of conversion can be different for different content.During these parameters may depend on the post-production of content generation Color gradient classification processing, content generate during, colorist can by different gradient fractionation parameters be applied to different skies Between and/or different contents to reflect its artistic intent.Input video for color gradient classification may include high-fidelity picture. In scalable coding system, quantizing noise can be introduced to the coding of BL pictures.Using coding structure (such as hierarchical prediction knot Structure), quantification gradation can be directed to per picture and/or be adjusted per group picture piece.According to the model parameter of color gradient classification generation It may be inaccurate for the purpose of coding.According to a kind of embodiment, encoder by arbitrary point estimation model parameter come Compensating coding noise may be more effective.The encoder can estimate model parameter for per picture or per group picture piece.These moulds Shape parameter (such as generate during color gradient classification is handled and/or generated by encoder) can be with sequence and/or picture Grade is transmitted using signal, and thus decoder can perform identical color gamut conversion processing during inter-layer prediction.
Color gamut conversion example may include but be not limited to linear or segmentation (piece-wise) linear color conversion.In film work In industry, 3D look-up tables (3D LUT) from a kind of colour gamut method or technique to the colour gamut of another colour gamut method or technique available for turning It changes.In addition, it is possible to provide and/or use the 3D LUT for CGS codings.Fig. 5 is shown using picture level inter-layer prediction (ILP) Example CGS encoding mechanisms.The ILP is included from base's (BL) colour gamut to the color gamut conversion of enhancement layer (EL) colour gamut, from BL spaces Resolution ratio is to the up-sampling of EL spatial resolutions, and/or from BL sample bits depth to the reverse tone of EL sample bits depth Mapping.
Fig. 6 shows the example 3D look-up tables of 8 bit YUV signals.Fig. 7 showed for three showing linearly or in tetrahedron interpolation Example weight calculation.As described here, color transformation model (such as 3D LUT) is available for color gamut conversion.For example, (y, u, V) the sample triple (triplet) in the colour gamut of base is can be referred to, and (Y, U, V) can be referred to the triple in EL colour gamuts. In 3D LUT, the ranges of BL color spaces can be divided (segment) into equal octant (octant), as shown in Figure 6.
The input of 3D LUT can be (y, u, v) in BL colour gamuts, and the output of 3D LUT can be reflecting in EL colour gamuts The triple (Y, U, V) penetrated.For example, referring to Fig. 7, the input can be located at octant 700 in index (y, u, v).During conversion process, if input (y, u, v) and one of vertex (vertex) of octant are overlapping, then export (Y, U, V) can be exported by directly quoting one in 3D LUT coordinates (entry), such as component (y, u, v) is each with it From vertex overlap.If (y, u, v) (or input component any one) is inputted in octant (but not at eight points On one of vertex in area), such as the index (y, u, v) in Fig. 7, then it can apply interpolation processing.For example, it can apply Perform three linear (trilinear) of same processing and/or tetrahedron interpolation and method.Tri linear interpolation can be applied to recently 8 vertex, as shown in Figure 7.One or more of following equalities can be used to perform in the Tri linear interpolation:
Y=K × ∑I=0,1J=0,1K=0,1si(y)×sj(u)×sk(v)×LUT[yi][uj][vk].Y (1)
U=K × ∑I=0,1J=0,1K=0,1si(y)×sj(u)×sk(v)×LUT[yi][uj][vk].U (2)
V=K × ∑I=0,1J=0,1K=0,1si(y)×sj(u)×sk(v)×LUT[yi][uj][vk].V (3)
Referring to equation (1)-(3) and Fig. 7, for example, (yi,uj,vk) vertex of BL colour gamuts can be represented (i.e. to 3D LUT Input).LUT[yi][uj][vk] vertex of EL colour gamuts can be represented (i.e. in coordinate (yi, uj,vk) 3D LUT output). LUT[yi][uj][vk].Y、LUT[yi][uj][vk].U、LUT[yi][uj][vk] .V can represent LUT [y respectivelyi][uj][vk] Y, U and V component.I, j, k={ 0,1 }, and s0(y)=y1-y、 s1(y)=y-y0、s0(u)=u1-u、s1(u)=u-u0、s0 (v)=v1-v、s1(v)=v-v0For the weight applied, as shown in Figure 7.
Fig. 8 shows example tetrahedron interpolation.Fig. 9 A, Fig. 9 B, Fig. 9 C, Fig. 9 D, Fig. 9 E and Fig. 9 F are shown around interpolation The tetrahedral type of point.Tetrahedral four vertex (including point P (y, u, v)) can be used in tetrahedron interpolation, so that interpolation is used for It calculates.Input point P (i.e. P (y, u, v)) in Fig. 8 can be closed (enclose) in tetrahedron, which can To be P0, P1, P5, P7.Tetrahedron interpolation can be directed to each component and be calculated in equation (4) (5) (6).It can be with for tetrahedron There are 6 possible selections, the selection may include the point P that will be interpolated.Fig. 9 A-9F can show or list this 6 kinds of possible feelings Condition.According to an example, vertex P0 and P7 may include in the tetrahedron.
Y=Ty×((y1-y0)×LUT[P0].Y+dy×(LUT[P1].Y-LUT[P0].Y)) +Tu×du×(LUT [P5].Y-LUT[P1].Y)+Tv×dv×(LUT[P7].Y-LUT[P5].Y) (4)
U=Ty×((y1-y0)×LUT[P0].U+dy×(LUT[P1].U-LUT[P0].U)) +Tu×du×(LUT [P5].U-LUT[P1].U)+Tv×dv×(LUT[P7].U-LUT[P5].U) (5)
V=Ty×((y1-y0)×LUT[P0].V+dy×(LUT[P1].V-LUT[P0].V)) +Tu×du×(LUT [P5].V-LUT[P1].V)+Tv×dv×(LUT[P7].V-LUT[P5].V) (6)
3D LUT can use the corresponding letter in the original signal in a color space and another color space by encoder Number estimate.For example, if 3D LUT interpolation techniques or processing are linear, least square (LS) estimation technique can be used for Estimate 3D LUT.It can be used for estimating based on the iterative technique that gradient declines.As described here, LS estimations can be used in 3D LUT estimations Method performs.
It may be brought challenges using the 3D LUT estimations of LS.For example, can the sizes of estimative 3D LUT parameters can With very big.In figure 6, sample bits depth can be 8 bits.If unit octant size is 16x16x16,3D LUT There are 17x17x17 coordinates in table.One or more (such as each) coordinate of 3D LUT may include three components.Each component can With 4913 (17x17x17) unknown parameters.A large amount of memories can be used, and can call in such large scale linear system estimation It is a large amount of to calculate.
3D LUT are not the color gamut conversion for being entirely used for given video input.For example, in the core of colour gamut scalability In the statistical analysis of one or more cycle tests used in heart experiment, the percentage of the coordinate used in 3D LUT can be small In 20%.In such embodiment, the LS estimations technique may not be directly applied, this is because there may be one or Multiple coordinates that can not be measured.
Being distributed in 3D color spaces for BL pixels is possible and uneven.For example, such BL pixels can tightly ring Around certain colors (such as primary color), and can sparsely be distributed in around other colors.As described here, this imbalance Feature can to LS estimate stability it is related.
Figure 10 shows YUV 4:2:The brightness of 0 video and the example of chroma sample position.The brightness of input video and color The phase of degree possibly can not be aligned.In order to be directed to one or more (such as each) component estimations and apply 3D LUT, can be used The triple formed by 3 components.Triple can be referred to the luminance component and two colorations point positioned at same sampling location It measures (such as luminance component, red color difference component and chroma blue difference component positioned at same sampling location).This brightness Misalignment with chroma sample position influences whether the accuracy of 3D LUT interpolation.
In order to cope with one or more this challenges, it is possible to provide for improving the system and or method of 3D LUT estimations. For example, BT.709 to BT.2020 color gamut conversions will be described herein.For example, the input signal in 3D LUT estimations can To be BT.709 compression/decompression videos, and output signal can be BT.2020 videos (it can be trained benchmark or target). Equation (7) is handled available for the color gamut conversion described using 3D LUT.
zi(c)=fP(c)(xi),i∈[0,N-1] (7)
Wherein x can be referred to the input signal of triple in BT.709 (y, u, v) form.Z (c) can be the output of component c Signal, wherein c can be Y, U or V in BT.2020.P (c) can be by the parameter of estimative component c.P (c) can be The 3D LUT outputs of component c.fP(c)It can be interpolating function.fP(c)Can be linear function, three such as stated herein are linear or four Face body.I can be the index for inputting pixel.N can be the sum for inputting pixel.It is rewritable as follows in the matrix form:
* in its equation (8) can be matrix multiplication.Can be the weight vectors of i-th of input pixel. Wi,j It can be the weight of j-th of output coordinate of the 3D LUT of i-th of input pixel.According to an example, Wi,jIt can be according to for three Equation (1)-(3) of linear interpolation and equation (4)-(6) calculating for tetrahedron interpolation.In one example, weight to Amount can represent as follows:
P (c) can be by estimative parameter vector, which can be the output coordinate of 3D LUT and can represent It is as follows:
P (c)=[p0…pM-1]
M can be the quantity of 3D LUT output coordinates.For example, M can be for the 3D LUT of 17x17x17 sizes 4913.According to an example, since the 3D LUT of one or more (such as each) components can individually be estimated that component c is under It can be omitted in row equation.For one or more (such as all) pixel set equatioies (8), it can define or provide and is following each :
Z=W*P (9)
Z=[z0…zN-1]T
Using least-squares estimation, solution is as follows:
P=H-1*(WT*Z) (11)
H=(WT*W) (12)
Wherein H can be autocorrelation matrix.
3D LUT estimations will be described herein.For example, for incoming video signal (such as BT.709), 3D LUT coordinates (such as 20% of 3D LUT coordinates) can be used in using the conversion of the color of 3D LUT.This means that equation (10) In matrix W to be sparse, and one or more of its element may be zero.Autocorrelation matrix H can determine in equation (12) Justice.Autocorrelation matrix H can be sparse.Autocorrelation matrix H is possible and non-reversible.The solution of equation (11) is for certainly It may be not available for correlation matrix H.According to an example, matrix W can by considering the 3D LUT of reference coordinate and It is simplified.In order to simplify (compact) described matrix, the input pixel (y, u, v) of input video can be scanned.If 3D LUT vertex is used in 3D LUT interpolation processings, then 3D LUT vertex can be obscured (mask).Do not made by removing Vertex, simplifying parameter set P ' can be determined, calculates or generate.The mapping slave P ' to P for rebuilding P after P ' can It is estimated and/or builds and is as follows:
P '=compact (P) (P '=simplify (P))
The P ' simplified can be used to calculate for W ' and H ', wherein not used vertex may be removed.It, which is solved, can define such as Under:
P '=H '-1*(W’T*Z) (13)
H '=(W 'T*W’) (14)
3D LUT estimate that the degree of rarefication of matrix W can be reduced.After simplifying, since the size of H ' is likely less than H, for depositing The memory of the autocorrelation matrix H of storage 3D LUT estimations can be reduced.
As described here, the distribution of color of input video is possible and uneven.For example, pixel may have similar Color.Color with high occurrence rate can be primary color.This can lead to the imbalance problem of W '.For example, in W ' Element may correspond to primary color, can have big value.Other elements in W ' may correspond to infrequently (such as relatively often or pole The color occurred less), with relatively low or smaller value.As a result, the dynamic range of the element in autocorrelation matrix H ' can With very big, this can cause the reverse process of H ' to become unstable.Unstable can be become to the estimation of P '.In order to reduce this ask Topic, it is possible to provide and/or limit to establish the tradeoff between accurate estimated result and the stability of estimation processing using a kind of.Citing For:
H '=(W 'T* W ')+λ I, λ >=0 (15)
Wherein I can be a variable matrix, and λ can be the factor for balancing estimated accuracy and the stability of processing.More Big λ means that in method or the stability of processing more deviations can be generated.The value of λ can be based on degree uneven in W ' come really It is fixed.
Initial parameter vector P can be obtained by the way that from P ' vertex estimated is mapped to P, for example, estimation simplify parameter to After amount P '.For example:
P '=decompact (P) (P '=loose (P)) (16)
Corresponding vertex can be used in P ' to fill in not used vertex in P, such as uses the difference in 3D LUT codings Processing (such as three linear or tetrahedrons).
As described here, Figure 10 shows 4:2:Phase shift between the luminance component and chromatic component of 0 chroma format.3D Brigadier is described in brightness and chrominance phase in LUT estimations here.For example, three in equation (1)-(3) Tetrahedron interpolation in linear interpolation or equation (4)-(6), the 3D LUT interpolation of one or more (such as each) output components Three input components of input signal can be used.
It is as shown in Figure 10 and described herein, according to an example, luma com-ponent samples position and chromatic component sampling location It possibly can not be aligned.Figure 10 describes 4:2:0 chroma format.Although component conversion is with reference to Figure 10 and/or 4:2:0 chroma format into Row description, examples described herein can be used for 4:1:0 chroma format, 4:2:2:0 chroma format, 4:4:4 chroma formats etc.. Although Figure 10 is described with reference to YCbCr format, it is possible to use other face chroma formats.
In Fig. 10, sample position L0-L15 may indicate that the sampling location of luminance component.In Fig. 10, L may indicate that brightness Component, and digital (such as 0-15) may indicate that sampling location.Sampling location C0-C3 may indicate that one or more (such as two) The sampling location of overlapping chromatic component (such as red color difference component and chroma blue difference component).In Fig. 10, C may indicate that one A or multiple (such as two) overlap chromatic component (such as red color difference component and chroma blue difference component), and digital (example Such as 0-3) it may indicate that sampling location.
Figure 10 can be the grid for having x- axis and y- axis, and wherein x- axis can be trunnion axis and y- axis can be vertical Axis.The luminance component of sampling location L0-L15 can have x coordinate and y-coordinate.Sampling location C0-C3 one or more (such as Two) overlapping chromatic component (such as red color difference component and chroma blue difference component) can have x coordinate and y-coordinate.
Sampling location misalignment can reduce the accuracy of 3D LUT estimations.For example, as shown in Figure 10, luminance component is adopted Sample position L0-L15 is not overlapped with chromatic component sampling location C0-C3.Sampling location misalignment appear in chroma format (such as 4:2:0 or 4:2:2) in, wherein chromatic component in both direction (such as 4:2:0, for there are one every four luminance components Red color difference component sample and a chroma blue difference component sample) it is subsampled or in the horizontal direction (such as 4:2:2) quilt Sub-sampling.As the processing of coloration sub-sampling as a result, the sample position of brightness and chromaticity position can become misalignment.
For luminance component interpolation, one or more chromatic components (such as red color difference component and/or chroma blue it is poor Component) multiple sampling locations can be used for by chromatic component to luminance component sample position be aligned.It is bright for chromatic component interpolation The one or more sampling locations for spending component can be used for luminance component being aligned to chromatic component sampling location.Once alignment, face Colouring component transformation model (such as 3D LUT) available for by component (such as brightness or coloration) from a color space conversion to another One color space.From a color space conversion to another color space can be by component in the second color space really The fixed component (such as using the component (such as in particular sample position) in the first color space).
Video encoder can receive scalable bitstream.Scalable bitstream may include base and enhancement layer.The base Layer may include picture and the base can be associated with the first color space.The enhancement layer can be associated with the second color space.
3D LUT conversion input can in a color space (such as BT.709) with signal transmit component (such as (y, U, v)), and 3D LUT conversion output can be in another color space (such as BT.2020) component (such as (Y, U, V)).Chromatic component is converted, luminance component y can be adjusted to that y ' to be aligned to chromatic component sampling location.Filtering interpolation Device can be equation (17)-(18).The input of the 3D LUT conversions of chromatic component can be (y ', u, v), and its output can be with It is U or V.Interpolation filter can be 2 tap filters [1,1], 4 tap filters and/or the like.
Video encoder can receive and the associated picture of the first color space.The picture may include the first sampling location The first chromatic component, the second chromatic component of the first sampling location, the luminance component of the second sampling location, third sampling location Luminance component, the luminance component of the 4th sampling location and the luminance component of the 5th sampling location.Shown video encoder Can by the first interpolation filter be applied to it is following in two or more with determine the first sampling location luminance component:Second The luminance component of sampling location, the luminance component of third sampling location, the luminance component of the 4th sampling location and the 5th sampling The luminance component of position, wherein the luminance component of first sampling location is associated with the first color space.The Video coding Color transformation model can be applied to the first chromatic component of the first sampling location by equipment, the second coloration of the first sampling location is divided First chromatic component of the first sampling location is transformed into the second color sky by amount and the luminance component of the first sampling location Between.Color transformation model can be applied to the first chromatic component of the first sampling location, the first sampling by the video encoder Second chromatic component of position and the luminance component of the first sampling location, by the second chromatic component of the first sampling location It is transformed into the second color space.For example, in the case where using YCbCr format, first chromatic component and/or described Second chromatic component can be red color difference component and/or chroma blue difference component.For example, YCgCo forms are being used In the case of, first chromatic component and/or the second chromatic component can be green chroma difference component and/or orange colour difference Component.It should be noted that description herein is suitable for the color space of extended formatting.
As shown in Figure 10, for example, one or more interpolation filters (such as shown in equation (17)-(18)) Sampling location available for the chromatic component that luminance component is registered to misalignment.Once alignment, transformation model (such as 3D LUT) available for by chromatic component from the first color space conversion to the second color space.For example, if chromatic component With identical sampling location, then the input of the 3D LUT conversions of chromatic component can be (y ', u, v), and wherein y ' is can The luminance component (such as the luminance component of sampling location overlapped with chromatic component u, the sampling location of v) of adjustment.Component (y ', U, v) it can be associated with the first color space.The output of 3D LUT can be U or V, refer to the coloration in the second color space Component U or V are related.
Interpolation filter (such as shown in equation (17)-(18)) available for luminance component is registered to chromatic component Thus transformation model can be used in sampling location.For example, described according to equation (17)-(18), chromatic component is adopted The luminance component of sample position (such as sampling location C0) can be by using two or more luma samples positions (such as sample bits Put L0, L1, L4 and/or L5) luminance component application interpolation filter be determined.The sampling location of chromatic component may include Two or more chromatic components, such as red color difference component Cr0 and corresponding chroma blue difference component Cb0.Sampling location C0 Luminance component and sampling location C0 chromatic component can be used for by the chromatic component of position C0 from the first color space conversion to Second color space.
For example, as described here, when by chromatic component from a color space conversion to another color space, color The luma component values of degree component samples position can be determined.In order to determine the sampling location of chromatic component (such as sampling location C0 4 plug interpolation filters or 2 plug interpolation filters can be used in luma component values), video encoder.The video is compiled Decoding apparatus can determine which interpolation filter used according to the sampling location of the chromatic component on the x-y axis in Figure 10.Citing For, the video encoder can determine the x and y-component of the chromatic component of sampling location.The video encoder is then Can be by the x coordinate of chromatic component sampling location divided by 2, and the video encoder can sit the y of chromatic component sampling location Mark divided by 2.If being 1 by x coordinate divided by 2 remainder, and be 1 by y-coordinate divided by 2 remainder, then the video encoder The interpolation filter in equation (17) or equation (18) can be used to determine the luminance component of the sampling location of chromatic component.If It is 0 by x coordinate divided by 2 remainder, and is 1 by y-coordinate divided by 2 remainder, then equation can be used in the video encoder (17) or interpolation filter in equation (18) determines the luminance component of the sampling location of chromatic component.If x coordinate is removed Using 2 remainder as 1, and it is 0 by y-coordinate divided by 2 remainder, then equation (17) or equation can be used in the video encoder (18) interpolation filter in determines the luminance component of the sampling location of chromatic component.If the remainder by x coordinate divided by 2 is 0, and be 0 by y-coordinate divided by 2 remainder, then the interpolation in equation (17) or equation (18) can be used in the video encoder Wave filter determines the luminance component of the sampling location of chromatic component.Video encoder is used interchangeably equation (17)-equation (18) value of the luminance component of the sampling location of (such as alignment) chromatic component is determined.
For example, 4 plug wave filters can be used to determine for the luminance component of sampling location C0, as shown in equation (17):
L (C0)=((L0+L4) * 3+ (L1+L5)+4)>>3 (17)
Wherein>>3 mean the sum of ((L0+L4) * 3+ (L1+L5)+4) divided by 23And/or>>3 are calculated using moving to right 3. If the sum of ((L0+L4) * 3+ (L1+L5)+4) is not integer, decimal can be neglected later again with this and divided by 23.In equation 17 In, in order to determine luminance component in sampling location C0, video encoder can apply interpolation filter, wherein by sampling location L0 Luminance component and the luminance component of different sampling location L4 be added with determine itself and.The video encoder can then incite somebody to action Should and be multiplied by 3, and after multiplying and be added with the luminance component of sampling location L1, the luminance component of sampling location L5 and 4 It is the sum of final to determine.It the sum of the final integer and can be determined.The video encoder then can by the integer and Divided by 8 with determine sampling location C0 luminance component.Appropriate brightness point can be used in the luminance component of sampling location C1, C2 and C3 Amount is determined using equation 17.
2 plug wave filters can be used to determine for the luminance component of sampling location C0.Workable 2 are provided in equation (18) The example of plug wave filter:
L (C0)=(L0+L4+1)>>1 (18)
Wherein>>1 means the sum of (L0+L4+1) divided by 21And/or>>1 is calculated using moving to right 1.If (L0+L4+ The sum of 1) it is not integer, then can be neglected after decimal again with this and divided by 21.In equation 18, in order to determine sampling location C0 Luminance component, video encoder can apply 2 plug interpolation filters, wherein by the luminance component of sampling location L0 with it is another The luminance component of sampling location L4 and 1 be added with determine itself and.It the sum of final integer and can be determined.The Video coding Equipment then can be by the integer value of the sum divided by 2 with the luminance component of determining sampling location C0.Sampling location C1, C2 and C3 Luminance component appropriate luminance component can be used using equation 18 to determine.
The video encoder can be used the luminance component of the sampling location C0 in the first color space convert (such as really It is fixed) chromatic component of sampling location C0 in the second color space and use transformation model (such as 3D described herein LUT) two chromatic components of the sampling location C0 in conversion (such as determining) first color space.As described above, sampling location Interpolation filter can be used to determine (as shown in equation (17) or (18)) in the luminance component of C0.
Video encoder can receive and the associated picture of the first color space.The picture may include the first sampling location Luminance component, the first chromatic component of the second sampling location, the second chromatic component of the second sampling location, third sampling location The first chromatic component, the second chromatic component of third sampling location, the first chromatic component of the 4th sampling location, the 4th sampling The second chromatic component, the first chromatic component of the 5th sampling location and the second chromatic component of the 5th sampling location of position. The video encoder interpolation filter can be applied to it is following in two or more to determine first sampling location First chromatic component:First chromatic component of the second sampling location, the first chromatic component of third sampling location, the 4th sample bits The first chromatic component and the first chromatic component of the 5th sampling location put, wherein the first color of first sampling location Degree component is associated with the first color space.The video encoder interpolation filter can be applied to it is following in the two or more More persons are with the second chromatic component of determining first sampling location:Second chromatic component of the second sampling location, third sampling location The second chromatic component, the second chromatic component of the 4th sampling location and the second chromatic component of the 5th sampling location, wherein Second chromatic component of first sampling location is associated with the first color space.The video encoder can convert color Model is applied to the luminance component of the first sampling location, the first chromatic component of the first sampling location and the first sampling location The second chromatic component, the luminance component of the first sampling location is transformed into the second color space.First chromatic component And/or second chromatic component can be red color difference component and/or chroma blue difference component.
As shown in Figure 10, for example, one or more interpolation filters (such as shown in equation (19)-(22)) Sampling location available for the luminance component that one or more chromatic components are registered to misalignment.Once alignment, transformation model (such as 3D LUT) available for by luminance component from the first color space conversion to the second color space.The 3D LUT of luminance component The input of conversion can be (y, u ', v '), and wherein u ' and v ' are chromatic component (such as the sampling with luminance component y after adjustment The chromatic component for the sampling location that position overlaps).Component (y, u ', v ') it can be associated with the first color space.The output of 3D LUT It, can be related to the luminance component in the second color space for Y.
Interpolation filter (such as shown in equation (19)-(22)) available for chromatic component is registered to luminance component Thus transformation model can be used in sampling location.For example, sampling location (such as sampling location L4, L5, L8 of luminance component And/or L9) chromatic component can be by using two or more sampling locations (such as sampling location C0, C1, C2 and/or C3) Chromatic component application resampling wave filter determine.As a result, the value after the resampling of component (such as another sampling location point The value of amount) component of other multiple sampling locations can be used to determine.For example, sampling location L4, L5, L8 in Figure 10 And/or 3D LUT can be used to carry out interpolation for the luminance component of L9.In order to which the luminance component to sampling location L4, L5, L8 and/or L9 is inserted Value, it may be determined that the chromatic component (such as u, v) of sampling location L4, L5, L8 and/or L9.As described here, sampling location L4, One or more resampling wave filters (such as equation (19)-(22)) can be used to export for the chromatic component of L5, L8 and L9.It adopts The luminance component of sample position L4, L5, L8 and/or L9 and the chromatic component of sampling location L4, L5, L8 and/or L9 can be used for position The luminance component of L4, L5, L8 and/or L9 is put from the first color space conversion to the second color space.
For example, as discussed here, by luminance component from a color space conversion to another color space When, the value of the chromatic component (such as red color difference component and/or chroma blue difference component) of luma com-ponent samples position can quilt It determines.In order to determine the value of the chromatic component of the sampling location of luminance component (such as sampling location L0, L1, L4, L5), video 4 plug interpolation filters or 2 plug interpolation filters can be used in encoding device.The video encoder can be according in Figure 10 Which interpolation filter the sampling location of luminance component on x-y axis determines using.For example, the video encoder It can determine the x and y-component of the luminance component of sampling location.The video encoder then can be by luma com-ponent samples position X coordinate divided by 2, and the video encoder can be by the y-coordinate of luma com-ponent samples position divided by 2.If by x coordinate divided by 2 remainder is 0, and is 1 by y-coordinate divided by 2 remainder, then the interpolation in equation (19) can be used to filter for the video encoder Wave device determines the chromatic component of the sampling location of luminance component (such as red color difference component and/or chroma blue difference Amount).As shown in Figure 10, chromatic component (example of the equation (19) available for determining luma com-ponent samples position L4, L6, L12 and L14 Such as red color difference component and/or chroma blue difference component).If it is 1 by x coordinate divided by 2 remainder, and by y-coordinate divided by 2 Remainder for 1, then the interpolation filter in equation (20) can be used to determine the sampling of luminance component for the video encoder The chromatic component (such as red color difference component and/or chroma blue difference component) of position.As shown in Figure 10, equation (20) is available In the chromatic component (such as red color difference component and/or the blue color that determine luma com-ponent samples position L5, L7, L13 and L15 Spend difference component).If being 0 by x coordinate divided by 2 remainder, and it is 0 by y-coordinate divided by 2 remainder, then the Video coding is set Interpolation filter in standby usable equation (21) determines the chromatic component of the sampling location of luminance component (such as red color Difference component and/or chroma blue difference component).As shown in Figure 10, equation (21) available for determine luma com-ponent samples position L0, The chromatic component (such as red color difference component and/or chroma blue difference component) of L2, L8 and L10.If by x coordinate divided by 2 remainder is 1, and is 0 by y-coordinate divided by 2 remainder, then the interpolation in equation (22) can be used to filter for the video encoder Wave device determines the chromatic component of the sampling location of luminance component (such as red color difference component and/or chroma blue difference Amount).As shown in Figure 10, equation (22) available for determine luma com-ponent samples position L1, L3, L9 and L11 chromatic component (such as Red color difference component and/or chroma blue difference component).
Equation (19) can be used to export for the chromatic component of sampling location L4:
C (L4)=(C0*3+C2+2)>>2 (19)
Wherein>>2 mean the sum of (C0*3+C2+2) divided by 22And/or>>2 are calculated using moving to right 2.If (C0*3+ The sum of) C2+2 it is not integer, then can be neglected after decimal again with this and divided by 22.In equation (19), in order to determine sample bits Put the chromatic component of L4, video encoder can apply interpolation filter, wherein the chromatic component of sampling location C0 (such as Cr0 Or Cb0) 3 are multiplied by, later should and with the chromatic component of another chroma samples position C2 (such as Cr2 or Cb2) be added, and itself and with 2 additions are the sum of final to determine.The sum of the final integer value can be determined.The interpolation filter by the integer and can remove With 4 with the chromatic component of determining sampling location L4.Multiple chromatic components (such as c of sampling location L4rAnd cb, u and v etc.) Interpolation filter (such as equation (19)) can be used to determine for value.Other sampling locations (such as sampling location L6, L14, L112) Chromatic component the chromatic component of appropriate sampling location can be used to be determined using equation (19).
Equation (20) can be used to export for the chromatic component of sampling location L8.The chromatic component of sampling location L8 can be with export Sampling location L4 chromatic component it is similar.Equation (20) provides as follows:
C (L8)=(C0+C2*3+2)>>2 (20)
Wherein>>2 mean the sum of (C0+C2*3+2) divided by 22And/or>>2 are calculated using moving to right 2.If (C0*3+ The sum of) C2+2 it is not integer, then can be neglected after decimal again with this and divided by 22.In equation (20), in order to determine sample bits Put the chromatic component of L8, video encoder can apply interpolation filter, wherein the chromatic component of sampling location C2 (such as Cr2 Or Cb2) 3 are multiplied by, later should and with the chromatic component of another chroma samples position C0 (such as Cr0 or Cb0) be added, and itself and with 2 additions are the sum of final to determine.The sum of the final integer value can be determined.The interpolation filter by the integer and can remove With 4 with the chromatic component of determining sampling location L8.Appropriate coloration point can be used in the chromatic component of sampling location L8, L2, L10 The sampling location of amount is determined using equation (20).Multiple chromatic components (such as C of sampling location L8rAnd Cb, u and v etc.) Interpolation filter (such as equation (20)) can be used to determine for value.
Equation (21) can be used to determine for the chromatic component of sampling location L5, as follows:
C (L5)=((C0+C1) * 3+ (C2+C3)+4)>>3 (21)
Wherein>>3 mean the sum of ((C0+C1) * 3+ (C2+C3)+4) divided by 23And/or>>3 are calculated using moving to right 3. If the sum of ((C0+C1) * 3+ (C2+C3)+4) is not integer, decimal can be neglected later again with this and divided by 23.In equation (21) in, in order to determine the chromatic component of sampling location L5, video encoder can apply interpolation filter, wherein sample bits Put chromatic component (such as the C of C0r0 or Cb0) with the chromatic component of another sampling location C1 (such as Cr1 or Cb1) it is added.It is described The sum of sampling location C1 and sampling location C0 can be multiplied by 3 by video encoder, after then multiplying and with sampling location C2 Chromatic component (such as Cr2 or Cb2), chromatic component (such as the C of sampling location C3r3 or Cb3) and 4 be added with determine it is final it With.The sum of the final integer value can be determined.The video encoder can be by the integer value divided by 8 with determining sampling location The chromatic component of L5.The sampling location that appropriate chromatic component can be used in the chromatic component of sampling location L7, L13, L15 uses Formula (21) determines.Multiple chromatic components (such as C of sampling location L5rAnd Cb, u and v etc.) value filtering interpolation can be used Device (such as equation (21)) determines.
It can be with the luminance component for sampling location L5 for the derived chromatic component of the luminance component of sampling location L9 Derived chromatic component it is similar.Equation (22) can be used to determine for the chromatic component of sampling location L9, as follows:
C (L9)=((C0+C1)+(C2+C5) * 3+4>>3 (22)
Wherein>>3 mean the sum of ((C0+C1)+(C2+C5) * 3+4) divided by 23And/or>>3 are calculated using moving to right 3. If the sum of ((C0+C1)+(C2+C5) * 3+4) is not integer, decimal can be neglected later again with this and divided by 23.In equation (22) in, in order to determine the chromatic component of sampling location L9, video encoder can apply interpolation filter, wherein sample bits Put chromatic component C0 (such as the C of C0r0 or Cb0) with the chromatic component C1 of sampling location C1 (such as Cr1 or Cb1) it is added.It is described Video encoder can be by the chromatic component (C of sampling location C2r2 or Cb2) with the chromatic component of another sampling location C5 (such as Cr5 or Cb5) it is added.The video encoder can by the chromatic component of the chromatic component of sampling location C2 and sampling location C5 it Be multiplied by 3, and after multiplying and be added with the sum of the chromatic component of sampling location C0 and the chromatic component of sampling location C1 and 4 It is the sum of final to determine.The sum of the final integer value can be determined.The video encoder can then remove the integer value With 8 with the chromatic component of determining sampling location L9.Appropriate chromatic component can be used in the chromatic component of sampling location L11, L1, L3 Sampling location is determined using equation (22).
Interpolation filter for the luminance component of sampling location L4 and L8 can be 2 plug wave filters, such as respectively 2 Plug wave filter [1,3] and [3,1].For example, the interpolation filter for the luminance component of sampling location L4 and L8 can be with Respectively with reference to equation (19) and the interpolation filter of equation (20) description.For sampling location L5 and L9 luminance component insert Value filter can be 4 plug wave filters, such as respectively 4 plug wave filters [3,3,1,1] and [1,1,3,3].Citing comes It says, the interpolation filter for the luminance component of sampling location L5 and L9 can be respectively with reference to equation (21) and equation (22) The interpolation filter of description.
The video encoder can be configured to the coloration point that the first interpolation filter is applied to the first sampling location Amount and the chromatic component that the second interpolation filter is applied to the second sampling location, it is such.For example, the video Encoding device can be configured to one that one or more of equation (17)-(18) are applied to one or more sampling locations Or multiple two overlapping chromatic components (such as red color difference component and/or chroma blue difference component).For example, the video Equation (18) then can be sampled chromatic component of the equation (17) applied to the first sampling location by encoding device applied to second The chromatic component of position, and then by chromatic component of the equation (17) applied to third sampling location, it is such.It is similar Ground, the video encoder can be configured to by the first interpolation filter be applied to the first sampling location luminance component and Second interpolation filter is applied to the luminance component of the second sampling location, it is such.For example, the Video coding is set The standby luminance component that can be configured to for one or more of equation (19)-(22) to be applied to one or more sampling locations. For example, the video encoder can be by luminance component of the equation (19) applied to the first sampling location, by equation (20) Applied to the luminance component of the second sampling location, by luminance component of the equation (21) applied to third sampling location, by equation (22) luminance component applied to the 4th sampling location, it is such.
Figure 11 A are the diagrams of example communication system 1100, can be implemented and/or using one in the example communication system 1100 A or multiple disclosed embodiments.The communication system 1100 can send such as voice, data, video, message, extensively The content broadcast or the like is supplied to the multi-access systems of multiple wireless users.The communication system 1100 can pass through system resource The shared of (including wireless bandwidth) causes multiple wireless users to be able to access that these contents.For example, the communication system 100 can make With one or more channel access methods, such as CDMA (CDMA), time division multiple acess (TDMA), frequency division multiple access (FDMA), just Hand over FDMA (OFDMA), Single Carrier Frequency Division Multiple Access (SC-FDMA) etc..
As shown in Figure 11 A, communication system 1100 can include wireless transmitter/receiver unit (WTRU) 1102a, 1102b, 1102c and/or 1102d (be referred to as or be collectively referred to as WTRU 1102), radio access network (RAN) 1103/1104/1105, core net 1106/1107/1109th, public switch telephone network (PSTN) 1108, internet 1110 and other networks 1112, it is to be understood that It can implement any number of WTRU, base station, network and/or network element.In WTRU 1102a, 1102b, 1102c and/or 102d Each can be arranged to any kind of device for running and/or communicating in wireless environments.As an example, WTRU 1102a, 1102b, 1102c and/or 102d may be configured to send and/or receive wireless signal, and can include user Equipment (UE), movement station, fixation or mobile subscriber unit, pager, cellular phone, personal digital assistant (PDA), intelligence electricity Words, portable computer, net book, personal computer, wireless sensor, consumption electronic product etc..
Communication system 1100 can also include base station 1114a and base station 1114b.Each in base station 1114a, 1114b It can be arranged to wirelessly dock at least one of WTRU 1102a, 1102b, 1102c and/or 102d, in order to connect Enter one or more communication networks (for example, core network 1106/1107/1109, internet 1110 and/or network 1112) Any kind of device.For example, base station 1114a and/or 114b can be base transceiver site (BTS), node B, e node B, family With node B, family expenses e node B, site controller, access point (AP), wireless router etc..Although base station 1114a, 1114b are each Be described as discrete component, but it is understood that base station 1114a, 1114b can include it is any amount of interconnection base station and/ Or network element.
Base station 1114a can be a part of RAN 1103/1104/1105, the RAN can also include other base stations and/ Or network element (not shown), base station controller (BSC), radio network controller (RNC), relay node etc..Base station 1114a and/or base station 1114b may be configured to send and/or receive the wireless signal in specific geographical area, this is specifically Reason region can be referred to as cell (not shown).Cell can also be divided into cell sector.It is such as associated with base station 1114a Cell can be divided into three sectors.As a result, in one embodiment, base station 1114a can include three transmitting-receiving letters Machine, i.e., for each sector of the cell, there are one transceivers.In another embodiment, base station 1114a can make With multiple-input and multiple-output (MIMO) technology, and it therefore can use multiple transceivers of each sector for cell.
Base station 1114a and/or 114b can by air interface 1115/1116/1117 and WTRU 1102a, 1102b, One or more of 1102c and/or 102d communication, the air interface 115/116/117 can be any suitable channel radios Believe link (for example, radio frequency (RF), microwave, infrared (IR), ultraviolet (UV), visible ray etc.).Air interface 1115/1116/1117 It can be established using any suitable radio access technologies (RAT).
More particularly, it as described above, communication system 1100 can be multi-access systems, and can use one or more Channel access scheme, such as CDMA, TDMA, FDMA, OFDMA, SC-FDMA etc..For example, in RAN1 103/1104/1105 Base station 1114a and WTRU 1102a, 1102b, 1102c can implement such as Universal Mobile Telecommunications System (UMTS) land without Line is electrically accessed the radiotechnics of (UTRA) etc, can establish air interface 1115/ using wideband CDMA (WCDMA) 1116/1117.WCDMA can include such as communication protocols of high-speed packet access (HSPA) and/or evolved HSPA (HSPA+) View.HSPA can include high-speed downlink packet access (HSDPA) and/or High Speed Uplink Packet access (HSUPA).
In another embodiment, base station 1114a and WTRU 1102a, 1102b and/or 1102c can implement such as to drill Into the radiotechnics of type UMTS terrestrial radios access (E-UTRA) etc, long term evolution (LTE) and/or height can be used Grade LTE (LTE-A) establishes air interface 1115/1116/1117.
In other embodiments, base station 1114a and WTRU 1102a, 1102b and/or 1102c can be implemented such as IEEE 802.16 (that is, worldwide interoperability for microwave accesses (WiMAX)), CDMA2000, CDMA2000 1X, CDMA2000EV-DO, Interim Standard 2000 (IS-2000), Interim Standard 95 (IS-95), Interim Standard 856 (IS-856), global system for mobile communications (GSM), the radiotechnics of enhanced data rates for gsm evolution (EDGE), GSM EDGE (GERAN) etc.
Base station 1114b in Figure 11 A can be such as wireless router, Home Node B, family expenses e node B or access Point, and any suitable RAT can be used, for promoting in the part of such as shopping centre, family, vehicle, campus etc The wireless connection in region.Base station 1114b and WTRU 1102c, 1102d can implement the wireless of such as IEEE 802.11 etc Power technology is to establish WLAN (WLAN).In another embodiment, base station 114b and WTRU 1102c, 1102d can Wireless personal area network (WPAN) is established to implement the radiotechnics of such as IEEE 802.15 etc.In another embodiment In, base station 1114b and WTRU 1102c, 1102d can use based on cellular RAT (for example, WCDMA, CDMA2000, GSM, LTE, LTE-A etc.) to establish picocell (picocell) and Femto cell (femtocell).As shown in Figure 11 A, base station 1114b can have to be directly connected to internet 1110.Base station 1114b can be not via core network 1106/1107/ as a result, 1109 access internet 1110.
RAN 1103/1104/1105 can communicate with core net 1106/1107/1109, the core net 1106/1107/ 1109 can be arranged to provide to WTRU by voice, data, using and/or by voice (VoIP) service of Internet protocol Any kind of network of one or more of 1102a, 1102b, 1102c and/or 102d.For example, core net 1106/ 1107/1109 can provide Call- Control1, Billing services, the service based on running fix, prepaid call, Internet connection, Video distribution etc. and/or execution advanced security feature, such as user authentication.Although it is not shown in Figure 11 A, it is possible to understand that RAN 1103/1104/1105 and/or core net 1106/1107/1109 can directly or indirectly communicate with other RAN, these Other RAN use the RAT identical from RAN 1103/1104/1105 or different RAT.For example, it can be adopted in addition to being connected to With the RAN 1103/1104/1105 of E-UTRA radiotechnics, core net 1106/1107/1109 can also with use GSM without Other RAN (not shown)s communication of line power technology.
Core net 1106/1107/1109 is also used as WTRU 1102a, 1102b, 1102c and/or 1102d access The gateway of PSTN 108, internet 1110 and/or other networks 1112.PSTN 1108 can include providing ordinary old style telephone Service the circuit switched telephone network of (POTS).Internet 1110 can include the use of the interconnected computer of common communicating protocol The global system of network and device, the common communicating protocol are, for example, transmission control protocol (TCP)/Internet protocol (IP) Yin Te Transmission control protocol (TCP), User Datagram Protocol (UDP) and Internet protocol (IP) in fidonetFido external member.The network 1112 can include the wirelessly or non-wirelessly communication network for being possessed and/or being runed by other service providers.For example, network 1112 can To include being connected to another core net of one or more RAN, these RAN can be used and 1103/1104/1105 phases of RAN Same RAT or different RAT.
Some or all in WTRU 1102a, 1102b, 1102c and/or 1102d in communication system 1100 can be with Including multi-mode ability, i.e. WTRU 1102a, 1102b, 1102c and/or 1102d can include passing through different communication Multiple transceivers that link communicates from different wireless networks.For example, the WTRU 1102c shown in Figure 11 A can be by Be configured to can be used the base station 1114a based on cellular radiotechnics communicate, and with usable 802 nothings of IEEE The base station 1114b of line power technology communicates.
Figure 11 B are the system diagrams of example WTRU 1102.As shown in Figure 11 B, WTRU 1102 can include processor 1118, Transceiver 1120, launch/receive element 1122, speaker/microphone 1124, keyboard 1126, display/touchpad 1128, Non-removable memory 1130, removable memory 1132, power supply 1134, global positioning system (GPS) chipset 1136 and its His peripheral equipment 1138.It should be understood that under holding and embodiment unanimous circumstances, WTRU 102 can include above-mentioned Any sub-portfolio of element.Equally, embodiment imagine base station 1114a and 1114b and/or base station 1114a and 1114b can be with Expression node (such as, but not limited to transceiver station (BTS), node B, site controller, access point (AP), home node-b, Evolved home node-b (e node B), Home evolved Node B (HeNB), Home evolved Node B gateways and agency's section Point etc.) can include it is described in Figure 11 B and element described herein some or all.
Processor 1118 can be general processor, application specific processor, conventional processors, digital signal processor (DSP), Multi-microprocessor, one or more microprocessors associated with DSP core, controller, microcontroller, application-specific integrated circuit (ASIC), field programmable gate array (FPGA) circuit, the integrated circuit (IC) of any other type, state machine etc..Processor 1118 can perform Signal coding, data processing, power control, input/output processing and/or WTRU 1102 is transported Other any functions of row in wireless environments.Processor 1118 may be coupled to transceiver 1120, the transceiver 1120 It may be coupled to launch/receive element 1122.Although processor 118 and the description of transceiver 1120 are independent in Figure 11 B Component, but processor 1118 and transceiver 1120 can be integrated together into Electronic Packaging or chip.
Launch/receive element 1122 may be configured to send signal to base by air interface 1115/1116/1117 It stands (for example, base station 1114a) or receives signal from base station (for example, base station 1114a).For example, in one embodiment, Launch/receive element 1122 can be arranged to send and/or receive the antenna of RF signals.In another embodiment, Launch/receive element 1122 can be arranged to send and/or receive the transmitter of such as IR, UV or visible light signal/ Detector.In another embodiment, launch/receive element 1122 may be configured to send and receive RF signals and light letter Number the two.It is appreciated that launch/receive element 1122 may be configured to send and/or receive the arbitrary combination of wireless signal.
In addition, although launch/receive element 1122 is described as discrete component in Figure 11 B, WTRU 1102 can To include any amount of launch/receive element 1122.More specifically, WTRU 1102 can use MIMO technology.Exist as a result, In one embodiment, WTRU 1102 can include two or more launch/receive elements 1122 (for example, mutiple antennas) For being emitted by air interface 1115/1116/1117 and/or receiving wireless signal.
Transceiver 1120 may be configured to the signal sent by launch/receive element 1122 is modulated, and And it is configured to being demodulated by 1122 received signal of launch/receive element.As described above, WTRU 1102 can have Multi-mode ability.Transceiver 1120 can include multiple transceivers for enabling WTRU 1102 via more as a result, A RAT communicates, such as UTRA and IEEE 802.11.
The processor 1118 of WTRU 1102 can be coupled to speaker/microphone 1124, keyboard 1126 and/or display Screen/touch tablet 1128 (for example, liquid crystal display (LCD) display unit or Organic Light Emitting Diode (OLED) display unit), and And user input data can be received from above device.Processor 1118 can also be to speaker/microphone 1124, keyboard 1126 And/or display/touchpad 1128 exports user data.In addition, processor 1118 can be accessed from any kind of suitable Memory in information and data are stored into any kind of suitable memory, the memory for example can be Non-removable memory 1130 and/or removable memory 1132.Non-removable memory 1130 can be deposited including arbitrary access Reservoir (RAM), read-only memory (ROM), hard disk or any other type memory storage apparatus.Removable memory 1132 can include subscriber identity module (SIM) card, memory stick, secure digital (SD) storage card etc..In other embodiment In, processor 1118 can be accessed from being physically not located on WTRU 102 (such as positioned at server or home computer On (not shown)) memory data and store data in the memory.
Processor 1118 can receive electric energy, and may be configured to the electric energy distributing to WTRU from power supply 1134 Other assemblies in 1102 and/or to being controlled to the electric energy of the other assemblies in WTRU 1102.Power supply 1134 can be appointed What is suitable for the device powered to WTRU 1102.For example, power supply 1134 can include one or more dry cell (ni-Cd (NiCd), nickel zinc (NiZn), ni-mh (NiMH), lithium ion (Li-ion) etc.), solar cell, fuel cell etc..
Processor 1118 is also coupled to GPS chip group 1136, which may be configured to provide pass Location information (for example, longitude and latitude) in the current location of WTRU 1102.As the information from GPS chip group 1136 Supplement or replacement, WTRU 1102 can by air interface 1115/1116/1117 from base station (for example, base station 1114a, 1114b) receive location information and/or the timing (timing) based on the signal received from two or more adjacent base stations To determine its position.It is appreciated that while keeping with embodiment consistency, WTRU can pass through any suitable position Method is determined to obtain location information.
Processor 1118 is also coupled to other peripheral equipments 1138, which can include providing additional Feature, function and/or the one or more software and/or hardware modules wirelessly or non-wirelessly connected.For example, peripheral equipment 1138 can With include accelerometer, electronic compass (e-compass), satellite transceiver, digital camera (for photo or video), Universal serial bus (USB) port, vibrating device, television transceiver, hands-free headsets, bluetooth module, frequency modulation (FM) radio Unit, digital music player, media player, video game machine module, explorer etc..
Figure 11 C are according to a kind of RAN 1103 of embodiment and the system diagram of core net 1106.As described above, RAN 1103 usable UTRA radiotechnics are communicated by air interface 1115 with WTRU 1102a, 1102b and/or 1102c.RAN 1103 can also communicate with core net 1106.As shown in Figure 11 C, RAN 1103 may include node B 1140a, 1140b and/ Or 1140c, node B 1140a, 1140b and/or 1140c each may each comprise one or more for passing through air interface 1115 transceivers to communicate with WTRU 1102a, 1102b and/or 1102c.Node B 1140a, 1140b and/or 1140c Each of can be associated with the specific cell (not shown) in RAN 1103.RAN 1103 can also include RNC 1142a、1142b.It is appreciated that RAN 1103 can include any number of section while keeping with embodiment consistency Point B and RNC.
As shown in Figure 11 C, node B 1140a, 1140b can communicate with RNC 1142a.In addition, node B 1140c can be with It communicates with RNC 1142b.Node B 1140a, 1140b and/or 1140c can via Iub interface and respective RNC 1142a, 1142b communicates.RNC 1142a, 1142b can communicate with one another via Iur interfaces.Each of RNC 1142a, 1142b can be with It is configured to respective node B 1140a, 1140b and/or 1140c that it is controlled to connect.In addition, RNC 1142a, 1142b Each can be formulated into execution or support other functions, such as open sea wharf, load control, allowance control, grouping Scheduling, switching control, macro-diversity, security function, data encryption etc..
The core net 1106 shown in Figure 11 C can include Media Gateway (MGW) 1144, mobile switching centre (MSC) 1146th, Serving GPRS Support Node (SGSN) 1148 and/or gateway GPRS supporting node (GGSN) 1150.It is although aforementioned each A element is described as a part for core net 1106, but any one of these elements can by remove core network operator side it Outer entity possesses and/or operates.
RNC 1142a in RAN 1103 can be connected to the MSC 1146 in core net 1106 via IuCS interfaces. MSC 1146 may be coupled to MGW 1144.MSC 1146 and MGW 1144 can give WTRU 1102a, 1102b and/or 1102c The access to such as circuit-switched network of PSTN 108 is provided, to promote WTRU 1102a, 1102b and/or 1102c and tradition Communication between route communication device.
RNC 1102a in RAN 1103 can also be connected to the SGSN 1148 in core net 1106 via IuPS interfaces. SGSN 1148 may be coupled to GGSN 1150.SGSN 1148 and GGSN 1150 can give WTRU 1102a, 1102b and/or 1102c provides the access to such as packet switching network of internet 1110, to promote WTRU 1102a, 1102b and/or 102c With the communication between IP enabled devices.
As described above, core net 1106 may be also connected to network 1112, network 1112 can include other services and provide Other of Fang Yongyou and/or operation wired or wireless network.
Figure 11 D are according to a kind of RAN 1104 of embodiment and the system diagram of core net 1107.As described above, RAN 1104 usable E-UTRA radiotechnics are communicated by air interface 1116 with WTRU 1102a, 1102b and/or 1102c. RAN 1104 can also communicate with core net 1107.
RAN 1104 may include e nodes B 1160a, 1160b and/or 1160c, it is to be understood that RAN 1104 can include Any number of e nodes B and keep consistent with embodiment.E nodes B 1160a, 1160b and/or 1160c each Including the one or more transmitting-receiving letters for being used to communicate with WTRU 1102a, 1102b and/or 1102c by air interface 1116 Machine.In one embodiment, e nodes B 1160a, 1160b and/or 1160c can implement MIMO technology.So as to, such as e Node B 1160a can be emitted wireless signal to WTRU 1102a and be received from WTRU 1102a wireless using mutiple antennas Signal.
Each in e nodes B 1160a, 1160b and/or 1160c can be associated with specific cell (not shown), And can be configured as processing provided for radio resources management determine, switching determine, in uplink and/or downlink to user It is scheduled.As shown in Figure 11 D, e nodes B 1160a, 1160b and/or 1160c can be communicated by X2 interface.
The core net 1107 shown in Figure 11 D can include mobile management gateway (MME) 1162,1164 and of gateway Packet Data Network (PDN) gateway 1166.Although each in said elements is described as a part for core net 1107, this The entity that any one of a little elements can be different from core network operators possesses and/or operates.
MME 1162 can be connected to via S1 interfaces in e nodes B 1160a, 1160b and/or the 1160c in RAN 1104 Each, and may act as control node.For example, MME 1162 can be responsible for certification WTRU 1102a, 1102b and/or 1102c User, bearer activation/deactivation, select specific clothes during the initial attachment of WTRU 1102a, 1102b and/or 1102c Business gateway, etc..MME 1162 may also provide control plane function, in RAN 1104 and the other radiotechnics of use It is switched between other RAN (not shown) of (such as GSM or WCDMA).
Gateway 1164 can be connected to via S1 interfaces e nodes B 1160a, 1160b in RAN 1104 and/or Each in 1160c.Gateway 1164 can usually be route simultaneously to/from WTRU 1102a, 1102b and/or 1102c Forward user data packets.Gateway 1164 can also carry out other functions, for example be anchored user during switching between e nodes B Plane triggers paging when down link data is available WTRU 1102a, 1102b and/or 1102c, manages and store Context of WTRU 1102a, 1102b and/or 1102c, etc..
Gateway 1164 is also connected to PDN 1166, can be provided to WTRU 1102a, 1102b and/or 1102c To the access of packet switching network (such as internet 1110), WTRU 1102a, 1102b and/or 1102c and IP to be promoted to enable Communication between device.
Core net 1107 can promote the communication with other networks.For example, core net 1107 can to WTRU 1102a, 1102b and/or 1102c provide the access to circuit-switched network (such as PSTN 1108), with promote WTRU 1102a, Communication between 1102b and/or 1102c and conventional land lines communication device.For example, core net 107 can include serving as core net The IP gateway (such as IP multimedia subsystem (IMS) server) of interface between 107 and PSTN 108 can be with the IP Gateway communication.In addition, core net 1107 can provide the access to network 1112 to WTRU 1102a, 1102b and/or 1102c, Network 1112 may include the other wired or wireless networks for being possessed and/or being operated by other service providers.
Figure 11 E are the system diagrams of RAN 1105 and core net 1109 according to a kind of embodiment.RAN 1105 can be Led to using 802.16 radiotechnics of IEEE by air interface 1117 and WTRU 1102a, 1102b and/or 1102c The access service network (ASN) of letter.As will be described in detail below, WTRU 1102a, 1102b and/or 1102c, RAN 1105 Communication link between the different function entity in core net 1109 can be defined as reference point.
As shown in Figure 11 E, RAN 1105 may include base station 1180a, 1180b and/or 1180c and ASN gateways 1182, but It is appreciated that RAN 1105 can include any number of base station and ASN gateways while keeping with embodiment consistency. Base station 1180a, 1180b and/or 1180c each can be associated with the specific cell (not shown) in RAN 1105 and It may include one or more transmitting-receiving letters for communicating by air interface 1117 with WTRU 1102a, 1102b and/or 1102c Machine.In one embodiment, base station 1180a, 1180b and/or 1180c can implement MIMO technology.So as to, for example, Base station 1180a to emit wireless signal to WTRU 1102a and can receive wireless communication using mutiple antennas from WTRU 1102a Number.Base station 1180a, 1180b and/or 1180c may also provide mobile management function to ps domain, such as handover trigger, tunnel building, wireless Radio resource management, traffic classification, service quality (QoS) strategy execution etc..ASN gateways 1182 can serve as flow accumulation point and simultaneously may be used Duty pager caches subscriber profiles, is routed to core net 1109 etc..
Air interface 1117 between WTRU 1102a, 1102b and/or 1102c and RAN 1105 can be defined as implementing The R1 reference points of 802.16 specifications of IEEE.In addition, each in WTRU 1102a, 1102b and/or 1102c can be with core Net 1109 establishes logic interfacing (not shown).Logic between WTRU 1102a, 1102b and/or 1102c and core net 1109 connects Mouth can be defined as R2 reference points, can be used for certification, mandate, the configuration management of IP hosts and/or mobile management.
Communication link between each in base station 1180a, 1180b and/or 1180c can be defined to include to promote Into the R8 reference points of the agreement of the data transfer between WTRU switchings and base station.Base station 1180a, 1180b and/or 1180c and Communication link between ASN gateways 1182 can be defined as R6 reference points.R6 reference points may include being based on for promotion and WTRU The agreement of the mobile management of each associated mobility event in 1102a, 1102b and/or 1102c.
As depicted in fig. 11E, RAN 1105 may be connected to core net 1109.It is logical between RAN 1105 and core net 1109 Letter link can be defined as such as R3 reference points including being used for the agreement for promoting data transfer and mobility management capabilities.Core Net 1109 may include mobility IP local agents (MIP-HA) 1184, Certificate Authority book keeping operation (AAA) server 1186 and gateway 1188.Although each in said elements is described as a part for core net 1109, it is to be understood that in these elements Any one entity that can be different from core network operators possess and/or operate.
The MIP-HA is responsible for IP address management, and may be such that WTRU 1102a, 1102b and/or 1102c in difference ASN and/or different core networks internetwork roaming.The MIP-HA 1184 can to WTRU 1102a, 1102b and/or 1102c provides the access to packet switching network (such as internet 1110), in order to WTRU 1102a, 1102b and/or Communication between 1102c and IP enabled devices.AAA service networks its can be responsible for user authentication and support user service.Gateway 1188 can promote the interaction with other networks.It is arrived for example, gateway 1188 can be provided to WTRU 1102a, 1102b and/or 1102c The access of circuit-switched network (such as PSTN 1108), in order to WTRU 1102a, 1102b and/or 1102c and conventional land lines Communication between communication equipment.In addition, gateway 1188 is provided to WTRU 1102a, 1102b and/or 1102c to network 1112 Access, the network 1112 may include other wired or wireless networks for being possessed and/or being operated by other service providers.
Although do not shown in Figure 11 E, it should be appreciated that RAN 1105 can be connect, and core network with other ASN 1109 can connect with other core networks.Communication link between RAN 1105 and other ASN can be defined as R4 reference points, It may include the ambulant association for coordinating WTRU 1102a, 1102b and/or 1102c between RAN 1105 and other ASN View.Communication link between core network 1109 and other core networks can be defined as R5 reference points, may include promoting Into the agreement of the interaction between household core network and visited core networks.
Although describing feature and element in a manner of specific combination above, each feature or element can not have It is used alone in the case of other features and element or carries out various combinations with other features and element.It is in addition, described herein Method can in being bound to computer readable storage medium computer program, realize in software or firmware, with by computer or Processor performs.The example of computer-readable medium includes electronic signal (being transmitted by wired or wireless connection) and computer can Read storaging medium.The example of computer readable storage medium includes but not limited to read-only memory (ROM), random access memory (RAM), the magnetic media of register, buffer memory, semiconductor memory apparatus, such as built-in disk and moveable magnetic disc, magneto-optic Medium and light medium (such as CD-ROM disk and digital multi-purpose disk (DVD)).Processor associated with software can be used for reality Apply the RF transceiver used in WTRU, UE, terminal, base station, RNC or any hosts.

Claims (26)

1. a kind of video encoder, which includes:
Processor is configured to:
The associated picture of the first color space with the first color volume of covering is received, the wherein picture includes the first sampling location The first component, the second component of the second sampling location and the second component of third sampling location;
First interpolation filter is applied to the second component of second sampling location and the third sampling location The second component, with determine first sampling location the second component, wherein the institute of first sampling location Second component is stated to be associated with first color space, wherein when first component is luminance component, first sampling Position is luma samples position, and first interpolation filter is applied to second sampling location and third sampling The chromatic component of position, with the chromatic component for determining the luma samples position and when first component is chromatic component When, first sampling location is chroma samples position, and first interpolation filter is applied to second sample bits The luminance component with the third sampling location is put, to determine the luminance component of the chroma samples position;And
Color transformation model is applied to first component of first sampling location and first sampling location First component of first sampling location is transformed into the second color space, second color by the second component The color volume of space covering bigger compared with the first color volume.
2. video encoder according to claim 1, wherein first component is luminance component, and described second point It measures as the first chromatic component or the second chromatic component;Or wherein described first component is first chromatic component or described the Two chromatic components, and the second component is the luminance component.
3. video encoder according to claim 2, wherein the processor is configured to apply first interpolation Wave filter includes:
The processor is configured to:
The second component of second sampling location, the second component of the third sampling location and 1 are added With determine and;And
It will be described and divided by 2.
4. video encoder according to claim 1, wherein the processor is configured to apply first interpolation Wave filter includes:
The processor is configured to:
The second component of second sampling location is multiplied by 3;
The second component of the second sampling location after multiplying, the second component of the third sampling location and 2 phases Determined and;And
It will be described and divided by 4.
5. video encoder according to claim 1, wherein the picture includes described the second of the 4th sampling location Component and the second component of the 5th sampling location;And
Wherein described processor is configured to for first interpolation filter to be applied to described the of second sampling location Two components, the second component of the third sampling location, the second component of the 4th sampling location and described The second component of 5th sampling location, to determine the second component of first sampling location.
6. video encoder according to claim 5, wherein the processor is configured to apply first interpolation Wave filter includes:
The processor is configured to:
By the second component of second sampling location and the second component of the third sampling location be added with Determine the first He;
By the second component of the 4th sampling location and the second component of the 5th sampling location be added with Determine the second He;
By described second and be multiplied by 3 with determine third and;
By described first with the third with and 4 be added with determine the 4th and;And
By the described 4th and divided by 8.
7. video encoder according to claim 1, wherein the picture includes the third of second sampling location Component and the third component of the third sampling location, wherein first component is luminance component, second point described It measures as the first chromatic component and the third component is the second chromatic component, and wherein described processor is further configured to:
First interpolation filter is applied to the third component of second sampling location and third sampling The third component of position, to determine the third component of first sampling location, wherein first sampling location The third component be associated with first color space;And
The color transformation model is applied to first component of first sampling location, first sampling location The second component and the third component of first sampling location, by described the first of first sampling location Component is transformed into second color space.
8. video encoder according to claim 1, wherein the picture includes the third of first sampling location Component, and wherein described processor is further configured to:
The color transformation model is applied to first component of first sampling location, first sampling location The third component and the second component of first sampling location, by described the first of first sampling location Component is transformed into second color space.
9. video encoder according to claim 8, wherein first component is the first chromatic component, described second Component is luminance component and the third component is the second chromatic component;Or wherein described first component is second color Degree component, the second component is the luminance component and the third component is first chromatic component.
10. video encoder according to claim 1, wherein the picture is characterized in that 4:2:0 chroma format.
11. video encoder according to claim 1, wherein the color transformation model is based on 3 dimension look-up table LUT.
12. video encoder according to claim 1, wherein the processor is further configured to receive scalable bit Stream, the scalable bitstream include base and enhancement layer, wherein the base includes the picture, the base and described the One color space is associated with, and the enhancement layer is associated with second color space.
13. a kind of video encoder, which includes:
Processor is configured to:
The associated picture of the first color space with the first color volume of covering is received, wherein the picture includes the first sample bits The first chromatic component for putting, the second chromatic component of first sampling location, the luminance component of the second sampling location, third are adopted The brightness of the luminance component of sample position, the luminance component of the 4th sampling location and the 5th sampling location point Amount;
By interpolation filter be applied to it is following in two or more with determine first chromatic component the first sample bits The luminance component put:The luminance component of second sampling location, the brightness point of the third sampling location Amount, the luminance component of the 4th sampling location and the luminance component of the 5th sampling location, wherein described The luminance component of first sampling location is associated with first color space;
Color transformation model is applied to first chromatic component of first sampling location, first sampling location Second chromatic component and the luminance component of first sampling location, by the institute of first sampling location It states the first chromatic component and is transformed into the second color space, second color space covering bigger compared with the first color volume Color volume;And
The color transformation model is applied to first chromatic component of first sampling location, first sample bits Second chromatic component and the luminance component of first sampling location put, by first sampling location Second chromatic component be transformed into covering compared with the first color volume the color volume of bigger second face The colour space.
14. a kind of method for video coding, which includes:
The associated picture of the first color space with the first color volume of covering is received, wherein the picture includes the first sample bits The second component of the first component, the second component of the second sampling location and third sampling location put;
First interpolation filter is applied to the second component of second sampling location and the third sampling location The second component, with determine first sampling location the second component, wherein the institute of first sampling location Second component is stated to be associated with first color space, wherein when first component is luminance component, first sampling Position is luma samples position, and first interpolation filter is applied to second sampling location and third sampling The chromatic component of position, with the chromatic component for determining the luma samples position and when first component is chromatic component When, first sampling location is chroma samples position, and first interpolation filter is applied to second sample bits The luminance component with the third sampling location is put, to determine the luminance component of the chroma samples position;And
Color transformation model is applied to first component of first sampling location and first sampling location First component of first sampling location is transformed into the second color space, second color by the second component The color volume of space covering bigger compared with the first color volume.
15. method for video coding according to claim 14, wherein first component is luminance component, and described second Component is the first chromatic component or the second chromatic component;Or wherein described first component is first chromatic component or described Second chromatic component, and the second component is the luminance component.
16. method for video coding according to claim 15, wherein the method using first interpolation filter It further includes:
The second component of second sampling location, the second component of the third sampling location and 1 are added With determine and;And
It will be described and divided by 2.
17. method for video coding according to claim 14, wherein the method using first interpolation filter It further includes:
The second component of second sampling location is multiplied by 3;
The second component of the second sampling location after multiplying, the second component of the third sampling location and 2 phases Determined and;And
It will be described and divided by 4.
18. method for video coding according to claim 14, wherein the picture includes described the of the 4th sampling location Two components and the second component of the 5th sampling location, and wherein the method further includes:
First interpolation filter is applied to the second component of second sampling location, the third sampling location The second component, the 4th sampling location the second component and the 5th sampling location described second Component, to determine the second component of first sampling location.
19. method for video coding according to claim 18, wherein the method using first interpolation filter It further includes:
By the second component of second sampling location and the second component of the third sampling location be added with Determine the first He;
By the second component of the 4th sampling location and the second component of the 5th sampling location be added with Determine the second He;
By described second and be multiplied by 3 with determine third and;
By described first with the third with and 4 be added with determine the 4th and;And
By the described 4th and divided by 8.
20. method for video coding according to claim 14, wherein the picture includes the of second sampling location Three-component and the third component of the third sampling location, wherein first component is luminance component, described second Component is the first chromatic component and the third component is the second chromatic component, and wherein the method further includes:
First interpolation filter is applied to the third component of second sampling location and third sampling The third component of position, to determine the third component of first sampling location, wherein first sampling location The third component be associated with first color space;And
The color transformation model is applied to first component of first sampling location, first sampling location The second component and the third component of first sampling location, by described the first of first sampling location Component is transformed into second color space.
21. method for video coding according to claim 14, wherein the picture includes the of first sampling location Three-component, and wherein the method further includes:
The color transformation model is applied to first component of first sampling location, first sampling location The third component and the second component of first sampling location, by described the first of first sampling location Component is transformed into second color space.
22. method for video coding according to claim 21, wherein first component is the first chromatic component, described the Two components are luminance component and the third component is the second chromatic component;Or wherein described first component is described second Chromatic component, the second component are the luminance component and the third component is first chromatic component.
23. method for video coding according to claim 14, wherein the picture is characterized in that 4:2:0 chroma format.
24. method for video coding according to claim 14, wherein the color transformation model is based on 3 dimension look-up table LUT.
25. method for video coding according to claim 14, wherein the method further include:
Scalable bitstream is received, the scalable bitstream includes base and enhancement layer, wherein the base includes the figure Piece, the base is associated with first color space, and the enhancement layer is associated with second color space.
26. a kind of method for video coding, which includes:
The associated picture of the first color space with the first color volume of covering is received, wherein the picture includes the first sample bits The first chromatic component for putting, the second chromatic component of first sampling location, the luminance component of the second sampling location, third are adopted The brightness of the luminance component of sample position, the luminance component of the 4th sampling location and the 5th sampling location point Amount;
By interpolation filter be applied to it is following in two or more with determine first chromatic component described in first sampling The luminance component of position:The luminance component of second sampling location, the brightness of the third sampling location The luminance component of component, the luminance component of the 4th sampling location and the 5th sampling location, wherein institute The luminance component for stating the first sampling location is associated with first color space;
Color transformation model is applied to first chromatic component of first sampling location, first sampling location Second chromatic component and the luminance component of first sampling location, by the institute of first sampling location It states the first chromatic component and is transformed into the second color space, second color space covering bigger compared with the first color volume Color volume;And
The color transformation model is applied to first chromatic component of first sampling location, first sample bits Second chromatic component and the luminance component of first sampling location put, by first sampling location Second chromatic component be transformed into second color space.
CN201480068000.3A 2013-12-13 2014-12-12 The colour gamut scalable video decoding device and method of the phase alignment of brightness and coloration are carried out using interpolation Active CN105830440B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810287571.6A CN108337519B (en) 2013-12-13 2014-12-12 Video encoding apparatus and method

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201361915892P 2013-12-13 2013-12-13
US61/915,892 2013-12-13
PCT/US2014/069898 WO2015089352A1 (en) 2013-12-13 2014-12-12 Color gamut scalable video coding device and method for the phase alignment of luma and chroma using interpolation

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201810287571.6A Division CN108337519B (en) 2013-12-13 2014-12-12 Video encoding apparatus and method

Publications (2)

Publication Number Publication Date
CN105830440A CN105830440A (en) 2016-08-03
CN105830440B true CN105830440B (en) 2018-06-12

Family

ID=52345528

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810287571.6A Active CN108337519B (en) 2013-12-13 2014-12-12 Video encoding apparatus and method
CN201480068000.3A Active CN105830440B (en) 2013-12-13 2014-12-12 The colour gamut scalable video decoding device and method of the phase alignment of brightness and coloration are carried out using interpolation

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201810287571.6A Active CN108337519B (en) 2013-12-13 2014-12-12 Video encoding apparatus and method

Country Status (9)

Country Link
US (3) US9912925B2 (en)
EP (1) EP3080985B1 (en)
JP (3) JP6240331B2 (en)
KR (2) KR20170113713A (en)
CN (2) CN108337519B (en)
AU (1) AU2014362246B2 (en)
MX (1) MX358758B (en)
TW (1) TWI628959B (en)
WO (1) WO2015089352A1 (en)

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9948916B2 (en) 2013-10-14 2018-04-17 Qualcomm Incorporated Three-dimensional lookup table based color gamut scalability in multi-layer video coding
US10531105B2 (en) 2013-12-17 2020-01-07 Qualcomm Incorporated Signaling partition information for 3D lookup table for color gamut scalability in multi-layer video coding
US9756337B2 (en) 2013-12-17 2017-09-05 Qualcomm Incorporated Signaling color values for 3D lookup table for color gamut scalability in multi-layer video coding
US10448029B2 (en) * 2014-04-17 2019-10-15 Qualcomm Incorporated Signaling bit depth values for 3D color prediction for color gamut scalability
US10455230B2 (en) 2015-01-09 2019-10-22 Avago Technologies International Sales Pte. Limited Methods for improving low-cost video/image compression
US10127887B2 (en) * 2015-01-14 2018-11-13 Intel Corporation Acceleration of color conversion
US11102495B2 (en) * 2016-05-17 2021-08-24 Qualcomm Incorporated Methods and systems for generating and processing content color volume messages for video
US11778231B2 (en) 2016-05-26 2023-10-03 Vid Scale, Inc. Geometric conversion for 360-degree video coding
US10225536B2 (en) 2016-07-01 2019-03-05 Intel Corporation Dynamic fidelity updates for encoded displays
US10219002B2 (en) * 2016-07-01 2019-02-26 Intel Corporation Dynamic fidelity updates for encoded displays
US11449256B2 (en) 2018-05-15 2022-09-20 Samsung Electronics Co., Ltd. Method for accelerating image storing and retrieving differential latency storage devices based on access rates
US10949087B2 (en) 2018-05-15 2021-03-16 Samsung Electronics Co., Ltd. Method for rapid reference object storage format for chroma subsampled images
WO2019229683A1 (en) 2018-05-31 2019-12-05 Beijing Bytedance Network Technology Co., Ltd. Concept of interweaved prediction
CN113597760A (en) 2019-01-02 2021-11-02 北京字节跳动网络技术有限公司 Method for video processing
CN113728633B (en) 2019-04-01 2022-12-20 北京字节跳动网络技术有限公司 Using interpolation filters for history-based motion vector prediction
WO2020207493A1 (en) 2019-04-12 2020-10-15 Beijing Bytedance Network Technology Co., Ltd. Transform coding based on matrix-based intra prediction
WO2020211807A1 (en) 2019-04-16 2020-10-22 Beijing Bytedance Network Technology Co., Ltd. Matrix derivation in intra coding mode
CN117097912A (en) 2019-05-01 2023-11-21 北京字节跳动网络技术有限公司 Matrix-based intra-prediction context coding
WO2020221373A1 (en) 2019-05-01 2020-11-05 Beijing Bytedance Network Technology Co., Ltd. Matrix-based intra prediction using filtering
EP3954115A4 (en) 2019-05-22 2023-04-19 Beijing Bytedance Network Technology Co., Ltd. Matrix-based intra prediction using upsampling
CN114051735A (en) 2019-05-31 2022-02-15 北京字节跳动网络技术有限公司 One-step downsampling process in matrix-based intra prediction
EP3963885A4 (en) 2019-06-05 2022-12-14 Beijing Bytedance Network Technology Co., Ltd. Context determination for matrix-based intra prediction
CN117579830A (en) 2019-06-21 2024-02-20 北京字节跳动网络技术有限公司 Selective use of adaptive intra-annular color space conversion and other codec tools
TWI737364B (en) 2019-06-28 2021-08-21 聯發科技股份有限公司 Method and apparatus of matrix based intra prediction in image and video processing
JP7359942B2 (en) 2019-08-20 2023-10-11 北京字節跳動網絡技術有限公司 Selective use of alternative interpolation filters in video processing
CN114641997A (en) 2019-10-28 2022-06-17 北京字节跳动网络技术有限公司 Color component based syntax signaling and parsing
CN115152219A (en) 2019-11-07 2022-10-04 抖音视界有限公司 Quantization characteristics of adaptive in-loop color space transforms for video coding
CN112017105B (en) * 2020-08-13 2024-03-26 深圳市洲明科技股份有限公司 Color correction FPGA implementation device and method, color correction equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5737032A (en) * 1995-09-05 1998-04-07 Videotek, Inc. Serial digital video processing with concurrent adjustment in RGB and luminance/color difference
CN1620150A (en) * 2003-11-20 2005-05-25 三星电子株式会社 Apparatus and method for controlling the colors of a color image
CN1859576A (en) * 2005-10-11 2006-11-08 华为技术有限公司 Top sampling method and its system for space layered coding video image
CN101005620A (en) * 2004-09-03 2007-07-25 微软公司 Reform in decoding macroblock and movement information for interlaced scanning and progressive seanning video code
CN101449476A (en) * 2005-03-18 2009-06-03 夏普株式会社 Methods and systems for reducing blocking artifacts with reduced complexity for spatially-scalable video coding
CN101977316A (en) * 2010-10-27 2011-02-16 无锡中星微电子有限公司 Telescopic coding method
CN103167295A (en) * 2011-12-16 2013-06-19 三星电子株式会社 Method and apparatus for processing image signal

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5583656A (en) 1992-12-31 1996-12-10 Eastman Kodak Company Methods and apparatus for attaching compressed look-up table (LUT) representations of N to M-dimensional transforms to image data and for processing image data utilizing the attached compressed LUTs
JP2933487B2 (en) * 1994-07-15 1999-08-16 松下電器産業株式会社 How to convert chroma format
US5712687A (en) * 1996-04-25 1998-01-27 Tektronix, Inc. Chrominance resampling for color images
US6400843B1 (en) 1999-04-22 2002-06-04 Seiko Epson Corporation Color image reproduction with accurate inside-gamut colors and enhanced outside-gamut colors
US6301393B1 (en) 2000-01-21 2001-10-09 Eastman Kodak Company Using a residual image formed from a clipped limited color gamut digital image to represent an extended color gamut digital image
RU2190306C2 (en) 2000-05-10 2002-09-27 Государственное унитарное предприятие "Калужский научно-исследовательский институт телемеханических устройств" Method and device for processing half-tone color picture using vector diffusion of chroma error
US7929610B2 (en) * 2001-03-26 2011-04-19 Sharp Kabushiki Kaisha Methods and systems for reducing blocking artifacts with reduced complexity for spatially-scalable video coding
US6741263B1 (en) * 2001-09-21 2004-05-25 Lsi Logic Corporation Video sampling structure conversion in BMME
US7643675B2 (en) 2003-08-01 2010-01-05 Microsoft Corporation Strategies for processing image information using a color information data structure
US7659911B2 (en) 2004-04-21 2010-02-09 Andreas Wittenstein Method and apparatus for lossless and minimal-loss color conversion
US8774269B2 (en) 2006-10-25 2014-07-08 Franuhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Quality scalable coding with mapping different ranges of bit depths
US8233536B2 (en) * 2007-01-23 2012-07-31 Sharp Laboratories Of America, Inc. Methods and systems for multiplication-free inter-layer image prediction
JP2008211310A (en) * 2007-02-23 2008-09-11 Seiko Epson Corp Image processing apparatus and image display device
US8625676B2 (en) 2007-06-29 2014-01-07 Pai Kung Limited Liability Company Video bitstream decoding using least square estimates
US7684084B2 (en) * 2007-09-25 2010-03-23 Xerox Corporation Multiple dimensional color conversion to minimize interpolation error
US8446961B2 (en) * 2008-07-10 2013-05-21 Intel Corporation Color gamut scalability techniques
US9538176B2 (en) 2008-08-08 2017-01-03 Dolby Laboratories Licensing Corporation Pre-processing for bitdepth and color format scalable video coding
JP2012520045A (en) 2009-03-09 2012-08-30 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Multi-primary conversion
US9185422B2 (en) 2010-07-15 2015-11-10 Qualcomm Incorporated Variable localized bit-depth increase for fixed-point transforms in video coding
JP5663093B2 (en) 2010-10-01 2015-02-04 ドルビー ラボラトリーズ ライセンシング コーポレイション Optimized filter selection for reference picture processing
GB2495468B (en) 2011-09-02 2017-12-13 Skype Video coding
US11184623B2 (en) 2011-09-26 2021-11-23 Texas Instruments Incorporated Method and system for lossless coding mode in video coding
WO2013106190A1 (en) 2012-01-09 2013-07-18 Dolby Laboratories Licensing Corporation Hybrid reference picture reconstruction method for single and multiple layered video coding systems
FR2989856B1 (en) * 2012-04-23 2014-11-28 Assistance Tech Et Etude De Materiels Electroniques PROGRESSIVE COMPRESSION / DECOMPRESSION OF VIDEO DIGITAL STREAM COMPRISING AT LEAST ONE INTERLACED IMAGE
PL2941872T3 (en) * 2013-01-02 2019-03-29 Dolby Laboratories Licensing Corporation Backward-compatible coding for ultra high definition video signals with enhanced dynamic range
US9673936B2 (en) 2013-03-15 2017-06-06 Google Inc. Method and system for providing error correction to low-latency streaming video
KR102481406B1 (en) 2013-04-08 2022-12-27 돌비 인터네셔널 에이비 Method for encoding and method for decoding a lut and corresponding devices
EP3386179A1 (en) 2013-09-20 2018-10-10 VID SCALE, Inc. Systems and methods for providing 3d look-up table coding for color gamut scalability
US9756337B2 (en) 2013-12-17 2017-09-05 Qualcomm Incorporated Signaling color values for 3D lookup table for color gamut scalability in multi-layer video coding

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5737032A (en) * 1995-09-05 1998-04-07 Videotek, Inc. Serial digital video processing with concurrent adjustment in RGB and luminance/color difference
CN1620150A (en) * 2003-11-20 2005-05-25 三星电子株式会社 Apparatus and method for controlling the colors of a color image
CN101005620A (en) * 2004-09-03 2007-07-25 微软公司 Reform in decoding macroblock and movement information for interlaced scanning and progressive seanning video code
CN101449476A (en) * 2005-03-18 2009-06-03 夏普株式会社 Methods and systems for reducing blocking artifacts with reduced complexity for spatially-scalable video coding
CN1859576A (en) * 2005-10-11 2006-11-08 华为技术有限公司 Top sampling method and its system for space layered coding video image
CN101977316A (en) * 2010-10-27 2011-02-16 无锡中星微电子有限公司 Telescopic coding method
CN103167295A (en) * 2011-12-16 2013-06-19 三星电子株式会社 Method and apparatus for processing image signal

Also Published As

Publication number Publication date
JP6463820B2 (en) 2019-02-06
KR20170113713A (en) 2017-10-12
US10440340B2 (en) 2019-10-08
JP2019050640A (en) 2019-03-28
CN108337519B (en) 2021-12-03
EP3080985B1 (en) 2023-08-02
EP3080985A1 (en) 2016-10-19
WO2015089352A1 (en) 2015-06-18
KR101786414B1 (en) 2017-10-17
TWI628959B (en) 2018-07-01
AU2014362246B2 (en) 2018-01-25
JP2018038076A (en) 2018-03-08
US20150172616A1 (en) 2015-06-18
JP2017505028A (en) 2017-02-09
US20180146181A1 (en) 2018-05-24
CN105830440A (en) 2016-08-03
MX2016007619A (en) 2016-12-08
CN108337519A (en) 2018-07-27
US9912925B2 (en) 2018-03-06
US20190379870A1 (en) 2019-12-12
MX358758B (en) 2018-09-04
KR20160096712A (en) 2016-08-16
JP6240331B2 (en) 2017-11-29
TW201537992A (en) 2015-10-01
AU2014362246A1 (en) 2016-07-07

Similar Documents

Publication Publication Date Title
CN105830440B (en) The colour gamut scalable video decoding device and method of the phase alignment of brightness and coloration are carried out using interpolation
US10986370B2 (en) Combined scalability processing for multi-layer video coding
CN105556943B (en) 3D look-up table coding is provided with the system and method for colour gamut scalability
EP3090540B1 (en) Color space conversion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant