GB2539241B - Video processing system - Google Patents

Video processing system Download PDF

Info

Publication number
GB2539241B
GB2539241B GB1510168.6A GB201510168A GB2539241B GB 2539241 B GB2539241 B GB 2539241B GB 201510168 A GB201510168 A GB 201510168A GB 2539241 B GB2539241 B GB 2539241B
Authority
GB
United Kingdom
Prior art keywords
resolution
pixel data
reference frame
frame
data representing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
GB1510168.6A
Other versions
GB201510168D0 (en
GB2539241A (en
Inventor
Edsö Tomas
Hugosson Ola
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ARM Ltd
Original Assignee
ARM Ltd
Advanced Risc Machines Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ARM Ltd, Advanced Risc Machines Ltd filed Critical ARM Ltd
Priority to GB1510168.6A priority Critical patent/GB2539241B/en
Publication of GB201510168D0 publication Critical patent/GB201510168D0/en
Priority to KR1020160070722A priority patent/KR20160146542A/en
Priority to US15/177,685 priority patent/US10440360B2/en
Priority to EP16174023.8A priority patent/EP3104613A1/en
Priority to CN201610407439.5A priority patent/CN106254877B/en
Publication of GB2539241A publication Critical patent/GB2539241A/en
Application granted granted Critical
Publication of GB2539241B publication Critical patent/GB2539241B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/107Selection of coding mode or of prediction mode between spatial and temporal predictive coding, e.g. picture refresh
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/59Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial sub-sampling or interpolation, e.g. alteration of picture size or resolution
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/115Selection of the code volume for a coding unit prior to coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/132Sampling, masking or truncation of coding units, e.g. adaptive resampling, frame skipping, frame interpolation or high-frequency transform coefficient masking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/172Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/176Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a block, e.g. a macroblock
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/182Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a pixel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability
    • H04N19/33Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability in the spatial domain
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/40Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video transcoding, i.e. partial or full decoding of a coded input stream followed by re-encoding of the decoded output stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • H04N19/423Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation characterised by memory arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • H04N19/43Hardware specially adapted for motion estimation or compensation
    • H04N19/433Hardware specially adapted for motion estimation or compensation characterised by techniques for memory access
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/53Multi-resolution motion estimation; Hierarchical motion estimation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/85Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/63Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding using sub-band based transform, e.g. wavelets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/90Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
    • H04N19/96Tree coding, e.g. quad-tree coding

Description

Video Processing System
The present invention relates to the processing of video data. More particularly, this invention relates to a method of and apparatus for processing frames of video data.
It is known in the art for video processing systems to generate frames of video data, typically for provision on an electronic display. A video frame is typically represented as a rectangular array of pixels (picture elements) representing an image, where the colour value to be used for each pixel is indicated using an appropriate colour space, e.g. the RGB colour space.
Storing the pixel data for a large number of video frames requires a large amount of memory. Accordingly, a number of video encoding methods have been developed to allow the pixel data representing the video frames to be stored in a compressed form.
According to many such video encoding methods, frames in a sequence of video frames are differentially encoded (i.e. in terms of their differences) relative to a so-called “reference frame”, which is a frame for which full resolution pixel data is stored, and which is not defined with reference to another frame. Typically, plural frames of a sequence of video frames are relatively defined with respect to a single reference frame, and this arrangement is repeated over the sequence of video frames.
One such video encoding method uses so-called “motion estimation”, wherein a given frame is divided into plural blocks of, e.g., 16x16 pixels, and each block of pixels is encoded with a vector value (the so-called “motion vector”) pointing to a corresponding block of pixels in the reference frame, and data (the so-called “residual”) describing the differences between the current frame pixel block and the corresponding pixel block in the reference frame. This thereby allows the pixel data for the pixel block of the current frame to be constructed from the pixel data for the pixel block in the reference frame that is pointed to by the motion vector and the residual data describing the differences between that pixel data and the pixel data of the current video frame.
Employing video encoding methods which use reference frames can lead to a significant reduction in memory requirements compared to arrangements where the raw pixel values are stored for each and every frame in a sequence of video frames.
Typically, a new reference frame is generated periodically, e.g. for every N frames in a sequence of frames. The reference frame is then stored in memory for use by the system.
For example, the pixel data for the current reference frame will be generated and stored for use when encoding a sequence of video frames, e.g. to derive the motion vectors and residual values for the frames being encoded in dependence on the reference frame. (Typically, a new reference frame will replace the existing reference frame in memory.)
When decoding a sequence of video frames, e.g. to display a sequence of video frames, the current reference frame will be generated from the encoded video data and, e.g., displayed, and also stored for use when decoding other frames in the sequence that are dependent on the reference frame. (ln order to correctly decode differentially encoded video frames, a full resolution reference frame is generated and stored in memory to be used during the decoding process.)
The Applicants believe that there remains scope for improvements to methods of and apparatus for processing frames of video data in a video processing system.
According to a first aspect ofthe present invention, there is provided a method of processing frames of video data in a video processing system as claimed in claim 1.
According to a second aspect of the present invention, there is provided an apparatus for processing frames of video data in a video processing system as claimed in claim 14.
The present invention relates to a method of and apparatus for processing frames of video data in a video processing system that employs differentiai encoding, i.e. where one or more frames in a sequence of video frames are defined with respect to a reference frame in the sequence. ln the present invention, when all or part of a reference frame is required for decoding a sequence of video frames, pixel data representing all or part of the reference frame at a first resolution is generated and stored in memory, e.g., so that it can be used to define and/or decode other (e.g. subsequent) frames in the sequence of video frames in terms of their differences (residuais). However, in addition to generating and storing pixel data representing the reference frame at a first resolution, the present invention generates and stores pixel data representing all or part of the same reference frame (or part thereof) at one or more different resolutions to the first resolution.
The embodiments described herein generally refer to generating pixel data, inter alia, for the whole reference frame. However, this is not required. ln the present invention, the pixel data that is generated and stored in memory can (and in some embodiments does) represent only a part of the reference frame at a first resolution and at least one different resolution.
As will be discussed further below, the Applicants have recognised that providing pixel data representing the same reference frame at at least two different resolutions can be advantageous at various stages of a video processing system, and can provide, e.g., an overall more efficient video processing system. For example, as will be discussed further below, generating pixel data representing the reference frame at not only a first resolution, but also at at least one second, e.g. lower, resolution can be used to facilitate more efficient compression of subsequently generated frames, or to reduce the number of memory accesses required when sealing a given frame for display, for example.
The sequence of video frames comprises one or more frames defined (encoded) with respect to a reference frame. The frames in the sequence can be in any suitable and desired order. However, in preferred embodiments the frames are in the order in which they are to be displayed.
The reference frame itself can be any frame from which one or more frames in the sequence of video frames are defined. ln preferred embodiments, the reference frame can be any frame in the sequence of video frames. For example, the reference frame can be before or after the frame or frames in the sequence of frames that are defined with respect to the reference frame. ln other arrangements, however, the reference frame is after one or more frames and before one or more other frames in the sequence of frames that are defined with respect to the reference frame.
As will be appreciated by those skilled in the art, there may be a set of many video frames to be decoded, including plural reference frames and respective other frames defined with respect to a respective reference frame. ln this case, each reference frame and its related frames can be, and are preferably, treated as a sequence of frames in the manner of the present invention. (Correspondingly, a given overall set or sequence of video frames may be made up of (and treated as) plural sequences of video frames of the form of the present invention.)
The pixel data representing the reference frame to be used when decoding the sequence of video frames is generated (and stored in memory) as and when a reference frame is required. According to an embodiment of the present invention, decoding the sequence of video frames comprises decoding the one or more dependently encoded video frames in the sequence, e.g. to provide the frame(s) for display, (preferably) using the reference frame pixel data.
The pixel data representing the reference frame at a first resolution can be generated in any suitable and desired manner.
In preferred embodiments, the step of generating pixel data representing the reference frame at a first resolution comprises determining appropriate values (e.g. colour values) to be used for the pixels of the reference frame.
In one preferred embodiment, generating pixel data representing the reference frame at a first resolution comprises rendering the pixel values, e.g. using a texture map and/or other, rendering processes.
In another preferred embodiment the step of generating pixel data for the reference frame comprises decoding encoded data representing the reference frame and generating therefrom pixel data representing the reference frame at a first resolution. This may be appropriate where the current reference frame is stored in memory in an encoded form. Thus, in preferred embodiments, generating pixel data representing the reference frame at a first resolution comprises first reading existing reference frame data from memory.
It will be appreciated that the resolution of the reference frame is dependent on the number of individual pixels within the reference frame, which in turn defines the amount of detail with which the reference frame can be represented.
The first resolution, at which the reference fame is represented by the generated pixel data, can be any suitable or desired resolution. However, the Applicants have recognised that itwill normally be useful to generate pixel data representing the reference frame at the greatest amount of detail (the maximum resolution) that is possible, so that it can be used later on by the video processing system for decoding purposes, for example. Thus, according to preferred embodiments of the present invention, the first resolution is the highest resolution at which pixel data for the reference frame is generated. It is preferably the maximum resolution possible, or the resolution at which the reference frame is represented with the greatest detail (e.g. depending on the available memory etc.). This can be, and is preferably, for example, the resolution at which pixel data would have been generated if the video processing system was configured to generate pixel data representing the reference frame at only a single resolution.
The pixel data representing the reference frame at the at least one different resolution to the first resolution can be generated in any suitable and desired manner.
The pixel data representing the reference frame at at least one different resolution can be generated separately and/or independently from the pixel data representing the reference frame at the first resolution. For example, where the reference frame is generated for the first time, it is possible to individually render the pixel data representing the reference frame at each resolution. Thus, according to an embodiment of the present invention, the method comprises generating, e.g. rendering, pixel data representing the reference frame at a first resolution separately to generating, e.g. rendering, the pixel data representing the reference frame at the at least one different resolution to the first resolution.
However, in preferred embodiments, the pixel data representing the reference frame at at least one different resolution to the first resolution is generated based on the pixel data that has been generated for (and which represents) the reference frame at the first resolution. For example, the reference frame generation stage can be (and preferably is) configured to first generate pixel data representing the reference frame at a first resolution (as discussed above), and to generate pixel data representing the reference frame at at least one different resolution that is different to the first resolution using (based on) the pixel data representing the reference frame at the first resolution.
The present invention can generate pixel data representing the reference frame at any, e.g. given or selected, number of different resolutions and each resolution that is different to the first resolution can be any suitable or desired resolution (so long as it is not the first resolution). ln preferred embodiments, the present invention generates and Stores in memory pixel data representing at least one less detailed version ofthe reference frame than the first resolution version of the reference frame. Thus, according to preferred embodiments, the at least one different resolution to the first resolution comprises at least one lower resolution than the first resolution. For example, the pixel data generated for the reference frame could represent the reference frame at a first resolution of 1080x920 pixels and a second resolution of 720x576 pixels.
Accordingly, the pixel data representing the lower resolution version of the reference frame is preferably a Consolidated (and preferably a down sampled (downscaled)) version of the pixel data representing the reference frame at the first resolution. Thus, according to preferred embodiments, the step of generating pixel data representing the reference frame at at least one different resolution to the first resolution comprises consolidating the pixel data representing the reference frame at the first resolution to generate Consolidated pixel data representing the reference frame at at least one lower resolution than the first resolution.
Consolidating the pixel data can be done in any suitable or desired way. The Consolidated pixel data representing the reference frame at one or more or all of the at least one lower resolutions than the first resolution can be generated from the pixel data representing the reference frame at the first resolution (i.e. the unconsolidated data). Altematively, where Consolidated pixel data representing the reference frame at two or more different, lower resolutions than the first resolution is to be generated, then the Consolidated pixel data representing the reference frame at a lower resolution can be generated from the Consolidated pixel data representing the reference frame at the preceding higher resolution, for example, and if desired. ln preferred embodiments, consolidating the pixel data comprises averaging or otherwise combining (filtering), the pixel data used to represent the reference frame at a higher resolution for a given, e.g. selected, number of pixels of that higher resolution reference frame. For example, the pixel data can be divided into a plurality of pixel blocks, with each such block of, e.g., four pixels, then (and preferably) being represented by a single pixel having the average pixel (colour) value of the pixels in the block. However, other techniques, such as subsampling the pixel data representing the higher resolution reference frame, can also or instead be used, if desired. ln a preferred embodiment, pixel data representing the reference frame at at least three different resolutions is generated and stored (i.e. at the first resolution) and at at least two different (and preferably lower) resolutions to the first resolution. ln a particularly preferred embodiment, the pixel data that is generated and stored, and which represents the reference frame at a first and at least one different resolution is in the form of two or more or all leveis of a mipmap set representing the reference frame. Equally, in preferred embodiments, the reference frame generation stage is configured to generate pixel data representing the reference frame as a mipmap set. A mipmap set comprises a sequence of frames, each of which is a progressively lower resolution (less detailed) representation of the (same) frame image. (Each version of the frame in the mipmap set is referred to herein as a mipmap levei. Unless otherwise indicated, references herein to higher and lower mipmap leveis refer to less and more detailed mipmaps, respectively.)
In a preferred embodiment, the height and width of each levei of the mipmap set is a factor of two smaller than that of the previous levei in the set and each levei has a resolution that is one fourth (in terms of the number of pixels) the resolution of the previous levei. For example, if the reference frame has a size of 256 by 256 pixels at the first resolution, then the associated mipmap set may contain a series of 8 leveis (different versions of the reference frame), wherein each levei is one-fourth the total area of the previous one: 128x128 pixels, 64x64, 32x32, 16x16, 8x8, 4x4, 2x2, 1x1 (a single pixel).
Other sealing factors could be used for the mipmap set if desired.
The Applicants have recognised in this regard that although a mipmap set, wherein each levei of the mipmap has a strict sealing order, may not necessarily be the most ideal set of resolutions at which to represent the reference frame, it can still be beneficiai to generate (and use) pixel data representing the reference frame as a mipmap set.
For example, although the resolution of a mipmap levei may not match exactly the resolution of a display that the reference frame is to be displayed on, the Applicants have recognised that the relative ease of generating and handling or processing the representation of the reference frame as respective mipmap leveis (compared to other, arbitrary resolutions) should outweigh any potential disadvantages that providing the reference frame in this form may have.
The pixel data that is generated and stored according to the present invention may represent an entire mipmap set for the reference frame (i.e. starting from the highest resolution version and including respective lower resolution versions for each mipmap levei down to a single pixel (or its equivalent)). However, it is not required for the pixel data to represent each possible levei of the mipmap.
In an embodiment of the present invention, pixel data representing the reference frame at two or more leveis of the mipmap are generated and stored in memory. ln some embodiments, pixel data representing the reference frame at every other levei of the mipmap, i.e. every other version of the reference frame in the sequence of reference frames that would make up a complete mipmap, is generated and stored in memory. ln other embodiments, the reference frame generation stage is configured to generate pixel data representing only a particular, e.g. desired, e.g., selected, subset of the leveis of the mipmap for the reference frame. ln one preferred embodiment, pixel data representing the whole reference frame at at least one different resolution to the first resolution is generated.
However, this is not required. Thus, in other embodiments of the present invention, the pixel data representing the reference frame at at least one different resolution to the first resolution represents only a region or a part of the whole reference frame (image). For example, the reference frame generation stage can be configured to consolidate only a subset of the pixel data representing the reference frame at the first resolution.
The video processing system of the present invention can be configured to implement the embodiments described above (in relation to generating pixel data representing the reference frame at at least one different resolution to the first resolution) in any suitable or desired manner. For example, the process of generating pixel data representing the reference frame at at least one different resolution to the first resolution can be (and preferably is) triggered and configured by the appropriate driver on the CPU, based on its knowledge of the overall video processing system, for example. ln a preferred embodiment, the reference frame pixel data generation is configured based on how the pixel data for the reference frame is to be used by the video processing system.
For example, and preferably, one or more or all of the at least one different resolution to the first resolution that the reference frame is generated at is selected based on the resolution of an electronic display or displays on which the video sequence is to be displayed. ln one preferred embodiment, the at least one different resolution to the first resolution corresponds to the resolution of the output frame to be displayed.
Additionally or altematively (and preferably additionally), the number of different resolutions at which the reference frame is to be represented is preferably selected based on knowledge of the system. For example, the reference frame generation stage can be configured to generate pixel data representing the reference frame at a single resolution that is different to the first resolution, up to, e.g., as many different resolutions as there are different displays in the overall video processing system on which to display the video sequence.
The pixel data representing the reference frame can be generated by any suitable and desired component of the video processing system. ln a preferred embodiment, the pixel data is generated by a suitable processor, such as a graphics processor, a video processor (video engine), a compositor or a display processor of the video processing system. Thus, the reference frame generation stage preferably comprises a graphics processor, a video processor (video engine), a composition engine (a compositor) or a display controller. The reference frame generation stage may comprise more than one processor if desired, and one or more or all of the processors may be operable in the manner of the present invention.
The reference frame generation stage may also include other components, such as a decompression stage (a decoding unit) and/or compression stage (an encoding unit), if desired (and in a preferred embodiment this is the case). ln preferred embodiments, the pixel data representing the reference frame at the at least one different resolution that is different to the first resolution is generated by the same processor of the video processing system that generated the pixel data representing the reference frame at the first resolution. ln other embodiments, however, the task of generating pixel data representing the reference frame at a first and at least one different resolution is divided amongst plural processors, and then performed substantially in parallel. ln this regard, any desired number of reference frame generation processors can be provided, such as, for example, two reference frame generation processors, up to, e.g., as many reference frame generation processors as the number of different resolutions at which a given reference frame is to be represented. ln preferred embodiments, the pixel data representing the reference frame at at least one different resolution is generated at substantially the same time as the first resolution pixel data, regardless of whether the pixel data is generated by a single or a plurality of reference frame generation stages. ln this way, the video processing system should only need to read/access the full resolution reference frame once, rather than each time a different resolution frame is generated.
In a particularly preferred embodiment, the generation of pixel data at at least one different resolution to the first resolution is performed by an existing stage of the video processing system, preferably by a stage that would otherwise normally perform as part of its “normal” processing operations a sealing operation (that can then also be used to generate pixel data representing the same reference frame at different resolutions).
In this regard, the Applicants have recognised that for certain methods of motion estimation, such as hierarchical motion estimation, frames will undergo a sealing process wherein the size and resolution of the frames are reduced. For example several versions of the same image are constructed, each having both dimensions reduced (sealed down) by a certain factor.
Thus, in a particularly preferred embodiment, the video processing system of the present invention is one which comprises hardware for performing sealing operations during motion estimation encoding and that hardware is used to generate the pixel data for the reference frame at at least one different resolution to the first resolution. Thus, in a preferred embodiment, the reference frame generation stage comprises sealing hardware that is used for hierarchical motion estimation encoding.
The sealing hardware used for hierarchical motion estimation encoding is a particularly preferred component of the video processing system for generating the pixel data representing the reference frame at different resolutions, as the sealing hardware already operates to perform sealing operations on video data. In this way, the sealing hardware used in hierarchical motion estimation encoding can be shared between (and re-used) by various components of the video processing system (such as one or more reference frame generation processors of the video processing system).
The generated pixel data for the reference frame is stored in memory. It may be stored in any suitable memory, such as, and preferably an appropriate frame buffer, so that it may then be read for later use by the video processing system.
In one embodiment, the generated pixel data is stored in local memory, preferably in an appropriate frame buffer, for the reference frame generation stage. This is preferably the case where the reference frame is immediately required e.g. to decode a sequence of differentially encode video frames. ln another embodiment, the pixel data is stored in a main memory ofthe video processing system, such as a main memory ofthe host system that is using the video processing system, such as a hard disk or disks, or a solid state disk, some other form of externai storage médium, such as an optical disk (DVD or CD-ROM), flash memory, hard drive, or remotely and accessed over a network connection, etc.. ln a preferred embodiment, the pixel data for the reference frame is stored in an encoded (and preferably compressed) form. ln particularly preferred embodiments, the pixel data representing the reference frame is encoded using a block-based encoding format. For example, the reference frame can be (and preferably is) divided into a plurality of blocks (macroblocks) of pixels of the frame (e.g. 16x16 pixel blocks in the case of MPEG encoding) and the pixel data for each block is encoded individually. Preferably, the reference frame is encoded according to any one of the methods disclosed in the Applicants US patent application US 2013/036290, and/or US patent US 8542939.
Thus, in preferred embodiments, the pixel data that represents the reference frame at the first and at at least one other resolution is stored in a compressed form. Correspondingly the two or more or all leveis of a mipmap set representing the reference frame are preferably stored in memory in a compressed form. ln this regard, the pixel data (e.g. mipmap set) may be compressed in any suitable and desired way. ln preferred embodiments, the pixel data (the layer (e.g. the lower layer) of the mipmap set) which represents the reference frame at full resolution (i.e. which represents the bitexact version ofthe reference frame) is compressed using a lossless compression scheme (so that it can be used to decode subsequent frames). For the other resolutions (layer(s)), which represent the reference frame at the one or more other, e.g. lower, resolutions, a lossy compression scheme can be (and in preferred embodiments is) used.
Each levei of the mipmap set may be compressed with any, e.g. selected or desired, compression scheme and/or compression rate, ln a preferred embodiment, one or more ofthe methods disclosed in the Applicants US patent application US 2013/036290, and/or US patent US 8542939 are used to compress the pixel data. The compression scheme and/or compression rate used to compress the reference frame data may be (and in some embodiments is) selected based on the resolution at which the pixel data being compressed represents the reference frame (e.g. which levei of the mipmap is being compressed). ln some embodiments the compression stage (the encoding unit) of the reference frame generation stage is configured to compress different resolution versions of the reference frame (different leveis of the mipmap set) with different compression rates. For example , if the full resolution layer in compressed form is represented using 10 bits of data, then the other layers could (and in some embodiments would) use fewer bits, such as 8 bits (in effect, a guaranteed compression of 20%). Similarly, the lower resolution layer(s) could remove chroma information by, e.g., converting from 4:2:2 sampling to 4:2:0 sampling (in effect, a guaranteed compression of 25%).
Compressing the pixel data (e.g. mipmap set) representing the reference frame according to the embodiments described above is advantageous in that it reduces memory bandwidth further. It is also a particularly suitable arrangement for dealing with video content, given that video content is typically already compressed (which means high-frequency content has been removed, making it more suitable for compression).
As mentioned above, pixel data representing the reference frame at at least one different resolution to the first resolution can be used advantageously at various stages of the video processing system.
Thus, according to preferred embodiments, the method of the present invention further comprises processing at least one frame in the sequence of video frames, e.g. for display, using the pixel data representing the reference frame at at least one different resolution to the first resolution.
Accordingly, in preferred embodiments, the apparatus of the present invention further comprises a frame processing stage configured to process at least one frame in the sequence of video frames using the pixel data representing the reference frame at at least one different resolution to the first resolution. ln this regard, the Applicants have recognised that pixel data representing the reference frame at different resolutions can be used advantageously on the decoding side of the video process, e.g. where the sequence of video frames is being processed for display. For example, it may be desirable to use pixel data representing the reference frame at at least one different (and preferably lower) resolution than the first resolution when providing an output frame or frames for display (and in a preferred embodiment, this is done).
This may be particularly true in arrangements where the current frame in the sequence of video frames is to be displayed at a resolution that is different to the first resolution of the reference frame. This may be the case, for example, when providing an output frame or frames for display on a display that has a resolution that is lower than the first resolution of the reference frame. Another example would be when providing an output frame or frames that are to be displayed in a portion (e.g. a window) of the overall display, wherein the portion of the display has a resolution that is lower than the first resolution of the reference frame. In such cases, the output frame representing the current video frame will correspond to a set of pixels having a lower resolution than the first resolution of the reference frame. (Thus the output frame that is being generated may, e.g., completely fill the display in question, or may be for a window that is being displayed on a display.)
As is known in the art, providing an output frame for display comprises, for example, loading pixel data into a suitable frame buffer, from where it can then be read and provided to a display. In many cases this will involve generating pixel data for the output frame based on at least the full resolution pixel data representing the corresponding reference frame.
However, according to an embodiment of the present invention, where the current frame is to be displayed at a lower resolution than the first resolution of the reference frame, instead of generating pixel data for the output frame using the pixel data for the first, i.e. full, resolution reference frame, pixel data representing a lower resolution reference frame can be (and preferably is) used for this purpose. Thus, according to an embodiment of the present invention, the step of processing at least one frame in the sequence of video frames comprises generating pixel data representing an output frame for display using the pixel data representing the reference frame at at least one lower resolution than the first resolution.
In some embodiments, the pixel data for the lower resolution reference frame can be used as the pixel data for the output frame. For example, where the current frame to be displayed in the sequence of video frames is (e.g. flagged as) the reference frame itself, then in order to generate the output frame, the video processing system (particularly the frame processing stage) will use only the lower resolution pixel data for the reference frame. (It will be appreciated that although in this embodiment only the lower resolution pixel data for the reference frame is used as an output frame, the video processing system (e.g. the video processing engine) may, and preferably does, still generate (and store in memory) pixel data representing the reference frame at the first, i.e. full, resolution so that it can be used to decode other frames in the sequence of video frames).
This could also be (and in some embodiments is) the case in arrangements where the current frame to be displayed is a differentially encoded frame and where the residuais for the current frame are equal to zero. For example, if a given block of the current frame is unchanged compared to the reference frame, then the video processing system can (and preferably will) use the pixel data for the corresponding block in the (lower resolution version of the) reference frame as the pixel data for the output frame. ln an embodiment, one of the at least one lower resolution than the first resolution, at which pixel data for the reference frame is generated, corresponds to the resolution required for the output frame to be displayed. ln such cases, pixel data representing the reference frame at the at least one lower resolution that corresponds to the output frame resolution can be (and preferably is) used (and e.g. outputted) directly as the pixel data for the output frame.
Thus, according to an embodiment ofthe present invention, generating pixel data representing an output frame for display using the pixel data representing the reference frame at at least one lower resolution than the first resolution comprises outputting pixel data representing the reference frame at one of the at least one lower resolution than the first resolution directly as the output frame.
However, in arrangements where the generated lower resolution version of the reference frame does not correspond to the resolution required for the output frame (e.g. display), it may be necessary to perform a sealing operation to derive the final pixel data for the output frame. Thus, according to an embodiment of the present invention, generating pixel data representing an output frame for display using the pixel data representing the reference frame at at least one lower resolution than the first resolution comprises performing a sealing operation on the pixel data representing the reference frame at at least one lower resolution than the first resolution when deriving the final pixel data for the output frame. ln this case, the lower resolution version of the reference frame that is used is preferably selected so as to reduce, e.g. minimise, the amount of sealing processing that may be required, e.g. by using the resolution that is closest to the actually required resolution.
Where the current frame to be displayed is a differentially encoded frame, i.e. a frame that has been encoded with respect to the full resolution pixel data representing the reference frame (and that does not have zero residuais), then in order to generate correctly an output frame for display, the differentially encoded frame to be displayed will normally be decoded from the differentially encoded pixel data representing the current frame and the pixel data representing the full resolution reference frame. The decoded pixel data for the frame will then normally be stored in memory, from which it can be fetched, e.g., by a display unit and sealed down to fit the output frame resolution if necessary.
However, according to embodiments of the present invention, instead of first decoding the pixel data for the current frame (at full resolution) and storing it in memory, the frame processing stage is configured not to store full resolution pixel data representing the (decoded) current frame in memory. Instead, the frame processing stage is configured to generate and store pixel data representing the (decoded) current frame at a lower resolution only.
This can be (and preferably is) done by (e.g. the frame processing stage) reading in and using the full resolution pixel data representing the reference frame to decode the current frame to be displayed, and using the (decoded) pixel data representing the current frame at full resolution internally to generate pixel data representing the current frame at one or more different resolutions to the full resolution. Only the pixel data representing the current frame at the one or more different resolutions to the full resolution is then stored in memory (for use later in the system).
Thus, according to an embodiment of the present invention, the method further comprises, if the current frame to be displayed is encoded with respect to the reference frame: decoding the current frame using the encoded pixel data for the current frame and the pixel data representing the reference frame at the first resolution; generating and storing in memory pixel data representing the current frame at at least one different resolution to the first resolution; and generating pixel data representing an output frame using the pixel data representing the current frame at the at least one different resolution than the first resolution. ln preferred embodiments, generating pixel data representing the current frame at at least one different resolution to the first (e.g. full) resolution comprises performing one or more sealing operations on the pixel data representing the current fame at the first resolution.
The Applicants have recognised that although this embodiment does not utilise the pixel data representing the reference frame at a lower resolution than the first resolution, there is still an advantage to using the full resolution reference frame to generate pixel data representing multiple resolution versions of the current frame to be displayed. For example, having lower resolution versions of the current frame to be displayed allows a display controller to read in only the smaller version of the current frame (not the full resolution version) in order to derive the pixel data for an output frame to be displayed, thereby saving on the number of memory accesses, and memory bandwidth, for example. This is particularly advantageous where the at least one lower resolution corresponds to the resolution required by the display.
In other embodiments, the video processing system (e.g. the frame processing stage) is configured to first scale down the (differentially) encoded pixel data for the current frame (i.e. by consolidating the motion vectors and residuais data) to a lower resolution and then derive and generate final, i.e. decoded, pixel data for the output frame based on the “sealed down” version of the current frame and the pixel data representing the reference frame at an at least one lower resolution.
Thus, according to an embodiment, generating an output frame for display comprises consolidating the (differentially encoded) pixel data representing a current frame to be displayed at a first resolution to generate Consolidated pixel data representing the current frame at a second, different (e.g. lower) resolution than the first resolution, and deriving final pixel data for the output frame based on the Consolidated data representing the current frame and the pixel data representing the reference frame at one of the at least one lower resolution at which pixel data for the reference frame has been generated.
In preferred embodiments, the first resolution of the current (differentially encoded) frame is the same as the first resolution of the reference frame and/or the second resolution of the current (differentially encoded) frame is the same as one of the at least one lower resolution at which pixel data for the reference frame has been generated.
According to embodiments, deriving final pixel data for the output frame based on the Consolidated data representing the current frame and the pixel data representing the reference frame at the one of the at least one lower resolution comprises decoding the Consolidated (and differentially encoded) pixel data for the current frame using the pixel data representing the reference frame at the one of the at least one lower resolution. The decoded (Consolidated) pixel data can then be (and in some embodiments is) used directly as the final pixel data for the output frame. ln other embodiments, however, the decoded pixel data is sealed in order to derive the final pixel data for the output frame. This is required, for example, where the resolution of the decoded pixel data is not the same as the output frame resolution.
Thus, according to an embodiment, the present invention comprises performing a sealing operation on the pixel data representing the decoded version of the current frame to be displayed to derive the final pixel data for the output frame. The sealing operation preferably comprises sealing the pixel data representing the decoded version of the current frame to provide pixel data representing the decoded version of the current frame at the (desired) output frame resolution.
Using lower resolution pixel data for the reference frame when generating frames for display in this way is advantageous in that the display process needs only to read in the smaller reference frame and not the full resolution version in order to derive the pixel data for an output frame or frames to be displayed, thereby saving on the number of memory accesses, and memory bandwidth, for example.
As mentioned above, the reference frame generation stage can be configured to control an aspect of reference frame generation based on knowledge of the requirements of the data processing system. Thus, where the required output frame (e.g. display) resolution is known to the data processing system, the reference frame generation stage may be configured to generate pixel data for the reference frame based on the display output frame resolution. Equally, where the video processing system comprises more than one display unit, the reference frame generation stage can be (and preferably is) configured to generate pixel data representing the reference frame based on each different display output frame resolution. ln these cases, the reference frame may be generated at the required output display frame resolution, and/or at a resolution that is closer to the required output display frame resolution (and then sealed to that resolution if required).
The Applicants believe that a video processing system configured to operate in a manner described above is new and advantageous in its own right.
Thus, according to a third aspect of the present invention there is provided a video processing system, in which data for one or more frames in a sequence of video frames is defined with respect to a reference frame, the video processing system comprising: a memory; an apparatus as claimed in claim 14; and a display controller configured to: read in from the memory the pixel data representing all or part of the reference frame at the at least one different resolution to the first resolution; use the pixel data when generating an output frame to be displayed; and output the generated output frame to a display.
As will be appreciated by those skilled in the art, this aspect of the present invention can and preferably does include any one or more of the preferred and optional features of the invention described herein, as appropriate. Thus, for example, the video processing system preferably further comprises a display. ln addition to its use when generating an output frame for display, the Applicants have recognised that having pixel data representing the reference frame at the at least one different resolution to the first resolution can also be used to facilitate performing processing operations on a video frame or frames, e.g. to be displayed.
Such a processing operation could comprise, for example, a transformation operation, such as sealing, rotation and/or skewing of a frame to be displayed. This operation may be performed by any suitable unit or stage of the overall video and data processing system, such as a graphics processor, a composition engine, a CPU, and/or a display controller that has the ability to perform such operations.
Thus, according to an embodiment of the present invention, processing at least one frame in the sequence of video frames using the pixel data representing the reference frame at at least one different (e.g. lower) resolution than the first resolution comprises performing a transformation operation on the at least one frame in the sequence of video frames using the pixel data representing the reference frame at at least one different (e.g. lower) resolution than the first resolution. Equally, according to an embodiment, the frame processing stage of the present invention comprises a processor (e.g. a graphics processor) that is capable of and configured to perform at least one transformation operation on the at least one frame in the sequence of video frames using the pixel data representing the reference frame at at least one different resolution (e.g. lower) to the first resolution.
In preferred embodiments, the transformation operation comprises at least one of a sealing, rotation, and skewing operation. The transformation operation could (and in some embodiments does) comprise interpolating between frames of the sequence of video frames.
The Applicants have recognised that generating and using pixel data representing the reference frame at at least one lower resolution than the first resolution may be beneficiai when performing processing operations in that the video processing system will require less memory accesses to read the reference frame compared to when using a higher resolution version of the reference frame.
Also, the Applicants have recognised that having multiple resolution versions of the reference frame available (e.g. to a GPU) at the same time may facilitate performing processing operations, such as sealing, in a more efficient and/or sophisticated manner. For example, having multiple resolution versions of the reference frame that can be used simultaneously may facilitate performing operations for visually flipping through frames on a display (such as “coverflow”-type operations).
The Applicants again believe that a video processing system configured to operate in this manner is new and advantageous in its own right.
Thus, according to another aspect of the present invention there is provided a video processing system, in which data for one or more frames in a sequence of video frames is defined with respect to a reference frame, the video processing system comprising: a memory; an apparatus as claimed in claim 14 (i.e. a first processing unit (e.g. a video processor)); and a (second) processing unit (e.g. a graphics processor) configured to: read in from memory the pixel data representing all or part of the reference frame at the at least one different resolution to the first resolution; use the pixel data when performing a processing operation on a frame in the sequence of video frames to generate pixel data representing a processed frame; and store the pixel data representing the processed frame in memory. As will be appreciated by those skilled in the art, this aspect of the present invention can and preferably does include any one or more of the preferred and optional features ofthe invention described herein, as appropriate. Thus, for example, the video processing system preferably further comprises a display and/or the system preferably further comprises a display controller configured to: read in from memory the pixel data representing the processed frame; use the pixel data when generating an output frame to be displayed; and output the generated output frame to a display.
Similarly, the first processing unit is preferably a video processor, and the second processing unit is preferably one of a graphics processor, a composition engine, a central processing unit (CPU), or a display controller that is operable to perform processing operations on frames to be displayed. There may be more than one “second” processing unit that uses the pixel data representing all or part of the reference frame at the at least one different resolution to the first resolution when performing a processing operation on a frame in the sequence of video frames to generate pixel data representing a processed frame, if desired.
Correspondingly, the processing operation preferably comprises a transformation operation, such as sealing, rotation and/or skewing of a frame to be displayed.
It will be appreciated that although the above embodiments have been described primarily with reference to generating pixel data representing an output frame for display, the principies and techniques described above can be used wherever pixel data representing a frame is to be generated based on pixel data representing a reference frame. For example, in one arrangement, the principies and techniques described above can also be used, e.g. by a composition engine, to derive and generate pixel data representing a composited frame from two or more separate source frames, each having their own reference frames stored in memory. ln that case, the lower resolution reference frame for one or more ofthe source frames can be used to generate a composited frame.
Furthermore, whilst the embodiments described above have been described with reference to entire frames, the techniques and principies of the present invention can equally be (and in one preferred embodiment are) applied to only a part of a frame. For example, pixel data representing the entire reference frame (or a part thereof) at at least one different resolution to the first resolution can be used to process only a part of a frame in a sequence of video frames.
The present invention can be implemented in any desired and suitable data processing system that is operable to generate frames of video data for display, e.g. on an electronic display.
The frame processing stage in this regard preferably comprises a suitable processor, such as a graphics processor, a video processor, a compositor or a display controller. ln some embodiments, the frame processing stage comprises the same processor or processors as the reference frame generation stage. ln a preferred embodiment the present invention is implemented in a data processing system that is a system for displaying Windows, e.g. for a graphical user interface, on a display, and preferably a compositing window system.
The data (video) processing system that the present invention is implemented in can contain any desired and appropriate and suitable elements and components. Thus it may, and preferably does, contain one or more of, and preferably all of: a CPU, a GPU, a video processor, a display controller, a display (e.g. an LCD or an OLED display), and appropriate memory for storing the various frames and other data that is required.
The generated frame(s) to be displayed and the output frame for the display (and any other source surface (frames)) can be stored in any suitable and desired manner in memory. They are preferably stored in appropriate buffers. For example, the output frame is preferably stored in an output frame buffer.
The output frame buffer may be an on-chip buffer or it may be an externai buffer. Similarly, the output frame buffer may be dedicated memory for this purpose or it may be part of a memory that is used for other data as well. ln some embodiments, the output frame buffer is a frame buffer for the video processing system that is generating the frame and/or for the display that the frames are to be displayed on.
Similarly, the buffers that the generated frames are first written to when they are generated (rendered) may comprise any suitable such buffers and may be configured in any suitable and desired manner in memory. For example, they may be an on-chip buffer or buffers or may be an externai buffer or buffers. Similarly, they may be dedicated memory for this purpose or may be part of a memory that is used for other data as well. The input frame buffers can be, e.g., in any format that an application requires, and may, e.g., be stored in system memory (e.g. in a unified memory architecture), or in graphics memory (e.g. in a non-unified memory architecture).
The present invention can be implemented in any suitable system, such as a suitably configured micro-processor based system. In some embodiments, the present invention is implemented in Computer and/or micro-processor based system.
The various functions of the present invention can be carried out in any desired and suitable manner. For example, the functions of the present invention can be implemented in hardware or software, as desired. Thus, for example, the various functional elements and "means" of the present invention may comprise a suitable processor or processors, controller or controllers, functional units, circuitry, processing logic, microprocessor arrangements, etc., that are operable to perform the various functions, etc., such as appropriately dedicated hardware elements (processing circuitry) and/or programmable hardware elements (processing circuitry) that can be programmed to operate in the desired manner.
It should also be noted here that, as will be appreciated by those skilled in the art, the various functions, etc., of the present invention may be duplicated and/or carried out in parallel on a given processor. Equally, the various processing stages may share processing circuitry, etc., if desired.
The present invention is preferably implemented in a portable device, such as, and preferably, a mobile phone or tablet.
It will also be appreciated by those skilled in the art that all of the described embodiments of the present invention can include, as appropriate, any one or more or all of the preferred and optional features described herein.
The methods in accordance with the present invention may be implemented at least partially using software e.g. Computer programs. It will thus be seen that when viewed from further embodiments the present invention provides Computer software specifically adapted to carry out the methods herein described when installed on data processing means, a Computer program element comprising Computer software code portions for performing the methods herein described when the program element is run on data processing means, and a Computer program comprising code means adapted to perform all the steps of a method or of the methods herein described when the program is run on a data processing system. The data processing system may be a microprocessor, a programmable FPGA (Field Programmable Gate Array), etc.
The present invention also extends to a Computer software carrier comprising such software which when used to operate a graphics processor, renderer or other system comprising data processing means causes in conjunction with said data processing means said processor, renderer or system to carry out the steps of the methods of the present invention. Such a Computer software carrier could be a physical storage médium such as a ROM chip, CD ROM, RAM, flash memory, or disk, or could be a signal such as an electronic signal over wires, an optical signal or a radio signal such as to a satellite or the like.
It will further be appreciated that not all steps of the methods of the present invention need be carried out by Computer software and thus from a further broad embodiment the present invention provides Computer software and such software installed on a Computer software carrier for carrying out at least one of the steps of the methods set out herein.
The present invention may accordingly suitably be embodied as a Computer program product for use with a Computer system. Such an implementation may comprise a series of Computer readable instructions fixed on a tangible, non-transitory médium, such as a Computer readable médium, for example, diskette, CD ROM, ROM, RAM, flash memory, or hard disk. It could also comprise a series of Computer readable instructions transmittable to a Computer system, via a modem or other interface device, over either a tangible médium, including but not limited to optical or analogue Communications lines, or intangibly using wireless techniques, including but not limited to microwave, infrared or other transmission techniques. The series of Computer readable instructions embodies all or part of the functionality previously described herein.
Those skilled in the art will appreciate that such Computer readable instructions can be written in a number of programming languages for use with many Computer architectures or operating systems. Further, such instructions may be stored using any memory technology, present or future, including but not limited to, semiconductor, magnetic, or optical, or transmitted using any Communications technology, present or future, including but not limited to optical, infrared, or microwave. It is contemplated that such a Computer program product may be distributed as a removable médium with accompanying printed or electronic documentation, for example, shrink wrapped software, pre-loaded with a Computer system, for example, on a system ROM or fixed disk, or distributed from a server or electronic bulletin board over a network, for example, the Internet or World Wide Web.
Preferred embodiments of the present invention will now be described by way of example only and with reference to the accompanying drawings, in which:
Figure 1 illustrates schematically an exemplary video processing system that is configured to process frames of video data in a manner according to the present invention;
Figure 2 illustrates schematically a reference frame represented as a mipmap set of progressively lower resolution versions of the (same) reference frame image;
Figure 3 shows schematically an embodiment of a video processing system that can perform the basic operation of the present invention;
Figure 4 illustrates schematically a video processing system having a video processor configured to process frames of still images in a manner not in accordance with the present invention; and
Figure 5 illustrates schematically a video processing system that is configured to encode at least one frame in a sequence of video frames in a manner not in accordance with the present invention.
Figure 1 shows schematically an embodiment of a video processing system 11 that can perform the basic operation of the present invention.
As shown in Figure 1, the video processing system 11 may comprise a system on-chip (SoC) 12 which includes a central processing unit (CPU) 13, a graphics processing unit (GPU) 14, a video processing unit (VPU) 15, a display processing unit 16 and an externai interface 18, all having access to externai (e.g. off-chip) memory 19. Separate to the SoC and externai memory is the display itself (not shown).
The GPU 14 and VPU 15 may include suitable compressors (and corresponding de-compressors) for encoding (compressing) data (e.g. a frame) to be stored in memory in a compressed form. Accordingly, the display processing unit 16 may include a de-compressor for decompressing data (e.g. a frame to be displayed). ln accordance with the present embodiments, a sequence of (compressed) video frames will be provided to the video engine from an externai source for decoding. ln other embodiments, a frame to be displayed is generated as desired by, for example, being appropriately rendered by the GPU 14 or video engine 15.
If the current frame being processed by the video engine is a so-called reference frame (and so is to be used when decoding other frames in the sequence of video frames), then pixel data representing the full resolution version of the reference frame will be generated and stored in a buffer within the externai memory 19. However, in addition to generating pixel data representing the reference frame at full resolution (i.e. the original intended resolution), the video engine will, in accordance with the present embodiments, generate and store in the externai memory 19 pixel data representing the reference frame at at least one different resolution to the first, full resolution.
Generating pixel data representing the reference frame at not only a first resolution, but also at at least one second, e.g. lower, resolution can be used to reduce the number of memory accesses required when generating an output frame for display.
For example, pixel data representing the reference frame at different resolutions is fetched from externai memory 19 and used to generate an output frame for display.
For example, where the frame to be displayed is the reference frame, the pixel data representing the reference frame at the most appropriate resolution for the display may be fetched from the externai memory 19 by the display processing unit 16 and output for display. It will be appreciated that in some cases the display processing unit 16 (or GPU 14, for example) may perform a sealing operation on the pixel data representing the reference to provide an output frame at the correct resolution for the display size.
Where the frame to be displayed is not flagged as a reference frame but is encoded relative to the reference frame, the video processing unit 15, for example, retrieves the pixel data representing the reference frame from memory 19 and uses it to decode the current frame and generate pixel data representing the current frame at a desired resolution (preferably the output frame resolution).
It will be understood that although the arrangement of Figure 1 shows only two frame generators (the GPU 14 and video engine 15), the video processing system of the present invention could include any number (and types) of frame generators, as appropriate.
Figure 2 illustrates a reference frame represented as a mipmap set of progressively lower resolution versions of the (same) reference frame image.
As mentioned above, in a particularly preferred embodiment, the pixel data that is generated and stored, and which represents the reference frame at a first and at least one different resolution is in the form of two or more or all leveis of a mipmap set representing the reference frame. Each version of the frame in the mipmap set is referred to as a mipmap levei, wherein the lowest levei, LO (20), is the most detailed version of the reference frame and the highest levei, L4 (24), is the least detailed version.
As shown in Figure 2, in this embodiment the height and width of each levei of the mipmap set is a factor of two smaller than that of the previous levei in the set and each levei has a resolution that is one fourth (in terms of the number of pixels) the resolution of the previous levei, ln the embodiment shown in Figure 2, the reference frame has an original, full resolution of 4096 by 2160 pixels at the first levei LO, and the associated mipmap set contains a series of 5 leveis (LO through L4), wherein each levei is one-fourth the total area of the previous one: 2048x1080 pixels, 1024x540, 512x270, 256x135.
It will be appreciated that the pixel data that is generated and stored according to the present invention may represent an entire mipmap set for the reference frame (i.e. starting from the highest resolution version and including respective lower resolution versions for each mipmap levei down to a single pixel (or its equivalent)) or only a particular, e.g. desired, e.g., selected, subset of the leveis of the mipmap for the reference frame.
It will also be appreciated that in accordance with some embodiments of the present invention, only a region or a part of the whole reference frame is represented at a given levei of the mipmap set.
Figure 3 shows schematically an embodiment of a video processing system 31 that can perform the basic operation of the present invention.
As shown in Figure 3, the video processing system includes a VPU 32, a first display processing unit (DPU[0]) 34 and a second display processing unit (DPU[1]) 35, all having access to externai memory 33. The first DPU (DPU[0]) corresponds to a first display screen (screen 0) of size L1, and the second DPU (DPU[1]) corresponds to a second display screen (screen 1) of size L2.
According to embodiments of the present invention, the VPU 32 receives as an input compressed video data representing a sequence of video frames, including a reference frame from which other frames in the sequence are defined. ln the embodiment shown, the compressed video data represents each frame in the sequence of video frames at a first resolution, L0.
The VPU 32 is configured to de-compress the compressed video input (one frame at a time) to generate decompressed video data representing the frames. Accordingly, the VPU 32 includes a de-compressor (not shown) for decompressing the video data (e.g. a frame to be displayed). (It will be appreciated that de-compression of the compressed video data can be done in other ways, if desired. For example, instead of the VPU 32 including a de-compressor, a separate de-compression engine that receives the compressed video data and de-compresses them before sending them to the VPU 32 could be provided in the system.)
The VPU 32 is configured to generate and store in the externai memory 33 pixel data representing the reference frame atthe first, original resolution, LO. Thus, once the compressed video data representing the reference has been decompressed, the (decompressed) pixel data representing the reference frame is stored in memory 33. ln accordance with the embodiments ofthe present invention, however, the VPU 32 is configured also to generate and store in memory 33 pixel data representing the reference frame at at least one different resolution to the first resolution. This may be achieved, for example, by performing appropriate sealing operations on the (de-compressed) reference frame. ln the embodiment shown in Figure 3, the first frame (frame 1) in the sequence of video frames received by the VPU 32 is a reference frame as well as a frame to be output for display. Accordingly, the VPU 32 generates pixel data representing the reference frame at the first resolution LO and Stores it in memory 33, e.g., so that it can be used to define and decode other frames in the sequence of video frames. However, in addition to this the VPU 32 generates pixel data representing the reference frame at two different resolutions (L1 and L2) in addition to the first resolution (LO). This may be done by sealing the reference frame to the exact resolutions of the first and second display screen.
The second frame (frame 2) in the sequence of video frames is also flagged as a reference frame and an output frame, so pixel data pixel representing that reference frame at the first, second and third resolutions LO, L1 and L2 is also generated.
The third frame (frame 3) is not flagged as a reference frame, but is instead flagged as an output frame only. ln this case, full pixel data representing the frame at full resolution is not required (as it will not be used to decode any other frames in the sequence), so the VPU 32 is configured to generate and store in memory 33 pixel data representing the frame at the one or more lower resolutions only. In the specific example shown, the VPU 32 generates pixel data representing the frame at the second and third resolutions, L1 and L2. (This can be done by first decoding the frame using pixel data representing the corresponding reference frame (e.g. frame 2) atfull resolution, and performing sealing operations on the (decoded) frame to generate pixel data representing the frame at one or more lower resolutions than the first resolution.)
The frames stored in memory can then be retrieved by an appropriate DPU (or DPUs) for display. In this regard, itwill be appreciated that having lower resolution pixel data for the frames is advantageous in that only the smaller version of the current frame (be it the first, second or third frame of Figure 3) needs to be read for display, thereby saving on memory bandwidth and accesses etc.
In the example shown in Figure 3, each one of the first and second display units, DPU[0] and DPU[1], is configured to output the frames, except that DPU[0] will read in only the data representing the frame at the lower resolution, L1, whilst DPU[1] will read in only the data representing the frame at the lower resolution, L2. (Itwill be appreciated that although Figure 3 shows pixel data representing the reference frame at only two different resolutions than the first resolution, any number of different resolutions can be generated in addition to the first resolution. For example, the VPU may be configured to generate a full mipmap set. Also, whilst Figure 3 shows two separate display processing units, any number of display processing units can be used. Further, the one or more lower resolutions, at which pixel data representing the frame is generated, need not correspond exactly to the resolution of the one or more display screens. Instead, the one or more lower resolutions can be chosen to be the closest levei of a mipmap to the resolution of the one or more display screens, with any final sealing that is required being performed in the display processing unit.)
Figure 4 illustrates schematically a video processing system 31 that is not in accordance with the present invention, where the video processor is configured to process frames of still images.
Similarly to the embodiment of Figure 3, in this example the video processing system 41 includes a VPU 42, externai memory 43, GPU 44 and a graphical user interface 45 on a display, for example.
As can be seen in Figure 4, the VPU 42 is configured to receive a sequence of (compressed) still image frames from an image source (not shown) and generate and store in memory pixel data representing each frame at a first and at least one different resolution to the first resolution.
The VPU 43 is configured to generate and store in memory 43 pixel data representing each image in the sequence of still images as a complete mipmap. For example, the externai memory 43 of Figure 4 has pixel data stored therein that represents the image at successively lower resolutions, LO through to LN.
The pixel data representing a frame or frames from memory is then retrieved from memory and used by GPU 44 to perform processing operations on the one or more frames, e.g. for display.
For example, the GPU 44 may be configured to read in and use the pixel data representing a frame at an at least one lower resolution than the first resolution to perform a transformation operation on the frame, such as a sealing, rotation or skewing operation. ln other examples, the GPU 44 may be configured to read in and use pixel data representing more than one frame at an at least one lower resolution than the first resolution to perform operations for visually flipping through frames on graphical user interface 45 (such as “coverflow”-type operations).
Figure 5 illustrates schematically a video processing system that is not in accordance with the present invention, but is configured to encode at least one frame in a sequence of video frames. The video processing system comprises externai memory 52 and a VPU 53 having a hierarchical motion estimation encoder 54 therein.
The VPU 53 is configured to perform hierarchical motion estimation encoding on a given frame (frame 1) in a sequence of video frames. As shown, the frame is encoded as a bidirectional frame, i.e. a frame that is defined from both a previous frame (frame 0) and a future frame (frame 2) in the sequence of video frames. ln the embodiment of Figure 5, frame 0 and frame 2 are both reference frames, whilst frame 1 is to be encoded as an output frame only.
Accordingly, pixel data representing the reference frames (frames 0 and 2) at a first resolution and at least one different resolution is generated and stored in externai memory 52. ln the example shown in Figure 5, pixel data representing the reference frames (frames 0 and 2) at a first resolution and at least one different resolution has already been generated and stored in the externai memory 52 ofthe video processing system. (ln this specific example, each reference frame is represented as a (e.g. sparse) mipmap set comprising a first, second and third resolution version of the reference frame (versions LO, L1 and L2, respectively), although other configurations are equally possible.)
The VPU 53, more particularly the encoder 54 of the VPU 53, is configured to read in pixel data representing the reference frames from memory 52 and use it to perform hierarchical motion estimation encoding on the frame in question (frame 1). ln the embodiment shown in Figure 5, the encoder 54 generates encoded pixel data representing the frame as a mipmap set, preferably a sparse mipmap set containing only the leveis that are being used by the hierarchical search scheme (that is, leveis LO, L1 and L2).
The Applicants have recognised that having multiple resolution versions of the reference frame already stored in memory is advantageous for performing motion estimation encoding in a more efficient manner. ln particular, having multiple resolution versions of the frame facilitates finding longer motion vectors, whilst keeping memory bandwidth down (by searching through lower resolution versions of the reference frame). For example, when it is desired to search for motion vectors in a lower resolution version of the reference frame, the VPU 53 can simply read in pixel data representing the reference frame at a lower resolution (instead of first reading in the full resolution pixel data and then sealing it down), thereby saving memory accesses and bandwidth.
It will be appreciated that although the VPU 53 of Figure 5 is encoding a bidirectional frame, this is not required. Any compression scheme could be used as appropriate. For example, a frame may be encoded as a forward-predicted frame (e.g. from a single reference frame) or as a bidirectional predicted frame.
It can be seen from the above that the present invention, in its preferred embodiments at least, provides a way of providing a more efficient video processing system in which data for one or more frames in a sequence of video frames is defined with respect to a reference frame.
This is achieved, in the preferred embodiments at least, by, when all or part of a reference frame is required for decoding a sequence of video frames, generating and storing in memory pixel data representing all or part of the reference frame at a first, full resolution and at least one different (e.g. lower) resolution to the first resolution.

Claims (27)

1. A method of processing frames of video data in a video processing system, in which data for one or more frames in a sequence of video frames is defined with respect to a reference frame, the method comprising: when decoding a sequence of video frames in which data for one or more frames is defined with respect to a reference frame, in response to determining that a frame in the sequence of video frames is a reference frame for the sequence of video frames, both: generating and storing in memory pixel data representing all or part of the reference frame at a first resolution; and generating and storing in memory pixel data representing all or part of the reference frame at at least one different resolution to the first resolution, such that pixel data representing all or part of the reference frame at the first resolution and pixel data representing all or part of the reference frame at the at least one different resolution is present and available in memory at the same time when decoding the sequence of video frames.
2. The method of claim 1, comprising: generating the pixel data representing the reference frame at the at least one different resolution to the first resolution using the pixel data that has been generated for the reference frame at the first resolution.
3. The method of claim 1 or 2, wherein each one of the at least one different resolution to the first resolution is a lower resolution than the first resolution.
4. The method of claim 1, 2 or 3, wherein the pixel data that is generated and stored, and which represents the reference frame at a first resolution and at at least one different resolution is in the form of two or more or all leveis of a mipmap set representing the reference frame.
5. The method of any one of the preceding claims, further comprising:selecting the one or more or all of the at least one different resolution to the first resolution that the reference frame is generated at based on the resolution of an electronic display or displays on which the video sequence is to be displayed.
6. The method of any one of the preceding claims, further comprising: selecting the number of different resolutions at which the reference frame is to be represented based on the number of different displays in the overall video processing system on which the video sequence is to be displayed.
7. The method of any one of the preceding claims, further comprising: processing at least one frame in the sequence of video frames using the pixel data representing the reference frame at at least one different resolution to the first resolution.
8. The method of claim 7, wherein the step of processing at least one frame in the sequence of video frames comprises: generating pixel data representing an output frame for display using the pixel data representing the reference frame at at least one lower resolution than the first resolution.
9. The method of any one of the preceding claims, wherein, if the reference frame is to be displayed: the pixel data for the lower resolution reference frame is used as the pixel data for the displayed frame.
10. The method of any one of claims 1 to 9, further comprising, when a current frame in the sequence of video frames is encoded with respect to the reference frame: decoding the current frame using the encoded pixel data for the current frame and the pixel data representing the reference frame at the first resolution; and generating and storing in memory pixel data representing the current frame at at least one different resolution to the first resolution.
11. The method of any one of claims 1 to 10, further comprising: reading in from the memory the pixel data representing all or part of the reference frame at the at least one different resolution to the first resolution; using the pixel data when generating an output frame to be displayed; and outputting the generated output frame to a display.
12. The method of any one of claims 1 to 11, further comprising: reading in from memory the pixel data representing all or part of the reference frame at the at least one different resolution to the first resolution; using the pixel data when performing a processing operation on a frame in the sequence of video frames to generate pixel data representing a processed frame; and storing the pixel data representing the processed frame in memory.
13. The method of any one of the preceding claims, wherein the pixel data representing all or part of the reference frame at the first resolution and the at least one different resolution to the first resolution is stored in memory in a compressed form.
14. An apparatus for processing frames of video data in a video processing system, in which data for one or more frames in a sequence of video frames is defined with respect to a reference frame, the apparatus comprising: a reference frame generation stage configured to, when decoding a sequence of video frames in which data for one or more frames is defined with respect to a reference frame, in response to determining that a frame in the sequence of video frames is a reference frame for the sequence of video frames, both: generate and store in memory pixel data representing all or part of the reference frame at a first resolution; and generate and store in memory pixel data representing all or part of the reference frame at at least one different resolution to the first resolution, such that pixel data representing all or part of the reference frame at the first resolution and pixel data representing all or part of the reference frame at the at least one different resolution is present and available in memory at the same time when decoding the sequence of video frames.
15. The apparatus of claim 14, wherein the reference frame generation stage is configured to generate the pixel data representing the reference frame at the at least one different resolution to the first resolution using the pixel data that has been generated for the reference frame at the first resolution.
16. The apparatus of claim 14 or 15, wherein each one of the at least one different resolution to the first resolution is a lower resolution than the first resolution.
17. The apparatus of claim 14, 15 or 16, wherein the pixel data that is generated and stored, and which represents the reference frame at a first resolution and at at least one different resolution is in the form of two or more or all leveis of a mipmap set representing the reference frame.
18. The apparatus of any one of claims 14 to 17, wherein one or more or all of the at least one different resolution to the first resolution that the reference frame is generated at is selected based on the resolution of an electronic display or displays on which the video sequence is to be displayed.
19. The apparatus of any one of claims 14 to 18; wherein: the number of different resolutions at which the reference frame is to be represented is selected based on the number of different displays in the overall video processing system on which the video sequence is to be displayed.
20. The apparatus of any one of claims 14 to 19, further comprising: a frame processing stage configured to process at least one frame in the sequence of video frames using the pixel data representing the reference frame at at least one different resolution to the first resolution.
21. The apparatus of claim 20, wherein the frame processing stage is configured to: generate pixel data representing an output frame for display using the pixel data representing the reference frame at at least one lower resolution than the first resolution.
22. The apparatus of any one of claims 14 to 21, wherein, if the reference frame is to be displayed: the pixel data for the lower resolution reference frame is used as the pixel data for the displayed frame.
23. The apparatus of any one of claims 14 to 22, wherein the apparatus is configured to, when the current frame in the sequence of video frames is encoded with respect to the reference frame: decode the current frame using the encoded pixel data for the current frame and the pixel data representing the reference frame at the first resolution; and generate and store in memory pixel data representing the current frame at at least one different resolution to the first resolution.
24. The apparatus of any one of claims 14 to 23, wherein the apparatus further comprises: a compression stage configured to store the pixel data representing all or part of the reference frame at the first resolution and the at least one different resolution to the first resolution in memory in a compressed form.
25. A video processing system, in which data for one or more frames in a sequence of video frames is defined with respect to a reference frame, the video processing system comprising: a memory; an apparatus as claimed in any one of claims 14 to 24; and a display controller configured to: read in from the memory the pixel data representing all or part of the reference frame at the at least one different resolution to the first resolution; use the pixel data when generating an output frame to be displayed; and output the generated output frame to a display.
26. A video processing system, in which data for one or more frames in a sequence of video frames is defined with respect to a reference frame, the video processing system comprising: a memory; an apparatus as claimed in any one of claims 14 to 24; and a processing unit configured to: read in from memory the pixel data representing all or part of the reference frame at the at least one different resolution to the first resolution; use the pixel data when performing a processing operation on a frame in the sequence of video frames to generate pixel data representing a processed frame; and store the pixel data representing the processed frame in memory.
27. A Computer program comprising software code adapted to perform the method of any one of claims 1 to 13 when the program is run on a data processing system.
GB1510168.6A 2015-06-11 2015-06-11 Video processing system Active GB2539241B (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
GB1510168.6A GB2539241B (en) 2015-06-11 2015-06-11 Video processing system
KR1020160070722A KR20160146542A (en) 2015-06-11 2016-06-08 Video processing system
US15/177,685 US10440360B2 (en) 2015-06-11 2016-06-09 Video processing system
EP16174023.8A EP3104613A1 (en) 2015-06-11 2016-06-10 Video processing system
CN201610407439.5A CN106254877B (en) 2015-06-11 2016-06-12 Video processing system, method, device and storage medium for processing video data frame

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1510168.6A GB2539241B (en) 2015-06-11 2015-06-11 Video processing system

Publications (3)

Publication Number Publication Date
GB201510168D0 GB201510168D0 (en) 2015-07-29
GB2539241A GB2539241A (en) 2016-12-14
GB2539241B true GB2539241B (en) 2019-10-23

Family

ID=53784494

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1510168.6A Active GB2539241B (en) 2015-06-11 2015-06-11 Video processing system

Country Status (5)

Country Link
US (1) US10440360B2 (en)
EP (1) EP3104613A1 (en)
KR (1) KR20160146542A (en)
CN (1) CN106254877B (en)
GB (1) GB2539241B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10887600B2 (en) * 2017-03-17 2021-01-05 Samsung Electronics Co., Ltd. Method and apparatus for packaging and streaming of virtual reality (VR) media content
WO2019050067A1 (en) * 2017-09-08 2019-03-14 라인 가부시키가이샤 Video quality control
KR101987062B1 (en) * 2017-11-21 2019-06-10 (주)루먼텍 System for distributing and combining multi-camera videos through ip and a method thereof
GB2583061B (en) * 2019-02-12 2023-03-15 Advanced Risc Mach Ltd Data processing systems
CN114245219A (en) * 2019-02-27 2022-03-25 深圳市大疆创新科技有限公司 Video playback method of shooting device and shooting device
CN114342407A (en) * 2019-11-29 2022-04-12 阿里巴巴集团控股有限公司 Region-of-interest aware adaptive resolution video coding
CN111179403A (en) * 2020-01-21 2020-05-19 南京芯瞳半导体技术有限公司 Method and device for parallel generation of texture mapping Mipmap image and computer storage medium
US11570479B2 (en) 2020-04-24 2023-01-31 Samsung Electronics Co., Ltd. Camera module, image processing device and image compression method
WO2022082363A1 (en) * 2020-10-19 2022-04-28 Qualcomm Incorporated Processing image data by prioritizing layer property

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6108039A (en) * 1996-05-23 2000-08-22 C-Cube Microsystems, Inc. Low bandwidth, two-candidate motion estimation for interlaced video
US20100316127A1 (en) * 2009-06-12 2010-12-16 Masayuki Yokoyama Image processing device and image processing method
EP2534828A2 (en) * 2010-02-11 2012-12-19 Microsoft Corporation Generic platform video image stabilization
US8358701B2 (en) * 2005-04-15 2013-01-22 Apple Inc. Switching decode resolution during video decoding
GB2495301A (en) * 2011-09-30 2013-04-10 Advanced Risc Mach Ltd Encoding texture data using partitioning patterns

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10257502A (en) * 1997-03-17 1998-09-25 Matsushita Electric Ind Co Ltd Hierarchical image encoding method, hierarchical image multiplexing method, hierarchical image decoding method and device therefor
US6148033A (en) * 1997-11-20 2000-11-14 Hitachi America, Ltd. Methods and apparatus for improving picture quality in reduced resolution video decoders
US6370192B1 (en) * 1997-11-20 2002-04-09 Hitachi America, Ltd. Methods and apparatus for decoding different portions of a video image at different resolutions
US6343098B1 (en) * 1998-02-26 2002-01-29 Lucent Technologies Inc. Efficient rate control for multi-resolution video encoding
JP4502678B2 (en) * 2004-03-24 2010-07-14 株式会社バンダイナムコゲームス Program, information storage medium, and image generation system
JP2008502064A (en) * 2004-06-08 2008-01-24 スリー−ビィ・インターナショナル・リミテッド Display image texture
JP4462132B2 (en) * 2005-07-04 2010-05-12 ソニー株式会社 Image special effects device, graphics processor, program
JP4594201B2 (en) * 2005-09-28 2010-12-08 パナソニック株式会社 Image coding method, image coding apparatus, program, and integrated circuit
US7801223B2 (en) 2006-07-27 2010-09-21 Lsi Corporation Method for video decoder memory reduction
US8068700B2 (en) * 2007-05-28 2011-11-29 Sanyo Electric Co., Ltd. Image processing apparatus, image processing method, and electronic appliance
KR101446771B1 (en) * 2008-01-30 2014-10-06 삼성전자주식회사 Apparatus of encoding image and apparatus of decoding image
EP2104356A1 (en) 2008-03-18 2009-09-23 Deutsche Thomson OHG Method and device for generating an image data stream, method and device for reconstructing a current image from an image data stream, image data stream and storage medium carrying an image data stream
TWI362888B (en) * 2008-12-24 2012-04-21 Acer Inc Encoding and decoding method for video screen layout, encoding device, decoding device, and data structure
KR101444691B1 (en) * 2010-05-17 2014-09-30 에스케이텔레콤 주식회사 Reference Frame Composing and Indexing Apparatus and Method
JP5693716B2 (en) * 2010-07-08 2015-04-01 ドルビー ラボラトリーズ ライセンシング コーポレイション System and method for multi-layer image and video delivery using reference processing signals
EP2410746A1 (en) * 2010-07-20 2012-01-25 Siemens Aktiengesellschaft Video coding with reference frames of high resolution
CA2846425A1 (en) * 2011-08-30 2013-03-07 Nokia Corporation An apparatus, a method and a computer program for video coding and decoding
JP5722761B2 (en) 2011-12-27 2015-05-27 株式会社ソニー・コンピュータエンタテインメント Video compression apparatus, image processing apparatus, video compression method, image processing method, and data structure of video compression file
EP2803191B1 (en) * 2012-01-13 2021-05-05 InterDigital Madison Patent Holdings Method and device for coding an image block, corresponding method and decoding device
CN104641644A (en) * 2012-05-14 2015-05-20 卢卡·罗萨托 Encoding and decoding based on blending of sequences of samples along time
GB2514099B (en) * 2013-05-07 2020-09-09 Advanced Risc Mach Ltd A data processing apparatus and method for performing a transform between spatial and frequency domains when processing video data
US20160241884A1 (en) * 2015-02-13 2016-08-18 Tandent Vision Science, Inc. Selective perceptual masking via scale separation in the spatial and temporal domains for use in data compression with motion compensation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6108039A (en) * 1996-05-23 2000-08-22 C-Cube Microsystems, Inc. Low bandwidth, two-candidate motion estimation for interlaced video
US8358701B2 (en) * 2005-04-15 2013-01-22 Apple Inc. Switching decode resolution during video decoding
US20100316127A1 (en) * 2009-06-12 2010-12-16 Masayuki Yokoyama Image processing device and image processing method
EP2534828A2 (en) * 2010-02-11 2012-12-19 Microsoft Corporation Generic platform video image stabilization
GB2495301A (en) * 2011-09-30 2013-04-10 Advanced Risc Mach Ltd Encoding texture data using partitioning patterns

Also Published As

Publication number Publication date
US10440360B2 (en) 2019-10-08
GB201510168D0 (en) 2015-07-29
GB2539241A (en) 2016-12-14
CN106254877A (en) 2016-12-21
US20160366408A1 (en) 2016-12-15
EP3104613A1 (en) 2016-12-14
KR20160146542A (en) 2016-12-21
CN106254877B (en) 2023-05-23

Similar Documents

Publication Publication Date Title
GB2539241B (en) Video processing system
US10825128B2 (en) Data processing systems
US9116790B2 (en) Methods of and apparatus for storing data in memory in data processing systems
US8990518B2 (en) Methods of and apparatus for storing data in memory in data processing systems
CN106030652B (en) Method, system and composite display controller for providing output surface and computer medium
US10395394B2 (en) Encoding and decoding arrays of data elements
US20220014767A1 (en) Bit plane encoding of data arrays
GB2561152A (en) Data processing systems
CN112714357A (en) Video playing method, video playing device, electronic equipment and storage medium
US10824357B2 (en) Updating data stored in a memory
GB2552136B (en) Storing headers for encoded blocks of data in memory according to a tiled layout
US20140354641A1 (en) Methods of and apparatus for compressing depth data
KR101551915B1 (en) Device and method for video compression
US11263786B2 (en) Decoding data arrays
US10672367B2 (en) Providing data to a display in data processing systems
KR102267792B1 (en) Image block coding based on pixel-domain pre-processing operations on image block
US11600026B2 (en) Data processing systems
CN110545446B (en) Desktop image encoding and decoding methods, related devices and storage medium
US20040008213A1 (en) Tagging multicolor images for improved compression
KR101811774B1 (en) Apparatus and method for processing graphics
US8462168B2 (en) Decompression system and method for DCT-base compressed graphic data with transparent attribute
US10262632B2 (en) Providing output surfaces for display in data processing systems
JP2008219848A (en) Circuit and method for decoding and viewing of image file
KR20220149124A (en) A image signal processor, a method of operating a image signal processor and a image processing system including the image processing device