AU2021202099A1 - Systems and methods for encoder-guided adaptive-quality rendering - Google Patents
Systems and methods for encoder-guided adaptive-quality rendering Download PDFInfo
- Publication number
- AU2021202099A1 AU2021202099A1 AU2021202099A AU2021202099A AU2021202099A1 AU 2021202099 A1 AU2021202099 A1 AU 2021202099A1 AU 2021202099 A AU2021202099 A AU 2021202099A AU 2021202099 A AU2021202099 A AU 2021202099A AU 2021202099 A1 AU2021202099 A1 AU 2021202099A1
- Authority
- AU
- Australia
- Prior art keywords
- quality
- settings
- rendering
- renderer
- computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/154—Measured or subjectively estimated visual quality after decoding, e.g. measurement of distortion
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/30—Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
- A63F13/35—Details of game servers
- A63F13/355—Performing operations on behalf of clients with restricted processing capabilities, e.g. servers transform changing game scene into an MPEG-stream for transmitting to a mobile phone or a thin client
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/28—Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/102—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
- H04N19/124—Quantisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/134—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
- H04N19/156—Availability of hardware or computational resources, e.g. encoding based on power-saving criteria
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/169—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
- H04N19/17—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
- H04N19/172—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/189—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/10—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
- H04N19/189—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
- H04N19/192—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding the adaptation method, adaptation tool or adaptation type being iterative or recursive
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/85—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/50—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
- A63F2300/53—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing
- A63F2300/538—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers details of basic data processing for performing operations on behalf of the game client, e.g. rendering
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
Abstract
The present invention relates to a computer-implemented method for rendering. The method
comprises the steps of generating one or more reference images, and encoding the one or more
reference images for a partial range of encoder settings. Further, the method comprises
comparing, for each encoded reference image, one or more first perceived qualities to one or
more second perceived qualities. A match between the one or more first perceived qualities and
the one or more second perceived qualities results in an association of one or more encoder
settings with a matching rendering quality-settings profile. The method further comprises
generating a rendered image at a substantially identical perceived quality to an encoded frame.
139127.00143/107072227
Description
[0001] This application claims the benefit of the following U.S. Provisional Applications: No.
62/488,526, filed April 21, 2017, and No. 62/653,056, filed April 5, 2018.
[0002] Remote gaming applications, in which a server-side game is controlled by a client-side
player, have attempted to encode the video output from a three-dimensional (3D) graphics engine
in real-time using existing or customized codecs, also referred to as encoders. However, the
interactive nature of video games, particularly the player feedback loop between video output
and player input, makes game video streaming much more sensitive to latency than traditional
video streaming. Existing video coding methods can trade computational power, and little else,
for reductions in encoding time. New methods for integrating the encoding process into the video
rendering process can provide significant reductions in encoding time while also reducing
computational power, improving the quality of the encoded video, and retaining the original
bitstream data format to preserve interoperability of existing hardware devices.
[0003] When a video game instance is running on hardware local to the player, it is desirable to
have the game output each pixel at the highest quality. However, in a server-side game instance
where rendered output is encoded and transmitted to a remote client, the encoder may reduce
image quality to fit within a limited bandwidth. If rendered quality is dramatically higher than
the quality of the decoded output, there is a measurable amount of server-side rendering work
that is lost.
1
139127.00143/107072227
[0004] By adapting the server-side rendered quality to match the post-quantization quality based
on feedback from the encoder, the game can reduce wasted server-side computation without any
noticeable client-side quality loss. The reduction in server-side computational waste may also
result in additional benefits including reduced energy usage, reduced rendering times, and
reduced player-feedback latency. The server-side computational savings is compounded in
environments where multiple game instances are running on the same server.
[0005] In streaming environments for games that involve multiple players, particularly games
such as Massive Multiplayer Online Games ("MMOGs"), ensuring that server-side rendering
work is not wasted becomes increasingly important. Due to the limited bandwidth available to
players of MMOGs, an encoder that maximizes rendering quality while preventing a slowdown
in the game is particularly important. Current technologies, as discussed below, adopt various
methods to attempt to address this problem, but remain deficient.
[0006] U.S. Patent Publication No. US20170132830A1 ("the '830 Publication"), discloses
systems and methods for determining a select shading point in a 3D scene on which shading is to
be performed, performing the shading on the determined shading point, and determining shading
information of the 3D scene based on a result of the shading performed on the determined
shading point. The shading of the scene is adjusted based on temporal characteristics of the
scene. However, this technology does not address the fundamental problem of optimizing
encoding based on server-side rendering capabilities and available bandwidth.
[0007] U.S. Patent Publication No. US20170200253A1 ("the '253 Publication") discloses
systems and methods for improving rendering performance of graphics processors. At the
graphics processor, an upper threshold can be set so that when a frame greater than the set
threshold is encountered, the graphics processor takes appropriate action to reduce rendering
2
139127.00143/107072227 time. However, this technology is based solely on a set threshold and does not dynamically adjust to server-side rendering capabilities and available bandwidth.
[0008] U.S. Patent Publication No. US2017/0278296A1 ("the '296 Publication") discloses
systems and methods in which the initial rendering of a scene that determines texture at each
portion of the scene is generated, and a ray traced rendering of the scene is generated by tracing
an initial sample of rays. This reference discloses that an optimal number of samples for each
pixel is intelligently determined based on foreknowledge of scene textures and identifying noise
arising due to under-sampling during ray tracing. Once more, this technology is limited to
optimal ray sampling and does not dynamically adjust to server-side rendering capabilities and
available bandwidth.
[0009] As is apparent from the above discussion of the state of the art in this technology, there is
a need in the art for an improvement to the present computer technology related to the rendering
and encoding of games.
[0010] According to one aspect of the present invention, there is provided a computer
implemented method for rendering, comprising the steps of:
generating one or more reference images;
encoding the one or more reference images for a partial range of encoder settings;
comparing, for each encoded reference image, one or more first perceived qualities to one
or more second perceived qualities, wherein a match between the one or more first perceived
qualities and the one or more second perceived qualities results in an association of one or more
encoder settings with a matching rendering quality-settings profile; and
3
139127.00143/107072227 generating a rendered image at a substantially identical perceived quality to an encoded frame.
[0011] According to another aspect of the present invention, there is provided a computer
implemented method for rendering, comprising:
reading a reported quantization parameter;
comparing the reported quantization parameter to effective quantization settings
associated with a previously rendered frame;
changing the rendering quality to match the effective quantization settings,
depending on the results of said comparison; and
generating a rendered image based on the altered rendering quality.
[0012] According to another aspect of the present invention, there is provided a system for
image rendering comprising a renderer, wherein:
the renderer reads a reported quantization parameter;
compares the reported quantization parameter to effective quantization settings
associated with a previously rendered frame;
changes the rendering quality to match the effective quantization settings,
depending on the results of said comparison; and
generates a rendered image based on the altered rendering quality.
[0013] According to another aspect of the present invention, there is provided a computer
implemented method for rendering comprising:
monitoring quantization settings for changes at a renderer;
querying a lookup table for a rendering quality-settings profile;
receiving a quantization parameter from an encoder;
4
139127.00143/107072227 applying an associated rendering quality-settings profile associated with the quantization parameter; and changing rendering quality settings to match the quantization settings if the quantization settings differ from a previously rendered frame, wherein the matching of quantization settings results is raising or lowering frame quality to match a compression level at an encoder.
[0014] A more complete appreciation of the invention and many of the attendant advantages
thereof will be readily obtained as the same becomes better understood by reference to the
following detailed description when considered in connection with the accompanying drawings,
wherein:
[0015] FIG. 1 is a diagram of an exemplary environment in which a livestreaming codec can
communicate settings back to the renderer producing the video, in accordance with an
embodiment of the invention;
[0016] FIG. 2 is a flow diagram outlining the exemplary stages of encoder-guided adaptive
quality rendering, in accordance with an embodiment of the invention;
[0017] FIG. 3 is a flow diagram outlining the exemplary pre-generation of the lookup table that
assigns a rendering quality-settings profile to each partial range of the encoder settings, in
accordance with an embodiment of the invention;
5
139127.00143/107072227
[0018] FIG. 4 is a diagram of an exemplary lookup table generation for rendering quality-setting
profiles which are comprised of only one setting, in accordance with an embodiment of the
invention; and
[0019] FIG. 5 is a diagram of an exemplary lookup table generation for rendering quality-setting
profiles which contains multiple settings, in accordance with an embodiment of the invention.
[0020] In describing a preferred embodiment of the invention illustrated in the drawings, specific
terminology will be resorted to for the sake of clarity. However, the invention is not intended to
be limited to the specific terms so selected, and it is to be understood that each specific term
includes all technical equivalents that operate in a similar manner to accomplish a similar
purpose. Several preferred embodiments of the invention are described for illustrative purposes,
it being understood that the invention may be embodied in other forms not specifically shown in
the drawings.
[0021] Modern rendering engines, such as those used in video games, have the ability to adapt
certain quality settings during runtime based on factors such as a player's distance from an
object, the rendering time of the previous frame, or other runtime measurements. A rendering
engine may provide several methods to adjust quality, allowing for more granular control of the
overall rendered quality. Some examples include biasing texture sampling to use blurrier
mipmaps, using lower quality cascades or fewer samples on shadows, running a simplified path
on the shading model (e.g. DCT-transforms of specular to look like diffuse), and using fewer
samples for post processing (e.g. for Gaussian, volumetric fog, etc.). In live-streaming
applications, altering one or more rendering quality settings in response to changes in encoder
6
139127.00143/107072227 settings may provide the best rendering-cost savings without impacting the encoded output quality.
[0022] FIG. 1 is a diagram of an exemplary environment in which real-time rendered video is
livestreamed to a remote viewer. The server 100 may be any hardware capable of simultaneously
running a real-time rendering process 102 (also referred to as a renderer below) and a streaming
codec 104. The codec 104 must also have the ability to communicate its quantization settings
back to the rendering process 102 through direct reporting or some other monitoring process
known in the art. The encoded video stream is transmitted over a network to a client device 106.
The client 106 may be any hardware capable of decoding and displaying the video stream.
[0023] FIG. 2 is a flow diagram outlining the exemplary stages of encoder-guided adaptive
quality rendering. Livestream encoding using an H.264 standard-compliant encoder typically
employs a Constant Rate Factor ("CRF") mode which reports the effective quantization settings
for an encoded frame as a quantization parameter ("QP") at "REPORT QUANTIZATION SETTINGS
FOR EACH ENCODED FRAME," step 200. In certain embodiments, the H.264 standard compliant
library used isffmpeg, which outputs the quantization parameter as the variable,f crf avg. The
quantization parameter is an index, ranging from 0 to 51, which defines how lossy the
compression is during encoding. Lower values of QP represent lower compression while higher
values of QP represent higher compression. In order to remain at a constant bitrate, an encoder
operating in CRF mode will increase the QP for frames which can afford higher compression and
decrease the QP for frames that require higher quality. The encoder takes advantage of the fact
that the human eye is less able to distinguish detail on moving objects by increasing compression
in areas which have comparatively high motion and decreasing compression in areas which are
7
139127.00143/107072227 relatively still. This allows the encoder to maintain a target perceived quality while reducing the size of some encoded frames.
[0024] The renderer reads the reported QP before rendering a frame at "MONITOR
QUANTIZATION SETTINGS FOR CHANGES," step 202. At "DIFFERENT?," step 203, if the effective
quantization settings have not changed since the previously rendered frame, the renderer takes no
action to adapt rendering quality and will check again on the next frame. If the renderer reads a
QP value which is different than the previously rendered frame, or if this is thefirst encoded
frame for which encoder-guided adaptive-quality rendering is being performed, the rendering
quality is altered at "CHANGE RENDERING QUALITY SETTINGS TO MATCH QUANTIZATION
SETTINGS," step 204. If the QP value has increased since the previously rendered frame, the
renderer will lower the quality to match the compression level at the encoder. Likewise, if the QP
value has decreased since the previously rendered frame, the encoder will increase the quality.
To change the rendering settings, the renderer will check a pre-generated lookup table that
provides a rendering quality-settings profile for the encoder-provided QP value. In general, there
should be only one entry per encoder quality setting. The renderer uses the encoder-provided QP,
finds the one entry, and uses the associated rendering quality-settings profile. In general, the
entire rendering quality-settings profile is applied. A rendering quality-settings profile is defined
as a list of values for each available rendering quality setting. The pre-generation of this lookup
table is described in more detail in reference to FIG. 3. The pre-defined lookup table may define
rendering settings for integer-values of QP, which requires the renderer to round the read QP
value to the nearest integer, or the lookup table may define rendering settings for each partial
range of QP values between 0 and 51. The examples in FIG. 4 and FIG. 5 assume the renderer
will round the QP to the nearest integer before using the lookup table, but the examples may be
8
139127.00143/107072227 modified to define a lookup table using partial ranges of QP instead. The renderer will alter the quality settings according to the rendering quality-settings profile fetched from the lookup table before rendering the next frame. Reducing rendering quality will reduce the amount of rendering work that is wasted when the encoder bottlenecks the quality.
[0025] FIG. 3 is a flow diagram outlining the exemplary pre-generation of the lookup table that
assigns a rendering quality-settings profile to each partial range of the encoder settings. A
reference image will be used as a baseline to measure the effects on perceived quality as the
encoding settings or rendering settings are changed. The reference image should represent a
typical frame of video output and include rendered elements such as models, textures, or visual
effects that are typical to a chosen game context. The game context might include a specific area,
specific map, specific level, or some specific gameplay. The selected reference image will be
used to generate a lookup table that estimates the perceived quality of video rendered within the
same context as the reference image. For example, the lookup table generated from a reference
image that contains a representative set of elements from a game level may be used to estimate
the perceived quality of video rendered from similar scenes within the same level. Methods for
combining multiple lookup tables into a generalized lookup table are discussed further below.
After a game context is identified, a representative scene should be chosen and rendered at full
quality, as shown at "SELECT AND GENERATE REFERENCE IMAGE" step 300. The full-quality
rendered scene of the representative scene is referred to herein as the reference image.
[0026] A preferred embodiment of the runtime behavior of the renderer, discussed above in
connection with the description of FIG. 2, requires the renderer to round the received values of
QP to the nearest integer before reading the lookup table. As a result, the lookup table will be
generated using only integer-values of QP. At the encoder, the full-quality reference image is
9
139127.00143/107072227 encoded for each integer-valued quality setting in the encoder, quantization parameter (QP) integer values 0 through 51, as shown at "ENCODE REFREENCE IMAGE FOR EACH PARTIAL RANGE
OF ENCODER SETTINGS," step 302. In the preferred embodiment, there are 52 partial ranges
which are defined by the rounding operation performed by the renderer. The implementation can
be modified to create more partial ranges for the more-common QP values, values in the middle
of the range from 0 to 51, or fewer partial ranges for the more-rare QP values, values at the
extremes of the range from 0 to 51.
[0027] Perceived quality is an attempt to quantify how well the human eye can perceive quality
loss between a compressed image and the full-quality source image. There are several methods
used to estimate perceived quality, including mean squared error (MSE) and peak signal-to-noise
ratio (PSNR), which use only the luminance and contrast value differences between two images
to calculate the quality of a compression codec. As disclosed by Z. Wang, A. C. Bovik, H. R.
Sheikh and E. P. Simoncelli, "Image quality assessment: From error visibility to structural
similarity," IEEE Transactionson Image Processing, vol. 13, no. 4, pp. 600-612, Apr. 2004, the
structural similarity (SSIM) index is a method which adds the assumption that the human eye is
also adept at extracting structural information from a scene and defines a calculation to estimate
perceived quality. SSIM works by comparing pixel-data between two images: the uncompressed
full-quality reference image to the encoded image. The algorithm compares the luminance,
contrast, structure, and sometimes chrominance over "windows" of 8x8 pixels. Because SSIM
has a low computation cost and outperforms methods like MSE and PSNR, it is the preferred
tool for calculating perceived quality. To generate the perceived quality for each value of the
encoder settings, preferably at the renderer and/or the game engine, the SSIM index is calculated
between each encoded reference image and the reference image as shown at "CALCULATE
10
139127.00143/107072227
PERCEIVED QUALITY FOR EACH ENCODED REFERENCE IMAGE," step 304. In the preferred
embodiment, 52 SSIM values are calculated, one for each quantization parameter (QP) integer,
with a value of 0 through 51. The exemplary descriptions in reference to FIG. 3, FIG. 4, and FIG.
use a standard SSIM calculation to compare two still images, but there are SSIM method
variants which can compare two video segments and which may be used instead at an increased
computational cost. One such SSIM variant is the Spatio-Temporal SSIM as disclosed by Anush
K. Moorthy and Alan C. Bovik, "Efficient Motion Weighted Spatio-Temporal Video SSIM
Index," Human Vision and Electronic ImagingXV, vol. 7527, Mar. 2010 (availableat
http://live.ece.utexas.edu/publications/2010/moorthy spie jan10.pdf).
[0028] The renderer may have several settings available for per-pixel-quality control including
screen resolution, mipmap selection, level-of-detail (LOD) selection, shadow quality, post
processing quality, or other settings. A quality-settings profile is defined as a list of values for
each available quality setting. In certain embodiments, at the renderer, a list of all rendering
settings which can be adaptively altered, along with their possible values, are gathered. Then all
permutations of adaptive quality rendering settings and their values are generated to create a list
of rendering quality-settings profiles, as shown at "GENERATE LIST OF RENDERING QUALITY
SETTINGS PROFILES," step 306. Since a renderer may have many quality settings with many
possible values, the number of permutations of quality-settings profiles may be prohibitively
long. The example of FIG. 5 discusses an exemplary method for limiting and optimizing the
number of quality-settings profiles in the list.
[0029] For each rendering quality-settings profile in the list, the reference image should be re
rendered at the renderer using the specified rendering settings, as shown at "RE-RENDER
REFERENCE IMAGE FOR EACH RENDERING QUALITY-SETTINGS PROFILE," step 308. If the
11
139127.00143/107072227 rendering quality-settings profiles are comprised of more than one setting, the rendering times for each re-rendered reference image should also be recorded as a measure of computation cost, exemplarily measured in rendering time or clock-cycles. This measure of computational cost may be used in a later step as a tie-breaker if there are any SSIM value collisions.
[0030] Using the same measure of perceived quality as previously used in step 304, the
perceived quality is measured by comparing each of the re-rendered images to the original
reference image, as shown at "CALCULATE PERCEIVED QUALITY FOR EACH RE-RENDERED
REFERENCE IMAGE," step 310. In the preferred embodiment, the structural similarity index
(SSIM) is used to measure the perceived quality of the encoder results and will be used to
measure the perceived quality of the re-rendering results.
[0031] At the renderer, the two sets of perceived quality values, the SSIM values for the encoded
reference images calculated at step 304 and the SSIM values for the per-profile re-rendered
reference images calculated at step 310, are compared across both image sets to find matching
SSIM values between the two sets. Ideally, for each encoded reference image's SSIM value,
there is one exact matching SSIM value from the set of per-profile re-rendered images. If there
are no exact matches, the chosen per-profile re-rendered image's SSIM value should be both
greater than and as close as possible to the target encoded reference image's SSIM value. The
matching SSIM values across both sets of perceived quality values will identify a rendering
quality-settings profile for each value of QP, as shown at "FIND A QUALITY-SETTINGS PROFILE
FOR EACH PARTIAL RANGE OF ENCODER SETTINGS," step 312. In cases where there is a collision,
where there are two or more exact matches from the set of SSIM values for the per-profile re
rendered images, the computational costs recorded in step 308 may be used as a tie-breaker and
12
139127.00143/107072227 the less costly rendering quality-settings profile selected for the encoder setting. FIG. 5 shows an example collision.
[0032] The encoder settings and their matching rendering quality-settings profiles should be
organized into a lookup table as shown at "CREATE LOOKUP TABLE ASSIGNING A RENDERING
QUALITY-SETTINGS PROFILE FOR EACH ENCODER SETTING," step 314. This lookup table may be
used during runtime at the renderer to change the rendering quality settings to match the
quantization settings as described by step 204 in FIG. 2. The lookup table provides a rendering
quality-settings profile that generates an image of the same perceived quality as the encoded
frame and provides the largest computational savings for the given reference frame. Example
lookup tables are shown in FIG. 4 and FIG. 5.
[0033] The lookup table generated by the method described in connection with FIG. 3 may be
used within similar game contexts, scenes, or environments as the reference image. The process
outlined in connection with FIG. 3 may be repeated for several reference images, each
representative of a particular environment, scene type, or other meaningful game context. For
example, a reference image may be selected from each map in a game to generate multiple map
specific lookup tables. Lookup tables may also be combined to create a lookup table that can be
more generally used in the game environment. For example, map-specific lookup tables may be
combined to generate one lookup table that may be used for all maps in a game. To combine
lookup tables, the rendering quality-settings profiles for each QP may be combined to find an
average value for each setting contained in the profile. For example, three lookup tables are
generated for three reference images. The rendering quality-settings profiles are comprised of
three settings values: a post-processing quality setting, a shadow quality setting, and a resolution
setting. To combine the rendering quality-settings profiles for a QP value of 4, the profiles are
13
139127.00143/107072227 read from each lookup table and are represented as P4 1= {3, MED, 95%}, P4 2 = {4, LOW, 90%}, and P4 3 = {2, MED, 90%}. The average values are found for each setting to generate PAvg= {3,
MED, 92%}. A profile-averaging process should round up so that the rendering process is never
generating images at a lower perceived quality level than the current encoding quality setting.
The profiles are averaged for each value of QP and organized into a new lookup table.
[0034] FIG. 4 is an example of lookup table generation for rendering quality-setting profiles
which are comprised of only one setting. In this example, a single rendering quality setting is
adapted in response to changes in encoder quality settings. The rendering of a first-person view
of a 3D scene is adapted at the renderer by altering the resolution of the 3D portions of the view,
shown at "3D VIEw" 400, while the resolution of user interface (UI) elements, shown as "UI"
402, is not altered to maintain readability of any player-facing text. This type of selective
resolution-scaling is referred to as dynamic resolution scaling and is an increasingly common
feature of rendering engines. The reference image, shown at "REFERENCE IMAGE" 404,
represents a single frame from a typical video output rendered in the highest possible resolution
and is chosen in accordance with the guidelines outlined at step 300 of FIG. 3. At the encoder,
the reference image, shown at "REFERENCE IMAGE" 404, is encoded for each integer-value of
QP, as described in connection with step 302 of FIG. 3, to generate a list of encoded reference
images at "ENCODED REFERENCE IMAGES" 406. As described in connection with step 304 of FIG.
3, at the renderer, the SSIM values, shown as "SSIM" 408, are calculated for each encoded
reference image 406. Since the rendering quality-profile is comprised of only one quality setting,
the number of quality-profile permutations is limited to the number of possible values available
for the resolution of the 3D view, shown as "3D VIEW" 400. The number of possible resolution
values is upper-bounded by the maximum possible resolution of the 3D view and lower-bounded
14
139127.00143/107072227 by the minimum viable resolution for the 3D view. The aspect ratio may define how many resolution values exist between the minimum and maximum resolutions. For example, a maximum resolution of 3840 x 2160 has an aspect ratio of 16:9, and the minimum viable resolution in this aspect ratio is chosen as 1280 x 720. There are 160 possible resolutions with an aspect ratio of 16:9 between these upper and lower bounds. Alternatively, some number of same resolutions between the upper and lower bounds may be arbitrarily selected as resolution samples. For example, the resolution may be incrementally reduced in the x direction between
3840 and 1280 to select some number of sample resolution sizes.
[0035] At the renderer, the reference image is re-rendered, as shown at "RE-RENDERED
REFERENCE SEQUENCE" 410, for each of the available resolution sizes or each of the selected
sample resolution sizes, as described in connection with step 308 of FIG. 3. The SSIM values
shown as "SSIM" 412 are calculated for each re-rendered image at the renderer, as described by
step 310 of FIG. 3. The two sets of SSIM values, the SSIM values for the encoded reference
images, as shown at "SSIM" 408, and the SSIM values for the per-profile re-rendered reference
images, as shown at "RE-RENDERED REFERENCE SEQUENCE" 410, are compared to find matches
across the image sets in order to provide a resolution setting for each integer-value of QP. The
results are organized into a lookup table, as shown at "LOKUP TABLE" 414, which will be used
during runtime. By reducing the 3D view resolution to match the quantization settings, the
wasted rendering work can be significantly reduced, which may result in additional benefits
including reduced energy usage on the server, reduced rendering times, and improved player
feedback latency. These benefits are compounded in environments where multiple game
instances are running on a single server.
15
139127.00143/107072227
[0036] FIG. 5 is an example of lookup table generation for a rendering quality-setting profiles
which contains multiple settings. The process as described in connection with FIG. 3 is
unchanged for selecting a reference image and measuring the perceived quality for each encoder
setting as described in connection with steps 300, 302, and 304. Since the renderer may scale one
or more rendering quality settings in relation to the value of QP, the list of generated rendering
quality-settings profiles, described in connection with step 306 in FIG. 3, may be prohibitively
long to facilitate re-rendering the reference image and calculating a perceived quality for each
rendering quality-settings profile. Since there may be a very large number of rendering settings
permutations, a decision tree may help to programmatically narrow down the possibility space.
For example, it may be undesirable to have a rendering quality-settings profile in which the post
processing quality is very low, but every other setting is very high. In certain embodiments, it
may be undesirable for high-quality shadows to be covered with low-quality post processes. In
other embodiments, it may be the opposite. Decisions of this kind are subjective, but based on
criteria including, but not limited to, computational cost associated with a particular rendering
setting, perceptual quality differences between two values of a setting, the comparative
obviousness of one rendering setting over another (such as close-up effects that consume large
portions of the screen in comparison to far-away details that are only a few pixels wide), or
relative gameplay importance (such as visual effects that are important for communicating
feedback to the player).
[0037] FIG. 5 is an exemplary decision tree, as shown at "DECISION TREE" 500, which is
comprised of a leaf for each permutation of four possible post-processing quality settings, three
possible shadow quality settings, and five possible 3D view resolutions. This example decision
tree is significantly smaller than a real-world example might be, as there might be many more
16
139127.00143/107072227 adaptive rendering settings or many more options per setting, which will be apparent to one of ordinary skill in the art. The decision tree is preferably traversed according to any limiting conditions, such as avoiding leaves where post-processing quality is very low, but all other settings are high. For each leaf that is not removed by a limiting condition, the reference frame may be re-rendered with the rendering quality-settings profile associated with the leaf as described by 308 in FIG. 3. The computational cost, measured in rendering time or clock-cycles, may be recorded at this point to be used as a potential tie-breaker in case of perceived quality value collisions. Then, the perceived quality may be measured for each re-rendered image, as described in connection with step 310 of FIG. 3. For each calculated perceived quality value
(SSIM) in the set calculated for the encoder settings, a list of all rendering quality-settings
profiles with a matching SSIM value may be generated as described in connection with step 312
of FIG. 3. The example of FIG. 5 shows this list being generated for a QP value of 16.
[0038] The SSIM value for the reference image encoded with QP value 16 is 0.997, for which
there are three rendering quality-settings profiles with matching SSIM values, shown with
calculated computational costs 16.004, 15.554, and 15.402. Since there are three collisions for
the perceived quality value, the computational costs recorded earlier serve as a tiebreaker and
may be used to determine which rendering quality-settings profile is the cheapest, in this case,
that which has a cost of 15.402. A lookup table, as shown at "LOOKUP TABLE" 502, should be
generated to assign the cheapest rendering quality-settings profile to each value of QP as
described by step 314 in FIG. 3. The rendering quality-settings profile selected for the QP value
16 is shown in FIG. 5 as "PROFILE 16."
EXAMPLE 1: Effects on Rendering Time as a Proxy for Computational Waste
17
139127.00143/107072227
[0039] In an example implementation, only the resolution is scaled linearly in response to
changes in encoder quality. For example, if the encoder quality drops by 50%, the resolution will
be reduced by 50% in response. Since rendering time savings directly correlate to computational
power savings, the rendering times were examined while the resolution was scaled.
Measurements were taken in a low-motion environment, with a view comprising of a first-person
view of the player's hands, weapon, and a stationary wall. This low-motion view was selected to
limit the number of factors that may contaminate the measurements by impacting the measured
rendering times. These factors may include post processes such as motion blur, changes in the
number of rendered objects, changes in the on-screen textures, or other components of the view
that are likely to change in high-motion views. A stationary view of a stationary scene also
makes it possible to directly compare various measurements taken at scaled resolutions. The
rendering engine was forced to output video at progressively lower resolutions and the results
were measured as shown in Table 1 below.
Resolution Opaque Pass Total Scale Time Rendering Time 100% 0.4 ms 1.4 ms 50% 0.3 ms 1.0 ms 25% 0.2 ms 0.8 ms
TABLE 1: Effects of Resolution Scaling on Rendering Time
[0040] The opaque pass is the portion of the rendering pipeline which draws the opaque
geometry in the view. This is the portion of the rendering pipeline which is most sensitive to
changes in resolution. Any rendering time savings or computational cost savings gained by
scaling the resolution will come mostly from the opaque rendering pass.
18
139127.00143/107072227
[0041] As shown in Table 1, at a full resolution of 1280 x 720 at 60 frames, the rendering time
for the opaque pass is 0.4 ms, out of a total rendering time of 1.4 ms. When the resolution is
reduced to 50% of the full resolution, the rendering time for the opaque pass is 0.3 ms, out of a
total rendering time of 1.0 ms. Scaling the resolution by 50% thus results in a significant
rendering time savings of almost 30%. When the resolution is reduced to 25% of the full
resolution, the rendering time for the opaque pass is 0.2 ms, out of a total rendering time of 0.8
ms. Scaling the resolution by 75% thus results in a significant rendering time savings of over
[0042] The foregoing description and drawings should be considered as illustrative only of the
principles of the invention. The invention is not intended to be limited by the preferred
embodiment and may be implemented in a variety of ways that will be clear to one of ordinary
skill in the art. Numerous applications of the invention will readily occur to those skilled in the
art. Therefore, it is not desired to limit the invention to the specific examples disclosed or the
exact construction and operation shown and described. Rather, all suitable modifications and
equivalents may be resorted to, falling within the scope of the invention.
[0043] The term 'comprise' and variants of the term such as 'comprises' or 'comprising' are
used herein to denote the inclusion of a stated integer or stated integers but not to exclude any
other integer or any other integers, unless in the context or usage an exclusive interpretation of
the term is required.
[0044] Any reference to publications cited in this specification is not an admission that the
disclosures constitute common general knowledge.
19
139127.00143/107072227
Claims (29)
1. A computer-implemented method for rendering, comprising the steps of:
generating one or more reference images;
encoding the one or more reference images for a partial range of encoder settings;
comparing, for each encoded reference image, one or more first perceived qualities to one
or more second perceived qualities, wherein a match between the one or more first perceived
qualities and the one or more second perceived qualities results in an association of one or more
encoder settings with a matching rendering quality-settings profile; and
generating a rendered image at a substantially identical perceived quality to an encoded
frame.
2. The computer-implemented method of claim 1, wherein the steps of the method
are performed at a renderer or a codec.
3. The computer-implemented method of claim 2, wherein the renderer may have
several settings available for per-pixel-quality control including screen resolution, mipmap
selection, level-of-detail (LOD) selection, shadow quality, and post-processing quality.
4. The computer-implemented method of claim 1, wherein the quality-settings
profiles are defined as a list of values for each available quality setting.
5. The computer-implemented method of claim 1, further comprising the step of
optimizing the quality-settings profiles.
6. The computer-implemented method of claim 5, wherein the quality-settings
profiles are optimized using a decision tree to programmatically narrow down probability space.
7. The computer-implemented method of claim 1, wherein the quality-settings
profiles are stored in one or more lookup tables.
20
139127.00143/107072227
8. A computer-implemented method for rendering, comprising:
reading a reported quantization parameter;
comparing the reported quantization parameter to effective quantization settings
associated with a previously rendered frame;
changing the rendering quality to match the effective quantization settings, depending on
the results of said comparison; and
generating a rendered image based on the altered rendering quality.
9. The computer-implemented method of claim 8, wherein the reported quantization
parameter is read at a renderer.
10. The computer-implemented method of claim 9, wherein the renderer read the
reported quantization parameter prior to rendering each frame of the rendered image.
11. The computer-implemented method of claim 8, wherein the rendering quality
changes if the quantization parameter is different from the previously rendered frame or if it is
the first frame to be encoded.
12. The computer-implemented method of claim 8, wherein the rendering quality is
lowered if the quantization parameter has increased since the previously rendered frame.
13. The computer-implemented method of claim 12, wherein the rendering quality
matches a compression level at an encoder.
14. The computer-implemented method of claim 8, wherein a renderer checks a pre
generated lookup table that includes a rendering quality-settings profile associated with the
quantization parameter to determine how to change the rendering quality.
15. The computer-implemented method of claim 14, wherein there is a unique
rendering quality-settings profile for each quantization parameter.
21
139127.00143/107072227
16. A system for image rendering comprising a renderer, wherein:
the renderer reads a reported quantization parameter;
compares the reported quantization parameter to effective quantization settings associated
with a previously rendered frame;
changes the rendering quality to match the effective quantization settings, depending on
the results of said comparison; and
generates a rendered image based on the altered rendering quality.
17. The system of claim 16, wherein the renderer read the reported quantization
parameter prior to rendering each frame of the rendered image.
18. The system of claim 16, wherein the rendering quality changes if the quantization
parameter is different from the previously rendered frame or if it is the first frame to be encoded.
19. The system of claim 16, wherein the rendering quality is lowered if the
quantization parameter has increased since the previously rendered frame.
20. The system of claim 19, wherein the rendering quality matches a compression
level at an encoder.
21. The system of claim 16, wherein a renderer checks a pre-generated lookup table
that includes a rendering quality-settings profile associated with the quantization parameter to
determine how to change the rendering quality.
22. The system of claim 21, wherein there is a unique rendering quality-settings
profile for each quantization parameter.
23. The system of claim 21, wherein the rendering quality-settings profile is defined
as a list of values for each available rendering quality.
24. A computer-implemented method for rendering comprising:
22
139127.00143/107072227 monitoring quantization settings for changes at a renderer; querying a lookup table for a rendering quality-settings profile; receiving a quantization parameter from an encoder; applying an associated rendering quality-settings profile associated with the quantization parameter; and changing rendering quality settings to match the quantization settings if the quantization settings differ from a previously rendered frame, wherein the matching of quantization settings results is raising or lowering frame quality to match a compression level at an encoder.
25. The method of claim 24, wherein the lookup table defines rendering settings for
each partial range of quantization parameters between 0 and 51.
26. The method of claim 24, wherein the renderer alters quality settings according to
the rendering quality-settings profile received from the lookup table before rendering a next
frame.
27. The method of claim 24, wherein the renderer takes no action if the quantization
settings have not changed.
28. The method of claim 24, wherein the renderer checks quantization settings for
every frame.
29. The method of claim 24, wherein the lookup table is comprised of a representative
set of elements from a game level.
23
139127.00143/107072227
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2021202099A AU2021202099B2 (en) | 2017-04-21 | 2021-04-06 | Systems and methods for encoder-guided adaptive-quality rendering |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762488526P | 2017-04-21 | 2017-04-21 | |
US62/488,526 | 2017-04-21 | ||
US201862653056P | 2018-04-05 | 2018-04-05 | |
US62/653,056 | 2018-04-05 | ||
PCT/US2018/028645 WO2018195477A1 (en) | 2017-04-21 | 2018-04-20 | Systems and methods for encoder-guided adaptive-quality rendering |
AU2018254591A AU2018254591B2 (en) | 2017-04-21 | 2018-04-20 | Systems and methods for encoder-guided adaptive-quality rendering |
AU2021202099A AU2021202099B2 (en) | 2017-04-21 | 2021-04-06 | Systems and methods for encoder-guided adaptive-quality rendering |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2018254591A Division AU2018254591B2 (en) | 2017-04-21 | 2018-04-20 | Systems and methods for encoder-guided adaptive-quality rendering |
Publications (2)
Publication Number | Publication Date |
---|---|
AU2021202099A1 true AU2021202099A1 (en) | 2021-05-06 |
AU2021202099B2 AU2021202099B2 (en) | 2022-07-07 |
Family
ID=63854334
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2018254591A Active AU2018254591B2 (en) | 2017-04-21 | 2018-04-20 | Systems and methods for encoder-guided adaptive-quality rendering |
AU2021202099A Active AU2021202099B2 (en) | 2017-04-21 | 2021-04-06 | Systems and methods for encoder-guided adaptive-quality rendering |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2018254591A Active AU2018254591B2 (en) | 2017-04-21 | 2018-04-20 | Systems and methods for encoder-guided adaptive-quality rendering |
Country Status (14)
Country | Link |
---|---|
US (3) | US10313679B2 (en) |
EP (1) | EP3612978A4 (en) |
JP (1) | JP7145204B2 (en) |
KR (1) | KR102326456B1 (en) |
CN (1) | CN111033519B (en) |
AU (2) | AU2018254591B2 (en) |
BR (1) | BR112019021687A2 (en) |
CA (2) | CA3060578C (en) |
DE (1) | DE112018002109T5 (en) |
GB (3) | GB2587091B (en) |
RU (2) | RU2730435C1 (en) |
TW (2) | TWI755616B (en) |
WO (1) | WO2018195477A1 (en) |
ZA (2) | ZA201907682B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2554877B (en) * | 2016-10-10 | 2021-03-31 | Canon Kk | Methods, devices, and computer programs for improving rendering display during streaming of timed media data |
EP3550837A1 (en) * | 2018-04-06 | 2019-10-09 | Comcast Cable Communications LLC | Method for generating quantization matrices based on viewing conditions |
US11830225B2 (en) * | 2018-05-30 | 2023-11-28 | Ati Technologies Ulc | Graphics rendering with encoder feedback |
CN111836116A (en) * | 2020-08-06 | 2020-10-27 | 武汉大势智慧科技有限公司 | Network self-adaptive rendering video display method and system |
CN112206535B (en) * | 2020-11-05 | 2022-07-26 | 腾讯科技(深圳)有限公司 | Rendering display method and device of virtual object, terminal and storage medium |
KR102472971B1 (en) * | 2020-11-19 | 2022-12-02 | 네이버 주식회사 | Method, system, and computer program to optimize video encoding using artificial intelligence model |
CN116803087A (en) * | 2021-02-02 | 2023-09-22 | 索尼集团公司 | Information processing apparatus and information processing method |
CN116440501B (en) * | 2023-06-16 | 2023-08-29 | 瀚博半导体(上海)有限公司 | Self-adaptive cloud game video picture rendering method and system |
Family Cites Families (104)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4501980A (en) | 1982-06-04 | 1985-02-26 | Motornetics Corporation | High torque robot motor |
JPH06129865A (en) | 1992-10-20 | 1994-05-13 | Sumitomo Electric Ind Ltd | Single-mode fiber type depolarizer and manufacture thereof and optical fiber gyro |
US6064359A (en) * | 1997-07-09 | 2000-05-16 | Seiko Epson Corporation | Frame rate modulation for liquid crystal display (LCD) |
US7936818B2 (en) | 2002-07-01 | 2011-05-03 | Arris Group, Inc. | Efficient compression and transport of video over a network |
US6903662B2 (en) | 2002-09-19 | 2005-06-07 | Ergodex | Computer input device with individually positionable and programmable input members |
US8054880B2 (en) | 2004-12-10 | 2011-11-08 | Tut Systems, Inc. | Parallel rate control for digital video encoder with multi-processor architecture and picture-based look-ahead window |
US7408984B2 (en) | 2003-09-17 | 2008-08-05 | International Business Machines Corporation | Method and system for multiple pass video coding |
US7362804B2 (en) | 2003-11-24 | 2008-04-22 | Lsi Logic Corporation | Graphical symbols for H.264 bitstream syntax elements |
US20060230428A1 (en) | 2005-04-11 | 2006-10-12 | Rob Craig | Multi-player video game system |
JP4816874B2 (en) | 2005-05-31 | 2011-11-16 | 日本電気株式会社 | Parameter learning apparatus, parameter learning method, and program |
US20070147497A1 (en) * | 2005-07-21 | 2007-06-28 | Nokia Corporation | System and method for progressive quantization for scalable image and video coding |
KR100790986B1 (en) * | 2006-03-25 | 2008-01-03 | 삼성전자주식회사 | Apparatus and Method for controlling bit rate in variable bit rate video coding |
US20080012856A1 (en) * | 2006-07-14 | 2008-01-17 | Daphne Yu | Perception-based quality metrics for volume rendering |
US8711144B2 (en) * | 2006-08-01 | 2014-04-29 | Siemens Medical Solutions Usa, Inc. | Perception-based artifact quantification for volume rendering |
EP2177010B1 (en) | 2006-12-13 | 2015-10-28 | Quickplay Media Inc. | Mobile media platform |
JP4843482B2 (en) | 2006-12-27 | 2011-12-21 | 株式会社東芝 | Information processing apparatus and program |
US8069258B1 (en) | 2007-09-11 | 2011-11-29 | Electronic Arts Inc. | Local frame processing to apparently reduce network lag of multiplayer deterministic simulations |
US8954876B1 (en) * | 2007-10-09 | 2015-02-10 | Teradici Corporation | Method and apparatus for providing a session status indicator |
US8295624B2 (en) * | 2007-12-03 | 2012-10-23 | Ecole De Technologie Superieure | Method and system for generating a quality prediction table for quality-aware transcoding of digital images |
US9865043B2 (en) | 2008-03-26 | 2018-01-09 | Ricoh Company, Ltd. | Adaptive image acquisition and display using multi-focal display |
EP2364190B1 (en) | 2008-05-12 | 2018-11-21 | GameFly Israel Ltd. | Centralized streaming game server |
US8154553B2 (en) | 2008-05-22 | 2012-04-10 | Playcast Media System, Ltd. | Centralized streaming game server |
US8264493B2 (en) * | 2008-05-12 | 2012-09-11 | Playcast Media Systems, Ltd. | Method and system for optimized streaming game server |
US8678929B1 (en) | 2008-08-01 | 2014-03-25 | Electronics Arts Inc. | Client-side prediction of a local game object to reduce apparent network lag of multiplayer simulations |
US8654835B2 (en) * | 2008-09-16 | 2014-02-18 | Dolby Laboratories Licensing Corporation | Adaptive video encoder control |
FR2936926B1 (en) * | 2008-10-06 | 2010-11-26 | Thales Sa | SYSTEM AND METHOD FOR DETERMINING ENCODING PARAMETERS |
US9342901B2 (en) * | 2008-10-27 | 2016-05-17 | Autodesk, Inc. | Material data processing pipeline |
TW201018443A (en) * | 2008-11-07 | 2010-05-16 | Inst Information Industry | Digital filtering system, method and program |
KR101400935B1 (en) | 2008-11-11 | 2014-06-19 | 소니 컴퓨터 엔터테인먼트 인코포레이티드 | Image processing device, information processing device, image processing method, and information processing method |
CN102308319A (en) * | 2009-03-29 | 2012-01-04 | 诺曼德3D有限公司 | System and format for encoding data and three-dimensional rendering |
EP2415269B1 (en) * | 2009-03-31 | 2020-03-11 | Citrix Systems, Inc. | A framework for quality-aware video optimization |
US10097946B2 (en) | 2011-12-22 | 2018-10-09 | Taiwan Semiconductor Manufacturing Co., Ltd. | Systems and methods for cooperative applications in communication systems |
JP2013507084A (en) * | 2009-10-05 | 2013-02-28 | アイ.シー.ヴイ.ティー リミテッド | Method and system for image processing |
FR2954036B1 (en) * | 2009-12-11 | 2012-01-13 | Thales Sa | METHOD AND SYSTEM FOR DETERMINING ENCODING PARAMETERS ON VARIABLE RESOLUTION FLOWS |
US9338523B2 (en) | 2009-12-21 | 2016-05-10 | Echostar Technologies L.L.C. | Audio splitting with codec-enforced frame sizes |
US20110261885A1 (en) | 2010-04-27 | 2011-10-27 | De Rivaz Peter Francis Chevalley | Method and system for bandwidth reduction through integration of motion estimation and macroblock encoding |
WO2012078640A2 (en) * | 2010-12-06 | 2012-06-14 | The Regents Of The University Of California | Rendering and encoding adaptation to address computation and network bandwidth constraints |
WO2013023287A1 (en) * | 2011-08-16 | 2013-02-21 | Destiny Software Productions Inc. | Script-based video rendering |
JP5155462B2 (en) | 2011-08-17 | 2013-03-06 | 株式会社スクウェア・エニックス・ホールディングス | VIDEO DISTRIBUTION SERVER, VIDEO REPRODUCTION DEVICE, CONTROL METHOD, PROGRAM, AND RECORDING MEDIUM |
US8542933B2 (en) | 2011-09-28 | 2013-09-24 | Pelican Imaging Corporation | Systems and methods for decoding light field image files |
US10091513B2 (en) * | 2011-09-29 | 2018-10-02 | Texas Instruments Incorporated | Perceptual three-dimensional (3D) video coding based on depth information |
US10085023B2 (en) * | 2011-10-05 | 2018-09-25 | Texas Instruments Incorporated | Systems and methods for quantization of video content |
JP5977023B2 (en) | 2011-11-07 | 2016-08-24 | 株式会社スクウェア・エニックス・ホールディングス | Drawing system, program, and recording medium |
US9300980B2 (en) | 2011-11-10 | 2016-03-29 | Luca Rossato | Upsampling and downsampling of motion maps and other auxiliary maps in a tiered signal quality hierarchy |
FR2987086B1 (en) * | 2012-02-22 | 2014-03-21 | Snecma | LINEAR JOINT OF PLATFORM INTER-AUBES |
US8655030B2 (en) * | 2012-04-18 | 2014-02-18 | Vixs Systems, Inc. | Video processing system with face detection and methods for use therewith |
US20150169960A1 (en) * | 2012-04-18 | 2015-06-18 | Vixs Systems, Inc. | Video processing system with color-based recognition and methods for use therewith |
KR20150014496A (en) | 2012-05-14 | 2015-02-06 | 루카 로사토 | Encoding and reconstruction of residual data based on support information |
US9313495B2 (en) | 2012-05-14 | 2016-04-12 | Luca Rossato | Encoding and decoding based on blending of sequences of samples along time |
US8897586B2 (en) * | 2012-06-15 | 2014-11-25 | Comcast Cable Communications, Llc | Dynamic generation of a quantization matrix for compression of a digital object |
BR112015006178B1 (en) | 2012-09-21 | 2022-11-16 | Nokia Technologies Oy | METHODS, APPARATUS AND COMPUTER READABLE NON-TRANSIOUS MEDIA FOR VIDEO ENCODING AND DECODING |
US10616086B2 (en) * | 2012-12-27 | 2020-04-07 | Navidia Corporation | Network adaptive latency reduction through frame rate control |
US9899007B2 (en) | 2012-12-28 | 2018-02-20 | Think Silicon Sa | Adaptive lossy framebuffer compression with controllable error rate |
US9924200B2 (en) | 2013-01-24 | 2018-03-20 | Microsoft Technology Licensing, Llc | Adaptive noise reduction engine for streaming video |
BR112015015575A2 (en) | 2013-01-30 | 2020-02-04 | Intel Corp | adaptive content partitioning for next-generation video prediction and coding |
US9106556B2 (en) * | 2013-02-11 | 2015-08-11 | Avaya Inc. | Method to achieve the use of an external metric as the primary tie-breaker in intermediate system to intermediate system (ISIS) route selections |
PL2959672T3 (en) | 2013-02-21 | 2020-06-01 | Koninklijke Philips N.V. | Improved hdr image encoding and decoding methods and devices |
US8842185B1 (en) * | 2013-03-14 | 2014-09-23 | Microsoft Corporation | HDMI image quality analysis |
US9661351B2 (en) | 2013-03-15 | 2017-05-23 | Sony Interactive Entertainment America Llc | Client side frame prediction for video streams with skipped frames |
US10040323B2 (en) | 2013-03-15 | 2018-08-07 | Bridgestone Americas Tire Operations, Llc | Pneumatic tire with bead reinforcing elements at least partially formed from carbon fibers |
GB2513345B (en) | 2013-04-23 | 2017-07-26 | Gurulogic Microsystems Oy | Data communication system and method |
US20140321561A1 (en) * | 2013-04-26 | 2014-10-30 | DDD IP Ventures, Ltd. | System and method for depth based adaptive streaming of video information |
US9079108B2 (en) | 2013-05-31 | 2015-07-14 | Empire Technology Development Llc | Cache-influenced video games |
WO2015027283A1 (en) * | 2013-08-29 | 2015-03-05 | Smart Services Crc Pty Ltd | Quality controller for video image |
GB2519336B (en) | 2013-10-17 | 2015-11-04 | Imagination Tech Ltd | Tone Mapping |
US9940904B2 (en) | 2013-10-23 | 2018-04-10 | Intel Corporation | Techniques for determining an adjustment for a visual output |
US9749642B2 (en) | 2014-01-08 | 2017-08-29 | Microsoft Technology Licensing, Llc | Selection of motion vector precision |
US20150228106A1 (en) | 2014-02-13 | 2015-08-13 | Vixs Systems Inc. | Low latency video texture mapping via tight integration of codec engine with 3d graphics engine |
US20150237351A1 (en) * | 2014-02-18 | 2015-08-20 | Penne Lee | Techniques for inclusion of region of interest indications in compressed video data |
US9928610B2 (en) | 2014-06-27 | 2018-03-27 | Samsung Electronics Co., Ltd. | Motion based adaptive rendering |
WO2016014852A1 (en) * | 2014-07-23 | 2016-01-28 | Sonic Ip, Inc. | Systems and methods for streaming video games using gpu command streams |
EP3188762A4 (en) | 2014-07-31 | 2018-01-10 | The Board of Regents of the University of Oklahoma | High isomerohydrolase activity mutants of mammalian rpe65 |
US9762919B2 (en) | 2014-08-28 | 2017-09-12 | Apple Inc. | Chroma cache architecture in block processing pipelines |
WO2016050991A1 (en) | 2014-10-03 | 2016-04-07 | RUIZ COLL, José Damián | Method for improving the quality of an image subjected to recoding |
US10063866B2 (en) | 2015-01-07 | 2018-08-28 | Texas Instruments Incorporated | Multi-pass video encoding |
US10098960B2 (en) | 2015-04-03 | 2018-10-16 | Ucl Business Plc | Polymer conjugate |
EP3286918A1 (en) | 2015-04-21 | 2018-02-28 | VID SCALE, Inc. | Artistic intent based video coding |
JP6574270B2 (en) | 2015-06-05 | 2019-09-11 | アップル インコーポレイテッドApple Inc. | Rendering and display of high dynamic range content |
IN2015CH02866A (en) * | 2015-06-09 | 2015-07-17 | Wipro Ltd | |
CN106293047B (en) | 2015-06-26 | 2020-01-10 | 微软技术许可有限责任公司 | Reducing power consumption of mobile devices by dynamic resolution scaling |
US9704270B1 (en) | 2015-07-30 | 2017-07-11 | Teradici Corporation | Method and apparatus for rasterizing and encoding vector graphics |
KR20180039722A (en) | 2015-09-08 | 2018-04-18 | 엘지전자 주식회사 | Method and apparatus for encoding / decoding image |
US9716875B2 (en) * | 2015-09-18 | 2017-07-25 | Intel Corporation | Facilitating quantization and compression of three-dimensional graphics data using screen space metrics at computing devices |
US9807416B2 (en) | 2015-09-21 | 2017-10-31 | Google Inc. | Low-latency two-pass video coding |
US10257528B2 (en) * | 2015-10-08 | 2019-04-09 | Electronics And Telecommunications Research Institute | Method and apparatus for adaptive encoding and decoding based on image quality |
JP6910130B2 (en) | 2015-11-06 | 2021-07-28 | 三星電子株式会社Samsung Electronics Co.,Ltd. | 3D rendering method and 3D rendering device |
US10163183B2 (en) | 2016-01-13 | 2018-12-25 | Rockwell Collins, Inc. | Rendering performance using dynamically controlled samples |
US10216467B2 (en) | 2016-02-03 | 2019-02-26 | Google Llc | Systems and methods for automatic content verification |
US10499056B2 (en) * | 2016-03-09 | 2019-12-03 | Sony Corporation | System and method for video processing based on quantization parameter |
US9705526B1 (en) | 2016-03-17 | 2017-07-11 | Intel Corporation | Entropy encoding and decoding of media applications |
US10109100B2 (en) | 2016-03-25 | 2018-10-23 | Outward, Inc. | Adaptive sampling of pixels |
KR101713492B1 (en) * | 2016-06-27 | 2017-03-07 | 가천대학교 산학협력단 | Method for image decoding, method for image encoding, apparatus for image decoding, apparatus for image encoding |
CN106162195B (en) * | 2016-07-05 | 2018-04-17 | 宁波大学 | A kind of 3D HEVC deep video information concealing methods based on single depth frame mode |
WO2018037525A1 (en) * | 2016-08-25 | 2018-03-01 | Necディスプレイソリューションズ株式会社 | Self-diagnostic imaging method, self-diagnostic imaging program, display device, and self-diagnostic imaging system |
EP3301921A1 (en) * | 2016-09-30 | 2018-04-04 | Thomson Licensing | Method and apparatus for calculating quantization parameters to encode and decode an immersive video |
US10070098B2 (en) * | 2016-10-06 | 2018-09-04 | Intel Corporation | Method and system of adjusting video quality based on viewer distance to a display |
GB2554877B (en) | 2016-10-10 | 2021-03-31 | Canon Kk | Methods, devices, and computer programs for improving rendering display during streaming of timed media data |
US10237293B2 (en) * | 2016-10-27 | 2019-03-19 | Bitdefender IPR Management Ltd. | Dynamic reputation indicator for optimizing computer security operations |
KR102651126B1 (en) | 2016-11-28 | 2024-03-26 | 삼성전자주식회사 | Graphic processing apparatus and method for processing texture in graphics pipeline |
GB2558886B (en) | 2017-01-12 | 2019-12-25 | Imagination Tech Ltd | Graphics processing units and methods for controlling rendering complexity using cost indications for sets of tiles of a rendering space |
US10117185B1 (en) | 2017-02-02 | 2018-10-30 | Futurewei Technologies, Inc. | Content-aware energy savings for video streaming and playback on mobile devices |
US10979718B2 (en) * | 2017-09-01 | 2021-04-13 | Apple Inc. | Machine learning video processing systems and methods |
US10423587B2 (en) * | 2017-09-08 | 2019-09-24 | Avago Technologies International Sales Pte. Limited | Systems and methods for rendering graphical assets |
CN108334412A (en) | 2018-02-11 | 2018-07-27 | 沈阳东软医疗系统有限公司 | A kind of method and apparatus of display image |
-
2018
- 2018-04-20 RU RU2019136805A patent/RU2730435C1/en active
- 2018-04-20 RU RU2020127411A patent/RU2759505C2/en active
- 2018-04-20 CA CA3060578A patent/CA3060578C/en active Active
- 2018-04-20 DE DE112018002109.2T patent/DE112018002109T5/en active Pending
- 2018-04-20 WO PCT/US2018/028645 patent/WO2018195477A1/en active Application Filing
- 2018-04-20 TW TW108125441A patent/TWI755616B/en active
- 2018-04-20 CN CN201880041154.1A patent/CN111033519B/en active Active
- 2018-04-20 TW TW107113586A patent/TWI669954B/en active
- 2018-04-20 US US15/959,069 patent/US10313679B2/en active Active
- 2018-04-20 GB GB2014205.5A patent/GB2587091B/en active Active
- 2018-04-20 KR KR1020197033915A patent/KR102326456B1/en active IP Right Grant
- 2018-04-20 CA CA3082771A patent/CA3082771C/en active Active
- 2018-04-20 GB GB1916965.5A patent/GB2576662B/en active Active
- 2018-04-20 BR BR112019021687-1A patent/BR112019021687A2/en unknown
- 2018-04-20 GB GB2112381.5A patent/GB2595197B/en active Active
- 2018-04-20 AU AU2018254591A patent/AU2018254591B2/en active Active
- 2018-04-20 EP EP18787539.8A patent/EP3612978A4/en active Pending
- 2018-04-20 JP JP2020507504A patent/JP7145204B2/en active Active
-
2019
- 2019-04-23 US US16/391,898 patent/US10554984B2/en active Active
- 2019-11-20 ZA ZA2019/07682A patent/ZA201907682B/en unknown
-
2020
- 2020-01-06 US US16/735,275 patent/US11330276B2/en active Active
- 2020-11-19 ZA ZA2020/07214A patent/ZA202007214B/en unknown
-
2021
- 2021-04-06 AU AU2021202099A patent/AU2021202099B2/en active Active
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2021202099B2 (en) | Systems and methods for encoder-guided adaptive-quality rendering | |
US10242462B2 (en) | Rate control bit allocation for video streaming based on an attention area of a gamer | |
AU2020289756B2 (en) | Systems and methods for rendering & pre-encoded load estimation based encoder hinting | |
RU2744982C2 (en) | Systems and methods for deferred post-processing operations when encoding video information | |
US11272185B2 (en) | Hierarchical measurement of spatial activity for text/edge detection | |
US20210092424A1 (en) | Adaptive framerate for an encoder |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FGA | Letters patent sealed or granted (standard patent) |