WO2024055286A1 - Systems and methods for efficient feature assessment for game visual quality - Google Patents

Systems and methods for efficient feature assessment for game visual quality Download PDF

Info

Publication number
WO2024055286A1
WO2024055286A1 PCT/CN2022/119304 CN2022119304W WO2024055286A1 WO 2024055286 A1 WO2024055286 A1 WO 2024055286A1 CN 2022119304 W CN2022119304 W CN 2022119304W WO 2024055286 A1 WO2024055286 A1 WO 2024055286A1
Authority
WO
WIPO (PCT)
Prior art keywords
frame recording
frame
recording
frames
vrs
Prior art date
Application number
PCT/CN2022/119304
Other languages
French (fr)
Inventor
Junjie Liu
Bin LV
Xingli HE
Zhaoming CAI
Original Assignee
Qualcomm Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Incorporated filed Critical Qualcomm Incorporated
Priority to PCT/CN2022/119304 priority Critical patent/WO2024055286A1/en
Publication of WO2024055286A1 publication Critical patent/WO2024055286A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/154Measured or subjectively estimated visual quality after decoding, e.g. measurement of distortion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Definitions

  • VRS Variable rate shading
  • a computing device to adjust the shading performance of a video game image frame to balance tradeoffs between high speed, low latency, and low energy consumption.
  • the shading is calculated over several pixels at once.
  • VRS may accomplish pixel shading in a more-coarse lower-resolution to reduce post-processing time and the load on the computing device processor.
  • VRS allows for adjusting the size of that combination of pixels on a per-frame basis, allowing the video game to take advantage of processing budget where it exists, or pull back to a point where performance is needed.
  • the various aspects include methods of assessing visual quality of a computer game output, which may include obtaining a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off, obtaining a second frame recording of the same segment of game replay generated with the VRS feature on, temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording, and using a visual quality assessment tool to compare the first frame recording to the second frame recording to output a comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
  • VRS variable rate shading
  • temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording may include sampling frames in each of the first frame recording and the second frame recording, computing peak signal-to-noise-ratio (PSNR) values of the sampled frames in each of the first frame recording and the second frame recording as an output statistic of PSNRs for describing a state at each index pair, storing all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions, using steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording, and outputting sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
  • PSNR peak signal-to-noise-ratio
  • the visual quality assessment tool may receive as input the sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices, and output key performance indicator (KPI) values of each of the first frame recording and the second frame recording for comparison.
  • KPI key performance indicator
  • obtaining the first frame recording of the segment of game replay generated with the VRS feature off may include extracting a game replay/observe recording sample including the first K frames of game sequences with the VRS feature off (in which K is a positive integer number)
  • obtaining the second frame recording of the segment of game replay generated with the VRS feature on may include extracting a game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on
  • computing PSNR values of the sampled frames in each of the first frame recording and the second frame recording may include computing the PSNR between the first K frames of game sequences started from index I with the VRS feature off and the first K frames of game sequences started from index J with the VRS feature on, in which I, J is a pair of candidate indices sampled from a fixed length frame sync window.
  • using the visual quality assessment tool to compare the first frame recording to the second frame recording may include using discrete Fourier transform (DFT) based operators to separate a high-frequency part from a low-frequency part of the sampled frames of each of the first frame recording and the second frame recording, using the high-frequency part of the sampled frames to identify inverse DFT shift frames, and using inverse DFT shift frames to output KPI sharpness scores for each of the first frame recording and the second frame recording.
  • DFT discrete Fourier transform
  • using the visual quality assessment tool to compare the first frame recording to the second frame recording may include one or more of a HaslerS03 method to output a colorfulness output of each of the first frame recording and the second frame recording, a mean of a grayscale method to output a brightness KPI score of each of the first frame recording and the second frame recording, using a DFT-based algorithm to output a sharpness KPI score of each of the first frame recording and the second frame recording, using a BRISQUE algorithm to output a perception KPI score of each of the first frame recording and the second frame recording, or using a video multimethod assessment fusion (VMAF) method to output a fidelity KPI score of each of the first frame recording and the second frame recording.
  • a HaslerS03 method to output a colorfulness output of each of the first frame recording and the second frame recording
  • a mean of a grayscale method to output a brightness KPI score of each of the first frame recording and the second frame recording
  • using a DFT-based algorithm to output a sharpness KPI score of
  • Further aspects include a computing device having a memory, and a processor coupled to the memory and configured to perform various operations corresponding to the method operations discussed above. Further aspects include a computing device having various means for performing functions corresponding to the method operations discussed above. Further aspects include a non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor to perform various operations corresponding to the method operations discussed above.
  • FIGs. 1-6 are process flow diagrams that illustrate methods of assessing visual quality of a computer game output in accordance with various embodiments.
  • FIG. 7 is a component block diagram illustrating components and operations in a computing system that may be configured to implement the various embodiments.
  • FIG. 8 is a component block diagram of an example computing system that includes system on chips (SOCs) suitable for implementing the various embodiments.
  • SOCs system on chips
  • FIG. 9 is a component block diagram of an example computing device, in the form of smartphone, that is suitable for implementing the various embodiments.
  • a computing device processor may be configured to obtain a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off, obtain a second frame recording of the same segment of game replay generated with the VRS feature on, temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording, and use a visual quality assessment tool to compare the first frame recording to the second frame recording, and output the comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
  • the output may be used to assess the quality loss of top games between VRS on/off on sampled game frame sequence.
  • the various embodiments allow the computing device to better balance tradeoffs between the performance, visual effects, memory usage, and the power consumption characteristics of the computing device. This may improve the performance and energy consumption characteristics of the computing device.
  • Game VEF Game Visual Effect Factors
  • computing device is used herein to refer to any one or all of internet-of-things (IOT) devices (e.g., smart televisions, smart speakers, smart locks, lighting systems, smart switches, smart plugs, smart doorbells, smart doorbell cameras, smart air pollution/quality monitors, smart smoke alarms, security systems, smart thermostats, etc. ) , server computing devices, personal computers, laptop computers, tablet computers, user equipment (UE) , smartphones, personal or mobile multi-media players, personal data assistants (PDAs) , palm-top computers, wireless electronic mail receivers, multimedia Internet enabled cellular telephones, gaming systems (e.g., PlayStation TM , Xbox TM , Nintendo Switch TM , etc.
  • IOT internet-of-things
  • server computing devices personal computers, laptop computers, tablet computers, user equipment (UE) , smartphones, personal or mobile multi-media players, personal data assistants (PDAs) , palm-top computers, wireless electronic mail receivers, multimedia Internet enabled cellular telephones, gaming systems (e.g., PlayStation
  • wearable devices e.g., smartwatch, head-mounted display, fitness tracker, etc.
  • media players e.g., digital video disc (DVD) players, ROKU TM , AppleTV TM , etc.
  • DVRs digital video recorders
  • Internet access gateways modems, routers, network switches, residential gateways, access points, integrated access devices (IAD) , mobile convergence products, networking adapters, multiplexers, and other similar devices that include a programmable processor and communications circuitry for providing the functionality described herein.
  • IAD integrated access devices
  • SOC system on chip
  • a single SOC may contain circuitry for digital, analog, mixed-signal, and radio-frequency functions.
  • a single SOC may also include any number of general purpose and/or specialized processors (digital signal processors, modem processors, video processors, etc. ) , memory blocks (e.g., ROM, RAM, Flash, etc. ) , and resources (e.g., timers, voltage regulators, oscillators, etc. ) .
  • SOCs may also include software for controlling the integrated resources and processors, as well as for controlling peripheral devices.
  • SIP system in a package
  • a SIP may include a single substrate on which multiple IC chips or semiconductor dies are stacked in a vertical configuration.
  • the SIP may include one or more multi-chip modules (MCMs) on which multiple ICs or semiconductor dies are packaged into a unifying substrate.
  • MCMs multi-chip modules
  • a SIP may also include multiple independent SOCs coupled together via high speed communication circuitry and packaged in close proximity, such as on a single motherboard or in a single wireless device. The proximity of the SOCs facilitates high speed communications and the sharing of memory and resources.
  • multicore processor is used herein to refer to a single integrated circuit (IC) chip or chip package that contains two or more independent processing cores (e.g., CPU core, Internet protocol (IP) core, graphics processor unit (GPU) core, etc. ) configured to read and execute program instructions.
  • a SOC may include multiple multicore processors, and each processor in an SOC may be referred to as a core.
  • multiprocessor is used herein to refer to a system or device that includes two or more processing units configured to read and execute program instructions.
  • Enabling auto variable rate shading (VRS) on a computing device used for gaming may reduce power consumption by at least 2-6%on the computing device.
  • conventional solutions do not include or prove a tool for determining game visual effect sensitive KPI metrics or assessing the quality loss of images between VRS on versus VRS off on a sampled game frame sequence.
  • conventional solutions do not provide users or software developers with adequate tools for balancing tradeoffs between reduced power consumption, improved performance, and improved game visual effects on the computing device.
  • the various embodiments generate comparison of the visual quality of computer game output with the VRS feature on and off.
  • some embodiments may temporally align an image frame captured with the VRS feature on with an image frame captured with the VRS feature off.
  • the computing device may capture sample frames with the VRS feature on and off, compute peak signal-to-noise-ratio (PSNR) values of the sampled frames, and use steepest ascent/decent analysis (or gradient ascent/decent analysis) of the PSNR values to match up the sampled frames.
  • PSNR peak signal-to-noise-ratio
  • the steepest ascent analysis may include well-known techniques that include performing a first-order iterative optimization algorithm for finding a local maximum (or minimum) of a differentiable function.
  • FIG. 1 illustrates a method 100 of assessing the visual quality of a computer game output in accordance with some embodiments.
  • Method 100 may be performed by a processor (e.g., a graphics processor, applications processor, etc. ) in a computing device.
  • a processor e.g., a graphics processor, applications processor, etc.
  • the computing device may obtain a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off.
  • the computing device may extract a game replay/observe recording sample that includes the first K frames of game sequences with the VRS feature off.
  • K may be a positive integer number.
  • the computing device may obtain a second frame recording of the same segment of game replay (e.g., first K frames, etc. ) generated with the VRS feature on.
  • a second frame recording of the same segment of game replay e.g., first K frames, etc.
  • the computing device may temporally align the first and second frame recordings and identify an alignment index of the first and second frame recordings.
  • the computing device may sample frames in the first and second frame recordings, compute peak signal-to-noise-ratio (PSNR) values of the sampled frames, and output statistics of PSNRs describing a state at each index pair.
  • PSNR peak signal-to-noise-ratio
  • the computing device may store all index pair states with a PSNR based value in a memory matrix M.
  • the computing device may use steepest ascent analysis of the PSNR values to match up the sampled frames and identify a maximum estimation of a state matrix and the start frame indices of the first and second frame recordings.
  • the computing device may output sampled frames of the first and second frame recordings corresponding to the identified start frame indices for comparison by a visual quality assessment tool of the computing device.
  • the computing device may use the visual quality assessment tool to compare the first frame recording to the second frame recording.
  • the computing device may output the comparison of the visual quality of the computer game output with the VRS feature on and off.
  • the computing device may use discrete Fourier transform (DFT) based operators to separate a high-frequency part from a low-frequency part of the sampled frames of each of the first and second frame recordings, use the high-frequency part of the sampled frames to identify inverse DFT shift frames, and use inverse DFT shift frames to output KPI sharpness scores for each of the first and second frame recordings.
  • DFT discrete Fourier transform
  • the computing device may use a HaslerS03 method to output a colorfulness output of each of the first and second frame recordings, a mean of a grayscale method to output a brightness KPI score of each of the first and second frame recordings, a DFT-based algorithm to output a sharpness KPI score of each of the first and second frame recordings, a BRISQUE algorithm to output a perception KPI score of each of the first and second frame recordings, or using a video multimethod assessment fusion (VMAF) method to output a fidelity KPI score of each of the first and second frame recordings.
  • VMAF video multimethod assessment fusion
  • FIG. 2 illustrates another method 200 of assessing the visual quality of a computer game output in accordance with some embodiments.
  • Method 200 may be performed by a processor (e.g., a graphics processor, applications processor, etc. ) in a computing device.
  • a processor e.g., a graphics processor, applications processor, etc.
  • the computing device may perform the operations discussed above with reference to FIG. 1. For example, the computing device may obtain a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off in block 102 and obtain a second frame recording of the same segment of game replay (e.g., first K frames, etc. ) generated with the VRS feature on in block 104.
  • VRS variable rate shading
  • obtaining the first frame recording in block 102 may include extracting a game replay/observe recording sample including the first K frames of game sequences with the VRS feature off (in which K is a positive integer number)
  • obtaining the second frame recording in block 104 may include extracting a game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on.
  • the computing device may sample frames in each of the first and second frame recordings.
  • the computing device may compute peak signal-to-noise-ratio (PSNR) values of the sampled frames as an output statistic of PSNRs for describing a state at each index pair.
  • PSNR values may include computing PSNR values of the sampled frames in each of the first frame recording and the second frame recording may include computing the PSNR between the first K frames of game sequences starting from index I with the VRS feature off and the first K frames of game sequences starting from index J with the VRS feature on, in which I, J is a pair of candidate indices sampled from a fixed length frame sync window.
  • the computing device may store all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions.
  • the computing device may use steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording.
  • the computing device may output sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
  • the computing device may use the visual quality assessment tool to compare the first frame recording to the second frame recording, and output a comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
  • FIG. 3 illustrates a method 300 of assessing the visual quality of a computer game output in accordance with some embodiments.
  • Method 300 may be performed by a processor (e.g., a graphics processor, applications processor, etc. ) in a computing device.
  • a processor e.g., a graphics processor, applications processor, etc.
  • the computing device may perform the operations discussed above with reference to FIGs. 1 and 2 to generate and output sampled frames corresponding to the identified start frame indices.
  • the computing device may receive as input, in its visual quality assessment tool, the sampled frames corresponding to the identified start frame indices.
  • the computing device may compare the received inputs in the visual quality assessment tool.
  • the computing device may output key performance indicator (KPI) values for each of the first and second frame recordings. The computing device may allow a user to compare the KPI values and adjust the operating parameters of the computing device to balance tradeoffs between performance, power consumption, and visual display quality.
  • KPI key performance indicator
  • FIG. 4 illustrates a method 400 of assessing the visual quality of a computer game output in accordance with some embodiments.
  • Method 400 may be performed by a processor (e.g., a graphics processor, applications processor, etc. ) in a computing device.
  • the computing device may obtain a first frame recording by extracting a game replay/observe recording sample including the first K frames of game sequences with the VRS feature off (where K is a positive integer number) .
  • the computing device may obtain a second frame recording by extracting a game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on.
  • the computing device may sample frames in each of the first and second frame recordings.
  • the computing device may compute peak signal-to-noise-ratio (PSNR) values between the first K frames of game sequences with the VRS feature off and the first K frames of game sequences with the VRS feature on.
  • the first K frames may be started from indices I, J in frame sync window, in which I is the start index from original sequence S 1 in the range of fixed length frame sync window (e.g., first 5 seconds, etc. ) and J is the start index from original sequence S 2 in the range of fixed length frame sync window with same length.
  • the computing device may choose a pair of candidate indices I, J in a fixed length frame sync window, then compute peak signal-to-noise-ratio (PSNR) values between the first K frames of game sequences starting from index I with the VRS feature off and the first K frames of game sequences starting from index J with the VRS feature on.
  • PSNR peak signal-to-noise-ratio
  • the computing device may perform the operations discussed above with reference to FIGs. 1 and 2.
  • the computing device may store all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions.
  • the computing device may use steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording.
  • the computing device may output sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
  • the computing device may use the visual quality assessment tool to compare the first frame recording to the second frame recording and output a comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
  • FIG. 5 illustrates a method 500 of assessing the visual quality of a computer game output in accordance with some embodiments.
  • Method 500 may be performed by a processor (e.g., a graphics processor, applications processor, etc. ) in a computing device.
  • a processor e.g., a graphics processor, applications processor, etc.
  • the computing device may perform the operations discussed above with reference to FIGs. 2 and 4.
  • the computing device may generate a first sample frame sequence S1 based on the game replay/observe recording sample including the first K frames of game sequences with the VRS feature off.
  • the computing device may generate second sample frame sequence Ss based on the game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on.
  • the computing device may generate a pair comparison sequence.
  • the computing device may generate an output array based on the pair comparison sequence.
  • the computing device may perform the operations discussed above with reference to FIGs. 1 and 2.
  • the computing device may store all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions.
  • the computing device may use steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording.
  • the computing device may output sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
  • the computing device may use the visual quality assessment tool to compare the first frame recording to the second frame recording and output a comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
  • FIG. 6 illustrates a method 600 of assessing the visual quality of a computer game output in accordance with some embodiments.
  • Method 600 may be performed by a processor (e.g., a graphics processor, applications processor, etc. ) in a computing device.
  • a processor e.g., a graphics processor, applications processor, etc.
  • the computing device may perform the operations discussed above with reference to FIGs. 1 and 2.
  • the computing device may use discrete Fourier transform (DFT) based operators to separate a high-frequency part from a low-frequency part of the sampled frames of each of the first frame recording and the second frame recording.
  • DFT discrete Fourier transform
  • the computing device may use the high-frequency part of the sampled frames to identify inverse DFT shift frames.
  • the computing device may use inverse DFT shift frames to output KPI sharpness scores for each of the first frame recording and the second frame recording.
  • FIG. 7 illustrates components and operations in a computing system 700 that could be configured to assessing the visual quality of a computer game output in accordance with some embodiments.
  • the computing system 700 may include an input 702 component, a temporal processing 104 component, a calculate KPIs 106 component, and an output 108 component.
  • the input 702 component may collect a set of target game config frame recordings (full reference frame with timestamp sync) , and a corresponding set of VRS enable frame recordings. For example, in block 710, the input 702 component may obtain a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off (or disabled) . In block 712, the input 702 component may obtain a second frame recording of the same segment of game replay with the VRS feature on (or enabled) .
  • VRS variable rate shading
  • the temporal processing 704 component may go through sequence processing steps to transform the input into to pair comparison sequences ⁇ P (S_1i ⁇ ′, S_2i ⁇ ′) , i ⁇ [1, N] ⁇ , N ⁇ min (len (S 1 ) , len (S 2 ) ) .
  • an integrated libvmaf component may calculate full reference fidelity to show difference on VRS enable/disable.
  • Sr sample rate
  • Wr FrameSyncWindow
  • Ds SyncDuration
  • the temporal processing 704 component may solve the problem of mismatch between the starting points of the two frame sequences.
  • Old methods use selection with human eye to calculate fidelity (eq: SSIM) for a single match pair (only game screenshot) . This is not efficient for sequence frames.
  • the PSNR-based approach discussed in this application that computes the PSNR between the VRS enable and Target video codec ( “VQ” ) frames iteratively until a match is found.
  • a matrix may be used in memorization to avoid repeated state calculations.
  • a Steepest Ascent calculation may be used to speedup estimating best match PSNRs.
  • the calculate KPIs 706 component may receive and use the output array (e.g., P (S′ 1i , S′ 2i ) ) to determine key performance indicator (KPI) values and/or visual effect factors (VEF) for each of the first and second frame recordings.
  • KPI key performance indicator
  • VEF visual effect factors
  • the output 708 component may store the KPIs and/or VEFs in a logfile and/or trend chart that allows users to compare the KPI or VEF values and adjust the operating parameters of the computing device to balance tradeoffs between visual display quality and the performance and power consumption characteristics of the computing device.
  • the logfile or stored values may also be used to generate a trend curve output.
  • FIG. 8 illustrates an example computing system or SIP 800 architecture that may be used in wireless devices implementing the various embodiments.
  • the illustrated example SIP 800 includes a two SOCs 802, 804, a clock 806, and a voltage regulator 808.
  • the first SOC 802 may operate as central processing unit (CPU) of the wireless device that carries out the instructions of software application programs by performing the arithmetic, logical, control and input/output (I/O) operations specified by the instructions.
  • the second SOC 804 may operate as a specialized processing unit.
  • the second SOC 804 may operate as a specialized 5G processing unit responsible for managing high volume, high speed (e.g., 5 Gbps, etc. ) , and/or very high frequency short wave length (e.g., 28 GHz mmWave spectrum, etc. ) communications.
  • the first SOC 802 may include a digital signal processor (DSP) 810, a modem processor 812, a graphics processor 814, an application processor 816, one or more coprocessors 818 (e.g., vector co-processor) connected to one or more of the processors, memory 820, custom circuity 822, system components and resources 824, an interconnection/bus module 826, one or more temperature sensors 830, a thermal management unit 832, and a thermal power envelope (TPE) component 834.
  • DSP digital signal processor
  • modem processor 812 e.g., a graphics processor 814
  • an application processor 816 e.g., one or more coprocessors 818 (e.g., vector co-processor) connected to one or more of the processors
  • memory 820 e.g., custom circuity 822, system components and resources 824
  • an interconnection/bus module 826 e.g., one or more temperature sensors 830
  • TPE thermal power envelope
  • the second SOC 804 may include a 5G modem processor 852, a power management unit 854, an interconnection/bus module 864, a plurality of mmWave transceivers 856, memory 858, and various additional processors 860, such as an applications processor, packet processor, etc.
  • Each processor 810, 812, 814, 816, 818, 852, 860 may include one or more cores, and each processor/core may perform operations independent of the other processors/cores.
  • the first SOC 802 may include a processor that executes a first type of operating system (e.g., FreeBSD, LINUX, OS X, ANDROID etc. ) and a processor that executes a second type of operating system (e.g., MICROSOFT WINDOWS 10) .
  • a first type of operating system e.g., FreeBSD, LINUX, OS X, ANDROID etc.
  • a processor that executes a second type of operating system e.g., MICROSOFT WINDOWS 10.
  • processors 810, 812, 814, 816, 818, 852, 860 may be included as part of a processor cluster architecture (e.g., a synchronous processor cluster architecture, an asynchronous or heterogeneous processor cluster architecture, etc. ) .
  • a processor cluster architecture e.g., a synchronous processor cluster architecture, an asynchronous or heterogeneous processor cluster architecture, etc.
  • the first and second SOC 802, 804 may include various system components, resources and custom circuitry for managing sensor data, analog-to-digital conversions, wireless data transmissions, and for performing other specialized operations, such as decoding data packets and processing encoded audio and video signals for rendering in a web browser.
  • the system components and resources 824 of the first SOC 802 may include power amplifiers, voltage regulators, oscillators, phase-locked loops, peripheral bridges, data controllers, memory controllers, system controllers, access ports, timers, and other similar components used to support the processors and software clients running on a wireless device.
  • the system components and resources 824 and/or custom circuitry 822 may also include circuitry to interface with peripheral devices, such as cameras, electronic displays, wireless communication devices, external memory chips, etc.
  • the first and second SOC 802, 804 may communicate via interconnection/bus module 850.
  • the various processors 810, 812, 814, 816, 818, may be interconnected to one or more memory elements 820, system components and resources 824, and custom circuitry 822, and a thermal management unit 832 via an interconnection/bus module 826.
  • the processor 852 may be interconnected to the power management unit 854, the mmWave transceivers 856, memory 858, and various additional processors 860 via the interconnection/bus module 864.
  • the interconnection/bus module 826, 850, 864 may include an array of reconfigurable logic gates and/or implement a bus architecture (e.g., CoreConnect, AMBA, etc. ) . Communications may be provided by advanced interconnects, such as high-performance networks-on chip (NoCs) .
  • NoCs high-performance networks-on chip
  • the first and/or second SOCs 802, 804 may further include an input/output module (not illustrated) for communicating with resources external to the SOC, such as a clock 806 and a voltage regulator 808.
  • resources external to the SOC e.g., clock 806, voltage regulator 808
  • various embodiments may be implemented in a wide variety of computing systems, which may include a single processor, multiple processors, multicore processors, or any combination thereof.
  • the smartphone 900 may include a first SOC 802 (e.g., a SOC-CPU) coupled to a second SOC 804 (e.g., a 5G capable SOC) .
  • the first and second SOCs 802, 804 may be coupled to internal memory 906, 916, a display 912, and to a speaker 914.
  • the smartphone 900 may include an antenna 904 for sending and receiving electromagnetic radiation that may be connected to a wireless data link and/or cellular telephone transceiver 908 coupled to one or more processors in the first and/or second SOCs 802, 804.
  • Smartphones 900 typically also include menu selection buttons or rocker switches 920 for receiving user inputs.
  • a typical smartphone 900 also includes a sound encoding/decoding (CODEC) circuit 910, which digitizes sound received from a microphone into data packets suitable for wireless transmission and decodes received sound data packets to generate analog signals that are provided to the speaker to generate sound.
  • CODEC sound encoding/decoding
  • one or more of the processors in the first and second SOCs 802, 804, wireless transceiver 908 and CODEC 910 may include a digital signal processor (DSP) circuit (not shown separately) .
  • DSP digital signal processor
  • the processors of the smart phone 900 may be any programmable microprocessor, microcomputer or multiple processor chip or chips that can be configured by software instructions (applications) to perform a variety of functions, including the functions of the various embodiments described below.
  • multiple processors may be provided, such as one processor within an SOC 804 dedicated to wireless communication functions and one processor within an SOC 802 dedicated to running other applications.
  • software applications may be stored in the memory 906, 916 before they are accessed and loaded into the processor.
  • the processors may include internal memory sufficient to store the application software instructions.
  • a component may be, but is not limited to, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • a component may be, but is not limited to, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a wireless device and the wireless device may be referred to as a component.
  • One or more components may reside within a process and/or thread of execution and a component may be localized on one processor or core and/or distributed between two or more processors or cores. In addition, these components may execute from various non-transitory computer readable media having various instructions and/or data structures stored thereon. Components may communicate by way of local and/or remote processes, function or procedure calls, electronic signals, data packets, memory read/writes, and other known network, computer, processor, and/or process related communication methodologies.
  • Such services and standards include, e.g., third generation partnership project (3GPP) , long term evolution (LTE) systems, third generation wireless mobile communication technology (3G) , fourth generation wireless mobile communication technology (4G) , fifth generation wireless mobile communication technology (5G) , global system for mobile communications (GSM) , universal mobile telecommunications system (UMTS) , 3GSM, general packet radio service (GPRS) , code division multiple access (CDMA) systems (e.g., cdmaOne, CDMA1020TM) , enhanced data rates for GSM evolution (EDGE) , advanced mobile phone system (AMPS) , digital AMPS (IS-136/TDMA) , evolution-data optimized (EV-DO) , digital enhanced cordless telecommunications (DECT) , Worldwide Interoperability for Microwave Access (WiMAX) , wireless local area network (WLAN)
  • 3GPP third generation partnership project
  • LTE long term evolution
  • 4G fourth generation wireless mobile communication technology
  • 5G fifth generation wireless mobile communication
  • the processors discussed in this application may be any programmable microprocessor, microcomputer or multiple processor chip or chips that can be configured by software instructions (applications) to perform a variety of functions, including the functions of the various embodiments described above.
  • multiple processors may be provided, such as one processor dedicated to wireless communication functions and one processor dedicated to running other applications.
  • software applications may be stored in the internal memory before they are accessed and loaded into the processors.
  • the processors may include internal memory sufficient to store the application software instructions.
  • the internal memory may be a volatile or nonvolatile memory, such as flash memory, or a mixture of both.
  • a general reference to memory refers to memory accessible by the processors including internal memory or removable memory plugged into the device and memory within the processors themselves. Additionally, as used herein, any reference to a memory may be a reference to a memory storage and the terms may be used interchangeable.
  • Implementation examples are described in the following paragraphs. While some of the following implementation examples are described in terms of example methods, further example implementations may include: the example methods discussed in the following paragraphs implemented by a computing device including a processor configured with processor-executable instructions to perform operations of the methods of the following implementation examples; the example methods discussed in the following paragraphs implemented by a computing device including means for performing functions of the methods of the following implementation examples; and the example methods discussed in the following paragraphs may be implemented as a non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of a computing device to perform the operations of the methods of the following implementation examples.
  • Example 1 A method of assessing visual quality of a computer game output, including: obtaining a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off; obtaining a second frame recording of the same segment of game replay generated with the VRS feature on; temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording; and using a visual quality assessment tool to compare the first frame recording to the second frame recording to output a comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
  • VRS variable rate shading
  • Example 2 The method of example 1, in which temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording includes: sampling frames in each of the first frame recording and the second frame recording; computing peak signal-to-noise-ratio (PSNR) values of the sampled frames in each of the first frame recording and the second frame recording as an output statistic of PSNRs for describing a state at each index pair; storing all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions; using steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording; and outputting sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
  • PSNR peak signal-to-noise-ratio
  • Example 3 The method of example 2, in which the visual quality assessment tool receives as input the sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices and outputs key performance indicator (KPI) values of each of the first frame recording and the second frame recording for comparison.
  • KPI key performance indicator
  • Example 4 The method of either of examples 2 or 3, in which: obtaining the first frame recording of the segment of game replay generated with the VRS feature off includes extracting a game replay/observe recording sample including the first K frames of game sequences with the VRS feature off, in which K is a positive integer number; obtaining the second frame recording of the segment of game replay generated with the VRS feature on includes extracting a game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on; and computing PSNR values of the sampled frames in each of the first frame recording and the second frame recording includes computing the PSNR between the first K frames of game sequences starting from index I with the VRS feature off and the first K frames of game sequences starting from index J with the VRS feature on, in which I, J is a pair of candidate indices sampled from a fixed length frame sync window.
  • Example 6 The method of any of examples 2-5, in which using the visual quality assessment tool to compare the first frame recording to the second frame recording includes: using discrete Fourier transform (DFT) based operators to separate a high-frequency part from a low-frequency part of the sampled frames of each of the first frame recording and the second frame recording; using the high-frequency part of the sampled frames to identify inverse DFT shift frames; and using inverse DFT shift frames to output KPI sharpness scores for each of the first frame recording and the second frame recording.
  • DFT discrete Fourier transform
  • Example 7 The method of any of examples 2-5, in which using the visual quality assessment tool to compare the first frame recording to the second frame recording includes one or more of: a HaslerS03 method to output a colorfulness output of each of the first frame recording and the second frame recording; a mean of a grayscale method to output a brightness KPI score of each of the first frame recording and the second frame recording; using a DFT-based algorithm to output a sharpness KPI score of each of the first frame recording and the second frame recording; using a BRISQUE algorithm to output a perception KPI score of each of the first frame recording and the second frame recording; or using a video multimethod assessment fusion (VMAF) method to output a fidelity KPI score of each of the first frame recording and the second frame recording.
  • a HaslerS03 method to output a colorfulness output of each of the first frame recording and the second frame recording
  • a mean of a grayscale method to output a brightness KPI score of each of the first frame recording and the second frame recording
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some steps or methods may be performed by circuitry that is specific to a given function.
  • the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a non-transitory computer-readable medium or non-transitory processor-readable medium.
  • the steps of a method or algorithm disclosed herein may be embodied in a processor-executable software module and/or processor-executable instructions, which may reside on a non-transitory computer-readable or non-transitory processor-readable storage medium.
  • Non-transitory server-readable, computer-readable or processor-readable storage media may be any storage media that may be accessed by a computer or a processor.
  • non-transitory server-readable, computer-readable or processor-readable media may include RAM, ROM, EEPROM, FLASH memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer.
  • Disk and disc includes compact disc (CD) , laser disc, optical disc, DVD, floppy disk, and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of non-transitory server-readable, computer-readable and processor-readable media.
  • the operations of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a non-transitory server-readable, processor-readable medium and/or computer-readable medium, which may be incorporated into a computer program product.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Graphics (AREA)
  • Image Analysis (AREA)

Abstract

Systems and methods for assessing the visual quality of a computer game output include obtaining a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off, obtaining a second frame recording of the same segment of game replay generated with the VRS feature on, temporally aligning the first and second frame recordings, and using a visual quality assessment tool to compare the first frame recording to the second frame recording and output a comparison of the visual quality of the computer game with the VRS feature on and off.

Description

Systems And Methods For Efficient Feature Assessment For Game Visual Quality BACKGROUND
Variable rate shading (VRS) is a gaming graphics rendering technique that allows a computing device to adjust the shading performance of a video game image frame to balance tradeoffs between high speed, low latency, and low energy consumption. Rather than shading on a per-pixel basis (meaning that each pixel has a full calculation, and that data is transferred to the final image) , the shading is calculated over several pixels at once. By using averaged data for a combination of pixels, VRS may accomplish pixel shading in a more-coarse lower-resolution to reduce post-processing time and the load on the computing device processor. In addition, VRS allows for adjusting the size of that combination of pixels on a per-frame basis, allowing the video game to take advantage of processing budget where it exists, or pull back to a point where performance is needed.
SUMMARY
The various aspects include methods of assessing visual quality of a computer game output, which may include obtaining a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off, obtaining a second frame recording of the same segment of game replay generated with the VRS feature on, temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording, and using a visual quality assessment tool to compare the first frame recording to the second frame recording to output a comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
In some aspects, temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording may include sampling frames in each of the first frame  recording and the second frame recording, computing peak signal-to-noise-ratio (PSNR) values of the sampled frames in each of the first frame recording and the second frame recording as an output statistic of PSNRs for describing a state at each index pair, storing all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions, using steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording, and outputting sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
In some aspects, the visual quality assessment tool may receive as input the sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices, and output key performance indicator (KPI) values of each of the first frame recording and the second frame recording for comparison.
In some aspects, obtaining the first frame recording of the segment of game replay generated with the VRS feature off may include extracting a game replay/observe recording sample including the first K frames of game sequences with the VRS feature off (in which K is a positive integer number) , obtaining the second frame recording of the segment of game replay generated with the VRS feature on may include extracting a game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on, and computing PSNR values of the sampled frames in each of the first frame recording and the second frame recording may include computing the PSNR between the first K frames of game sequences started from index I with the VRS feature off and the first K frames of game sequences started from index J with the VRS feature on, in which I, J is a pair of candidate indices sampled from a fixed length frame sync window.
In some aspects, computing the PSNR between the first K frames of game sequences with the VRS feature off and the first K frames of game sequences with the VRS feature on may include generating a first sample frame sequence S 1 based on the game replay/observe recording sample including the first K frames of game sequences with the VRS feature off, generating second sample frame sequence S s based on the game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on, generating a pair comparison sequence P= {P (S′ 1i, S′ 2i) , i∈ [1, N] } (in which N is a length of frame pairs and N is less than min(len (S 1) , len (S 2) ) ) , and generating an output array (P (S′ 1i, S′ 2i) ) based on the pair comparison sequence.
In some aspects, using the visual quality assessment tool to compare the first frame recording to the second frame recording may include using discrete Fourier transform (DFT) based operators to separate a high-frequency part from a low-frequency part of the sampled frames of each of the first frame recording and the second frame recording, using the high-frequency part of the sampled frames to identify inverse DFT shift frames, and using inverse DFT shift frames to output KPI sharpness scores for each of the first frame recording and the second frame recording.
In some aspects, using the visual quality assessment tool to compare the first frame recording to the second frame recording may include one or more of a HaslerS03 method to output a colorfulness output of each of the first frame recording and the second frame recording, a mean of a grayscale method to output a brightness KPI score of each of the first frame recording and the second frame recording, using a DFT-based algorithm to output a sharpness KPI score of each of the first frame recording and the second frame recording, using a BRISQUE algorithm to output a perception KPI score of each of the first frame recording and the second frame recording, or using a video multimethod assessment fusion (VMAF) method to output a fidelity KPI score of each of the first frame recording and the second frame recording.
Further aspects include a computing device having a memory, and a processor coupled to the memory and configured to perform various operations corresponding to the method operations discussed above. Further aspects include a computing device having various means for performing functions corresponding to the method operations discussed above. Further aspects include a non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor to perform various operations corresponding to the method operations discussed above.
BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated herein and constitute part of this specification, illustrate exemplary embodiments, and together with the general description given above and the detailed description given below, serve to explain the features of various embodiments.
FIGs. 1-6 are process flow diagrams that illustrate methods of assessing visual quality of a computer game output in accordance with various embodiments.
FIG. 7 is a component block diagram illustrating components and operations in a computing system that may be configured to implement the various embodiments.
FIG. 8 is a component block diagram of an example computing system that includes system on chips (SOCs) suitable for implementing the various embodiments.
FIG. 9 is a component block diagram of an example computing device, in the form of smartphone, that is suitable for implementing the various embodiments.
DETAILED DESCRIPTION
The various embodiments will be described in detail with reference to the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts. References made to particular examples and implementations are for illustrative purposes, and are not intended to limit the scope of the invention or the claims.
In overview, the various embodiments include methods, and computing devices configured to implement the methods, for assessing visual quality of a computer video game output. A computing device processor may be configured to obtain a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off, obtain a second frame recording of the same segment of game replay generated with the VRS feature on, temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording, and use a visual quality assessment tool to compare the first frame recording to the second frame recording, and output the comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off. The output may be used to assess the quality loss of top games between VRS on/off on sampled game frame sequence.
By intelligently and efficiently generating a comparison of the visual quality of the computer game output with the VRS feature on and off, the various embodiments allow the computing device to better balance tradeoffs between the performance, visual effects, memory usage, and the power consumption characteristics of the computing device. This may improve the performance and energy consumption characteristics of the computing device.
In addition, by quantifying the game visual effects of OEMs and competitors using assessment tool, a Game Visual Effect Factors (Game VEF) Data Base may be created and maintained for use in designing and configuring future computing devices. These KPIs may be used to control visual loss on VRS, providing clues that assist in balancing tradeoffs between the performance, visual effects, memory usage, and the power consumption characteristics of the computing device. Further, since visual effects may impact of up to 5%on power consumption, effective analysis of Game VEF may improve the power efficiency ratio of the computing device. For all these reasons, the various embodiments improve the performance and functioning of the computing device.
The term “computing device” is used herein to refer to any one or all of internet-of-things (IOT) devices (e.g., smart televisions, smart speakers, smart locks, lighting systems, smart switches, smart plugs, smart doorbells, smart doorbell cameras, smart air pollution/quality monitors, smart smoke alarms, security systems, smart thermostats, etc. ) , server computing devices, personal computers, laptop computers, tablet computers, user equipment (UE) , smartphones, personal or mobile multi-media players, personal data assistants (PDAs) , palm-top computers, wireless electronic mail receivers, multimedia Internet enabled cellular telephones, gaming systems (e.g., PlayStation TM, Xbox TM, Nintendo Switch TM, etc. ) , wearable devices (e.g., smartwatch, head-mounted display, fitness tracker, etc. ) , media players (e.g., digital video disc (DVD) players, ROKU TM, AppleTV TM, etc. ) , digital video recorders (DVRs) , Internet access gateways, modems, routers, network switches, residential gateways, access points, integrated access devices (IAD) , mobile convergence products, networking adapters, multiplexers, and other similar devices that include a programmable processor and communications circuitry for providing the functionality described herein.
The term “system on chip” (SOC) is used herein to refer to a single integrated circuit (IC) chip that contains multiple resources and/or processors integrated on a single substrate. A single SOC may contain circuitry for digital, analog, mixed-signal, and radio-frequency functions. A single SOC may also include any number of general purpose and/or specialized processors (digital signal processors, modem processors, video processors, etc. ) , memory blocks (e.g., ROM, RAM, Flash, etc. ) , and resources (e.g., timers, voltage regulators, oscillators, etc. ) . SOCs may also include software for controlling the integrated resources and processors, as well as for controlling peripheral devices.
The term “system in a package” (SIP) is used herein to refer to a single module or package that contains multiple resources, computational units, cores and/or processors on two or more IC chips, substrates, or SOCs. For example, a SIP may include a single substrate on which multiple IC chips or semiconductor dies are  stacked in a vertical configuration. Similarly, the SIP may include one or more multi-chip modules (MCMs) on which multiple ICs or semiconductor dies are packaged into a unifying substrate. A SIP may also include multiple independent SOCs coupled together via high speed communication circuitry and packaged in close proximity, such as on a single motherboard or in a single wireless device. The proximity of the SOCs facilitates high speed communications and the sharing of memory and resources.
The term “multicore processor” is used herein to refer to a single integrated circuit (IC) chip or chip package that contains two or more independent processing cores (e.g., CPU core, Internet protocol (IP) core, graphics processor unit (GPU) core, etc. ) configured to read and execute program instructions. A SOC may include multiple multicore processors, and each processor in an SOC may be referred to as a core. The term “multiprocessor” is used herein to refer to a system or device that includes two or more processing units configured to read and execute program instructions.
Enabling auto variable rate shading (VRS) on a computing device used for gaming may reduce power consumption by at least 2-6%on the computing device. However, conventional solutions do not include or prove a tool for determining game visual effect sensitive KPI metrics or assessing the quality loss of images between VRS on versus VRS off on a sampled game frame sequence. As such, conventional solutions do not provide users or software developers with adequate tools for balancing tradeoffs between reduced power consumption, improved performance, and improved game visual effects on the computing device.
The various embodiments generate comparison of the visual quality of computer game output with the VRS feature on and off. To generate an accurate comparison, some embodiments may temporally align an image frame captured with the VRS feature on with an image frame captured with the VRS feature off. The computing device may capture sample frames with the VRS feature on and off, compute peak signal-to-noise-ratio (PSNR) values of the sampled frames, and use  steepest ascent/decent analysis (or gradient ascent/decent analysis) of the PSNR values to match up the sampled frames. The steepest ascent analysis may include well-known techniques that include performing a first-order iterative optimization algorithm for finding a local maximum (or minimum) of a differentiable function.
FIG. 1 illustrates a method 100 of assessing the visual quality of a computer game output in accordance with some embodiments. Method 100 may be performed by a processor (e.g., a graphics processor, applications processor, etc. ) in a computing device.
In block 102, the computing device may obtain a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off. For example, the computing device may extract a game replay/observe recording sample that includes the first K frames of game sequences with the VRS feature off. In some embodiments, K may be a positive integer number.
In block 104, the computing device may obtain a second frame recording of the same segment of game replay (e.g., first K frames, etc. ) generated with the VRS feature on.
In block 106, the computing device may temporally align the first and second frame recordings and identify an alignment index of the first and second frame recordings. In some embodiments, in block 106, the computing device may sample frames in the first and second frame recordings, compute peak signal-to-noise-ratio (PSNR) values of the sampled frames, and output statistics of PSNRs describing a state at each index pair. In some embodiments, this may be accomplished by generating a first sample frame sequence S 1 based on the recording sample including the first K frames of game sequences with the VRS feature off, generating second sample frame sequence S 2 based on the recording sample including the first K frames of the same game sequences with the VRS feature on, generating a pair comparison sequence (e.g., P= {P (S′ 1i, S′ 2i) , i∈ [1, N] } , where N is a length of frame pairs and N is  less than min (len (S 1) , len (S 2) ) ) , and generating an output array (e.g., P (S′ 1i, S′ 2i) ) based on the pair comparison sequence.
In some embodiments, computing the PSNR between the first K frames of game sequences with the VRS feature off and the first K frames of game sequences with the VRS feature on may include choosing pairs of candidate indices (I, J) from original sequence S 1 and S 2 in a fixed length frame sync window (e.g., first 5 seconds) , generating a first sample frame sequence S′ 1i based on the game replay/observe recording sample including the first K frames of game sequences starting from index I (0<I< Length (frame Sync Window) ) with the VRS feature off, generating second sample frame sequence S′ 2i based on the game replay/observe recording sample including the first K frames of the same game sequences starting from index J (0<J< Length (frame Sync Window) ) with the VRS feature on, generating a pair comparison sequence candidate P (I, J) = {P (S′ 1i, S′ 2i) , i∈ [1, N] } , in which N is a length of frame pairs and N is less than min (len (S 1) , len (S 2) ) , and finally generating an output array (P (S′ 1i, S′ 2i) ) with maximum M (I, J) .
To reduce repeated computations for index pair states in neighbor regions, in some embodiments the computing device may store all index pair states with a PSNR based value in a memory matrix M. The computing device may use steepest ascent analysis of the PSNR values to match up the sampled frames and identify a maximum estimation of a state matrix and the start frame indices of the first and second frame recordings. The computing device may output sampled frames of the first and second frame recordings corresponding to the identified start frame indices for comparison by a visual quality assessment tool of the computing device.
In block 108, the computing device may use the visual quality assessment tool to compare the first frame recording to the second frame recording. The computing device may output the comparison of the visual quality of the computer game output with the VRS feature on and off. To accomplish this, in some embodiments, the computing device may use discrete Fourier transform (DFT) based operators to  separate a high-frequency part from a low-frequency part of the sampled frames of each of the first and second frame recordings, use the high-frequency part of the sampled frames to identify inverse DFT shift frames, and use inverse DFT shift frames to output KPI sharpness scores for each of the first and second frame recordings.
In various embodiments, in block 108 the computing device may use a HaslerS03 method to output a colorfulness output of each of the first and second frame recordings, a mean of a grayscale method to output a brightness KPI score of each of the first and second frame recordings, a DFT-based algorithm to output a sharpness KPI score of each of the first and second frame recordings, a BRISQUE algorithm to output a perception KPI score of each of the first and second frame recordings, or using a video multimethod assessment fusion (VMAF) method to output a fidelity KPI score of each of the first and second frame recordings.
FIG. 2 illustrates another method 200 of assessing the visual quality of a computer game output in accordance with some embodiments. Method 200 may be performed by a processor (e.g., a graphics processor, applications processor, etc. ) in a computing device. With reference to FIGs. 1 and 2, in  blocks  102 and 104 the computing device may perform the operations discussed above with reference to FIG. 1. For example, the computing device may obtain a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off in block 102 and obtain a second frame recording of the same segment of game replay (e.g., first K frames, etc. ) generated with the VRS feature on in block 104. In some embodiments, obtaining the first frame recording in block 102 may include extracting a game replay/observe recording sample including the first K frames of game sequences with the VRS feature off (in which K is a positive integer number) , and obtaining the second frame recording in block 104 may include extracting a game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on.
In block 202, the computing device may sample frames in each of the first and second frame recordings. In block 204, the computing device may compute peak signal-to-noise-ratio (PSNR) values of the sampled frames as an output statistic of PSNRs for describing a state at each index pair. In some embodiments, computing the PSNR values may include computing PSNR values of the sampled frames in each of the first frame recording and the second frame recording may include computing the PSNR between the first K frames of game sequences starting from index I with the VRS feature off and the first K frames of game sequences starting from index J with the VRS feature on, in which I, J is a pair of candidate indices sampled from a fixed length frame sync window. In block 206, the computing device may store all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions.
In block 208, the computing device may use steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording.
In block 210, the computing device may output sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
In block 108, the computing device may use the visual quality assessment tool to compare the first frame recording to the second frame recording, and output a comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
FIG. 3 illustrates a method 300 of assessing the visual quality of a computer game output in accordance with some embodiments. Method 300 may be performed by a processor (e.g., a graphics processor, applications processor, etc. ) in a computing device. With reference to FIGs. 1-3, in  blocks  102, 104, and 202-210, the computing  device may perform the operations discussed above with reference to FIGs. 1 and 2 to generate and output sampled frames corresponding to the identified start frame indices.
In block 302, the computing device may receive as input, in its visual quality assessment tool, the sampled frames corresponding to the identified start frame indices. In block 304, the computing device may compare the received inputs in the visual quality assessment tool. In block 306, the computing device may output key performance indicator (KPI) values for each of the first and second frame recordings. The computing device may allow a user to compare the KPI values and adjust the operating parameters of the computing device to balance tradeoffs between performance, power consumption, and visual display quality.
FIG. 4 illustrates a method 400 of assessing the visual quality of a computer game output in accordance with some embodiments. Method 400 may be performed by a processor (e.g., a graphics processor, applications processor, etc. ) in a computing device. With reference to FIGs. 1-4, in block 402 the computing device may obtain a first frame recording by extracting a game replay/observe recording sample including the first K frames of game sequences with the VRS feature off (where K is a positive integer number) . In block 404, the computing device may obtain a second frame recording by extracting a game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on.
In block 202, the computing device may sample frames in each of the first and second frame recordings.
In block 406, the computing device may compute peak signal-to-noise-ratio (PSNR) values between the first K frames of game sequences with the VRS feature off and the first K frames of game sequences with the VRS feature on. The first K frames may be started from indices I, J in frame sync window, in which I is the start index from original sequence S 1 in the range of fixed length frame sync window (e.g., first 5 seconds, etc. ) and J is the start index from original sequence S 2 in the range of  fixed length frame sync window with same length. PSNR may be calculated through first K frames (K=50, etc. ) , and the average of these values may be used to compute a single PSNR mean value for index pair state (which may be stored in matrix M) .
Said another way, in block 406, the computing device may choose a pair of candidate indices I, J in a fixed length frame sync window, then compute peak signal-to-noise-ratio (PSNR) values between the first K frames of game sequences starting from index I with the VRS feature off and the first K frames of game sequences starting from index J with the VRS feature on.
In blocks 206-210 and 108, the computing device may perform the operations discussed above with reference to FIGs. 1 and 2. In block 206, the computing device may store all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions.
In block 208, the computing device may use steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording.
In block 210, the computing device may output sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
In block 108, the computing device may use the visual quality assessment tool to compare the first frame recording to the second frame recording and output a comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
FIG. 5 illustrates a method 500 of assessing the visual quality of a computer game output in accordance with some embodiments. Method 500 may be performed by a processor (e.g., a graphics processor, applications processor, etc. ) in a computing  device. With reference to FIGs. 1-5, in  blocks  402, 404, and 202 the computing device may perform the operations discussed above with reference to FIGs. 2 and 4.
In block 502, the computing device may generate a first sample frame sequence S1 based on the game replay/observe recording sample including the first K frames of game sequences with the VRS feature off.
In block 504, the computing device may generate second sample frame sequence Ss based on the game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on.
In block 506, the computing device may generate a pair comparison sequence.
In block 508, the computing device may generate an output array based on the pair comparison sequence.
In blocks 206-210 and 108, the computing device may perform the operations discussed above with reference to FIGs. 1 and 2. In block 206, the computing device may store all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions. In block 208, the computing device may use steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording. In block 210, the computing device may output sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool. In block 108, the computing device may use the visual quality assessment tool to compare the first frame recording to the second frame recording and output a comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
FIG. 6 illustrates a method 600 of assessing the visual quality of a computer game output in accordance with some embodiments. Method 600 may be performed  by a processor (e.g., a graphics processor, applications processor, etc. ) in a computing device. With reference to FIGs. 1-6, in  blocks  102, 104 and 202-210 the computing device may perform the operations discussed above with reference to FIGs. 1 and 2.
In block 602, the computing device may use discrete Fourier transform (DFT) based operators to separate a high-frequency part from a low-frequency part of the sampled frames of each of the first frame recording and the second frame recording.
In block 604, the computing device may use the high-frequency part of the sampled frames to identify inverse DFT shift frames.
In block 606, the computing device may use inverse DFT shift frames to output KPI sharpness scores for each of the first frame recording and the second frame recording.
FIG. 7 illustrates components and operations in a computing system 700 that could be configured to assessing the visual quality of a computer game output in accordance with some embodiments. With reference to FIGs. 1-7, the computing system 700 may include an input 702 component, a temporal processing 104 component, a calculate KPIs 106 component, and an output 108 component.
The input 702 component may collect a set of target game config frame recordings (full reference frame with timestamp sync) , and a corresponding set of VRS enable frame recordings. For example, in block 710, the input 702 component may obtain a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off (or disabled) . In block 712, the input 702 component may obtain a second frame recording of the same segment of game replay with the VRS feature on (or enabled) .
The temporal processing 704 component may go through sequence processing steps to transform the input into to pair comparison sequences {P (S_1i^′, S_2i^′) , i∈ [1, N] } , N < min (len (S 1) , len (S 2) ) . For VRS, an integrated libvmaf component (not illustrated separately) may calculate full reference fidelity to show difference on VRS enable/disable. As an example, in blocks 714-718, the temporal processing 704  component may run initiate sample rate (Sr) , FrameSyncWindow (Wr) , and/or SyncDuration (Ds) functions to extract samples including the first K frames of the first and second frame recordings, generate a first sample frame sequence S 1, generate a second sample frame sequence S 2, and synchronize the frames, generate a pair comparison sequence (e.g., P= {P (S′ 1i, S′ 2i) , i∈ [1, N] } , where N is a length of frame pairs and N is less than min (len (S 1) , len (S 2) ) ) , and generate an output array (e.g., P (S′ 1i, S′ 2i) ) based on the pair comparison sequence.
The temporal processing 704 component may solve the problem of mismatch between the starting points of the two frame sequences. Old methods use selection with human eye to calculate fidelity (eq: SSIM) for a single match pair (only game screenshot) . This is not efficient for sequence frames. The PSNR-based approach discussed in this application that computes the PSNR between the VRS enable and Target video codec ( “VQ” ) frames iteratively until a match is found. A matrix may be used in memorization to avoid repeated state calculations. A Steepest Ascent calculation may be used to speedup estimating best match PSNRs.
In block 720, the calculate KPIs 706 component may receive and use the output array (e.g., P (S′ 1i, S′ 2i) ) to determine key performance indicator (KPI) values and/or visual effect factors (VEF) for each of the first and second frame recordings.
In block 722, the output 708 component may store the KPIs and/or VEFs in a logfile and/or trend chart that allows users to compare the KPI or VEF values and adjust the operating parameters of the computing device to balance tradeoffs between visual display quality and the performance and power consumption characteristics of the computing device. The logfile or stored values may also be used to generate a trend curve output.
FIG. 8 illustrates an example computing system or SIP 800 architecture that may be used in wireless devices implementing the various embodiments. With reference to FIG. 8, the illustrated example SIP 800 includes a two  SOCs  802, 804, a clock 806, and a voltage regulator 808. In some embodiments, the first SOC 802 may  operate as central processing unit (CPU) of the wireless device that carries out the instructions of software application programs by performing the arithmetic, logical, control and input/output (I/O) operations specified by the instructions. In some embodiments, the second SOC 804 may operate as a specialized processing unit. For example, the second SOC 804 may operate as a specialized 5G processing unit responsible for managing high volume, high speed (e.g., 5 Gbps, etc. ) , and/or very high frequency short wave length (e.g., 28 GHz mmWave spectrum, etc. ) communications.
The first SOC 802 may include a digital signal processor (DSP) 810, a modem processor 812, a graphics processor 814, an application processor 816, one or more coprocessors 818 (e.g., vector co-processor) connected to one or more of the processors, memory 820, custom circuity 822, system components and resources 824, an interconnection/bus module 826, one or more temperature sensors 830, a thermal management unit 832, and a thermal power envelope (TPE) component 834. The second SOC 804 may include a 5G modem processor 852, a power management unit 854, an interconnection/bus module 864, a plurality of mmWave transceivers 856, memory 858, and various additional processors 860, such as an applications processor, packet processor, etc.
Each  processor  810, 812, 814, 816, 818, 852, 860 may include one or more cores, and each processor/core may perform operations independent of the other processors/cores. For example, the first SOC 802 may include a processor that executes a first type of operating system (e.g., FreeBSD, LINUX, OS X, ANDROID etc. ) and a processor that executes a second type of operating system (e.g., MICROSOFT WINDOWS 10) . In addition, any or all of the  processors  810, 812, 814, 816, 818, 852, 860 may be included as part of a processor cluster architecture (e.g., a synchronous processor cluster architecture, an asynchronous or heterogeneous processor cluster architecture, etc. ) .
The first and  second SOC  802, 804 may include various system components, resources and custom circuitry for managing sensor data, analog-to-digital  conversions, wireless data transmissions, and for performing other specialized operations, such as decoding data packets and processing encoded audio and video signals for rendering in a web browser. For example, the system components and resources 824 of the first SOC 802 may include power amplifiers, voltage regulators, oscillators, phase-locked loops, peripheral bridges, data controllers, memory controllers, system controllers, access ports, timers, and other similar components used to support the processors and software clients running on a wireless device. The system components and resources 824 and/or custom circuitry 822 may also include circuitry to interface with peripheral devices, such as cameras, electronic displays, wireless communication devices, external memory chips, etc.
The first and  second SOC  802, 804 may communicate via interconnection/bus module 850. The  various processors  810, 812, 814, 816, 818, may be interconnected to one or more memory elements 820, system components and resources 824, and custom circuitry 822, and a thermal management unit 832 via an interconnection/bus module 826. Similarly, the processor 852 may be interconnected to the power management unit 854, the mmWave transceivers 856, memory 858, and various additional processors 860 via the interconnection/bus module 864. The interconnection/ bus module  826, 850, 864 may include an array of reconfigurable logic gates and/or implement a bus architecture (e.g., CoreConnect, AMBA, etc. ) . Communications may be provided by advanced interconnects, such as high-performance networks-on chip (NoCs) .
The first and/or  second SOCs  802, 804 may further include an input/output module (not illustrated) for communicating with resources external to the SOC, such as a clock 806 and a voltage regulator 808. Resources external to the SOC (e.g., clock 806, voltage regulator 808) may be shared by two or more of the internal SOC processors/cores.
In addition to the example SIP 800 discussed above, various embodiments may be implemented in a wide variety of computing systems, which may include a single processor, multiple processors, multicore processors, or any combination thereof.
Various embodiments may be implemented on a variety of computing devices, an example of which is illustrated in FIG. 9 in the form of a smartphone 900. The smartphone 900 may include a first SOC 802 (e.g., a SOC-CPU) coupled to a second SOC 804 (e.g., a 5G capable SOC) . The first and  second SOCs  802, 804 may be coupled to  internal memory  906, 916, a display 912, and to a speaker 914. Additionally, the smartphone 900 may include an antenna 904 for sending and receiving electromagnetic radiation that may be connected to a wireless data link and/or cellular telephone transceiver 908 coupled to one or more processors in the first and/or  second SOCs  802, 804. Smartphones 900 typically also include menu selection buttons or rocker switches 920 for receiving user inputs.
typical smartphone 900 also includes a sound encoding/decoding (CODEC) circuit 910, which digitizes sound received from a microphone into data packets suitable for wireless transmission and decodes received sound data packets to generate analog signals that are provided to the speaker to generate sound. Also, one or more of the processors in the first and  second SOCs  802, 804, wireless transceiver 908 and CODEC 910 may include a digital signal processor (DSP) circuit (not shown separately) .
The processors of the smart phone 900 may be any programmable microprocessor, microcomputer or multiple processor chip or chips that can be configured by software instructions (applications) to perform a variety of functions, including the functions of the various embodiments described below. In some mobile devices, multiple processors may be provided, such as one processor within an SOC 804 dedicated to wireless communication functions and one processor within an SOC 802 dedicated to running other applications. Typically, software applications may be stored in the  memory  906, 916 before they are accessed and loaded into the processor. The processors may include internal memory sufficient to store the application software instructions.
As used in this application, the terms “component, ” “module, ” “system, ” and the like are intended to include a computer-related entity, such as, but not limited to,  hardware, firmware, a combination of hardware and software, software, or software in execution, which are configured to perform particular operations or functions. For example, a component may be, but is not limited to, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a wireless device and the wireless device may be referred to as a component. One or more components may reside within a process and/or thread of execution and a component may be localized on one processor or core and/or distributed between two or more processors or cores. In addition, these components may execute from various non-transitory computer readable media having various instructions and/or data structures stored thereon. Components may communicate by way of local and/or remote processes, function or procedure calls, electronic signals, data packets, memory read/writes, and other known network, computer, processor, and/or process related communication methodologies.
A number of different cellular and mobile communication services and standards are available or contemplated in the future, all of which may implement and benefit from the various embodiments. Such services and standards include, e.g., third generation partnership project (3GPP) , long term evolution (LTE) systems, third generation wireless mobile communication technology (3G) , fourth generation wireless mobile communication technology (4G) , fifth generation wireless mobile communication technology (5G) , global system for mobile communications (GSM) , universal mobile telecommunications system (UMTS) , 3GSM, general packet radio service (GPRS) , code division multiple access (CDMA) systems (e.g., cdmaOne, CDMA1020TM) , enhanced data rates for GSM evolution (EDGE) , advanced mobile phone system (AMPS) , digital AMPS (IS-136/TDMA) , evolution-data optimized (EV-DO) , digital enhanced cordless telecommunications (DECT) , Worldwide Interoperability for Microwave Access (WiMAX) , wireless local area network (WLAN) , Wi-Fi Protected Access I &II (WPA, WPA2) , and integrated digital enhanced network (iDEN) . Each of these technologies involves, for example, the  transmission and reception of voice, data, signaling, and/or content messages. It should be understood that any references to terminology and/or technical details related to an individual telecommunication standard or technology are for illustrative purposes only, and are not intended to limit the scope of the claims to a particular communication system or technology unless specifically recited in the claim language.
Various embodiments illustrated and described are provided merely as examples to illustrate various features of the claims. However, features shown and described with respect to any given embodiment are not necessarily limited to the associated embodiment and may be used or combined with other embodiments that are shown and described. Further, the claims are not intended to be limited by any one example embodiment. For example, one or more of the operations of the methods 400–600 may be substituted for or combined with one or more operations of the methods 400–600.
The processors discussed in this application may be any programmable microprocessor, microcomputer or multiple processor chip or chips that can be configured by software instructions (applications) to perform a variety of functions, including the functions of the various embodiments described above. In some devices, multiple processors may be provided, such as one processor dedicated to wireless communication functions and one processor dedicated to running other applications. Typically, software applications may be stored in the internal memory before they are accessed and loaded into the processors. The processors may include internal memory sufficient to store the application software instructions. In many devices, the internal memory may be a volatile or nonvolatile memory, such as flash memory, or a mixture of both. For the purposes of this description, a general reference to memory refers to memory accessible by the processors including internal memory or removable memory plugged into the device and memory within the processors themselves. Additionally, as used herein, any reference to a memory may be a reference to a memory storage and the terms may be used interchangeable.
Implementation examples are described in the following paragraphs. While some of the following implementation examples are described in terms of example methods, further example implementations may include: the example methods discussed in the following paragraphs implemented by a computing device including a processor configured with processor-executable instructions to perform operations of the methods of the following implementation examples; the example methods discussed in the following paragraphs implemented by a computing device including means for performing functions of the methods of the following implementation examples; and the example methods discussed in the following paragraphs may be implemented as a non-transitory processor-readable storage medium having stored thereon processor-executable instructions configured to cause a processor of a computing device to perform the operations of the methods of the following implementation examples.
Example 1: Example 1. A method of assessing visual quality of a computer game output, including: obtaining a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off; obtaining a second frame recording of the same segment of game replay generated with the VRS feature on; temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording; and using a visual quality assessment tool to compare the first frame recording to the second frame recording to output a comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
Example 2. The method of example 1, in which temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording includes: sampling frames in each of the first frame recording and the second frame recording; computing peak signal-to-noise-ratio (PSNR) values of the sampled frames in each of the first frame recording and the second frame recording as an output statistic of PSNRs for  describing a state at each index pair; storing all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions; using steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording; and outputting sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
Example 3. The method of example 2, in which the visual quality assessment tool receives as input the sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices and outputs key performance indicator (KPI) values of each of the first frame recording and the second frame recording for comparison.
Example 4. The method of either of examples 2 or 3, in which: obtaining the first frame recording of the segment of game replay generated with the VRS feature off includes extracting a game replay/observe recording sample including the first K frames of game sequences with the VRS feature off, in which K is a positive integer number; obtaining the second frame recording of the segment of game replay generated with the VRS feature on includes extracting a game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on; and computing PSNR values of the sampled frames in each of the first frame recording and the second frame recording includes computing the PSNR between the first K frames of game sequences starting from index I with the VRS feature off and the first K frames of game sequences starting from index J with the VRS feature on, in which I, J is a pair of candidate indices sampled from a fixed length frame sync window.
Example 5. The method of example 4, in which computing the PSNR between the first K frames of game sequences with the VRS feature off and the first K frames  of game sequences with the VRS feature on includes: generating a first sample frame sequence S1 based on the game replay/observe recording sample including the first K frames of game sequences with the VRS feature off; generating second sample frame sequence Ss based on the game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on; generating a pair comparison sequence P= {P (S_1i^', S_2i^') , i∈ [1, N] } , in which N is a length of frame pairs and N is less than min (〖len (S〗_1) 〖, len (S〗_2) ) ; and generating an output array (P (S_1i^', S_2i^') ) based on the pair comparison sequence.
Example 6. The method of any of examples 2-5, in which using the visual quality assessment tool to compare the first frame recording to the second frame recording includes: using discrete Fourier transform (DFT) based operators to separate a high-frequency part from a low-frequency part of the sampled frames of each of the first frame recording and the second frame recording; using the high-frequency part of the sampled frames to identify inverse DFT shift frames; and using inverse DFT shift frames to output KPI sharpness scores for each of the first frame recording and the second frame recording.
Example 7. The method of any of examples 2-5, in which using the visual quality assessment tool to compare the first frame recording to the second frame recording includes one or more of: a HaslerS03 method to output a colorfulness output of each of the first frame recording and the second frame recording; a mean of a grayscale method to output a brightness KPI score of each of the first frame recording and the second frame recording; using a DFT-based algorithm to output a sharpness KPI score of each of the first frame recording and the second frame recording; using a BRISQUE algorithm to output a perception KPI score of each of the first frame recording and the second frame recording; or using a video multimethod assessment fusion (VMAF) method to output a fidelity KPI score of each of the first frame recording and the second frame recording.
The foregoing method descriptions and the process flow diagrams are provided merely as illustrative examples and are not intended to require or imply that the steps of the various embodiments must be performed in the order presented. As will be appreciated by one of skill in the art the order of steps in the foregoing embodiments may be performed in any order. Words such as “thereafter, ” “then, ” “next, ” etc. are not intended to limit the order of the steps; these words are simply used to guide the reader through the description of the methods. Further, any reference to claim elements in the singular, for example, using the articles “a, ” “an” or “the” is not to be construed as limiting the element to the singular.
The various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The hardware used to implement the various illustrative logics, logical blocks, modules, components, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor (DSP) , an application specific integrated circuit (ASIC) , a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but, in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be  implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration. Alternatively, some steps or methods may be performed by circuitry that is specific to a given function.
In one or more exemplary embodiments, the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a non-transitory computer-readable medium or non-transitory processor-readable medium. The steps of a method or algorithm disclosed herein may be embodied in a processor-executable software module and/or processor-executable instructions, which may reside on a non-transitory computer-readable or non-transitory processor-readable storage medium. Non-transitory server-readable, computer-readable or processor-readable storage media may be any storage media that may be accessed by a computer or a processor. By way of example but not limitation, such non-transitory server-readable, computer-readable or processor-readable media may include RAM, ROM, EEPROM, FLASH memory, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that may be used to store desired program code in the form of instructions or data structures and that may be accessed by a computer. Disk and disc, as used herein, includes compact disc (CD) , laser disc, optical disc, DVD, floppy disk, and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above are also included within the scope of non-transitory server-readable, computer-readable and processor-readable media. Additionally, the operations of a method or algorithm may reside as one or any combination or set of codes and/or instructions on a non-transitory server-readable, processor-readable medium and/or computer-readable medium, which may be incorporated into a computer program product.
The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various  modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the following claims and the principles and novel features disclosed herein.

Claims (28)

  1. A method of assessing visual quality of a computer game output, comprising:
    obtaining a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off;
    obtaining a second frame recording of the same segment of game replay generated with the VRS feature on;
    temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording; and
    using a visual quality assessment tool to compare the first frame recording to the second frame recording to output a comparison of a visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
  2. The method of claim 1, wherein temporally aligning the first frame recording and the second frame recording to identify the alignment index of each of the first frame recording and the second frame recording comprises:
    sampling frames in each of the first frame recording and the second frame recording;
    computing peak signal-to-noise-ratio (PSNR) values of the sampled frames in each of the first frame recording and the second frame recording as an output statistic of PSNRs for describing a state at each index pair;
    storing all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions;
    using steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and  identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording; and
    outputting sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
  3. The method of claim 2, wherein the visual quality assessment tool receives as input the sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices and outputs key performance indicator (KPI) values of each of the first frame recording and the second frame recording for comparison.
  4. The method of claim 2, wherein:
    obtaining the first frame recording of the segment of game replay generated with the VRS feature off comprises extracting a game replay/observe recording sample including the first K frames of game sequences with the VRS feature off, wherein K is a positive integer number;
    obtaining the second frame recording of the segment of game replay generated with the VRS feature on comprises extracting a game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on; and
    computing PSNR values of the sampled frames in each of the first frame recording and the second frame recording comprises computing the PSNR between the first K frames of game sequences starting from index I with the VRS feature off and the first K frames of game sequences starting from index J with the VRS feature on, wherein I, J is a pair of candidate indices sampled from a fixed length frame sync window.
  5. The method of claim 4, wherein computing the PSNR between the first K frames of game sequences with the VRS feature off and the first K frames of game sequences with the VRS feature on comprises:
    generating a first sample frame sequence S 1 based on the game replay/observe recording sample including the first K frames of game sequences with the VRS feature off;
    generating second sample frame sequence S s based on the game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on;
    generating a pair comparison sequence P= {P (S′ 1i, S′ 2i) , i∈ [1, N] } , wherein N is a length of frame pairs and N is less than min (len (S 1) , len (S 2) ) ; and
    generating an output array (P (S′ 1i, S′ 2i) ) based on the pair comparison sequence.
  6. The method of claim 2, wherein using the visual quality assessment tool to compare the first frame recording to the second frame recording comprises:
    using discrete Fourier transform (DFT) based operators to separate a high-frequency part from a low-frequency part of the sampled frames of each of the first frame recording and the second frame recording;
    using the high-frequency part of the sampled frames to identify inverse DFT shift frames; and
    using the inverse DFT shift frames to output KPI sharpness scores for each of the first frame recording and the second frame recording.
  7. The method of claim 2, wherein using the visual quality assessment tool to compare the first frame recording to the second frame recording comprises one or more of:
    a HaslerS03 method to output a colorfulness output of each of the first frame recording and the second frame recording;
    a mean of a grayscale method to output a brightness KPI score of each of the first frame recording and the second frame recording;
    using a DFT-based algorithm to output a sharpness KPI score of each of the first frame recording and the second frame recording;
    using a BRISQUE algorithm to output a perception KPI score of each of the first frame recording and the second frame recording; or
    using a video multimethod assessment fusion (VMAF) method to output a fidelity KPI score of each of the first frame recording and the second frame recording.
  8. A computing device, comprising:
    a memory; and
    a processor coupled to the memory and configured to:
    obtain a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off;
    obtain a second frame recording of the same segment of game replay generated with the VRS feature on;
    temporally align the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording; and
    use a visual quality assessment tool to compare the first frame recording to the second frame recording to output a comparison of a visual quality of a computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
  9. The computing device of claim 8, wherein the processor is further configured to temporally align the first frame recording and the second frame recording to identify the alignment index of each of the first frame recording and the second frame recording by:
    sampling frames in each of the first frame recording and the second frame recording;
    computing peak signal-to-noise-ratio (PSNR) values of the sampled frames in each of the first frame recording and the second frame recording as an output statistic of PSNRs for describing a state at each index pair;
    storing all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions;
    using steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording; and
    outputting sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
  10. The computing device of claim 9, wherein the visual quality assessment tool receives as input the sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices and outputs key performance indicator (KPI) values of each of the first frame recording and the second frame recording for comparison.
  11. The computing device of claim 9, wherein the processor is further configured to:
    obtain the first frame recording of the segment of game replay generated with the VRS feature off by extracting a game replay/observe recording sample including the first K frames of game sequences with the VRS feature off, wherein K is a positive integer number;
    obtain the second frame recording of the segment of game replay generated with the VRS feature on by extracting a game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on; and
    compute PSNR values of the sampled frames in each of the first frame recording and the second frame recording by computing the PSNR between the first K frames of game sequences starting from index I with the VRS feature off and the first K frames of game sequences starting from index J with the VRS feature on, wherein I, J is a pair of candidate indices sampled from a fixed length frame sync window.
  12. The computing device of claim 11, wherein the processor is further configured to compute the PSNR between the first K frames of game sequences with the VRS feature off and the first K frames of game sequences with the VRS feature on by:
    generating a first sample frame sequence S 1 based on the game replay/observe recording sample including the first K frames of game sequences with the VRS feature off;
    generating second sample frame sequence S s based on the game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on;
    generating a pair comparison sequence P= {P (S′ 1i, S′ 2i) , i∈ [1, N] } , wherein N is a length of frame pairs and N is less than min (len (S 1) , len (S 2) ) ; and
    generating an output array (P (S′ 1i, S′ 2i) ) based on the pair comparison sequence.
  13. The computing device of claim 9, wherein the processor is further configured to use the visual quality assessment tool to compare the first frame recording to the second frame recording by:
    using discrete Fourier transform (DFT) based operators to separate a high-frequency part from a low-frequency part of the sampled frames of each of the first frame recording and the second frame recording;
    using the high-frequency part of the sampled frames to identify inverse DFT shift frames; and
    using the inverse DFT shift frames to output KPI sharpness scores for each of the first frame recording and the second frame recording.
  14. The computing device of claim 9, wherein the processor is further configured to use the visual quality assessment tool to compare the first frame recording to the second frame recording using one or more of:
    a HaslerS03 computing device to output a colorfulness output of each of the first frame recording and the second frame recording;
    a mean of a grayscale computing device to output a brightness KPI score of each of the first frame recording and the second frame recording;
    a DFT-based algorithm to output a sharpness KPI score of each of the first frame recording and the second frame recording;
    a BRISQUE algorithm to output a perception KPI score of each of the first frame recording and the second frame recording; or
    a video multicomputing device assessment fusion (VMAF) computing device to output a fidelity KPI score of each of the first frame recording and the second frame recording.
  15. A computing device, comprising:
    means for obtaining a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off;
    means for obtaining a second frame recording of the same segment of game replay generated with the VRS feature on;
    means for temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording; and
    means for using a visual quality assessment tool to compare the first frame recording to the second frame recording to output a comparison of a visual quality of a computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
  16. The computing device of claim 15, wherein means for temporally aligning the first frame recording and the second frame recording to identify the alignment index of each of the first frame recording and the second frame recording comprises:
    means for sampling frames in each of the first frame recording and the second frame recording;
    means for computing peak signal-to-noise-ratio (PSNR) values of the sampled frames in each of the first frame recording and the second frame recording as an output statistic of PSNRs for describing a state at each index pair;
    means for storing all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions;
    means for using steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording; and
    means for outputting sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
  17. The computing device of claim 16, wherein the visual quality assessment tool receives as input the sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices and outputs key performance indicator (KPI) values of each of the first frame recording and the second frame recording for comparison.
  18. The computing device of claim 16, wherein:
    means for obtaining the first frame recording of the segment of game replay generated with the VRS feature off comprises means for extracting a game replay/observe recording sample including the first K frames of game sequences with the VRS feature off, wherein K is a positive integer number;
    means for obtaining the second frame recording of the segment of game replay generated with the VRS feature on comprises means for extracting a game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on; and
    means for computing PSNR values of the sampled frames in each of the first frame recording and the second frame recording comprises means for computing the PSNR between the first K frames of game sequences starting from index I with the VRS feature off and the first K frames of game sequences starting from index J with the VRS feature on, wherein I, J is a pair of candidate indices sampled from a fixed length frame sync window.
  19. The computing device of claim 18, wherein means for computing the PSNR between the first K frames of game sequences with the VRS feature off and the first K frames of game sequences with the VRS feature on comprises:
    means for generating a first sample frame sequence S 1 based on the game replay/observe recording sample including the first K frames of game sequences with the VRS feature off;
    means for generating second sample frame sequence S s based on the game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on;
    means for generating a pair comparison sequence P= {P (S′ 1i, S′ 2i) , i∈ [1, N] } , wherein N is a length of frame pairs and N is less than min (len (S 1) , len (S 2) ) ; and
    means for generating an output array (P (S′ 1i, S′ 2i) ) based on the pair comparison sequence.
  20. The computing device of claim 16, wherein means for using the visual quality assessment tool to compare the first frame recording to the second frame recording comprises:
    means for using discrete Fourier transform (DFT) based operators to separate a high-frequency part from a low-frequency part of the sampled frames of each of the first frame recording and the second frame recording;
    means for using the high-frequency part of the sampled frames to identify inverse DFT shift frames; and
    means for using the inverse DFT shift frames to output KPI sharpness scores for each of the first frame recording and the second frame recording.
  21. The computing device of claim 16, wherein means for using the visual quality assessment tool to compare the first frame recording to the second frame recording comprises one or more of:
    means for performing a HaslerS03 method to output a colorfulness output of each of the first frame recording and the second frame recording;
    means for performing a mean of a grayscale method to output a brightness KPI score of each of the first frame recording and the second frame recording;
    means for using a DFT-based algorithm to output a sharpness KPI score of each of the first frame recording and the second frame recording;
    means for using a BRISQUE algorithm to output a perception KPI score of each of the first frame recording and the second frame recording; or
    means for using a video multicomputing device assessment fusion (VMAF) method to output a fidelity KPI score of each of the first frame recording and the second frame recording.
  22. A non-transitory processor readable medium having stored thereon processor-executable instructions configured to cause a processor of a computing device to perform operations for assessing visual quality of a computer game output comprising:
    obtaining a first frame recording of a segment of game replay generated with a variable rate shading (VRS) feature off;
    obtaining a second frame recording of the same segment of game replay generated with the VRS feature on;
    temporally aligning the first frame recording and the second frame recording to identify an alignment index of each of the first frame recording and the second frame recording; and
    using a visual quality assessment tool to compare the first frame recording to the second frame recording to output a comparison of the visual quality of the computer game output with the VRS feature on to the visual quality of the computer game output with the VRS feature off.
  23. The non-transitory processor readable medium of claim 22, wherein the stored processor-executable instructions are further configured to cause a processor of a computing device to perform operations such that temporally aligning the first frame recording and the second frame recording to identify the alignment index of each of the first frame recording and the second frame recording comprises:
    sampling frames in each of the first frame recording and the second frame recording;
    computing peak signal-to-noise-ratio (PSNR) values of the sampled frames in each of the first frame recording and the second frame recording as an output statistic of PSNRs for describing a state at each index pair;
    storing all index pair states with PSNR based value in a memory matrix M to reduce repeated computations for index pair states in neighbor regions;
    using steepest ascent analysis of the PSNR values to match up the sampled frames in each of the first frame recording and the second frame recording and identify a maximum estimation of a state matrix and start frame indices of each of the first frame recording and the second frame recording; and
    outputting sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices for comparison by the visual quality assessment tool.
  24. The non-transitory processor readable medium of claim 23, wherein the stored processor-executable instructions are further configured to cause a processor of a computing device to perform operations such that the visual quality assessment tool receives as input the sampled frames of each of the first frame recording and the second frame recording corresponding to the identified start frame indices and outputs key performance indicator (KPI) values of each of the first frame recording and the second frame recording for comparison.
  25. The non-transitory processor readable medium of claim 23, wherein the stored processor-executable instructions are further configured to cause a processor of a computing device to perform operations such that:
    obtaining the first frame recording of the segment of game replay generated with the VRS feature off comprises extracting a game replay/observe recording sample including the first K frames of game sequences with the VRS feature off, wherein K is a positive integer number;
    obtaining the second frame recording of the segment of game replay generated with the VRS feature on comprises extracting a game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on; and
    computing PSNR values of the sampled frames in each of the first frame recording and the second frame recording comprises computing the PSNR between the first K frames of game sequences starting from index I with the VRS feature off and the first K frames of game sequences starting from index J with the VRS feature on, wherein I, J is a pair of candidate indices sampled from a fixed length frame sync window.
  26. The non-transitory processor readable medium of claim 25, wherein the stored processor-executable instructions are further configured to cause a processor of a computing device to perform operations such that computing the PSNR between the  first K frames of game sequences with the VRS feature off and the first K frames of game sequences with the VRS feature on comprises:
    generating a first sample frame sequence S 1 based on the game replay/observe recording sample including the first K frames of game sequences with the VRS feature off;
    generating second sample frame sequence S s based on the game replay/observe recording sample including the first K frames of the same game sequences with the VRS feature on;
    generating a pair comparison sequence P= {P (S′ 1i, S′ 2i) , i∈ [1, N] } , wherein N is a length of frame pairs and N is less than min (len (S 1) , len (S 2) ) ; and
    generating an output array (P (S′ 1i, S′ 2i) ) based on the pair comparison sequence.
  27. The non-transitory processor readable medium of claim 23, wherein the stored processor-executable instructions are further configured to cause a processor of a computing device to perform operations such that using the visual quality assessment tool to compare the first frame recording to the second frame recording comprises:
    using discrete Fourier transform (DFT) based operators to separate a high-frequency part from a low-frequency part of the sampled frames of each of the first frame recording and the second frame recording;
    using the high-frequency part of the sampled frames to identify inverse DFT shift frames; and
    using the inverse DFT shift frames to output KPI sharpness scores for each of the first frame recording and the second frame recording.
  28. The non-transitory processor readable medium of claim 23, wherein the stored processor-executable instructions are further configured to cause a processor of a computing device to perform operations such that using the visual quality assessment tool to compare the first frame recording to the second frame recording comprises one or more of:
    a HaslerS03 method to output a colorfulness output of each of the first frame recording and the second frame recording;
    a mean of a grayscale method to output a brightness KPI score of each of the first frame recording and the second frame recording;
    using a DFT-based algorithm to output a sharpness KPI score of each of the first frame recording and the second frame recording;
    using a BRISQUE algorithm to output a perception KPI score of each of the first frame recording and the second frame recording; or
    using a video multimethod assessment fusion (VMAF) method to output a fidelity KPI score of each of the first frame recording and the second frame recording.
PCT/CN2022/119304 2022-09-16 2022-09-16 Systems and methods for efficient feature assessment for game visual quality WO2024055286A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/119304 WO2024055286A1 (en) 2022-09-16 2022-09-16 Systems and methods for efficient feature assessment for game visual quality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/119304 WO2024055286A1 (en) 2022-09-16 2022-09-16 Systems and methods for efficient feature assessment for game visual quality

Publications (1)

Publication Number Publication Date
WO2024055286A1 true WO2024055286A1 (en) 2024-03-21

Family

ID=90273972

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/119304 WO2024055286A1 (en) 2022-09-16 2022-09-16 Systems and methods for efficient feature assessment for game visual quality

Country Status (1)

Country Link
WO (1) WO2024055286A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080012856A1 (en) * 2006-07-14 2008-01-17 Daphne Yu Perception-based quality metrics for volume rendering
US20150049800A1 (en) * 2013-08-16 2015-02-19 Nvidia Corporation Estimation of entropy encoding bits in video compression
CN113940066A (en) * 2019-06-10 2022-01-14 微软技术许可有限责任公司 Selectively enhancing compressed digital content
CN114549683A (en) * 2022-02-25 2022-05-27 Oppo广东移动通信有限公司 Image rendering method and device and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080012856A1 (en) * 2006-07-14 2008-01-17 Daphne Yu Perception-based quality metrics for volume rendering
US20150049800A1 (en) * 2013-08-16 2015-02-19 Nvidia Corporation Estimation of entropy encoding bits in video compression
CN113940066A (en) * 2019-06-10 2022-01-14 微软技术许可有限责任公司 Selectively enhancing compressed digital content
CN114549683A (en) * 2022-02-25 2022-05-27 Oppo广东移动通信有限公司 Image rendering method and device and electronic equipment

Similar Documents

Publication Publication Date Title
US11050530B2 (en) Generating wireless reference signals in a different domain for transmission with a collapsed time-frequency grid
KR101586954B1 (en) Techniques to reduce color artifacts in a digital image
WO2023024820A1 (en) Method and apparatus for adjusting specific absorption rate of electromagnetic wave, medium, and electronic device
US20230412295A1 (en) Method and apparatus for service processing in dual card terminal device
US11621814B2 (en) Encoding multistage messages in a 5G or other next generation wireless network
US20230199482A1 (en) Method for routing access, user equipment, and storage medium
US20200221136A1 (en) Strong intra smoothing for in rext
CN112422591A (en) Method and device for transmitting video stream data and electronic equipment
WO2024055286A1 (en) Systems and methods for efficient feature assessment for game visual quality
US10966121B1 (en) Centralized management of wireless relay node equipment in a fifth generation (5G) or other next generation networks
CN108389165A (en) A kind of image de-noising method
US11729055B1 (en) Utilizing templates with associated naming policies to deploy network equipment
CN115701003A (en) Antenna switching method and device, storage medium and electronic equipment
US20230100203A1 (en) Autonomous onsite remediation of adverse conditions for network infrastructure in a fifth generation (5g) network or other next generation wireless communication system
WO2024060064A1 (en) Miracast end to end (e2e) stream transmission
US11984098B2 (en) Per layer adaptive over-drive
US11991645B2 (en) Determining a response by user equipment to a base station signal based on transmission signal strength relayed with the signal
WO2021207933A1 (en) A method to avoid irat ping-pong and save power
US11727602B2 (en) Resolution of a picture
US20230345251A1 (en) Method, device and computer readable medium for communications
WO2024094019A1 (en) Beam quality reporting method, beam quality reception method, terminal, and network side device
US20240155395A1 (en) Configuration method and apparatus for measurement gap sharing rule
US20240072881A1 (en) Based on a mode of communication, selecting a reflective surface to be used for signal propagation
WO2024078589A1 (en) Information reporting method and apparatus, communication device, and storage medium
US11627092B2 (en) Streaming augmented reality data in a fifth generation (5G) or other next generation network

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22958477

Country of ref document: EP

Kind code of ref document: A1