US20080012870A1 - Color correction of digital video images using a programmable graphics processing unit - Google Patents

Color correction of digital video images using a programmable graphics processing unit Download PDF

Info

Publication number
US20080012870A1
US20080012870A1 US11/775,923 US77592307A US2008012870A1 US 20080012870 A1 US20080012870 A1 US 20080012870A1 US 77592307 A US77592307 A US 77592307A US 2008012870 A1 US2008012870 A1 US 2008012870A1
Authority
US
United States
Prior art keywords
color space
buffer
video information
processing unit
buffer containing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/775,923
Inventor
Sean Gies
James Batson
Tim Cherna
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apple Inc
Original Assignee
Apple Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Apple Inc filed Critical Apple Inc
Priority to US11/775,923 priority Critical patent/US20080012870A1/en
Publication of US20080012870A1 publication Critical patent/US20080012870A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/67Circuits for processing colour signals for matrixing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/363Graphics controllers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/18Use of a frame buffer in a display terminal, inclusive of the display panel

Definitions

  • the invention relates generally to computer display technology and, more particularly, to the application of visual effects using a programmable graphics processing unit during frame-buffer composition in a computer system.
  • Programs such as QuickTime by Apple Computer, Inc. allow the display of various video formats on a computer. In operation, QuickTime must decode each frame of the video from its encoded format and then provide the decoded image to a compositor in the operating system for display.
  • a system utilizes the processing capabilities of the graphics processing unit (GPU) in the graphics controller.
  • GPU graphics processing unit
  • Each frame of each video stream or track is decoded into a buffer and a color profile indicating parameters of the color space of the video source is associated with the buffer.
  • the compositor uses the color profile to convert each buffer to a defined working color space from the source color space. This conversion and rendering of the buffer is performed using the fragment processing capabilities of the GPU.
  • the compositor instructs the GPU to convert the buffer to the final color space of the display device and the frame is rendered to the frame buffer for final display. Each of these operations is done in real time for each frame of the video. Because each stream or frame is properly color converted, the final displayed image will be the uniformly colored for each video source and each display.
  • FIG. 1 shows an illustration of a computer system with various video sources and displays.
  • FIG. 2 shows an exemplary block diagram of the computer of FIG. 1 .
  • FIG. 3 shows an exemplary software environment of the computer of FIG. 1 .
  • FIG. 4 shows a flowchart of operation of video software according to the present invention.
  • FIG. 5 shows a flowchart of operation of a compositor according to the present invention.
  • FIG. 6 shows operations and data of a graphics processing unit according to the present invention.
  • a computer 100 such as a PowerMac G5 from Apple Computer, Inc., has connected a monitor or graphics display 102 and a keyboard 104 .
  • a mouse or pointing device 108 is connected to the keyboard 104 .
  • a video display 106 is also connected for video display purposes.
  • the display 102 can also be used for video display, but in that case it is usually done in a window in the graphic display.
  • a video camera 110 is shown connected to the computer 100 to provide a first video source.
  • a cable television device 112 is shown as a second video source for the computer 100 .
  • a CPU 200 is connected to a bridge 202 .
  • DRAM 204 is connected to the bridge 202 to form the working memory for the CPU 200 .
  • a graphics controller 206 which preferably includes a graphics processing unit (GPU) 207 , is connected to the bridge 202 .
  • the graphics controller 206 is shown including a cable input 208 , for connection to the cable device 112 ; a monitor output 210 , for connection to the graphics display 102 ; and a video output 212 , for connection to the video display 106 .
  • An I/O chip 214 is connected to the bridge 202 and includes a 1394 or FireWireTM block 216 , a USB (Universal Serial Bus) block 218 and a SATA (Serial ATA) block 220 .
  • a 1394 port 222 is connected to the 1394 block 216 to receive devices such as the video camera 110 .
  • a USB port 224 is connected to the USB block 218 to receive devices such as the keyboard 104 or various other USB devices such as hard drives or video converters.
  • Hard drives 226 are connected to the SATA bock 220 to provide bulk storage for the computer 100 .
  • FIG. 3 a drawing of exemplary software present on the computer 100 is shown.
  • An operating system such as Mac OS X by Apple Computer, Inc., forms the core piece of software.
  • Various device drivers 302 sit below the operating system 300 and provide interface to the various physical devices.
  • Application software 304 runs on the operating system 300 .
  • Exemplary drivers are a graphics driver 306 used with the graphics controller 206 , a digital video (DV) driver 308 used with the video camera 110 to decode digital video, and a TV tuner driver 310 to work with the graphics controller 206 to control the tuner functions.
  • a graphics driver 306 used with the graphics controller 206
  • DV digital video
  • TV tuner driver 310 to work with the graphics controller 206 to control the tuner functions.
  • the compositor 312 has the responsibility of receiving the content from each application for that application's window and combining the content into the final displayed image.
  • the buffer space 314 is used by the applications 304 and the compositor 312 to provide the content and develop the final image.
  • QuickTime 316 a video player program in its simplest form.
  • QuickTime can play video from numerous sources, including the cable, video camera and stored video files.
  • the video may include multiple video tracks, not just a single video track.
  • step 400 the QuickTime application 316 decodes track 1 .
  • track 1 In the illustrated embodiment two tracks are used to develop the actual video image being displayed. It is understood that often a single track or further tracks can be utilized, but the two track example is considered most informative. Further, the tracks can come from real time sources or from a stored or streaming video file.
  • the QuickTime application 316 After the QuickTime application 316 decodes track 1 in step 400 , it attaches a Composite NTSC color profile in step 402 .
  • each video source and display operates in a particular color space.
  • a color space is a technique and method to describe the characteristics of color values for the relevant device.
  • each video source has a color space in which it is operating.
  • a normal digital camera is utilized to encode and record track 1 , thus indicating that it was recorded with the Composite NTSC color profile.
  • Color profiles generally include information such as the device color space, a desired working color space and parameters to convert between the color spaces. See the International Color Consortium specification ICC.1:2004-10 (Profile Version 4.2.0.0), which is hereby incorporated by reference, for more information on color profiles.
  • the buffer and attached profile are sent to the compositor 312 in step 404 .
  • the QuickTime application 316 then decodes track 2 in step 406 .
  • track 2 is an HDTV image which was recorded by an HDTV camera. Therefore in step 408 the HDTV profile is attached to the decoded information and the combination is provided to the compositor in step 410 .
  • the color spaces for NTSC, PAL and HDTV all use Y′CbCr encoding but because there are slight differences in the actual encodings and NTSC/PAL and HDTV thus have slightly different parameters or equations for conversions, this description will generally specify the source color space and not the encoding scheme for clarity. It is also understood that these steps are performed for each frame in the video. It is noted that because these steps are performed for each frame, the color spaces can also be changed with each frame, if desired.
  • the compositor 312 converts track 1 , which is received from step 404 , in step 500 .
  • the exemplary conversion for track 1 is from the Composite NTSC color space to an arbitrary or working color space, in this case indicated as being the Linear RGB color space.
  • two conversions are actually performed, one from Composite NTSC, the source color space in the example, to the XYZ color space, an intermediate color space usually used for such conversions, and then from the XYZ color space to the Linear RGB color space, the working color space in the example embodiment.
  • the frame from decoded track 1 is rendered to a ASM or assembly buffer in the buffer space 314 .
  • the compositor 312 converts track 2 from the HDTV color space to the Linear RGB color space and renders it into the ASM buffer.
  • the Linear RGB color space profile is attached to the ASM buffer and then in step 506 the compositor converts the ASM buffer to the proper LCD color space for display by the graphics display 102 and then this is rendered to the frame buffer for ultimate display on the LCD graphics display in the illustrated embodiment.
  • step 506 a different color profile or conversion profile would be used, to convert from the working or Linear RGB color space to the display device color space.
  • a different display source would utilize different source color profiles. As above, these operations are performed for each frame of the video. It is also understood that multiple color profiles could be provided if needed.
  • FIG. 6 an illustration of the various data sources and operations of the GPU 207 are shown.
  • a track 1 buffer 600 and the associated Composite NTSC profile 602 are provided to the GPU 207 in operation ⁇ circle around ( 1 ) ⁇ .
  • the GPU 207 converts the track 1 buffer using the Composite NTSC color profile from the indicated Composite NTSC color space to the desired color space and renders the track 1 buffer into the Linear RGB color space in the ASM buffer 604 .
  • the two step conversion process through the XYZ color space and any use of temporary buffers for that process are omitted in FIG. 6 for clarity.
  • the track 2 buffer 606 and its attached HDTV color profile 608 are provided in operation ⁇ circle around ( 3 ) ⁇ to the GPU 207 .
  • the GPU 207 converts the HDTV color space information from the track 2 buffer into an intermediate color space using its built-in hardware conversion equations for Y′CbCr to RGB color spaces and renders it into a temp buffer 610 .
  • a temp buffer 610 is utilized because the proper HDTV color space or profile utilized on the HDTV video source is slightly different than the Y′CbCr color profile conversion equations utilized in the hardware in the preferred GPU, which are SDTV or NTSC/PAL equations.
  • operation ⁇ circle around ( 4 ) ⁇ provides an incorrect result and a correction from the actual color space utilized by the GPU 207 is required.
  • operation ⁇ circle around ( 5 ) ⁇ the temp buffer 610 is provided to the GPU 207 and then operation ⁇ circle around ( 6 ) ⁇ performs the correction and the final conversion and renders the temporary buffer contents, i.e., incorrect color space encoded track 2 values, into a proper Linear RGB color space into the ASM buffer 604 .
  • other corrections can be performed if desired.
  • This ASM buffer 604 and its attached Linear RGB or related color profile are then provided again to the GPU 207 in operation ⁇ circle around ( 7 ) ⁇ , which then in operation ⁇ circle around ( 8 ) ⁇ provides a final conversion to the proper color space of the LCD display device, for example, and provides this information to the frame buffer 616 .
  • the various buffers can be located in either the DRAM 204 or in memory contained on the graphics controller 206 , though the frame buffer is almost always contained on the graphics controller for performance reasons.
  • FIGS. 1, 2 and 3 there may be additional assembly buffers, temporary buffers, frame buffers and/or GPUs.
  • acts in accordance with FIGS. 4, 5 , and 6 may be performed by two or more cooperatively coupled GPUs and may, further, receive input from one or more system processing units (e.g., CPUs).
  • system processing units e.g., CPUs
  • fragment programs may be organized into one or more modules and, as such, may be tangibly embodied as program code stored in any suitable storage device.
  • Storage devices suitable for use in this manner include, but are not limited to: magnetic disks (fixed, floppy, and removable) and tape; optical media such as CD-ROMs and digital video disks (“DVDs”; and semiconductor memory devices such as Electrically Programmable Read-Only Memory (“EPROM”, Electrically Erasable Programmable Read-Only Memory (“EEPROM”, Programmable Gate Arrays and flash devices.
  • EPROM Electrically Programmable Read-Only Memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • flash devices such as Electrically Programmable Read-Only Memory
  • the video source can be any video source, be it live or stored, and in any video format.

Abstract

A system which utilizes the processing capabilities of the graphics processing unit (GPU) in the graphics controller. Each frame of each video stream or track is decoded into a buffer and a color profile indicating parameters of the color space of the video source is associated with the buffer. The compositor uses the color profile to convert each buffer to a defined working color space from the source color space. This conversion and rendering of the buffer is performed using the fragment processing capabilities of the GPU. The compositor then instructs the GPU to convert the buffer to the final color space of the display device and the frame is rendered to the frame buffer for final display. Each of these operations is done in real time for each frame of the video.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This continuation application claims priority to U.S. patent application Ser. No. 11/113,817 entitled “Color Correction of Digital Video Images Using a Programmable Graphics Processing Unit,” filed Apr. 25, 2005 and which is hereby incorporated by reference in its entirety. The subject matter of the invention is generally related to the following jointly owned and co-pending patent application: “Display-Wide Visual Effects for a Windowing System Using a Programmable Graphics Processing Unit” by Ralph Brunner and John Harper, Ser. No. 10/877,358, filed Jun. 25, 2004, which is also incorporated herein by reference in its entirety.
  • BACKGROUND
  • The invention relates generally to computer display technology and, more particularly, to the application of visual effects using a programmable graphics processing unit during frame-buffer composition in a computer system.
  • Presentation of video on digital devices is becoming more common with the increases in processing power, storage capability and telecommunications speed. Programs such as QuickTime by Apple Computer, Inc., allow the display of various video formats on a computer. In operation, QuickTime must decode each frame of the video from its encoded format and then provide the decoded image to a compositor in the operating system for display.
  • Because of the limited power of the CPU, it has not been possible to provide real time color compensation for a single video stream, much less multiple video streams. As a result, when real time video is displayed using a computer, the colors are generally incorrect because no compensation can be performed. Instead, generic color spaces and conversions are used. Thus a displayed image's appearance will change for each video source and video output. As this cannot be done for even one video stream, it becomes worse when multiple video streams are involved.
  • Thus, it would be beneficial to provide a mechanism by which real time video can be color compensated, both for video source and for the ultimate display device. Further, it would be beneficial to do this for multiple, simultaneous video streams.
  • SUMMARY
  • A system according to the present invention utilizes the processing capabilities of the graphics processing unit (GPU) in the graphics controller. Each frame of each video stream or track is decoded into a buffer and a color profile indicating parameters of the color space of the video source is associated with the buffer. After all of the streams have been decoded, the compositor uses the color profile to convert each buffer to a defined working color space from the source color space. This conversion and rendering of the buffer is performed using the fragment processing capabilities of the GPU. After any other desired operations, the compositor instructs the GPU to convert the buffer to the final color space of the display device and the frame is rendered to the frame buffer for final display. Each of these operations is done in real time for each frame of the video. Because each stream or frame is properly color converted, the final displayed image will be the uniformly colored for each video source and each display.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an illustration of a computer system with various video sources and displays.
  • FIG. 2 shows an exemplary block diagram of the computer of FIG. 1.
  • FIG. 3 shows an exemplary software environment of the computer of FIG. 1.
  • FIG. 4 shows a flowchart of operation of video software according to the present invention.
  • FIG. 5 shows a flowchart of operation of a compositor according to the present invention.
  • FIG. 6 shows operations and data of a graphics processing unit according to the present invention.
  • DETAILED DESCRIPTION
  • Methods and devices to provide real time video color compensation using fragment programs executing on a programmable graphics processing unit are described. The compensation can be done for multiple video streams and compensates for the video source, conversion errors and display device. The following embodiments of the invention, described in terms of the Mac OS X window server and compositing application and the QuickTime video application, are illustrative only and are not to be considered limiting in any respect. (The Mac OS X operating system and QuickTime are developed, distributed and supported by Apple Computer, Inc. of Cupertino, Calif.)
  • Referring now to FIG. 1, a computer system is shown. A computer 100, such as a PowerMac G5 from Apple Computer, Inc., has connected a monitor or graphics display 102 and a keyboard 104. A mouse or pointing device 108 is connected to the keyboard 104. A video display 106 is also connected for video display purposes. The display 102 can also be used for video display, but in that case it is usually done in a window in the graphic display.
  • A video camera 110 is shown connected to the computer 100 to provide a first video source. A cable television device 112 is shown as a second video source for the computer 100.
  • It is understood that this is an exemplary computer system and numerous other configurations and devices can be used.
  • Referring to FIG. 2, an exemplary block diagram of the computer 100 is shown. A CPU 200 is connected to a bridge 202. DRAM 204 is connected to the bridge 202 to form the working memory for the CPU 200. A graphics controller 206, which preferably includes a graphics processing unit (GPU) 207, is connected to the bridge 202. The graphics controller 206 is shown including a cable input 208, for connection to the cable device 112; a monitor output 210, for connection to the graphics display 102; and a video output 212, for connection to the video display 106.
  • An I/O chip 214 is connected to the bridge 202 and includes a 1394 or FireWire™ block 216, a USB (Universal Serial Bus) block 218 and a SATA (Serial ATA) block 220. A 1394 port 222 is connected to the 1394 block 216 to receive devices such as the video camera 110. A USB port 224 is connected to the USB block 218 to receive devices such as the keyboard 104 or various other USB devices such as hard drives or video converters. Hard drives 226 are connected to the SATA bock 220 to provide bulk storage for the computer 100.
  • It is understood that this is an exemplary block diagram and numerous other arrangements and components could be used.
  • Referring them to FIG. 3, a drawing of exemplary software present on the computer 100 is shown. An operating system, such as Mac OS X by Apple Computer, Inc., forms the core piece of software. Various device drivers 302 sit below the operating system 300 and provide interface to the various physical devices. Application software 304 runs on the operating system 300.
  • Exemplary drivers are a graphics driver 306 used with the graphics controller 206, a digital video (DV) driver 308 used with the video camera 110 to decode digital video, and a TV tuner driver 310 to work with the graphics controller 206 to control the tuner functions.
  • Particularly relevant to the present invention are two modules in the operating system 300, specifically the compositor 312 and buffer space 314. The compositor 312 has the responsibility of receiving the content from each application for that application's window and combining the content into the final displayed image. The buffer space 314 is used by the applications 304 and the compositor 312 to provide the content and develop the final image.
  • The exemplary application is QuickTime 316, a video player program in its simplest form. QuickTime can play video from numerous sources, including the cable, video camera and stored video files. The video may include multiple video tracks, not just a single video track.
  • Having set this background, and referring then to FIG. 4, the operations of the QuickTime application 316 are illustrated. In step 400 the QuickTime application 316 decodes track 1. In the illustrated embodiment two tracks are used to develop the actual video image being displayed. It is understood that often a single track or further tracks can be utilized, but the two track example is considered most informative. Further, the tracks can come from real time sources or from a stored or streaming video file. After the QuickTime application 316 decodes track 1 in step 400, it attaches a Composite NTSC color profile in step 402. As known to one skilled in the art, each video source and display operates in a particular color space. A color space is a technique and method to describe the characteristics of color values for the relevant device. There are different color spaces for different devices, some of which are linear, some are non-linear. There are numerous other characteristics of particular color spaces. In reference to operation according to the present invention, generally each video source has a color space in which it is operating. In the instance illustrated in track 1, a normal digital camera is utilized to encode and record track 1, thus indicating that it was recorded with the Composite NTSC color profile. Color profiles generally include information such as the device color space, a desired working color space and parameters to convert between the color spaces. See the International Color Consortium specification ICC.1:2004-10 (Profile Version 4.2.0.0), which is hereby incorporated by reference, for more information on color profiles. After the profile is attached to the decoded track, the buffer and attached profile are sent to the compositor 312 in step 404. The QuickTime application 316 then decodes track 2 in step 406. In the illustrated embodiment track 2 is an HDTV image which was recorded by an HDTV camera. Therefore in step 408 the HDTV profile is attached to the decoded information and the combination is provided to the compositor in step 410. It is understood that the color spaces for NTSC, PAL and HDTV all use Y′CbCr encoding but because there are slight differences in the actual encodings and NTSC/PAL and HDTV thus have slightly different parameters or equations for conversions, this description will generally specify the source color space and not the encoding scheme for clarity. It is also understood that these steps are performed for each frame in the video. It is noted that because these steps are performed for each frame, the color spaces can also be changed with each frame, if desired.
  • Referring then to FIG. 5, the operations of the compositor 312 are illustrated. The compositor 312 converts track 1, which is received from step 404, in step 500. The exemplary conversion for track 1 is from the Composite NTSC color space to an arbitrary or working color space, in this case indicated as being the Linear RGB color space. In the preferred embodiments, two conversions are actually performed, one from Composite NTSC, the source color space in the example, to the XYZ color space, an intermediate color space usually used for such conversions, and then from the XYZ color space to the Linear RGB color space, the working color space in the example embodiment. Only the Composite NTSC profile is provided to the compositor as it has the information needed to perform the XYZ to Linear RGB conversion and knows the results are to be provided in the Linear RGB color space. During this conversion the frame from decoded track 1 is rendered to a ASM or assembly buffer in the buffer space 314. In step 502 the compositor 312 converts track 2 from the HDTV color space to the Linear RGB color space and renders it into the ASM buffer. In step 504 the Linear RGB color space profile is attached to the ASM buffer and then in step 506 the compositor converts the ASM buffer to the proper LCD color space for display by the graphics display 102 and then this is rendered to the frame buffer for ultimate display on the LCD graphics display in the illustrated embodiment. Again in the preferred embodiment this is done by converting through the XYZ color space. Also, only the Linear RGB color profile is needed to be attached because the compositor knows the conversion from the intermediate XYZ color space to the display color space and knows to use that conversion because the destination is the frame buffer. It is, of course, understood that should a different display device be used, in step 506 a different color profile or conversion profile would be used, to convert from the working or Linear RGB color space to the display device color space. Similarly, a different display source would utilize different source color profiles. As above, these operations are performed for each frame of the video. It is also understood that multiple color profiles could be provided if needed.
  • Referring then to FIG. 6, an illustration of the various data sources and operations of the GPU 207 are shown. A track 1 buffer 600 and the associated Composite NTSC profile 602 are provided to the GPU 207 in operation {circle around (1)}. Then in operation {circle around (2)} the GPU 207 converts the track 1 buffer using the Composite NTSC color profile from the indicated Composite NTSC color space to the desired color space and renders the track 1 buffer into the Linear RGB color space in the ASM buffer 604. The two step conversion process through the XYZ color space and any use of temporary buffers for that process are omitted in FIG. 6 for clarity. The track 2 buffer 606 and its attached HDTV color profile 608 are provided in operation {circle around (3)} to the GPU 207. In operation {circle around (4)} the GPU 207 converts the HDTV color space information from the track 2 buffer into an intermediate color space using its built-in hardware conversion equations for Y′CbCr to RGB color spaces and renders it into a temp buffer 610. In the illustrated embodiment a temp buffer 610 is utilized because the proper HDTV color space or profile utilized on the HDTV video source is slightly different than the Y′CbCr color profile conversion equations utilized in the hardware in the preferred GPU, which are SDTV or NTSC/PAL equations. Therefore, operation {circle around (4)} provides an incorrect result and a correction from the actual color space utilized by the GPU 207 is required. Thus, in operation {circle around (5)} the temp buffer 610 is provided to the GPU 207 and then operation {circle around (6)} performs the correction and the final conversion and renders the temporary buffer contents, i.e., incorrect color space encoded track 2 values, into a proper Linear RGB color space into the ASM buffer 604. Of course, other corrections can be performed if desired. This ASM buffer 604 and its attached Linear RGB or related color profile are then provided again to the GPU 207 in operation {circle around (7)}, which then in operation {circle around (8)} provides a final conversion to the proper color space of the LCD display device, for example, and provides this information to the frame buffer 616.
  • The various buffers can be located in either the DRAM 204 or in memory contained on the graphics controller 206, though the frame buffer is almost always contained on the graphics controller for performance reasons.
  • Thus an efficient method of performing real time color space conversion from video source to final display device has been described. Use of the GPU and its fragment programs provides sufficient computational power to perform the operations in real time, as opposed to the CPU, which cannot perform the calculations in real time. Therefore, because of the color conversions, the video is displayed with accurate colors.
  • Various changes in the components as well as in the details of the illustrated operational methods are possible without departing from the scope of the following claims. For instance, in the illustrative system of FIGS. 1, 2 and 3 there may be additional assembly buffers, temporary buffers, frame buffers and/or GPUs. In addition, acts in accordance with FIGS. 4, 5, and 6 may be performed by two or more cooperatively coupled GPUs and may, further, receive input from one or more system processing units (e.g., CPUs). It will further be understood that fragment programs may be organized into one or more modules and, as such, may be tangibly embodied as program code stored in any suitable storage device. Storage devices suitable for use in this manner include, but are not limited to: magnetic disks (fixed, floppy, and removable) and tape; optical media such as CD-ROMs and digital video disks (“DVDs”; and semiconductor memory devices such as Electrically Programmable Read-Only Memory (“EPROM”, Electrically Erasable Programmable Read-Only Memory (“EEPROM”, Programmable Gate Arrays and flash devices. It is further understood that the video source can be any video source, be it live or stored, and in any video format.
  • The preceding description was presented to enable any person skilled in the art to make and use the invention as claimed and is provided in the context of the particular examples discussed above, variations of which will be readily apparent to those skilled in the art. Accordingly, the claims appended hereto are not intended to be limited by the disclosed embodiments, but are to be accorded their widest scope consistent with the principles and features disclosed herein.

Claims (15)

1. A method for displaying digital video, comprising:
converting a buffer containing decoded video information in a source color space to a buffer in a working color space using a graphics processing unit to perform the conversion; and
converting the buffer containing video information in the working color space to a buffer in a display color space using a graphics processing unit to perform the conversion.
2. The method of claim 1, wherein the buffer containing the decoded video information in the source color space has an associated color profile of the video source device and wherein the buffer containing video information in the working color space has an associated color profile of the display device.
3. The method of claim 1, further comprising:
converting a second buffer containing decoded video information in a second source color space to the buffer in the working color space using a graphics processing unit to perform the conversion.
4. The method of claim 1, further comprising:
decoding video information into the buffer containing decoded video information in the source color space.
5. The method of claim 1, wherein the digital video is a series of frames and the source color space may change between two frames.
6. A computer readable medium or media having computer-executable instructions stored therein for performing the following method for displaying digital video, the method comprising:
converting a buffer containing decoded video information in a source color space to a buffer in a working color space using a graphics processing unit to perform the conversion; and
converting the buffer containing video information in the working color space to a buffer in a display color space using a graphics processing unit to perform the conversion.
7. The computer readable medium or media of claim 7, wherein the buffer containing the decoded video information in the source color space has an associated color profile of the video source device and wherein the buffer containing video information in the working color space has an associated color profile of the display device.
8. The computer readable medium or media of claim 7, the method further comprising:
converting a second buffer containing decoded video information in a second source color space to the buffer in the working color space using a graphics processing unit to perform the conversion.
9. The computer readable medium or media of claim 7, the method further comprising:
decoding video information into the buffer containing decoded video information in the source color space.
10. The computer readable medium or media of claim 7, wherein the digital video is a series of frames and the source color space may change between two frames.
11. A computer system comprising:
a central processing unit;
memory, operatively coupled to the central processing unit, said memory adapted to provide a plurality of buffers, including a frame buffer;
a display port operatively coupled to the frame buffer and adapted to couple to a display device;
a graphics processing unit, operatively coupled to the memory; and
one or more programs for causing the graphics processing unit to perform the following method, the method including:
converting a buffer containing decoded video information in a source color space to a buffer in a working color space using a graphics processing unit to perform the conversion; and
converting the buffer containing video information in the working color space to a buffer in a display color space using a graphics processing unit to perform the conversion.
12. The computer system of claim 13, wherein the buffer containing the decoded video information in the source color space has an associated color profile of the video source device and wherein the buffer containing video information in the working color space has an associated color profile of the display device.
13. The computer system of claim 13, the method further comprising:
converting a second buffer containing decoded video information in a second source color space to the buffer in the working color space using a graphics processing unit to perform the conversion.
14. The computer system of claim 13, the method further comprising:
decoding video information into the buffer containing decoded video information in the source color space.
15. The computer system of claim 13, wherein the digital video is a series of frames and the source color space may change between two frames.
US11/775,923 2005-04-25 2007-07-11 Color correction of digital video images using a programmable graphics processing unit Abandoned US20080012870A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/775,923 US20080012870A1 (en) 2005-04-25 2007-07-11 Color correction of digital video images using a programmable graphics processing unit

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/113,817 US7312800B1 (en) 2005-04-25 2005-04-25 Color correction of digital video images using a programmable graphics processing unit
US11/775,923 US20080012870A1 (en) 2005-04-25 2007-07-11 Color correction of digital video images using a programmable graphics processing unit

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/113,817 Continuation US7312800B1 (en) 2005-04-25 2005-04-25 Color correction of digital video images using a programmable graphics processing unit

Publications (1)

Publication Number Publication Date
US20080012870A1 true US20080012870A1 (en) 2008-01-17

Family

ID=38863307

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/113,817 Active 2025-08-28 US7312800B1 (en) 2005-04-25 2005-04-25 Color correction of digital video images using a programmable graphics processing unit
US11/775,923 Abandoned US20080012870A1 (en) 2005-04-25 2007-07-11 Color correction of digital video images using a programmable graphics processing unit

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/113,817 Active 2025-08-28 US7312800B1 (en) 2005-04-25 2005-04-25 Color correction of digital video images using a programmable graphics processing unit

Country Status (1)

Country Link
US (2) US7312800B1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009092020A1 (en) * 2008-01-18 2009-07-23 Qualcomm Incorporated Multi-format support for surface creation in a graphics processing system
US20100027907A1 (en) * 2008-07-29 2010-02-04 Apple Inc. Differential image enhancement
US20100225657A1 (en) * 2009-03-06 2010-09-09 Sakariya Kapil V Systems and methods for operating a display
US20120133790A1 (en) * 2010-11-29 2012-05-31 Google Inc. Mobile device image feedback
US20130236161A1 (en) * 2012-03-09 2013-09-12 Panasonic Corporation Image processing device and image processing method
WO2018203115A1 (en) 2017-05-04 2018-11-08 Inspired Gaming (Uk) Limited Generation of variations in computer graphics from intermediate file formats of limited variability, including generation of different game appearances or game outcomes
US10210700B2 (en) 2017-05-04 2019-02-19 Inspired Gaming (Uk) Limited Generation of variations in computer graphics from intermediate file formats of limited variability, including generation of different game outcomes
US10322339B2 (en) 2017-05-04 2019-06-18 Inspired Gaming (Uk) Limited Generation of variations in computer graphics from intermediate formats of limited variability, including generation of different game appearances

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008117442A1 (en) * 2007-03-27 2008-10-02 Pioneer Corporation Content setting value information acquiring device, content output system, content setting value information acquiring method, its program, and recording medium on which its program is recorded
US8749574B2 (en) * 2008-06-06 2014-06-10 Apple Inc. Method and apparatus for improved color management
US8483286B2 (en) 2010-10-27 2013-07-09 Cyberlink Corp. Batch processing of media content
KR20130141920A (en) * 2012-06-18 2013-12-27 삼성디스플레이 주식회사 System and method for converting color gamut
US20140289368A1 (en) * 2013-03-22 2014-09-25 Thomson Licensing Device and method for generating a media package

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010008428A1 (en) * 2000-01-17 2001-07-19 Lg Electronics, Inc. Device and method for decoding televison video signal
US6310659B1 (en) * 2000-04-20 2001-10-30 Ati International Srl Graphics processing device and method with graphics versus video color space conversion discrimination
US20030001861A1 (en) * 2000-01-10 2003-01-02 Watson David W. Method and apparatus for pixel filtering using shared filter resource between overlay and texture mapping engines
US6518970B1 (en) * 2000-04-20 2003-02-11 Ati International Srl Graphics processing device with integrated programmable synchronization signal generation
US20040190617A1 (en) * 2003-03-28 2004-09-30 Microsoft Corporation Accelerating video decoding using a graphics processing unit
US20050141778A1 (en) * 2003-12-26 2005-06-30 Konica Minolta Photo Imaging, Inc. Image processing method, image processing apparatus and image processing program
US7038690B2 (en) * 2001-03-23 2006-05-02 Microsoft Corporation Methods and systems for displaying animated graphics on a computing device
US20060209080A1 (en) * 2005-03-18 2006-09-21 Telefonaktiebolaget L M Ericsson (Publ) Memory management for mobile terminals with limited amounts of graphics memory
US20070247468A1 (en) * 2004-04-16 2007-10-25 Mark Zimmer System and method for processing graphics operations with graphics processing unit
US7593021B1 (en) * 2004-09-13 2009-09-22 Nvidia Corp. Optional color space conversion

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030001861A1 (en) * 2000-01-10 2003-01-02 Watson David W. Method and apparatus for pixel filtering using shared filter resource between overlay and texture mapping engines
US20010008428A1 (en) * 2000-01-17 2001-07-19 Lg Electronics, Inc. Device and method for decoding televison video signal
US6310659B1 (en) * 2000-04-20 2001-10-30 Ati International Srl Graphics processing device and method with graphics versus video color space conversion discrimination
US6518970B1 (en) * 2000-04-20 2003-02-11 Ati International Srl Graphics processing device with integrated programmable synchronization signal generation
US7038690B2 (en) * 2001-03-23 2006-05-02 Microsoft Corporation Methods and systems for displaying animated graphics on a computing device
US20040190617A1 (en) * 2003-03-28 2004-09-30 Microsoft Corporation Accelerating video decoding using a graphics processing unit
US20050141778A1 (en) * 2003-12-26 2005-06-30 Konica Minolta Photo Imaging, Inc. Image processing method, image processing apparatus and image processing program
US20070247468A1 (en) * 2004-04-16 2007-10-25 Mark Zimmer System and method for processing graphics operations with graphics processing unit
US7593021B1 (en) * 2004-09-13 2009-09-22 Nvidia Corp. Optional color space conversion
US20060209080A1 (en) * 2005-03-18 2006-09-21 Telefonaktiebolaget L M Ericsson (Publ) Memory management for mobile terminals with limited amounts of graphics memory

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009092020A1 (en) * 2008-01-18 2009-07-23 Qualcomm Incorporated Multi-format support for surface creation in a graphics processing system
US20090184977A1 (en) * 2008-01-18 2009-07-23 Qualcomm Incorporated Multi-format support for surface creation in a graphics processing system
US20100027907A1 (en) * 2008-07-29 2010-02-04 Apple Inc. Differential image enhancement
US8229211B2 (en) 2008-07-29 2012-07-24 Apple Inc. Differential image enhancement
US8553976B2 (en) 2008-07-29 2013-10-08 Apple Inc. Differential image enhancement
US20100225657A1 (en) * 2009-03-06 2010-09-09 Sakariya Kapil V Systems and methods for operating a display
US8508542B2 (en) 2009-03-06 2013-08-13 Apple Inc. Systems and methods for operating a display
US20120133790A1 (en) * 2010-11-29 2012-05-31 Google Inc. Mobile device image feedback
US20130236161A1 (en) * 2012-03-09 2013-09-12 Panasonic Corporation Image processing device and image processing method
WO2018203115A1 (en) 2017-05-04 2018-11-08 Inspired Gaming (Uk) Limited Generation of variations in computer graphics from intermediate file formats of limited variability, including generation of different game appearances or game outcomes
US10210700B2 (en) 2017-05-04 2019-02-19 Inspired Gaming (Uk) Limited Generation of variations in computer graphics from intermediate file formats of limited variability, including generation of different game outcomes
US10322339B2 (en) 2017-05-04 2019-06-18 Inspired Gaming (Uk) Limited Generation of variations in computer graphics from intermediate formats of limited variability, including generation of different game appearances

Also Published As

Publication number Publication date
US7312800B1 (en) 2007-12-25

Similar Documents

Publication Publication Date Title
US7312800B1 (en) Color correction of digital video images using a programmable graphics processing unit
US8625666B2 (en) 4:4:4 color space video with 4:2:0 color space video encoders and decoders systems and methods
US7643675B2 (en) Strategies for processing image information using a color information data structure
US7564470B2 (en) Compositing images from multiple sources
US7453522B2 (en) Video data processing apparatus
US20080101455A1 (en) Apparatus and method for multiple format encoding
US8379039B2 (en) Reformatting content with proper color-region conversion
US8723891B2 (en) System and method for efficiently processing digital video
US7710434B2 (en) Rotation and scaling optimization for mobile devices
US10715847B2 (en) Custom data indicating nominal range of samples of media content
US7483037B2 (en) Resampling chroma video using a programmable graphics processing unit to provide improved color rendering
US7414632B1 (en) Multi-pass 4:2:0 subpicture blending
US20080253449A1 (en) Information apparatus and method
US11323701B2 (en) Systems and methods for group of pictures encoding
US8189681B1 (en) Displaying multiple compressed video streams on display devices
JP2007274229A (en) Information processing apparatus and method, and program
US20070097146A1 (en) Resampling selected colors of video information using a programmable graphics processing unit to provide improved color rendering on LCD displays
US20080226166A1 (en) Image signal processing device, image signal processing method and image signal processing program product
JPWO2005101819A1 (en) Display device
US20070097144A1 (en) Resampling individual fields of video information using a programmable graphics processing unit to provide improved full rate displays
JP5394447B2 (en) Strategies for processing image information using color information data structures
WO2010047706A1 (en) Decompressing a video stream on a first computer system, and scaling and displaying the video stream on a second computer system

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION