US20140267784A1 - Methods and systems for measuring and correcting electronic visual displays - Google Patents

Methods and systems for measuring and correcting electronic visual displays Download PDF

Info

Publication number
US20140267784A1
US20140267784A1 US13/830,678 US201313830678A US2014267784A1 US 20140267784 A1 US20140267784 A1 US 20140267784A1 US 201313830678 A US201313830678 A US 201313830678A US 2014267784 A1 US2014267784 A1 US 2014267784A1
Authority
US
United States
Prior art keywords
display
subpixels
pixels
illuminated
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US13/830,678
Other versions
US8836797B1 (en
Inventor
Ronald F. Rykowski
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Radiant Vision Systems LLC
Original Assignee
Radiant Vision Systems LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US13/830,678 priority Critical patent/US8836797B1/en
Application filed by Radiant Vision Systems LLC filed Critical Radiant Vision Systems LLC
Assigned to RADIANT ZEMAX, LLC reassignment RADIANT ZEMAX, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RYKOWSKI, RONALD F.
Assigned to FIFTH THIRD BANK reassignment FIFTH THIRD BANK SECURITY AGREEMENT Assignors: RADIANT ZEMAX, LLC
Priority to PCT/US2014/022134 priority patent/WO2014159134A1/en
Priority to CN201480027705.0A priority patent/CN105247607A/en
Priority to US14/458,695 priority patent/US9135851B2/en
Publication of US8836797B1 publication Critical patent/US8836797B1/en
Application granted granted Critical
Publication of US20140267784A1 publication Critical patent/US20140267784A1/en
Assigned to RADIANT VISION SYSTEMS, LLC, F/K/A RADIANT ZEMAX, LLC reassignment RADIANT VISION SYSTEMS, LLC, F/K/A RADIANT ZEMAX, LLC RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: FIFTH THIRD BANK
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/22Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
    • G09G3/30Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/006Electronic inspection or testing of displays and display drivers, e.g. of LED or LCD displays
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/004Diagnosis, testing or measuring for television systems or their details for digital television systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/04Diagnosis, testing or measuring for television systems or their details for receivers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0233Improving the luminance or brightness uniformity across the screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0693Calibration of display systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/145Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light originating from the display screen
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10TTECHNICAL SUBJECTS COVERED BY FORMER US CLASSIFICATION
    • Y10T29/00Metal working
    • Y10T29/49Method of mechanical manufacture
    • Y10T29/49002Electrical device making
    • Y10T29/49004Electrical device making including measuring or testing of device or component part

Definitions

  • the present disclosure relates generally to electronic visual displays, and more particularly, to methods and systems for measuring and calibrating the output from such displays.
  • Displays of increasingly high resolution are used in a wide variety of contexts, from personal electronics with screens a few inches or smaller in size to computer screens and televisions several feet across to scoreboards and billboards covering hundreds of square feet.
  • Some displays are assembled from a series of smaller panels, each of which may further consist of a series of internally connected modules.
  • Virtually all displays are made up of arrays of individual light-emitting elements called “pixels.”
  • each pixel is made up of a plurality of light-emitting points (e.g., one red, one green, and one blue). The light-emitting points are termed “subpixels.”
  • a display it is often desirable for a display to be calibrated. For example, calibration may improve the uniformity of the display and improve consistency between displays.
  • the color and brightness of each pixel or subpixel is measured. Adjustments are determined so the pixels can display particular colors at desired brightness levels. The adjustments are then stored (e.g., in software or firmware that controls the display or module), so that those adjustments or correction factors can be applied.
  • FIG. 1 is a schematic view of an electronic visual display calibration system configured in accordance with an embodiment of the disclosure.
  • FIG. 2 is an isometric front view of an electronic visual display calibration system configured in accordance with an embodiment of the disclosure.
  • FIG. 3 is a schematic block diagram of the electronic visual display calibration system of FIG. 1 .
  • FIGS. 4A and 4B are enlarged partial front views of a portion of an electronic visual display configured to be used with embodiments of the disclosure.
  • FIG. 5 is a diagram of a color gamut triangle.
  • FIG. 6 is a flow diagram of a method or process configured in accordance with an embodiment of the disclosure.
  • a display measurement method and/or system configured in accordance with one aspect of the disclosure is configured to measure the luminance and the color of the individual pixels or subpixels of an electronic visual display, such as a high-resolution liquid crystal display (“LCD”) or an organic light-emitting diode (“OLED”) display.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • a pattern generator e.g., standalone hardware test equipment, a logic analyzer add-on module, a computer peripheral, software in a computing device or controller connected to the display, output from a serial digital interface (“SDI”), digital video interface (“DVI”) or high-definition multimedia interface (“HDMI”) port, etc.
  • SDI serial digital interface
  • DVI digital video interface
  • HDMI high-definition multimedia interface
  • the pattern generator illuminates only every third or every fourth pixel of the display, such that the pixels between them remain off.
  • the technology uses an imaging device (which typically has a considerably higher resolution than the display itself) to measure only the illuminated pixels (and/or subpixels). Because only a subset of the pixels are illuminated and measured at once, the display under test effectively has a much lower pixel resolution. After measuring the illuminated pixels, the pattern can then be shifted (e.g., by one pixel) and then measurements can be repeated until all of the pixel of the display have been measured.
  • the effective resolution is 384 ⁇ 216 pixels.
  • a camera with a resolution of approximately 2,300 ⁇ 1,300 i.e., a camera readily available for a reasonable price—could potentially be used.
  • many conventional approaches for analyzing the 1,920 ⁇ 1,080 pixel HDTV display would require a camera having a resolution of approximately 12,000 ⁇ 6,000, or 72,000,000 pixels. Such a camera (with resolution high enough for the display to be measured) is expected to be prohibitively expensive and/or unavailable. As a result, measuring and calibrating such displays using conventional techniques is often impractical and/or too expensive.
  • Another conventional approach for measuring such large or high-resolution displays is to divide the display (or its constituent panels or modules) into sections small enough that the imaging system has sufficient resolving power to enable an accurate measurement of the pixels or subpixels of each section.
  • the imaging device or the display being measured
  • the imaging device is generally mounted on an x-y stage for horizontal and vertical positioning, or rotated to align to each section being measured.
  • Moving or rotating either the camera or the display requires additional, potentially expensive additional equipment, as well as time to perform the movement or rotation and to align the imaging device to the display.
  • this technique can lead to slight mismatches or discontinuities of measurement between the individual sections. If the measurements are used for uniformity correction, such mismatches must be addressed, typically with further measurements and/or post-processing the display measurement data.
  • embodiments of the present technology are expected to enable precise measurement of individual pixel or subpixel output for any display (e.g., an OLED display) without requiring expensive, high resolution imaging devices, and without additional equipment for moving the relationship between the imaging device and the display, time for moving and aligning them, or mismatches between sections of the display.
  • any display e.g., an OLED display
  • FIGS. 1-6 Certain details are set forth in the following description and in FIGS. 1-6 to provide a thorough understanding of various embodiments of the disclosure. However, other details describing well-known structures and systems often associated with visual displays and related optical equipment and/or other aspects of visual display calibration systems are not set forth below to avoid unnecessarily obscuring the description of various embodiments of the disclosure.
  • FIG. 1 is a schematic view of an electronic visual display calibration system (“the system”) 100 configured in accordance with an embodiment of the disclosure.
  • the system 100 is configured to collect, manage, and/or analyze display data for the purpose of processing image patterns (e.g., static image patterns, video streams comprised of a series of image patterns, etc.) that are shown on an electronic visual display 150 .
  • the pattern 160 shown on the display 150 is generated by a pattern generator 110 .
  • the display 150 can be, for example, a large electronic display or sign composed of smaller panels or modules.
  • the pattern 160 generated by the pattern generator 110 and displayed on the display 150 illustrated in FIG. 1 is described in further detail below in connection with FIGS. 4A-4B .
  • the system 100 includes a computing device 130 operably coupled to an imaging device 120 (e.g., an imaging colorimeter or other photometer).
  • the imaging device 120 is spaced apart from the display 150 (e.g., so that the entire display 150 is within the field of view of the imaging device 120 , and, in the case of a large elevated sign, for improving the convenience of measurement) and configured to sense or capture display information (e.g., color data, luminance data, etc.) from selectively illuminated pixels or subpixels 160 of the display 150 .
  • the pattern generator 110 can illuminate every nth pixel of the display 150 .
  • the captured display information is transferred from the imaging device 120 to the computing device 130 .
  • the pattern generator 110 can generate additional patterns 160 on the display 150 .
  • the pattern generator 110 can illuminate every next nth pixel of the display 150 . This process can be repeated (e.g., n times) until the computing device 130 obtains display information for all the pixels or subpixels of the entire display 150 .
  • the computing device 130 is configured to store, manage, and/or analyze the display information from each pattern 160 to determine one or more correction factors for the display 150 or for its pixels or subpixels.
  • the correction factors for the display 150 are applied to the firmware and/or software controlling the display 150 to calibrate the display 150 .
  • the corrections are applied in real time to a video stream to be shown on the display 150 .
  • the technology includes comparing the actual display value with a desired display value for the one or more portions of the display 150 , and determining a correction factor for the pixels or subpixels of the display 150 as determined from the measurements of the patterns 160 described above.
  • the technology processes or adjusts the image with the correction factors for the corresponding pixels of the display 150 .
  • the technology can further include transmitting the image to the display 150 and showing the image on the display 150 . Accordingly, in some embodiments, the image on the display 150 can be presented according to the desired display values without modifying or calibrating the actual display 150 .
  • system 100 illustrated in FIG. 1 includes separate components (e.g., the pattern generator 110 , the imaging device 120 , and the computing device 130 ), in other embodiments the system 100 can incorporate more or less than three components. Moreover, the various components can be further divided into subcomponents, or the various components and functions may be combined and integrated. In addition, these components can communicate via wired or wireless communication, as well as by information contained in storage media. The various components and features of the electronic visual display calibration system 100 are described in greater detail below in connection with FIG. 3 .
  • FIG. 2 is an isometric front view of an electronic visual display calibration system 200 configured in accordance with an embodiment of the disclosure.
  • the system 200 is configured to perform correction of the brightness and color of light-emitting elements that are used in electronic visual displays.
  • the calibration system 200 can include a test pattern generator 210 , a test station 240 , an interface 230 , and an electronic visual display 250 .
  • the calibration system 200 is designed to calibrate a display 250 that is placed within the test station 240 . In alternate embodiments, it is possible to calibrate multiple displays or multiple panels of a larger display within the test station 240 .
  • the test pattern generator 210 is configured to generate a series of test patterns 260 , each of which illuminates a proper subset of the pixels or subpixels of the display 250 .
  • the test station 240 is configured to capture a series of images from an imaging area covering all of the display 250 .
  • the captured image data is transferred from the test station 240 to the interface 230 .
  • the interface 230 compiles and manages the image data, performs a series of calculations to determine the appropriate correction factors that should be made to the image data, and then stores the data. This process is repeated until images of each of the pixels or subpixels of display 250 have been obtained.
  • the processed correction data is then uploaded from the interface 230 to the firmware and/or software controlling the display 250 and used to recalibrate the display 250 .
  • the test station 240 can include a lightproof chamber for calibrating a display 250 in a fully-illuminated room or factory.
  • the test station 240 can include a digital camera 220 mounted on the top portion 244 of the test station 240 .
  • the test station 240 can further include light baffles to eliminate any stray light that might be reflected off the walls of the test station chamber 242 back into the camera 220 .
  • the display 250 is positioned beneath the test station 240 .
  • the test station 240 includes mechanical and electrical fixtures for receiving the display 250 and placing it in position within the test station 240 for calibration.
  • test station 240 may be in other orientations, e.g., facing upward at a display positioned above the test station or facing horizontally. Further, in some embodiments the test station 240 may have a different arrangement and/or include different features.
  • the test station 240 also incorporates a ground glass diffuser 246 positioned just above the display 250 .
  • the diffuser 246 scatters the light emitted from each subpixel in the display 250 , which effectively partially integrates the emitted light angularly. Accordingly, the camera 220 is actually measuring the average light emitted into a cone rather than only the light traveling directly from each subpixel on the display 250 toward the camera 220 .
  • One advantage of this arrangement is that the display 250 will be corrected to optimize viewing over a wider angular range.
  • the diffuser 246 is an optional component that may not be included in some embodiments.
  • the interface 230 that is operably coupled to the test station 240 is configured to manage the data that is collected, stored, and used for calculation of new correction factors that will be used to recalibrate the display 250 .
  • the interface 230 automates the operation of the pattern generator 210 and the test station 240 and writes all the data into a database.
  • the interface 230 can be a personal computer with software for pattern selection, camera control, image data acquisition, and image data analysis.
  • various devices capable of operating the software can be used, such as handheld computers.
  • FIG. 3 is a schematic block diagram of the electronic visual display calibration system 100 of FIG. 1 .
  • the imaging device 120 can include a camera 320 , such as a digital camera suitable for high-resolution imaging.
  • the camera 320 can include optics capable of measuring subpixels of the display 150 (which can be a few millimeters in size) from a distance of 25 meters or more. If the displayed pattern 160 does not illuminate adjacent subpixels or pixels, imaging resolution requirements for the camera 320 may be less stringent, allowing the use of a less expensive imaging device 120 .
  • the camera 320 can be a CCD camera.
  • Suitable CCD digital color cameras include ProMetric® imaging colorimeters and photometers, which are commercially available from the assignee of the present disclosure, Radiant Zemax, LLC, of Redmond, Wash.
  • the camera 320 can be a complementary metal oxide semiconductor (“CMOS”) camera, or another type of suitable camera for imaging with sufficient resolution at a certain distance from the display.
  • CMOS complementary metal oxide semiconductor
  • the imaging device 120 can also include a lens 322 .
  • the lens 322 can be a reflecting telescope that is operably coupled to the camera 320 to provide sufficiently high resolution for long distance imaging of the display 150 .
  • the lens 322 can include other suitable configurations for viewing and/or capturing display information from the display 150 .
  • Suitable imaging devices 320 and lenses 322 are disclosed in U.S. Pat. Nos. 7,907,154 and 7,911,485, both of which are incorporated herein by reference in their entireties.
  • the imaging device 120 can accordingly be positioned at a distance L from the display 150 .
  • the distance L can vary depending on the size of the display 150 , and can include relatively large distances.
  • the imaging device 120 can be positioned at a distance L that is generally similar to a typical viewing distance of the display 150 .
  • the imaging device 120 can be positioned in a seating area facing toward the display 150 .
  • the distance L can be less that a typical viewing distance and direction, and the imaging system 120 can be configured to account for any viewing distance and/or direction differences.
  • the imaging device 120 has a wide field of view and the distance L can be less than the width of the display 150 (e.g., approximately one meter for a typical HDTV display). In other embodiments, the imaging device 120 has a long-focus lens 322 (e.g., a telephoto lens) and the distance L can be significantly greater than the width of the display 150 (e.g., between approximately 100 and 300 meters for an outdoor billboard or video screen). In yet other embodiments, the distance L can have other values.
  • the imaging device 120 has a wide field of view and the distance L can be less than the width of the display 150 (e.g., approximately one meter for a typical HDTV display). In other embodiments, the imaging device 120 has a long-focus lens 322 (e.g., a telephoto lens) and the distance L can be significantly greater than the width of the display 150 (e.g., between approximately 100 and 300 meters for an outdoor billboard or video screen). In yet other embodiments, the distance L can have other values.
  • the computing device 130 is configured to cause the pattern generator 110 to send images 160 (e.g., pixel or subpixel patterns) to the display 150 .
  • the pattern generator 110 is standalone hardware test equipment, a logic analyzer add-on module, a computer peripheral operably coupled to the computing device 130 , or software in the computing device 130 or in a controller connected to the display 150 .
  • the pattern generator 110 operates independently of the computing device 130 .
  • the patterns 160 are provided to the display 150 via standard video signal input, e.g., using a DVI, HDMI, or SDI input to the display.
  • the patterns 160 generated by the pattern generator 110 for displaying on the electronic visual display 150 are discussed in greater detail in connection with FIGS. 4A and 4B below.
  • the computing device 130 is configured to receive, manage, store, and/or process the display data collected by the imaging device 120 (e.g., for the purpose of adjusting the appearance of images 160 that will be displayed on the display 150 ).
  • display data associated with the display 150 can be processed by a computer that is separate from the imaging device 120 .
  • a typical display 150 such as a quad extended graphics array (“QXGA”)-resolution (2048 ⁇ 1536) visual display for example, can have over nine million subpixels that provide display data for the computing device 130 to manage and process.
  • QXGA quad extended graphics array
  • the pattern generator 110 may illuminate only a fraction of those subpixels at any one time, but by sending a series of patterns 160 to the display 150 , information about all the subpixels will be delivered to the computing device 130 .
  • the computing device 130 includes the necessary hardware and corresponding software components for managing and processing the display data.
  • the computing device 130 configured in accordance with an embodiment of the disclosure can include a processor 330 , a memory 332 , input/output devices 334 , one or more sensors 336 in addition to sensors of the imaging device 120 , and/or any other suitable subsystems and/or components 338 (displays, speakers, communication modules, etc.).
  • the memory 332 can be configured to store the display data from the patterns 160 shown on the display 150 .
  • the computing device 130 includes computer readable media (e.g., memory 332 , disk drives, or other storage media, excluding only a transitory, propagating signal per se) including instructions or software stored thereon that, when executed by the processor 330 or computing device 130 , cause the processor 330 or computing device 130 to process an image as described herein.
  • the processor 330 can be configured for performing or otherwise controlling calculations, analysis, and any other functions associated with the methods described herein.
  • the memory 332 includes software to control the imaging device 120 as well as measurement software to identify portions of the display 150 (e.g., subpixels of the display 150 ) and to image or otherwise extract the display data (e.g., subpixel brightness data, pixel color data, etc.).
  • suitable software for controlling the imaging device 120 and/or acquiring the display data is VisionCALTM screen correction software, which is commercially available from the assignee of the present disclosure, Radiant Zemax, LLC, of Redmond, Wash.
  • other suitable software can be implemented with the system 100 .
  • the memory 332 can also store one or more databases used to store the display data from the patterns 160 shown on display 150 , as well as calculated correction factors for the display data.
  • the database is a Microsoft Access® database designed by the assignee of the present disclosure.
  • the display data is stored in other types of databases or data files.
  • FIG. 4A is an enlarged partial front view of a portion of an electronic visual display 450 configured to be used with embodiments of the disclosure.
  • the illustrated view is representative of a portion of a display 450 (e.g., display 150 ( FIG. 1 ) or display 250 ( FIG. 2 )) displaying a pattern 460 a .
  • the display 450 is made up of a large number (e.g., millions) of individual light sources or light-emitting elements or pixels 430 .
  • Each pixel 430 comprises multiple light-emitting points or subpixels 432 (identified as first, second, and third subpixels 432 a - 432 c , respectively).
  • the subpixels 432 are LEDs or OLEDs.
  • the subpixels 432 a - 432 c can correspond to red, green, and blue LEDs, respectively.
  • each pixel 430 can include more or less than three subpixels 432 .
  • some pixels 430 may have four subpixels 432 (e.g., two green subpixels, one blue subpixel, and one red subpixel, or other combinations). Pixels and subpixels may be laid out in various geometric arrangements (e.g., triangular or hexagonal arrays in various color orders, vertical or oblique stripes, etc.).
  • the red, green, and blue (“RGB”) color space may not be used. Rather, a different color space can serve as the basis for processing and display of color images on the display 450 .
  • the subpixels 432 may be cyan, magenta, and yellow, respectively.
  • each subpixel 432 the luminance level of each subpixel 432 can vary. Accordingly, the additive primary colors represented by a red subpixel, a green subpixel, and a blue subpixel can be selectively combined to produce the colors within the color gamut defined by a color gamut triangle, as shown in FIG. 5 . For example, when only “pure” red is displayed, the green and blue subpixels may be turned on only slightly to achieve a specific chromaticity for the red color.
  • each pixel 430 or subpixel 432 is measured at input levels (using values from 0 to 255) of 255 (full brightness), 128 (one half brightness), 64 (one quarter brightness), and 32 (one eighth brightness). Data from such measurements can be used in calibration to achieve the same chromaticity for a particular color at various input brightness levels, or, e.g., to improve the uniformity of color and luminance response curves for each pixel or subpixel.
  • an illustrative pattern 460 a illuminates a proper subset of the pixels of the display 450 .
  • every fourth pixel 430 vertically and every fourth pixel 430 horizontally is illuminated, and the pixels between are switched off.
  • the effective pixel density of the display 450 is one sixteenth of the actual pixel density.
  • pattern 460 a is displayed on a “4K Ultra HD” television display 450 having a screen resolution of 3,840 ⁇ 2,160 pixels (a total of approximately 8.3 million pixels (“megapixels”)), only (3,840/4) ⁇ (2,160/4) pixels, i.e., 960 ⁇ 540 pixels (a total of approximately five hundred thousand pixels (half a megapixel)) are lit at once.
  • imaging equipment e.g., a camera sensor and lens
  • FIG. 4B illustrates another pattern 460 b on the same enlarged partial front view of a portion of the electronic visual display 450 .
  • each pixel 430 that was illuminated in the pattern 460 a of FIG. 4A is switched off, and the next pixel to the right is illuminated.
  • measuring the output of each pixel 430 of the display 450 requires displaying and measuring a total of sixteen patterns (multiplied by the number of different brightness levels for each pattern).
  • Different patterns 460 of pixels 430 and/or subpixels 432 could require a smaller or larger number of patterns 460 to ensure full coverage of the display 450 .
  • a pattern that illuminates every third pixel 430 horizontally and vertically requires nine patterns to cover every pixel 430 in the display 450 .
  • the patterns 460 illuminate individual subpixels 432 (e.g., one or more at a time of subpixels 432 a - 432 c ) rather than whole pixels 430 .
  • the patterns 460 are displayed and measured at more than one brightness level. Separately illuminating each subpixel 432 and measuring individual pixels or subpixels at different brightness levels correspondingly multiplies the number of required measurements.
  • patterns are tailored to a particular display or a particular measurement. The patterns are not necessarily as regular or evenly distributed as the examples illustrated in FIGS. 4A-4B , and different patterns may illuminate different numbers of pixels or subpixels.
  • the subpixels 432 may have other visual properties that can be measured and analyzed in accordance with embodiments of the present disclosure.
  • the displayed patterns 460 are described above with reference to pixels 430 and subpixels 432 , other embodiments of the disclosure can be used with displays having different types of light emitting elements or components.
  • FIG. 6 is a flow diagram of a method or process 600 configured in accordance with an embodiment of the disclosure.
  • the method includes identifying a fraction 1/n of the pixels or subpixels of the display to be illuminated for measurement.
  • the technology receives the number, e.g., from user input or from a configuration file.
  • the technology determines a number based on a heuristic and the characteristics of the display to be measured and the measuring equipment.
  • Such characteristics may include, e.g., the size of the display, the pixel resolution of the display, the pixel density or dot pitch (i.e., distance between pixels) of the display, the distance from the display to the imaging device, the optical resolving power or angular resolution of the imaging device, and the pixel resolution of the imaging device.
  • An example heuristic is that the pixel resolution of the imaging device is such that 50 pixels on the imaging device correspond to one illuminated subpixel on the display.
  • the imaging device can capture data from 125,829 subpixels on the display (6,291,456 camera pixels/50 camera pixels per display subpixel) in a single captured image.
  • the correlation between the resolution of the imaging device and the display can vary between, e.g., 6 to 200 pixels on the imaging device corresponding to one subpixel on the display.
  • the total number of illuminated subpixels will be below the threshold of 125,829 subpixels that can be captured in a single image by the selected imaging device in accordance with the applicable heuristic.
  • the technology displays a pattern selectively illuminating 1/n of the pixels or subpixels of the display (e.g., in the example above, 1 of every 8 subpixels of the display). For example, every nth pixel (or subpixel of a particular color) may be illuminated. An example of such a pattern is described above in connection with FIG. 4A . As described above, the technology may illuminate each pixel or subpixel at various brightness levels. At block 630 , the imaging device captures at least one image of the pattern of pixels or subpixels illuminated on the display.
  • Each subpixel captured by the imaging device can be characterized, e.g., by its color value, typically expressed as chromaticity (Cx, Cy), and its brightness, typically expressed as luminance Lv.
  • the captured image data is analyzed by a computing device, e.g., the computing device described above in connection with FIGS. 1 and 3 .
  • the computing device compares the color and brightness of each captured pixel with target color and brightness values, e.g., points within the color gamut defined by a color gamut triangle, such as shown in FIG. 5 .
  • the actual pixel or subpixel color or brightness values may differ from desired or target display values for the display. For example, there is typically significant variation in color or luminance of each subpixel of the display, especially if the subpixels are LEDs or OLEDs. Moreover, over time the visual properties of the display may degrade or otherwise vary from desired or target display values. Accordingly, at block 650 , the technology compares actual captured and analyzed values with target or desired display values for the pixels or subpixels illuminated according to the displayed pattern, and determines a correction value applicable to each analyzed pixel or subpixel.
  • Determining the correction values can include creating a correction data set or map.
  • the computing device calculates a three-by-three matrix of values for each pixel that indicate some fractional amount of power to turn on each subpixel to obtain each of the three primary colors (red, green, and blue) at target color and brightness levels.
  • a sample matrix is displayed below:
  • Fractional values for each subpixel of a pixel Primary color Red Green Blue Red 0.60 0.10 0.05 Green 0.15 0.70 0.08 Blue 0.03 0.08 0.75
  • the technology has calculated that the display should turn on its red subpixel at 60% power, its green subpixel at 10% power, and its blue subpixel at 5% power.
  • each correction factor can compensate for the difference between the captured and analyzed values and the corresponding target display value. For example, if the captured and analyzed value is less bright than the corresponding target display value, the correction factor can include the amount of brightness that would be required for the captured and analyzed value of the pixel or subpixel to be generally equal to the target display value. Moreover, the correction factor can correlate to the corresponding type of display value. For example, the correction value can be expressed in terms of color or brightness correction values, or in terms of other visual display property correction values. Suitable methods and systems for determining correction values or correction factors are disclosed in U.S. Pat. Nos. 7,907,154 and 7,911,485 referenced above.
  • the process branches depending on whether or not all the pixels or subpixels of the display have been illuminated, captured, analyzed, and corrected as described above in blocks 620 - 650 . If the technology illuminates a fraction 1/n of the pixels or subpixels of the display in each pattern, then at least n iterations are required to measure and calibrate the entire display. For example, after displaying a first pattern such as the pattern described above in connection with FIG. 4A , in which every nth pixel (or subpixel of a particular color) is illuminated, the technology returns to block 620 .
  • the technology illuminates a different pattern of pixels or subpixels, e.g., a distinct subset of the pixels or subpixels of the display. For example, the technology might next display the pattern described above in connection with FIG. 4B , in which the next neighbor of every nth pixel (or subpixel of a particular color) is illuminated. The process then continues as described above.
  • the method 600 can further include sending the calibration correction values to the display.
  • the correction factors are stored in firmware within the display or a controller of the display.
  • the correction factor data set or map can be saved and, e.g., provided to a third party such as the owner of the display, or used to process video images outside the display such that the display can show the processed image according to desired or target display properties without calibrating or adjusting the display itself. Suitable methods and systems for correcting images to calibrate their appearance on a particular display are disclosed in U.S. patent application Ser. No.
  • the technology verifies or improves the calibration by measuring the calibrated output of each pixel or subpixel as described in blocks 610 - 660 above, and optionally modifying the correction factors applied to the display.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Electroluminescent Light Sources (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Control Of El Displays (AREA)

Abstract

The present disclosure relates to methods and systems for measuring and correcting electronic visual displays. A method in accordance with one embodiment of the present technology includes generating a series of patterns for illuminating proper subsets of the light emitting elements of the display, such as regular grids of nonadjacent activated light emitting elements with the elements in between deactivated. For each generated pattern, an imaging device captures information about the activated light emitting elements. A computing device analyzes the captured information, comparing the output of the activated light emitting elements to target output values, and determines correction factors to calibrate the display to better achieve the target output values. In some embodiments, the correction factors may be uploaded to firmware controlling the display or used to process images to be shown on the display.

Description

    TECHNICAL FIELD
  • The present disclosure relates generally to electronic visual displays, and more particularly, to methods and systems for measuring and calibrating the output from such displays.
  • BACKGROUND
  • Electronic visual displays (“displays”) have become commonplace. Displays of increasingly high resolution are used in a wide variety of contexts, from personal electronics with screens a few inches or smaller in size to computer screens and televisions several feet across to scoreboards and billboards covering hundreds of square feet. Some displays are assembled from a series of smaller panels, each of which may further consist of a series of internally connected modules. Virtually all displays are made up of arrays of individual light-emitting elements called “pixels.” In turn, each pixel is made up of a plurality of light-emitting points (e.g., one red, one green, and one blue). The light-emitting points are termed “subpixels.”
  • It is often desirable for a display to be calibrated. For example, calibration may improve the uniformity of the display and improve consistency between displays. During calibration of a display (or, e.g., of each module of a display), the color and brightness of each pixel or subpixel is measured. Adjustments are determined so the pixels can display particular colors at desired brightness levels. The adjustments are then stored (e.g., in software or firmware that controls the display or module), so that those adjustments or correction factors can be applied.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a schematic view of an electronic visual display calibration system configured in accordance with an embodiment of the disclosure.
  • FIG. 2 is an isometric front view of an electronic visual display calibration system configured in accordance with an embodiment of the disclosure.
  • FIG. 3 is a schematic block diagram of the electronic visual display calibration system of FIG. 1.
  • FIGS. 4A and 4B are enlarged partial front views of a portion of an electronic visual display configured to be used with embodiments of the disclosure.
  • FIG. 5 is a diagram of a color gamut triangle.
  • FIG. 6 is a flow diagram of a method or process configured in accordance with an embodiment of the disclosure.
  • DETAILED DESCRIPTION A. Overview
  • The following disclosure describes electronic visual display calibration systems and associated methods for measuring and calibrating electronic visual displays. As described in greater detail below, a display measurement method and/or system configured in accordance with one aspect of the disclosure is configured to measure the luminance and the color of the individual pixels or subpixels of an electronic visual display, such as a high-resolution liquid crystal display (“LCD”) or an organic light-emitting diode (“OLED”) display.
  • The inventors have recognized that when pixels are very closely spaced, such as is typical in many LCDs, OLED displays, and high resolution light-emitting diode (“LED”) displays, measuring individual pixel or subpixel attributes becomes more difficult. Accordingly, embodiments of the present technology use a pattern generator (e.g., standalone hardware test equipment, a logic analyzer add-on module, a computer peripheral, software in a computing device or controller connected to the display, output from a serial digital interface (“SDI”), digital video interface (“DVI”) or high-definition multimedia interface (“HDMI”) port, etc.) to display only a desired subset of pixels or subpixels to be measured. In some embodiments, for example, the pattern generator illuminates only every third or every fourth pixel of the display, such that the pixels between them remain off. The technology uses an imaging device (which typically has a considerably higher resolution than the display itself) to measure only the illuminated pixels (and/or subpixels). Because only a subset of the pixels are illuminated and measured at once, the display under test effectively has a much lower pixel resolution. After measuring the illuminated pixels, the pattern can then be shifted (e.g., by one pixel) and then measurements can be repeated until all of the pixel of the display have been measured.
  • In one particular embodiment, for example, if every fifth pixel of a 1,920×1,080 pixel high definition television (“HDTV”) display is illuminated at a time, then the effective resolution is 384×216 pixels. To measure the illuminated pixels with an imaging device having a resolution about six times greater than the display's pixel resolution, a camera with a resolution of approximately 2,300×1,300—i.e., a camera readily available for a reasonable price—could potentially be used. In contrast with the present technology, however, many conventional approaches for analyzing the 1,920×1,080 pixel HDTV display would require a camera having a resolution of approximately 12,000×6,000, or 72,000,000 pixels. Such a camera (with resolution high enough for the display to be measured) is expected to be prohibitively expensive and/or unavailable. As a result, measuring and calibrating such displays using conventional techniques is often impractical and/or too expensive.
  • Another conventional approach for measuring such large or high-resolution displays is to divide the display (or its constituent panels or modules) into sections small enough that the imaging system has sufficient resolving power to enable an accurate measurement of the pixels or subpixels of each section. Using this approach, the imaging device (or the display being measured) is generally mounted on an x-y stage for horizontal and vertical positioning, or rotated to align to each section being measured. Moving or rotating either the camera or the display, however, requires additional, potentially expensive additional equipment, as well as time to perform the movement or rotation and to align the imaging device to the display. Furthermore, this technique can lead to slight mismatches or discontinuities of measurement between the individual sections. If the measurements are used for uniformity correction, such mismatches must be addressed, typically with further measurements and/or post-processing the display measurement data.
  • In contrast with conventional techniques, embodiments of the present technology are expected to enable precise measurement of individual pixel or subpixel output for any display (e.g., an OLED display) without requiring expensive, high resolution imaging devices, and without additional equipment for moving the relationship between the imaging device and the display, time for moving and aligning them, or mismatches between sections of the display.
  • Certain details are set forth in the following description and in FIGS. 1-6 to provide a thorough understanding of various embodiments of the disclosure. However, other details describing well-known structures and systems often associated with visual displays and related optical equipment and/or other aspects of visual display calibration systems are not set forth below to avoid unnecessarily obscuring the description of various embodiments of the disclosure.
  • Many of the details, dimensions, angles, and other features shown in the Figures are merely illustrative of particular embodiments of the disclosure. Accordingly, other embodiments can have other details, dimensions, angles, and features without departing from the spirit or scope of the present disclosure. In addition, those of ordinary skill in the art will appreciate that further embodiments of the disclosure can be practiced without several of the details described below.
  • B. Embodiments of Electronic Visual Display Calibration Systems and Associated Methods for Calibrating Electronic Visual Displays
  • FIG. 1 is a schematic view of an electronic visual display calibration system (“the system”) 100 configured in accordance with an embodiment of the disclosure. The system 100 is configured to collect, manage, and/or analyze display data for the purpose of processing image patterns (e.g., static image patterns, video streams comprised of a series of image patterns, etc.) that are shown on an electronic visual display 150. The pattern 160 shown on the display 150 is generated by a pattern generator 110. The display 150 can be, for example, a large electronic display or sign composed of smaller panels or modules. The pattern 160 generated by the pattern generator 110 and displayed on the display 150 illustrated in FIG. 1 is described in further detail below in connection with FIGS. 4A-4B.
  • In the embodiment illustrated in FIG. 1, the system 100 includes a computing device 130 operably coupled to an imaging device 120 (e.g., an imaging colorimeter or other photometer). In the illustrated embodiment, the imaging device 120 is spaced apart from the display 150 (e.g., so that the entire display 150 is within the field of view of the imaging device 120, and, in the case of a large elevated sign, for improving the convenience of measurement) and configured to sense or capture display information (e.g., color data, luminance data, etc.) from selectively illuminated pixels or subpixels 160 of the display 150. For example, the pattern generator 110 can illuminate every nth pixel of the display 150. The captured display information is transferred from the imaging device 120 to the computing device 130. After capturing or otherwise sensing the display information for one pattern 160, the pattern generator 110 can generate additional patterns 160 on the display 150. For example, the pattern generator 110 can illuminate every next nth pixel of the display 150. This process can be repeated (e.g., n times) until the computing device 130 obtains display information for all the pixels or subpixels of the entire display 150. The computing device 130 is configured to store, manage, and/or analyze the display information from each pattern 160 to determine one or more correction factors for the display 150 or for its pixels or subpixels.
  • In some embodiments, the correction factors for the display 150 are applied to the firmware and/or software controlling the display 150 to calibrate the display 150. In alternate embodiments, the corrections are applied in real time to a video stream to be shown on the display 150. In such embodiments, the technology includes comparing the actual display value with a desired display value for the one or more portions of the display 150, and determining a correction factor for the pixels or subpixels of the display 150 as determined from the measurements of the patterns 160 described above. The technology processes or adjusts the image with the correction factors for the corresponding pixels of the display 150. After processing the image to account for variations in the display 150, the technology can further include transmitting the image to the display 150 and showing the image on the display 150. Accordingly, in some embodiments, the image on the display 150 can be presented according to the desired display values without modifying or calibrating the actual display 150.
  • One of ordinary skill in the art will understand that although the system 100 illustrated in FIG. 1 includes separate components (e.g., the pattern generator 110, the imaging device 120, and the computing device 130), in other embodiments the system 100 can incorporate more or less than three components. Moreover, the various components can be further divided into subcomponents, or the various components and functions may be combined and integrated. In addition, these components can communicate via wired or wireless communication, as well as by information contained in storage media. The various components and features of the electronic visual display calibration system 100 are described in greater detail below in connection with FIG. 3.
  • FIG. 2 is an isometric front view of an electronic visual display calibration system 200 configured in accordance with an embodiment of the disclosure. The system 200 is configured to perform correction of the brightness and color of light-emitting elements that are used in electronic visual displays. In one embodiment, the calibration system 200 can include a test pattern generator 210, a test station 240, an interface 230, and an electronic visual display 250. In the embodiment illustrated in FIG. 2, the calibration system 200 is designed to calibrate a display 250 that is placed within the test station 240. In alternate embodiments, it is possible to calibrate multiple displays or multiple panels of a larger display within the test station 240.
  • The test pattern generator 210 is configured to generate a series of test patterns 260, each of which illuminates a proper subset of the pixels or subpixels of the display 250. The test station 240 is configured to capture a series of images from an imaging area covering all of the display 250. The captured image data is transferred from the test station 240 to the interface 230. The interface 230 compiles and manages the image data, performs a series of calculations to determine the appropriate correction factors that should be made to the image data, and then stores the data. This process is repeated until images of each of the pixels or subpixels of display 250 have been obtained. After collection of all the necessary data, the processed correction data is then uploaded from the interface 230 to the firmware and/or software controlling the display 250 and used to recalibrate the display 250.
  • In the embodiment illustrated in FIG. 2, the test station 240 can include a lightproof chamber for calibrating a display 250 in a fully-illuminated room or factory. The test station 240 can include a digital camera 220 mounted on the top portion 244 of the test station 240. The test station 240 can further include light baffles to eliminate any stray light that might be reflected off the walls of the test station chamber 242 back into the camera 220. In the illustrated embodiment, the display 250 is positioned beneath the test station 240. The test station 240 includes mechanical and electrical fixtures for receiving the display 250 and placing it in position within the test station 240 for calibration. In other embodiments, the test station 240 may be in other orientations, e.g., facing upward at a display positioned above the test station or facing horizontally. Further, in some embodiments the test station 240 may have a different arrangement and/or include different features.
  • In the illustrated embodiment, the test station 240 also incorporates a ground glass diffuser 246 positioned just above the display 250. The diffuser 246 scatters the light emitted from each subpixel in the display 250, which effectively partially integrates the emitted light angularly. Accordingly, the camera 220 is actually measuring the average light emitted into a cone rather than only the light traveling directly from each subpixel on the display 250 toward the camera 220. One advantage of this arrangement is that the display 250 will be corrected to optimize viewing over a wider angular range. The diffuser 246 is an optional component that may not be included in some embodiments.
  • The interface 230 that is operably coupled to the test station 240 is configured to manage the data that is collected, stored, and used for calculation of new correction factors that will be used to recalibrate the display 250. The interface 230 automates the operation of the pattern generator 210 and the test station 240 and writes all the data into a database. In one embodiment, the interface 230 can be a personal computer with software for pattern selection, camera control, image data acquisition, and image data analysis. Optionally, in other embodiments various devices capable of operating the software can be used, such as handheld computers.
  • FIG. 3 is a schematic block diagram of the electronic visual display calibration system 100 of FIG. 1. In the illustrated embodiment, the imaging device 120 can include a camera 320, such as a digital camera suitable for high-resolution imaging. For example, the camera 320 can include optics capable of measuring subpixels of the display 150 (which can be a few millimeters in size) from a distance of 25 meters or more. If the displayed pattern 160 does not illuminate adjacent subpixels or pixels, imaging resolution requirements for the camera 320 may be less stringent, allowing the use of a less expensive imaging device 120. In some embodiments, the camera 320 can be a CCD camera. Suitable CCD digital color cameras include ProMetric® imaging colorimeters and photometers, which are commercially available from the assignee of the present disclosure, Radiant Zemax, LLC, of Redmond, Wash. In other embodiments, the camera 320 can be a complementary metal oxide semiconductor (“CMOS”) camera, or another type of suitable camera for imaging with sufficient resolution at a certain distance from the display.
  • According to another aspect of the illustrated embodiment, the imaging device 120 can also include a lens 322. In one embodiment, for example, the lens 322 can be a reflecting telescope that is operably coupled to the camera 320 to provide sufficiently high resolution for long distance imaging of the display 150. In other embodiments, however, the lens 322 can include other suitable configurations for viewing and/or capturing display information from the display 150. Suitable imaging devices 320 and lenses 322 are disclosed in U.S. Pat. Nos. 7,907,154 and 7,911,485, both of which are incorporated herein by reference in their entireties.
  • The imaging device 120 can accordingly be positioned at a distance L from the display 150. The distance L can vary depending on the size of the display 150, and can include relatively large distances. In one embodiment, for example, the imaging device 120 can be positioned at a distance L that is generally similar to a typical viewing distance of the display 150. In a sports stadium, for example, the imaging device 120 can be positioned in a seating area facing toward the display 150. In other embodiments, however, the distance L can be less that a typical viewing distance and direction, and the imaging system 120 can be configured to account for any viewing distance and/or direction differences. In some embodiments, the imaging device 120 has a wide field of view and the distance L can be less than the width of the display 150 (e.g., approximately one meter for a typical HDTV display). In other embodiments, the imaging device 120 has a long-focus lens 322 (e.g., a telephoto lens) and the distance L can be significantly greater than the width of the display 150 (e.g., between approximately 100 and 300 meters for an outdoor billboard or video screen). In yet other embodiments, the distance L can have other values.
  • The computing device 130 is configured to cause the pattern generator 110 to send images 160 (e.g., pixel or subpixel patterns) to the display 150. In various embodiments, the pattern generator 110 is standalone hardware test equipment, a logic analyzer add-on module, a computer peripheral operably coupled to the computing device 130, or software in the computing device 130 or in a controller connected to the display 150. In other embodiments, the pattern generator 110 operates independently of the computing device 130. In alternative embodiments, the patterns 160 are provided to the display 150 via standard video signal input, e.g., using a DVI, HDMI, or SDI input to the display. The patterns 160 generated by the pattern generator 110 for displaying on the electronic visual display 150 are discussed in greater detail in connection with FIGS. 4A and 4B below.
  • Continuing with respect to FIG. 3, the computing device 130 is configured to receive, manage, store, and/or process the display data collected by the imaging device 120 (e.g., for the purpose of adjusting the appearance of images 160 that will be displayed on the display 150). In other embodiments, display data associated with the display 150, including correction factors and related data, can be processed by a computer that is separate from the imaging device 120. A typical display 150, such as a quad extended graphics array (“QXGA”)-resolution (2048×1536) visual display for example, can have over nine million subpixels that provide display data for the computing device 130 to manage and process. The pattern generator 110 may illuminate only a fraction of those subpixels at any one time, but by sending a series of patterns 160 to the display 150, information about all the subpixels will be delivered to the computing device 130. As such, the computing device 130 includes the necessary hardware and corresponding software components for managing and processing the display data. More specifically, the computing device 130 configured in accordance with an embodiment of the disclosure can include a processor 330, a memory 332, input/output devices 334, one or more sensors 336 in addition to sensors of the imaging device 120, and/or any other suitable subsystems and/or components 338 (displays, speakers, communication modules, etc.). The memory 332 can be configured to store the display data from the patterns 160 shown on the display 150. The computing device 130 includes computer readable media (e.g., memory 332, disk drives, or other storage media, excluding only a transitory, propagating signal per se) including instructions or software stored thereon that, when executed by the processor 330 or computing device 130, cause the processor 330 or computing device 130 to process an image as described herein. Moreover, the processor 330 can be configured for performing or otherwise controlling calculations, analysis, and any other functions associated with the methods described herein.
  • In some embodiments, the memory 332 includes software to control the imaging device 120 as well as measurement software to identify portions of the display 150 (e.g., subpixels of the display 150) and to image or otherwise extract the display data (e.g., subpixel brightness data, pixel color data, etc.). One example of suitable software for controlling the imaging device 120 and/or acquiring the display data is VisionCAL™ screen correction software, which is commercially available from the assignee of the present disclosure, Radiant Zemax, LLC, of Redmond, Wash. In other embodiments, other suitable software can be implemented with the system 100. Moreover, the memory 332 can also store one or more databases used to store the display data from the patterns 160 shown on display 150, as well as calculated correction factors for the display data. In one embodiment, for example, the database is a Microsoft Access® database designed by the assignee of the present disclosure. In other embodiments, however, the display data is stored in other types of databases or data files.
  • FIG. 4A is an enlarged partial front view of a portion of an electronic visual display 450 configured to be used with embodiments of the disclosure. The illustrated view is representative of a portion of a display 450 (e.g., display 150 (FIG. 1) or display 250 (FIG. 2)) displaying a pattern 460 a. The display 450 is made up of a large number (e.g., millions) of individual light sources or light-emitting elements or pixels 430. Each pixel 430 comprises multiple light-emitting points or subpixels 432 (identified as first, second, and third subpixels 432 a-432 c, respectively). In certain embodiments, the subpixels 432 are LEDs or OLEDs. For example, the subpixels 432 a-432 c can correspond to red, green, and blue LEDs, respectively. In other embodiments, each pixel 430 can include more or less than three subpixels 432. For example, some pixels 430 may have four subpixels 432 (e.g., two green subpixels, one blue subpixel, and one red subpixel, or other combinations). Pixels and subpixels may be laid out in various geometric arrangements (e.g., triangular or hexagonal arrays in various color orders, vertical or oblique stripes, etc.). Furthermore, in certain embodiments, the red, green, and blue (“RGB”) color space may not be used. Rather, a different color space can serve as the basis for processing and display of color images on the display 450. For example, the subpixels 432 may be cyan, magenta, and yellow, respectively.
  • In addition to the color level of each subpixel 432, the luminance level of each subpixel 432 can vary. Accordingly, the additive primary colors represented by a red subpixel, a green subpixel, and a blue subpixel can be selectively combined to produce the colors within the color gamut defined by a color gamut triangle, as shown in FIG. 5. For example, when only “pure” red is displayed, the green and blue subpixels may be turned on only slightly to achieve a specific chromaticity for the red color.
  • In addition, the measurement process described herein may be performed at various brightness levels. For example, in some embodiments, each pixel 430 or subpixel 432 is measured at input levels (using values from 0 to 255) of 255 (full brightness), 128 (one half brightness), 64 (one quarter brightness), and 32 (one eighth brightness). Data from such measurements can be used in calibration to achieve the same chromaticity for a particular color at various input brightness levels, or, e.g., to improve the uniformity of color and luminance response curves for each pixel or subpixel.
  • Returning to FIG. 4A, an illustrative pattern 460 a illuminates a proper subset of the pixels of the display 450. In the illustrated embodiment, every fourth pixel 430 vertically and every fourth pixel 430 horizontally is illuminated, and the pixels between are switched off. Thus, for the illustrated pattern 460 a, only one of every sixteen pixels 430 is illuminated, and the spaces between illuminated pixels are four times larger in each direction than there would be if every pixel 430 were illuminated. As a result, the effective pixel density of the display 450 is one sixteenth of the actual pixel density. For example, if pattern 460 a is displayed on a “4K Ultra HD” television display 450 having a screen resolution of 3,840×2,160 pixels (a total of approximately 8.3 million pixels (“megapixels”)), only (3,840/4)×(2,160/4) pixels, i.e., 960×540 pixels (a total of approximately five hundred thousand pixels (half a megapixel)) are lit at once. Such a reduction in the effective pixel resolution of the display 450 can permit use of imaging equipment (e.g., a camera sensor and lens) that is less sophisticated and expensive than would otherwise be required to measure the display 450.
  • The technology displays a series of patterns to illuminate and measure each pixel or subpixel of the display at least once (and potentially multiple times, e.g., at different brightness input levels). FIG. 4B illustrates another pattern 460 b on the same enlarged partial front view of a portion of the electronic visual display 450. In the pattern 460 b of FIG. 4B, each pixel 430 that was illuminated in the pattern 460 a of FIG. 4A is switched off, and the next pixel to the right is illuminated. In the illustrated embodiment, measuring the output of each pixel 430 of the display 450 requires displaying and measuring a total of sixteen patterns (multiplied by the number of different brightness levels for each pattern). Different patterns 460 of pixels 430 and/or subpixels 432 could require a smaller or larger number of patterns 460 to ensure full coverage of the display 450. For example, a pattern that illuminates every third pixel 430 horizontally and vertically requires nine patterns to cover every pixel 430 in the display 450.
  • In alternative embodiments, the patterns 460 illuminate individual subpixels 432 (e.g., one or more at a time of subpixels 432 a-432 c) rather than whole pixels 430. In various embodiments, the patterns 460 are displayed and measured at more than one brightness level. Separately illuminating each subpixel 432 and measuring individual pixels or subpixels at different brightness levels correspondingly multiplies the number of required measurements. In some embodiments, patterns are tailored to a particular display or a particular measurement. The patterns are not necessarily as regular or evenly distributed as the examples illustrated in FIGS. 4A-4B, and different patterns may illuminate different numbers of pixels or subpixels.
  • In addition to color and/or luminance, the subpixels 432 may have other visual properties that can be measured and analyzed in accordance with embodiments of the present disclosure. Moreover, although the displayed patterns 460 are described above with reference to pixels 430 and subpixels 432, other embodiments of the disclosure can be used with displays having different types of light emitting elements or components.
  • FIG. 6 is a flow diagram of a method or process 600 configured in accordance with an embodiment of the disclosure. At block 610, the method includes identifying a fraction 1/n of the pixels or subpixels of the display to be illuminated for measurement. In some embodiments, the technology receives the number, e.g., from user input or from a configuration file. In some embodiments, the technology determines a number based on a heuristic and the characteristics of the display to be measured and the measuring equipment. Such characteristics may include, e.g., the size of the display, the pixel resolution of the display, the pixel density or dot pitch (i.e., distance between pixels) of the display, the distance from the display to the imaging device, the optical resolving power or angular resolution of the imaging device, and the pixel resolution of the imaging device. An example heuristic is that the pixel resolution of the imaging device is such that 50 pixels on the imaging device correspond to one illuminated subpixel on the display.
  • By way of example, in one embodiment the imaging device has a pixel resolution of 3,072×2,048=6,291,456 pixels. According to the heuristic that fifty pixels of resolution from the imaging device correspond to one subpixel on the display, the imaging device can capture data from 125,829 subpixels on the display (6,291,456 camera pixels/50 camera pixels per display subpixel) in a single captured image. In other embodiments, the correlation between the resolution of the imaging device and the display can vary between, e.g., 6 to 200 pixels on the imaging device corresponding to one subpixel on the display. Assuming, for example, that no other characteristic of the imaging device or its relationship to the display restricts its ability to measure the display, then the technology can determine the appropriate fraction 1/n in this case by dividing 125,829 (the number of subpixels to be illuminated in each captured image) by the total number of subpixels in the display. For example, to measure a display having a pixel resolution of 1,280×720=921,600 pixels, the fraction 1/n would be 125,829/921,600=1/7.324 or (rounding the denominator up) ⅛. In other words, if ⅛ of the display's subpixels are illuminated, the total number of illuminated subpixels will be below the threshold of 125,829 subpixels that can be captured in a single image by the selected imaging device in accordance with the applicable heuristic.
  • At block 620, the technology displays a pattern selectively illuminating 1/n of the pixels or subpixels of the display (e.g., in the example above, 1 of every 8 subpixels of the display). For example, every nth pixel (or subpixel of a particular color) may be illuminated. An example of such a pattern is described above in connection with FIG. 4A. As described above, the technology may illuminate each pixel or subpixel at various brightness levels. At block 630, the imaging device captures at least one image of the pattern of pixels or subpixels illuminated on the display. Each subpixel captured by the imaging device can be characterized, e.g., by its color value, typically expressed as chromaticity (Cx, Cy), and its brightness, typically expressed as luminance Lv. At block 640, the captured image data is analyzed by a computing device, e.g., the computing device described above in connection with FIGS. 1 and 3.
  • In some embodiments, the computing device compares the color and brightness of each captured pixel with target color and brightness values, e.g., points within the color gamut defined by a color gamut triangle, such as shown in FIG. 5. The actual pixel or subpixel color or brightness values may differ from desired or target display values for the display. For example, there is typically significant variation in color or luminance of each subpixel of the display, especially if the subpixels are LEDs or OLEDs. Moreover, over time the visual properties of the display may degrade or otherwise vary from desired or target display values. Accordingly, at block 650, the technology compares actual captured and analyzed values with target or desired display values for the pixels or subpixels illuminated according to the displayed pattern, and determines a correction value applicable to each analyzed pixel or subpixel.
  • Determining the correction values can include creating a correction data set or map. In some embodiments, the computing device calculates a three-by-three matrix of values for each pixel that indicate some fractional amount of power to turn on each subpixel to obtain each of the three primary colors (red, green, and blue) at target color and brightness levels. A sample matrix is displayed below:
  • Fractional values for each subpixel of a pixel
    Primary color Red Green Blue
    Red 0.60 0.10 0.05
    Green 0.15 0.70 0.08
    Blue 0.03 0.08 0.75

    For example, according to the above matrix for a particular brightness level, when a pixel of the display should be red, the technology has calculated that the display should turn on its red subpixel at 60% power, its green subpixel at 10% power, and its blue subpixel at 5% power.
  • The determination of the correction values is based, at least in part, on the comparison between the captured and analyzed values and the target values for the display. More specifically, each correction factor can compensate for the difference between the captured and analyzed values and the corresponding target display value. For example, if the captured and analyzed value is less bright than the corresponding target display value, the correction factor can include the amount of brightness that would be required for the captured and analyzed value of the pixel or subpixel to be generally equal to the target display value. Moreover, the correction factor can correlate to the corresponding type of display value. For example, the correction value can be expressed in terms of color or brightness correction values, or in terms of other visual display property correction values. Suitable methods and systems for determining correction values or correction factors are disclosed in U.S. Pat. Nos. 7,907,154 and 7,911,485 referenced above.
  • At block 660, the process branches depending on whether or not all the pixels or subpixels of the display have been illuminated, captured, analyzed, and corrected as described above in blocks 620-650. If the technology illuminates a fraction 1/n of the pixels or subpixels of the display in each pattern, then at least n iterations are required to measure and calibrate the entire display. For example, after displaying a first pattern such as the pattern described above in connection with FIG. 4A, in which every nth pixel (or subpixel of a particular color) is illuminated, the technology returns to block 620. At block 620, the technology illuminates a different pattern of pixels or subpixels, e.g., a distinct subset of the pixels or subpixels of the display. For example, the technology might next display the pattern described above in connection with FIG. 4B, in which the next neighbor of every nth pixel (or subpixel of a particular color) is illuminated. The process then continues as described above.
  • After n iterations have been completed, at block 670, the method 600 can further include sending the calibration correction values to the display. In some embodiments, the correction factors are stored in firmware within the display or a controller of the display. In some embodiments, the correction factor data set or map can be saved and, e.g., provided to a third party such as the owner of the display, or used to process video images outside the display such that the display can show the processed image according to desired or target display properties without calibrating or adjusting the display itself. Suitable methods and systems for correcting images to calibrate their appearance on a particular display are disclosed in U.S. patent application Ser. No. 12/772,916, filed May 3, 2010, entitled “Methods and systems for correcting the appearance of images displayed on an electronic visual display,” which is incorporated herein in its entirety by reference. In some embodiments, the technology verifies or improves the calibration by measuring the calibrated output of each pixel or subpixel as described in blocks 610-660 above, and optionally modifying the correction factors applied to the display.
  • From the foregoing, it will be appreciated that specific embodiments of the disclosure have been described herein for purposes of illustration, but that various modifications may be made without deviating from the spirit and scope of the various embodiments of the disclosure. Further, while various advantages associated with certain embodiments of the disclosure have been described above in the context of those embodiments, other embodiments may also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the disclosure. Accordingly, the disclosure is not limited, except as by the appended claims.

Claims (25)

1. A method in a computing system having a pattern generator and an image capture device for calibrating a visual display comprising an array of a number of pixels and corresponding subpixels, the method comprising:
identifying a fraction of the number of pixels and corresponding subpixels of the display;
generating, by the pattern generator, patterns for illuminating proper subsets of the pixels and corresponding subpixels of the display, such that—
each pattern illuminates the identified fraction of the number of pixels and corresponding subpixels of the display, and
each of the pixels and corresponding subpixels of the display is illuminated in at least one pattern; and
for each generated pattern—
illuminating subpixels of the display according to the generated pattern;
capturing, by the image capture device, information about the illuminated subpixels;
analyzing, by the computing system, the captured information about the illuminated subpixels;
calculating correction factors for illuminated pixels and corresponding subpixels.
2. The method of claim 1, further comprising using the correction factors to calibrate the visual display.
3. The method of claim 2 wherein using the correction factors to calibrate the visual display comprises uploading the correction factors to firmware or software controlling the display.
4. The method of claim 2 wherein using the correction factors to calibrate the visual display comprises applying the correction factors to process an image to be shown on the display.
5. The method of claim 4, further comprising applying the correction factors to process substantially every image to be shown on the display.
6. The method of claim 1 wherein the subpixels are light-emitting diodes.
7. The method of claim 1 wherein the subpixels are organic light-emitting diodes.
8. The method of claim 1 wherein identifying a fraction of the number of pixels and corresponding subpixels of the display comprises receiving input specifying the fraction.
9. The method of claim 1 wherein identifying a fraction of the number of pixels and corresponding subpixels of the display comprises:
determining characteristics of the display and of measurement equipment for capturing information about the illuminated subpixels; and
calculating the fraction based on the determined characteristics.
10. The method of claim 1 wherein the fraction is ¼ or smaller.
11. The method of claim 1 wherein a pattern comprises a regular grid of nonadjacent illuminated pixels.
12. The method of claim 1 wherein each pattern comprises a distinct set of nonadjacent illuminated pixels.
13. The method of claim 1 wherein a pattern comprises illuminated subpixels that are substantially evenly distributed across the display.
14. The method of claim 1, further comprising, for each generated pattern, illuminating subpixels of the display according to the generated pattern at more than one brightness level.
15. The method of claim 14 wherein the brightness levels comprise full brightness, one-half brightness, one-quarter brightness, and one-eighth brightness.
16. The method of claim 1 wherein capturing information about the illuminated subpixels comprises measuring the illuminated subpixels using an imaging colorimeter.
17. The method of claim 1 wherein analyzing the captured information about the illuminated subpixels comprises:
locating and registering illuminated subpixels of the display; and
determining a chromaticity value and a luminance value for each registered subpixel.
18. The method of claim 17 wherein calculating correction factors for illuminated pixels and corresponding subpixels comprises:
converting the chromaticity and luminance value for each registered subpixel value to measured tristimulus values;
converting a target chromaticity value and a target luminance value for a given color to target tristimulus values; and
calculating correction factors for each registered subpixel based on a difference between the measured tristimulus values and the target tristimulus values.
19. The method of claim 18 wherein correction factors for each registered subpixel comprise a three by three matrix of values that indicate fractional amounts of power to turn on each registered subpixel for a given color and brightness level.
20. The method of claim 1 wherein the illuminating and capturing are performed in a testing station configured to block out or inhibit ambient light.
21. An apparatus for measuring and calibrating a visual display having pixels and corresponding subpixels, the apparatus comprising:
a pattern generator operably coupled to the display, wherein the pattern generator is configured to illuminate a proper subset of the pixels and corresponding subpixels of the display;
an imaging device configured to capture information about pixels and corresponding subpixels of the display illuminated by the pattern generator; and
a computing device operably coupled to the pattern generator and to the imaging device, wherein the computing device comprises a processor and a computer-readable medium having instructions stored thereon that, when executed by the processor—
cause the pattern generator to illuminate a proper subset of the pixels and corresponding subpixels of the display;
cause the imaging device to capture information about the illuminated proper subset of the pixels and corresponding subpixels of the display;
analyze the captured information about the illuminated subpixels; and
calculate correction factors for the illuminated subpixels.
22. The apparatus of claim 21 wherein the pattern generator comprises standalone test equipment.
23. The apparatus of claim 21 wherein the pattern generator comprises software in the computing device, such that the computing device is operably coupled to the display and configured to transmit patterns to the display.
24. The apparatus of claim 21, further comprising a testing station configured to receive at least a portion of the display being measured and calibrated and block ambient light to the display during processing.
25-31. (canceled)
US13/830,678 2013-03-14 2013-03-14 Methods and systems for measuring and correcting electronic visual displays Active US8836797B1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US13/830,678 US8836797B1 (en) 2013-03-14 2013-03-14 Methods and systems for measuring and correcting electronic visual displays
PCT/US2014/022134 WO2014159134A1 (en) 2013-03-14 2014-03-07 Methods and systems for measuring and correcting electronic visual displays
CN201480027705.0A CN105247607A (en) 2013-03-14 2014-03-07 Methods and systems for measuring and correcting electronic visual displays
US14/458,695 US9135851B2 (en) 2013-03-14 2014-08-13 Methods and systems for measuring and correcting electronic visual displays

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US13/830,678 US8836797B1 (en) 2013-03-14 2013-03-14 Methods and systems for measuring and correcting electronic visual displays

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US14/458,695 Division US9135851B2 (en) 2013-03-14 2014-08-13 Methods and systems for measuring and correcting electronic visual displays

Publications (2)

Publication Number Publication Date
US8836797B1 US8836797B1 (en) 2014-09-16
US20140267784A1 true US20140267784A1 (en) 2014-09-18

Family

ID=51493402

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/830,678 Active US8836797B1 (en) 2013-03-14 2013-03-14 Methods and systems for measuring and correcting electronic visual displays
US14/458,695 Active US9135851B2 (en) 2013-03-14 2014-08-13 Methods and systems for measuring and correcting electronic visual displays

Family Applications After (1)

Application Number Title Priority Date Filing Date
US14/458,695 Active US9135851B2 (en) 2013-03-14 2014-08-13 Methods and systems for measuring and correcting electronic visual displays

Country Status (3)

Country Link
US (2) US8836797B1 (en)
CN (1) CN105247607A (en)
WO (1) WO2014159134A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150123957A1 (en) * 2013-11-04 2015-05-07 Samsung Display Co., Ltd. System and method for luminance correction
US9311847B2 (en) * 2014-07-16 2016-04-12 Ultravision Technologies, Llc Display system having monitoring circuit and methods thereof
US9349306B2 (en) 2013-12-31 2016-05-24 Ultravision Technologies, Llc Modular display panel
US9416551B2 (en) 2013-12-31 2016-08-16 Ultravision Technologies, Llc Preassembled display systems and methods of installation thereof
US9582237B2 (en) 2013-12-31 2017-02-28 Ultravision Technologies, Llc Modular display panels with different pitches
US10061553B2 (en) 2013-12-31 2018-08-28 Ultravision Technologies, Llc Power and data communication arrangement between panels
WO2019040310A1 (en) * 2017-08-24 2019-02-28 Radiant Vision Systems, LLC Methods and systems for measuring electronic visual displays using fractional pixels
US10262605B2 (en) 2017-09-08 2019-04-16 Apple Inc. Electronic display color accuracy compensation

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4987177B1 (en) * 2011-08-31 2012-07-25 パイオニア株式会社 Illumination device and light emission control method
KR102170101B1 (en) * 2014-02-24 2020-10-26 삼성전자주식회사 Display apparatus, mobile apparaus, system and image quality matching method thereof
US9658816B2 (en) * 2014-07-29 2017-05-23 Samsung Display Co., Ltd. System and apparatus in managing color-consistency for multiple panel simultaneous display
US9952242B2 (en) 2014-09-12 2018-04-24 Roche Diagnostics Operations, Inc. Laboratory sample distribution system and laboratory automation system
KR102468270B1 (en) * 2015-09-23 2022-11-18 삼성전자주식회사 Electronic apparatus, display panel apparatus calibration method thereof and calibration system
CN105206217B (en) * 2015-10-27 2018-02-06 京东方科技集团股份有限公司 display processing method, device and display device
KR102702307B1 (en) * 2016-08-03 2024-09-04 삼성전자주식회사 Display apparatus and control method of Electronic apparatus
KR102555953B1 (en) * 2016-11-04 2023-07-17 삼성전자주식회사 Electronic apparatus, display apparatus and control method thereof
KR102686185B1 (en) 2016-11-23 2024-07-19 삼성전자주식회사 Display apparatus, Calibration apparatus and Calibration method thereof
US10366674B1 (en) 2016-12-27 2019-07-30 Facebook Technologies, Llc Display calibration in electronic displays
US10672320B2 (en) 2018-01-30 2020-06-02 Apple Inc. Applying gain and offset correction factors for pixel uniformity compensation in display panels
WO2021087256A1 (en) * 2019-10-30 2021-05-06 Radiant Vision Systems, LLC Non-spatial measurement calibration methods and associated systems and devices
TWI759669B (en) * 2019-12-23 2022-04-01 中強光電股份有限公司 Method and system for inspecting display image
CN112071263B (en) * 2020-09-04 2022-03-18 京东方科技集团股份有限公司 Display method and display device of display panel
TWI746292B (en) * 2020-11-27 2021-11-11 茂達電子股份有限公司 Circuit measuring device and method
DE102021205703A1 (en) 2021-06-07 2022-12-08 TechnoTeam Holding GmbH Method and device for photometric measurement of an electronic display and method for controlling an electronic display
KR20240122745A (en) * 2023-01-31 2024-08-13 제이드 버드 디스플레이(상하이) 리미티드 Method for detecting defects in near-field displays

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6285349B1 (en) * 1999-02-26 2001-09-04 Intel Corporation Correcting non-uniformity in displays
US6954216B1 (en) * 1999-08-19 2005-10-11 Adobe Systems Incorporated Device-specific color intensity settings and sub-pixel geometry
US6618076B1 (en) * 1999-12-23 2003-09-09 Justsystem Corporation Method and apparatus for calibrating projector-camera system
US7063449B2 (en) * 2002-11-21 2006-06-20 Element Labs, Inc. Light emitting diode (LED) picture element
US7308157B2 (en) * 2003-02-03 2007-12-11 Photon Dynamics, Inc. Method and apparatus for optical inspection of a display
BRPI0409513A (en) * 2003-04-25 2006-04-18 Visioneered Image Systems Inc led area light source for emitting light of a desired color, color video monitor and methods of determining the degradation of the representative led (s) of each color and of operating and calibrating the monitor
US7227592B2 (en) * 2003-09-26 2007-06-05 Mitsubishi Electric Research Laboratories, Inc. Self-correcting rear projection television
US7508387B2 (en) * 2003-09-30 2009-03-24 International Business Machines Corporation On demand calibration of imaging displays
US7218358B2 (en) * 2004-06-15 2007-05-15 Coretronic Corporation Method and apparatus for calibrating color temperature of color display devices
JP4996065B2 (en) * 2005-06-15 2012-08-08 グローバル・オーエルイーディー・テクノロジー・リミテッド・ライアビリティ・カンパニー Method for manufacturing organic EL display device and organic EL display device
US8207914B2 (en) * 2005-11-07 2012-06-26 Global Oled Technology Llc OLED display with aging compensation
US20070291121A1 (en) * 2006-06-19 2007-12-20 Inventec Corporation Computer executable image quality detection system
US8406562B2 (en) * 2006-08-11 2013-03-26 Geo Semiconductor Inc. System and method for automated calibration and correction of display geometry and color
US7524226B2 (en) * 2006-10-10 2009-04-28 Eastman Kodak Company OLED display device with adjusted filter array
JP2008252515A (en) * 2007-03-30 2008-10-16 Olympus Corp Video signal processor, video display system, and video signal processing method
US20080284690A1 (en) * 2007-05-18 2008-11-20 Sam Min Ko Organic light emitting display device
US20090322800A1 (en) * 2008-06-25 2009-12-31 Dolby Laboratories Licensing Corporation Method and apparatus in various embodiments for hdr implementation in display devices
US8521035B2 (en) * 2008-09-05 2013-08-27 Ketra, Inc. Systems and methods for visible light communication
US9378685B2 (en) * 2009-03-13 2016-06-28 Dolby Laboratories Licensing Corporation Artifact mitigation method and apparatus for images generated using three dimensional color synthesis
US7901095B2 (en) * 2009-03-27 2011-03-08 Seiko Epson Corporation Resolution scalable view projection
US8537098B2 (en) * 2009-08-05 2013-09-17 Dolby Laboratories Licensing Corporation Retention and other mechanisms or processes for display calibration
JP5471306B2 (en) * 2009-10-28 2014-04-16 ソニー株式会社 Color unevenness inspection apparatus and color unevenness inspection method
US8344659B2 (en) * 2009-11-06 2013-01-01 Neofocal Systems, Inc. System and method for lighting power and control system
KR20110056167A (en) * 2009-11-20 2011-05-26 삼성전자주식회사 Display apparatus and calibration method therefor
CA2696778A1 (en) * 2010-03-17 2011-09-17 Ignis Innovation Inc. Lifetime, uniformity, parameter extraction methods
US8564879B1 (en) * 2010-03-26 2013-10-22 The United States Of America As Represented By The Secretary Of The Navy Multispectral infrared simulation target array
US20110267365A1 (en) * 2010-05-03 2011-11-03 Radiant Imaging, Inc. Methods and systems for correcting the appearance of images displayed on an electronic visual display
EP2715669A4 (en) * 2011-05-25 2015-03-18 Third Dimension Ip Llc Systems and methods for alignment, calibration and rendering for an angular slice true-3d display
US8704895B2 (en) * 2011-08-29 2014-04-22 Qualcomm Incorporated Fast calibration of displays using spectral-based colorimetrically calibrated multicolor camera
KR20130051576A (en) * 2011-11-10 2013-05-21 삼성전자주식회사 Optimizing method of visibility and system thereof, portable device using the same
US9385169B2 (en) * 2011-11-29 2016-07-05 Ignis Innovation Inc. Multi-functional active matrix organic light-emitting diode display

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9564074B2 (en) * 2013-11-04 2017-02-07 Samsung Display Co., Ltd. System and method for luminance correction
US20150123957A1 (en) * 2013-11-04 2015-05-07 Samsung Display Co., Ltd. System and method for luminance correction
US9978294B1 (en) 2013-12-31 2018-05-22 Ultravision Technologies, Llc Modular display panel
US9528283B2 (en) 2013-12-31 2016-12-27 Ultravision Technologies, Llc Method of performing an installation of a display unit
US9990869B1 (en) 2013-12-31 2018-06-05 Ultravision Technologies, Llc Modular display panel
US9513863B2 (en) 2013-12-31 2016-12-06 Ultravision Technologies, Llc Modular display panel
US10061553B2 (en) 2013-12-31 2018-08-28 Ultravision Technologies, Llc Power and data communication arrangement between panels
US9535650B2 (en) 2013-12-31 2017-01-03 Ultravision Technologies, Llc System for modular multi-panel display wherein each display is sealed to be waterproof and includes array of display elements arranged to form display panel surface
US9349306B2 (en) 2013-12-31 2016-05-24 Ultravision Technologies, Llc Modular display panel
US9582237B2 (en) 2013-12-31 2017-02-28 Ultravision Technologies, Llc Modular display panels with different pitches
US9642272B1 (en) 2013-12-31 2017-05-02 Ultravision Technologies, Llc Method for modular multi-panel display wherein each display is sealed to be waterproof and includes array of display elements arranged to form display panel surface
US9832897B2 (en) 2013-12-31 2017-11-28 Ultravision Technologies, Llc Method of assembling a modular multi-panel display system
US9916782B2 (en) 2013-12-31 2018-03-13 Ultravision Technologies, Llc Modular display panel
US10871932B2 (en) 2013-12-31 2020-12-22 Ultravision Technologies, Llc Modular display panels
US10540917B2 (en) 2013-12-31 2020-01-21 Ultravision Technologies, Llc Modular display panel
US9984603B1 (en) 2013-12-31 2018-05-29 Ultravision Technologies, Llc Modular display panel
US9416551B2 (en) 2013-12-31 2016-08-16 Ultravision Technologies, Llc Preassembled display systems and methods of installation thereof
US9372659B2 (en) 2013-12-31 2016-06-21 Ultravision Technologies, Llc Modular multi-panel display system using integrated data and power cables
US9940856B2 (en) 2013-12-31 2018-04-10 Ultravision Technologies, Llc Preassembled display systems and methods of installation thereof
US10248372B2 (en) 2013-12-31 2019-04-02 Ultravision Technologies, Llc Modular display panels
US10410552B2 (en) 2013-12-31 2019-09-10 Ultravision Technologies, Llc Modular display panel
US10373535B2 (en) 2013-12-31 2019-08-06 Ultravision Technologies, Llc Modular display panel
US10380925B2 (en) 2013-12-31 2019-08-13 Ultravision Technologies, Llc Modular display panel
US9311847B2 (en) * 2014-07-16 2016-04-12 Ultravision Technologies, Llc Display system having monitoring circuit and methods thereof
US10706770B2 (en) 2014-07-16 2020-07-07 Ultravision Technologies, Llc Display system having module display panel with circuitry for bidirectional communication
KR20200029571A (en) * 2017-08-24 2020-03-18 래디언트 비전 시스템즈, 엘엘씨 Methods and systems for measuring electronic visual displays using fractional pixels
CN111066062A (en) * 2017-08-24 2020-04-24 雷迪安特视觉系统有限公司 Method and system for measuring electronic visual displays using fractional pixels
WO2019040310A1 (en) * 2017-08-24 2019-02-28 Radiant Vision Systems, LLC Methods and systems for measuring electronic visual displays using fractional pixels
US10971044B2 (en) * 2017-08-24 2021-04-06 Radiant Vision Systems, LLC Methods and systems for measuring electronic visual displays using fractional pixels
KR102377250B1 (en) * 2017-08-24 2022-03-22 래디언트 비전 시스템즈, 엘엘씨 Methods and systems for measuring electronic visual displays using fractional pixels
US10262605B2 (en) 2017-09-08 2019-04-16 Apple Inc. Electronic display color accuracy compensation

Also Published As

Publication number Publication date
US20140347408A1 (en) 2014-11-27
US9135851B2 (en) 2015-09-15
WO2014159134A1 (en) 2014-10-02
CN105247607A (en) 2016-01-13
US8836797B1 (en) 2014-09-16

Similar Documents

Publication Publication Date Title
US8836797B1 (en) Methods and systems for measuring and correcting electronic visual displays
US7911485B2 (en) Method and apparatus for visual display calibration system
US8465335B2 (en) Color calibration system for a video display
WO2018016572A1 (en) Display correction apparatus, program, and display correction system
US7907154B2 (en) Method and apparatus for on-site calibration of visual displays
US9210418B2 (en) Method and apparatus for calibrating multi-spectral sampling system
US8531381B2 (en) Methods and systems for LED backlight white balance
CN105185302B (en) Lamp position deviation modification method and its application between monochrome image
US20100207865A1 (en) Systems and methods for display device backlight compensation
US9508281B2 (en) Apparatus and method for image analysis and image display
CN105551431A (en) LED display module uniformity correction method
TW201903743A (en) Optical compensation apparatus applied to panel and operating method thereof
US20110267365A1 (en) Methods and systems for correcting the appearance of images displayed on an electronic visual display
KR102377250B1 (en) Methods and systems for measuring electronic visual displays using fractional pixels
US20100315429A1 (en) Visual display measurement and calibration systems and associated methods
US20100079365A1 (en) Methods and systems for LED backlight white balance
RU2008141365A (en) METHOD AND DEVICE FOR MEASURING SURFACE QUALITY OF SUBSTRATE
Zhao et al. Perceptual spatial uniformity assessment of projection displays with a calibrated camera
CN116777910B (en) Display screen sub-pixel brightness extraction precision evaluation method and system and electronic equipment
US20240331185A1 (en) Method of obtaining brightness information of display panel and related panel detection system
CN117672161A (en) Brightness and chrominance compensation device and method for display
CN116413008A (en) Display screen full gray-scale optical information acquisition method and device and display control equipment
Saha et al. Characterization of mobile display systems for use in medical imaging

Legal Events

Date Code Title Description
AS Assignment

Owner name: RADIANT ZEMAX, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RYKOWSKI, RONALD F.;REEL/FRAME:030388/0853

Effective date: 20130321

AS Assignment

Owner name: FIFTH THIRD BANK, COLORADO

Free format text: SECURITY AGREEMENT;ASSIGNOR:RADIANT ZEMAX, LLC;REEL/FRAME:032264/0632

Effective date: 20140122

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: RADIANT VISION SYSTEMS, LLC, F/K/A RADIANT ZEMAX,

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:FIFTH THIRD BANK;REEL/FRAME:036242/0107

Effective date: 20150803

FEPP Fee payment procedure

Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551)

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8