US20140267784A1 - Methods and systems for measuring and correcting electronic visual displays - Google Patents
Methods and systems for measuring and correcting electronic visual displays Download PDFInfo
- Publication number
- US20140267784A1 US20140267784A1 US13/830,678 US201313830678A US2014267784A1 US 20140267784 A1 US20140267784 A1 US 20140267784A1 US 201313830678 A US201313830678 A US 201313830678A US 2014267784 A1 US2014267784 A1 US 2014267784A1
- Authority
- US
- United States
- Prior art keywords
- display
- subpixels
- pixels
- illuminated
- pattern
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/22—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
- G09G3/30—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/006—Electronic inspection or testing of displays and display drivers, e.g. of LED or LCD displays
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/004—Diagnosis, testing or measuring for television systems or their details for digital television systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N17/00—Diagnosis, testing or measuring for television systems or their details
- H04N17/04—Diagnosis, testing or measuring for television systems or their details for receivers
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0233—Improving the luminance or brightness uniformity across the screen
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0693—Calibration of display systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/145—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light originating from the display screen
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10T—TECHNICAL SUBJECTS COVERED BY FORMER US CLASSIFICATION
- Y10T29/00—Metal working
- Y10T29/49—Method of mechanical manufacture
- Y10T29/49002—Electrical device making
- Y10T29/49004—Electrical device making including measuring or testing of device or component part
Definitions
- the present disclosure relates generally to electronic visual displays, and more particularly, to methods and systems for measuring and calibrating the output from such displays.
- Displays of increasingly high resolution are used in a wide variety of contexts, from personal electronics with screens a few inches or smaller in size to computer screens and televisions several feet across to scoreboards and billboards covering hundreds of square feet.
- Some displays are assembled from a series of smaller panels, each of which may further consist of a series of internally connected modules.
- Virtually all displays are made up of arrays of individual light-emitting elements called “pixels.”
- each pixel is made up of a plurality of light-emitting points (e.g., one red, one green, and one blue). The light-emitting points are termed “subpixels.”
- a display it is often desirable for a display to be calibrated. For example, calibration may improve the uniformity of the display and improve consistency between displays.
- the color and brightness of each pixel or subpixel is measured. Adjustments are determined so the pixels can display particular colors at desired brightness levels. The adjustments are then stored (e.g., in software or firmware that controls the display or module), so that those adjustments or correction factors can be applied.
- FIG. 1 is a schematic view of an electronic visual display calibration system configured in accordance with an embodiment of the disclosure.
- FIG. 2 is an isometric front view of an electronic visual display calibration system configured in accordance with an embodiment of the disclosure.
- FIG. 3 is a schematic block diagram of the electronic visual display calibration system of FIG. 1 .
- FIGS. 4A and 4B are enlarged partial front views of a portion of an electronic visual display configured to be used with embodiments of the disclosure.
- FIG. 5 is a diagram of a color gamut triangle.
- FIG. 6 is a flow diagram of a method or process configured in accordance with an embodiment of the disclosure.
- a display measurement method and/or system configured in accordance with one aspect of the disclosure is configured to measure the luminance and the color of the individual pixels or subpixels of an electronic visual display, such as a high-resolution liquid crystal display (“LCD”) or an organic light-emitting diode (“OLED”) display.
- LCD liquid crystal display
- OLED organic light-emitting diode
- a pattern generator e.g., standalone hardware test equipment, a logic analyzer add-on module, a computer peripheral, software in a computing device or controller connected to the display, output from a serial digital interface (“SDI”), digital video interface (“DVI”) or high-definition multimedia interface (“HDMI”) port, etc.
- SDI serial digital interface
- DVI digital video interface
- HDMI high-definition multimedia interface
- the pattern generator illuminates only every third or every fourth pixel of the display, such that the pixels between them remain off.
- the technology uses an imaging device (which typically has a considerably higher resolution than the display itself) to measure only the illuminated pixels (and/or subpixels). Because only a subset of the pixels are illuminated and measured at once, the display under test effectively has a much lower pixel resolution. After measuring the illuminated pixels, the pattern can then be shifted (e.g., by one pixel) and then measurements can be repeated until all of the pixel of the display have been measured.
- the effective resolution is 384 ⁇ 216 pixels.
- a camera with a resolution of approximately 2,300 ⁇ 1,300 i.e., a camera readily available for a reasonable price—could potentially be used.
- many conventional approaches for analyzing the 1,920 ⁇ 1,080 pixel HDTV display would require a camera having a resolution of approximately 12,000 ⁇ 6,000, or 72,000,000 pixels. Such a camera (with resolution high enough for the display to be measured) is expected to be prohibitively expensive and/or unavailable. As a result, measuring and calibrating such displays using conventional techniques is often impractical and/or too expensive.
- Another conventional approach for measuring such large or high-resolution displays is to divide the display (or its constituent panels or modules) into sections small enough that the imaging system has sufficient resolving power to enable an accurate measurement of the pixels or subpixels of each section.
- the imaging device or the display being measured
- the imaging device is generally mounted on an x-y stage for horizontal and vertical positioning, or rotated to align to each section being measured.
- Moving or rotating either the camera or the display requires additional, potentially expensive additional equipment, as well as time to perform the movement or rotation and to align the imaging device to the display.
- this technique can lead to slight mismatches or discontinuities of measurement between the individual sections. If the measurements are used for uniformity correction, such mismatches must be addressed, typically with further measurements and/or post-processing the display measurement data.
- embodiments of the present technology are expected to enable precise measurement of individual pixel or subpixel output for any display (e.g., an OLED display) without requiring expensive, high resolution imaging devices, and without additional equipment for moving the relationship between the imaging device and the display, time for moving and aligning them, or mismatches between sections of the display.
- any display e.g., an OLED display
- FIGS. 1-6 Certain details are set forth in the following description and in FIGS. 1-6 to provide a thorough understanding of various embodiments of the disclosure. However, other details describing well-known structures and systems often associated with visual displays and related optical equipment and/or other aspects of visual display calibration systems are not set forth below to avoid unnecessarily obscuring the description of various embodiments of the disclosure.
- FIG. 1 is a schematic view of an electronic visual display calibration system (“the system”) 100 configured in accordance with an embodiment of the disclosure.
- the system 100 is configured to collect, manage, and/or analyze display data for the purpose of processing image patterns (e.g., static image patterns, video streams comprised of a series of image patterns, etc.) that are shown on an electronic visual display 150 .
- the pattern 160 shown on the display 150 is generated by a pattern generator 110 .
- the display 150 can be, for example, a large electronic display or sign composed of smaller panels or modules.
- the pattern 160 generated by the pattern generator 110 and displayed on the display 150 illustrated in FIG. 1 is described in further detail below in connection with FIGS. 4A-4B .
- the system 100 includes a computing device 130 operably coupled to an imaging device 120 (e.g., an imaging colorimeter or other photometer).
- the imaging device 120 is spaced apart from the display 150 (e.g., so that the entire display 150 is within the field of view of the imaging device 120 , and, in the case of a large elevated sign, for improving the convenience of measurement) and configured to sense or capture display information (e.g., color data, luminance data, etc.) from selectively illuminated pixels or subpixels 160 of the display 150 .
- the pattern generator 110 can illuminate every nth pixel of the display 150 .
- the captured display information is transferred from the imaging device 120 to the computing device 130 .
- the pattern generator 110 can generate additional patterns 160 on the display 150 .
- the pattern generator 110 can illuminate every next nth pixel of the display 150 . This process can be repeated (e.g., n times) until the computing device 130 obtains display information for all the pixels or subpixels of the entire display 150 .
- the computing device 130 is configured to store, manage, and/or analyze the display information from each pattern 160 to determine one or more correction factors for the display 150 or for its pixels or subpixels.
- the correction factors for the display 150 are applied to the firmware and/or software controlling the display 150 to calibrate the display 150 .
- the corrections are applied in real time to a video stream to be shown on the display 150 .
- the technology includes comparing the actual display value with a desired display value for the one or more portions of the display 150 , and determining a correction factor for the pixels or subpixels of the display 150 as determined from the measurements of the patterns 160 described above.
- the technology processes or adjusts the image with the correction factors for the corresponding pixels of the display 150 .
- the technology can further include transmitting the image to the display 150 and showing the image on the display 150 . Accordingly, in some embodiments, the image on the display 150 can be presented according to the desired display values without modifying or calibrating the actual display 150 .
- system 100 illustrated in FIG. 1 includes separate components (e.g., the pattern generator 110 , the imaging device 120 , and the computing device 130 ), in other embodiments the system 100 can incorporate more or less than three components. Moreover, the various components can be further divided into subcomponents, or the various components and functions may be combined and integrated. In addition, these components can communicate via wired or wireless communication, as well as by information contained in storage media. The various components and features of the electronic visual display calibration system 100 are described in greater detail below in connection with FIG. 3 .
- FIG. 2 is an isometric front view of an electronic visual display calibration system 200 configured in accordance with an embodiment of the disclosure.
- the system 200 is configured to perform correction of the brightness and color of light-emitting elements that are used in electronic visual displays.
- the calibration system 200 can include a test pattern generator 210 , a test station 240 , an interface 230 , and an electronic visual display 250 .
- the calibration system 200 is designed to calibrate a display 250 that is placed within the test station 240 . In alternate embodiments, it is possible to calibrate multiple displays or multiple panels of a larger display within the test station 240 .
- the test pattern generator 210 is configured to generate a series of test patterns 260 , each of which illuminates a proper subset of the pixels or subpixels of the display 250 .
- the test station 240 is configured to capture a series of images from an imaging area covering all of the display 250 .
- the captured image data is transferred from the test station 240 to the interface 230 .
- the interface 230 compiles and manages the image data, performs a series of calculations to determine the appropriate correction factors that should be made to the image data, and then stores the data. This process is repeated until images of each of the pixels or subpixels of display 250 have been obtained.
- the processed correction data is then uploaded from the interface 230 to the firmware and/or software controlling the display 250 and used to recalibrate the display 250 .
- the test station 240 can include a lightproof chamber for calibrating a display 250 in a fully-illuminated room or factory.
- the test station 240 can include a digital camera 220 mounted on the top portion 244 of the test station 240 .
- the test station 240 can further include light baffles to eliminate any stray light that might be reflected off the walls of the test station chamber 242 back into the camera 220 .
- the display 250 is positioned beneath the test station 240 .
- the test station 240 includes mechanical and electrical fixtures for receiving the display 250 and placing it in position within the test station 240 for calibration.
- test station 240 may be in other orientations, e.g., facing upward at a display positioned above the test station or facing horizontally. Further, in some embodiments the test station 240 may have a different arrangement and/or include different features.
- the test station 240 also incorporates a ground glass diffuser 246 positioned just above the display 250 .
- the diffuser 246 scatters the light emitted from each subpixel in the display 250 , which effectively partially integrates the emitted light angularly. Accordingly, the camera 220 is actually measuring the average light emitted into a cone rather than only the light traveling directly from each subpixel on the display 250 toward the camera 220 .
- One advantage of this arrangement is that the display 250 will be corrected to optimize viewing over a wider angular range.
- the diffuser 246 is an optional component that may not be included in some embodiments.
- the interface 230 that is operably coupled to the test station 240 is configured to manage the data that is collected, stored, and used for calculation of new correction factors that will be used to recalibrate the display 250 .
- the interface 230 automates the operation of the pattern generator 210 and the test station 240 and writes all the data into a database.
- the interface 230 can be a personal computer with software for pattern selection, camera control, image data acquisition, and image data analysis.
- various devices capable of operating the software can be used, such as handheld computers.
- FIG. 3 is a schematic block diagram of the electronic visual display calibration system 100 of FIG. 1 .
- the imaging device 120 can include a camera 320 , such as a digital camera suitable for high-resolution imaging.
- the camera 320 can include optics capable of measuring subpixels of the display 150 (which can be a few millimeters in size) from a distance of 25 meters or more. If the displayed pattern 160 does not illuminate adjacent subpixels or pixels, imaging resolution requirements for the camera 320 may be less stringent, allowing the use of a less expensive imaging device 120 .
- the camera 320 can be a CCD camera.
- Suitable CCD digital color cameras include ProMetric® imaging colorimeters and photometers, which are commercially available from the assignee of the present disclosure, Radiant Zemax, LLC, of Redmond, Wash.
- the camera 320 can be a complementary metal oxide semiconductor (“CMOS”) camera, or another type of suitable camera for imaging with sufficient resolution at a certain distance from the display.
- CMOS complementary metal oxide semiconductor
- the imaging device 120 can also include a lens 322 .
- the lens 322 can be a reflecting telescope that is operably coupled to the camera 320 to provide sufficiently high resolution for long distance imaging of the display 150 .
- the lens 322 can include other suitable configurations for viewing and/or capturing display information from the display 150 .
- Suitable imaging devices 320 and lenses 322 are disclosed in U.S. Pat. Nos. 7,907,154 and 7,911,485, both of which are incorporated herein by reference in their entireties.
- the imaging device 120 can accordingly be positioned at a distance L from the display 150 .
- the distance L can vary depending on the size of the display 150 , and can include relatively large distances.
- the imaging device 120 can be positioned at a distance L that is generally similar to a typical viewing distance of the display 150 .
- the imaging device 120 can be positioned in a seating area facing toward the display 150 .
- the distance L can be less that a typical viewing distance and direction, and the imaging system 120 can be configured to account for any viewing distance and/or direction differences.
- the imaging device 120 has a wide field of view and the distance L can be less than the width of the display 150 (e.g., approximately one meter for a typical HDTV display). In other embodiments, the imaging device 120 has a long-focus lens 322 (e.g., a telephoto lens) and the distance L can be significantly greater than the width of the display 150 (e.g., between approximately 100 and 300 meters for an outdoor billboard or video screen). In yet other embodiments, the distance L can have other values.
- the imaging device 120 has a wide field of view and the distance L can be less than the width of the display 150 (e.g., approximately one meter for a typical HDTV display). In other embodiments, the imaging device 120 has a long-focus lens 322 (e.g., a telephoto lens) and the distance L can be significantly greater than the width of the display 150 (e.g., between approximately 100 and 300 meters for an outdoor billboard or video screen). In yet other embodiments, the distance L can have other values.
- the computing device 130 is configured to cause the pattern generator 110 to send images 160 (e.g., pixel or subpixel patterns) to the display 150 .
- the pattern generator 110 is standalone hardware test equipment, a logic analyzer add-on module, a computer peripheral operably coupled to the computing device 130 , or software in the computing device 130 or in a controller connected to the display 150 .
- the pattern generator 110 operates independently of the computing device 130 .
- the patterns 160 are provided to the display 150 via standard video signal input, e.g., using a DVI, HDMI, or SDI input to the display.
- the patterns 160 generated by the pattern generator 110 for displaying on the electronic visual display 150 are discussed in greater detail in connection with FIGS. 4A and 4B below.
- the computing device 130 is configured to receive, manage, store, and/or process the display data collected by the imaging device 120 (e.g., for the purpose of adjusting the appearance of images 160 that will be displayed on the display 150 ).
- display data associated with the display 150 can be processed by a computer that is separate from the imaging device 120 .
- a typical display 150 such as a quad extended graphics array (“QXGA”)-resolution (2048 ⁇ 1536) visual display for example, can have over nine million subpixels that provide display data for the computing device 130 to manage and process.
- QXGA quad extended graphics array
- the pattern generator 110 may illuminate only a fraction of those subpixels at any one time, but by sending a series of patterns 160 to the display 150 , information about all the subpixels will be delivered to the computing device 130 .
- the computing device 130 includes the necessary hardware and corresponding software components for managing and processing the display data.
- the computing device 130 configured in accordance with an embodiment of the disclosure can include a processor 330 , a memory 332 , input/output devices 334 , one or more sensors 336 in addition to sensors of the imaging device 120 , and/or any other suitable subsystems and/or components 338 (displays, speakers, communication modules, etc.).
- the memory 332 can be configured to store the display data from the patterns 160 shown on the display 150 .
- the computing device 130 includes computer readable media (e.g., memory 332 , disk drives, or other storage media, excluding only a transitory, propagating signal per se) including instructions or software stored thereon that, when executed by the processor 330 or computing device 130 , cause the processor 330 or computing device 130 to process an image as described herein.
- the processor 330 can be configured for performing or otherwise controlling calculations, analysis, and any other functions associated with the methods described herein.
- the memory 332 includes software to control the imaging device 120 as well as measurement software to identify portions of the display 150 (e.g., subpixels of the display 150 ) and to image or otherwise extract the display data (e.g., subpixel brightness data, pixel color data, etc.).
- suitable software for controlling the imaging device 120 and/or acquiring the display data is VisionCALTM screen correction software, which is commercially available from the assignee of the present disclosure, Radiant Zemax, LLC, of Redmond, Wash.
- other suitable software can be implemented with the system 100 .
- the memory 332 can also store one or more databases used to store the display data from the patterns 160 shown on display 150 , as well as calculated correction factors for the display data.
- the database is a Microsoft Access® database designed by the assignee of the present disclosure.
- the display data is stored in other types of databases or data files.
- FIG. 4A is an enlarged partial front view of a portion of an electronic visual display 450 configured to be used with embodiments of the disclosure.
- the illustrated view is representative of a portion of a display 450 (e.g., display 150 ( FIG. 1 ) or display 250 ( FIG. 2 )) displaying a pattern 460 a .
- the display 450 is made up of a large number (e.g., millions) of individual light sources or light-emitting elements or pixels 430 .
- Each pixel 430 comprises multiple light-emitting points or subpixels 432 (identified as first, second, and third subpixels 432 a - 432 c , respectively).
- the subpixels 432 are LEDs or OLEDs.
- the subpixels 432 a - 432 c can correspond to red, green, and blue LEDs, respectively.
- each pixel 430 can include more or less than three subpixels 432 .
- some pixels 430 may have four subpixels 432 (e.g., two green subpixels, one blue subpixel, and one red subpixel, or other combinations). Pixels and subpixels may be laid out in various geometric arrangements (e.g., triangular or hexagonal arrays in various color orders, vertical or oblique stripes, etc.).
- the red, green, and blue (“RGB”) color space may not be used. Rather, a different color space can serve as the basis for processing and display of color images on the display 450 .
- the subpixels 432 may be cyan, magenta, and yellow, respectively.
- each subpixel 432 the luminance level of each subpixel 432 can vary. Accordingly, the additive primary colors represented by a red subpixel, a green subpixel, and a blue subpixel can be selectively combined to produce the colors within the color gamut defined by a color gamut triangle, as shown in FIG. 5 . For example, when only “pure” red is displayed, the green and blue subpixels may be turned on only slightly to achieve a specific chromaticity for the red color.
- each pixel 430 or subpixel 432 is measured at input levels (using values from 0 to 255) of 255 (full brightness), 128 (one half brightness), 64 (one quarter brightness), and 32 (one eighth brightness). Data from such measurements can be used in calibration to achieve the same chromaticity for a particular color at various input brightness levels, or, e.g., to improve the uniformity of color and luminance response curves for each pixel or subpixel.
- an illustrative pattern 460 a illuminates a proper subset of the pixels of the display 450 .
- every fourth pixel 430 vertically and every fourth pixel 430 horizontally is illuminated, and the pixels between are switched off.
- the effective pixel density of the display 450 is one sixteenth of the actual pixel density.
- pattern 460 a is displayed on a “4K Ultra HD” television display 450 having a screen resolution of 3,840 ⁇ 2,160 pixels (a total of approximately 8.3 million pixels (“megapixels”)), only (3,840/4) ⁇ (2,160/4) pixels, i.e., 960 ⁇ 540 pixels (a total of approximately five hundred thousand pixels (half a megapixel)) are lit at once.
- imaging equipment e.g., a camera sensor and lens
- FIG. 4B illustrates another pattern 460 b on the same enlarged partial front view of a portion of the electronic visual display 450 .
- each pixel 430 that was illuminated in the pattern 460 a of FIG. 4A is switched off, and the next pixel to the right is illuminated.
- measuring the output of each pixel 430 of the display 450 requires displaying and measuring a total of sixteen patterns (multiplied by the number of different brightness levels for each pattern).
- Different patterns 460 of pixels 430 and/or subpixels 432 could require a smaller or larger number of patterns 460 to ensure full coverage of the display 450 .
- a pattern that illuminates every third pixel 430 horizontally and vertically requires nine patterns to cover every pixel 430 in the display 450 .
- the patterns 460 illuminate individual subpixels 432 (e.g., one or more at a time of subpixels 432 a - 432 c ) rather than whole pixels 430 .
- the patterns 460 are displayed and measured at more than one brightness level. Separately illuminating each subpixel 432 and measuring individual pixels or subpixels at different brightness levels correspondingly multiplies the number of required measurements.
- patterns are tailored to a particular display or a particular measurement. The patterns are not necessarily as regular or evenly distributed as the examples illustrated in FIGS. 4A-4B , and different patterns may illuminate different numbers of pixels or subpixels.
- the subpixels 432 may have other visual properties that can be measured and analyzed in accordance with embodiments of the present disclosure.
- the displayed patterns 460 are described above with reference to pixels 430 and subpixels 432 , other embodiments of the disclosure can be used with displays having different types of light emitting elements or components.
- FIG. 6 is a flow diagram of a method or process 600 configured in accordance with an embodiment of the disclosure.
- the method includes identifying a fraction 1/n of the pixels or subpixels of the display to be illuminated for measurement.
- the technology receives the number, e.g., from user input or from a configuration file.
- the technology determines a number based on a heuristic and the characteristics of the display to be measured and the measuring equipment.
- Such characteristics may include, e.g., the size of the display, the pixel resolution of the display, the pixel density or dot pitch (i.e., distance between pixels) of the display, the distance from the display to the imaging device, the optical resolving power or angular resolution of the imaging device, and the pixel resolution of the imaging device.
- An example heuristic is that the pixel resolution of the imaging device is such that 50 pixels on the imaging device correspond to one illuminated subpixel on the display.
- the imaging device can capture data from 125,829 subpixels on the display (6,291,456 camera pixels/50 camera pixels per display subpixel) in a single captured image.
- the correlation between the resolution of the imaging device and the display can vary between, e.g., 6 to 200 pixels on the imaging device corresponding to one subpixel on the display.
- the total number of illuminated subpixels will be below the threshold of 125,829 subpixels that can be captured in a single image by the selected imaging device in accordance with the applicable heuristic.
- the technology displays a pattern selectively illuminating 1/n of the pixels or subpixels of the display (e.g., in the example above, 1 of every 8 subpixels of the display). For example, every nth pixel (or subpixel of a particular color) may be illuminated. An example of such a pattern is described above in connection with FIG. 4A . As described above, the technology may illuminate each pixel or subpixel at various brightness levels. At block 630 , the imaging device captures at least one image of the pattern of pixels or subpixels illuminated on the display.
- Each subpixel captured by the imaging device can be characterized, e.g., by its color value, typically expressed as chromaticity (Cx, Cy), and its brightness, typically expressed as luminance Lv.
- the captured image data is analyzed by a computing device, e.g., the computing device described above in connection with FIGS. 1 and 3 .
- the computing device compares the color and brightness of each captured pixel with target color and brightness values, e.g., points within the color gamut defined by a color gamut triangle, such as shown in FIG. 5 .
- the actual pixel or subpixel color or brightness values may differ from desired or target display values for the display. For example, there is typically significant variation in color or luminance of each subpixel of the display, especially if the subpixels are LEDs or OLEDs. Moreover, over time the visual properties of the display may degrade or otherwise vary from desired or target display values. Accordingly, at block 650 , the technology compares actual captured and analyzed values with target or desired display values for the pixels or subpixels illuminated according to the displayed pattern, and determines a correction value applicable to each analyzed pixel or subpixel.
- Determining the correction values can include creating a correction data set or map.
- the computing device calculates a three-by-three matrix of values for each pixel that indicate some fractional amount of power to turn on each subpixel to obtain each of the three primary colors (red, green, and blue) at target color and brightness levels.
- a sample matrix is displayed below:
- Fractional values for each subpixel of a pixel Primary color Red Green Blue Red 0.60 0.10 0.05 Green 0.15 0.70 0.08 Blue 0.03 0.08 0.75
- the technology has calculated that the display should turn on its red subpixel at 60% power, its green subpixel at 10% power, and its blue subpixel at 5% power.
- each correction factor can compensate for the difference between the captured and analyzed values and the corresponding target display value. For example, if the captured and analyzed value is less bright than the corresponding target display value, the correction factor can include the amount of brightness that would be required for the captured and analyzed value of the pixel or subpixel to be generally equal to the target display value. Moreover, the correction factor can correlate to the corresponding type of display value. For example, the correction value can be expressed in terms of color or brightness correction values, or in terms of other visual display property correction values. Suitable methods and systems for determining correction values or correction factors are disclosed in U.S. Pat. Nos. 7,907,154 and 7,911,485 referenced above.
- the process branches depending on whether or not all the pixels or subpixels of the display have been illuminated, captured, analyzed, and corrected as described above in blocks 620 - 650 . If the technology illuminates a fraction 1/n of the pixels or subpixels of the display in each pattern, then at least n iterations are required to measure and calibrate the entire display. For example, after displaying a first pattern such as the pattern described above in connection with FIG. 4A , in which every nth pixel (or subpixel of a particular color) is illuminated, the technology returns to block 620 .
- the technology illuminates a different pattern of pixels or subpixels, e.g., a distinct subset of the pixels or subpixels of the display. For example, the technology might next display the pattern described above in connection with FIG. 4B , in which the next neighbor of every nth pixel (or subpixel of a particular color) is illuminated. The process then continues as described above.
- the method 600 can further include sending the calibration correction values to the display.
- the correction factors are stored in firmware within the display or a controller of the display.
- the correction factor data set or map can be saved and, e.g., provided to a third party such as the owner of the display, or used to process video images outside the display such that the display can show the processed image according to desired or target display properties without calibrating or adjusting the display itself. Suitable methods and systems for correcting images to calibrate their appearance on a particular display are disclosed in U.S. patent application Ser. No.
- the technology verifies or improves the calibration by measuring the calibrated output of each pixel or subpixel as described in blocks 610 - 660 above, and optionally modifying the correction factors applied to the display.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Biomedical Technology (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Electroluminescent Light Sources (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Control Of El Displays (AREA)
Abstract
Description
- The present disclosure relates generally to electronic visual displays, and more particularly, to methods and systems for measuring and calibrating the output from such displays.
- Electronic visual displays (“displays”) have become commonplace. Displays of increasingly high resolution are used in a wide variety of contexts, from personal electronics with screens a few inches or smaller in size to computer screens and televisions several feet across to scoreboards and billboards covering hundreds of square feet. Some displays are assembled from a series of smaller panels, each of which may further consist of a series of internally connected modules. Virtually all displays are made up of arrays of individual light-emitting elements called “pixels.” In turn, each pixel is made up of a plurality of light-emitting points (e.g., one red, one green, and one blue). The light-emitting points are termed “subpixels.”
- It is often desirable for a display to be calibrated. For example, calibration may improve the uniformity of the display and improve consistency between displays. During calibration of a display (or, e.g., of each module of a display), the color and brightness of each pixel or subpixel is measured. Adjustments are determined so the pixels can display particular colors at desired brightness levels. The adjustments are then stored (e.g., in software or firmware that controls the display or module), so that those adjustments or correction factors can be applied.
-
FIG. 1 is a schematic view of an electronic visual display calibration system configured in accordance with an embodiment of the disclosure. -
FIG. 2 is an isometric front view of an electronic visual display calibration system configured in accordance with an embodiment of the disclosure. -
FIG. 3 is a schematic block diagram of the electronic visual display calibration system ofFIG. 1 . -
FIGS. 4A and 4B are enlarged partial front views of a portion of an electronic visual display configured to be used with embodiments of the disclosure. -
FIG. 5 is a diagram of a color gamut triangle. -
FIG. 6 is a flow diagram of a method or process configured in accordance with an embodiment of the disclosure. - The following disclosure describes electronic visual display calibration systems and associated methods for measuring and calibrating electronic visual displays. As described in greater detail below, a display measurement method and/or system configured in accordance with one aspect of the disclosure is configured to measure the luminance and the color of the individual pixels or subpixels of an electronic visual display, such as a high-resolution liquid crystal display (“LCD”) or an organic light-emitting diode (“OLED”) display.
- The inventors have recognized that when pixels are very closely spaced, such as is typical in many LCDs, OLED displays, and high resolution light-emitting diode (“LED”) displays, measuring individual pixel or subpixel attributes becomes more difficult. Accordingly, embodiments of the present technology use a pattern generator (e.g., standalone hardware test equipment, a logic analyzer add-on module, a computer peripheral, software in a computing device or controller connected to the display, output from a serial digital interface (“SDI”), digital video interface (“DVI”) or high-definition multimedia interface (“HDMI”) port, etc.) to display only a desired subset of pixels or subpixels to be measured. In some embodiments, for example, the pattern generator illuminates only every third or every fourth pixel of the display, such that the pixels between them remain off. The technology uses an imaging device (which typically has a considerably higher resolution than the display itself) to measure only the illuminated pixels (and/or subpixels). Because only a subset of the pixels are illuminated and measured at once, the display under test effectively has a much lower pixel resolution. After measuring the illuminated pixels, the pattern can then be shifted (e.g., by one pixel) and then measurements can be repeated until all of the pixel of the display have been measured.
- In one particular embodiment, for example, if every fifth pixel of a 1,920×1,080 pixel high definition television (“HDTV”) display is illuminated at a time, then the effective resolution is 384×216 pixels. To measure the illuminated pixels with an imaging device having a resolution about six times greater than the display's pixel resolution, a camera with a resolution of approximately 2,300×1,300—i.e., a camera readily available for a reasonable price—could potentially be used. In contrast with the present technology, however, many conventional approaches for analyzing the 1,920×1,080 pixel HDTV display would require a camera having a resolution of approximately 12,000×6,000, or 72,000,000 pixels. Such a camera (with resolution high enough for the display to be measured) is expected to be prohibitively expensive and/or unavailable. As a result, measuring and calibrating such displays using conventional techniques is often impractical and/or too expensive.
- Another conventional approach for measuring such large or high-resolution displays is to divide the display (or its constituent panels or modules) into sections small enough that the imaging system has sufficient resolving power to enable an accurate measurement of the pixels or subpixels of each section. Using this approach, the imaging device (or the display being measured) is generally mounted on an x-y stage for horizontal and vertical positioning, or rotated to align to each section being measured. Moving or rotating either the camera or the display, however, requires additional, potentially expensive additional equipment, as well as time to perform the movement or rotation and to align the imaging device to the display. Furthermore, this technique can lead to slight mismatches or discontinuities of measurement between the individual sections. If the measurements are used for uniformity correction, such mismatches must be addressed, typically with further measurements and/or post-processing the display measurement data.
- In contrast with conventional techniques, embodiments of the present technology are expected to enable precise measurement of individual pixel or subpixel output for any display (e.g., an OLED display) without requiring expensive, high resolution imaging devices, and without additional equipment for moving the relationship between the imaging device and the display, time for moving and aligning them, or mismatches between sections of the display.
- Certain details are set forth in the following description and in
FIGS. 1-6 to provide a thorough understanding of various embodiments of the disclosure. However, other details describing well-known structures and systems often associated with visual displays and related optical equipment and/or other aspects of visual display calibration systems are not set forth below to avoid unnecessarily obscuring the description of various embodiments of the disclosure. - Many of the details, dimensions, angles, and other features shown in the Figures are merely illustrative of particular embodiments of the disclosure. Accordingly, other embodiments can have other details, dimensions, angles, and features without departing from the spirit or scope of the present disclosure. In addition, those of ordinary skill in the art will appreciate that further embodiments of the disclosure can be practiced without several of the details described below.
-
FIG. 1 is a schematic view of an electronic visual display calibration system (“the system”) 100 configured in accordance with an embodiment of the disclosure. Thesystem 100 is configured to collect, manage, and/or analyze display data for the purpose of processing image patterns (e.g., static image patterns, video streams comprised of a series of image patterns, etc.) that are shown on an electronicvisual display 150. Thepattern 160 shown on thedisplay 150 is generated by apattern generator 110. Thedisplay 150 can be, for example, a large electronic display or sign composed of smaller panels or modules. Thepattern 160 generated by thepattern generator 110 and displayed on thedisplay 150 illustrated inFIG. 1 is described in further detail below in connection withFIGS. 4A-4B . - In the embodiment illustrated in
FIG. 1 , thesystem 100 includes acomputing device 130 operably coupled to an imaging device 120 (e.g., an imaging colorimeter or other photometer). In the illustrated embodiment, theimaging device 120 is spaced apart from the display 150 (e.g., so that theentire display 150 is within the field of view of theimaging device 120, and, in the case of a large elevated sign, for improving the convenience of measurement) and configured to sense or capture display information (e.g., color data, luminance data, etc.) from selectively illuminated pixels orsubpixels 160 of thedisplay 150. For example, thepattern generator 110 can illuminate every nth pixel of thedisplay 150. The captured display information is transferred from theimaging device 120 to thecomputing device 130. After capturing or otherwise sensing the display information for onepattern 160, thepattern generator 110 can generateadditional patterns 160 on thedisplay 150. For example, thepattern generator 110 can illuminate every next nth pixel of thedisplay 150. This process can be repeated (e.g., n times) until thecomputing device 130 obtains display information for all the pixels or subpixels of theentire display 150. Thecomputing device 130 is configured to store, manage, and/or analyze the display information from eachpattern 160 to determine one or more correction factors for thedisplay 150 or for its pixels or subpixels. - In some embodiments, the correction factors for the
display 150 are applied to the firmware and/or software controlling thedisplay 150 to calibrate thedisplay 150. In alternate embodiments, the corrections are applied in real time to a video stream to be shown on thedisplay 150. In such embodiments, the technology includes comparing the actual display value with a desired display value for the one or more portions of thedisplay 150, and determining a correction factor for the pixels or subpixels of thedisplay 150 as determined from the measurements of thepatterns 160 described above. The technology processes or adjusts the image with the correction factors for the corresponding pixels of thedisplay 150. After processing the image to account for variations in thedisplay 150, the technology can further include transmitting the image to thedisplay 150 and showing the image on thedisplay 150. Accordingly, in some embodiments, the image on thedisplay 150 can be presented according to the desired display values without modifying or calibrating theactual display 150. - One of ordinary skill in the art will understand that although the
system 100 illustrated inFIG. 1 includes separate components (e.g., thepattern generator 110, theimaging device 120, and the computing device 130), in other embodiments thesystem 100 can incorporate more or less than three components. Moreover, the various components can be further divided into subcomponents, or the various components and functions may be combined and integrated. In addition, these components can communicate via wired or wireless communication, as well as by information contained in storage media. The various components and features of the electronic visualdisplay calibration system 100 are described in greater detail below in connection withFIG. 3 . -
FIG. 2 is an isometric front view of an electronic visualdisplay calibration system 200 configured in accordance with an embodiment of the disclosure. Thesystem 200 is configured to perform correction of the brightness and color of light-emitting elements that are used in electronic visual displays. In one embodiment, thecalibration system 200 can include atest pattern generator 210, atest station 240, aninterface 230, and an electronicvisual display 250. In the embodiment illustrated inFIG. 2 , thecalibration system 200 is designed to calibrate adisplay 250 that is placed within thetest station 240. In alternate embodiments, it is possible to calibrate multiple displays or multiple panels of a larger display within thetest station 240. - The
test pattern generator 210 is configured to generate a series oftest patterns 260, each of which illuminates a proper subset of the pixels or subpixels of thedisplay 250. Thetest station 240 is configured to capture a series of images from an imaging area covering all of thedisplay 250. The captured image data is transferred from thetest station 240 to theinterface 230. Theinterface 230 compiles and manages the image data, performs a series of calculations to determine the appropriate correction factors that should be made to the image data, and then stores the data. This process is repeated until images of each of the pixels or subpixels ofdisplay 250 have been obtained. After collection of all the necessary data, the processed correction data is then uploaded from theinterface 230 to the firmware and/or software controlling thedisplay 250 and used to recalibrate thedisplay 250. - In the embodiment illustrated in
FIG. 2 , thetest station 240 can include a lightproof chamber for calibrating adisplay 250 in a fully-illuminated room or factory. Thetest station 240 can include adigital camera 220 mounted on thetop portion 244 of thetest station 240. Thetest station 240 can further include light baffles to eliminate any stray light that might be reflected off the walls of thetest station chamber 242 back into thecamera 220. In the illustrated embodiment, thedisplay 250 is positioned beneath thetest station 240. Thetest station 240 includes mechanical and electrical fixtures for receiving thedisplay 250 and placing it in position within thetest station 240 for calibration. In other embodiments, thetest station 240 may be in other orientations, e.g., facing upward at a display positioned above the test station or facing horizontally. Further, in some embodiments thetest station 240 may have a different arrangement and/or include different features. - In the illustrated embodiment, the
test station 240 also incorporates aground glass diffuser 246 positioned just above thedisplay 250. Thediffuser 246 scatters the light emitted from each subpixel in thedisplay 250, which effectively partially integrates the emitted light angularly. Accordingly, thecamera 220 is actually measuring the average light emitted into a cone rather than only the light traveling directly from each subpixel on thedisplay 250 toward thecamera 220. One advantage of this arrangement is that thedisplay 250 will be corrected to optimize viewing over a wider angular range. Thediffuser 246 is an optional component that may not be included in some embodiments. - The
interface 230 that is operably coupled to thetest station 240 is configured to manage the data that is collected, stored, and used for calculation of new correction factors that will be used to recalibrate thedisplay 250. Theinterface 230 automates the operation of thepattern generator 210 and thetest station 240 and writes all the data into a database. In one embodiment, theinterface 230 can be a personal computer with software for pattern selection, camera control, image data acquisition, and image data analysis. Optionally, in other embodiments various devices capable of operating the software can be used, such as handheld computers. -
FIG. 3 is a schematic block diagram of the electronic visualdisplay calibration system 100 ofFIG. 1 . In the illustrated embodiment, theimaging device 120 can include acamera 320, such as a digital camera suitable for high-resolution imaging. For example, thecamera 320 can include optics capable of measuring subpixels of the display 150 (which can be a few millimeters in size) from a distance of 25 meters or more. If the displayedpattern 160 does not illuminate adjacent subpixels or pixels, imaging resolution requirements for thecamera 320 may be less stringent, allowing the use of a lessexpensive imaging device 120. In some embodiments, thecamera 320 can be a CCD camera. Suitable CCD digital color cameras include ProMetric® imaging colorimeters and photometers, which are commercially available from the assignee of the present disclosure, Radiant Zemax, LLC, of Redmond, Wash. In other embodiments, thecamera 320 can be a complementary metal oxide semiconductor (“CMOS”) camera, or another type of suitable camera for imaging with sufficient resolution at a certain distance from the display. - According to another aspect of the illustrated embodiment, the
imaging device 120 can also include alens 322. In one embodiment, for example, thelens 322 can be a reflecting telescope that is operably coupled to thecamera 320 to provide sufficiently high resolution for long distance imaging of thedisplay 150. In other embodiments, however, thelens 322 can include other suitable configurations for viewing and/or capturing display information from thedisplay 150.Suitable imaging devices 320 andlenses 322 are disclosed in U.S. Pat. Nos. 7,907,154 and 7,911,485, both of which are incorporated herein by reference in their entireties. - The
imaging device 120 can accordingly be positioned at a distance L from thedisplay 150. The distance L can vary depending on the size of thedisplay 150, and can include relatively large distances. In one embodiment, for example, theimaging device 120 can be positioned at a distance L that is generally similar to a typical viewing distance of thedisplay 150. In a sports stadium, for example, theimaging device 120 can be positioned in a seating area facing toward thedisplay 150. In other embodiments, however, the distance L can be less that a typical viewing distance and direction, and theimaging system 120 can be configured to account for any viewing distance and/or direction differences. In some embodiments, theimaging device 120 has a wide field of view and the distance L can be less than the width of the display 150 (e.g., approximately one meter for a typical HDTV display). In other embodiments, theimaging device 120 has a long-focus lens 322 (e.g., a telephoto lens) and the distance L can be significantly greater than the width of the display 150 (e.g., between approximately 100 and 300 meters for an outdoor billboard or video screen). In yet other embodiments, the distance L can have other values. - The
computing device 130 is configured to cause thepattern generator 110 to send images 160 (e.g., pixel or subpixel patterns) to thedisplay 150. In various embodiments, thepattern generator 110 is standalone hardware test equipment, a logic analyzer add-on module, a computer peripheral operably coupled to thecomputing device 130, or software in thecomputing device 130 or in a controller connected to thedisplay 150. In other embodiments, thepattern generator 110 operates independently of thecomputing device 130. In alternative embodiments, thepatterns 160 are provided to thedisplay 150 via standard video signal input, e.g., using a DVI, HDMI, or SDI input to the display. Thepatterns 160 generated by thepattern generator 110 for displaying on the electronicvisual display 150 are discussed in greater detail in connection withFIGS. 4A and 4B below. - Continuing with respect to
FIG. 3 , thecomputing device 130 is configured to receive, manage, store, and/or process the display data collected by the imaging device 120 (e.g., for the purpose of adjusting the appearance ofimages 160 that will be displayed on the display 150). In other embodiments, display data associated with thedisplay 150, including correction factors and related data, can be processed by a computer that is separate from theimaging device 120. Atypical display 150, such as a quad extended graphics array (“QXGA”)-resolution (2048×1536) visual display for example, can have over nine million subpixels that provide display data for thecomputing device 130 to manage and process. Thepattern generator 110 may illuminate only a fraction of those subpixels at any one time, but by sending a series ofpatterns 160 to thedisplay 150, information about all the subpixels will be delivered to thecomputing device 130. As such, thecomputing device 130 includes the necessary hardware and corresponding software components for managing and processing the display data. More specifically, thecomputing device 130 configured in accordance with an embodiment of the disclosure can include aprocessor 330, amemory 332, input/output devices 334, one ormore sensors 336 in addition to sensors of theimaging device 120, and/or any other suitable subsystems and/or components 338 (displays, speakers, communication modules, etc.). Thememory 332 can be configured to store the display data from thepatterns 160 shown on thedisplay 150. Thecomputing device 130 includes computer readable media (e.g.,memory 332, disk drives, or other storage media, excluding only a transitory, propagating signal per se) including instructions or software stored thereon that, when executed by theprocessor 330 orcomputing device 130, cause theprocessor 330 orcomputing device 130 to process an image as described herein. Moreover, theprocessor 330 can be configured for performing or otherwise controlling calculations, analysis, and any other functions associated with the methods described herein. - In some embodiments, the
memory 332 includes software to control theimaging device 120 as well as measurement software to identify portions of the display 150 (e.g., subpixels of the display 150) and to image or otherwise extract the display data (e.g., subpixel brightness data, pixel color data, etc.). One example of suitable software for controlling theimaging device 120 and/or acquiring the display data is VisionCAL™ screen correction software, which is commercially available from the assignee of the present disclosure, Radiant Zemax, LLC, of Redmond, Wash. In other embodiments, other suitable software can be implemented with thesystem 100. Moreover, thememory 332 can also store one or more databases used to store the display data from thepatterns 160 shown ondisplay 150, as well as calculated correction factors for the display data. In one embodiment, for example, the database is a Microsoft Access® database designed by the assignee of the present disclosure. In other embodiments, however, the display data is stored in other types of databases or data files. -
FIG. 4A is an enlarged partial front view of a portion of an electronicvisual display 450 configured to be used with embodiments of the disclosure. The illustrated view is representative of a portion of a display 450 (e.g., display 150 (FIG. 1 ) or display 250 (FIG. 2 )) displaying apattern 460 a. Thedisplay 450 is made up of a large number (e.g., millions) of individual light sources or light-emitting elements orpixels 430. Eachpixel 430 comprises multiple light-emitting points or subpixels 432 (identified as first, second, and third subpixels 432 a-432 c, respectively). In certain embodiments, the subpixels 432 are LEDs or OLEDs. For example, the subpixels 432 a-432 c can correspond to red, green, and blue LEDs, respectively. In other embodiments, eachpixel 430 can include more or less than three subpixels 432. For example, somepixels 430 may have four subpixels 432 (e.g., two green subpixels, one blue subpixel, and one red subpixel, or other combinations). Pixels and subpixels may be laid out in various geometric arrangements (e.g., triangular or hexagonal arrays in various color orders, vertical or oblique stripes, etc.). Furthermore, in certain embodiments, the red, green, and blue (“RGB”) color space may not be used. Rather, a different color space can serve as the basis for processing and display of color images on thedisplay 450. For example, the subpixels 432 may be cyan, magenta, and yellow, respectively. - In addition to the color level of each subpixel 432, the luminance level of each subpixel 432 can vary. Accordingly, the additive primary colors represented by a red subpixel, a green subpixel, and a blue subpixel can be selectively combined to produce the colors within the color gamut defined by a color gamut triangle, as shown in
FIG. 5 . For example, when only “pure” red is displayed, the green and blue subpixels may be turned on only slightly to achieve a specific chromaticity for the red color. - In addition, the measurement process described herein may be performed at various brightness levels. For example, in some embodiments, each
pixel 430 or subpixel 432 is measured at input levels (using values from 0 to 255) of 255 (full brightness), 128 (one half brightness), 64 (one quarter brightness), and 32 (one eighth brightness). Data from such measurements can be used in calibration to achieve the same chromaticity for a particular color at various input brightness levels, or, e.g., to improve the uniformity of color and luminance response curves for each pixel or subpixel. - Returning to
FIG. 4A , anillustrative pattern 460 a illuminates a proper subset of the pixels of thedisplay 450. In the illustrated embodiment, everyfourth pixel 430 vertically and everyfourth pixel 430 horizontally is illuminated, and the pixels between are switched off. Thus, for theillustrated pattern 460 a, only one of every sixteenpixels 430 is illuminated, and the spaces between illuminated pixels are four times larger in each direction than there would be if everypixel 430 were illuminated. As a result, the effective pixel density of thedisplay 450 is one sixteenth of the actual pixel density. For example, ifpattern 460 a is displayed on a “4K Ultra HD”television display 450 having a screen resolution of 3,840×2,160 pixels (a total of approximately 8.3 million pixels (“megapixels”)), only (3,840/4)×(2,160/4) pixels, i.e., 960×540 pixels (a total of approximately five hundred thousand pixels (half a megapixel)) are lit at once. Such a reduction in the effective pixel resolution of thedisplay 450 can permit use of imaging equipment (e.g., a camera sensor and lens) that is less sophisticated and expensive than would otherwise be required to measure thedisplay 450. - The technology displays a series of patterns to illuminate and measure each pixel or subpixel of the display at least once (and potentially multiple times, e.g., at different brightness input levels).
FIG. 4B illustrates anotherpattern 460 b on the same enlarged partial front view of a portion of the electronicvisual display 450. In thepattern 460 b ofFIG. 4B , eachpixel 430 that was illuminated in thepattern 460 a ofFIG. 4A is switched off, and the next pixel to the right is illuminated. In the illustrated embodiment, measuring the output of eachpixel 430 of thedisplay 450 requires displaying and measuring a total of sixteen patterns (multiplied by the number of different brightness levels for each pattern). Different patterns 460 ofpixels 430 and/or subpixels 432 could require a smaller or larger number of patterns 460 to ensure full coverage of thedisplay 450. For example, a pattern that illuminates everythird pixel 430 horizontally and vertically requires nine patterns to cover everypixel 430 in thedisplay 450. - In alternative embodiments, the patterns 460 illuminate individual subpixels 432 (e.g., one or more at a time of subpixels 432 a-432 c) rather than
whole pixels 430. In various embodiments, the patterns 460 are displayed and measured at more than one brightness level. Separately illuminating each subpixel 432 and measuring individual pixels or subpixels at different brightness levels correspondingly multiplies the number of required measurements. In some embodiments, patterns are tailored to a particular display or a particular measurement. The patterns are not necessarily as regular or evenly distributed as the examples illustrated inFIGS. 4A-4B , and different patterns may illuminate different numbers of pixels or subpixels. - In addition to color and/or luminance, the subpixels 432 may have other visual properties that can be measured and analyzed in accordance with embodiments of the present disclosure. Moreover, although the displayed patterns 460 are described above with reference to
pixels 430 and subpixels 432, other embodiments of the disclosure can be used with displays having different types of light emitting elements or components. -
FIG. 6 is a flow diagram of a method orprocess 600 configured in accordance with an embodiment of the disclosure. Atblock 610, the method includes identifying afraction 1/n of the pixels or subpixels of the display to be illuminated for measurement. In some embodiments, the technology receives the number, e.g., from user input or from a configuration file. In some embodiments, the technology determines a number based on a heuristic and the characteristics of the display to be measured and the measuring equipment. Such characteristics may include, e.g., the size of the display, the pixel resolution of the display, the pixel density or dot pitch (i.e., distance between pixels) of the display, the distance from the display to the imaging device, the optical resolving power or angular resolution of the imaging device, and the pixel resolution of the imaging device. An example heuristic is that the pixel resolution of the imaging device is such that 50 pixels on the imaging device correspond to one illuminated subpixel on the display. - By way of example, in one embodiment the imaging device has a pixel resolution of 3,072×2,048=6,291,456 pixels. According to the heuristic that fifty pixels of resolution from the imaging device correspond to one subpixel on the display, the imaging device can capture data from 125,829 subpixels on the display (6,291,456 camera pixels/50 camera pixels per display subpixel) in a single captured image. In other embodiments, the correlation between the resolution of the imaging device and the display can vary between, e.g., 6 to 200 pixels on the imaging device corresponding to one subpixel on the display. Assuming, for example, that no other characteristic of the imaging device or its relationship to the display restricts its ability to measure the display, then the technology can determine the
appropriate fraction 1/n in this case by dividing 125,829 (the number of subpixels to be illuminated in each captured image) by the total number of subpixels in the display. For example, to measure a display having a pixel resolution of 1,280×720=921,600 pixels, thefraction 1/n would be 125,829/921,600=1/7.324 or (rounding the denominator up) ⅛. In other words, if ⅛ of the display's subpixels are illuminated, the total number of illuminated subpixels will be below the threshold of 125,829 subpixels that can be captured in a single image by the selected imaging device in accordance with the applicable heuristic. - At
block 620, the technology displays a pattern selectively illuminating 1/n of the pixels or subpixels of the display (e.g., in the example above, 1 of every 8 subpixels of the display). For example, every nth pixel (or subpixel of a particular color) may be illuminated. An example of such a pattern is described above in connection withFIG. 4A . As described above, the technology may illuminate each pixel or subpixel at various brightness levels. Atblock 630, the imaging device captures at least one image of the pattern of pixels or subpixels illuminated on the display. Each subpixel captured by the imaging device can be characterized, e.g., by its color value, typically expressed as chromaticity (Cx, Cy), and its brightness, typically expressed as luminance Lv. Atblock 640, the captured image data is analyzed by a computing device, e.g., the computing device described above in connection withFIGS. 1 and 3 . - In some embodiments, the computing device compares the color and brightness of each captured pixel with target color and brightness values, e.g., points within the color gamut defined by a color gamut triangle, such as shown in
FIG. 5 . The actual pixel or subpixel color or brightness values may differ from desired or target display values for the display. For example, there is typically significant variation in color or luminance of each subpixel of the display, especially if the subpixels are LEDs or OLEDs. Moreover, over time the visual properties of the display may degrade or otherwise vary from desired or target display values. Accordingly, atblock 650, the technology compares actual captured and analyzed values with target or desired display values for the pixels or subpixels illuminated according to the displayed pattern, and determines a correction value applicable to each analyzed pixel or subpixel. - Determining the correction values can include creating a correction data set or map. In some embodiments, the computing device calculates a three-by-three matrix of values for each pixel that indicate some fractional amount of power to turn on each subpixel to obtain each of the three primary colors (red, green, and blue) at target color and brightness levels. A sample matrix is displayed below:
-
Fractional values for each subpixel of a pixel Primary color Red Green Blue Red 0.60 0.10 0.05 Green 0.15 0.70 0.08 Blue 0.03 0.08 0.75
For example, according to the above matrix for a particular brightness level, when a pixel of the display should be red, the technology has calculated that the display should turn on its red subpixel at 60% power, its green subpixel at 10% power, and its blue subpixel at 5% power. - The determination of the correction values is based, at least in part, on the comparison between the captured and analyzed values and the target values for the display. More specifically, each correction factor can compensate for the difference between the captured and analyzed values and the corresponding target display value. For example, if the captured and analyzed value is less bright than the corresponding target display value, the correction factor can include the amount of brightness that would be required for the captured and analyzed value of the pixel or subpixel to be generally equal to the target display value. Moreover, the correction factor can correlate to the corresponding type of display value. For example, the correction value can be expressed in terms of color or brightness correction values, or in terms of other visual display property correction values. Suitable methods and systems for determining correction values or correction factors are disclosed in U.S. Pat. Nos. 7,907,154 and 7,911,485 referenced above.
- At
block 660, the process branches depending on whether or not all the pixels or subpixels of the display have been illuminated, captured, analyzed, and corrected as described above in blocks 620-650. If the technology illuminates afraction 1/n of the pixels or subpixels of the display in each pattern, then at least n iterations are required to measure and calibrate the entire display. For example, after displaying a first pattern such as the pattern described above in connection withFIG. 4A , in which every nth pixel (or subpixel of a particular color) is illuminated, the technology returns to block 620. Atblock 620, the technology illuminates a different pattern of pixels or subpixels, e.g., a distinct subset of the pixels or subpixels of the display. For example, the technology might next display the pattern described above in connection withFIG. 4B , in which the next neighbor of every nth pixel (or subpixel of a particular color) is illuminated. The process then continues as described above. - After n iterations have been completed, at
block 670, themethod 600 can further include sending the calibration correction values to the display. In some embodiments, the correction factors are stored in firmware within the display or a controller of the display. In some embodiments, the correction factor data set or map can be saved and, e.g., provided to a third party such as the owner of the display, or used to process video images outside the display such that the display can show the processed image according to desired or target display properties without calibrating or adjusting the display itself. Suitable methods and systems for correcting images to calibrate their appearance on a particular display are disclosed in U.S. patent application Ser. No. 12/772,916, filed May 3, 2010, entitled “Methods and systems for correcting the appearance of images displayed on an electronic visual display,” which is incorporated herein in its entirety by reference. In some embodiments, the technology verifies or improves the calibration by measuring the calibrated output of each pixel or subpixel as described in blocks 610-660 above, and optionally modifying the correction factors applied to the display. - From the foregoing, it will be appreciated that specific embodiments of the disclosure have been described herein for purposes of illustration, but that various modifications may be made without deviating from the spirit and scope of the various embodiments of the disclosure. Further, while various advantages associated with certain embodiments of the disclosure have been described above in the context of those embodiments, other embodiments may also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the disclosure. Accordingly, the disclosure is not limited, except as by the appended claims.
Claims (25)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/830,678 US8836797B1 (en) | 2013-03-14 | 2013-03-14 | Methods and systems for measuring and correcting electronic visual displays |
PCT/US2014/022134 WO2014159134A1 (en) | 2013-03-14 | 2014-03-07 | Methods and systems for measuring and correcting electronic visual displays |
CN201480027705.0A CN105247607A (en) | 2013-03-14 | 2014-03-07 | Methods and systems for measuring and correcting electronic visual displays |
US14/458,695 US9135851B2 (en) | 2013-03-14 | 2014-08-13 | Methods and systems for measuring and correcting electronic visual displays |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/830,678 US8836797B1 (en) | 2013-03-14 | 2013-03-14 | Methods and systems for measuring and correcting electronic visual displays |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/458,695 Division US9135851B2 (en) | 2013-03-14 | 2014-08-13 | Methods and systems for measuring and correcting electronic visual displays |
Publications (2)
Publication Number | Publication Date |
---|---|
US8836797B1 US8836797B1 (en) | 2014-09-16 |
US20140267784A1 true US20140267784A1 (en) | 2014-09-18 |
Family
ID=51493402
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/830,678 Active US8836797B1 (en) | 2013-03-14 | 2013-03-14 | Methods and systems for measuring and correcting electronic visual displays |
US14/458,695 Active US9135851B2 (en) | 2013-03-14 | 2014-08-13 | Methods and systems for measuring and correcting electronic visual displays |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/458,695 Active US9135851B2 (en) | 2013-03-14 | 2014-08-13 | Methods and systems for measuring and correcting electronic visual displays |
Country Status (3)
Country | Link |
---|---|
US (2) | US8836797B1 (en) |
CN (1) | CN105247607A (en) |
WO (1) | WO2014159134A1 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150123957A1 (en) * | 2013-11-04 | 2015-05-07 | Samsung Display Co., Ltd. | System and method for luminance correction |
US9311847B2 (en) * | 2014-07-16 | 2016-04-12 | Ultravision Technologies, Llc | Display system having monitoring circuit and methods thereof |
US9349306B2 (en) | 2013-12-31 | 2016-05-24 | Ultravision Technologies, Llc | Modular display panel |
US9416551B2 (en) | 2013-12-31 | 2016-08-16 | Ultravision Technologies, Llc | Preassembled display systems and methods of installation thereof |
US9582237B2 (en) | 2013-12-31 | 2017-02-28 | Ultravision Technologies, Llc | Modular display panels with different pitches |
US10061553B2 (en) | 2013-12-31 | 2018-08-28 | Ultravision Technologies, Llc | Power and data communication arrangement between panels |
WO2019040310A1 (en) * | 2017-08-24 | 2019-02-28 | Radiant Vision Systems, LLC | Methods and systems for measuring electronic visual displays using fractional pixels |
US10262605B2 (en) | 2017-09-08 | 2019-04-16 | Apple Inc. | Electronic display color accuracy compensation |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4987177B1 (en) * | 2011-08-31 | 2012-07-25 | パイオニア株式会社 | Illumination device and light emission control method |
KR102170101B1 (en) * | 2014-02-24 | 2020-10-26 | 삼성전자주식회사 | Display apparatus, mobile apparaus, system and image quality matching method thereof |
US9658816B2 (en) * | 2014-07-29 | 2017-05-23 | Samsung Display Co., Ltd. | System and apparatus in managing color-consistency for multiple panel simultaneous display |
US9952242B2 (en) | 2014-09-12 | 2018-04-24 | Roche Diagnostics Operations, Inc. | Laboratory sample distribution system and laboratory automation system |
KR102468270B1 (en) * | 2015-09-23 | 2022-11-18 | 삼성전자주식회사 | Electronic apparatus, display panel apparatus calibration method thereof and calibration system |
CN105206217B (en) * | 2015-10-27 | 2018-02-06 | 京东方科技集团股份有限公司 | display processing method, device and display device |
KR102702307B1 (en) * | 2016-08-03 | 2024-09-04 | 삼성전자주식회사 | Display apparatus and control method of Electronic apparatus |
KR102555953B1 (en) * | 2016-11-04 | 2023-07-17 | 삼성전자주식회사 | Electronic apparatus, display apparatus and control method thereof |
KR102686185B1 (en) | 2016-11-23 | 2024-07-19 | 삼성전자주식회사 | Display apparatus, Calibration apparatus and Calibration method thereof |
US10366674B1 (en) | 2016-12-27 | 2019-07-30 | Facebook Technologies, Llc | Display calibration in electronic displays |
US10672320B2 (en) | 2018-01-30 | 2020-06-02 | Apple Inc. | Applying gain and offset correction factors for pixel uniformity compensation in display panels |
WO2021087256A1 (en) * | 2019-10-30 | 2021-05-06 | Radiant Vision Systems, LLC | Non-spatial measurement calibration methods and associated systems and devices |
TWI759669B (en) * | 2019-12-23 | 2022-04-01 | 中強光電股份有限公司 | Method and system for inspecting display image |
CN112071263B (en) * | 2020-09-04 | 2022-03-18 | 京东方科技集团股份有限公司 | Display method and display device of display panel |
TWI746292B (en) * | 2020-11-27 | 2021-11-11 | 茂達電子股份有限公司 | Circuit measuring device and method |
DE102021205703A1 (en) | 2021-06-07 | 2022-12-08 | TechnoTeam Holding GmbH | Method and device for photometric measurement of an electronic display and method for controlling an electronic display |
KR20240122745A (en) * | 2023-01-31 | 2024-08-13 | 제이드 버드 디스플레이(상하이) 리미티드 | Method for detecting defects in near-field displays |
Family Cites Families (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6285349B1 (en) * | 1999-02-26 | 2001-09-04 | Intel Corporation | Correcting non-uniformity in displays |
US6954216B1 (en) * | 1999-08-19 | 2005-10-11 | Adobe Systems Incorporated | Device-specific color intensity settings and sub-pixel geometry |
US6618076B1 (en) * | 1999-12-23 | 2003-09-09 | Justsystem Corporation | Method and apparatus for calibrating projector-camera system |
US7063449B2 (en) * | 2002-11-21 | 2006-06-20 | Element Labs, Inc. | Light emitting diode (LED) picture element |
US7308157B2 (en) * | 2003-02-03 | 2007-12-11 | Photon Dynamics, Inc. | Method and apparatus for optical inspection of a display |
BRPI0409513A (en) * | 2003-04-25 | 2006-04-18 | Visioneered Image Systems Inc | led area light source for emitting light of a desired color, color video monitor and methods of determining the degradation of the representative led (s) of each color and of operating and calibrating the monitor |
US7227592B2 (en) * | 2003-09-26 | 2007-06-05 | Mitsubishi Electric Research Laboratories, Inc. | Self-correcting rear projection television |
US7508387B2 (en) * | 2003-09-30 | 2009-03-24 | International Business Machines Corporation | On demand calibration of imaging displays |
US7218358B2 (en) * | 2004-06-15 | 2007-05-15 | Coretronic Corporation | Method and apparatus for calibrating color temperature of color display devices |
JP4996065B2 (en) * | 2005-06-15 | 2012-08-08 | グローバル・オーエルイーディー・テクノロジー・リミテッド・ライアビリティ・カンパニー | Method for manufacturing organic EL display device and organic EL display device |
US8207914B2 (en) * | 2005-11-07 | 2012-06-26 | Global Oled Technology Llc | OLED display with aging compensation |
US20070291121A1 (en) * | 2006-06-19 | 2007-12-20 | Inventec Corporation | Computer executable image quality detection system |
US8406562B2 (en) * | 2006-08-11 | 2013-03-26 | Geo Semiconductor Inc. | System and method for automated calibration and correction of display geometry and color |
US7524226B2 (en) * | 2006-10-10 | 2009-04-28 | Eastman Kodak Company | OLED display device with adjusted filter array |
JP2008252515A (en) * | 2007-03-30 | 2008-10-16 | Olympus Corp | Video signal processor, video display system, and video signal processing method |
US20080284690A1 (en) * | 2007-05-18 | 2008-11-20 | Sam Min Ko | Organic light emitting display device |
US20090322800A1 (en) * | 2008-06-25 | 2009-12-31 | Dolby Laboratories Licensing Corporation | Method and apparatus in various embodiments for hdr implementation in display devices |
US8521035B2 (en) * | 2008-09-05 | 2013-08-27 | Ketra, Inc. | Systems and methods for visible light communication |
US9378685B2 (en) * | 2009-03-13 | 2016-06-28 | Dolby Laboratories Licensing Corporation | Artifact mitigation method and apparatus for images generated using three dimensional color synthesis |
US7901095B2 (en) * | 2009-03-27 | 2011-03-08 | Seiko Epson Corporation | Resolution scalable view projection |
US8537098B2 (en) * | 2009-08-05 | 2013-09-17 | Dolby Laboratories Licensing Corporation | Retention and other mechanisms or processes for display calibration |
JP5471306B2 (en) * | 2009-10-28 | 2014-04-16 | ソニー株式会社 | Color unevenness inspection apparatus and color unevenness inspection method |
US8344659B2 (en) * | 2009-11-06 | 2013-01-01 | Neofocal Systems, Inc. | System and method for lighting power and control system |
KR20110056167A (en) * | 2009-11-20 | 2011-05-26 | 삼성전자주식회사 | Display apparatus and calibration method therefor |
CA2696778A1 (en) * | 2010-03-17 | 2011-09-17 | Ignis Innovation Inc. | Lifetime, uniformity, parameter extraction methods |
US8564879B1 (en) * | 2010-03-26 | 2013-10-22 | The United States Of America As Represented By The Secretary Of The Navy | Multispectral infrared simulation target array |
US20110267365A1 (en) * | 2010-05-03 | 2011-11-03 | Radiant Imaging, Inc. | Methods and systems for correcting the appearance of images displayed on an electronic visual display |
EP2715669A4 (en) * | 2011-05-25 | 2015-03-18 | Third Dimension Ip Llc | Systems and methods for alignment, calibration and rendering for an angular slice true-3d display |
US8704895B2 (en) * | 2011-08-29 | 2014-04-22 | Qualcomm Incorporated | Fast calibration of displays using spectral-based colorimetrically calibrated multicolor camera |
KR20130051576A (en) * | 2011-11-10 | 2013-05-21 | 삼성전자주식회사 | Optimizing method of visibility and system thereof, portable device using the same |
US9385169B2 (en) * | 2011-11-29 | 2016-07-05 | Ignis Innovation Inc. | Multi-functional active matrix organic light-emitting diode display |
-
2013
- 2013-03-14 US US13/830,678 patent/US8836797B1/en active Active
-
2014
- 2014-03-07 WO PCT/US2014/022134 patent/WO2014159134A1/en active Application Filing
- 2014-03-07 CN CN201480027705.0A patent/CN105247607A/en active Pending
- 2014-08-13 US US14/458,695 patent/US9135851B2/en active Active
Cited By (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9564074B2 (en) * | 2013-11-04 | 2017-02-07 | Samsung Display Co., Ltd. | System and method for luminance correction |
US20150123957A1 (en) * | 2013-11-04 | 2015-05-07 | Samsung Display Co., Ltd. | System and method for luminance correction |
US9978294B1 (en) | 2013-12-31 | 2018-05-22 | Ultravision Technologies, Llc | Modular display panel |
US9528283B2 (en) | 2013-12-31 | 2016-12-27 | Ultravision Technologies, Llc | Method of performing an installation of a display unit |
US9990869B1 (en) | 2013-12-31 | 2018-06-05 | Ultravision Technologies, Llc | Modular display panel |
US9513863B2 (en) | 2013-12-31 | 2016-12-06 | Ultravision Technologies, Llc | Modular display panel |
US10061553B2 (en) | 2013-12-31 | 2018-08-28 | Ultravision Technologies, Llc | Power and data communication arrangement between panels |
US9535650B2 (en) | 2013-12-31 | 2017-01-03 | Ultravision Technologies, Llc | System for modular multi-panel display wherein each display is sealed to be waterproof and includes array of display elements arranged to form display panel surface |
US9349306B2 (en) | 2013-12-31 | 2016-05-24 | Ultravision Technologies, Llc | Modular display panel |
US9582237B2 (en) | 2013-12-31 | 2017-02-28 | Ultravision Technologies, Llc | Modular display panels with different pitches |
US9642272B1 (en) | 2013-12-31 | 2017-05-02 | Ultravision Technologies, Llc | Method for modular multi-panel display wherein each display is sealed to be waterproof and includes array of display elements arranged to form display panel surface |
US9832897B2 (en) | 2013-12-31 | 2017-11-28 | Ultravision Technologies, Llc | Method of assembling a modular multi-panel display system |
US9916782B2 (en) | 2013-12-31 | 2018-03-13 | Ultravision Technologies, Llc | Modular display panel |
US10871932B2 (en) | 2013-12-31 | 2020-12-22 | Ultravision Technologies, Llc | Modular display panels |
US10540917B2 (en) | 2013-12-31 | 2020-01-21 | Ultravision Technologies, Llc | Modular display panel |
US9984603B1 (en) | 2013-12-31 | 2018-05-29 | Ultravision Technologies, Llc | Modular display panel |
US9416551B2 (en) | 2013-12-31 | 2016-08-16 | Ultravision Technologies, Llc | Preassembled display systems and methods of installation thereof |
US9372659B2 (en) | 2013-12-31 | 2016-06-21 | Ultravision Technologies, Llc | Modular multi-panel display system using integrated data and power cables |
US9940856B2 (en) | 2013-12-31 | 2018-04-10 | Ultravision Technologies, Llc | Preassembled display systems and methods of installation thereof |
US10248372B2 (en) | 2013-12-31 | 2019-04-02 | Ultravision Technologies, Llc | Modular display panels |
US10410552B2 (en) | 2013-12-31 | 2019-09-10 | Ultravision Technologies, Llc | Modular display panel |
US10373535B2 (en) | 2013-12-31 | 2019-08-06 | Ultravision Technologies, Llc | Modular display panel |
US10380925B2 (en) | 2013-12-31 | 2019-08-13 | Ultravision Technologies, Llc | Modular display panel |
US9311847B2 (en) * | 2014-07-16 | 2016-04-12 | Ultravision Technologies, Llc | Display system having monitoring circuit and methods thereof |
US10706770B2 (en) | 2014-07-16 | 2020-07-07 | Ultravision Technologies, Llc | Display system having module display panel with circuitry for bidirectional communication |
KR20200029571A (en) * | 2017-08-24 | 2020-03-18 | 래디언트 비전 시스템즈, 엘엘씨 | Methods and systems for measuring electronic visual displays using fractional pixels |
CN111066062A (en) * | 2017-08-24 | 2020-04-24 | 雷迪安特视觉系统有限公司 | Method and system for measuring electronic visual displays using fractional pixels |
WO2019040310A1 (en) * | 2017-08-24 | 2019-02-28 | Radiant Vision Systems, LLC | Methods and systems for measuring electronic visual displays using fractional pixels |
US10971044B2 (en) * | 2017-08-24 | 2021-04-06 | Radiant Vision Systems, LLC | Methods and systems for measuring electronic visual displays using fractional pixels |
KR102377250B1 (en) * | 2017-08-24 | 2022-03-22 | 래디언트 비전 시스템즈, 엘엘씨 | Methods and systems for measuring electronic visual displays using fractional pixels |
US10262605B2 (en) | 2017-09-08 | 2019-04-16 | Apple Inc. | Electronic display color accuracy compensation |
Also Published As
Publication number | Publication date |
---|---|
US20140347408A1 (en) | 2014-11-27 |
US9135851B2 (en) | 2015-09-15 |
WO2014159134A1 (en) | 2014-10-02 |
CN105247607A (en) | 2016-01-13 |
US8836797B1 (en) | 2014-09-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8836797B1 (en) | Methods and systems for measuring and correcting electronic visual displays | |
US7911485B2 (en) | Method and apparatus for visual display calibration system | |
US8465335B2 (en) | Color calibration system for a video display | |
WO2018016572A1 (en) | Display correction apparatus, program, and display correction system | |
US7907154B2 (en) | Method and apparatus for on-site calibration of visual displays | |
US9210418B2 (en) | Method and apparatus for calibrating multi-spectral sampling system | |
US8531381B2 (en) | Methods and systems for LED backlight white balance | |
CN105185302B (en) | Lamp position deviation modification method and its application between monochrome image | |
US20100207865A1 (en) | Systems and methods for display device backlight compensation | |
US9508281B2 (en) | Apparatus and method for image analysis and image display | |
CN105551431A (en) | LED display module uniformity correction method | |
TW201903743A (en) | Optical compensation apparatus applied to panel and operating method thereof | |
US20110267365A1 (en) | Methods and systems for correcting the appearance of images displayed on an electronic visual display | |
KR102377250B1 (en) | Methods and systems for measuring electronic visual displays using fractional pixels | |
US20100315429A1 (en) | Visual display measurement and calibration systems and associated methods | |
US20100079365A1 (en) | Methods and systems for LED backlight white balance | |
RU2008141365A (en) | METHOD AND DEVICE FOR MEASURING SURFACE QUALITY OF SUBSTRATE | |
Zhao et al. | Perceptual spatial uniformity assessment of projection displays with a calibrated camera | |
CN116777910B (en) | Display screen sub-pixel brightness extraction precision evaluation method and system and electronic equipment | |
US20240331185A1 (en) | Method of obtaining brightness information of display panel and related panel detection system | |
CN117672161A (en) | Brightness and chrominance compensation device and method for display | |
CN116413008A (en) | Display screen full gray-scale optical information acquisition method and device and display control equipment | |
Saha et al. | Characterization of mobile display systems for use in medical imaging |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: RADIANT ZEMAX, LLC, WASHINGTON Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RYKOWSKI, RONALD F.;REEL/FRAME:030388/0853 Effective date: 20130321 |
|
AS | Assignment |
Owner name: FIFTH THIRD BANK, COLORADO Free format text: SECURITY AGREEMENT;ASSIGNOR:RADIANT ZEMAX, LLC;REEL/FRAME:032264/0632 Effective date: 20140122 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: RADIANT VISION SYSTEMS, LLC, F/K/A RADIANT ZEMAX, Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:FIFTH THIRD BANK;REEL/FRAME:036242/0107 Effective date: 20150803 |
|
FEPP | Fee payment procedure |
Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551) Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |