WO2016044124A2 - Method and apparatus for real-time color correction in a normal display mode - Google Patents

Method and apparatus for real-time color correction in a normal display mode Download PDF

Info

Publication number
WO2016044124A2
WO2016044124A2 PCT/US2015/049896 US2015049896W WO2016044124A2 WO 2016044124 A2 WO2016044124 A2 WO 2016044124A2 US 2015049896 W US2015049896 W US 2015049896W WO 2016044124 A2 WO2016044124 A2 WO 2016044124A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
video image
projected
output
color correction
Prior art date
Application number
PCT/US2015/049896
Other languages
French (fr)
Other versions
WO2016044124A3 (en
Inventor
Michael S. Deiss
Mark Francis Rumreich
Evan CHORNEY
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Publication of WO2016044124A2 publication Critical patent/WO2016044124A2/en
Publication of WO2016044124A3 publication Critical patent/WO2016044124A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut

Definitions

  • the present principles of the embodiments generally relate to an apparatus and a method for projecting a signal, such as e.g., a video signal onto an external screen to display a projected video image.
  • a signal such as e.g., a video signal onto an external screen to display a projected video image.
  • the present invention receives an external video signal, and dynamically in real time, provides a color correction of the output video image while the projector in a normal operating/display mode, without having to use a pre-calibration procedure (i.e., a calibration procedure before a user- provided signal is displayed) and/or using a pre-determined calibration image.
  • a pre-calibration procedure i.e., a calibration procedure before a user- provided signal is displayed
  • a projector is typically connected to an external device such as, e.g., a PC, a video receiver, a tablet, a cellphone, and etc., and projects an image produced by a display element onto a projection screen.
  • a projector produces a projected image using a light emitted from a light source or a light engine in combination with an optical or lens system/module.
  • projecting systems including, e.g., LCD (Liquid Crystal Display), DLP (Digital Light Processing), LCOS (Liquid Crystal on Silicon), LED (Light Emitting Diode), and Laser (i.e., projectors with a laser light source), and etc.
  • different projectors have different native resolutions (i.e., the number of physical pixels in the projector's display).
  • a common resolution is SVGA which is 800 by 600 pixels.
  • color compensation can be determined via a pre-calibration process using a test pattern and then the determined color compensation settings for the projectors are stored for use by the projectors for that calibrated period of usage.
  • a user frequently changes the projection surface and/or the lighting environment due to their portability.
  • the present inventors recognize the need to use dynamic color compensation that continuously monitors projected user signal in real time as the basis for color compensation.
  • Epson PowerLite 835P states in its advertisement that it has a sensor which measures the conditions of the projection surface and automatically adjusts the colors and shading of the image.
  • the existing projectors including the Epson PowerLite 835P
  • a user- provided signal such as e.g., a movie, a television show, and a picture image, etc.
  • the present inventors recognize the need to improve the existing systems and methods for providing a color correction function for a projection device.
  • an apparatus comprising:
  • a processor for processing the video signal to provide an output video image; wherein the output video image is projected to produce a projected video image;
  • the processor in the normal operating mode compares the output video image with the sensed projected video image and provides a color correction of the output video image based on the comparison.
  • a method comprising: receiving a video signal from an external source in a normal operating mode;
  • a computer program product stored in a non-transitory computer-readable storage media comprising computer-executable instructions for:
  • FIG. 1 shows an example system according to the principles of the present invention
  • FIG. 2 shows an exemplary process according to the principles of the present invention.
  • FIG. 3 shows an exemplary image having a plurality of image pixels to which the present principles can be applied.
  • FIG. 1 is an exemplary apparatus according to the principles of the present invention.
  • FIG. 1 represents a functional block diagram of a projection display device according to an embodiment of the present invention.
  • the projection display device 100 of FIG. 1 comprises a signal I/O interface 120 for connecting to an input device such as e.g., a PC, a tablet, a phone, and etc., and for receiving a user-provided signal such as a video or a picture image.
  • a connection interface 120 is a HDMI, a USB or a VGA connection.
  • the projector 1 00 illustrated in FIG. 1 is a standalone, externally- provided projector, one skilled in the art can readily recognize that it is also possible that device 100 may be built-in and becomes part of an end user device, if the components chosen are small enough. Such an implementation would be envisioned and covered by the scope of the present invention.
  • Projector 100 also comprises an image/video processor 150 for processing a signal received by signal I/O (Input/Output) interface 120.
  • a controller/processor 160 interfaces with the various components of projector 100 and controls their respective functions.
  • video processor 150 and projector controller 160 may be integrated and combined into one processor unit 188.
  • Processor 188 also comprises a memory 155.
  • Memory 1 55 may represent different types of memories as needed according to the principles of the present invention.
  • memory 155 may represent random access memories (RAM) for processing computer instructions, video memories for storing one or more video frames for processing, Non-volatile, non-transitory memories such as flash memories and/or hard drives for storing files and/or data structures, and etc.
  • RAM random access memories
  • a video image signal from output of video processor 1 50 is input to a video projection module 1 70.
  • video projection module 1 70 As described above, there are various types of projecting systems which would require their respective video projector module, including, e.g., LCD (Liquid Crystal Display), DLP (Digital Light Processing), LCOS (Liquid Crystal on Silicon), LED (Light Emitting Diode), Laser, and etc.
  • Each type of video projection module would also include the appropriate type of light source/light engine and/or imager for the corresponding type of projection system.
  • the video image output from the video projection module 170 is passed through a preselected configuration of lenses or optical module 185 to output a projected image on a projection surface or screen 1 90.
  • an image sensor 1 80 is provided to detect and sense the projected video image on a projection screen 190 through lens/optical module 185.
  • the image senor 180 may be, e.g., a high resolution 2-dimensional color image sensor.
  • the detail of the lens/optical module 185 is not shown in FIG.1 , one skilled in the art would readily recognize that the optical path of the image sensed by video image sensor 180 may be implemented with a separate and independent set of lenses, or the same set of lenses used for projection may also be used to capture the projected image via a beam splitter.
  • projector 100 comprises a user I/O interface 1 30 for receiving signals from a user control device 140 such as, e.g., a remote control or a keypad on projector 100.
  • a user control device 140 such as, e.g., a remote control or a keypad on projector 100.
  • a user of projector 1 00 may utilize the user control device 140 to select different modes and features of the projector, as to be described in detailed below.
  • the projection display apparatus 100 may include other elements (e.g., a power supply, and etc.) in addition to the above-described exemplary elements.
  • additional elements e.g., a power supply, and etc.
  • no detailed description will be given of these additional elements, for simplicity of description, and because the additional elements have no direct relation with the present invention, and/or the additional elements are known in the art.
  • each of the above-described elements may be combined with another element to form one element, or may be divided into two or more elements.
  • FIG. 2 is a flow chart of an exemplary process 200 according to principles of the present invention.
  • the exemplary process may be implemented as computer executable instructions which may be executed by, e.g., a processor 160 or 188 in device 100 in FIG. 1 .
  • a computer program product having the computer-executable instructions may be stored in non-transitory computer-readable storage media of the 100.
  • the exemplary control program 200 shown in FIG. 2 when executed, facilitates processing of a projected signal such as e.g., a video signal according to the principles of the present invention to be described in detail below.
  • a projected signal such as e.g., a video signal
  • FIG. 2 may also be implemented using a combination of hardware and software (e.g., a firmware implementation), and/or executed using logic arrays or ASIC.
  • a pre-calibration mode i.e., a calibration mode before a user-provided signal is displayed.
  • a user may select different modes and/or features of an exemplary projector 1 00 using a user control device 140 of FIG. 1 such as, e.g., a remote control or a keypad on projector 1 00.
  • a user control device 140 of FIG. 1 such as, e.g., a remote control or a keypad on projector 1 00.
  • step 220 if the determination at step 210 is "yes", then projector 100 enters the pre-calibration mode. In this mode, the project is not in a normal operating or display mode.
  • projector 100 is not processing an input signal such as, e.g., a movie or a television video from an input device 1 10 of FIG. 1 .
  • projector 1 00 calibrates projection colors with a pre-determined and pre-stored calibration image (as described before in connection with existing projection systems).
  • a pre-calibration mode projector 1 00 calibrates projection colors with a pre-determined and pre-stored calibration image (as described before in connection with existing projection systems).
  • a so-called service mode which is meant to be performed (but not necessarily) by a service technician, and/or in a factory or a service shop.
  • a user of projector 100 may select and enter a normal operating/display mode of projector 1 00.
  • projector 100 of FIG 1 receives an external video signal (such as e.g., a movie or a television program) in this mode and dynamically in real time, provides a color correction of the output video image while the external video signal is being projected in a normal operating/display mode, without having to rely on a pre- calibration procedure, and/or use a pre-determined calibration image.
  • an external video signal such as e.g., a movie or a television program
  • processor 188 in FIG. 1 processes the external video signal received from an external device 1 1 0 of FIG. 1 at step 240 to provide an output video image.
  • this output video image is coupled to e.g., video projection module 170 and to lens/optical module 185 of FIG. 1 to produce a projected video image to be shown, e.g., on a projection surface or screen 190 of FIG. 1 .
  • the dynamic real time color correction feature is activated and applied in the normal operating/display mode of projector 100.
  • the projected video image on a projection surface or screen of 190 is sensed using an image sensor 180, through a lens/optical module 185.
  • processor 188 compares the output video image at step 250 with the sensed projected video image at step 280.
  • This comparison step 290 may include pixel and/or image alignment steps to be described in detail below.
  • a color correction of the output video image is provided based on the comparison step at step 290.
  • the present invention provides a dynamic real time color correction to a normal display signal without using a predetermined calibration image and/or a pre- calibration mode. More detail of color correction embodiments according to the principles of the present invention are described below.
  • the dynamic real time color correction system and method utilizes an image processing system that can compare the video signal sent to an image transducer (e.g., video projection module 170 of FIG. 1 ) with the video signal sensed and created (e.g., from an image sensor 180 of FIG. 1 ).
  • an image transducer e.g., video projection module 170 of FIG. 1
  • This comparison step may be two-dimensional with a transform value determined for each pixel projected.
  • the transform needs not be recomputed at a frame rate of the video signal, but at a rate consistent with the necessary rate of adaptation to a changing environment.
  • a projected image is composed of a 2-dimensional array of pixels which are updated in value at a rate that is at least a frame.
  • a common resolution for a projector is SVGA which is 800 by 600 pixels.
  • an intensity value for each color primary is computed.
  • colors may be rendered at each pixel by a time sequential presentation of the color primaries at a fractional frame rate, or by co-located sub-pixels, presenting each primary at the same time.
  • each color primary can be expressed at each pixel location in a two dimensional array (x,y) of pixels, forming the complete picture P(x,y), which can be interpreted as R(x,y), G(x,y), B(x,y).
  • An additional image transform (T) is proposed that can alter the value of intensity of each of the primaries expressed at each pixel location (x,y) of the projected image.
  • the transform will produce a new output P'(x,y), composed of R'(x,y), G'(x,y), and B'(x,y).
  • R'(x,y) R(x,y) * Mr,x,y + Kr,x,y for the RED primary
  • G'(x,y) G(x,y) * Mg,x,y + Kg,x,y for the GREEN primary
  • the transform T includes an array of gain factors M and offsets K for each pixel value (x,y) of the picture P.
  • the array of gains and offsets although static in value until the next adaptation system update, is applied in real time to the picture P to produce a transformed output P' for each frame as a frame becomes available for display.
  • the array of gains and offset is designed to be alterable by the adaptation system to compensate for changing conditions of display, such as a change in extraneous room lighting and a change in projection surface colors as may exist when the projector is moved to a different location.
  • the rate of adaptation (of the gains and offsets) in the transform can be a product design option made with regard to cost/performance tradeoffs.
  • the brightness value of each projected pixel is compared with its counterpart from the image sample captured by, e.g., a 2D image sensor 180 of FIG. 1 which provides an error value.
  • Successive measures can be made over a number of frames and these successive measures can, in turn, be used to compute a line equation with at least two sample points of intensity at each pixel position to form gain and offset values for the transform T.
  • the line equation can be used to interpolate or extrapolate to intensity values not yet measured.
  • Successive measurements of intensity can be used to adjust gain and offset values for a "best fit" for measured values.
  • any picture controls such as brightness or contrast will have to be asserted before comparison to the captured image, otherwise the feedback system will merely negate the picture controls.
  • the line equations of the transform may not be able to be satisfied for all values, since projectors cannot remove light from a projection screen nor output light higher than its maximum capability.
  • brightness and contrast controls can be applied manually or automatically until the extreme conditions of too black or too bright for each primary is eliminated. Note that these conditions can also be affected by adjustments of color saturation and other picture controls.
  • the brightness value of each projected pixel is compared with its counterpart from the image sample captured by e.g., 2D color image sensor 180 of FIG. 1 .
  • Successive measures can be made over a number of frames and these successive measures can, in turn, be used to compute an "error value" £ for each intensity value for each primary color and at each pixel location.
  • the error is simply the projected intensity vs captured intensity:
  • G'(x,y) G(x,y) + Sg[G(x,y][[x][y] for the GREEN primary
  • B'(x,y) B(x,y) + 6b[B(x,y)][x][y] for the BLUE primary
  • extrapolations and interpolations from neighbor pixel values can be used to assign an initial value until an actual sample can be made. Before any samples are made, the tables can be initialized to values assuming no error.
  • a laser projector using a 2-D scanning spot beam for image projection presents a unique opportunity greatly simplifying the overall complexity of color compensation.
  • a projection surface e.g., 1 90 of FIG.1
  • light is uniquely reflected from that spot, in the case of the perfect projection environment.
  • a single one- dimensional sensor as represented, e.g., by image sensor 180 of FIG. 1 , with a "look angle" of the entire projected image can capture the intensity of the reflection and adjust beam power in real time to meet the needs of proper color rendering.
  • a measure of any extraneous background lighting may be captured (e.g., using image senor 180 of FIG.
  • Exemplary corrections methods 1 & 2 as outlined above may require alignment of a projected image (by e.g., elements 150, 170 and 185 of FIG. 1 ) and the sensed and captured image (by e.g., image sensor 180 of FIG.1 ). Stated another way, a best correspondence of the images is to be found between a given pixel of a projected image and that of the captured image for the purpose of determining the brightness error between these pixels. It is possible to find this correspondence in a number of ways as described below.
  • the 2D image sensor determines a maximum extent of the projected image, left to right and top to bottom. Over a span of time, the edges of the projected image will become detectable as enough pixels on the edges of the picture will eventually take on a brightness value that can be detected by the image sensor.
  • the edges of the projected image can be precisely determined by extrapolating virtual lines through detected pixels at the maximum measured extent of the picture. Intersecting lines denote the corners of the projected image. A correspondence among pixels of the projected image and captured image can now be determined through interpolation of position values.
  • a spatial convolution kernel can be created (based on maximum alignment errors, which is a fixed system parameter) that can relate multiple captured pixels to a given projected pixel. Once the alignment parameters are determined they may not need further adjustments unless the optics changes. This could be, e.g., done as a one-time factory adjustment.
  • This method utilizes tools found in motion estimation techniques for video encoders. Corresponding matching regions between the projected image and the captured image determine pixel correspondence.
  • a spatial convolution kernel is created in which a given region is scanned for correspondence between projected image and captured image.
  • this method sets a given pixel to black in the projected image and uses the spatial convolution kernel in the expected region of the captured image to search for that black pixel.
  • a sort of sensitivity analysis is performed which finds the best spatial fit for the kernel. Pixel locations selected for analysis may be pseudo-randomly determined to prevent detection by the human eye.
  • the present principles also can be used to solve the registration error (i.e., image alignment problem) in a dynamic color-compensation system.
  • Registration errors may come from a number of ways. One is from the fact that the projector lens is not co-located with the camera lens, and the distance from the projector to the wall is not fixed. Without registration errors, it would have been possible to directly compare each projected pixel with its measured counterpart at the output of the image processor 150. This would be an ideal situation, and would have provided accurate and comprehensive data for the color compensation algorithm.
  • the present inventors recognize that one way to deal with registration errors would have been to spatially translate the measured image to accurately line up with the projected image, but this is a costly and power- consuming process. Instead, the present invention uses a very simple alternative.
  • the present invention uses a weighted sum of color pixel values in a selected area of an output or projected image as the basis for colorimetric comparison.
  • a weighting algorithm according to the principles of the present invention would give, for example, unity weight for a central portion of the image, gradual diminishing weight near the periphery of the image (to reduce the adverse effects of registration errors), and zero weight outside the image area. Other weighting schemes could also be used according to the principles of the present invention.
  • FIG. 3 shows an exemplary image 300 having a plurality of image pixels.
  • different projectors have different native resolutions.
  • a common resolution is SVGA which would have 800 by 600 pixels per image or frame.
  • Each image pixel in an image in a projection system is typically represented by a red component data value, a green component data value and a blue component data value.
  • a pixel 301 in image 300 of FIG. 3 would have its corresponding R (Red), G (Green), B (Blue) pixel values.
  • the determination of the weighted sum of an image will be done on the color component basis for each of the R, G, B components of each pixel, and with more emphasis given to the pixels near the center of the image. For example, as shown in FIG.
  • a weighted sum of a color component of an image 300 when calculating a weighted sum of a color component of an image 300, more weight may be given to pixels 301 to 304 located in center region 31 0 of image 300 (e.g., using a unity factor or weighting of 1 ).
  • a factor of less than 1 may be allocated (e.g., .8).
  • a factor smaller than the factor used for region 320 is used (e.g.5, Similarly, pixels in area 340 would be allocated with even less weight than pixels in area 330 (e.g., .3).
  • a weighted sum calculation formula for the red component of image 310 in FIG. 3 may be:
  • the weighted sum for the green and the blue components of exemplary image 300 may be calculated similarly.
  • Other weighting schemes could also be used according to the principles of the present invention. These values could be determined for each frame of video, then used by the color compensation algorithm.
  • an enhancement to the single image approach would be to divide an image picture into multiple zones, for example a 8x8 grid. Each zone would produce a weighted sum, with gradually diminishing weight near the periphery of each zone. This would add some complexity, but would offer the advantage of more useful data points.
  • processor 188 determines a weighted sum of the sensed projected image, e.g., using the same weight sum calculation for determining the weighted sum of the output image as described in detailed above in connection with step 285.
  • processor 188 compares the determined weighted sum of the output image with the determined weighted sum of the sensed projected image.
  • a color correction of the output video image is provided based on the comparison step at step 290. That is, projector 1 00 would adjust the output of image processor 150 so that the determined weighted sum of the output image at the output of image processor is substantially the same as, or converges with, the determined weighted sum of the sensed projected image.
  • the dynamic real time color correction system and method utilizes an image processing system that can efficiently compare, e.g., the video signal sent to an image transducer (e.g., video projection module 170 of FIG. 1 ) with the video signal sensed and created (e.g., from an image sensor 1 80 of FIG. 1 ) based on weighted sums of the respective color components of these signals.
  • an image transducer e.g., video projection module 170 of FIG. 1
  • the video signal sensed and created e.g., from an image sensor 1 80 of FIG. 1
  • a reference black image may be provided and projected to provide an enhanced color correction function.
  • additional useful data points of a reference black level may be provided and detected by one or more of the following exemplary ways:
  • a) Expands the detection of light for a reference black level outside the area of projected image (e.g., in area 380 of image 300 shown in FIG. 3). b) Detects a reference black level within the area of projected frames, established within the time period between two successive frames of a sequence.
  • c) Projects an area of black, for reference, within a frame during a single projected frame.
  • the area of black may encompass a portion of a projected frame or the entire frame area. A short projected flash of black is unlikely to be noticed by viewers if it occurs randomly and infrequently.
  • the black level is not adaptable to changing conditions after the projector begins rendering images for display.
  • ⁇ ) Detects a reference black level, when a single black frame is projected opportunistically, at a time dependent upon the physical movement of the projector.
  • An accelerometer can detect movement of the projector and conditionally project a black frame that can be used for black level detection.
  • f) Detects a reference black level, when a single black frame is projected opportunistically, at a time dependent upon selecting a new video source, modifying some display parameter (e.g. as in a user video adjustment or auto aspect ratio adjustment), or during the rendering of an updated OSD or menu display.

Abstract

A video signal is projected onto an external screen to display a projected video image. In one exemplary embodiment, the present invention receives an external video signal, and dynamically in real time, provides a color correction of the output video image while the projector in a normal operating/display mode, without having to use a pre-calibration procedure and/or using a pre-determined calibration image. A color correction of an output image is based on a weighted sum of a plurality of selected image pixels. The projector performs the color correction dynamically in real time while a user signal is displayed in a normal operating/display mode, without having to use a pre-calibration procedure (i.e., a calibration procedure before a user-provided signal is displayed) and/or using a pre-determined calibration image.

Description

METHOD AND APPARATUS FOR REAL-TIME COLOR CORRECTION IN A
NORMAL DISPLAY MODE
BACKGROUND OF THE INVENTION
Field of the Invention
The present principles of the embodiments generally relate to an apparatus and a method for projecting a signal, such as e.g., a video signal onto an external screen to display a projected video image. In one exemplary embodiment, the present invention receives an external video signal, and dynamically in real time, provides a color correction of the output video image while the projector in a normal operating/display mode, without having to use a pre-calibration procedure (i.e., a calibration procedure before a user- provided signal is displayed) and/or using a pre-determined calibration image. Background Information
Various projectors or projection systems are well known in the art. Generally, a projector is typically connected to an external device such as, e.g., a PC, a video receiver, a tablet, a cellphone, and etc., and projects an image produced by a display element onto a projection screen. A projector produces a projected image using a light emitted from a light source or a light engine in combination with an optical or lens system/module. There are various types of projecting systems, including, e.g., LCD (Liquid Crystal Display), DLP (Digital Light Processing), LCOS (Liquid Crystal on Silicon), LED (Light Emitting Diode), and Laser (i.e., projectors with a laser light source), and etc. In addition, different projectors have different native resolutions (i.e., the number of physical pixels in the projector's display). A common resolution is SVGA which is 800 by 600 pixels.
For traditional fixed-location projectors, color compensation can be determined via a pre-calibration process using a test pattern and then the determined color compensation settings for the projectors are stored for use by the projectors for that calibrated period of usage. But for handheld projectors, a user frequently changes the projection surface and/or the lighting environment due to their portability. For such cases, the present inventors recognize the need to use dynamic color compensation that continuously monitors projected user signal in real time as the basis for color compensation.
As an example, a currently available projector, Epson PowerLite 835P states in its advertisement that it has a sensor which measures the conditions of the projection surface and automatically adjusts the colors and shading of the image. However, it is believed that the existing projectors (including the Epson PowerLite 835P) require such a color correction function to be performed in a pre-calibration mode (i.e., a calibration mode before a user- provided signal is displayed) and performed by having a pre-determined calibration image which is not a user-provided image. Therefore, the existing systems do not perform color correction dynamically in real time, when a user- provided signal such as e.g., a movie, a television show, and a picture image, etc., is being projected in the normal operating/display mode.
SUMMARY OF THE INVENTION
The present inventors recognize the need to improve the existing systems and methods for providing a color correction function for a projection device.
In accordance with an aspect of the present invention, an apparatus is presented, comprising:
an interface for receiving a video signal from an external source;
a processor for processing the video signal to provide an output video image; wherein the output video image is projected to produce a projected video image;
a sensor for sensing the projected video image in a normal operating mode; and
wherein the processor in the normal operating mode compares the output video image with the sensed projected video image and provides a color correction of the output video image based on the comparison.
In another exemplary embodiment, a method is presented comprising: receiving a video signal from an external source in a normal operating mode;
processing the video signal to provide an output video image;
projecting the output video image to produce a projected video image; sensing the projected video image;
comparing the output video image with the sensed projected video image; and
providing a color correction of the output video image based on the comparison step in the normal operating mode.
In accordance with principles of the present invention, a computer program product stored in a non-transitory computer-readable storage media is presented, comprising computer-executable instructions for:
receiving a video signal from an external source in a normal operating mode;
processing the video signal to provide an output video image;
projecting the output video image to produce a projected video image; sensing the projected video image;
comparing the output video image with the sensed projected video image; and
providing a color correction of the output video image based on the comparison step in the normal operating mode.
DETAILED DESCRIPTION OF THE DRAWINGS
The above-mentioned and other features and advantages of this invention, and the manner of attaining them, will become more apparent and the invention will be better understood by reference to the following description of embodiments of the invention taken in conjunction with the accompanying drawings, wherein:
FIG. 1 shows an example system according to the principles of the present invention; and
FIG. 2 shows an exemplary process according to the principles of the present invention. FIG. 3 shows an exemplary image having a plurality of image pixels to which the present principles can be applied.
The examples set out herein illustrate exemplary embodiments of the invention. Such examples are not to be construed as limiting the scope of the invention in any manner.
DETAILED DESCRIPTION
FIG. 1 is an exemplary apparatus according to the principles of the present invention. FIG. 1 represents a functional block diagram of a projection display device according to an embodiment of the present invention.
The projection display device 100 of FIG. 1 according to the present invention comprises a signal I/O interface 120 for connecting to an input device such as e.g., a PC, a tablet, a phone, and etc., and for receiving a user-provided signal such as a video or a picture image. An example of such as a connection interface 120 is a HDMI, a USB or a VGA connection. Although the projector 1 00 illustrated in FIG. 1 is a standalone, externally- provided projector, one skilled in the art can readily recognize that it is also possible that device 100 may be built-in and becomes part of an end user device, if the components chosen are small enough. Such an implementation would be envisioned and covered by the scope of the present invention.
Projector 100 also comprises an image/video processor 150 for processing a signal received by signal I/O (Input/Output) interface 120. In addition, a controller/processor 160 interfaces with the various components of projector 100 and controls their respective functions. In one exemplary embodiment according to the principles of the present invention, video processor 150 and projector controller 160 may be integrated and combined into one processor unit 188. Processor 188 also comprises a memory 155. Memory 1 55 may represent different types of memories as needed according to the principles of the present invention. For example, memory 155 may represent random access memories (RAM) for processing computer instructions, video memories for storing one or more video frames for processing, Non-volatile, non-transitory memories such as flash memories and/or hard drives for storing files and/or data structures, and etc.
A video image signal from output of video processor 1 50 is input to a video projection module 1 70. As described above, there are various types of projecting systems which would require their respective video projector module, including, e.g., LCD (Liquid Crystal Display), DLP (Digital Light Processing), LCOS (Liquid Crystal on Silicon), LED (Light Emitting Diode), Laser, and etc. Each type of video projection module would also include the appropriate type of light source/light engine and/or imager for the corresponding type of projection system. Also as well known in the art, the video image output from the video projection module 170 is passed through a preselected configuration of lenses or optical module 185 to output a projected image on a projection surface or screen 1 90.
In one exemplary aspect according to the principles of the present invention, an image sensor 1 80 is provided to detect and sense the projected video image on a projection screen 190 through lens/optical module 185. In one exemplary embodiment, the image senor 180 may be, e.g., a high resolution 2-dimensional color image sensor. In addition, although the detail of the lens/optical module 185 is not shown in FIG.1 , one skilled in the art would readily recognize that the optical path of the image sensed by video image sensor 180 may be implemented with a separate and independent set of lenses, or the same set of lenses used for projection may also be used to capture the projected image via a beam splitter.
Furthermore, projector 100 comprises a user I/O interface 1 30 for receiving signals from a user control device 140 such as, e.g., a remote control or a keypad on projector 100. A user of projector 1 00 may utilize the user control device 140 to select different modes and features of the projector, as to be described in detailed below.
The projection display apparatus 100 according to the present invention may include other elements (e.g., a power supply, and etc.) in addition to the above-described exemplary elements. However, no detailed description will be given of these additional elements, for simplicity of description, and because the additional elements have no direct relation with the present invention, and/or the additional elements are known in the art. Also, it should be noted that each of the above-described elements may be combined with another element to form one element, or may be divided into two or more elements.
FIG. 2 is a flow chart of an exemplary process 200 according to principles of the present invention. In one embodiment, the exemplary process may be implemented as computer executable instructions which may be executed by, e.g., a processor 160 or 188 in device 100 in FIG. 1 . For example, a computer program product having the computer-executable instructions may be stored in non-transitory computer-readable storage media of the 100. The exemplary control program 200 shown in FIG. 2, when executed, facilitates processing of a projected signal such as e.g., a video signal according to the principles of the present invention to be described in detail below. One skilled in the art can readily recognize that the exemplary process shown in FIG. 2 may also be implemented using a combination of hardware and software (e.g., a firmware implementation), and/or executed using logic arrays or ASIC.
At step 210 of FIG. 2, a determination is made to see if a user of an exemplary projector 100 shown in FIG 1 has selected a pre-calibration mode (i.e., a calibration mode before a user-provided signal is displayed). As described above, a user may select different modes and/or features of an exemplary projector 1 00 using a user control device 140 of FIG. 1 such as, e.g., a remote control or a keypad on projector 1 00. At step 220, if the determination at step 210 is "yes", then projector 100 enters the pre-calibration mode. In this mode, the project is not in a normal operating or display mode. That is, projector 100 is not processing an input signal such as, e.g., a movie or a television video from an input device 1 10 of FIG. 1 . Instead, in a pre-calibration mode, projector 1 00 calibrates projection colors with a pre-determined and pre-stored calibration image (as described before in connection with existing projection systems). One skilled in the art may also readily recognize that some projectors may incorporate this pre- calibration mode in a so-called service mode which is meant to be performed (but not necessarily) by a service technician, and/or in a factory or a service shop.
At step 230, a user of projector 100 may select and enter a normal operating/display mode of projector 1 00. At step 240, projector 100 of FIG 1 receives an external video signal (such as e.g., a movie or a television program) in this mode and dynamically in real time, provides a color correction of the output video image while the external video signal is being projected in a normal operating/display mode, without having to rely on a pre- calibration procedure, and/or use a pre-determined calibration image.
At step 250, processor 188 in FIG. 1 processes the external video signal received from an external device 1 1 0 of FIG. 1 at step 240 to provide an output video image. At step 260, this output video image is coupled to e.g., video projection module 170 and to lens/optical module 185 of FIG. 1 to produce a projected video image to be shown, e.g., on a projection surface or screen 190 of FIG. 1 .
At step 270, a determination is made to see if a user of projector 100 has selected a dynamic real time color correction feature in the normal operating/display mode according to the principles of the present invention. Again, a user may select this feature using a user control device 140 of FIG. 1 such as, e.g., a remote control or a keypad on projector 100. If the determination is "no" at step 270, then projector processes the external input video signal with no dynamic real time color correction applied, as shown at step 260.
If on the other hand, the determination is "yes" at step 270, then the dynamic real time color correction feature is activated and applied in the normal operating/display mode of projector 100. At step 280, the projected video image on a projection surface or screen of 190 is sensed using an image sensor 180, through a lens/optical module 185.
At step 290, processor 188 compares the output video image at step 250 with the sensed projected video image at step 280. This comparison step 290 may include pixel and/or image alignment steps to be described in detail below.
At step 295, a color correction of the output video image is provided based on the comparison step at step 290. As noted above, the present invention provides a dynamic real time color correction to a normal display signal without using a predetermined calibration image and/or a pre- calibration mode. More detail of color correction embodiments according to the principles of the present invention are described below.
The dynamic real time color correction system and method according to the principles of the present invention utilizes an image processing system that can compare the video signal sent to an image transducer (e.g., video projection module 170 of FIG. 1 ) with the video signal sensed and created (e.g., from an image sensor 180 of FIG. 1 ). One exemplary embodiment of this comparison step (e.g., 290 of FIG. 2) may be two-dimensional with a transform value determined for each pixel projected. Although the projected image is processed in real-time through the transform, the transform needs not be recomputed at a frame rate of the video signal, but at a rate consistent with the necessary rate of adaptation to a changing environment.
According to the principles of the present invention, a projected image is composed of a 2-dimensional array of pixels which are updated in value at a rate that is at least a frame. As noted previously, one example of a common resolution for a projector is SVGA which is 800 by 600 pixels. For each pixel of an imager or video projection module 1 70 in FIG. 1 , an intensity value for each color primary is computed. Depending upon construction of the light engine, colors may be rendered at each pixel by a time sequential presentation of the color primaries at a fractional frame rate, or by co-located sub-pixels, presenting each primary at the same time. In either case and for the purpose of this invention, each color primary can be expressed at each pixel location in a two dimensional array (x,y) of pixels, forming the complete picture P(x,y), which can be interpreted as R(x,y), G(x,y), B(x,y). For example, for a SVGA (800 by 600 pixels) projector, x = 800 and y = 600.
An additional image transform (T) is proposed that can alter the value of intensity of each of the primaries expressed at each pixel location (x,y) of the projected image. The transform will produce a new output P'(x,y), composed of R'(x,y), G'(x,y), and B'(x,y).
CORRECTION METHOD #1
It is proposed that for each value of (x,y), gain factors (Mr,x,y Mg,x,y Mb,x,y) and offset factors (Kr,x,y Kg,x,y Kb,x,y) exist such that the transform
R'(x,y)=R(x,y)*Mr,x,y + Kr,x,y for the RED primary
G'(x,y)=G(x,y)*Mg,x,y + Kg,x,y for the GREEN primary
B'(x,y)=B(x,y)*Mb,x,y + Kb,x,y for the BLUE primary is computed, and thus P'(x,y) = R'(x,y), G'(x,y), B'(x,y)
In shorthand notation, this will be represented as P' = T(P). The transform T includes an array of gain factors M and offsets K for each pixel value (x,y) of the picture P. The array of gains and offsets, although static in value until the next adaptation system update, is applied in real time to the picture P to produce a transformed output P' for each frame as a frame becomes available for display. The array of gains and offset is designed to be alterable by the adaptation system to compensate for changing conditions of display, such as a change in extraneous room lighting and a change in projection surface colors as may exist when the projector is moved to a different location. The rate of adaptation (of the gains and offsets) in the transform can be a product design option made with regard to cost/performance tradeoffs. In preparation for the adaptation system update, the brightness value of each projected pixel is compared with its counterpart from the image sample captured by, e.g., a 2D image sensor 180 of FIG. 1 which provides an error value. Successive measures can be made over a number of frames and these successive measures can, in turn, be used to compute a line equation with at least two sample points of intensity at each pixel position to form gain and offset values for the transform T. The line equation can be used to interpolate or extrapolate to intensity values not yet measured. Successive measurements of intensity can be used to adjust gain and offset values for a "best fit" for measured values. In one exemplary embodiment, since a feedback system exists between the projected image and the captured image, any picture controls such as brightness or contrast will have to be asserted before comparison to the captured image, otherwise the feedback system will merely negate the picture controls.
Due to the possibility of extraneous room lighting, the line equations of the transform may not be able to be satisfied for all values, since projectors cannot remove light from a projection screen nor output light higher than its maximum capability. In such a case, brightness and contrast controls can be applied manually or automatically until the extreme conditions of too black or too bright for each primary is eliminated. Note that these conditions can also be affected by adjustments of color saturation and other picture controls.
CORRECTION METHOD #2
Similar to the above method #1 , in preparation for the adaptation system update, the brightness value of each projected pixel is compared with its counterpart from the image sample captured by e.g., 2D color image sensor 180 of FIG. 1 . Successive measures can be made over a number of frames and these successive measures can, in turn, be used to compute an "error value" £ for each intensity value for each primary color and at each pixel location. The error is simply the projected intensity vs captured intensity:
R'(x,y)=R(x,y) + Sr[R(x,y)][x][y] for the RED primary
G'(x,y)=G(x,y) + Sg[G(x,y][[x][y] for the GREEN primary B'(x,y)=B(x,y) + 6b[B(x,y)][x][y] for the BLUE primary
This can be reduced to: R'(x.y)=Rt[R(x,y)][x][y] for the RED primary
G'(x,y)=Gt[G(x,y][[x][y] for the GREEN primary
B'(x,y)=Bt[B(x,y)][x][y] for the BLUE primary
These equations are effectively lookup tables (Rt, Gt, Bt) for each of the primaries. Potential issues and solutions of the image feedback system negating picture controls and of extraneous room lighting are similar to correction method #1 described above.
For sparsely populated tables, extrapolations and interpolations from neighbor pixel values can be used to assign an initial value until an actual sample can be made. Before any samples are made, the tables can be initialized to values assuming no error.
Using independent equations for the primaries (as opposed to R' =f(R,G,B) ) provides great simplicity, but introduces problems when any of the primaries runs out of dynamic range. This can result in abrupt tint shifts, which are very objectionable to the human eye. A simple solution is to make the error values approach zero at each end of their dynamic range. CORRECTION METHOD #3
A laser projector using a 2-D scanning spot beam for image projection presents a unique opportunity greatly simplifying the overall complexity of color compensation. At any instant in time that the beam is impinging upon a projection surface (e.g., 1 90 of FIG.1 ), light is uniquely reflected from that spot, in the case of the perfect projection environment. A single one- dimensional sensor (for each primary) as represented, e.g., by image sensor 180 of FIG. 1 , with a "look angle" of the entire projected image can capture the intensity of the reflection and adjust beam power in real time to meet the needs of proper color rendering. For the case of the less than perfect projection environment, a measure of any extraneous background lighting may be captured (e.g., using image senor 180 of FIG. 1 ) without interference of the spot beam by sampling at the end of each line scan or of each frame scan. This measure can be stored and subtracted out of subsequent measurements when the spot beam is active, thus providing a better measure of the true error and therefore of the compensation needed in the beam intensity. ALIGNMENT METHODS
Exemplary corrections methods 1 & 2 as outlined above may require alignment of a projected image (by e.g., elements 150, 170 and 185 of FIG. 1 ) and the sensed and captured image (by e.g., image sensor 180 of FIG.1 ). Stated another way, a best correspondence of the images is to be found between a given pixel of a projected image and that of the captured image for the purpose of determining the brightness error between these pixels. It is possible to find this correspondence in a number of ways as described below.
ALIGNMENT METHOD #1
The 2D image sensor (e.g., image sensor 1 80 of FIG. 1 ) determines a maximum extent of the projected image, left to right and top to bottom. Over a span of time, the edges of the projected image will become detectable as enough pixels on the edges of the picture will eventually take on a brightness value that can be detected by the image sensor. The edges of the projected image can be precisely determined by extrapolating virtual lines through detected pixels at the maximum measured extent of the picture. Intersecting lines denote the corners of the projected image. A correspondence among pixels of the projected image and captured image can now be determined through interpolation of position values. The relationship between projected pixels and captured pixels need not be a 1 :1 correspondence: A spatial convolution kernel can be created (based on maximum alignment errors, which is a fixed system parameter) that can relate multiple captured pixels to a given projected pixel. Once the alignment parameters are determined they may not need further adjustments unless the optics changes. This could be, e.g., done as a one-time factory adjustment.
ALIGNMENT METHOD #2
This method utilizes tools found in motion estimation techniques for video encoders. Corresponding matching regions between the projected image and the captured image determine pixel correspondence.
Again, once the alignment parameters are determined they may not need further adjustments unless the optics changes. Perhaps this could be done as a one-time factory adjustment.
ALIGNMENT METHOD #3
A spatial convolution kernel is created in which a given region is scanned for correspondence between projected image and captured image. Although similar to ALIGNMENT METHOD 2, this method sets a given pixel to black in the projected image and uses the spatial convolution kernel in the expected region of the captured image to search for that black pixel. A sort of sensitivity analysis is performed which finds the best spatial fit for the kernel. Pixel locations selected for analysis may be pseudo-randomly determined to prevent detection by the human eye.
Again, once the alignment parameters are determined they may not need further adjustments unless the optics changes, therefore, this could also be done as a one-time factory adjustment.
The present principles also can be used to solve the registration error (i.e., image alignment problem) in a dynamic color-compensation system. Registration errors may come from a number of ways. One is from the fact that the projector lens is not co-located with the camera lens, and the distance from the projector to the wall is not fixed. Without registration errors, it would have been possible to directly compare each projected pixel with its measured counterpart at the output of the image processor 150. This would be an ideal situation, and would have provided accurate and comprehensive data for the color compensation algorithm.
The present inventors recognize that one way to deal with registration errors would have been to spatially translate the measured image to accurately line up with the projected image, but this is a costly and power- consuming process. Instead, the present invention uses a very simple alternative. In one exemplary embodiment, the present invention uses a weighted sum of color pixel values in a selected area of an output or projected image as the basis for colorimetric comparison. In one exemplary embodiment, a weighting algorithm according to the principles of the present invention would give, for example, unity weight for a central portion of the image, gradual diminishing weight near the periphery of the image (to reduce the adverse effects of registration errors), and zero weight outside the image area. Other weighting schemes could also be used according to the principles of the present invention. An exemplary weighing algorithm according to the principles of the present invention is illustrated in FIG. 3 and described herein. FIG. 3 shows an exemplary image 300 having a plurality of image pixels. As noted above, different projectors have different native resolutions. A common resolution is SVGA which would have 800 by 600 pixels per image or frame.
Each image pixel in an image in a projection system is typically represented by a red component data value, a green component data value and a blue component data value. For example, a pixel 301 in image 300 of FIG. 3 would have its corresponding R (Red), G (Green), B (Blue) pixel values. In one embodiment, according to the principles of the present invention, the determination of the weighted sum of an image will be done on the color component basis for each of the R, G, B components of each pixel, and with more emphasis given to the pixels near the center of the image. For example, as shown in FIG. 3, when calculating a weighted sum of a color component of an image 300, more weight may be given to pixels 301 to 304 located in center region 31 0 of image 300 (e.g., using a unity factor or weighting of 1 ). Moving away from the center region 310, for pixels located in region 320, a factor of less than 1 may be allocated (e.g., .8). For pixels in region 330, which is even further away from the center region 310 than region 320, a factor smaller than the factor used for region 320 is used (e.g., .5). Similarly, pixels in area 340 would be allocated with even less weight than pixels in area 330 (e.g., .3). Therefore, the present invention would give, for example, more weight for a central portion of the image, gradual diminishing weight near the periphery of the image, and zero weight outside the image area (e.g., in area 380). Hence, for example, a weighted sum calculation formula for the red component of image 310 in FIG. 3 may be:
Weighted Sum for R Component of Image = Factor_for_Region310 x
(R301 +R302+R303+R304) + Factor_for_Region32o x (R3o5+ - - - +R3i e) +
Facto r_for_R eg ion 330 x (R3i 7+ - - -+ R336) + Factor_for_Region340 x (R337+ - - - + Rn), where R is the red component data value for the image pixel n.
The weighted sum for the green and the blue components of exemplary image 300 may be calculated similarly. Other weighting schemes could also be used according to the principles of the present invention. These values could be determined for each frame of video, then used by the color compensation algorithm. In addition, an enhancement to the single image approach would be to divide an image picture into multiple zones, for example a 8x8 grid. Each zone would produce a weighted sum, with gradually diminishing weight near the periphery of each zone. This would add some complexity, but would offer the advantage of more useful data points.
At step 287, processor 188 determines a weighted sum of the sensed projected image, e.g., using the same weight sum calculation for determining the weighted sum of the output image as described in detailed above in connection with step 285. At step 290, processor 188 compares the determined weighted sum of the output image with the determined weighted sum of the sensed projected image. At step 295, a color correction of the output video image is provided based on the comparison step at step 290. That is, projector 1 00 would adjust the output of image processor 150 so that the determined weighted sum of the output image at the output of image processor is substantially the same as, or converges with, the determined weighted sum of the sensed projected image. Therefore, the dynamic real time color correction system and method according to the principles of the present invention utilizes an image processing system that can efficiently compare, e.g., the video signal sent to an image transducer (e.g., video projection module 170 of FIG. 1 ) with the video signal sensed and created (e.g., from an image sensor 1 80 of FIG. 1 ) based on weighted sums of the respective color components of these signals.
In an exemplary embodiment, a reference black image may be provided and projected to provide an enhanced color correction function. For example, additional useful data points of a reference black level may be provided and detected by one or more of the following exemplary ways:
a) Expands the detection of light for a reference black level outside the area of projected image (e.g., in area 380 of image 300 shown in FIG. 3). b) Detects a reference black level within the area of projected frames, established within the time period between two successive frames of a sequence.
c) Projects an area of black, for reference, within a frame during a single projected frame. The area of black may encompass a portion of a projected frame or the entire frame area. A short projected flash of black is unlikely to be noticed by viewers if it occurs randomly and infrequently.
d) Detects a reference black level at startup, before a projected image is rendered. However, with this approach, the black level is not adaptable to changing conditions after the projector begins rendering images for display. θ) Detects a reference black level, when a single black frame is projected opportunistically, at a time dependent upon the physical movement of the projector. An accelerometer can detect movement of the projector and conditionally project a black frame that can be used for black level detection. f) Detects a reference black level, when a single black frame is projected opportunistically, at a time dependent upon selecting a new video source, modifying some display parameter (e.g. as in a user video adjustment or auto aspect ratio adjustment), or during the rendering of an updated OSD or menu display.
While several embodiments have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the functions and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the present embodiments. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings herein is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific embodiments described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereof, the embodiments disclosed may be practiced otherwise than as specifically described and claimed. The present embodiments are directed to each individual feature, system, article, material and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials and/or methods, if such features, systems, articles, materials and/or methods are not mutually inconsistent, is included within the scope of the present embodiments.

Claims

CLAIMS:
1 . A method comprising:
receiving (240) a video signal from an external source in a normal operating mode;
processing (250) the video signal to provide an output video image;
projecting (260) the output video image to produce a projected video image; sensing (280) the projected video image;
comparing (290) the output video image with the sensed projected video image; and
providing (295) a color correction of the output video image based on the comparison step in the normal operating mode.
2. The method of claim 1 wherein the step of providing a color correction is performed for each frame of the output video image.
3. The method of claim 1 wherein the step of providing a color correction is selected in response to a user input (270).
4. The method of claim 1 further comprising a step of aligning (290) the output video image with the sensed projected video image.
5. The method of claim 4 wherein the aligning step is based on determining an edge of the sensed projected video image.
6. The method of claim 1 wherein the providing a color correction step is performed without a predetermined calibration image.
7. The method of claim 1 wherein the comparing step is performed using a transform of the output video image.
8. The method of claim 1 further comprising a step of determining (210) whether a pre-calibration mode has been selected and wherein the pre-calibration mode is different than the normal operating mode.
9. The method of claim 1 wherein the step of projecting is performed by one of following projection type: 1 ) LCD, 2) LED, 3) LCOS, 4) DLP, and 5) Laser Light.
10. An apparatus comprising:
an interface (120) for receiving a video signal from an external source;
a processor (188) for processing the video signal to provide an output video image; wherein the output video image is projected to produce a projected video image;
a sensor (180) for sensing the projected video image in a normal operating mode; and
wherein the processor (188) in the normal operating mode compares the output video image with the sensed projected video image and provides a color correction of the output video image based on the comparison.
1 1 . The apparatus of claim 10 wherein the color correction is performed for each frame of the output video image.
12. The apparatus of claim 10 wherein the color correction is provided in response to a user input.
13. The apparatus of claim 10 wherein the processor aligns the output video image with the sensed projected video image.
14. The apparatus of claim 13 wherein the processor aligns the output video image based on determining an edge of the sensed projected video image.
15. The apparatus of claim 10 wherein the comparison is performed using a transform of the output video image.
16. The apparatus of claim 10 wherein the color correction is performed without a predetermined calibration image.
17. The apparatus of claim 10 wherein the processor further determines whether a pre-calibration mode has been selected and wherein the pre-calibration mode is different than the normal operating mode.
18. The apparatus of claim 10 wherein the apparatus is one of following projection type: 1 ) LCD, 2) LED, 3) LCOS, 4) DLP, and 5) Laser Light.
19. A computer program product stored in a non-transitory computer-readable storage media comprising computer-executable instructions for:
receiving a video signal from an external source in a normal operating mode; processing the video signal to provide an output video image;
projecting the output video image to produce a projected video image;
sensing the projected video image;
comparing the output video image with the sensed projected video image; and
providing a color correction of the output video image based on the comparison step in the normal operating mode.
20. A method comprising:
processing (250) a signal to provide an output image;
projecting (260) the output image to produce a projected image;
sensing (280) the projected image;
determining (285) a weighted sum of the output image;
determining (287) a weighted sum of the sensed projected image; comparing (290) the determined weighted sum of the output image with the determined weighted sum of the sensed projected image; and
providing (295) a color correction of the output image based on the comparison step.
21 . A method comprising:
processing (250) a signal to provide an output image;
projecting (260) the output image to produce a projected image;
sensing (280) the projected image;
determining (285) a weighted sum of the output image;
determining (287) a weighted sum of the sensed projected image;
comparing (290) the determined weighted sum of the output image with the determined weighted sum of the sensed projected image; and
providing (295) a color correction of the output image based on the comparison step.
PCT/US2015/049896 2014-09-18 2015-09-14 Method and apparatus for real-time color correction in a normal display mode WO2016044124A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462051994P 2014-09-18 2014-09-18
US201462051995P 2014-09-18 2014-09-18
US62/051,994 2014-09-18
US62/051,995 2014-09-18

Publications (2)

Publication Number Publication Date
WO2016044124A2 true WO2016044124A2 (en) 2016-03-24
WO2016044124A3 WO2016044124A3 (en) 2016-05-12

Family

ID=54238562

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2015/049896 WO2016044124A2 (en) 2014-09-18 2015-09-14 Method and apparatus for real-time color correction in a normal display mode

Country Status (1)

Country Link
WO (1) WO2016044124A2 (en)

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI338515B (en) * 2006-02-20 2011-03-01 Mstar Semiconductor Inc Color calibration device and system and method for use therewith
US8023996B2 (en) * 2007-09-07 2011-09-20 Sony Ericsson Mobile Communications Ab Cellular terminals and other electronic devices and methods that adjust projected video images to compensate for color, brightness, and/or pattern of a display surface
JP5386894B2 (en) * 2008-09-09 2014-01-15 ソニー株式会社 Image position recognition device, image position recognition method, program, and correction data setting device for image display device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Also Published As

Publication number Publication date
WO2016044124A3 (en) 2016-05-12

Similar Documents

Publication Publication Date Title
EP1519575B1 (en) Image processing system, projector, information storage medium, and image processing method
US7524070B2 (en) Projection control system, projector and projection control method
US9843781B1 (en) Projector
US11375165B2 (en) Image calibration for projected images
US8011789B2 (en) Rear projection display
EP3136377B1 (en) Information processing device, information processing method, program
US10298894B2 (en) Projector
US20130222776A1 (en) Image projector, method of image projection, and computer-readable storage medium storing program for causing computer to execute image projection
US9357190B2 (en) Projection device and automatic projection calibration method thereof
CN107728410B (en) Image distortion correction method for laser projector and laser projector
JP7434487B2 (en) Image processing device, projection system, image processing method, and image processing program
KR101169276B1 (en) A projector and A method of image revision
JP2017129703A (en) Projector and control method thereof
JP2012248910A (en) Color correction method of projection display device
US20200077060A1 (en) Image Calibration Method and Projector System Capable of Adjusting a Distorted Image Automatically
CN113542709B (en) Projection image brightness adjusting method and device, storage medium and projection equipment
JP6191019B2 (en) Projection apparatus and projection method
JP2021101204A (en) Operation method for control unit, control method for projector, and projector
JP2011244044A (en) Image projector
CN112189337A (en) Image processing apparatus, image processing method, and program
JP5822575B2 (en) Image projection apparatus, control method for image projection apparatus, and program
US11146766B2 (en) Projection-type video display device
JP2011135445A (en) Image projection apparatus
WO2016044124A2 (en) Method and apparatus for real-time color correction in a normal display mode
JP6665543B2 (en) Projector and method of correcting captured image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15772105

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15772105

Country of ref document: EP

Kind code of ref document: A2