WO2011071488A1 - Procédé de compensation de diaphonie dans un affichage 3d - Google Patents

Procédé de compensation de diaphonie dans un affichage 3d Download PDF

Info

Publication number
WO2011071488A1
WO2011071488A1 PCT/US2009/067181 US2009067181W WO2011071488A1 WO 2011071488 A1 WO2011071488 A1 WO 2011071488A1 US 2009067181 W US2009067181 W US 2009067181W WO 2011071488 A1 WO2011071488 A1 WO 2011071488A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
talk
cross
projector
images
Prior art date
Application number
PCT/US2009/067181
Other languages
English (en)
Inventor
Niranjan Damera-Venkata
Nelson Liang An Chang
Original Assignee
Hewlett-Packard Development Company, Lp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, Lp filed Critical Hewlett-Packard Development Company, Lp
Priority to JP2012543071A priority Critical patent/JP5503750B2/ja
Priority to US13/384,944 priority patent/US8982184B2/en
Priority to PCT/US2009/067181 priority patent/WO2011071488A1/fr
Priority to KR1020127006128A priority patent/KR101631973B1/ko
Priority to CN200980161271.2A priority patent/CN102484687B/zh
Priority to EP20090852127 priority patent/EP2510683A4/fr
Publication of WO2011071488A1 publication Critical patent/WO2011071488A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/337Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/18Stereoscopic photography by simultaneous viewing
    • G03B35/26Stereoscopic photography by simultaneous viewing using polarised or coloured light separating different viewpoint images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof

Definitions

  • 3-D polarized projection uses dual complementary polarizing elements that either emit, transmit or reflect (via a polarization preserving screen) images comprised of polarized light, to each eye.
  • the light to each eye is selected via corresponding complementarily polarized lenses in the viewer's glasses, to produce distinct images to each eye, giving the effect of
  • FIG. 1 A is a diagram illustrating a single projector polarization- based 3-D display system projecting a horizontally polarized sub-frame
  • FIG. 1 B is a diagram illustrating the single projector polarization- based 3-D display system of FIG. 1 A, projecting a vertically polarized sub- frame;
  • FIG. 2 is a block diagram of one embodiment of a multi-projector 3-D display system
  • FIG. 3 is a flow chart outlining the steps in one embodiment of a method for reducing cross-talk in a polarization-based 3-D projection system, in accordance with the present disclosure.
  • FIG. 4 is a flow chart outlining the steps in another embodiment of a method for reducing cross-talk in a polarization-based 3-D projection system, in accordance with the present disclosure.
  • directional terms such as “top,” “bottom,” “front,” “back,” “leading,” “trailing,” etc, are used with reference to the orientation of the figures being described. Because components of various embodiments disclosed herein can be positioned in a number of different orientations, the directional terminology is used for illustrative purposes only, and is not intended to be limiting.
  • the term "computer” refers to any type of computing device, including a personal computer, mainframe computer, portable computer, PDA, smart phone, or workstation computer that includes a processing unit, a system memory, and a system bus that couples the processing unit to the various components of the computer.
  • the processing unit can include one or more processors, each of which may be in the form of any one of various commercially available processors.
  • each processor receives instructions and data from a read-only memory (ROM) and/or a random access memory (RAM).
  • ROM read-only memory
  • RAM random access memory
  • the system memory typically includes ROM that stores a basic input/output system (BIOS) that contains start-up routines for the computer, and RAM for storing computer program instructions and data.
  • BIOS basic input/output system
  • a computer typically also includes input devices for user interaction (e.g., entering commands or data, receiving or viewing results), such as a keyboard, a pointing device (e.g. a computer mouse), microphone, camera, or any other means of input known to be used with a computing device.
  • the computer can also include output devices such as a monitor or display, projector, printer, audio speakers, or any other device known to be controllable by a computing device.
  • the computer can also include one or more graphics cards, each of which is capable of driving one or more display outputs that are synchronized to an internal or external clock source.
  • computer program is used herein to refer to machine- readable instructions, stored on tangible computer-readable storage media, for causing a computing device including a processor and system memory to perform a series of process steps that transform data and/or produce tangible results, such as a display indication or printed indicia.
  • computer-readable media includes any kind of memory or memory device, whether volatile or non-volatile, such as floppy disks, hard disks, CD-ROMs, flash memory, read-only memory, and random access memory, that is suitable to provide non-volatile or persistent storage for data, data structures and machine-executable instructions.
  • Storage devices suitable for tangibly embodying these instructions and data include all forms of non-volatile memory, including, for example, semiconductor memory devices, such as EPROM, EEPROM, and flash memory devices, magnetic disks such as internal hard disks and removable disks, magneto-optical disks, and optical disks, such as CD, CDROM, DVD-ROM, DVD-RAM, and DVD-RW.
  • Any of the above types of computer-readable media or related devices can be associated with or included as part of a computer, and connected to the system bus by respective interfaces.
  • Other computer-readable storage devices e.g., magnetic tape drives, flash memory devices, and digital video disks
  • magnetic tape drives e.g., magnetic tape drives, flash memory devices, and digital video disks
  • sub-frame refers to that portion of a display image that is produced for one eye by a single projector.
  • a complete display image produced by multiple sub-frames from one projector or from multiple projectors is referred to as a "composite image.” It is to be understood that a single projector can produce a composite image by time multiplexing, to sequentially project individual sub-frames, and that a composite image can also be produced by a sub-group of projectors (i.e. fewer than all of the projectors) in a multiple projector system.
  • complementary polarizing elements that emit, transmit or reflect images comprised of polarized light, to a polarization preserving screen. The images are then viewed by a viewer wearing special glasses with polarized lenses.
  • FIG. 1 A A diagram of one embodiment of a polarization-based 3-D image display system 100 is shown in FIG. 1 A.
  • the image display system 100 includes a computer processor 102 that processes image data and transmits the image data to a projector 1 12, which includes dual complementary polarizing elements and projects a polarized image, indicated generally at 1 14, to a polarization preserving screen 106.
  • the computer controller 102 can include a dual-head graphics card, having separate channels for the left and right eye images.
  • the image can include any pictorial, graphical, or textural characters, symbols, illustrations, or other representations of information.
  • the display system can also include a calibration camera 122, which is
  • FIG. 1 A represents a linearly polarized system, though it is to be understood that circularly polarized systems can also be used for 3-D polarization-based projection.
  • linear polarization in order to present a stereoscopic motion picture using a single projector, two images are
  • the polarizing filters can be orthogonally polarized (i.e. being polarized at 90 degree angles relative to each other) or polarized at some other relatively large angle.
  • the screen is specially constructed to be non-depolarizing, in order to preserve the polarization.
  • the screen can also be configured to preserve brightness, in order to compensate for light loss (since one eye views one frame while the other eye sees nothing).
  • FIGs. 1 A and 1 B are illustrative only. In actual practice the respective angles of polarization can be different than horizontal and vertical. For example, in many polarization-based 3-D systems the polarization angles for each lens are set at 45° and 135 °, respectively.
  • each filter Since each filter only passes light which is similarly polarized and blocks other light, each eye only sees that part of the total image that is similarly polarized. That is, the light to each eye is selected to produce distinct images to each eye, creating stereoscopic vision and giving the illusion of depth in the projected image.
  • this approach can enable several people to simultaneously view the stereoscopic images, even without head tracking.
  • FIG. 1 A the projection system 100 is shown projecting a horizontally polarized sub-frame 1 14a. Because of its polarization, this image will pass through the right lens 1 1 0a of the polarizing glasses 1 08, and will be blocked by the left lens 1 1 0b of those glasses, since that lens is vertically polarized.
  • the projector 1 1 2 is projecting a vertically polarized sub-frame 1 14b, which will be transmitted by the left lens 1 10b, which has corresponding polarization, but will be blocked by the right lens 1 1 0a, which is horizontally polarized.
  • These polarized sub-frames can be alternately projected at some multiple of the normal video refresh rate. For example, where the normal refresh rate for a video projection system is 60 frames per second, the single projector system 100 can be configured to project the individual polarized sub-frames at a rate of 120 sub- frames per second, providing the equivalent of 60 full frames per second to each eye. Those of skill in the art will recognize that other refresh rates and sub-frame rates can also be used.
  • 3-D polarized projection systems can also use multiple projectors.
  • a diagram of a multi-projector 3-D display system is shown in FIG. 2. While this system is shown with two projectors, it is to be understood that a multi- projector system using more than two projectors can also be used. The use of two projectors allows one projector to project the right eye image, and the other projector to project the left eye image in substantially overlapping positions.
  • the image display system 200 processes image data 202 and generates a corresponding polarized displayed image 214.
  • the image display system includes an image frame buffer 204, a sub-frame generator 208, projectors 21 2A-212B (collectively referred to as projectors 212), a camera 222, and a calibration unit 224.
  • the image frame buffer 204 receives and buffers image data 202 to create image frames 206.
  • the sub-frame generator 208 processes the image frames 206 to define corresponding image sub-frames 21 OA-210B (collectively referred to as sub-frames 21 0) that are differently polarized.
  • the sub-frame generator 208 for each image frame 206, the sub-frame generator 208 generates one sub-frame 21 OA for projector 21 2A and one sub- frame 210B for projector 212B, these sub-frames corresponding to the right and left eye images, respectively.
  • the sub-frames 21 OA-21 OB are received by the projectors 212A-21 2B, respectively, and stored in the image frame buffers 1 1 3A-1 13B (collectively referred to as image frame buffers 1 13), respectively.
  • the projectors 21 2A-212B project polarized sub-frames 21 OA-21 OB, respectively, onto the screen 216 to produce the composite displayed image 214 for viewing by a user.
  • the image from projector 212A will be differently polarized than that projected by projector 212B, so that the respective lenses 230a, 230b of the viewer's glasses 232 will pass different images to each eye, giving the illusion of depth in the resulting image.
  • the image frame buffer 204 includes memory for storing image data 202 for one or more image frames 206.
  • the image frame buffer 204 constitutes a database of one or more image frames 206.
  • the image frame buffers 1 13 also include memory for storing sub-frames 210. Examples of image frame buffers 204 and 1 13 include non-volatile memory (e.g., a hard disk drive or other persistent storage device) and may include volatile memory (e.g., random access memory (RAM)).
  • non-volatile memory e.g., a hard disk drive or other persistent storage device
  • volatile memory e.g., random access memory (RAM)
  • the sub-frame generator 208 receives and processes image frames 206 to define a plurality of image sub-frames 210.
  • the sub-frame generator 208 generates sub-frames 210 based on the image data in image frames 206.
  • the projectors 212 receive image sub-frames 210 from the sub- frame generator 208 and can simultaneously project the image sub-frames 210 onto the target surface 216. Where two projectors are used, these sub-frames can be projected in substantially overlapping relation, to simultaneously provide two polarized sub-frames of the total image 214, rather than time multiplexing, as in the embodiment of FIGs. 1 A, 1 B.
  • selected sub-frames can also be projected to spatially offset positions, to produce a composite image 214 that is tiled or partially overlapped. This can allow the provision of a larger or wider image, or an image with the appearance of a higher resolution display by using overlapping lower-resolution sub-frames 210 from multiple projectors 212.
  • sub-frame generator 208 may be implemented in hardware, software, firmware, or any combination thereof.
  • the implementation may be via a microprocessor, programmable logic device, or state machine.
  • Components of the system may reside in software on one or more computer- readable media devices.
  • FIG. 2 Also shown in FIG. 2 is a reference projector 218 with an image frame buffer 220.
  • the reference projector 21 8 is shown in hidden lines in FIG. 2 because, in one embodiment, the projector 21 8 is not an actual projector, but rather is a hypothetical high-resolution reference projector that is used in an image formation model for generating optimal sub-frames 21 0.
  • the location of one of the actual projectors 212 can be defined to be the location of the reference projector 218.
  • the display system 200 can also include a camera 222 and a calibration unit 224, which can be used to automatically determine geometric mapping between the projector 212 and the reference projector 218.
  • the image display system 200 can include hardware, software, firmware, or a combination of these.
  • one or more components of the image display system 200 e.g. the frame buffer 204, sub- frame generator 208 and calibration unit 224.
  • a computer, computer server, or other microprocessor-based system capable of performing a sequence of logic operations, and having system memory.
  • Such a system is generally referred to herein as a "controller" for the multi-projector system.
  • processing can be distributed throughout the system with individual portions being implemented in separate system components, such as in a networked or multiple computing unit environment (e.g. clustered computers).
  • the camera 222 is coupled to the calibration unit 224, and is used to determine compensation parameters for geometry, color, luminance, etc., to allow the multiple projector system to form a seamless image.
  • crosstalk between image channels that is, part of the image intended for one eye leaks or bleeds through to the other eye, thus diminishing image quality.
  • This sort of cross-talk can occur in single- or multiple-projector 3-D systems.
  • Cross-talk in 3-D polarized projection systems generally occurs due to imperfections in the polarization and selection process. For example, cross-talk can originate from the polarizing plate of a given projector, which may not completely polarize the projected light to the desired angle.
  • a projection screen that is configured to preserve polarization may not do so perfectly, and the polarizing filters in glasses that are worn by a viewer to select the desired signal may pick up other signals as well.
  • Cross-talk in a polarization-based 3-D projection system can originate from these and other sources.
  • a non-hardware method for reducing cross-talk in a polarization-based 3-D projection system. This method involves observing light leakage between channels via a camera or recording device during a calibration phase. The leakage is then compensated by digitally correcting the images displayed for the left and right eyes via a computational process. While the embodiments shown herein are polarization- based systems, the method disclosed herein can also apply to other cross- canceling techniques, such as complementary spectral approaches (e.g. using red-green color filters for each eye). Additionally, while the embodiments that are shown and described include two channels for the left and right eye, respectively, the method can also be expanded for more than two channels.
  • the method has three main phases: a calibration phase 302, a computation phase 304, and a correction/rendering phase 306.
  • the calibration phase provides data that is to be used by a computer (e.g. the computer controller 1 02 in FIG. 1 ) in the computation phase.
  • the computation and rendering phases can be performed by the computer controller, which provides a computer system having a processor and system memory.
  • the steps that are performed in each phase can be saved as program steps in stored in memory in the computer.
  • the display element(s) i.e. projector(s)
  • contributing to the left eye image are used to project a calibration image (e.g.
  • the left eye projector first projects an image for the left eye, then projects an image for the right eye. Then the display element(s) contributing to the right eye image are also used to project the image for each eye. Where multiple projectors are involved, this process is repeated for each projector.
  • the image that is projected can be an impulse image that is used as part of a training process for a multi-projector system, such as is outlined in N. Damera- Venkata, N. Chang and J. Dicarlo, "A Unified Paradigm for Scalable Multi- Projector Displays", IEEE Trans, on Visualization and Computer Graphics, Nov.-Dec. 2007 (hereinafter "Damera-Venkata, Chang and Dicarlo").
  • each calibration screen can be a solid image of a single primary color of (presumably) uniform intensity.
  • Each calibration image is projected sequentially with polarizing filters on the projector(s), unlike typical geometric or color calibration for a camera-projector system.
  • a camera observes the projected image with dual-polarizing filters, in order to mimic the effect of polarized glasses (step 31 0). This may be done by first introducing one polarizing filter over the camera and taking an image, and then repeating the process with the other polarizing filter. It is assumed in this step that the filters are positioned in a similar orientation (with respect to the screen) as a viewer wearing polarized glasses would be. This step captures crosstalk because any leakage from one image to the other will show up. For example, assuming an RGB projection system, when a red calibration image is projected for the left eye, the right eye should theoretically see nothing.
  • the right eye image will capture some level of red light in some regions of the view and may also capture other light, such as green and blue. This reflects the fact that in addition to spatial leakage, the system can also exhibit spectral leakage. The converse is also true: when the red calibration image is projected for the right eye, any leakage will show up as some level of red and possibly also blue and green light captured by the left eye. As noted above, this process of taking these images is repeated for each combination of right/left eye image and right/left eye reception for each projector.
  • the calibration approach described above can be performed in conjunction with geometric, luminance and color calibration, especially where a multiple camera system is used.
  • geometric calibration of a camera group in addition to projecting calibration patterns for detecting cross-talk, the sequential projection of red, green and blue color primaries can be used to allow color calibration as well.
  • the cross-talk calibration process disclosed herein is normally performed after geometric calibration is completed, so that images from each projector will be spatially oriented in the desired way. Other calibration operations can also be performed.
  • step 310) the step of capturing the projected calibration images (step 310) will involve the calibration camera viewing the calibration image with some other appropriate filter (e.g. color filters), rather than polarizing filters.
  • some other appropriate filter e.g. color filters
  • the first step in the computation phase 304 is to construct a simulated image or model of what the eye would see if a given image were projected from the projector(s) (step 31 2).
  • This model is a numerical model of the actual image that is being projected.
  • the computer system analyzes the image that is captured by the calibration camera, and constructs a numerical model of that image.
  • the numerical model represents a data stream that corresponds to the image that was actually captured. If y is used to represent the color value of a given pixel at coordinates (m,n), and / represents the luminance value of that pixel, then the construction of the simulated image can be represented by the following expressions:
  • Ciyi( m, n)l ⁇ m, n) + C H y r ( m, n)l r ⁇ m, n) x ⁇ m,n) [ 1 ]
  • the "simulated" image represented by the terms on the left side of equations [1 ] and [2] is then compared with a desired theoretical cross-talk- free image (step 312). Specifically, mathematical expressions representing the desired image without cross-talk are substituted into the right side of equations [1 ] and [2], in place of x / and x r .
  • the desired cross-talk-free image is a theoretical image that is determined by the computer controller, and represents a numerical model of the image that the camera should capture if there were no cross-talk. In other words, the theoretical image is an image data stream that has been transformed to that which would produce a desired camera view of an ideal projection system, with no cross-talk.
  • this image represents a consistent feasible ideal image from each projector.
  • the controller system is already programmed to be "aware" of these various aspects of commonality. Other aspects of commonality between projectors can also be taken into
  • the next step is to solve for the cross-talk parameters or factors C and I in equations [1 ] and [2].
  • This step involves setting the expressions representing the simulated image and theoretical image equal to each other, and then solving for the variables C and I.
  • the values for C and I can be directly obtained using equations [1 ] and [2] since the pixel values y are known for the calibration image.
  • the values for C and I represent characteristics of a given projector and are not image dependent, while the values for y are image dependent. However, with a calibration image comprising a flat field of a single color, the y values will be known, allowing equations [1 ] and [2] to be directly solved for C and I for a given projector.
  • These values of C and I can be stored in computer memory.
  • the next step is to receive new image data (step 316) and solve for the pixel values y (step 318) in equations [1 ] and [2].
  • the new image data can be data representing an impulse or training image, for example, or data representing some other image.
  • a mathematical expression representing the new image data is substituted into equations [1 ] and [2] in a manner similar to the way in which the theoretical image was substituted, as described above, and the equations are then solved for the pixel values y.
  • this process of receiving an impulse image for each projector, then computing the pixel values for correcting that image can be repeated multiple times for multiple impulse images in a training process, as indicated by repeat block 320.
  • This type of training process is known to those of skill in the art, and is presented in Damera- Venkata, Chang and Dicarlo, , and in N. Damera-Venkata and N. L. Chang, "Realizing Super-resolution With Superimposed Projection", Proc. IEEE
  • step 322 By performing this training process on a series of impulse images, several sets of corrected y values can be obtained. These corrected pixel values can be collectively used to produce a set of linear run-time filter coefficients (step 322) that can be used for any subsequent image in the rendering phase 306. That is, once the impulse images have been corrected in step 31 8, the system can determine a series of linear coefficients (step 322) by which the data stream for all subsequent images can be transformed in real time prior to projection (step 324). These coefficients allow the rapid correction of subsequent images to reduce cross-talk so that each subsequent image frame is closer to a theoretical cross-talk free image than otherwise.
  • step 31 8 The process described herein is image dependent. That is, the correction of a given image frame (whether a calibration pattern or other image) will be unique. Thus, the computational solution obtained in step 31 8 will be different for each image used in step 316. However, by repeating the training process multiple times, as indicated in step 320, the error terms of the various mathematical iterations can be encapsulated in a filter to obtain the run-time filter coefficients, so that the output for any input can be computed using the same set of filters, allowing faster real-time rendering. Consequently, the training method outlined above will provide run-time filter coefficients that will substantially reduce cross-talk for subsequent images, though a small quantity of cross-talk may still occur in some images. While this approach does not necessarily produce a perfect calibration for each image frame, it provides an approximate solution for each image frame, significantly reducing cross-talk in a moving video image.
  • the system can perform continuous cross-talk corrections on every image frame of a video stream in real time.
  • a flow chart outlining the steps in this embodiment is provided in FIG. 4. Like the process outlined in FIG. 3, this process includes a calibration phase 402, a computation phase 404, and a rendering phase 406.
  • a polarized calibration image is first projected by each projector for each eye (step 408).
  • step 41 as in step 31 0, the image is viewed by a calibration camera with polarizing filters, first for one eye, then the other.
  • step 412 The processes of computing a simulated image (step 412), equating the simulated image to a theoretical image (step 414) and solving for the cross-talk factors "C" and ⁇ " (step 416) is also the same as the comparable steps in the embodiment of FIG. 3.
  • the cross-talk factors of "C” and ⁇ " that are determined in step 416 are applied directly to new image data that is received in step 418.
  • data representing a new image frame is received in step 418.
  • the new image data is equated with the expressions of equations [1 ] and [2] with the values of "C" and ⁇ " therein. This allows the system to solve for the pixel values y for that specific image (step 420).
  • the image data received in step 418 is directly corrected, and that specific image is projected with the corrected pixel values (step 422).
  • equations [1 ] and [2] are linear.
  • the value of y is constrained to be between zero and one, since a pixel cannot have a negative value or a value greater than one. With this constraint, these equations become non-linear, but can be solved using an iterative computational process such as a gradient descent process.
  • the computer system is programmed to make an initial estimate or guess of the y t and y r values, then find an error term, adjust the y and y r values, and repeat the process until the simulated image and the new image data set are equal, with the error term applied.
  • This process ultimately provides a cross-talk correction for the new image data.
  • the gradient descent process is non-linear.
  • the corrected image is projected (step 422), and this process can then repeat (step 424) for the next image frame, returning to step 418.
  • each frame of a video stream can receive a unique calibration.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Projection Apparatus (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

L'invention concerne un procédé de compensation de diaphonie dans un système de projecteur-caméra 3D comprenant une unité de commande qui comprend un processeur et une mémoire système, et au moins deux canaux, lequel procédé comprend les étapes consistant à étalonner le système de projecteur-caméra, à calculer des facteurs de diaphonie applicables au système de projecteur-caméra, et à corriger de nouvelles données d'image de diaphonie en fonction des facteurs de diaphonie calculés. Le système est étalonné en projetant et en capturant séquentiellement, à l'aide d'une caméra, une image d'étalonnage pour chaque canal, ceci afin de capturer la diaphonie entre les canaux. L'unité de commande peut calculer les facteurs de diaphonie en fonction des images d'étalonnage capturées.
PCT/US2009/067181 2009-12-08 2009-12-08 Procédé de compensation de diaphonie dans un affichage 3d WO2011071488A1 (fr)

Priority Applications (6)

Application Number Priority Date Filing Date Title
JP2012543071A JP5503750B2 (ja) 2009-12-08 2009-12-08 3dディスプレイにおいてクロストークを補償する方法
US13/384,944 US8982184B2 (en) 2009-12-08 2009-12-08 Method for compensating for cross-talk in 3-D display
PCT/US2009/067181 WO2011071488A1 (fr) 2009-12-08 2009-12-08 Procédé de compensation de diaphonie dans un affichage 3d
KR1020127006128A KR101631973B1 (ko) 2009-12-08 2009-12-08 3d 디스플레이에서의 누화 보상 방법
CN200980161271.2A CN102484687B (zh) 2009-12-08 2009-12-08 用于补偿在3-d显示中的串扰的方法
EP20090852127 EP2510683A4 (fr) 2009-12-08 2009-12-08 Procédé de compensation de diaphonie dans un affichage 3d

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2009/067181 WO2011071488A1 (fr) 2009-12-08 2009-12-08 Procédé de compensation de diaphonie dans un affichage 3d

Publications (1)

Publication Number Publication Date
WO2011071488A1 true WO2011071488A1 (fr) 2011-06-16

Family

ID=44145814

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2009/067181 WO2011071488A1 (fr) 2009-12-08 2009-12-08 Procédé de compensation de diaphonie dans un affichage 3d

Country Status (6)

Country Link
US (1) US8982184B2 (fr)
EP (1) EP2510683A4 (fr)
JP (1) JP5503750B2 (fr)
KR (1) KR101631973B1 (fr)
CN (1) CN102484687B (fr)
WO (1) WO2011071488A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013011491A1 (fr) * 2011-07-21 2013-01-24 Imax Corporation Normalisation généralisée pour un dispositif d'affichage d'image
EP2749033A1 (fr) * 2011-08-25 2014-07-02 Hewlett-Packard Development Company, L.P. Réduction de l'interférence stéréoscopique et multivue à l'aide d'un modèle
US9384535B2 (en) 2008-06-13 2016-07-05 Imax Corporation Methods and systems for reducing or eliminating perceived ghosting in displayed stereoscopic images
KR101839150B1 (ko) * 2011-10-06 2018-04-27 엘지디스플레이 주식회사 3d 화질개선방법과 이를 이용한 입체영상 표시장치

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140028801A1 (en) * 2012-07-30 2014-01-30 Canon Kabushiki Kaisha Multispectral Binary Coded Projection
US9560343B2 (en) 2012-11-23 2017-01-31 Samsung Electronics Co., Ltd. Apparatus and method for calibrating multi-layer three-dimensional (3D) display
KR101791518B1 (ko) 2014-01-23 2017-10-30 삼성전자주식회사 사용자 인증 방법 및 장치
KR102265109B1 (ko) 2014-01-24 2021-06-15 삼성전자주식회사 영상 처리 방법 및 장치
US11199446B2 (en) * 2016-05-30 2021-12-14 Silios Technologies Method for limiting crosstalk in an image sensor
US9961333B1 (en) * 2016-06-10 2018-05-01 X Development Llc System and method for light field projection
US9891516B1 (en) 2016-08-23 2018-02-13 X Development Llc Methods for calibrating a light field projection system
US10091496B2 (en) 2016-11-28 2018-10-02 X Development Llc Systems, devices, and methods for calibrating a light field projection system
JP6499226B2 (ja) * 2017-06-02 2019-04-10 株式会社Subaru 車載カメラのキャリブレーション装置及び車載カメラのキャリブレーション方法
US10867538B1 (en) * 2019-03-05 2020-12-15 Facebook Technologies, Llc Systems and methods for transferring an image to an array of emissive sub pixels
CN111429540B (zh) * 2020-04-22 2020-12-08 清华大学 温度场和变形场同步测量装置和方法
US11676556B2 (en) 2021-01-06 2023-06-13 Apple Inc. Row crosstalk mitigation
KR20220157147A (ko) * 2021-05-20 2022-11-29 삼성전자주식회사 이미지를 처리하기 위한 방법 및 장치

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100236135B1 (ko) * 1990-02-27 1999-12-15 이데이 노부유끼 영상 신호 재생 장치
US20060268104A1 (en) 2005-05-26 2006-11-30 Real D Ghost-compensation for improved stereoscopic projection
KR20080013815A (ko) * 2006-08-08 2008-02-13 엔비디아 코포레이션 스테레오 콘텐츠의 표시 동안 크로스토크를 보상하기 위한시스템, 방법 및 컴퓨터 프로그램 제품

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1389956B1 (fr) * 2001-04-25 2012-10-31 Amnis Corporation Procede et appareil de correction de la diaphonie et de la resolution spatiale de l'imagerie multicanal
US6826304B2 (en) * 2002-03-13 2004-11-30 Hewlett-Packard Development Company, L.P. Reducing halos in spatially dependent gamut mapping
JP2004333561A (ja) * 2003-04-30 2004-11-25 Nippon Hoso Kyokai <Nhk> 立体画像表示装置
US7387392B2 (en) * 2005-09-06 2008-06-17 Simon Widdowson System and method for projecting sub-frames onto a surface
US8542326B2 (en) * 2008-11-17 2013-09-24 X6D Limited 3D shutter glasses for use with LCD displays
US8587639B2 (en) * 2008-12-11 2013-11-19 Alcatel Lucent Method of improved three dimensional display technique
US8797340B2 (en) * 2012-10-02 2014-08-05 Nvidia Corporation System, method, and computer program product for modifying a pixel value as a function of a display duration estimate

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100236135B1 (ko) * 1990-02-27 1999-12-15 이데이 노부유끼 영상 신호 재생 장치
US20060268104A1 (en) 2005-05-26 2006-11-30 Real D Ghost-compensation for improved stereoscopic projection
KR20080013815A (ko) * 2006-08-08 2008-02-13 엔비디아 코포레이션 스테레오 콘텐츠의 표시 동안 크로스토크를 보상하기 위한시스템, 방법 및 컴퓨터 프로그램 제품

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2510683A4 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9384535B2 (en) 2008-06-13 2016-07-05 Imax Corporation Methods and systems for reducing or eliminating perceived ghosting in displayed stereoscopic images
WO2013011491A1 (fr) * 2011-07-21 2013-01-24 Imax Corporation Normalisation généralisée pour un dispositif d'affichage d'image
US9106811B2 (en) 2011-07-21 2015-08-11 Imax Corporation Generalized normalization for image display
EP2749033A1 (fr) * 2011-08-25 2014-07-02 Hewlett-Packard Development Company, L.P. Réduction de l'interférence stéréoscopique et multivue à l'aide d'un modèle
JP2014529954A (ja) * 2011-08-25 2014-11-13 ヒューレット−パッカード デベロップメント カンパニー エル.ピー.Hewlett‐Packard Development Company, L.P. 立体視及びマルチビューにおけるモデルベースのクロストーク低減
EP2749033A4 (fr) * 2011-08-25 2015-02-25 Hewlett Packard Development Co Réduction de l'interférence stéréoscopique et multivue à l'aide d'un modèle
KR101574914B1 (ko) * 2011-08-25 2015-12-04 휴렛-팩커드 디벨롭먼트 컴퍼니, 엘.피. 모델-기반 입체 및 멀티뷰 크로스토크 감소
KR101839150B1 (ko) * 2011-10-06 2018-04-27 엘지디스플레이 주식회사 3d 화질개선방법과 이를 이용한 입체영상 표시장치

Also Published As

Publication number Publication date
CN102484687A (zh) 2012-05-30
EP2510683A4 (fr) 2013-12-04
JP5503750B2 (ja) 2014-05-28
JP2013513332A (ja) 2013-04-18
EP2510683A1 (fr) 2012-10-17
KR20120119905A (ko) 2012-10-31
KR101631973B1 (ko) 2016-06-20
US20120262544A1 (en) 2012-10-18
US8982184B2 (en) 2015-03-17
CN102484687B (zh) 2016-03-23

Similar Documents

Publication Publication Date Title
US8982184B2 (en) Method for compensating for cross-talk in 3-D display
US8477241B2 (en) Multi-projector system and method
US8944612B2 (en) Multi-projector system and method
US9165536B2 (en) Systems and methods for projecting composite images
US9288465B2 (en) Ghost-compensation for improved stereoscopic images
US9817305B2 (en) Image correction system and method for multi-projection
Itoh et al. Semi-parametric color reproduction method for optical see-through head-mounted displays
Sajadi et al. Color seamlessness in multi-projector displays using constrained gamut morphing
CN103026381A (zh) 双堆叠投影
US20140292825A1 (en) Multi-layer display apparatus and display method using it
US20140192170A1 (en) Model-Based Stereoscopic and Multiview Cross-Talk Reduction
US10244216B2 (en) Projector, video display device, and video display method
KR20140063536A (ko) 다면 상영을 위한 영상 보정 시스템 및 방법
CN102868902B (zh) 立体图像显示装置及其方法
JP4969684B2 (ja) 画像表示装置

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200980161271.2

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09852127

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2009852127

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 13384944

Country of ref document: US

ENP Entry into the national phase

Ref document number: 20127006128

Country of ref document: KR

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2012543071

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE