WO2008156457A1 - Interactive display with camera feedback - Google Patents

Interactive display with camera feedback Download PDF

Info

Publication number
WO2008156457A1
WO2008156457A1 PCT/US2007/014575 US2007014575W WO2008156457A1 WO 2008156457 A1 WO2008156457 A1 WO 2008156457A1 US 2007014575 W US2007014575 W US 2007014575W WO 2008156457 A1 WO2008156457 A1 WO 2008156457A1
Authority
WO
WIPO (PCT)
Prior art keywords
devices
camera
projection
display system
screen
Prior art date
Application number
PCT/US2007/014575
Other languages
French (fr)
Inventor
Mark Alan Schultz
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Priority to PCT/US2007/014575 priority Critical patent/WO2008156457A1/en
Publication of WO2008156457A1 publication Critical patent/WO2008156457A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03542Light pens for emitting or receiving light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • G06F3/0386Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry for light pen
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback

Definitions

  • the invention relates generally to projection displays and, in particular interactive projection displays.
  • Projection systems such as rear projection displays as well as segmented display systems typically include many of the components required to make an interactive display. In particular, their screen sizes make them useful for such applications as interactive displays for presentations and graphics.
  • the challenges encountered in trying to make such systems in existing displays are the feedback systems as well as the way in which the customer interacts with the display.
  • the feedback system refers to the way in which the projection system monitors interactions with a user.
  • the present invention is directed to an interactive projection display system.
  • the interactive projection display system incorporates a feedback arrangement for monitoring user interactions with the display.
  • the feedback arrangement may include one or more feedback devices that are used to obtain an image of a display screen which then may be interacted with by a user.
  • the one or more feedback devices that are used to obtain the image of the screen may include, for example, a camera or either an infrared (IR) or radio frequency (RF) receiver positioned within the projection display system. Additionally, the user may interact with the screen of the display system using a hand held pointing device or by touching the screen.
  • IR infrared
  • RF radio frequency
  • FIG. 1 illustrates a top view of an interactive projection display system, in accordance with the present invention
  • FIG. 2 illustrates a front view of the interactive projection display system shown in FIG. 1;
  • FIG. 3 illustrates user interaction with the projection display shown in FIG.
  • FIG. 4 illustrates multiple users interacting with the projection display shown in FIG. 2;
  • FIG. 5 illustrates another embodiment showing multiple users interacting with the projection display shown in FIG. 2;
  • FIG. 6 depicts a side view of the interactive projection display system shown in FIG. 1;
  • FIG. 7 is a flow chart showing user interaction commands for interacting with interactive projection display system of the present invention.
  • FIGS. 8A-8B depict front and side views of a hand held pointing device for providing user interaction commands to the interactive projection display of the present invention.
  • the present invention is directed to an interactive projection display system.
  • the interactive projection display system incorporates a feedback arrangement for monitoring user interactions with the display.
  • the feedback arrangement may include one or more feedback devices that are used to obtain an image of a display screen which then may be interacted with by a user.
  • the interactive projection display system may be a segmented display system in which the display devices are arranged, for example, in a N x 1 array, as shown in FIG. 1.
  • the present invention will be discussed primarily with reference to a segmented display system.
  • the interactive projection display system of the present invention may be incorporated into a rear projection display having a single display screen (not shown).
  • FIG. 1 there is shown a top view of an segmented interactive display system 100 of the present invention including a plurality of projection devices 110A, 110B, 110C, 11 OD that each project a portion of an image on a corresponding one of the display devices 111 A, 111 B, 111 C, 111 D in the N x 1 array.
  • the display devices 111 A, 111 B, 111 C, 111D may be screens on which the image is projected.
  • FIG. 6 is a side view of the segmented interactive display system of the present invention.
  • Each projection device 110 interacts with a lamp 120 (FIG. 1), one or more feedback devices 130 and a mirror 125.
  • the lamp 120 projects that portion of the image to be projected onto the mirror 125 and than to the corresponding one of the display devices 111 A, 111B, 111C, 111D in the N x 1 array.
  • the mirror 125 is typically placed in a frame that is positioned at an angle of about 45 degrees relative to the lamp 120 and the display devices 111 A, 111 B, 111 C, 111 D. Positioning the mirror at about 45 degrees with respect to the lamp 120 permits the projection system to have a more compact cabinet with a narrower width.
  • One or more feedback devices 130 may be positioned between the mirrors 125 (FIG. 1 ), within one or more of the projection devices 110A, 110B, 11 OC, 11 OD or in a CPU 140 (FIG. 6) coupled to one or more of the projection devices 110A, 110B, 110C, 110D.
  • suitable feedback devices include a camera, an infrared (IR) receiver and a radio frequency (RF) receiver, among others.
  • a camera 130A is positioned adjacent to an exit pupil of DLP projection device 110C (FIG. 1 ).
  • An IR receiver or RF receiver may also be coupled to either of the projection devices 110A, 110B, 110C, 110D or the CPU 140, or both.
  • the camera 130A is used to obtain an image of one or more of the display devices 111 C, 111 D.
  • a camera 110B is positioned adjacent to an exit pupil of DLP projection device 111B (FIG. 1 ).
  • the camera 130B is used to obtain an image of one ore more display devices 111A, 111 B. As such, the resolution of the camera does not need to be too high to still be effective.
  • the camera in its simplest form can be a low resolution camera with a filter that matches the wavelength of the laser pointer, such as a monochrome camera with a narrow band filter and a single frequency laser pointer.
  • a monochrome camera with a narrow band filter and a single frequency laser pointer This has the advantage that the laser pointer intensity is high with respect to the normal camera image and the gain of the camera image can be turned up high to respond only to an on/off state or the off/low/high states. This may have a few exceptions if multicolor laser pointers are used or other light sources are used. Multicolor laser pointers would need a system that is much more versatile in the responses to a wide band filter to include all of the laser pointer frequencies as well as a color camera. A more sensitive system could be accomplished with multiple monochrome cameras using very selective optical filters on each monochrome camera.
  • the multicolor system could also process a single color camera image by looking at the RGB components (FlG. 7, step 300) of the processing algorithm discussed below, to help determine which laser pointer color or which user(s) is(are) active.
  • the intensity of the laser pointers is also a factor in the sensitivity of the system where the brighter the laser pointer, the easier it is to distinguish the laser pointer from the video on the screen.
  • the responses for each multicolor laser pointer would also have to be characterized and optimized to the optics and processing for the specific application.
  • the camera output can be provided directly to the chassis for processing.
  • the camera input may be externally processed.
  • a DLP display typically includes all the functionality required to process the video from the camera.
  • an external processor may be desired.
  • the display system 100 may use the images from the cameras 130A,
  • the camera is very useful on segmented displays to help analyze seams, alignment and video color brightness characteristics between screens.
  • the display system 100 may interact with these images.
  • the cameras 130A, 130B may provide feedback from a person 135 holding a laser pointer 145 (FIG. 2).
  • the cameras 130A, 130B having an appropriate wavelength filter may be used to detect the wavelength of the laser pointer 145 as well as its intensity to track its movement across the display devices 111 A, 111 B, 111 C, 111 D.
  • the intensity of the laser may also be used to provide graphics onto the displayed images.
  • the intensity of the laser light from the laser pointer 145 is at a first intensity in its normal state, i.e., when a button 170 (FIGS. 8A-8B) on the laser pointer is not depressed.
  • a button 170 FIGS. 8A-8B
  • the first intensity may be increased to a second intensity that is higher than the first intensity.
  • RF receiver may be coupled to the buttons 170 on the laser pointer so as to transmit an IR or RF signal toward the display devices 111 A, 111 B, 111 C, 111 D.
  • the IR or RF receiver in the display system receives these signals and the CPU 140 processes them.
  • the difference in magnitude of the laser light between the first intensity and the second intensity may be used to indicate to a user that a laser pointer 145 is active.
  • An IR receiver or and RF receiver may be coupled to the buttons 170 on the laser pointer so as to transmit an IR or RF signal toward the display devices 111 A; 111 B, 111C, 111 D.
  • the IR or RF receiver in the display system receives these signals and the CPU 140 processes them to add or remove graphics and text onto portions of the display devices 111 A, 111 B, 111 C, 111 D.
  • the cameras 130A, 130B may apply appropriate light selective filters to the image to detect the wavelength of the laser light as described in step 200.
  • a red filter may be applied to detect a ruby laser.
  • the camera tracks the light laser light movement across the display devices 111 A, 111 B, 111 C,
  • a plurality of commands may be performed on portions of the display devices 111 A, 111 B, 111 C, 111 D.
  • buttons 170 may include, for example, color 179, width 178, select 180, zoom in 177, center 181 , zoom out 176, erase 182 and power 175, among others.
  • the cameras continually 130A, 130B monitor if any of the buttons 170 are depressed. If the cameras detect that a command button is depressed the display system 100 in conjunction with the CPU 140 and IR or
  • RF inputs applies a graphics overlay on the display devices 111 A, 111 B, 111 C, 111D following the laser light movement thereon, as described in step 215.
  • the graphics of a particular command button 170 are added to the image or an image is then modified based on the command as in step 220.
  • step 225 once the camera detects that the button 170 is released, the graphics are updated to remove the command strokes or execute an end of a command algorithm as describe in step 230. Given the video capture and commands the video processing can track the light over time, execute the commands, and generate the next sequence of graphics ⁇ as ⁇ required by the user. Special functions, such as, for example the erase button 182, may define an area that moves on the screen about a center point which is used to clear the existing lines or video content within that area.
  • the laser pointer may also include an infrared (IR) light emitting diode (LED) 170 (FIGS. 8A-8B).
  • IR infrared
  • LED light emitting diode
  • the IR LED 170 may be used in conjunction with the laser light output, cameras 130A, 130B and CPU 140 to get user identification information, execute functions and attributes (FIG. 7).
  • FIGS. 7-8 when a button 170 on the laser pointer 145 is depressed the IR or RF receivers may apply appropriate light selective filters to look for laser pointer identification information (ID), as described in steps 300- 310.
  • ID laser pointer identification information
  • a plurality of functions and attributes may be performed on portions of the display devices 111 A, 111 B, 111C, 111 D.
  • the CPU 140 and IR or RF determines whether a graphics overlay is needed to help a user select attributes on the display devices 111 A, 111 B, 111 C, 111 D.
  • a graphics overlay is needed, a pop-up menu is provided so a user can toggle through desired selections, as outlined in step 330.
  • attributes may include, color selection, line width control and position indications, among others.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Position Input By Displaying (AREA)

Abstract

An interactive projection display system is described. The interactive projection display system incorporates a feedback arrangement for monitoring user interactions with the display. The feedback arrangement may include one or more feedback devices that are used to obtain an image of a display screen which then may be interacted with by a user. The one or more feedback devices that are used to obtain the image of the screen may include, for example, a camera or either an infrared (IR) or radio frequency (RF) receiver positioned within the projection display system. Additionally, the user may interact with the screen of the display system using a hand held pointing device or by touching the screen.

Description

INTERACTIVE DISPLAY WITH CAMERA FEEDBACK
TECHNICAL FIELD
The invention relates generally to projection displays and, in particular interactive projection displays.
BACKGROUND OF THE INVENTION
Projection systems such as rear projection displays as well as segmented display systems typically include many of the components required to make an interactive display. In particular, their screen sizes make them useful for such applications as interactive displays for presentations and graphics. However, the challenges encountered in trying to make such systems in existing displays are the feedback systems as well as the way in which the customer interacts with the display. The feedback system refers to the way in which the projection system monitors interactions with a user.
BRIEF SUMMARY OF THE INVENTION
The present invention is directed to an interactive projection display system. The interactive projection display system incorporates a feedback arrangement for monitoring user interactions with the display. The feedback arrangement may include one or more feedback devices that are used to obtain an image of a display screen which then may be interacted with by a user.
The one or more feedback devices that are used to obtain the image of the screen may include, for example, a camera or either an infrared (IR) or radio frequency (RF) receiver positioned within the projection display system. Additionally, the user may interact with the screen of the display system using a hand held pointing device or by touching the screen. BRIEF SUMMARY OF THE DRAWINGS
FIG. 1 illustrates a top view of an interactive projection display system, in accordance with the present invention; FIG. 2 illustrates a front view of the interactive projection display system shown in FIG. 1;
FIG. 3 illustrates user interaction with the projection display shown in FIG.
FIG. 4 illustrates multiple users interacting with the projection display shown in FIG. 2;
FIG. 5 illustrates another embodiment showing multiple users interacting with the projection display shown in FIG. 2;
FIG. 6 depicts a side view of the interactive projection display system shown in FIG. 1; FIG. 7 is a flow chart showing user interaction commands for interacting with interactive projection display system of the present invention; and
FIGS. 8A-8B depict front and side views of a hand held pointing device for providing user interaction commands to the interactive projection display of the present invention.
DETAILED DESCRIPTION
The present invention is directed to an interactive projection display system. The interactive projection display system incorporates a feedback arrangement for monitoring user interactions with the display. The feedback arrangement may include one or more feedback devices that are used to obtain an image of a display screen which then may be interacted with by a user.
The interactive projection display system may be a segmented display system in which the display devices are arranged, for example, in a N x 1 array, as shown in FIG. 1. The present invention will be discussed primarily with reference to a segmented display system. However, the interactive projection display system of the present invention may be incorporated into a rear projection display having a single display screen (not shown).
Referring the FIG. 1 , there is shown a top view of an segmented interactive display system 100 of the present invention including a plurality of projection devices 110A, 110B, 110C, 11 OD that each project a portion of an image on a corresponding one of the display devices 111 A, 111 B, 111 C, 111 D in the N x 1 array. The display devices 111 A, 111 B, 111 C, 111D may be screens on which the image is projected.
FIG. 6 is a side view of the segmented interactive display system of the present invention. Each projection device 110 interacts with a lamp 120 (FIG. 1), one or more feedback devices 130 and a mirror 125. The lamp 120 projects that portion of the image to be projected onto the mirror 125 and than to the corresponding one of the display devices 111 A, 111B, 111C, 111D in the N x 1 array. The mirror 125 is typically placed in a frame that is positioned at an angle of about 45 degrees relative to the lamp 120 and the display devices 111 A, 111 B, 111 C, 111 D. Positioning the mirror at about 45 degrees with respect to the lamp 120 permits the projection system to have a more compact cabinet with a narrower width. One or more feedback devices 130 may be positioned between the mirrors 125 (FIG. 1 ), within one or more of the projection devices 110A, 110B, 11 OC, 11 OD or in a CPU 140 (FIG. 6) coupled to one or more of the projection devices 110A, 110B, 110C, 110D. Examples of suitable feedback devices include a camera, an infrared (IR) receiver and a radio frequency (RF) receiver, among others.
In one embodiment, a camera 130A is positioned adjacent to an exit pupil of DLP projection device 110C (FIG. 1 ). An IR receiver or RF receiver (FIGS. 2- 6) may also be coupled to either of the projection devices 110A, 110B, 110C, 110D or the CPU 140, or both. The camera 130A is used to obtain an image of one or more of the display devices 111 C, 111 D. A camera 110B is positioned adjacent to an exit pupil of DLP projection device 111B (FIG. 1 ). The camera 130B is used to obtain an image of one ore more display devices 111A, 111 B. As such, the resolution of the camera does not need to be too high to still be effective. For example, assume a display system with a screen size of about 305 cm x 152.5 cm (120 inches x 60 inches) and two cameras having a resolution of 720 pixels x 480 pixels. A laser pointer held by a user would have a resolution of about 480 pixels/ 152.5 cm = 3 pixels/cm. This is about 1/8 resolution for a pointing device which provides enough accuracy for a user to hold a laser pointer a few feet from the screen. The use of only one camera will still provide 720 pixels/ 305 cm = 2 pixels/cm resolution on a 305 cm screen. This is because the camera is monitoring the image found on the screen and not the laser pointer directly. This provides for accurate system tracking with no parallax issues found in other systems.
The camera in its simplest form can be a low resolution camera with a filter that matches the wavelength of the laser pointer, such as a monochrome camera with a narrow band filter and a single frequency laser pointer. This has the advantage that the laser pointer intensity is high with respect to the normal camera image and the gain of the camera image can be turned up high to respond only to an on/off state or the off/low/high states. This may have a few exceptions if multicolor laser pointers are used or other light sources are used. Multicolor laser pointers would need a system that is much more versatile in the responses to a wide band filter to include all of the laser pointer frequencies as well as a color camera. A more sensitive system could be accomplished with multiple monochrome cameras using very selective optical filters on each monochrome camera. The multicolor system could also process a single color camera image by looking at the RGB components (FlG. 7, step 300) of the processing algorithm discussed below, to help determine which laser pointer color or which user(s) is(are) active. The intensity of the laser pointers is also a factor in the sensitivity of the system where the brighter the laser pointer, the easier it is to distinguish the laser pointer from the video on the screen. The responses for each multicolor laser pointer would also have to be characterized and optimized to the optics and processing for the specific application.
The camera output can be provided directly to the chassis for processing.
Optionally, the camera input may be externally processed. A DLP display typically includes all the functionality required to process the video from the camera. However, an external processor may be desired.
The display system 100 may use the images from the cameras 130A,
130B as feedback for convergence. For example, the camera is very useful on segmented displays to help analyze seams, alignment and video color brightness characteristics between screens.
Alternatively, the display system 100 may interact with these images. For example the cameras 130A, 130B may provide feedback from a person 135 holding a laser pointer 145 (FIG. 2). The cameras 130A, 130B having an appropriate wavelength filter may be used to detect the wavelength of the laser pointer 145 as well as its intensity to track its movement across the display devices 111 A, 111 B, 111 C, 111 D. The intensity of the laser may also be used to provide graphics onto the displayed images.
Security of the viewer is not an issue since many display devices 111 A,
111B, 111 C, 111D include a Fresnel lens, lenticular or other screen components that are intended to diffuse the image at the screen rather than pass the light through the screen. As such, the camera will not see the viewer as normal video. Referring to FIGS. 2-3, in one illustrative embodiment, the intensity of the laser light from the laser pointer 145 is at a first intensity in its normal state, i.e., when a button 170 (FIGS. 8A-8B) on the laser pointer is not depressed. When a button 170 on the laser pointer is depressed the first intensity may be increased to a second intensity that is higher than the first intensity. An IR receiver or and
RF receiver may be coupled to the buttons 170 on the laser pointer so as to transmit an IR or RF signal toward the display devices 111 A, 111 B, 111 C, 111 D.
The IR or RF receiver in the display system receives these signals and the CPU 140 processes them. The difference in magnitude of the laser light between the first intensity and the second intensity may be used to indicate to a user that a laser pointer 145 is active.
An IR receiver or and RF receiver may be coupled to the buttons 170 on the laser pointer so as to transmit an IR or RF signal toward the display devices 111 A; 111 B, 111C, 111 D. The IR or RF receiver in the display system receives these signals and the CPU 140 processes them to add or remove graphics and text onto portions of the display devices 111 A, 111 B, 111 C, 111 D.
Referring to FIGS. 7-8, when a button 170 on the laser pointer 145 is depressed the cameras 130A, 130B may apply appropriate light selective filters to the image to detect the wavelength of the laser light as described in step 200.
For example, a red filter may be applied to detect a ruby laser.
After the selective filter is applied, referring to step 205, the camera tracks the light laser light movement across the display devices 111 A, 111 B, 111 C,
111D based on movement made by a user 135 holding the laser pointer 145 (FIG. 3).
Depending upon which button 170 is depressed, a plurality of commands may be performed on portions of the display devices 111 A, 111 B, 111 C, 111 D.
The commands performed by buttons 170 may include, for example, color 179, width 178, select 180, zoom in 177, center 181 , zoom out 176, erase 182 and power 175, among others.
Referring to step 210, the cameras continually 130A, 130B monitor if any of the buttons 170 are depressed. If the cameras detect that a command button is depressed the display system 100 in conjunction with the CPU 140 and IR or
RF inputs, as described below, applies a graphics overlay on the display devices 111 A, 111 B, 111 C, 111D following the laser light movement thereon, as described in step 215. The graphics of a particular command button 170 are added to the image or an image is then modified based on the command as in step 220.
Referring to step 225, once the camera detects that the button 170 is released, the graphics are updated to remove the command strokes or execute an end of a command algorithm as describe in step 230. Given the video capture and commands the video processing can track the light over time, execute the commands, and generate the next sequence of graphics~as~required by the user. Special functions, such as, for example the erase button 182, may define an area that moves on the screen about a center point which is used to clear the existing lines or video content within that area.
Additionally, the laser pointer may also include an infrared (IR) light emitting diode (LED) 170 (FIGS. 8A-8B). The IR LED 170 may be used in conjunction with the laser light output, cameras 130A, 130B and CPU 140 to get user identification information, execute functions and attributes (FIG. 7). Referring to FIGS. 7-8, when a button 170 on the laser pointer 145 is depressed the IR or RF receivers may apply appropriate light selective filters to look for laser pointer identification information (ID), as described in steps 300- 310.
Depending upon which button 170 is depressed, a plurality of functions and attributes may be performed on portions of the display devices 111 A, 111 B, 111C, 111 D. Referring to step 315, if the IR or RF receivers detect that a command button is depressed, the CPU 140 and IR or RF determines whether a graphics overlay is needed to help a user select attributes on the display devices 111 A, 111 B, 111 C, 111 D. Referring to step 325, when a graphics overlay is needed, a pop-up menu is provided so a user can toggle through desired selections, as outlined in step 330. For example, attributes may include, color selection, line width control and position indications, among others. These attributes may be applied to the graphics overlay as part of step 215 described above. While the above described embodiment has been discussed with respect to one user. More than one user may interact with the display system 100. Additionally, teleconferencing can take advantage of this system by allowing a speaker to use a laser pointer and have the highlighted portions added directly to the distributed video without requiring a mouse. Normally, the mouse is very limited in its abilities versus the juxtaposed laser pointers. Although an exemplary interactive display system which incorporates the teachings of the present invention has been shown and described in detail herein, those skilled in the art can readily devise many other varied embodiments that still incorporate these teachings.

Claims

1. A projection system, comprising: one or more display screens; one or more projection devices which provide images to be displayed on the one or more display screens; one or more feedback devices for monitoring user interactions with the one or more display screens; and a remote device by which a user interacts with images displayed on the one or more display screens.
2. The projection system of claim 1 further including a mirror positioned at an angle.
3. The projection system of claim 1 wherein the one or more projection devices are digital light processor (DLP) devices.
4. The projection system of claim 1 wherein the one or more projection devices are selected from the group consisting of a camera, an infrared (IR) receiver and a radio frequency (RF) receiver.
5. The projection system of claim 1 wherein the one or more feedback devices monitor user interactions by monitoring images displayed on the one or more display screens.
6. The projection system of claim 1 wherein the remote device is a laser pointer.
7. The projection system of claim 6 wherein the one or more feedback devices detects laser light from the laser pointer on the one or more display screens.
PCT/US2007/014575 2007-06-20 2007-06-20 Interactive display with camera feedback WO2008156457A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2007/014575 WO2008156457A1 (en) 2007-06-20 2007-06-20 Interactive display with camera feedback

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2007/014575 WO2008156457A1 (en) 2007-06-20 2007-06-20 Interactive display with camera feedback

Publications (1)

Publication Number Publication Date
WO2008156457A1 true WO2008156457A1 (en) 2008-12-24

Family

ID=39092309

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/014575 WO2008156457A1 (en) 2007-06-20 2007-06-20 Interactive display with camera feedback

Country Status (1)

Country Link
WO (1) WO2008156457A1 (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0515015A2 (en) * 1991-05-10 1992-11-25 nVIEW CORPORATION Method and apparatus for interacting with a computer generated projected image
WO1997041502A1 (en) * 1996-04-27 1997-11-06 Philips Electronics N.V. Position determination of a laser pointer in a projection display system
US5793361A (en) * 1994-06-09 1998-08-11 Corporation For National Research Initiatives Unconstrained pointing interface for natural human interaction with a display-based computer system
US5914783A (en) * 1997-03-24 1999-06-22 Mistubishi Electric Information Technology Center America, Inc. Method and apparatus for detecting the location of a light source
WO2000058933A1 (en) * 1999-03-17 2000-10-05 Tegrity, Inc. Method and apparatus for visual pointing and computer control
EP1406438A2 (en) * 2002-10-02 2004-04-07 Hewlett-Packard Development Company, L.P. Freezable projection display
EP1646232A2 (en) * 2004-10-11 2006-04-12 Barco NV Multiple image projection system and method for projecting selected images adjacent to each other
US20060203203A1 (en) * 1995-04-07 2006-09-14 Seiko Epson Corporation Image projection system and a method of controlling a projected pointer
US20060238493A1 (en) * 2005-04-22 2006-10-26 Dunton Randy R System and method to activate a graphical user interface (GUI) via a laser beam

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0515015A2 (en) * 1991-05-10 1992-11-25 nVIEW CORPORATION Method and apparatus for interacting with a computer generated projected image
US5793361A (en) * 1994-06-09 1998-08-11 Corporation For National Research Initiatives Unconstrained pointing interface for natural human interaction with a display-based computer system
US20060203203A1 (en) * 1995-04-07 2006-09-14 Seiko Epson Corporation Image projection system and a method of controlling a projected pointer
WO1997041502A1 (en) * 1996-04-27 1997-11-06 Philips Electronics N.V. Position determination of a laser pointer in a projection display system
US5914783A (en) * 1997-03-24 1999-06-22 Mistubishi Electric Information Technology Center America, Inc. Method and apparatus for detecting the location of a light source
WO2000058933A1 (en) * 1999-03-17 2000-10-05 Tegrity, Inc. Method and apparatus for visual pointing and computer control
EP1406438A2 (en) * 2002-10-02 2004-04-07 Hewlett-Packard Development Company, L.P. Freezable projection display
EP1646232A2 (en) * 2004-10-11 2006-04-12 Barco NV Multiple image projection system and method for projecting selected images adjacent to each other
US20060238493A1 (en) * 2005-04-22 2006-10-26 Dunton Randy R System and method to activate a graphical user interface (GUI) via a laser beam

Similar Documents

Publication Publication Date Title
EP0839347B1 (en) Position determination of a laser pointer in a projection display system
US8508472B1 (en) Wearable remote control with a single control button
US6050690A (en) Apparatus and method for focusing a projected image
US7006055B2 (en) Wireless multi-user multi-projector presentation system
US7034866B1 (en) Combined display-camera for an image processing system
WO2008156453A1 (en) Laser pointer for an interactive display
US7703926B2 (en) Projector capable of capturing images and briefing system having the same
JPH08161114A (en) Display indication device of spatial optical modulator
US5973672A (en) Multiple participant interactive interface
US7918566B2 (en) Image display apparatus, image display method, and program product therefor
US9398223B2 (en) Shared-field image projection and capture system
JP2007525699A (en) Color laser projection display
JPH1185395A (en) Liquid crystal projector device with pointing function
US10901548B2 (en) Touch screen rear projection display
US20110249019A1 (en) Projection system and method
CN106851234A (en) The control method of projecting apparatus and projecting apparatus
US20100118202A1 (en) Display control apparatus and method
JPH0980372A (en) Projection type display device
JP3278350B2 (en) Projection display device
KR102127863B1 (en) Method for adjusting image on cylindrical screen device
JP2010117495A (en) Image processing apparatus, image display device, and image display system
KR20110085493A (en) A display device equipped with a projector and a controlling method thereof
US20120268371A1 (en) Image Projection Device
US20170270700A1 (en) Display device, method of controlling display device, and program
WO2019198381A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07809805

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07809805

Country of ref document: EP

Kind code of ref document: A1