US20020113950A1 - Interactive teleconferencing display system - Google Patents
Interactive teleconferencing display system Download PDFInfo
- Publication number
- US20020113950A1 US20020113950A1 US10/038,229 US3822902A US2002113950A1 US 20020113950 A1 US20020113950 A1 US 20020113950A1 US 3822902 A US3822902 A US 3822902A US 2002113950 A1 US2002113950 A1 US 2002113950A1
- Authority
- US
- United States
- Prior art keywords
- image
- presenter
- signal
- locations
- projection screen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/15—Conference systems
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/26—Projecting separately subsidiary matter simultaneously with main image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/74—Projection arrangements for image reproduction, e.g. using eidophor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
Definitions
- Teleconferencing the use of video and sound to connect two or more locations, permits groups of people at a distant location to see and hear a presenter at another location.
- a presenter from a remote location will typically be combined with graphics using a split screen technique or dual monitors.
- the Interactive Teleconferencing Display System uses equipment performing identical functions at each location thus permitting any location to originate or participate in a conference.
- the equipment includes a front or rear projection screen, an electronic projector, and a signal processor.
- a matte signal is generated that selectively inhibits the projector to prevent the projected graphics from illuminating the presenter.
- the graphics are downloaded and stored at all locations.
- the presenter, having been extracted by a matte signal is transmitted to all locations where it is matted over the graphics before projection.
- An individual at another location may participate at any time by stepping in front of his screen. All locations will see and hear both the presenter and the additional participant. Both participants may look at each other, point to, and discuss the material being displayed. They may also look toward their local audience without being blinded by the projector. Participants from other locations may join in and also appear on all screens.
- FIG. 1 illustrates the position of the signal processor unit with respect to the projector when using front projection.
- FIG. 2 illustrates the functions of the signal processor.
- FIG. 3 together with FIG. 2, illustrates the interconnections between two locations.
- FIG. 4 shows a block diagram of the components comprising this invention.
- FIG. 5 is a curve showing the relationship between infrared deviation from that of the screen and the reduction of video signal.
- FIG. 6 is a logic diagram of the elements of an operational system.
- FIG. 7 illustrates the functions of the signal processor when using rear projection or liquid crystal display screens.
- FIG. 8 illustrates the interconnections required for four-location teleconferencing.
- FIG. 9 illustrates the additional compositing stages required when adding a third and forth location.
- FIG. 1 represents a typical conference room 1 .
- Each room contains a screen 2 , a participating presenter 3 , an electronic projector 4 that is often located above an audience 7 , a computer 6 or other storage device (e.g. DVD, VCR, etc.) for storing and retrieving graphics, and a signal processor 5 .
- a computer 6 or other storage device e.g. DVD, VCR, etc.
- the signal processor contained in a single enclosure, is the key element of this invention in that it includes all elements of the system except the projector, projection screen, and the image storage device. This device is most likely to be a computer, and is placed in an area easily accessible to an operator.
- One of the signal processor components is a camera that must be located in close proximity directly below or above the projector, assuming one is using a front projection screen, or it may be integrated into the projector. Users having ample space behind the projection screen may use rear projection. In this event, the ideal camera location is a point over the audience, normal to the screen, and on a common axis through screen center and projector lens. While liquid crystal display screens are still relatively small, they are getting bigger and may become large enough for a large audience. Another possibility is the multiple cathode ray tube display. Its disadvantages are cost and the presence of a join line between tubes. These screens have some advantages over rear projection and front projection screens with few disadvantages other than cost or small size. Although it is expected that most users will use front projection screens, the following system explanations apply to all display methods except where noted.
- the camera provides an image of the presenter and anything he adds to the scene, such as material written on a white board.
- the participants may not always require stored background graphics, and on these occasions, memory 26 will contain a black slide, or will not be used.
- FIG. 2 and FIG. 3 represent the display components at locations A and B, distant from each other, but the diagrams of FIGS. 2 and 3 illustrate the interactions occurring between the components at each location.
- the numbers 20 through 29 represent the functions of a signal processor.
- Number series 30 through 39 are the same signal processor functions at a second location.
- a selected graphics image from memory 26 is routed through compositing function 25 , through inhibitor function 24 , then to projector 27 which projects the selected graphics onto screen 29 .
- the audience at location A will see the stored graphics image from a local memory projected onto the projection screen as an original without loss of detail.
- the presenter matte extraction function ( 22 , 32 ) has nothing to extract, and compositor ( 25 , 35 ) has no foreground image to composite, and the inhibitor ( 24 , 34 ) has no presenter to protect.
- the inhibitor 24 , 34
- Camera 20 is located directly below projector 27 so as to see presenter 28 and to maintain the proper alignment of the inhibit matte.
- a beam splitter is provided in camera 20 to split off an infrared or other image for the generation of a matte signal in matte generator 21 .
- Projected image source 41 of FIG. 4 represents the source of video image to be projected onto projection screen 43 .
- Image source 41 may be a computer, videocassette, digital videodisc, another camera or other source of video image.
- the video program signal from image source 41 is connected to inhibitor 42 where the video signal at selected pixels may be inhibited.
- the program signal is then connected from inhibitor 42 to video projector 46 , which projects the program image on projection screen 43 .
- At least one infrared source 47 is used to uniformly illuminate projection screen 43 . Being infrared, this illumination is not seen by the viewer.
- Camera 45 is an infrared sensitive video camera observing the uniformly illuminated projection screen. Camera 45 output is connected to video inhibitor 42 .
- the infrared signal at inhibitor 42 from the projection screen is nulled to zero.
- the subject 44 enters into the projection beam, the subject's infrared reflection is likely to be higher or lower than the uniform infrared luminance level of the projection screen. Any infrared deviation from the infrared signal level established for the projection screen represents the subject.
- the addresses of those detected pixels that identify the subject location are used to inhibit the video program signal at these same addresses.
- the probability of deceiving the inhibit logic is reduced by selecting the infrared camera's pass band least likely to match the reflection levels of the subject.
- the near infrared bandwidth is very wide, and the infrared provided by an incandescent source provides a flat wide illumination bandwidth.
- the infrared sensitive camera may therefore be equipped with filters of adjoining pass bands such as 700-800, 800-900, and 900-1000 nanometers. It takes only a small shift in the pass band to effect a large change in infrared reflection. A filter selection may be made during setup to prevent the subject's infrared reflection from matching that of the screen.
- An alternative to selecting external pass band camera filters is to incorporate two or more infrared image channels in the camera, each filtered to a different pass band, with a separate infrared reference frame stored for each pass band.
- a standard difference key or matte, relies on a reference frame of the blank screen to compare with each succeeding frame to detect the subject's location. Since an image within the visible spectrum is also being projected onto the screen, a standard difference key does not appear to function in this application.
- Another option is to flood the projection screen with one or more bands of ultra violet light outside visible wavelengths.
- a preferred option is the use of near infrared to illuminate the projection screen.
- the infrared luminance level of the projection screen may be monitored and the reference frame updated to compensate for line voltage changes to the infrared source.
- the updated reference frame permits improved subject detection when infrared differences are very small.
- IRm stored IR pixel value (at the same location)
- Inhibiting of the projected image may be continuous, either linear or nonlinear, as opposed to a switch action. If nonlinear, the earliest and smallest detectable variance of the infrared signal is made to cause a small reduction of video signal level. As the deviation increases, the rate of inhibition increases. When the deviation nears a selected level, the inhibition rate is rapidly increased to cutoff, or to a selected low level near cutoff. The variable rate at which signal inhibition occurs prevents the on-off flicker effect of a switch action. FIG. 5 illustrates this relationship.
- RGB levels representing white (or colored) light may be added to those pixels defining the subject area.
- the illumination of the subject may therefore be increased above that produced by ambient light alone.
- supplementary subject illumination augmenting ambient room light is likely to be somewhat annoying to the subject facing the projector.
- image source 41 the video program source may be a computer, videotape, or videodisc as selected by the user.
- the video projector 46 and projection surface 43 are commercial devices selected by the user.
- An infrared filter if needed, removes any residual infrared in the video projection beam.
- the infrared sensitive camera 45 is a video camera whose photoreceptors extend into the near infrared beyond 700 nanometers. A filter is placed over the camera lens to remove visible wavelengths.
- At least one infrared source 47 is a projector using an incandescent lamp.
- a filter is placed over the infrared source to remove visible light.
- Inhibitor 42 is the detector/inhibitor. Its function has been described earlier.
- FIG. 6 is a logic flow diagram showing the functions of subject detection and program signal inhibiting.
- IR camera 61 may be a 480 line VGA progressive scan low resolution camera, or any other low resolution camera sensitive to near infrared.
- Clear frame memory 62 is a stored infrared image of the infrared illuminated screen with the subject removed from the scene.
- the mask generator 63 compares the infrared sensitive camera image with the clear frame image in memory 62 and any difference identifies that area occupied by a subject, if present.
- Shaping function 64 shapes the subject detection signal from an on-off signal to a linear, or a nonlinear signal as shown in FIG. 5.
- Projector image source 65 is the program source to be projected onto the projection screen.
- the program video is generally an image of much higher resolution than an NTSC signal.
- Image size detect 66 determines the resolution of the program image and connects this size data to scale and filter 67 , which acts as a standards converter, to scale the size of the infrared camera to match the size of the projected image. Having matched image sizes, the program image is inhibited in inhibit projector image 68 in the area occupied by a subject, if a subject is present.
- Projector 99 projects program image onto the screen, but does not project the program onto the subject.
- Matte signal 21 is generated by one of such existing methods from information provided by camera 20 .
- the inhibit matte signal from generator 21 is inverted to form a second matte signal providing a 1.0 value for the subject area and a 0.0 value for the background surrounding the subject.
- This second matte and the video signal from camera 20 are connected to multiplier 23 .
- Their product is the Processed Foreground signal (PrFg) consisting of the subject against a 0.0 field of black.
- PrFg Processed Foreground signal
- the processed foreground having a subject on a field of 0.0 black is intentional since the blackest black in a video signal sits atop a pedestal of about 7% of white.
- the 0.0 of the processed foreground video is therefore a matte signal transmitted with the isolated subject.
- the processed foreground 23 from location A is connected to the matte extraction function 32 and compositing function 35 at location B.
- the matte extraction function 32 separates the processed foreground, whose lowest level is the 7% pedestal, from the 0.0 of the black field by setting a detection threshold at about 3%. All pixels above the threshold are in the foreground and are assigned a 1.0 value. All pixels below the selected threshold are in the background and are assigned a 0.0 value.
- the assignment of pixel values as 1.0 or 0.0 is arbitrary and may be inverted as required by the function it is intended to control.
- a threshold level above camera and system noise is necessary to prevent background area noise peaks from incorrectly being accepted as a subject pixel.
- the extracted matte is inverted to provide a 0.0 in the processed foreground area and a 1.0 in the graphics area surrounding the subject.
- Multiplying the graphics image from source 36 by 1.0 retains the full signal level of the graphics surrounding the subject, but the 0.0 in the subject area creates a 0.0 black hole in the projected graphics.
- Compositing function 35 adds the processed foreground, consisting only of the subject, into the hole created for it.
- the composite image from 35 is routed through the inhibit function 34 to projector 39 .
- the audience at location B sees the graphics from their own image source 36 being projected onto their own screen with the video image of the presenter from location A composited over their graphics.
- the quality of the image is limited only by the resolution of the original image, and the resolution of the projector. By pre-loading the graphics at each location, the remaining data to send to other locations is only the processed video signal, with sound.
- the matte signal assigned is a binary switch (i.e. 1.0 or 0.0), and therefore the composite image may be formed by a key function derived from the matte signal to switch between a stored image and the presenter. In either case the presenter pixel values replace those of the background image to form the composite image.
- a binary I/O matte signal generates a sharp edged switch, however the matte edge can be sized to better fit the subject outline, and it may be softened to improve the transition from the presenter to his background.
- the inhibit function 34 awaits the presence of a presenter 38 .
- a person 38 at location B wishes to participate, he steps in front of his screen.
- Functions 30 , 31 and 34 inhibit pixels in projector 37 from projecting onto the person 38 .
- Functions 30 , 31 , and 33 generate a processed foreground, PrFg, which is routed back to location A to the matte extractor 22 and compositor 25 .
- PrFg processed foreground
- the video of person 38 at location B, in front of his screen will be composited over the graphics being projected at location A.
- the audience at location B will see participant 38 in person in front of the projected graphics, and presenter 28 will be seen composited over said graphics.
- both participants will see the other person's video image composited with the graphics.
- the participants may see and face each other, point to elements in the graphics, and discuss them.
- the audience at locations C and D will see the presenter A and participant B on their projection screens.
- a person at C and D may also become a participant by stepping in front of their screen.
- the audience at the location of a participant will see their presenter in person and all other presenters will appear on the screen behind him, but in front of projected graphics.
- FIG. 7 shows the signal flow through a signal processor after the inhibit function is removed or inactivated.
- FIG. 8 illustrates the interconnections required for four participating locations such as A, B, C and D.
- the output signal at each of these locations is a Processed Foreground (PrFg) and is connected to the compositing function at all other locations.
- the input needed by each location is the PrFg signal from all other locations.
- the PrFg 23 from location A is shown connected to composite functions B. C, and D to illustrate how the PrFg is connected to the input stages at other locations. The remaining connections are made as indicated in FIG. 8.
- FIG. 9 illustrates the compositing function needed when there are four participating locations.
- Functions 22 and 25 are all that are needed if only location B is sending a PrFg signal to location A.
- the addition of a third location, C requires a separate compositing stage 22 ′ and 25 ′.
- the addition of a fourth location, D requires a separate compositing stage 22 ′′ and 25 ′′.
- the number of compositing stages needed is one less than the number of participating locations.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Projection Apparatus (AREA)
- Controls And Circuits For Display Device (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Facsimiles In General (AREA)
- Circuits Of Receivers In General (AREA)
- Details Of Television Scanning (AREA)
- Medicines Containing Material From Animals Or Micro-Organisms (AREA)
- Overhead Projectors And Projection Screens (AREA)
- Liquid Crystal (AREA)
Abstract
Description
- This is a continuation-in-part of U.S. application Ser. No. 09/788,026 filed Feb. 16, 2001.
- Teleconferencing, the use of video and sound to connect two or more locations, permits groups of people at a distant location to see and hear a presenter at another location. A presenter from a remote location will typically be combined with graphics using a split screen technique or dual monitors.
- Rear projection, and large liquid crystal display screens, have been used to combine the presenter with graphics. The audience in the same room with the presenter, in front of the rear projection or liquid crystal display, is seeing first generation graphics, but when photographed and transmitted to another location, it must be projected again which makes it second generation. Because of the loss of two generations, the graphic data at the distant location is degraded to a point where many graphs, charts and text cannot be clearly read.
- Combining the presenter with the graphics using front projection suffers from the additional problems of blinding the presenter, and distorting the graphics his body intercepts, which is disconcerting to the viewers.
- In teleconferencing, there are numerous variations in the techniques for combining a presenter and the selected graphics. None of these techniques can be considered ideal. The effort to place the presenter in front of the projected graphics is to improve personal contact between a presenter and his audience, as compared to the sleep inducing graphics image with an off-screen presenter.
- The Interactive Teleconferencing Display System uses equipment performing identical functions at each location thus permitting any location to originate or participate in a conference. The equipment includes a front or rear projection screen, an electronic projector, and a signal processor. When the presenter is in front of a front projection screen, a matte signal is generated that selectively inhibits the projector to prevent the projected graphics from illuminating the presenter. The graphics are downloaded and stored at all locations. The presenter, having been extracted by a matte signal is transmitted to all locations where it is matted over the graphics before projection. By separately transmitting the graphics image and the presenter's image, and combining them at the remote location, each is an original and there is no loss of detail when displayed.
- An individual at another location may participate at any time by stepping in front of his screen. All locations will see and hear both the presenter and the additional participant. Both participants may look at each other, point to, and discuss the material being displayed. They may also look toward their local audience without being blinded by the projector. Participants from other locations may join in and also appear on all screens.
- FIG. 1 illustrates the position of the signal processor unit with respect to the projector when using front projection.
- FIG. 2 illustrates the functions of the signal processor.
- FIG. 3, together with FIG. 2, illustrates the interconnections between two locations.
- FIG. 4 shows a block diagram of the components comprising this invention.
- FIG. 5 is a curve showing the relationship between infrared deviation from that of the screen and the reduction of video signal.
- FIG. 6 is a logic diagram of the elements of an operational system.
- FIG. 7 illustrates the functions of the signal processor when using rear projection or liquid crystal display screens.
- FIG. 8 illustrates the interconnections required for four-location teleconferencing.
- FIG. 9 illustrates the additional compositing stages required when adding a third and forth location.
- FIG. 1, represents a typical conference room1. Each room contains a
screen 2, a participatingpresenter 3, anelectronic projector 4 that is often located above anaudience 7, a computer 6 or other storage device (e.g. DVD, VCR, etc.) for storing and retrieving graphics, and asignal processor 5. - The signal processor, contained in a single enclosure, is the key element of this invention in that it includes all elements of the system except the projector, projection screen, and the image storage device. This device is most likely to be a computer, and is placed in an area easily accessible to an operator.
- One of the signal processor components is a camera that must be located in close proximity directly below or above the projector, assuming one is using a front projection screen, or it may be integrated into the projector. Users having ample space behind the projection screen may use rear projection. In this event, the ideal camera location is a point over the audience, normal to the screen, and on a common axis through screen center and projector lens. While liquid crystal display screens are still relatively small, they are getting bigger and may become large enough for a large audience. Another possibility is the multiple cathode ray tube display. Its disadvantages are cost and the presence of a join line between tubes. These screens have some advantages over rear projection and front projection screens with few disadvantages other than cost or small size. Although it is expected that most users will use front projection screens, the following system explanations apply to all display methods except where noted.
- The camera provides an image of the presenter and anything he adds to the scene, such as material written on a white board. The participants may not always require stored background graphics, and on these occasions,
memory 26 will contain a black slide, or will not be used. - FIG. 2 and FIG. 3 represent the display components at locations A and B, distant from each other, but the diagrams of FIGS. 2 and 3 illustrate the interactions occurring between the components at each location. The
numbers 20 through 29 represent the functions of a signal processor.Number series 30 through 39 are the same signal processor functions at a second location. - Referring to FIG. 2 (location A), a selected graphics image from
memory 26 is routed through compositingfunction 25, throughinhibitor function 24, then toprojector 27 which projects the selected graphics ontoscreen 29. The audience at location A will see the stored graphics image from a local memory projected onto the projection screen as an original without loss of detail. - Referring to FIG. 3 (location B), the same graphics image will be retrieved from
computer 36 and routed through compositingfunction 35, throughinhibitor function 34, then toprojector 37 that projects the selected graphics ontoscreen 39. If there is a third and fourth participating location, their audience will also see the same graphics, obtained from their own computer, being projected onto their screens without loss of detail. - As long as there is no presenter in front of any projection screen, the presenter matte extraction function (22,32) has nothing to extract, and compositor (25,35) has no foreground image to composite, and the inhibitor (24,34) has no presenter to protect. When a person or object enters in front of the screen, it becomes a foreground subject and activates the above subject-related functions.
- Camera20 is located directly below
projector 27 so as to seepresenter 28 and to maintain the proper alignment of the inhibit matte. A beam splitter is provided incamera 20 to split off an infrared or other image for the generation of a matte signal inmatte generator 21. - There are several matte generation methods in use. One is described in U.S. application Ser. No. 09/788,026 filed Feb. 16, 2001. One such method is described with reference to FIG. 4 as follows.
- Projected
image source 41 of FIG. 4 represents the source of video image to be projected ontoprojection screen 43.Image source 41 may be a computer, videocassette, digital videodisc, another camera or other source of video image. - The video program signal from
image source 41 is connected toinhibitor 42 where the video signal at selected pixels may be inhibited. The program signal is then connected frominhibitor 42 tovideo projector 46, which projects the program image onprojection screen 43. - In one embodiment, at least one
infrared source 47 is used to uniformly illuminateprojection screen 43. Being infrared, this illumination is not seen by the viewer.Camera 45 is an infrared sensitive video camera observing the uniformly illuminated projection screen.Camera 45 output is connected tovideo inhibitor 42. The infrared signal atinhibitor 42 from the projection screen is nulled to zero. In the event a subject 44 enters into the projection beam, the subject's infrared reflection is likely to be higher or lower than the uniform infrared luminance level of the projection screen. Any infrared deviation from the infrared signal level established for the projection screen represents the subject. The addresses of those detected pixels that identify the subject location are used to inhibit the video program signal at these same addresses. - There is always a possibility that some small area on the subject's wardrobe will reflect exactly the same amount of infrared as the screen. In this area, the inhibitor is fooled and the video signal is not inhibited. Such areas are of little concern since there is little probability of infrared reflection from the subject's face matching that of the screen.
- The probability of deceiving the inhibit logic is reduced by selecting the infrared camera's pass band least likely to match the reflection levels of the subject.
- The near infrared bandwidth is very wide, and the infrared provided by an incandescent source provides a flat wide illumination bandwidth. The infrared sensitive camera may therefore be equipped with filters of adjoining pass bands such as 700-800, 800-900, and 900-1000 nanometers. It takes only a small shift in the pass band to effect a large change in infrared reflection. A filter selection may be made during setup to prevent the subject's infrared reflection from matching that of the screen.
- An alternative to selecting external pass band camera filters is to incorporate two or more infrared image channels in the camera, each filtered to a different pass band, with a separate infrared reference frame stored for each pass band.
- It is highly unlikely the subject's infrared reflection would simultaneously match the infrared reflection of two or more infrared pass bands.
- To inhibit the projected image from falling upon the subject when the subject enters into the projected image, it is necessary to separate the subject from the scene being projected upon it.
- There are several existing ways of detecting a subject's location. A standard difference key, or matte, relies on a reference frame of the blank screen to compare with each succeeding frame to detect the subject's location. Since an image within the visible spectrum is also being projected onto the screen, a standard difference key does not appear to function in this application.
- Another option is to flood the projection screen with one or more bands of ultra violet light outside visible wavelengths.
- One might also separate the subject from the projection screen by using a long wave infrared camera sensitive to the temperature of the human body. Since a camera of this type sees body temperature, there is no need to flood the screen with long wave infrared.
- Other methods identify the subject presence by radar or sonar techniques that detect a subject as being at a shorter distance than the screen.
- Stereoscopic devices, and maximizing image detail, have been used in automatic cameras to determine distance. Any scheme that provides a signal separating the subject from the projected image may be used in this invention to inhibit the projected image in the area occupied by the subject.
- A preferred option is the use of near infrared to illuminate the projection screen. The infrared luminance level of the projection screen may be monitored and the reference frame updated to compensate for line voltage changes to the infrared source. The updated reference frame permits improved subject detection when infrared differences are very small. By using the infrared portion of the radiation spectrum, the projected and detected infrared images are immune from projected image content changes.
- Using infrared illumination to generate a difference or ratio matte provides a practical method of identifying those pixels occupied by a subject. Equations for generating suitable ratio and difference mattes for this purpose are as follows:
- If IRo ≦IRm
- M−IRo / IRm
- If IRo >IRm
- M−IRm / IRo
- If IRm−
IRo 0 - M−0
- M=1-{max [(IRo−IRm), (IRm−IRo)]}
- Where:
- IRo= observed IR pixel value
- IRm= stored IR pixel value (at the same location)
- M= calculated matte value
- Inhibiting of the projected image may be continuous, either linear or nonlinear, as opposed to a switch action. If nonlinear, the earliest and smallest detectable variance of the infrared signal is made to cause a small reduction of video signal level. As the deviation increases, the rate of inhibition increases. When the deviation nears a selected level, the inhibition rate is rapidly increased to cutoff, or to a selected low level near cutoff. The variable rate at which signal inhibition occurs prevents the on-off flicker effect of a switch action. FIG. 5 illustrates this relationship.
- The term “inhibit” is defined as a reduction in the level of the projected image in that area occupied by the subject. In fact, if the level is reduced to about5% of full level, the visibility of the subject is reduced to visual black. With little or no projector illumination onto the subject, it will receive no illumination other than ambient room light, which is typically attenuated to a very low level when using a projector.
- Since subject illumination from the video projector has been inhibited to near zero, RGB levels representing white (or colored) light may be added to those pixels defining the subject area. The illumination of the subject may therefore be increased above that produced by ambient light alone. Although at a lower level, supplementary subject illumination augmenting ambient room light, is likely to be somewhat annoying to the subject facing the projector.
- The techniques described in U.S. Pat. No. 5,270,820 may be used to locate the speaker's head (or other extremity). With this additional information, the projected white (or colored) light onto the subject may be inhibited in the region of his head and eyes.
- The term “projection screen” or “screen” has been used above. This screen may be white, beaded, metallic, or metallic coated lenticular, or any surface suitable for viewing a projected image.
- In FIG. 4,
image source 41, the video program source may be a computer, videotape, or videodisc as selected by the user. - The
video projector 46 andprojection surface 43 are commercial devices selected by the user. An infrared filter, if needed, removes any residual infrared in the video projection beam. - The infrared
sensitive camera 45 is a video camera whose photoreceptors extend into the near infrared beyond 700 nanometers. A filter is placed over the camera lens to remove visible wavelengths. - At least one
infrared source 47 is a projector using an incandescent lamp. A filter is placed over the infrared source to remove visible light.Inhibitor 42 is the detector/inhibitor. Its function has been described earlier. - FIG. 6 is a logic flow diagram showing the functions of subject detection and program signal inhibiting. Referring to FIG. 6,
IR camera 61 may be a 480 line VGA progressive scan low resolution camera, or any other low resolution camera sensitive to near infrared.Clear frame memory 62 is a stored infrared image of the infrared illuminated screen with the subject removed from the scene. Themask generator 63 compares the infrared sensitive camera image with the clear frame image inmemory 62 and any difference identifies that area occupied by a subject, if present. Shapingfunction 64 shapes the subject detection signal from an on-off signal to a linear, or a nonlinear signal as shown in FIG. 5. -
Projector image source 65 is the program source to be projected onto the projection screen. The program video is generally an image of much higher resolution than an NTSC signal. Image size detect 66 determines the resolution of the program image and connects this size data to scale andfilter 67, which acts as a standards converter, to scale the size of the infrared camera to match the size of the projected image. Having matched image sizes, the program image is inhibited in inhibitprojector image 68 in the area occupied by a subject, if a subject is present. Projector 99 projects program image onto the screen, but does not project the program onto the subject. -
Matte signal 21 is generated by one of such existing methods from information provided bycamera 20. -
Matte signal generator 21 generates an inhibit matte signal and supplies it toinhibitor 24. The matte signal is assigned a 0.0 value for those pixels that constitute the foreground subject. Pixels in areas of the screen displaying the graphics surrounding the subject are assigned a 1.0. Thegraphics image 26, passes throughcompositor 25 to the inhibitmultiplier 24. The graphics image is multiplied in 24 by the matte signal from 21 whose zeros in the subject area shut off (inhibit) the projector signal in the area of the subject. At this point the audience at location A (FIG. 2) sees the presenter, illuminated by room light, with the graphics appearing on the screen behind him. The presenter may look at his audience without being blinded by the glare of the projector. The use of a matte signal in generating an inhibit signal is described above. (While the matte signal will be required to isolate the subject, an inhibit signal is not required for a rear projected image or a liquid crystal display.) - The inhibit matte signal from
generator 21 is inverted to form a second matte signal providing a 1.0 value for the subject area and a 0.0 value for the background surrounding the subject. This second matte and the video signal fromcamera 20 are connected tomultiplier 23. Their product is the Processed Foreground signal (PrFg) consisting of the subject against a 0.0 field of black. The processed foreground having a subject on a field of 0.0 black is intentional since the blackest black in a video signal sits atop a pedestal of about 7% of white. The 0.0 of the processed foreground video is therefore a matte signal transmitted with the isolated subject. The processedforeground 23 from location A is connected to thematte extraction function 32 andcompositing function 35 at location B. - The
matte extraction function 32 separates the processed foreground, whose lowest level is the 7% pedestal, from the 0.0 of the black field by setting a detection threshold at about 3%. All pixels above the threshold are in the foreground and are assigned a 1.0 value. All pixels below the selected threshold are in the background and are assigned a 0.0 value. The assignment of pixel values as 1.0 or 0.0 is arbitrary and may be inverted as required by the function it is intended to control. A threshold level above camera and system noise is necessary to prevent background area noise peaks from incorrectly being accepted as a subject pixel. - The extracted matte is inverted to provide a 0.0 in the processed foreground area and a 1.0 in the graphics area surrounding the subject. Multiplying the graphics image from
source 36 by 1.0 (the matte signal) retains the full signal level of the graphics surrounding the subject, but the 0.0 in the subject area creates a 0.0 black hole in the projected graphics. Compositingfunction 35 adds the processed foreground, consisting only of the subject, into the hole created for it. The composite image from 35 is routed through the inhibitfunction 34 toprojector 39. The audience at location B sees the graphics from theirown image source 36 being projected onto their own screen with the video image of the presenter from location A composited over their graphics. - The quality of the image is limited only by the resolution of the original image, and the resolution of the projector. By pre-loading the graphics at each location, the remaining data to send to other locations is only the processed video signal, with sound.
- The process of using the matte signal to multiply and add to composite an image over a background preserves subject edge transparency. However when the matte signal assigned, is a binary switch (i.e. 1.0 or 0.0), and therefore the composite image may be formed by a key function derived from the matte signal to switch between a stored image and the presenter. In either case the presenter pixel values replace those of the background image to form the composite image.
- A binary I/O matte signal generates a sharp edged switch, however the matte edge can be sized to better fit the subject outline, and it may be softened to improve the transition from the presenter to his background.
- The inhibit
function 34 awaits the presence of apresenter 38. When aperson 38 at location B, wishes to participate, he steps in front of his screen.Functions projector 37 from projecting onto theperson 38.Functions matte extractor 22 andcompositor 25. The video ofperson 38 at location B, in front of his screen, will be composited over the graphics being projected at location A. The audience at location B will seeparticipant 38 in person in front of the projected graphics, andpresenter 28 will be seen composited over said graphics. - By looking at the screen, both participants will see the other person's video image composited with the graphics. The participants may see and face each other, point to elements in the graphics, and discuss them. The audience at locations C and D will see the presenter A and participant B on their projection screens. A person at C and D may also become a participant by stepping in front of their screen. The audience at the location of a participant will see their presenter in person and all other presenters will appear on the screen behind him, but in front of projected graphics.
- There is an obvious limitation to the number of simultaneous participants that can be in the scene and still see the graphics behind them. If the presentation is in the form of a number of speeches, the graphics may be generated to occupy the upper part of the screen so the seated participants will not obscure material that needs to be seen by the audience. Each presenter in turn makes his presentation while the audience at all locations watch the speaker and the reaction of those seated.
- If a large white board is used as a projection screen, then the presenter and whatever he writes or draws becomes part of the subject matter and will be projected onto the white boards at the other locations. A participant from another location may draw on his own white board and his writing will be projected on all the other white boards. In this manner each location may contribute to a drawing, add to a list, mark locations on a map, etc.
- Rear projection and liquid crystal display systems do not require the inhibit
function 24, and is therefore bypassed. FIG. 7 shows the signal flow through a signal processor after the inhibit function is removed or inactivated. - FIG. 8 illustrates the interconnections required for four participating locations such as A, B, C and D. The output signal at each of these locations is a Processed Foreground (PrFg) and is connected to the compositing function at all other locations. The input needed by each location is the PrFg signal from all other locations. In FIG. 8, the
PrFg 23 from location A is shown connected to composite functions B. C, and D to illustrate how the PrFg is connected to the input stages at other locations. The remaining connections are made as indicated in FIG. 8. - FIG. 9 illustrates the compositing function needed when there are four participating locations.
Functions separate compositing stage 22′ and 25′. The addition of a fourth location, D, requires aseparate compositing stage 22″ and 25″. The number of compositing stages needed is one less than the number of participating locations.
Claims (18)
Priority Applications (11)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/038,229 US6454415B1 (en) | 2001-02-16 | 2002-01-02 | Interactive teleconferencing display system |
CA002438739A CA2438739C (en) | 2001-02-16 | 2002-02-14 | Interactive teleconferencing display system |
CNB028083237A CN100366079C (en) | 2001-02-16 | 2002-02-14 | Interactive teleconferencing display system |
AU2002255556A AU2002255556B2 (en) | 2001-02-16 | 2002-02-14 | Interactive teleconferencing display system |
EP02724957.2A EP1368705B1 (en) | 2001-02-16 | 2002-02-14 | Interactive teleconferencing display system |
JP2002566717A JP2004525560A (en) | 2001-02-16 | 2002-02-14 | Interactive teleconference display system |
PCT/US2002/004593 WO2002067050A1 (en) | 2001-02-16 | 2002-02-14 | Interactive teleconferencing display system |
KR1020037010773A KR100851612B1 (en) | 2001-02-16 | 2002-02-14 | Interactive teleconferencing display system |
TW091102646A TW571595B (en) | 2001-02-16 | 2002-02-18 | Interactive teleconferencing display system |
MYPI20020537A MY122949A (en) | 2001-02-16 | 2002-02-18 | Interactive teleconferencing display system |
HK04109747A HK1067716A1 (en) | 2001-02-16 | 2004-12-08 | Interactive teleconferencing display system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/788,026 US6361173B1 (en) | 2001-02-16 | 2001-02-16 | Method and apparatus for inhibiting projection of selected areas of a projected image |
US10/038,229 US6454415B1 (en) | 2001-02-16 | 2002-01-02 | Interactive teleconferencing display system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/788,026 Continuation-In-Part US6361173B1 (en) | 2001-02-16 | 2001-02-16 | Method and apparatus for inhibiting projection of selected areas of a projected image |
Publications (2)
Publication Number | Publication Date |
---|---|
US20020113950A1 true US20020113950A1 (en) | 2002-08-22 |
US6454415B1 US6454415B1 (en) | 2002-09-24 |
Family
ID=25143215
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/788,026 Expired - Lifetime US6361173B1 (en) | 2001-02-16 | 2001-02-16 | Method and apparatus for inhibiting projection of selected areas of a projected image |
US10/038,229 Expired - Lifetime US6454415B1 (en) | 2001-02-16 | 2002-01-02 | Interactive teleconferencing display system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/788,026 Expired - Lifetime US6361173B1 (en) | 2001-02-16 | 2001-02-16 | Method and apparatus for inhibiting projection of selected areas of a projected image |
Country Status (13)
Country | Link |
---|---|
US (2) | US6361173B1 (en) |
EP (1) | EP1379916B1 (en) |
JP (1) | JP2004533632A (en) |
KR (2) | KR100948572B1 (en) |
CN (1) | CN100476570C (en) |
AT (1) | ATE463760T1 (en) |
AU (1) | AU2002306508B2 (en) |
CA (1) | CA2438724C (en) |
DE (1) | DE60235880D1 (en) |
HK (2) | HK1067716A1 (en) |
MY (2) | MY126256A (en) |
TW (1) | TW522279B (en) |
WO (1) | WO2002067049A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6616284B2 (en) * | 2000-03-06 | 2003-09-09 | Si Diamond Technology, Inc. | Displaying an image based on proximity of observer |
US20060033824A1 (en) * | 2004-08-10 | 2006-02-16 | Nicholson Bruce A | Sodium screen digital traveling matte methods and apparatus |
US20070273842A1 (en) * | 2006-05-24 | 2007-11-29 | Gerald Morrison | Method And Apparatus For Inhibiting A Subject's Eyes From Being Exposed To Projected Light |
US20080018862A1 (en) * | 2006-07-18 | 2008-01-24 | Fuji Xerox Co., Ltd. | Image display apparatus, image display method, and program product therefor |
US20090091711A1 (en) * | 2004-08-18 | 2009-04-09 | Ricardo Rivera | Image Projection Kit and Method and System of Distributing Image Content For Use With The Same |
US20140043545A1 (en) * | 2011-05-23 | 2014-02-13 | Panasonic Corporation | Light projection device |
US20140063353A1 (en) * | 2012-09-05 | 2014-03-06 | 3M Innovative Properties Company | Variable brightness digital signage |
US20150378250A1 (en) * | 2014-06-26 | 2015-12-31 | Panasonic Intellectual Property Management Co., Ltd. | Light projection apparatus and illumination apparatus using same |
CN109508162A (en) * | 2018-10-12 | 2019-03-22 | 福建星网视易信息系统有限公司 | A kind of throwing screen display methods, system and storage medium |
Families Citing this family (79)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6553140B1 (en) | 1999-04-16 | 2003-04-22 | Avid Technology, Inc. | Method and system for spill correction |
JP3678111B2 (en) * | 2000-04-25 | 2005-08-03 | セイコーエプソン株式会社 | Luminous flux control when a person enters the path of the projection light emitted from the projector |
US6785402B2 (en) * | 2001-02-15 | 2004-08-31 | Hewlett-Packard Development Company, L.P. | Head tracking and color video acquisition via near infrared luminance keying |
US6742901B2 (en) * | 2001-05-16 | 2004-06-01 | Sony Corporation | Imaging prevention method and system |
JP2003043412A (en) * | 2001-08-01 | 2003-02-13 | Fuji Photo Optical Co Ltd | Presentation system using laser pointer |
US7468778B2 (en) * | 2002-03-15 | 2008-12-23 | British Broadcasting Corp | Virtual studio system |
WO2003104892A1 (en) * | 2002-06-10 | 2003-12-18 | ソニー株式会社 | Image projector and image projecting method |
JP3744002B2 (en) * | 2002-10-04 | 2006-02-08 | ソニー株式会社 | Display device, imaging device, and imaging / display system |
US6840627B2 (en) * | 2003-01-21 | 2005-01-11 | Hewlett-Packard Development Company, L.P. | Interactive display device |
US6789903B2 (en) * | 2003-02-18 | 2004-09-14 | Imatte, Inc. | Generating an inhibit signal by pattern displacement |
JP3849654B2 (en) * | 2003-02-21 | 2006-11-22 | 株式会社日立製作所 | Projection display |
US6811267B1 (en) * | 2003-06-09 | 2004-11-02 | Hewlett-Packard Development Company, L.P. | Display system with nonvisible data projection |
US6796656B1 (en) * | 2003-06-14 | 2004-09-28 | Imatte, Inc. | Generating a matte signal from a retro reflective component of a front projection screen |
JP3735624B2 (en) * | 2003-09-26 | 2006-01-18 | Necビューテクノロジー株式会社 | Projection display device |
US7324166B1 (en) * | 2003-11-14 | 2008-01-29 | Contour Entertainment Inc | Live actor integration in pre-recorded well known video |
US6984039B2 (en) * | 2003-12-01 | 2006-01-10 | Eastman Kodak Company | Laser projector having silhouette blanking for objects in the output light path |
US6860604B1 (en) * | 2004-01-09 | 2005-03-01 | Imatte, Inc. | Method and apparatus for inhibiting the projection of a shadow of a presenter onto a projection screen |
JP4747524B2 (en) * | 2004-07-13 | 2011-08-17 | セイコーエプソン株式会社 | projector |
US7325933B2 (en) * | 2004-08-09 | 2008-02-05 | Sanyo Electric Co., Ltd | Projection type video display apparatus |
EP1851588B1 (en) * | 2005-02-01 | 2019-08-07 | Laser Projection Technologies, Inc. | Laser projection with object feature detection |
US20060170871A1 (en) * | 2005-02-01 | 2006-08-03 | Dietz Paul H | Anti-blinding safety feature for projection systems |
US8386909B2 (en) * | 2005-04-07 | 2013-02-26 | Hewlett-Packard Development Company, L.P. | Capturing and presenting interactions with image-based media |
US20070018989A1 (en) * | 2005-07-20 | 2007-01-25 | Playmotion, Llc | Sensory integration therapy system and associated method of use |
EP2027720A2 (en) * | 2006-05-17 | 2009-02-25 | Eidgenössische Technische Hochschule | Displaying information interactively |
US8098330B2 (en) * | 2006-07-28 | 2012-01-17 | International Business Machines Corporation | Mapping of presentation material |
US8045060B2 (en) * | 2006-10-04 | 2011-10-25 | Hewlett-Packard Development Company, L.P. | Asynchronous camera/projector system for video segmentation |
US7690795B2 (en) * | 2006-10-06 | 2010-04-06 | Hewlett-Packard Development Company, L.P. | Projector/camera system |
EP2104930A2 (en) | 2006-12-12 | 2009-09-30 | Evans & Sutherland Computer Corporation | System and method for aligning rgb light in a single modulator projector |
TWI325998B (en) * | 2006-12-20 | 2010-06-11 | Delta Electronics Inc | Projection apparatus and system |
WO2008144749A1 (en) * | 2007-05-21 | 2008-11-27 | Evans & Sutherland Computer Corporation | Invisible scanning safety system |
US8251517B2 (en) * | 2007-12-05 | 2012-08-28 | Microvision, Inc. | Scanned proximity detection method and apparatus for a scanned image projection system |
US20090147272A1 (en) * | 2007-12-05 | 2009-06-11 | Microvision, Inc. | Proximity detection for control of an imaging device |
US8358317B2 (en) | 2008-05-23 | 2013-01-22 | Evans & Sutherland Computer Corporation | System and method for displaying a planar image on a curved surface |
US8702248B1 (en) | 2008-06-11 | 2014-04-22 | Evans & Sutherland Computer Corporation | Projection method for reducing interpixel gaps on a viewing surface |
US8215799B2 (en) | 2008-09-23 | 2012-07-10 | Lsi Industries, Inc. | Lighting apparatus with heat dissipation system |
USD631183S1 (en) | 2008-09-23 | 2011-01-18 | Lsi Industries, Inc. | Lighting fixture |
US8077378B1 (en) | 2008-11-12 | 2011-12-13 | Evans & Sutherland Computer Corporation | Calibration system and method for light modulation device |
DE102008060110A1 (en) * | 2008-12-03 | 2010-06-10 | Sennheiser Electronic Gmbh & Co. Kg | Projection apparatus for projecting video signal to projection surface, has projection unit for generating broadcasting light signal based on video input signal |
US8290208B2 (en) * | 2009-01-12 | 2012-10-16 | Eastman Kodak Company | Enhanced safety during laser projection |
US8628198B2 (en) * | 2009-04-20 | 2014-01-14 | Lsi Industries, Inc. | Lighting techniques for wirelessly controlling lighting elements |
US20100264314A1 (en) * | 2009-04-20 | 2010-10-21 | Lsi Industries, Inc. | Lighting Techniques for Wirelessly Controlling Lighting Elements |
WO2011013240A1 (en) * | 2009-07-31 | 2011-02-03 | Necディスプレイソリューションズ株式会社 | Projection display device and light quantity adjusting method |
US8515196B1 (en) * | 2009-07-31 | 2013-08-20 | Flir Systems, Inc. | Systems and methods for processing infrared images |
US8672427B2 (en) * | 2010-01-25 | 2014-03-18 | Pepsico, Inc. | Video display for product merchandisers |
KR20110116525A (en) * | 2010-04-19 | 2011-10-26 | 엘지전자 주식회사 | Image display device and operating method for the same |
CN102681312B (en) * | 2011-03-16 | 2015-06-24 | 宏瞻科技股份有限公司 | Human eye safety protection system of laser projection system |
WO2012122679A1 (en) * | 2011-03-16 | 2012-09-20 | Chen Chih-Hsiao | Human eyes safety protection system of a laser projection system |
KR20120129664A (en) * | 2011-05-20 | 2012-11-28 | 삼성전자주식회사 | Projector and method for controlling of projector |
US9641826B1 (en) | 2011-10-06 | 2017-05-02 | Evans & Sutherland Computer Corporation | System and method for displaying distant 3-D stereo on a dome surface |
TW201329508A (en) * | 2012-01-04 | 2013-07-16 | Walsin Lihwa Corp | Device and method for protecting eyes |
JP6281857B2 (en) * | 2012-01-13 | 2018-02-21 | 株式会社ドワンゴ | Video system and photographing method |
FR2991787B1 (en) * | 2012-06-11 | 2014-07-18 | Mireille Jacquesson | DEVICE AND METHOD FOR PROJECTING IMAGES ON MOBILE SCREENS |
CN103576428B (en) | 2012-08-02 | 2015-11-25 | 光宝科技股份有限公司 | There is the laser projection system of safety protecting mechanism |
GB2505708B (en) | 2012-09-11 | 2015-02-25 | Barco Nv | Projection system with safety detection |
US9407961B2 (en) * | 2012-09-14 | 2016-08-02 | Intel Corporation | Media stream selective decode based on window visibility state |
CN102914936B (en) * | 2012-09-20 | 2016-02-17 | 深圳雅图数字视频技术有限公司 | The control method of projector and device |
US9357165B2 (en) | 2012-11-16 | 2016-05-31 | At&T Intellectual Property I, Lp | Method and apparatus for providing video conferencing |
CN104981757B (en) | 2013-02-14 | 2017-08-22 | 苹果公司 | Flexible room controller |
US9514558B2 (en) * | 2013-09-06 | 2016-12-06 | Imatte, Inc. | Method for preventing selected pixels in a background image from showing through corresponding pixels in a transparency layer |
US9716861B1 (en) | 2014-03-07 | 2017-07-25 | Steelcase Inc. | Method and system for facilitating collaboration sessions |
US10664772B1 (en) | 2014-03-07 | 2020-05-26 | Steelcase Inc. | Method and system for facilitating collaboration sessions |
US9380682B2 (en) | 2014-06-05 | 2016-06-28 | Steelcase Inc. | Environment optimization for space based on presence and activities |
US9766079B1 (en) | 2014-10-03 | 2017-09-19 | Steelcase Inc. | Method and system for locating resources and communicating within an enterprise |
US9955318B1 (en) | 2014-06-05 | 2018-04-24 | Steelcase Inc. | Space guidance and management system and method |
US10433646B1 (en) | 2014-06-06 | 2019-10-08 | Steelcaase Inc. | Microclimate control systems and methods |
US11744376B2 (en) | 2014-06-06 | 2023-09-05 | Steelcase Inc. | Microclimate control systems and methods |
US9852388B1 (en) | 2014-10-03 | 2017-12-26 | Steelcase, Inc. | Method and system for locating resources and communicating within an enterprise |
CN104598034B (en) * | 2015-02-09 | 2019-03-29 | 联想(北京)有限公司 | Information processing method and information processing equipment |
US10733371B1 (en) | 2015-06-02 | 2020-08-04 | Steelcase Inc. | Template based content preparation system for use with a plurality of space types |
CN105007439A (en) * | 2015-07-15 | 2015-10-28 | 合肥联宝信息技术有限公司 | Projector and method for avoiding projecting projected image on human body |
JP6464977B2 (en) * | 2015-09-30 | 2019-02-06 | ブラザー工業株式会社 | Projection control apparatus and program |
US9921726B1 (en) | 2016-06-03 | 2018-03-20 | Steelcase Inc. | Smart workstation method and system |
US10264213B1 (en) | 2016-12-15 | 2019-04-16 | Steelcase Inc. | Content amplification system and method |
CN106709954B (en) * | 2016-12-27 | 2020-05-15 | 上海唱风信息科技有限公司 | Method for masking human face in projection area |
US10027934B1 (en) | 2017-01-03 | 2018-07-17 | International Business Machines Corporation | Prohibiting facial exposure to projected light |
CN111856866A (en) * | 2019-04-30 | 2020-10-30 | 中强光电股份有限公司 | Projection device and operation method thereof |
CN114019756A (en) * | 2020-07-28 | 2022-02-08 | 青岛海信激光显示股份有限公司 | Laser projection equipment and human eye protection method |
US11984739B1 (en) | 2020-07-31 | 2024-05-14 | Steelcase Inc. | Remote power systems, apparatus and methods |
US11397071B1 (en) | 2021-09-14 | 2022-07-26 | Vladimir V. Maslinkovskiy | System and method for anti-blinding target game |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5367315A (en) * | 1990-11-15 | 1994-11-22 | Eyetech Corporation | Method and apparatus for controlling cursor movement |
US5270820A (en) * | 1992-06-25 | 1993-12-14 | Ultimatte Corporation | Method and apparatus for tracking a pointing device in a video field |
US5681223A (en) * | 1993-08-20 | 1997-10-28 | Inventures Inc | Training video method and display |
US5572248A (en) | 1994-09-19 | 1996-11-05 | Teleport Corporation | Teleconferencing method and system for providing face-to-face, non-animated teleconference environment |
US5912700A (en) * | 1996-01-10 | 1999-06-15 | Fox Sports Productions, Inc. | System for enhancing the television presentation of an object at a sporting event |
US5574511A (en) * | 1995-10-18 | 1996-11-12 | Polaroid Corporation | Background replacement for an image |
DE19640404A1 (en) * | 1996-09-30 | 1998-04-09 | Ldt Gmbh & Co | Device for displaying images |
JP3510746B2 (en) * | 1996-11-14 | 2004-03-29 | 理想科学工業株式会社 | Electric stapler |
US6252632B1 (en) * | 1997-01-17 | 2001-06-26 | Fox Sports Productions, Inc. | System for enhancing a video presentation |
US5971544A (en) * | 1997-07-24 | 1999-10-26 | Chris Perry | Color key surface and stage |
US6259470B1 (en) * | 1997-12-18 | 2001-07-10 | Intel Corporation | Image capture system having virtual camera |
US5913591A (en) * | 1998-01-20 | 1999-06-22 | University Of Washington | Augmented imaging using a silhouette to improve contrast |
JP3630015B2 (en) * | 1999-04-21 | 2005-03-16 | セイコーエプソン株式会社 | Projection display apparatus and information storage medium |
-
2001
- 2001-02-16 US US09/788,026 patent/US6361173B1/en not_active Expired - Lifetime
-
2002
- 2002-01-02 US US10/038,229 patent/US6454415B1/en not_active Expired - Lifetime
- 2002-02-14 WO PCT/US2002/004591 patent/WO2002067049A1/en active Application Filing
- 2002-02-14 KR KR1020037010774A patent/KR100948572B1/en not_active IP Right Cessation
- 2002-02-14 DE DE60235880T patent/DE60235880D1/en not_active Expired - Lifetime
- 2002-02-14 JP JP2002566716A patent/JP2004533632A/en active Pending
- 2002-02-14 AU AU2002306508A patent/AU2002306508B2/en not_active Ceased
- 2002-02-14 CA CA2438724A patent/CA2438724C/en not_active Expired - Fee Related
- 2002-02-14 KR KR1020037010773A patent/KR100851612B1/en not_active IP Right Cessation
- 2002-02-14 EP EP02742480A patent/EP1379916B1/en not_active Expired - Lifetime
- 2002-02-14 CN CNB028083210A patent/CN100476570C/en not_active Expired - Fee Related
- 2002-02-14 AT AT02742480T patent/ATE463760T1/en not_active IP Right Cessation
- 2002-02-15 TW TW091102580A patent/TW522279B/en not_active IP Right Cessation
- 2002-02-15 MY MYPI20020531A patent/MY126256A/en unknown
- 2002-02-18 MY MYPI20020537A patent/MY122949A/en unknown
-
2004
- 2004-12-08 HK HK04109747A patent/HK1067716A1/en not_active IP Right Cessation
- 2004-12-08 HK HK04109746.4A patent/HK1066871A1/en not_active IP Right Cessation
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6616284B2 (en) * | 2000-03-06 | 2003-09-09 | Si Diamond Technology, Inc. | Displaying an image based on proximity of observer |
US20060033824A1 (en) * | 2004-08-10 | 2006-02-16 | Nicholson Bruce A | Sodium screen digital traveling matte methods and apparatus |
US8066384B2 (en) | 2004-08-18 | 2011-11-29 | Klip Collective, Inc. | Image projection kit and method and system of distributing image content for use with the same |
US9560307B2 (en) | 2004-08-18 | 2017-01-31 | Klip Collective, Inc. | Image projection kit and method and system of distributing image content for use with the same |
US20090091711A1 (en) * | 2004-08-18 | 2009-04-09 | Ricardo Rivera | Image Projection Kit and Method and System of Distributing Image Content For Use With The Same |
US10986319B2 (en) | 2004-08-18 | 2021-04-20 | Klip Collective, Inc. | Method for projecting image content |
US8632192B2 (en) | 2004-08-18 | 2014-01-21 | Klip Collective, Inc. | Image projection kit and method and system of distributing image content for use with the same |
US10567718B2 (en) | 2004-08-18 | 2020-02-18 | Klip Collective, Inc. | Image projection kit and method and system of distributing image content for use with the same |
US10084998B2 (en) | 2004-08-18 | 2018-09-25 | Klip Collective, Inc. | Image projection kit and method and system of distributing image content for use with the same |
US9078029B2 (en) | 2004-08-18 | 2015-07-07 | Klip Collective, Inc. | Image projection kit and method and system of distributing image content for use with the same |
US20080106706A1 (en) * | 2006-05-24 | 2008-05-08 | Smart Technologies, Inc. | Method and apparatus for inhibiting a subject's eyes from being exposed to projected light |
US7686460B2 (en) * | 2006-05-24 | 2010-03-30 | Smart Technologies Ulc | Method and apparatus for inhibiting a subject's eyes from being exposed to projected light |
US20100182416A1 (en) * | 2006-05-24 | 2010-07-22 | Smart Technologies Ulc | Method and apparatus for inhibiting a subject's eyes from being exposed to projected light |
US7984995B2 (en) * | 2006-05-24 | 2011-07-26 | Smart Technologies Ulc | Method and apparatus for inhibiting a subject's eyes from being exposed to projected light |
US20070273842A1 (en) * | 2006-05-24 | 2007-11-29 | Gerald Morrison | Method And Apparatus For Inhibiting A Subject's Eyes From Being Exposed To Projected Light |
US7918566B2 (en) * | 2006-07-18 | 2011-04-05 | Fuji Xerox Co., Ltd. | Image display apparatus, image display method, and program product therefor |
US20080018862A1 (en) * | 2006-07-18 | 2008-01-24 | Fuji Xerox Co., Ltd. | Image display apparatus, image display method, and program product therefor |
US9217865B2 (en) * | 2011-05-23 | 2015-12-22 | Panasonic Intellectual Property Management Co., Ltd. | Light projection device |
US20140043545A1 (en) * | 2011-05-23 | 2014-02-13 | Panasonic Corporation | Light projection device |
US20140063353A1 (en) * | 2012-09-05 | 2014-03-06 | 3M Innovative Properties Company | Variable brightness digital signage |
US20150378250A1 (en) * | 2014-06-26 | 2015-12-31 | Panasonic Intellectual Property Management Co., Ltd. | Light projection apparatus and illumination apparatus using same |
US9465281B2 (en) * | 2014-06-26 | 2016-10-11 | Panasonic Intellectual Property Management Co., Ltd. | Light projection apparatus and illumination apparatus using same |
CN109508162A (en) * | 2018-10-12 | 2019-03-22 | 福建星网视易信息系统有限公司 | A kind of throwing screen display methods, system and storage medium |
Also Published As
Publication number | Publication date |
---|---|
KR100948572B1 (en) | 2010-03-18 |
CN1503924A (en) | 2004-06-09 |
EP1379916A4 (en) | 2007-08-29 |
TW522279B (en) | 2003-03-01 |
US6361173B1 (en) | 2002-03-26 |
KR100851612B1 (en) | 2008-08-12 |
AU2002306508B2 (en) | 2006-03-09 |
EP1379916B1 (en) | 2010-04-07 |
ATE463760T1 (en) | 2010-04-15 |
HK1067716A1 (en) | 2005-04-15 |
CN100476570C (en) | 2009-04-08 |
CA2438724C (en) | 2010-01-26 |
MY122949A (en) | 2006-05-31 |
KR20030093206A (en) | 2003-12-06 |
MY126256A (en) | 2006-09-29 |
KR20030083715A (en) | 2003-10-30 |
WO2002067049A1 (en) | 2002-08-29 |
US6454415B1 (en) | 2002-09-24 |
JP2004533632A (en) | 2004-11-04 |
DE60235880D1 (en) | 2010-05-20 |
EP1379916A1 (en) | 2004-01-14 |
HK1066871A1 (en) | 2005-04-01 |
CA2438724A1 (en) | 2002-08-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6454415B1 (en) | Interactive teleconferencing display system | |
AU2002306508A1 (en) | Method and apparatus for inhibiting projection of selected areas of a projected image | |
CA2438739C (en) | Interactive teleconferencing display system | |
US5940139A (en) | Background extraction in a video picture | |
US8045060B2 (en) | Asynchronous camera/projector system for video segmentation | |
US7690795B2 (en) | Projector/camera system | |
US7071897B2 (en) | Immersive augmentation for display systems | |
AU2002255556A1 (en) | Interactive teleconferencing display system | |
US9679369B2 (en) | Depth key compositing for video and holographic projection | |
US6616281B1 (en) | Visible-invisible background prompter | |
JP7387653B2 (en) | Presentation system and presentation method | |
KR102571677B1 (en) | AI Studio for Online Lectures | |
Vidal et al. | Chroma key without color restrictions based on asynchronous amplitude modulation of background illumination on retroreflective screens | |
CA2066163A1 (en) | Video conferencing system for courtroom and other applications | |
JPH07120837A (en) | Presentation device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: IMATTE, INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VLAHOS, PAUL;REEL/FRAME:012459/0877 Effective date: 20011031 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |