US20190320163A1 - Multi-screen perspective altering display system - Google Patents
Multi-screen perspective altering display system Download PDFInfo
- Publication number
- US20190320163A1 US20190320163A1 US16/452,901 US201916452901A US2019320163A1 US 20190320163 A1 US20190320163 A1 US 20190320163A1 US 201916452901 A US201916452901 A US 201916452901A US 2019320163 A1 US2019320163 A1 US 2019320163A1
- Authority
- US
- United States
- Prior art keywords
- display
- viewer
- scene
- camera
- foreground
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
- H04N21/42206—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
- H04N21/42222—Additional components integrated in the remote control device, e.g. timer, speaker, sensors for detecting position, direction or movement of the remote control, microphone or battery charging device
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
- H04N13/279—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/368—Image reproducers using viewer tracking for two or more viewers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/373—Image reproducers using viewer tracking for tracking forward-backward translational head movements, i.e. longitudinal movements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42202—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/10—Mixing of images, i.e. displayed pixel being the result of an operation, e.g. adding, on the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/12—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels
- G09G2340/125—Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels wherein one of the images is motion video
-
- H04N2005/4428—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/006—Pseudo-stereoscopic systems, i.e. systems wherein a stereoscopic effect is obtained without sending different images to the viewer's eyes
Definitions
- This invention relates generally to display systems and, more particularly, to systems that alter perspective, synthesize depth perception and provide other capabilities, thereby enhancing the viewing experience.
- This invention resides in apparatus and methods providing a unique experience for the viewer of a display, particularly large wall-mounted panels.
- the perception of a displayed image is altered for viewers moving relative to the position of the display system screen, thereby imparting a sense of three-dimensional immersion in the scene being displayed.
- a perspective-altering display system comprises a display generator for generating a scene having foreground and background elements, and a display screen displaying the scene.
- a sensor detects the position of a viewer relative to the display screen, and a processor is operative to shift the relative position of the foreground and background elements in the displayed scene as a function of viewer position, such that the viewer's perspective of the scene changes as the viewer moves relative to the display screen.
- the foreground and background elements may be presented in the form of multiple superimposed graphics planes.
- At least one of the graphics planes may include prerecorded material or material received through a transmission medium or camera.
- a camera may be used to record the scene through panning at sequential angles, with a memory being used to store the images obtained at the sequential angles for later recall as a function of user movement.
- An interpolator may be used to “fill in” visual gaps in the scene.
- the display screen may be mounted on a wall having a backside, and a camera may be mounted on the backside of the wall which pans as a viewer moves, thereby imaging a scene representative of the display being a virtual window through the wall.
- a camera with a zoom capability may be used for recording the scene.
- the processor is further operative to zoom in the camera as the viewer moves closer to the display screen, and zoom out the camera as the viewer moves away from the display screen.
- the sensor may be operative to detect the viewer's distance from the display screen, with the processor being operative to increase the resolution of the scene as the viewer moves closer to the display screen and decrease the resolution of the scene as the viewer moves away from the display screen.
- a camera with a tilt capability may be used for recording the scene, with the processor being further operative to tilt the camera in response to the viewer's up/down movement.
- the camera may have a field of view which includes a viewer of the display screen, enabling the display to function a virtual mirror.
- a plurality of cameras may be used for capturing the scene, with the processor being further operative to construct a three-dimensional image for display on the screen.
- a user control may be provided enabling a viewer to select a specific camera or cameras to see how others would view the user from different perspectives.
- the sensor may include an infrared CCD (charge-coupled device) camera.
- the camera may have an image sensor, such that an image of a person in front of the display may be focused onto the image sensor as a spot or group of pixels, allowing the movements of the person to be tracked with no moving parts.
- CCD charge-coupled device
- the sensor may be a camera having at least a pan mount that tracks the movement of a viewer, with the processor being operative to shift the relative position of the foreground and background elements as a function of the tracking.
- the sensor may be a camera having at least a pan/tilt mount that tracks the movement of a viewer, with the processor being operative to shift the relative position of the foreground and background elements as a function of the tracking.
- the sensor may be a camera having an auto-focus capability.
- the sensor may be operative to sense a plurality of individuals in front of the display screen.
- the processor may be operative to shift the relative position of the foreground and background elements by favoring larger clusters of individuals as opposed to smaller clusters or single individuals. Alternatively, moving individuals may be favored over stationary individuals, or individuals actually looking at the display screen may be favored over those who are not.
- One or more transducers may be provided for producing sounds associated with the scene, with the processor operative to alter the reproduction of the sounds as a function of viewer movement.
- FIG. 1A shows a viewer having a field of view walking from the right towards the left relative to a display screen (shown as viewed from above);
- FIG. 1B is a simplified representation of what the person might see on the display screen according to the invention from the position shown in FIG. 1A ;
- FIG. 2A shows the person moving to the left, closer to the center of the display screen
- FIG. 2B shows how closer objects have shifted laterally to the greatest degree, as opposed to more distant objects, much as a train passenger would experience while looking out the window of the train as the train moves;
- FIG. 3A illustrates how the viewer has moved to the left-most portion of the display
- FIG. 3B shows how close objects have shifted to the extent that they now partially overlap with an object further away
- FIG. 4A depicts the image of a foreground object being gathered by a camera and recorded by a recorder
- FIG. 4B shows mid-range objects being recorded
- FIG. 4C shows distant or background objects are being recorded
- FIG. 5 depicts an alternative technique for implementing perspective alteration according to the invention
- FIG. 6A shows how an infrared CCD (charge-coupled device) camera, preferably with a wide-angle lens, may be used as a sensor according to the invention
- FIG. 6B shows the use of a panning camera
- FIG. 6C depicts three persons generating a composite thermal field on an image sensor
- FIG. 7A illustrates a “virtual picture window” embodiment, wherein the movement of a viewer causes an outdoor camera to pan back/forth, thereby allowing the viewer to visualize the outdoor scene as if the display were a hole in the wall;
- FIG. 7B shows how the invention enables virtual windows on insides walls, which may be useful in homes and businesses such as restaurants, bars and nightclubs;
- FIG. 7C shows how the display can function as a “virtual mirror”
- FIG. 8 depicts how the virtual mirror embodiment of the invention may be used in bathrooms and dressing rooms.
- This invention employs a variety of techniques to provide a unique experience for the viewer of a display, particularly large wall-mounted panels.
- the perception of a displayed image is altered for viewers moving relative to the position of the display system screen, thereby imparting a sense of three-dimensional immersion in the scene being displayed.
- FIG. 1A shows a viewer 102 having a field of view 104 walking from the right towards the left relative to a display screen 106 .
- One or more sensors 108 which may use visible-light, infrared, ultrasonic, or other modalities described in further detail below, are used to track at least the lateral position of individual 102 .
- FIG. 1B is a simplified representation of what the person 102 might see on the display screen 106 according to the invention from the position shown in FIG. 1A .
- Relatively close objects are shown at 115 , 116 .
- Less close objects are seen at 114 .
- Somewhat distant objects are shown at 112 , and distant objects are shown at 110 .
- four relative distances are mentioned, the invention is not limited in this regard, and is applicable to more or fewer such relative distances.
- FIG. 2A the person 102 has moved to the left, closer to the center of the screen 106 .
- the sensor(s) 108 have detected this movement and, in response, the perspective of the scene has been altered.
- FIG. 2B closer objects have shifted laterally to the greatest degree, followed by objects 114 and 112 in order, much as a train passenger would experience while looking out the window of the train as the train moves.
- Object 110 being significantly distant, would shift little, if at all.
- FIG. 3A This process continues in FIG. 3A , where the viewer has moved to the left-most portion of the display. Close object 116 has shifted to the extent that it now partially overlaps with an object further away, and object 115 has begun to move off the screen 106 , as shown in FIG. 3B .
- FIG. 3B The ways in which the invention makes this possible will now be described in further detail.
- the invention employs a technique similar to that utilized in animation films: multiple superimposed graphics planes.
- FIG. 4A the image of a foreground object 412 is gathered by camera 402 and recorded by recorder 410 .
- a blue or other solid-color background 414 may be used for chroma-keying.
- mid range objects 420 are being recorded whereas, in FIG. 4C , distant or background objects are being recorded.
- Commercially available software packages, or customized software tuned to specific program content can be utilized to derive the desired material from among multiple subjects representing different focal points, and, once identified, can track the subjects as they change their position and even their orientation.
- multiple cameras, positioned to capture three-dimensional information may be utilized to derive a three-dimensional array for each frame of motion, thereby allowing the producer to select “slices” which can be captured as graphics plane images for manipulation by the graphics processor of the instant invention.
- the camera(s) may record moving video images for the foreground graphics plane(s), the background graphics plane(s), or any combination thereof.
- the background graphic plane may be based upon a still picture, while the foreground cameras record motion imagery.
- position sensor(s) 108 detect the location of the viewer relative to the screen, and reposition the foreground graphics plane(s) as the viewer moves, thereby conveying to the viewer the impression, for example, that he is looking through a window at an outdoor scene, with, perhaps, a nearer image, such as a tree branch, that the viewer can see around by simply shifting his position relative to the display screen.
- the tree branch (or other object) may also be moving, as it would in a breeze, for example.
- the video source for these graphics planes may include prerecorded material supplied by playback from any recording devices.
- Other sources include broadcast, satellite, cable, or other programming sources, material delivered over broadband or other telecommunication links, privately recorded material, live video from cameras (including security and monitoring cameras), computer-generated graphics and the like, or any other source of image material.
- Graphics planes displaying text information may be superimposed over, or under, other graphics planes.
- FIG. 5 depicts an alternative technique for implementing perspective alteration according to the invention.
- a camera 502 having at least a pan mount 506 records an actual scene 512 at incremental angles suggested by arrow 508 .
- the view at each angle is recorded by unit 510 for later replay.
- the number of increments depends upon the desired resolution, room dimensions, and other factors. For example, at a very high resolution, single-degree increments may be recorded through a full 180 degrees. At a lesser resolution, single-degree increments may be recorded across a smaller angle of view, or larger-degree increments may be used at angle of view with or without interpolation to fill in any ‘gaps’ during replay.
- a variety of novel video applications are enabled by virtue of the invention, such as a simulated window which has a view that is not possible from that position if there were an actual window in that position (i.e., viewing a sunset from an eastern exposure), or a view which is not possible at all (i.e., a winter scene during the summer, or a scene from a different country).
- a simulated window which has a view that is not possible from that position if there were an actual window in that position (i.e., viewing a sunset from an eastern exposure), or a view which is not possible at all (i.e., a winter scene during the summer, or a scene from a different country).
- FIG. 7A illustrates a “virtual picture window” embodiment, wherein the movement of viewer 102 , detected by sensors 108 , cause outdoor camera 704 on mount 706 to pan back/forth, thereby allowing the viewer 102 to “see” the outdoor scene 720 “through” the wall 702 on display 106 using graphics processing system 710 .
- the camera 704 may zoom as the viewer comes closer, pan when the user moves away and tilt if the viewer goes up/down, as might be the case on a staircase, for example.
- the display system can provide a three-dimensional effect, by applying modifications to an image as the viewer changes his position.
- an image of a painting might capture the artist's intentions when viewed at a distance, while a close-up examination of the video display by the viewer could display the details of the brush strokes as the viewer changed his position relative to the screen.
- the invention is not limited to virtual windows through outside walls. As shown in FIG. 7B , the invention enables virtual windows on insides walls, which may be useful in homes and businesses such as restaurants, bars and nightclubs. Similar to the embodiment depicted in FIG. 7A , as user 102 moves relative to screen 106 , camera 704 pans (or tilts or zooms), enabling the user 102 to see people 730 or other objects through the wall 702 .
- the display system can function as a “virtual mirror.” Such an embodiment is depicted in FIG. 7C , wherein graphics processing system 710 directs the camera 704 to at least pan at an angle “A” substantially equal to angle “B” formed by the location of the user and line 722 perpendicular to the plane of display 106 . This allows the viewer to see a synthesized reflection depicted by broken line 720 , typically including the subject him/herself.
- the mirror embodiment of the invention may be used in bathrooms and dressing rooms, as illustrated in FIG. 8 .
- Multiple cameras 802 , 804 , 806 , 808 disposed around the display screen 106 allow the processor unit 812 to construct a three-dimensional image for display on the screen, and the proximity sensing devices may be utilized to create an on-screen image which is representative of what a person should see as he re-positions himself, in three dimensions, about the display screen. It is a simple matter to provide a left-to-right reversed-image, in keeping with the mirror aspect of the invention.
- a user control 810 allows the user to select a specific camera or cameras to see how others would view them from different perspectives.
- the camera(s) capturing the image of the viewer may be placed behind a semi-transparent screen allowing better visualization such as eye-level contact to be maintained.
- One or more video cameras may be provided on a bendable tether—or wireless hookup—enabling a user to view hard-to-reach places such as ears, nose, mouth etc.
- Variable degrees of magnification may be provided, based upon detected distance from a surface being viewed, for example.
- the preferred embodiment uses an infrared CCD (charge-coupled device) camera, preferably with a wide-angle lens 206 shown in FIG. 6A .
- the user 202 generates a thermal image 204 , which is focused onto camera array 210 as a spot or group of pixels, allowing the system 220 to know where a person is with no moving parts.
- Processor 106 can then cause the perspective, depth perception, or other characteristics to change accordingly. If no tilt or zoom functions are provided, a linear sensor may be substituted for a 2D sensor.
- a panning camera may be used, as shown in FIG. 6B .
- the camera 230 is trained on a subject and, as that subject moves positional information is sensed by pan mount 234 and communicated to alter screen 106 through electronics unit 240 .
- the camera 230 may use tilt and/or auto focus to determine other positional aspects of the viewer.
- the invention may handle multiple viewers in different ways. These solutions include (1) favoring clusters of potential viewers over singular viewers; (2) favoring moving viewers over stationary viewers; and (3) favoring viewers actually looking at the screen over those looking away.
- One advantage of the sensor system of FIG. 6A is that clustering is naturally accommodated.
- three persons 242 , 244 , 246 generate a composite thermal field 240 , which generates a relatively large imprint 251 on sensor 210 .
- the narrower thermal field 250 of single person 248 results in a smaller spot 241 on array 210 such that, in this embodiment of the invention, the perspective of persons in the group would be favored.
- FIG. 6A also naturally addresses the favoring of moving viewers over stationary viewers.
- the group consisting of persons 242 , 244 , 246 were stationary, it would be a straightforward processing task to detect that an individual is moving. In this embodiment of the invention, the moving individual 248 would be preferred over the stationary group.
- sensing in the visible region of the spectrum may be used instead of—or in concert with—IR sensing.
- visible light sensing may allow a single camera (or cameras) to detect the image and position of the viewer (as in the mirror embodiments).
- recognition techniques may be used to determine if a particular person is actually looking at the display in which case that person may be favored over individuals looking away. If multiple persons are looking at the display, other techniques such as clustering and motion favoring may also be used.
- image recognition and other operations require additional processing power, however, that is easily accommodated with modern processors.
- specialized graphics processing provides the management of the graphics planes and any audio material, while processing rules (for example—“take image modification instructions from the position of the closest viewer only”) ensure that the system will not be misdirected by movement of viewers that are on the opposite side of the room.
- processing rules for example—“take image modification instructions from the position of the closest viewer only”
- an overall system for management of the displays is utilized, thus providing an integrated, coordinated system of imaging displays. For example—an overall image, larger than the entire display system, may be utilized, or alternative schemes, in which image planes or other data may “flow” from one display screen to another display screen next to it.
- audio may be included, representing material that may or may not be related to the video images presented on the screen.
- the system can serve the function of an enhanced video display terminal, a television viewing screen, a security monitoring system, a video entertainment system, or any other system for which display of graphics material is of value to the viewer.
- the sound reproduction may be altered as a function with user position, with or without a change in visual perspective. For example, as an individual walks past the display screen the sound of elements in the scene (i.e., birds, vehicles, etc.) may be varied whether or not the individual is looking at the screen. If the user moves toward the screen, sounds may be enhanced or attenuated. For example, if a viewer moves toward a frog or a bird in the scene, the sounds of that creature may be enhanced, or diminished as the user moves away.
- elements in the scene i.e., birds, vehicles, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- Databases & Information Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biodiversity & Conservation Biology (AREA)
- Ecology (AREA)
- Emergency Management (AREA)
- Environmental & Geological Engineering (AREA)
- Environmental Sciences (AREA)
- Remote Sensing (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The perception of a displayed image is altered for viewers moving relative to a display screen, thereby imparting a sense of three-dimensional immersion in the scene being displayed. A display generator generates a scene having foreground and background elements, and a display screen displaying the scene. A sensor detects the position of a viewer relative to the display screen, and a processor is operative to shift the relative position of the foreground and background elements in the displayed scene as a function of viewer position, such that the viewer's perspective of the scene changes as the viewer moves relative to the display screen. The foreground and background elements may be presented in the form of multiple superimposed graphics planes, and/or a camera may be used to record the scene through panning at sequential angles. The system may be used to implement virtual windows, virtual mirrors and other effects.
Description
- This application is a Divisional of U.S. patent application Ser. No. 16/046,065, which claims priority and the benefit of U.S. patent application Ser. No. 12/197,635, filed Aug. 25, 2008, which claims priority and the benefit of U.S. Provisional Patent Application Ser. No. 60/957,845, filed Aug. 24, 2007, the entire content of all applications being incorporated herein by reference.
- This invention relates generally to display systems and, more particularly, to systems that alter perspective, synthesize depth perception and provide other capabilities, thereby enhancing the viewing experience.
- Flat panel displays are growing in size and falling in price. At this time, non-projection, true HD (i.e.—1080 p) 50″ displays are available in the $1,000 to $2,000 price range, and new models are introduced on a regular basis. When edge-lit liquid-crystal display (LCD) panels are replaced with back-lit white light-emitting diodes, another leap in technology will occur. When organic LED panels become viable, flexible, affordable wall-sized displays, including wrap-around configurations, should be possible.
- As large displays proliferate, users are finding uses for them beyond just “watching TV.” In some cases, for example, users are displaying pictures or rotating sequences of pictures on these displays, thereby creating, in effect, large picture frames. However, existing systems for video display are restricted either in their utility or in their realism, due to the inherent limitations in a two-dimensional presentation unit. Of these, the most important impact to the perception of a scene may be its lack of visual perspective as the viewer changes his position relative to the display unit.
- The use of 3-D glasses or other paraphernalia is oppressive in those circumstances where a casual effect is desired, or where no user intervention is to be required. Thus, prior-art systems relying on still or even video images, or various photographic or video projection techniques, cannot achieve the level of perception of reality that is desired for many residential or commercial display applications.
- This invention resides in apparatus and methods providing a unique experience for the viewer of a display, particularly large wall-mounted panels. In the preferred embodiments, the perception of a displayed image is altered for viewers moving relative to the position of the display system screen, thereby imparting a sense of three-dimensional immersion in the scene being displayed.
- A perspective-altering display system according to the invention comprises a display generator for generating a scene having foreground and background elements, and a display screen displaying the scene. A sensor detects the position of a viewer relative to the display screen, and a processor is operative to shift the relative position of the foreground and background elements in the displayed scene as a function of viewer position, such that the viewer's perspective of the scene changes as the viewer moves relative to the display screen.
- The foreground and background elements may be presented in the form of multiple superimposed graphics planes. At least one of the graphics planes may include prerecorded material or material received through a transmission medium or camera. A camera may be used to record the scene through panning at sequential angles, with a memory being used to store the images obtained at the sequential angles for later recall as a function of user movement. An interpolator may be used to “fill in” visual gaps in the scene.
- The display screen may be mounted on a wall having a backside, and a camera may be mounted on the backside of the wall which pans as a viewer moves, thereby imaging a scene representative of the display being a virtual window through the wall. A camera with a zoom capability may be used for recording the scene. With the sensor being operative to detect the viewer's distance from the display screen, the processor is further operative to zoom in the camera as the viewer moves closer to the display screen, and zoom out the camera as the viewer moves away from the display screen.
- The sensor may be operative to detect the viewer's distance from the display screen, with the processor being operative to increase the resolution of the scene as the viewer moves closer to the display screen and decrease the resolution of the scene as the viewer moves away from the display screen. A camera with a tilt capability may be used for recording the scene, with the processor being further operative to tilt the camera in response to the viewer's up/down movement. The camera may have a field of view which includes a viewer of the display screen, enabling the display to function a virtual mirror.
- A plurality of cameras may be used for capturing the scene, with the processor being further operative to construct a three-dimensional image for display on the screen. A user control may be provided enabling a viewer to select a specific camera or cameras to see how others would view the user from different perspectives.
- The sensor may include an infrared CCD (charge-coupled device) camera. Regardless, the camera may have an image sensor, such that an image of a person in front of the display may be focused onto the image sensor as a spot or group of pixels, allowing the movements of the person to be tracked with no moving parts.
- The sensor may be a camera having at least a pan mount that tracks the movement of a viewer, with the processor being operative to shift the relative position of the foreground and background elements as a function of the tracking. The sensor may be a camera having at least a pan/tilt mount that tracks the movement of a viewer, with the processor being operative to shift the relative position of the foreground and background elements as a function of the tracking. The sensor may be a camera having an auto-focus capability.
- The sensor may be operative to sense a plurality of individuals in front of the display screen. The processor may be operative to shift the relative position of the foreground and background elements by favoring larger clusters of individuals as opposed to smaller clusters or single individuals. Alternatively, moving individuals may be favored over stationary individuals, or individuals actually looking at the display screen may be favored over those who are not.
- One or more transducers may be provided for producing sounds associated with the scene, with the processor operative to alter the reproduction of the sounds as a function of viewer movement.
-
FIG. 1A shows a viewer having a field of view walking from the right towards the left relative to a display screen (shown as viewed from above); -
FIG. 1B is a simplified representation of what the person might see on the display screen according to the invention from the position shown inFIG. 1A ; -
FIG. 2A shows the person moving to the left, closer to the center of the display screen; -
FIG. 2B shows how closer objects have shifted laterally to the greatest degree, as opposed to more distant objects, much as a train passenger would experience while looking out the window of the train as the train moves; -
FIG. 3A illustrates how the viewer has moved to the left-most portion of the display; -
FIG. 3B shows how close objects have shifted to the extent that they now partially overlap with an object further away; -
FIG. 4A depicts the image of a foreground object being gathered by a camera and recorded by a recorder; -
FIG. 4B shows mid-range objects being recorded; -
FIG. 4C shows distant or background objects are being recorded; -
FIG. 5 depicts an alternative technique for implementing perspective alteration according to the invention; -
FIG. 6A shows how an infrared CCD (charge-coupled device) camera, preferably with a wide-angle lens, may be used as a sensor according to the invention; -
FIG. 6B shows the use of a panning camera; -
FIG. 6C depicts three persons generating a composite thermal field on an image sensor; -
FIG. 7A illustrates a “virtual picture window” embodiment, wherein the movement of a viewer causes an outdoor camera to pan back/forth, thereby allowing the viewer to visualize the outdoor scene as if the display were a hole in the wall; -
FIG. 7B shows how the invention enables virtual windows on insides walls, which may be useful in homes and businesses such as restaurants, bars and nightclubs; -
FIG. 7C shows how the display can function as a “virtual mirror”; and -
FIG. 8 depicts how the virtual mirror embodiment of the invention may be used in bathrooms and dressing rooms. - This invention employs a variety of techniques to provide a unique experience for the viewer of a display, particularly large wall-mounted panels. In the preferred embodiments, the perception of a displayed image is altered for viewers moving relative to the position of the display system screen, thereby imparting a sense of three-dimensional immersion in the scene being displayed.
-
FIG. 1A shows aviewer 102 having a field ofview 104 walking from the right towards the left relative to adisplay screen 106. One ormore sensors 108, which may use visible-light, infrared, ultrasonic, or other modalities described in further detail below, are used to track at least the lateral position ofindividual 102. -
FIG. 1B is a simplified representation of what theperson 102 might see on thedisplay screen 106 according to the invention from the position shown inFIG. 1A . Relatively close objects are shown at 115, 116. Less close objects are seen at 114. Somewhat distant objects are shown at 112, and distant objects are shown at 110. Although four relative distances are mentioned, the invention is not limited in this regard, and is applicable to more or fewer such relative distances. - In
FIG. 2A , theperson 102 has moved to the left, closer to the center of thescreen 106. The sensor(s) 108 have detected this movement and, in response, the perspective of the scene has been altered. As shown inFIG. 2B , closer objects have shifted laterally to the greatest degree, followed byobjects Object 110, being significantly distant, would shift little, if at all. - This process continues in
FIG. 3A , where the viewer has moved to the left-most portion of the display.Close object 116 has shifted to the extent that it now partially overlaps with an object further away, and object 115 has begun to move off thescreen 106, as shown inFIG. 3B . The ways in which the invention makes this possible will now be described in further detail. - In the preferred embodiment, the invention employs a technique similar to that utilized in animation films: multiple superimposed graphics planes. In
FIG. 4A , the image of aforeground object 412 is gathered bycamera 402 and recorded byrecorder 410. Depending upon the circumstances, a blue or other solid-color background 414 may be used for chroma-keying. - In
FIG. 4B , mid range objects 420 are being recorded whereas, inFIG. 4C , distant or background objects are being recorded. Commercially available software packages, or customized software tuned to specific program content, can be utilized to derive the desired material from among multiple subjects representing different focal points, and, once identified, can track the subjects as they change their position and even their orientation. In an alternative approach, multiple cameras, positioned to capture three-dimensional information, may be utilized to derive a three-dimensional array for each frame of motion, thereby allowing the producer to select “slices” which can be captured as graphics plane images for manipulation by the graphics processor of the instant invention. - The camera(s) may record moving video images for the foreground graphics plane(s), the background graphics plane(s), or any combination thereof. For example, the background graphic plane may be based upon a still picture, while the foreground cameras record motion imagery. In this embodiment, position sensor(s) 108 detect the location of the viewer relative to the screen, and reposition the foreground graphics plane(s) as the viewer moves, thereby conveying to the viewer the impression, for example, that he is looking through a window at an outdoor scene, with, perhaps, a nearer image, such as a tree branch, that the viewer can see around by simply shifting his position relative to the display screen. The tree branch (or other object) may also be moving, as it would in a breeze, for example.
- The video source for these graphics planes may include prerecorded material supplied by playback from any recording devices. Other sources include broadcast, satellite, cable, or other programming sources, material delivered over broadband or other telecommunication links, privately recorded material, live video from cameras (including security and monitoring cameras), computer-generated graphics and the like, or any other source of image material. Graphics planes displaying text information may be superimposed over, or under, other graphics planes.
-
FIG. 5 depicts an alternative technique for implementing perspective alteration according to the invention. Here acamera 502 having at least apan mount 506 records anactual scene 512 at incremental angles suggested byarrow 508. The view at each angle is recorded byunit 510 for later replay. The number of increments depends upon the desired resolution, room dimensions, and other factors. For example, at a very high resolution, single-degree increments may be recorded through a full 180 degrees. At a lesser resolution, single-degree increments may be recorded across a smaller angle of view, or larger-degree increments may be used at angle of view with or without interpolation to fill in any ‘gaps’ during replay. - Thus, a variety of novel video applications are enabled by virtue of the invention, such as a simulated window which has a view that is not possible from that position if there were an actual window in that position (i.e., viewing a sunset from an eastern exposure), or a view which is not possible at all (i.e., a winter scene during the summer, or a scene from a different country). These represent a clear improvement over any of the typical “light box” with photographic transparency or other attempt to simulate a window view with conventional means.
-
FIG. 7A illustrates a “virtual picture window” embodiment, wherein the movement ofviewer 102, detected bysensors 108, causeoutdoor camera 704 onmount 706 to pan back/forth, thereby allowing theviewer 102 to “see” theoutdoor scene 720 “through” thewall 702 ondisplay 106 usinggraphics processing system 710. Depending upon the movement of the viewer toward/away from thescreen 106, in this and other applicable embodiments thecamera 704 may zoom as the viewer comes closer, pan when the user moves away and tilt if the viewer goes up/down, as might be the case on a staircase, for example. - Through the use of distance detection, the display system can provide a three-dimensional effect, by applying modifications to an image as the viewer changes his position. Thus, for example, an image of a painting might capture the artist's intentions when viewed at a distance, while a close-up examination of the video display by the viewer could display the details of the brush strokes as the viewer changed his position relative to the screen.
- The invention is not limited to virtual windows through outside walls. As shown in
FIG. 7B , the invention enables virtual windows on insides walls, which may be useful in homes and businesses such as restaurants, bars and nightclubs. Similar to the embodiment depicted inFIG. 7A , asuser 102 moves relative toscreen 106,camera 704 pans (or tilts or zooms), enabling theuser 102 to seepeople 730 or other objects through thewall 702. - In still another aspect, the display system can function as a “virtual mirror.” Such an embodiment is depicted in
FIG. 7C , whereingraphics processing system 710 directs thecamera 704 to at least pan at an angle “A” substantially equal to angle “B” formed by the location of the user andline 722 perpendicular to the plane ofdisplay 106. This allows the viewer to see a synthesized reflection depicted bybroken line 720, typically including the subject him/herself. - The mirror embodiment of the invention may be used in bathrooms and dressing rooms, as illustrated in
FIG. 8 .Multiple cameras display screen 106 allow theprocessor unit 812 to construct a three-dimensional image for display on the screen, and the proximity sensing devices may be utilized to create an on-screen image which is representative of what a person should see as he re-positions himself, in three dimensions, about the display screen. It is a simple matter to provide a left-to-right reversed-image, in keeping with the mirror aspect of the invention. Auser control 810 allows the user to select a specific camera or cameras to see how others would view them from different perspectives. - In the dressing room/bathroom embodiments of the invention, the camera(s) capturing the image of the viewer may be placed behind a semi-transparent screen allowing better visualization such as eye-level contact to be maintained. One or more video cameras may be provided on a bendable tether—or wireless hookup—enabling a user to view hard-to-reach places such as ears, nose, mouth etc. Variable degrees of magnification may be provided, based upon detected distance from a surface being viewed, for example.
- In terms of position sensing, the preferred embodiment uses an infrared CCD (charge-coupled device) camera, preferably with a wide-
angle lens 206 shown inFIG. 6A . Theuser 202 generates athermal image 204, which is focused ontocamera array 210 as a spot or group of pixels, allowing thesystem 220 to know where a person is with no moving parts.Processor 106 can then cause the perspective, depth perception, or other characteristics to change accordingly. If no tilt or zoom functions are provided, a linear sensor may be substituted for a 2D sensor. - As an alternative to a fixed camera with sufficiently wide-angle lens, a panning camera may be used, as shown in
FIG. 6B . Here thecamera 230 is trained on a subject and, as that subject moves positional information is sensed bypan mount 234 and communicated to alterscreen 106 throughelectronics unit 240. Again, if tilt/zoom functions are provided, thecamera 230 may use tilt and/or auto focus to determine other positional aspects of the viewer. - The invention may handle multiple viewers in different ways. These solutions include (1) favoring clusters of potential viewers over singular viewers; (2) favoring moving viewers over stationary viewers; and (3) favoring viewers actually looking at the screen over those looking away. One advantage of the sensor system of
FIG. 6A is that clustering is naturally accommodated. InFIG. 6C , threepersons thermal field 240, which generates a relativelylarge imprint 251 onsensor 210. In contrast, the narrowerthermal field 250 ofsingle person 248 results in asmaller spot 241 onarray 210 such that, in this embodiment of the invention, the perspective of persons in the group would be favored. - The approach of
FIG. 6A also naturally addresses the favoring of moving viewers over stationary viewers. Referring again toFIG. 6C , if the group consisting ofpersons - In all embodiments of the invention, sensing in the visible region of the spectrum may be used instead of—or in concert with—IR sensing. This presents advantages and disadvantages. In terms of advantages, visible light sensing may allow a single camera (or cameras) to detect the image and position of the viewer (as in the mirror embodiments). Another advantage is that recognition techniques may be used to determine if a particular person is actually looking at the display in which case that person may be favored over individuals looking away. If multiple persons are looking at the display, other techniques such as clustering and motion favoring may also be used. Perhaps the only disadvantage is that image recognition and other operations require additional processing power, however, that is easily accommodated with modern processors.
- In each case, specialized graphics processing provides the management of the graphics planes and any audio material, while processing rules (for example—“take image modification instructions from the position of the closest viewer only”) ensure that the system will not be misdirected by movement of viewers that are on the opposite side of the room. Where multiple display systems are in use (as for simulating an array of “structural” windows), an overall system for management of the displays is utilized, thus providing an integrated, coordinated system of imaging displays. For example—an overall image, larger than the entire display system, may be utilized, or alternative schemes, in which image planes or other data may “flow” from one display screen to another display screen next to it.
- While described herein with reference to flat-panel displays (LED, LCD, Plasma, etc.) the principles disclosed may be applied with suitable results to any number of display technologies currently available or in development today (CRT-type, front or rear projection, Electroluminescence, OLEDs, etc.). Furthermore, adjustments may be applied to the image data to correct for any geometric distortions introduced due to the position of the camera(s) or display unit(s). In addition, alternative embodiments may utilize additional graphics planes to enhance the effect of the display.
- In all cases, audio may be included, representing material that may or may not be related to the video images presented on the screen. Thus, the system can serve the function of an enhanced video display terminal, a television viewing screen, a security monitoring system, a video entertainment system, or any other system for which display of graphics material is of value to the viewer.
- With further regard to audio, if the system is provided with stereo or surround sound, the sound reproduction may be altered as a function with user position, with or without a change in visual perspective. For example, as an individual walks past the display screen the sound of elements in the scene (i.e., birds, vehicles, etc.) may be varied whether or not the individual is looking at the screen. If the user moves toward the screen, sounds may be enhanced or attenuated. For example, if a viewer moves toward a frog or a bird in the scene, the sounds of that creature may be enhanced, or diminished as the user moves away.
Claims (20)
1. A perspective-altering display system, comprising:
a plurality of display screens displaying different portions of the same scene;
a sensor for detecting the movement of a viewer relative to the display screens;
at least one memory for separately storing foreground and background imagery associated with the scene;
at least one processor interconnected to the memory and the sensor, the processor being operative to receive a signal from the sensor and shift the relative position of the foreground and background imagery in the displayed scene as a function of viewer movement; and wherein the foreground imagery is shifted by way of a translation to a greater extent than the background imagery as a viewer moves from side to side relative to the display screens.
2. The display system of claim 1 , wherein:
the scene is larger than the plurality of display screens are capable of displaying such that the at least one memory stores non-visible foreground or background imagery; and
at least some of the non-visible foreground or background imagery to becomes visible as a result of the shifting in the relative position of the foreground and background imagery in the displayed scene as a function of viewer movement.
3. The display system of claim 1 , wherein the shifting of the relative position of the foreground and background imagery in the displayed scene as a function of viewer movement causes at least some of the foreground or background imagery on one display screen to become visible on another one of the display screens.
4. The display system of claim 1 , including prerecorded foreground or background imagery.
5. The display system of claim 1 , wherein at least some of the foreground or background imagery includes imagery received through a transmission medium.
6. The display system of claim 1 , wherein at least some of the foreground or background imagery includes imagery received from a camera.
7. The display system of claim 1 , further including:
a camera that records the scene through panning at sequential angles; and
wherein the memory stores imagery obtained at the sequential angles for later recall as a function of user movement.
8. The display system of claim 7 , further including:
an interpolator for filling in visual gaps in the scene.
9. The display system of claim 1 , wherein:
the display screens are mounted on a wall having a backside; and
further including a camera mounted on the backside of the wall which pans as a viewer moves, thereby imaging the scene on the display screens as virtual windows through the wall.
10. The display system of claim 1 , further including:
a camera with a zoom capability for recording the scene;
wherein the sensor is operative to detect the viewer's distance from the display screens; and
wherein the processor is further operative to zoom in the camera as the viewer moves closer to the display screens and zoom out the camera as the viewer moves away from the display screens.
11. The display system of claim 1 , wherein:
the sensor is operative to detect the viewer's distance from the display screens; and
the processor is operative to increase the resolution of the scene as the viewer moves closer to the display screens and decrease the resolution of the scene as the viewer moves away from the display screens.
12. The display system of claim 1 , further including:
a camera with a tilt capability for recording the scene;
the sensor being operative to detect the viewer's up/down movement relative to the display screens; and
wherein the processor is further operative to tilt the camera in response to the viewer's up/down movement.
13. The display system of claim 1 , further including:
a camera for recording the scene, the camera having a field of view including a viewer of the display screen; and
wherein the processor is further operative to flip the scene horizontally such that the display functions as a virtual mirror.
14. The display system of claim 1 , further including:
a plurality of cameras for recording the scene; and
wherein the processor is further operative to construct a three-dimensional image for display on the screens.
15. The display system of claim 1 , further including:
a plurality of cameras for recording the scene;
the processor being further operative to construct a three-dimensional image for display on the screens; and
a user control a viewer to select a specific camera or cameras to see how others would view the user from different perspectives.
16. The display system of claim 1 , wherein:
the sensor is a camera having an image sensor; and
an image of a viewer is focused onto the image sensor as a spot or group of pixels, allowing the movements of the viewer to be tracked with no moving parts.
17. The display system of claim 1 , wherein:
the sensor is a camera having at least a pan mount that tracks the movement of a viewer; and
the processor is operative to shift the relative position of the foreground and background imagery as a function of the tracking.
18. The display system of claim 1 , wherein:
the sensor is operative to sense a whether an individual is looking toward the display screens; and
the processor is operative to shift the relative position of the foreground and background imagery by favoring individuals looking toward the display screens.
19. The display of claim 1 , wherein the display screens are contiguous. he display of claim 1 , wherein the display screens are not contiguous.
20. The display of claim 1 , wherein the display screens are not contiguous.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/452,901 US20190320163A1 (en) | 2007-08-24 | 2019-06-26 | Multi-screen perspective altering display system |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US95784507P | 2007-08-24 | 2007-08-24 | |
US12/197,635 US10063848B2 (en) | 2007-08-24 | 2008-08-25 | Perspective altering display system |
US16/046,065 US10397556B2 (en) | 2007-08-24 | 2018-07-26 | Perspective altering display system |
US16/452,901 US20190320163A1 (en) | 2007-08-24 | 2019-06-26 | Multi-screen perspective altering display system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/046,065 Division US10397556B2 (en) | 2007-08-24 | 2018-07-26 | Perspective altering display system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190320163A1 true US20190320163A1 (en) | 2019-10-17 |
Family
ID=40381715
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/197,635 Active 2034-06-23 US10063848B2 (en) | 2007-08-24 | 2008-08-25 | Perspective altering display system |
US16/046,065 Active US10397556B2 (en) | 2007-08-24 | 2018-07-26 | Perspective altering display system |
US16/452,901 Abandoned US20190320163A1 (en) | 2007-08-24 | 2019-06-26 | Multi-screen perspective altering display system |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/197,635 Active 2034-06-23 US10063848B2 (en) | 2007-08-24 | 2008-08-25 | Perspective altering display system |
US16/046,065 Active US10397556B2 (en) | 2007-08-24 | 2018-07-26 | Perspective altering display system |
Country Status (1)
Country | Link |
---|---|
US (3) | US10063848B2 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190020498A1 (en) * | 2015-12-31 | 2019-01-17 | Robert Bosch Gmbh | Intelligent Smart Room Control System |
WO2022023142A1 (en) * | 2020-07-27 | 2022-02-03 | Roomality Limited | Virtual window |
US20230199161A1 (en) * | 2021-12-17 | 2023-06-22 | Samsung Electronics Co., Ltd. | 3d arts viewing on display devices |
US11900521B2 (en) | 2020-08-17 | 2024-02-13 | LiquidView Corp | Virtual window apparatus and system |
Families Citing this family (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8427535B2 (en) * | 2008-06-07 | 2013-04-23 | Rick Davis | Personal grooming visual display system |
US8259178B2 (en) * | 2008-12-23 | 2012-09-04 | At&T Intellectual Property I, L.P. | System and method for creating and manipulating synthetic environments |
US20110069158A1 (en) * | 2009-09-21 | 2011-03-24 | Dekel Shiloh | Virtual window system and method |
US8791949B1 (en) * | 2010-04-06 | 2014-07-29 | The Pnc Financial Services Group, Inc. | Investment management marketing tool |
US8780115B1 (en) * | 2010-04-06 | 2014-07-15 | The Pnc Financial Services Group, Inc. | Investment management marketing tool |
US9167289B2 (en) * | 2010-09-02 | 2015-10-20 | Verizon Patent And Licensing Inc. | Perspective display systems and methods |
US8896631B2 (en) * | 2010-10-25 | 2014-11-25 | Hewlett-Packard Development Company, L.P. | Hyper parallax transformation matrix based on user eye positions |
US20120320080A1 (en) * | 2011-06-14 | 2012-12-20 | Microsoft Corporation | Motion based virtual object navigation |
US9509922B2 (en) * | 2011-08-17 | 2016-11-29 | Microsoft Technology Licensing, Llc | Content normalization on digital displays |
US20130083252A1 (en) * | 2011-10-04 | 2013-04-04 | David John Boyes | Gesture recognition capable picture video frame |
US9274597B1 (en) * | 2011-12-20 | 2016-03-01 | Amazon Technologies, Inc. | Tracking head position for rendering content |
US9389682B2 (en) | 2012-07-02 | 2016-07-12 | Sony Interactive Entertainment Inc. | Methods and systems for interaction with an expanded information space |
EP2693363B1 (en) * | 2012-07-31 | 2015-07-22 | Sick Ag | Camera system and method for recording a flow of objects |
US10116911B2 (en) * | 2012-12-18 | 2018-10-30 | Qualcomm Incorporated | Realistic point of view video method and apparatus |
US9342467B1 (en) | 2013-07-17 | 2016-05-17 | Frederick Thomas McGrath | Virtual window system |
US20160205492A1 (en) * | 2013-08-21 | 2016-07-14 | Thomson Licensing | Video display having audio controlled by viewing direction |
WO2015025185A1 (en) * | 2013-08-21 | 2015-02-26 | Thomson Licensing | Video display with pan function controlled by viewing direction |
US20150221064A1 (en) * | 2014-02-03 | 2015-08-06 | Nvidia Corporation | User distance based modification of a resolution of a display unit interfaced with a data processing device and/or a display area size thereon |
US9990438B2 (en) | 2014-03-13 | 2018-06-05 | Ebay Inc. | Customized fitting room environment |
US10134187B2 (en) | 2014-08-07 | 2018-11-20 | Somo Innvoations Ltd. | Augmented reality with graphics rendering controlled by mobile device position |
EP3286596B1 (en) | 2015-04-21 | 2022-11-02 | University of Rochester | Cloaking systems and methods |
US10609438B2 (en) | 2015-08-13 | 2020-03-31 | International Business Machines Corporation | Immersive cognitive reality system with real time surrounding media |
KR20170035608A (en) * | 2015-09-23 | 2017-03-31 | 삼성전자주식회사 | Videotelephony System, Image Display Apparatus, Driving Method of Image Display Apparatus, Method for Generation Realistic Image and Computer Readable Recording Medium |
WO2018027110A1 (en) * | 2016-08-05 | 2018-02-08 | University Of Rochester | Virtual window |
US10496238B2 (en) | 2016-08-22 | 2019-12-03 | University Of Rochester | 3D display ray principles and methods, zooming, and real-time demonstration |
US20180063444A1 (en) * | 2016-08-29 | 2018-03-01 | Razmik Karabed | View friendly monitor systems |
US10654422B2 (en) | 2016-08-29 | 2020-05-19 | Razmik Karabed | View friendly monitor systems |
JP6320488B1 (en) * | 2016-11-07 | 2018-05-09 | ヤフー株式会社 | Virtual reality providing system, virtual reality providing method, virtual reality providing device, and program |
WO2018110821A1 (en) * | 2016-12-14 | 2018-06-21 | Samsung Electronics Co., Ltd. | Display apparatus and method for controlling the display apparatus |
US10264222B2 (en) * | 2017-02-08 | 2019-04-16 | Phazr, Inc. | Window-installed wireless communications device |
EP3607420B1 (en) * | 2017-05-10 | 2021-06-23 | Microsoft Technology Licensing, LLC | Presenting applications within virtual environments |
WO2018232050A1 (en) * | 2017-06-14 | 2018-12-20 | Dpa Ventures, Inc. | Artificial window system |
US10606347B1 (en) | 2017-07-31 | 2020-03-31 | Facebook, Inc. | Parallax viewer system calibration |
JP6767319B2 (en) * | 2017-07-31 | 2020-10-14 | 株式会社ソニー・インタラクティブエンタテインメント | Information processing device and file copy method |
CN113170231A (en) * | 2019-04-11 | 2021-07-23 | 华为技术有限公司 | Method and device for controlling playing of video content following user motion |
CN114144753A (en) * | 2019-07-30 | 2022-03-04 | 索尼集团公司 | Image processing apparatus, image processing method, and recording medium |
US11388354B2 (en) | 2019-12-06 | 2022-07-12 | Razmik Karabed | Backup-camera-system-based, on-demand video player |
CN111246116B (en) * | 2020-03-20 | 2022-03-11 | 谌春亮 | Method for intelligent framing display on screen and mobile terminal |
CN112053940B (en) * | 2020-07-07 | 2022-04-26 | 北京华卓精科科技股份有限公司 | Device and method for compensating wedge error in wafer bonding |
CN115695771A (en) * | 2021-07-28 | 2023-02-03 | 京东方科技集团股份有限公司 | Display device and display method thereof |
Family Cites Families (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2815310A (en) | 1952-03-01 | 1957-12-03 | Pictorial Prod Inc | Process of assembling in the art of changeable picture display devices |
US3463581A (en) | 1966-01-17 | 1969-08-26 | Intermountain Res & Eng | System for three-dimensional panoramic static-image motion pictures |
US4135502A (en) | 1976-09-07 | 1979-01-23 | Donald Peck | Stereoscopic patterns and method of making same |
US4896210A (en) | 1987-11-16 | 1990-01-23 | Brokenshire Daniel A | Stereoscopic graphics display terminal with image data processing |
US4987487A (en) | 1988-08-12 | 1991-01-22 | Nippon Telegraph And Telephone Corporation | Method of stereoscopic images display which compensates electronically for viewer head movement |
US5086354A (en) | 1989-02-27 | 1992-02-04 | Bass Robert E | Three dimensional optical viewing system |
US4956705A (en) | 1989-03-10 | 1990-09-11 | Dimensional Visions Group | Electronic method and apparatus for stereoscopic photography |
DE3921061A1 (en) | 1989-06-23 | 1991-01-03 | Hertz Inst Heinrich | DISPLAY DEVICE FOR THREE-DIMENSIONAL PERCEPTION OF IMAGES |
JP3104909B2 (en) | 1989-12-05 | 2000-10-30 | ソニー株式会社 | Image processing device |
US5253051A (en) | 1991-03-05 | 1993-10-12 | Mcmanigal Paul G | Video artificial window apparatus |
US5613048A (en) | 1993-08-03 | 1997-03-18 | Apple Computer, Inc. | Three-dimensional image synthesis using view interpolation |
US5543964A (en) | 1993-12-28 | 1996-08-06 | Eastman Kodak Company | Depth image apparatus and method with angularly changing display information |
US6005967A (en) | 1994-02-18 | 1999-12-21 | Matushita Electric Industrial Co., Ltd. | Picture synthesizing apparatus and method |
AUPM701394A0 (en) * | 1994-07-22 | 1994-08-18 | Monash University | A graphical display system |
US5574836A (en) * | 1996-01-22 | 1996-11-12 | Broemmelsiek; Raymond M. | Interactive display apparatus and method with viewer position compensation |
US5956180A (en) | 1996-12-31 | 1999-09-21 | Bass; Robert | Optical viewing system for asynchronous overlaid images |
JP3103045B2 (en) | 1997-07-10 | 2000-10-23 | 三菱電機株式会社 | Image capturing / reproducing method and method, and recording medium recording image reproducing program |
US5963303A (en) | 1997-09-04 | 1999-10-05 | Allen; Dann M. | Stereo pair and method of making stereo pairs |
AU4184399A (en) | 1998-05-13 | 1999-11-29 | Infinite Pictures Inc. | Panoramic movies which simulate movement through multidimensional space |
DE10005335C2 (en) * | 2000-02-08 | 2002-06-27 | Daimler Chrysler Ag | Method and device for multi-dimensional representation of an object |
US6760026B2 (en) * | 2001-01-02 | 2004-07-06 | Microsoft Corporation | Image-based virtual reality player with integrated 3D graphics objects |
US7079706B2 (en) | 2001-06-20 | 2006-07-18 | Paul Peterson | Methods and apparatus for generating a multiple composite image |
DE60237834D1 (en) * | 2001-08-15 | 2010-11-11 | Koninkl Philips Electronics Nv | 3D VIDEO CONFERENCE SYSTEM |
US20030080937A1 (en) * | 2001-10-30 | 2003-05-01 | Light John J. | Displaying a virtual three-dimensional (3D) scene |
US7883415B2 (en) * | 2003-09-15 | 2011-02-08 | Sony Computer Entertainment Inc. | Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion |
GB0329312D0 (en) | 2003-12-18 | 2004-01-21 | Univ Durham | Mapping perceived depth to regions of interest in stereoscopic images |
US7663689B2 (en) * | 2004-01-16 | 2010-02-16 | Sony Computer Entertainment Inc. | Method and apparatus for optimizing capture device settings through depth information |
US7626569B2 (en) * | 2004-10-25 | 2009-12-01 | Graphics Properties Holdings, Inc. | Movable audio/video communication interface system |
US20070291035A1 (en) * | 2004-11-30 | 2007-12-20 | Vesely Michael A | Horizontal Perspective Representation |
US20070060346A1 (en) | 2005-06-28 | 2007-03-15 | Samsung Electronics Co., Ltd. | Tool for video gaming system and method |
US8730156B2 (en) * | 2010-03-05 | 2014-05-20 | Sony Computer Entertainment America Llc | Maintaining multiple views on a shared stable virtual space |
JP2009533721A (en) * | 2006-04-17 | 2009-09-17 | サード ディメンション アイピー エルエルシー | System and method for real 3D display of angular slices |
JP2012501506A (en) | 2008-08-31 | 2012-01-19 | ミツビシ エレクトリック ビジュアル ソリューションズ アメリカ, インコーポレイテッド | Conversion of 3D video content that matches the viewer position |
JP2011205358A (en) * | 2010-03-25 | 2011-10-13 | Fujifilm Corp | Head-mounted display device |
US8704879B1 (en) * | 2010-08-31 | 2014-04-22 | Nintendo Co., Ltd. | Eye tracking enabling 3D viewing on conventional 2D display |
US9485487B2 (en) * | 2011-06-22 | 2016-11-01 | Koninklijke Philips N.V. | Method and apparatus for generating a signal for a display |
US9591295B2 (en) * | 2013-09-24 | 2017-03-07 | Amazon Technologies, Inc. | Approaches for simulating three-dimensional views |
US20150130799A1 (en) * | 2013-11-12 | 2015-05-14 | Fyusion, Inc. | Analysis and manipulation of images and video for generation of surround views |
-
2008
- 2008-08-25 US US12/197,635 patent/US10063848B2/en active Active
-
2018
- 2018-07-26 US US16/046,065 patent/US10397556B2/en active Active
-
2019
- 2019-06-26 US US16/452,901 patent/US20190320163A1/en not_active Abandoned
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190020498A1 (en) * | 2015-12-31 | 2019-01-17 | Robert Bosch Gmbh | Intelligent Smart Room Control System |
WO2022023142A1 (en) * | 2020-07-27 | 2022-02-03 | Roomality Limited | Virtual window |
US11900521B2 (en) | 2020-08-17 | 2024-02-13 | LiquidView Corp | Virtual window apparatus and system |
US20230199161A1 (en) * | 2021-12-17 | 2023-06-22 | Samsung Electronics Co., Ltd. | 3d arts viewing on display devices |
Also Published As
Publication number | Publication date |
---|---|
US10397556B2 (en) | 2019-08-27 |
US10063848B2 (en) | 2018-08-28 |
US20180343441A1 (en) | 2018-11-29 |
US20090051699A1 (en) | 2009-02-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10397556B2 (en) | Perspective altering display system | |
JP4848339B2 (en) | Virtual window method, system, and computer program recorded with simulated parallax and visual field change (virtual window with simulated parallax and visual field change) | |
US7980704B2 (en) | Audiovisual system including wall-integrated audiovisual capabilities | |
US20100013738A1 (en) | Image capture and display configuration | |
US7079173B2 (en) | Displaying a wide field of view video image | |
KR102491749B1 (en) | Panoramic 3D Imaging System | |
US20080246759A1 (en) | Automatic Scene Modeling for the 3D Camera and 3D Video | |
KR20150068299A (en) | Method and system of generating images for multi-surface display | |
US20070247518A1 (en) | System and method for video processing and display | |
US6836286B1 (en) | Method and apparatus for producing images in a virtual space, and image pickup system for use therein | |
CN112435558A (en) | Holographic 3D intelligent interactive digital virtual sand table and interactive method thereof | |
EP2408191A1 (en) | A staging system and a method for providing television viewers with a moving perspective effect | |
JP2009010915A (en) | Video display method and video system | |
CN110458953A (en) | A kind of 3-D image reconfiguration system and method | |
CN2667827Y (en) | Quasi-panorama surrounded visual reproducing system | |
JP2007501950A (en) | 3D image display device | |
CN110730340B (en) | Virtual audience display method, system and storage medium based on lens transformation | |
JP2019512177A (en) | Device and related method | |
Naimark | Elements of real-space imaging: a proposed taxonomy | |
EP1892558A2 (en) | 3d image capture camera and non-stereoscopic 3d viewing device that does not require glasses | |
JP4019785B2 (en) | Image display system, image processing apparatus, and image display method | |
CN210605808U (en) | Three-dimensional image reconstruction system | |
JP2010266696A (en) | All view type panoramic stereoscopic viewer device | |
JP4505559B2 (en) | Distant panel for studio set and studio set using the same | |
JP5457668B2 (en) | Video display method and video system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION UNDERGOING PREEXAM PROCESSING |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- INCOMPLETE APPLICATION (PRE-EXAMINATION) |