WO2019178635A1 - Computer assisted virtual reality display system - Google Patents

Computer assisted virtual reality display system Download PDF

Info

Publication number
WO2019178635A1
WO2019178635A1 PCT/AU2018/051174 AU2018051174W WO2019178635A1 WO 2019178635 A1 WO2019178635 A1 WO 2019178635A1 AU 2018051174 W AU2018051174 W AU 2018051174W WO 2019178635 A1 WO2019178635 A1 WO 2019178635A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
computer assisted
images
assisted virtual
display surface
Prior art date
Application number
PCT/AU2018/051174
Other languages
French (fr)
Inventor
Bruce Robert DELL
Original Assignee
Euclideon Holographics Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2018900926A external-priority patent/AU2018900926A0/en
Application filed by Euclideon Holographics Pty Ltd filed Critical Euclideon Holographics Pty Ltd
Publication of WO2019178635A1 publication Critical patent/WO2019178635A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/275Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
    • H04N13/279Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals the virtual viewpoint locations being selected by the viewers or determined by tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/34Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/22Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/368Image reproducers using viewer tracking for two or more viewers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N2013/40Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N2013/40Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene
    • H04N2013/405Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene the images being stereoscopic or three dimensional

Definitions

  • the invention relates to a computer assisted virtual reality display system.
  • An imaging system including at least one projector for producing images on the at least one surface;
  • a tracking system that is configured to track the users’ view point and viewing direction;
  • a computer that is programmed to respond to the tracking system and thereby control the imaging system to produce stereo 3D images of the virtual environment in response to the tracking system;
  • the present applicant operates a recreational virtual reality centre that includes the above components on a commercial basis at 59 Nerang St, Southport, Queensland, 4215, Australia (http://holoverse.com.au).
  • each display system is adapted for generating a display specifically for the particular user that is being tracked. Consequently, if a second person views the display they will not see the display from their viewpoint but rather from the view point of the first person who is the user that is being tracked by the tracking system. Therefore, for the second person the 3D scene being presented will not change dynamically in accordance with the second person’s movements and so there will be no illusion of virtual reality for the second person.
  • a computer assisted virtual display system comprising:
  • At least two display surfaces for presenting images to one or more corresponding users of each display surface, said display surfaces being angled away from each other to reduce visibility of other displays to the one or more corresponding users of one of the display surfaces;
  • a tracking system configured to track each of the stereoscopic 3D glasses
  • an imaging system including two or more imaging devices for producing images on the two or more display surfaces;
  • the system comprises two display surfaces.
  • the two display surfaces are arranged end to end.
  • the display surfaces are inclined at an angle relative to a horizontal axis.
  • the display surfaces are arranged to form a substantially triangular structure.
  • each display surface appears substantially flat or planar to a user.
  • the display surfaces are arranged opposingly.
  • each display surface is orientated at an angle of between 0° and 90° to horizontal. More preferably, the angle is between 30° and 90°. Even more preferably, the angle is between 45° and 90° to horizontal.
  • the imaging devices are located beneath the display surfaces and arranged to project images onto the display surfaces.
  • the angle of each display surface is about 90°.
  • a display surface extends perpendicularly from each display surface angled at 90°.
  • the imaging devices are located above the display surfaces and produce images on both the display surface angled at 90° and the perpendicular display surface.
  • the system further comprises an input device for interfacing with the computational device to thereby affect the stereoscopic 3D images produced on the display surface.
  • the imaging system comprises two sets of two imaging devices, wherein a first set of two imaging devices produce images on a first display surface and a second set of two imaging devices produce images on a second display surface.
  • the stereoscopic 3D glasses further comprise an image separation arrangement to present images from the imaging system to corresponding ones of the two or more users whereby each user sees images emanating from a corresponding one of said sets of projectors only.
  • the image separation arrangement comprises a frequency separation crystal film.
  • the imaging system comprises additional sets of two imaging devices for additional display surfaces.
  • the image separation arrangement includes two or more filters for separating images from the two or more sets of projectors wherein the different viewing filters have different visible light spectral transmission characteristics.
  • the filters comprise interference filters.
  • the filters may comprise films of dichroic material.
  • the filters of the image separation arrangement comprise corresponding viewer and projector interference filters, wherein a viewer interference filter is mounted to the stereoscopic 3D glasses for the user and projector interference filters are mounted to projectors of the corresponding set of projectors.
  • the stereoscopic 3D glasses comprise first and second visible light filter windows.
  • the display surfaces are quadrilateral. More preferably, the display surfaces are rectangular.
  • the system comprises three display surfaces.
  • the system comprises three triangular display surfaces. More preferably, the three triangular display surfaces are arranged in a substantially triangular pyramid arrangement. In some alternative embodiments, the three display surfaces are quadrilateral wherein each display surface is arranged at substantially 90° relative to an adjacent display surface.
  • the system comprises four triangular display surfaces. More preferably, the four triangular display surfaces are arranged in a substantially square pyramid arrangement. In some alternative embodiments, the four display surfaces are quadrilateral.
  • the system comprises four quadrilateral display surfaces, wherein each display surface is arranged at substantially 90° relative to an adjacent display surface.
  • the invention resides in a method for operating a computer assisted virtual reality display system, the method comprising the steps of:
  • processing with a computational device the position of each of the stereoscopic 3D glasses of the two or more users; processing with the computational device the orientation of each display surface relative to an associated imaging device to calculate a projection transform;
  • each display surface displays the same image.
  • Figure 1 illustrates a perspective view of a display system according to an embodiment of the present invention
  • Figure 2 illustrates a side view of the display system shown in Figure 1 ;
  • Figure 2A illustrates the display system of Figure 1 and a computer system according to a preferred embodiment of the present invention
  • Figure 3 illustrates one embodiment of the stereoscopic 3D glasses of the display system
  • Figure 4 illustrates another embodiment of the stereoscopic 3D glasses of the display system
  • Figure 5 illustrates a second embodiment of a display system according to the present invention.
  • Figure 6 illustrates a third embodiment of a display system according to the present invention.
  • FIG. 1 and 2 there is depicted a computer assisted virtual reality display system 10 for two or more users according to a first embodiment of the present invention.
  • the system 10 includes a table 100 that has a first display surface 105a and a second display surface 105b, made from, for example a web or sheet of translucent material for presenting images of a virtual environment to two or more users.
  • each table Attached to the opposite ends of each table are input devices in the form of a set of hand-operated controls 107a, 107b to allow each user to interact with the display surfaces 105a, 105b.
  • these controls take the form of buttons but can be any suitable form of input mechanism, such as a joystick or keyboard, for example.
  • System 10 also includes an imaging system which is comprised of projectors 1 10a and 1 10b.
  • One projector is located below each display surface 105a, 105b of the table 100 and are arranged to project on to the underside of the display surfaces 105a, 105b. Due to the translucent nature of the display surfaces 105a, 105b, any projections on the underside of the displays surfaces 105a, 105b are visible on the top of the display surfaces 105a, 105b.
  • a tracking system 1 15 is also provided that comprises sensors 120a... d.
  • the sensors 120a, ... ,120d are configured to sense the position and orientation of targets affixed to special glasses (which will be described below), that are worn by users 125a, 125b of the system.
  • System 10 further includes a computer 130 that is coupled to the projectors 1 10a, 1 10b and to the tracking sensors 120a,..,,120d, and is specifically programmed to communicate with the tracking sensors and issue instructions to the projectors.
  • the computer 130 is programmed to respond to the tracking system 1 15 and thereby control the imaging system, i.e. projectors 1 10a and 1 10b, to produce stereo 3D images of a virtual environment corresponding to both a projection transform determined based on the angle of inclination qi, 02 of the display surfaces 105a, 105b and the tracking data from the tracking system for each of the two or more users 125a, 125b.
  • the virtual environment may comprise a vertex and edge polygon model or an octree type solid model stored in computer system 140 or accessible thereto.
  • the display surfaces 105a, 105b are arranged adjacent to one another and positioned end to end at predetermined angles q-i, 02 above a horizontal member 1 1 1 to form a substantially triangular prism shape or tent-like shape.
  • the Inventor has found that, in use, with appropriate transformations applied to the projections the inclined display surface can be made to appear to each user to extend from one user to another in a planar fashion, i.e. the display surface appears to be horizontal to the user.
  • the two users that they are using the same display surface and are interacting with the same image while, in fact, each user has their own display surface and own projected image thereon.
  • the two display surfaces 105a, 105b are arranged opposingly so that multiple display surfaces are not viewable from any one of the display surfaces.
  • display surface 105b is not viewable to user 125a when user 125a is positioned at display surface 105a. This prevents the first user 125a, for example, from seeing both images generated for viewing by user 125a on display surface 105a and also an image generated on display surface 105b for viewing by user 125b, which would break the illusion of a single display table.
  • the display surfaces are shown to be inclined, in some alternative embodiments, the display surfaces are substantially vertical or perpendicular to the floor (which will be described in more detail later).
  • FIG. 2A there is shown the computer assisted virtual reality display system 10 including the flow of data to and from the computer 130 and computer system 130A to control the imaging devices 1 10a. 1 10b and stereoscopic 3D glasses 135a, 135b.
  • data collected by the tracking system 1 15 is communicated to the computer 130 which uses the tracking data and the orientation (including the angles of inclination q-i, 02 of each of the display surfaces 105a, 105b to instruct the imaging devices 1 10a, 1 10b to produce the appropriate stereoscopic 3D images on each of the display surfaces 105a, 105b for the respective users 125a, 125b.
  • the computer 130 outputs instructions to the stereoscopic 3D glasses 135a, 135b to control the active shutter windows and receives inputs from the control interfaces 107a, 107b operated by the users 125a, 125b to manipulate and interact with the displayed images.
  • an administrator loads imaging software, including projection transform software 132.
  • the projection transform software 132 uses the tracking data and the orientation of the display surfaces 105a, 105b to create a projection transform, such as a projection transform matrix or set of matrices, which are applied to an image model to create the appropriate images for projection onto the display surfaces 105a, 105b for each user 125a, 125b.
  • the system 10 also includes stereo 3D glasses 135a, 135b that are worn by each of the users 125a, 125b in order for the users to perceive the stereoscopic 3D images.
  • Affixed to each of the glasses 135a, 135b are the aforementioned targets 137a, 137b which are tracked by the tracking system 1 15.
  • Each of the glasses 135a, 135b also includes first and second active shutter windows 140a, 140a’ and 140b, 140b’ respectively. The action of the shutter windows 140a, 140a’ and 140b, 140b’ is synchronized to the stereoscopic images generated by the corresponding projectors 1 10a, 1 10b.
  • Active shutter stereoscopic glasses are well known and used for perceiving 3D stereoscopic images, for example from suitably equipped LCD and plasma televisions.
  • Other types of stereoscopic systems may also be used such as differently polarized left and right windows (though these suffer from loss of 3D effect as the head is rotated) and anaglyph ic windows, e.g. red/cyan windows (however anaglyphic windows typically cause a loss of colour realism).
  • the system also includes an image separation arrangement to present images from the two or more projectors 1 10a, 1 10b to corresponding ones of the two or more users whereby each user sees images emanating from a corresponding one of said projectors only.
  • the image separation arrangement of the preferable embodiments mentioned above includes first and second interference filters 145a and 145b that fit over the shutter windows 140a, 140a’ and 140b, 140b’ respectively, and can be seen in Figure 4.
  • the first and second interference filters 145a and 145b are formed of dichroic material and have different, orthogonal visible light spectral transmission characteristics.
  • Filter 145a which is made of a filter material F1 having a specific first red (R1 ), blue (B1 ), green (G1 ) transmission characteristic.
  • the second interference filter 145b which is made of filter material F2 having a specific second red (R2), blue (B1 ), green (G1 ) transmission characteristic is non-overlapping with the transmission characteristic of filter material F1 . Consequently, light that passes through filter F1 will be entirely blocked by filter F2 and vice versa. Accordingly, the filters F1 and F2 are said to have“orthogonal” transmission characteristics.
  • the image separation arrangement further includes projector filters 150a of material F1 and 1 50b of material F2 which fit over the output lenses of the filters 1 10a and 1 10b respectively. Consequently, light from projector 1 10a is incident upon filter 1 50a of material F1 . Only light with wavelengths falling within passbands B1 , G1 and R1 of the filter material F1 passes through the filter. Accordingly, the light escaping from projector filter 150a can pass through glasses filter 145a, since filters 150a and 145a are made of the same material and have the same spectral bandpasses B1 , G1 , R1 . Similarly, only light with wavelengths falling within passbands B2, G2 and R2 of the filter material F2 passes through filter 150b of the second projector 1 10b.
  • the light escaping from projector filter 1 50b can pass through glasses filter 145b, since filters 150b and 145b are made of the same material and have the same spectral bandpasses B2, G2, R2.
  • the first glasses filter 145a which is made of material F1
  • the second projector filter 150b which is made of material F2 because transmission characteristics of the F1 and F2 filter materials are orthogonal and have no overlap.
  • the paired glasses and projector lenses 1 45a, 150a and 145b, 150b serve to separate images from the projectors 1 10a and 1 10b so that only a wearer of glasses 135a can see images from projector 1 10a whereas only a wearer of glasses 1 35b can see images from projector 1 10b.
  • the stereo 3D glasses may comprise first and second filter windows.
  • the stereoscopic images produced by the projectors 1 10a, 1 10b comprise two different views of the same 3D object.
  • an image displayed on display surface 105a may be a view of the head of a dinosaur from the view direction of the first user 125a whereas the corresponding image displayed on display surface 105b would correspond to a view of the tail of the dinosaur from the view direction of the second user 125b.
  • the computer system 130 is able to monitor their positions via the tracking system 1 15 and adjust the views of the virtual scene that are delivered by projectors 1 10a and 1 10b accordingly. Consequently, both users 125a, 125b see an appropriate view of the virtual scene from their viewpoint on their respective display surface.
  • the views of the scene for both the first and second users 125a, 125b will change dynamically in accordance with each user’s movements and so a virtual reality of the same scene will be perceived for both the first and the second user 125a, 125b appropriate to their display surface 105a, 105b.
  • the first user 125a is looking at the head of the dinosaur but wishes to inspect the tail of the dinosaur instead.
  • a control mechanism such as controls 107a
  • the first user 125a can manipulate and rotate the dinosaur image so that the tail is facing the first user 125a.
  • the computer 130 adjusts the image produced by projector 1 10b, which presents the image to the second user 125b on display surface 105b, so that the second user 125b watches the dinosaur rotate until it faces the first user 125a and thus the head of the dinosaur now faces the second user 125b.
  • System 20 includes a table 200 that has a first display 205 having display surfaces 205a, 205b and a second display 210 having display surfaces 210a, 210b.
  • the system 20 includes control systems 207a, 207b attached to the displays 205, 210, an imaging system having two overhead projectors 215a, 215b, a tracking system 220 having tracking sensors 220a...220d and a computer 225 connected to the projectors 215a, 215b.
  • control systems 207a, 207b attached to the displays 205, 210, an imaging system having two overhead projectors 215a, 215b, a tracking system 220 having tracking sensors 220a...220d and a computer 225 connected to the projectors 215a, 215b.
  • Each of these components is substantially similar to the components described above in relation system 10.
  • the users 230a and 230b of the system 20 are also wearing stereoscopic 3D glasses 135a, 135b, which are described above.
  • display surfaces 205a and 210a are vertical or angled at substantially 90° relative to the horizontal and the additional display surfaces 205b and 210b extend perpendicularly from the two vertical display surfaces 205a, 210a, respectively.
  • the illustration also shows the projectors 215a, 215b are mounted overhead, rather than underneath the table 200. This allows projector 215a to project onto both display surfaces 205a, 205b viewed by first user 230a and projector 215b to project onto both display surfaces 210a, 210b viewed by second user 230b.
  • the computer 225 calculates a projection transform which takes into account the angle and orientation of the display surfaces 205a, 205b, 210a, 210b. This projection transform is then applied by the computer to the images to be projected by the projectors 215a, 215b.
  • Figure 6 illustrates yet another embodiment of a computer assisted virtual reality display system 30 having two vertically orientated display surfaces 305a, 305b, a tracking system having sensors 310a...d and two projectors 315a, 315b connected to a computer 320.
  • Figure 6 also shows first and second users 325a and 325b wearing stereoscopic 3D glasses 135a, 135b, which are described above.
  • the illustrated embodiment is similar to system 20 but does not include horizontal display surfaces. Rather, display surfaces 305a, 305b extend from the floor.
  • the illustration shows the projectors 315a, 315b are mounted overhead to project downwardly onto the display surfaces 315a, 315b. While not shown, it will be appreciated that the projectors may also project images onto the floor in the illustrated embodiment.
  • the computer 320 calculates a projection transform which takes into account the angle and orientation of the display surfaces 305a, 305b. This projection transform is then applied by the computer to the images to be projected by the projectors 315a, 315b.
  • Additional display surfaces can also be added to the system.
  • a third rectangular display surface may be introduced and positioned so that the three display surfaces are in a substantially triangular arrangement from an overhead point of view.
  • the three displays may be triangular in shape and positioned edge to edge so that the display surfaces form a triangular pyramid arrangement.
  • Implementations of the present disclosure and all of the functional operations provided herein can be realized in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations of the invention can be realized as one or more computer program products, i.e. , one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus.
  • the computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter affecting a machine-readable propagated signal, or a combination of one or more of them.
  • the term "data processing apparatus" encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers, for example, computer system 130.
  • An exemplary data processing apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
  • a computer program (also known as a program, software, software application, script, or code) can be written in any one of many forms of programming language and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
  • a computer program does not necessarily correspond to a file in a file system.
  • a program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a mark-up language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code).
  • a computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
  • processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer.
  • a processor will receive instructions and data from a read only memory or a random access memory or both.
  • the essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data.
  • a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks.
  • a computer need not have such devices.
  • a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few.
  • Computer readable media suitable for storing computer program instructions and data include all forms of non volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks.
  • the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
  • the terms ‘comprises’, ‘comprising’, ‘includes’, ‘including’, or similar terms are intended to mean a non-exclusive inclusion, such that a method, system or apparatus that comprises a list of elements does not include those elements solely, but may well include other elements not listed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

A computer assisted virtual display system including at least two display surfaces angled away from each other for presenting images to one or more corresponding users of each display surface, two or more pairs of stereoscopic 3D glasses for wearing by each of the users, a tracking system configured to track each of the stereoscopic 3D glasses, an imaging system including two or more imaging devices for producing images on the two or more display surfaces and a computational device in communication with the tracking system and programmed to respond to the tracking system. The computational device controls the imaging system to produce stereoscopic 3D images based on orientations of each display surface and data from the tracking system for each of the corresponding users for each user to perceive different views of the same virtual environment appropriate to their position and their display surface.

Description

COMPUTER ASSISTED VIRTUAL REALITY DISPLAY SYSTEM
FIELD OF THE INVENTION
[0001 ] The invention relates to a computer assisted virtual reality display system.
BACKGROUND
[0002] Reference to background art herein is not to be construed as an admission that such art constitutes common general knowledge.
[0003] In one type of computer assisted virtual reality display system there is provided:
1 . At least one surface for presenting images of a virtual environment to a user;
2. An imaging system, including at least one projector for producing images on the at least one surface;
3. A tracking system that is configured to track the users’ view point and viewing direction;
4. A computer that is programmed to respond to the tracking system and thereby control the imaging system to produce stereo 3D images of the virtual environment in response to the tracking system;
5. Stereo 3D glasses worn by the user in order for the user to perceive the stereo 3D images.
[0004] The present applicant operates a recreational virtual reality centre that includes the above components on a commercial basis at 59 Nerang St, Southport, Queensland, 4215, Australia (http://holoverse.com.au).
[0005] While the above system has been found to work very well it will be realized that each display system is adapted for generating a display specifically for the particular user that is being tracked. Consequently, if a second person views the display they will not see the display from their viewpoint but rather from the view point of the first person who is the user that is being tracked by the tracking system. Therefore, for the second person the 3D scene being presented will not change dynamically in accordance with the second person’s movements and so there will be no illusion of virtual reality for the second person.
OBJECT OF THE INVENTION
[0006] It is an aim of this invention to provide a computer assisted virtual reality display system which overcome or ameliorates one or more of the disadvantages or problems described above, or which at least provides a useful commercial alternative.
[0007] Other preferred objects of the present invention will become apparent from the following description.
SUMMARY OF THE INVENTION
[0008] According to a first aspect of the present invention there is provided a computer assisted virtual display system comprising:
at least two display surfaces for presenting images to one or more corresponding users of each display surface, said display surfaces being angled away from each other to reduce visibility of other displays to the one or more corresponding users of one of the display surfaces;
two or more pairs of stereoscopic 3D glasses for wearing by each of the users;
a tracking system configured to track each of the stereoscopic 3D glasses;
an imaging system including two or more imaging devices for producing images on the two or more display surfaces; and
a computational device in communication with the tracking system and programmed to respond to the tracking system and thereby control the imaging system to produce stereoscopic 3D images based on orientations of each display surface and tracking data from the tracking system for each of the corresponding users for each user to perceive different views of the same virtual environment appropriate to their position and their display surface. [0009] In one preferable embodiment, the system comprises two display surfaces. Preferably, the two display surfaces are arranged end to end.
[0010] Preferably, the display surfaces are inclined at an angle relative to a horizontal axis. Preferably, the display surfaces are arranged to form a substantially triangular structure. Suitably, in use, each display surface appears substantially flat or planar to a user. Preferably, the display surfaces are arranged opposingly.
[001 1 ] Preferably, each display surface is orientated at an angle of between 0° and 90° to horizontal. More preferably, the angle is between 30° and 90°. Even more preferably, the angle is between 45° and 90° to horizontal.
[0012] Preferably, the imaging devices are located beneath the display surfaces and arranged to project images onto the display surfaces.
[0013] In some embodiments, the angle of each display surface is about 90°. In some further embodiments, a display surface extends perpendicularly from each display surface angled at 90°. Preferably, the imaging devices are located above the display surfaces and produce images on both the display surface angled at 90° and the perpendicular display surface.
[0014] Preferably, the system further comprises an input device for interfacing with the computational device to thereby affect the stereoscopic 3D images produced on the display surface.
[0015] In some embodiments, the imaging system comprises two sets of two imaging devices, wherein a first set of two imaging devices produce images on a first display surface and a second set of two imaging devices produce images on a second display surface. Preferably, the stereoscopic 3D glasses further comprise an image separation arrangement to present images from the imaging system to corresponding ones of the two or more users whereby each user sees images emanating from a corresponding one of said sets of projectors only. More preferably, the image separation arrangement comprises a frequency separation crystal film. Preferably, the imaging system comprises additional sets of two imaging devices for additional display surfaces.
[0016] Preferably the image separation arrangement includes two or more filters for separating images from the two or more sets of projectors wherein the different viewing filters have different visible light spectral transmission characteristics. [0017] In a preferred embodiment of the invention the filters comprise interference filters. For example, the filters may comprise films of dichroic material.
[0018] Preferably the filters of the image separation arrangement comprise corresponding viewer and projector interference filters, wherein a viewer interference filter is mounted to the stereoscopic 3D glasses for the user and projector interference filters are mounted to projectors of the corresponding set of projectors.
[0019] In an alternative embodiment the stereoscopic 3D glasses comprise first and second visible light filter windows.
[0020] Preferably, the display surfaces are quadrilateral. More preferably, the display surfaces are rectangular.
[0021 ] Preferably, the system comprises three display surfaces.
[0022] Preferably, the system comprises three triangular display surfaces. More preferably, the three triangular display surfaces are arranged in a substantially triangular pyramid arrangement. In some alternative embodiments, the three display surfaces are quadrilateral wherein each display surface is arranged at substantially 90° relative to an adjacent display surface.
[0023] Preferably, the system comprises four triangular display surfaces. More preferably, the four triangular display surfaces are arranged in a substantially square pyramid arrangement. In some alternative embodiments, the four display surfaces are quadrilateral.
[0024] Preferably the system comprises four quadrilateral display surfaces, wherein each display surface is arranged at substantially 90° relative to an adjacent display surface.
[0025] According to a second aspect, the invention resides in a method for operating a computer assisted virtual reality display system, the method comprising the steps of:
detecting a position of each of two or more pairs of stereoscopic 3D glasses of two or more users relative to two or more differently orientated display surfaces;
processing with a computational device the position of each of the stereoscopic 3D glasses of the two or more users; processing with the computational device the orientation of each display surface relative to an associated imaging device to calculate a projection transform;
operating two or more imaging devices in communication with the computational device to present an image to each user upon their respective display surfaces based upon the position of the glasses of the one or more users and the projection transform, wherein each display surface displays the same image.
BRIEF DESCRIPTION OF THE DRAWINGS
[0026] By way of example only, preferred embodiments of the invention will be described more fully hereinafter with reference to the accompanying figures, wherein:
[0027] Figure 1 illustrates a perspective view of a display system according to an embodiment of the present invention;
[0028] Figure 2 illustrates a side view of the display system shown in Figure 1 ;
[0029] Figure 2A illustrates the display system of Figure 1 and a computer system according to a preferred embodiment of the present invention;
[0030] Figure 3 illustrates one embodiment of the stereoscopic 3D glasses of the display system;
[0031 ] Figure 4 illustrates another embodiment of the stereoscopic 3D glasses of the display system;
[0032] Figure 5 illustrates a second embodiment of a display system according to the present invention; and
[0033] Figure 6 illustrates a third embodiment of a display system according to the present invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0034] Referring now to Figures 1 and 2 there is depicted a computer assisted virtual reality display system 10 for two or more users according to a first embodiment of the present invention.
[0035] The system 10 includes a table 100 that has a first display surface 105a and a second display surface 105b, made from, for example a web or sheet of translucent material for presenting images of a virtual environment to two or more users.
[0036] Attached to the opposite ends of each table are input devices in the form of a set of hand-operated controls 107a, 107b to allow each user to interact with the display surfaces 105a, 105b. In the illustrated embodiment these controls take the form of buttons but can be any suitable form of input mechanism, such as a joystick or keyboard, for example.
[0037] System 10 also includes an imaging system which is comprised of projectors 1 10a and 1 10b. One projector is located below each display surface 105a, 105b of the table 100 and are arranged to project on to the underside of the display surfaces 105a, 105b. Due to the translucent nature of the display surfaces 105a, 105b, any projections on the underside of the displays surfaces 105a, 105b are visible on the top of the display surfaces 105a, 105b.
[0038] A tracking system 1 15 is also provided that comprises sensors 120a... d.
The sensors 120a, ... ,120d are configured to sense the position and orientation of targets affixed to special glasses (which will be described below), that are worn by users 125a, 125b of the system.
[0039] System 10 further includes a computer 130 that is coupled to the projectors 1 10a, 1 10b and to the tracking sensors 120a,..,,120d, and is specifically programmed to communicate with the tracking sensors and issue instructions to the projectors.
[0040] As mentioned above, the computer 130 is programmed to respond to the tracking system 1 15 and thereby control the imaging system, i.e. projectors 1 10a and 1 10b, to produce stereo 3D images of a virtual environment corresponding to both a projection transform determined based on the angle of inclination qi, 02 of the display surfaces 105a, 105b and the tracking data from the tracking system for each of the two or more users 125a, 125b. The virtual environment may comprise a vertex and edge polygon model or an octree type solid model stored in computer system 140 or accessible thereto.
[0041 ] From the figures it can be seen that the display surfaces 105a, 105b are arranged adjacent to one another and positioned end to end at predetermined angles q-i, 02 above a horizontal member 1 1 1 to form a substantially triangular prism shape or tent-like shape. Surprisingly, the Inventor has found that, in use, with appropriate transformations applied to the projections the inclined display surface can be made to appear to each user to extend from one user to another in a planar fashion, i.e. the display surface appears to be horizontal to the user. As a result, it appears to the two users that they are using the same display surface and are interacting with the same image while, in fact, each user has their own display surface and own projected image thereon.
[0042] With the above in mind, the two display surfaces 105a, 105b are arranged opposingly so that multiple display surfaces are not viewable from any one of the display surfaces. For example, display surface 105b is not viewable to user 125a when user 125a is positioned at display surface 105a. This prevents the first user 125a, for example, from seeing both images generated for viewing by user 125a on display surface 105a and also an image generated on display surface 105b for viewing by user 125b, which would break the illusion of a single display table.
[0043] While in the illustrated embodiments the display surfaces are shown to be inclined, in some alternative embodiments, the display surfaces are substantially vertical or perpendicular to the floor (which will be described in more detail later).
[0044] Referring now to Figure 2A, there is shown the computer assisted virtual reality display system 10 including the flow of data to and from the computer 130 and computer system 130A to control the imaging devices 1 10a. 1 10b and stereoscopic 3D glasses 135a, 135b.
[0045] As indicated in the diagram, data collected by the tracking system 1 15 is communicated to the computer 130 which uses the tracking data and the orientation (including the angles of inclination q-i, 02 of each of the display surfaces 105a, 105b to instruct the imaging devices 1 10a, 1 10b to produce the appropriate stereoscopic 3D images on each of the display surfaces 105a, 105b for the respective users 125a, 125b.
[0046] In addition, the computer 130 outputs instructions to the stereoscopic 3D glasses 135a, 135b to control the active shutter windows and receives inputs from the control interfaces 107a, 107b operated by the users 125a, 125b to manipulate and interact with the displayed images.
[0047] Referring to the computer system 130A, using the operating system 131 , an administrator loads imaging software, including projection transform software 132. The projection transform software 132 uses the tracking data and the orientation of the display surfaces 105a, 105b to create a projection transform, such as a projection transform matrix or set of matrices, which are applied to an image model to create the appropriate images for projection onto the display surfaces 105a, 105b for each user 125a, 125b.
[0048] With reference to Figure 3, the system 10 also includes stereo 3D glasses 135a, 135b that are worn by each of the users 125a, 125b in order for the users to perceive the stereoscopic 3D images. Affixed to each of the glasses 135a, 135b are the aforementioned targets 137a, 137b which are tracked by the tracking system 1 15. Each of the glasses 135a, 135b also includes first and second active shutter windows 140a, 140a’ and 140b, 140b’ respectively. The action of the shutter windows 140a, 140a’ and 140b, 140b’ is synchronized to the stereoscopic images generated by the corresponding projectors 1 10a, 1 10b. Active shutter stereoscopic glasses are well known and used for perceiving 3D stereoscopic images, for example from suitably equipped LCD and plasma televisions. Other types of stereoscopic systems may also be used such as differently polarized left and right windows (though these suffer from loss of 3D effect as the head is rotated) and anaglyph ic windows, e.g. red/cyan windows (however anaglyphic windows typically cause a loss of colour realism).
[0049] In some preferable embodiments, the system also includes an image separation arrangement to present images from the two or more projectors 1 10a, 1 10b to corresponding ones of the two or more users whereby each user sees images emanating from a corresponding one of said projectors only.
[0050] The image separation arrangement of the preferable embodiments mentioned above includes first and second interference filters 145a and 145b that fit over the shutter windows 140a, 140a’ and 140b, 140b’ respectively, and can be seen in Figure 4.
[0051 ] The first and second interference filters 145a and 145b are formed of dichroic material and have different, orthogonal visible light spectral transmission characteristics. Filter 145a, which is made of a filter material F1 having a specific first red (R1 ), blue (B1 ), green (G1 ) transmission characteristic. The second interference filter 145b, which is made of filter material F2 having a specific second red (R2), blue (B1 ), green (G1 ) transmission characteristic is non-overlapping with the transmission characteristic of filter material F1 . Consequently, light that passes through filter F1 will be entirely blocked by filter F2 and vice versa. Accordingly, the filters F1 and F2 are said to have“orthogonal” transmission characteristics.
[0052] Returning again to Figure 4, the image separation arrangement further includes projector filters 150a of material F1 and 1 50b of material F2 which fit over the output lenses of the filters 1 10a and 1 10b respectively. Consequently, light from projector 1 10a is incident upon filter 1 50a of material F1 . Only light with wavelengths falling within passbands B1 , G1 and R1 of the filter material F1 passes through the filter. Accordingly, the light escaping from projector filter 150a can pass through glasses filter 145a, since filters 150a and 145a are made of the same material and have the same spectral bandpasses B1 , G1 , R1 . Similarly, only light with wavelengths falling within passbands B2, G2 and R2 of the filter material F2 passes through filter 150b of the second projector 1 10b.
[0053] Similarly, the light escaping from projector filter 1 50b can pass through glasses filter 145b, since filters 150b and 145b are made of the same material and have the same spectral bandpasses B2, G2, R2. Flowever, the first glasses filter 145a, which is made of material F1 , will completely block light from the second projector filter 150b, which is made of material F2 because transmission characteristics of the F1 and F2 filter materials are orthogonal and have no overlap. Therefore, the paired glasses and projector lenses 1 45a, 150a and 145b, 150b serve to separate images from the projectors 1 10a and 1 10b so that only a wearer of glasses 135a can see images from projector 1 10a whereas only a wearer of glasses 1 35b can see images from projector 1 10b.
[0054] Whilst some preferred embodiments of the invention make use of interference filters, namely dichroic filters 1 45a and 145b and projector filters 150a and 150b, other arrangements are possible. For example, absorption filters may also be used and US Patent No. 9,651 ,791 in the name of Infitec GmbFI describes a set of absorption filters that are suitable for image separation purposes. In use the tracking system 1 15 detects the position of targets 1 37a and 137b of the glasses 135a and 135b. Computer system 130 then operates projectors 1 10a and 1 10b to project corresponding stereoscopic images on display surfaces 105a, 105b. [0055] It will be realized that in other embodiments of the invention the stereo 3D glasses may comprise first and second filter windows.
[0056] The stereoscopic images produced by the projectors 1 10a, 1 10b comprise two different views of the same 3D object. As an example, an image displayed on display surface 105a may be a view of the head of a dinosaur from the view direction of the first user 125a whereas the corresponding image displayed on display surface 105b would correspond to a view of the tail of the dinosaur from the view direction of the second user 125b.
[0057] As the first and second users 125a, 125b change their viewpoints and viewing directions, for example by turning their heads or by moving about their respective display surfaces 105a, 105b, the computer system 130 is able to monitor their positions via the tracking system 1 15 and adjust the views of the virtual scene that are delivered by projectors 1 10a and 1 10b accordingly. Consequently, both users 125a, 125b see an appropriate view of the virtual scene from their viewpoint on their respective display surface.
[0058] Furthermore, the views of the scene for both the first and second users 125a, 125b will change dynamically in accordance with each user’s movements and so a virtual reality of the same scene will be perceived for both the first and the second user 125a, 125b appropriate to their display surface 105a, 105b. Continuing the example above, the first user 125a is looking at the head of the dinosaur but wishes to inspect the tail of the dinosaur instead. Using a control mechanism, such as controls 107a, the first user 125a can manipulate and rotate the dinosaur image so that the tail is facing the first user 125a.
[0059] Consequently, the computer 130 adjusts the image produced by projector 1 10b, which presents the image to the second user 125b on display surface 105b, so that the second user 125b watches the dinosaur rotate until it faces the first user 125a and thus the head of the dinosaur now faces the second user 125b.
[0060] Turning to Figure 5, there is illustrated another embodiment of a computer assisted virtual reality display system 20. System 20 includes a table 200 that has a first display 205 having display surfaces 205a, 205b and a second display 210 having display surfaces 210a, 210b.
[0061 ] Additionally, the system 20 includes control systems 207a, 207b attached to the displays 205, 210, an imaging system having two overhead projectors 215a, 215b, a tracking system 220 having tracking sensors 220a...220d and a computer 225 connected to the projectors 215a, 215b. Each of these components is substantially similar to the components described above in relation system 10.
[0062] The users 230a and 230b of the system 20 are also wearing stereoscopic 3D glasses 135a, 135b, which are described above.
[0063] In the illustrated embodiment it can be seen that display surfaces 205a and 210a are vertical or angled at substantially 90° relative to the horizontal and the additional display surfaces 205b and 210b extend perpendicularly from the two vertical display surfaces 205a, 210a, respectively.
[0064] The illustration also shows the projectors 215a, 215b are mounted overhead, rather than underneath the table 200. This allows projector 215a to project onto both display surfaces 205a, 205b viewed by first user 230a and projector 215b to project onto both display surfaces 210a, 210b viewed by second user 230b.
[0065] In order to allow the projectors 215a, 215b to project the image onto the surfaces, the computer 225 calculates a projection transform which takes into account the angle and orientation of the display surfaces 205a, 205b, 210a, 210b. This projection transform is then applied by the computer to the images to be projected by the projectors 215a, 215b.
[0066] Figure 6 illustrates yet another embodiment of a computer assisted virtual reality display system 30 having two vertically orientated display surfaces 305a, 305b, a tracking system having sensors 310a...d and two projectors 315a, 315b connected to a computer 320.
[0067] Figure 6 also shows first and second users 325a and 325b wearing stereoscopic 3D glasses 135a, 135b, which are described above.
[0068] The illustrated embodiment is similar to system 20 but does not include horizontal display surfaces. Rather, display surfaces 305a, 305b extend from the floor.
[0069] The illustration shows the projectors 315a, 315b are mounted overhead to project downwardly onto the display surfaces 315a, 315b. While not shown, it will be appreciated that the projectors may also project images onto the floor in the illustrated embodiment. [0070] Similar to the above described embodiments, the computer 320 calculates a projection transform which takes into account the angle and orientation of the display surfaces 305a, 305b. This projection transform is then applied by the computer to the images to be projected by the projectors 315a, 315b.
[0071 ] Additional display surfaces can also be added to the system. For example, a third rectangular display surface may be introduced and positioned so that the three display surfaces are in a substantially triangular arrangement from an overhead point of view. Alternatively, the three displays may be triangular in shape and positioned edge to edge so that the display surfaces form a triangular pyramid arrangement.
[0072] Implementations of the present disclosure and all of the functional operations provided herein can be realized in digital electronic circuitry, or in computer software, firmware, or hardware, including the structures disclosed in this specification and their structural equivalents, or in combinations of one or more of them. Implementations of the invention can be realized as one or more computer program products, i.e. , one or more modules of computer program instructions encoded on a computer readable medium for execution by, or to control the operation of, data processing apparatus.
[0073] The computer readable medium can be a machine-readable storage device, a machine-readable storage substrate, a memory device, a composition of matter affecting a machine-readable propagated signal, or a combination of one or more of them. The term "data processing apparatus" encompasses all apparatus, devices, and machines for processing data, including by way of example a programmable processor, a computer, or multiple processors or computers, for example, computer system 130. An exemplary data processing apparatus can include, in addition to hardware, code that creates an execution environment for the computer program in question, e.g., code that constitutes processor firmware, a protocol stack, a database management system, an operating system, or a combination of one or more of them.
[0074] A computer program (also known as a program, software, software application, script, or code) can be written in any one of many forms of programming language and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file in a file system. A program can be stored in a portion of a file that holds other programs or data (e.g., one or more scripts stored in a mark-up language document), in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers that are located at one site or distributed across multiple sites and interconnected by a communication network.
[0075] Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. Moreover, a computer can be embedded in another device, e.g., a mobile telephone, a personal digital assistant (PDA), a mobile audio player, a Global Positioning System (GPS) receiver, to name just a few. Computer readable media suitable for storing computer program instructions and data include all forms of non volatile memory, media and memory devices, including by way of example semiconductor memory devices, e.g., EPROM, EEPROM, and flash memory devices; magnetic disks, e.g., internal hard disks or removable disks; magneto optical disks; and CD ROM and DVD-ROM disks. The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
[0076] In this specification, adjectives such as first and second, left and right, top and bottom, and the like may be used solely to distinguish one element or action from another element or action without necessarily implying any actual such relationship or order. Where the context permits, reference to an integer or a component or step (or the like) is not to be interpreted as being limited to only one of that integer, component, or step, but rather could be one or more of that integer, component, or step, etc.
[0077] The above description of various embodiments of the present invention is provided for purposes of description to one of ordinary skill in the related art. It is not intended to be exhaustive or to limit the invention to a single disclosed embodiment. As mentioned above, numerous alternatives and variations to the present invention will be apparent to those skilled in the art of the above teaching. Accordingly, while some alternative embodiments have been discussed specifically, other embodiments will be apparent or relatively easily developed by those of ordinary skill in the art. The invention is intended to embrace all alternatives, modifications, and variations of the present invention that have been discussed herein, and other embodiments that fall within the spirit and scope of the above described invention.
[0078] In this specification, the terms ‘comprises’, ‘comprising’, ‘includes’, ‘including’, or similar terms are intended to mean a non-exclusive inclusion, such that a method, system or apparatus that comprises a list of elements does not include those elements solely, but may well include other elements not listed.

Claims

Claims
1. A computer assisted virtual display system comprising:
at least two display surfaces for presenting images to one or more corresponding users of each display surface, said display surfaces being angled away from each other to reduce visibility of other display surfaces to the one or more corresponding users of one of the display surfaces;
two or more pairs of stereoscopic 3D glasses for wearing by each of the users;
a tracking system configured to track each of the stereoscopic 3D glasses;
an imaging system including two or more imaging devices for producing images on the two or more display surfaces; and
a computational device in communication with the tracking system and programmed to respond to the tracking system and thereby control the imaging system to produce stereoscopic 3D images based on orientations of each display surface and tracking data from the tracking system for each of the corresponding users for each user to perceive different views of the same virtual environment appropriate to their position and their display surface.
2. A computer assisted virtual display system according to claim 1 , wherein the system comprises two display surfaces arranged end to end.
3. A computer assisted virtual display system according to claim 2, wherein the display surfaces are arranged to form a substantially triangular structure.
4. A computer assisted virtual display system according to claim 3, wherein the display surfaces are arranged opposingly.
5. A computer assisted virtual display system according to any one of the preceding claims, wherein in use, each display surface appears substantially flat or planar to a user.
6. A computer assisted virtual display system according to any one of the preceding claims, wherein each display surface is orientated at an angle of between 45° and 90° to horizontal.
7. A computer assisted virtual display system according to any one of the preceding claims, wherein the imaging devices are located beneath the display surfaces and arranged to project images onto the display surfaces.
8. A computer assisted virtual display system according to claim 6, wherein the angle of each display surface is about 90°.
9. A computer assisted virtual display system according to claim 8, wherein a perpendicular display surface extends substantially perpendicularly from each display surface angled at about 90°.
10. A computer assisted virtual display system according to claim 9, wherein the imaging devices are located above the display surfaces and produce images on both the display surface angled at 90° and the perpendicular display surface.
1 1 . A computer assisted virtual display system according to any one of the preceding claims, the system further comprising an input device for interfacing with the computational device to thereby affect the stereoscopic 3D images produced on the display surface.
12. A computer assisted virtual display system according to any one of the preceding claims, wherein the imaging system comprises two sets of two imaging devices, wherein a first set of two imaging devices produce images on a first display surface and a second set of two imaging devices produce images on a second display surface.
13. A computer assisted virtual display system according to any one of the preceding claims, wherein the stereoscopic 3D glasses further comprise an image separation arrangement to present images from the imaging system to corresponding ones of the one or more users of each display whereby each user sees images emanating from a corresponding one of said sets of projectors only.
14. A computer assisted virtual display system according to claim 13, wherein the image separation arrangement comprises a frequency separation crystal film.
15. A computer assisted virtual display system according to claim 13 or claim 14, wherein the imaging system comprises additional sets of two imaging devices for additional display surfaces.
16. A computer assisted virtual display system according to claim 1 , wherein the image separation arrangement includes two or more filters for separating images from the two or more sets of projectors wherein the different viewing filters have different visible light spectral transmission characteristics.
17. A computer assisted virtual display system according to claim 16, wherein the filters comprise interference filters.
18. A computer assisted virtual display system according to claim 16 or claim 17, wherein the filters of the image separation arrangement comprise corresponding viewer and projector interference filters, wherein a viewer interference filter is mounted to the stereoscopic 3D glasses for the user and projector interference filters are mounted to projectors of the corresponding set of projectors.
19. A computer assisted virtual display system according to any one of the preceding claims, the stereoscopic 3D glasses comprise first and second visible light filter windows.
20. A computer assisted virtual display system according to any one of the preceding claims, wherein the display surfaces are rectangular.
21 . A method for operating a computer assisted virtual reality display system, the method comprising the steps of:
detecting a position of each of two or more pairs of stereoscopic 3D glasses of two or more users relative to two or more differently orientated display surfaces;
processing with a computational device the position of each of the stereoscopic 3D glasses of the two or more users;
processing with the computational device the orientation of each display surface relative to an associated imaging device to calculate a projection transform;
operating two or more imaging devices in communication with the computational device to present an image to each user upon their respective display surfaces based upon the position of the glasses of the one or more users and the projection transform, wherein each display surface displays the same image.
PCT/AU2018/051174 2018-03-21 2018-10-31 Computer assisted virtual reality display system WO2019178635A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2018900926 2018-03-21
AU2018900926A AU2018900926A0 (en) 2018-03-21 Computer assisted virtual reality display system

Publications (1)

Publication Number Publication Date
WO2019178635A1 true WO2019178635A1 (en) 2019-09-26

Family

ID=67986731

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2018/051174 WO2019178635A1 (en) 2018-03-21 2018-10-31 Computer assisted virtual reality display system

Country Status (1)

Country Link
WO (1) WO2019178635A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113703582A (en) * 2021-09-06 2021-11-26 联想(北京)有限公司 Image display method and device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5781229A (en) * 1997-02-18 1998-07-14 Mcdonnell Douglas Corporation Multi-viewer three dimensional (3-D) virtual display system and operating method therefor
US7832869B2 (en) * 2003-10-21 2010-11-16 Barco N.V. Method and device for performing stereoscopic image display based on color selective filters
US20160007035A1 (en) * 2012-10-04 2016-01-07 Dish Network, L.L.C. Frame block comparison
EP3242274A1 (en) * 2014-12-31 2017-11-08 Alt Limited Liability Company Method and device for displaying three-dimensional objects

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5781229A (en) * 1997-02-18 1998-07-14 Mcdonnell Douglas Corporation Multi-viewer three dimensional (3-D) virtual display system and operating method therefor
US7832869B2 (en) * 2003-10-21 2010-11-16 Barco N.V. Method and device for performing stereoscopic image display based on color selective filters
US20160007035A1 (en) * 2012-10-04 2016-01-07 Dish Network, L.L.C. Frame block comparison
EP3242274A1 (en) * 2014-12-31 2017-11-08 Alt Limited Liability Company Method and device for displaying three-dimensional objects

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
AGRAWALA, M ET AL.: "The two-user Responsive Workbench: support for collaboration through individual views of a shared space", PROCEEDINGS OF THE 24TH ANNUAL CONFERENCE ON COMPUTER GRAPHICS AND INTERACTIVE TECHNIQUES, 1997, XP058375845 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113703582A (en) * 2021-09-06 2021-11-26 联想(北京)有限公司 Image display method and device
GB2610889A (en) * 2021-09-06 2023-03-22 Lenovo Beijing Ltd Image display method and apparatus

Similar Documents

Publication Publication Date Title
US8482549B2 (en) Mutiple image projection apparatus
TWI557491B (en) Combined visible and non-visible projection system
US8506085B2 (en) Methods and systems for projecting images
KR100616556B1 (en) Polarized stereoscopic display device and method without loss
US7690794B2 (en) Image-combining device and projection display apparatus having image-combining devices incorporated therein
KR101693082B1 (en) A 3d observation device with glassless mode
US11051006B2 (en) Superstereoscopic display with enhanced off-angle separation
US20070047043A1 (en) image projecting device and method
JP2009017207A (en) Stereoscopic television system and stereoscopic television receiver
JP5905585B2 (en) Video processing system based on stereo video
US11574389B2 (en) Reprojection and wobulation at head-mounted display device
WO2006113039A3 (en) Polarized projection display
US20080304013A1 (en) Projection Type Stereoscopic Display Apparatus
US10659772B1 (en) Augmented reality system for layering depth on head-mounted displays using external stereo screens
WO2019178635A1 (en) Computer assisted virtual reality display system
JP4472607B2 (en) 3D image presentation and imaging device
AU2018303842B2 (en) Multiple observer improvement for a virtual environment
WO2004099825A3 (en) Phase retardance autostereoscopic display
US20040179263A1 (en) Stereoscopic image display apparatus
US20150022646A1 (en) System and Method for Display of Image Streams
US10067352B2 (en) 3D image generating lens tool
WO2008076111A1 (en) 3d image projection system
JP4554391B2 (en) Display system
JP4928152B2 (en) Rear projection display device and screen
NZ585744A (en) Method of defining stereoscopic depth using manipulation of virtual 3D cameras

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18910334

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18910334

Country of ref document: EP

Kind code of ref document: A1