GB2470462A - Surround projection display system with synchronised screens - Google Patents

Surround projection display system with synchronised screens Download PDF

Info

Publication number
GB2470462A
GB2470462A GB1007383A GB201007383A GB2470462A GB 2470462 A GB2470462 A GB 2470462A GB 1007383 A GB1007383 A GB 1007383A GB 201007383 A GB201007383 A GB 201007383A GB 2470462 A GB2470462 A GB 2470462A
Authority
GB
United Kingdom
Prior art keywords
images
user
image projection
projection apparatus
screens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1007383A
Other versions
GB201007383D0 (en
Inventor
Paul Smith
Tim Scott
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TV Sports Network Ltd
Original Assignee
TV Sports Network Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TV Sports Network Ltd filed Critical TV Sports Network Ltd
Publication of GB201007383D0 publication Critical patent/GB201007383D0/en
Publication of GB2470462A publication Critical patent/GB2470462A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/363Image reproducers using image projection screens
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/54Accessories
    • G03B21/56Projection screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/327Calibration thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A projection system 1 comprising at least three screens 2, 3, 4 each oriented in a different plane with images projected onto them, wherein each screen seamlessly joins with other screens along at least two edges, characterised in that the projected images on one of the screens rotate when the images on the adjacent screens shift sideways. A controller includes a processor, graphics cards and software comprising system initialisation, a scenario manager, a physics engine and rendering control. A sound engine may cause sound to emanate from a location in synchronisation with the projected imagery. Projectors 7, 8 may display video between 18-30fps, and preferably at 25fps. Rendering of image elements may be switched on or off. A user controlled device, such as a joystick, mouse or position sensor may be used to determine the user's position and orientation, and provide selectable options. The screens may comprise a floor 4, four walls 2 and a ceiling 3. The system provides true perspective imagery within a 3D scene.

Description

Apparatus and Method for Projecting 3 D Images
Field of the Invention
The present invention relates to an apparatus and method for projecting images onto screens so that a person within those screens perceives himself as being within a three dimensional scene, and in particular to an apparatus and method for projecting images simultaneously onto at least three screens lying in at least two different planes, whilst maintaining a seamless environment and giving the illusion of depth.
Background of the Invention
Many attempts have been made to project images in manner which simulates reality. Flight simulators for example are used successfully in the training of pilots. Whilst flight simulators provide true perspective in the field of view the pilot must have no point of reference outside the simulated environment otherwise the simulation fails. Furthermore, in order for such a simulator to simulate flight accurately the pilot must feel movement. In known flight simulators this is achieved by physically moving the whole simulator by means of powerful actuators.
Another flight simulator requires the pilot to wear goggles. Images are presented on two displays and are rendered in real time. For this system to function the position of the head, including any changes in position of the head must be measured. The pilot has no freedom to move away from the goggles.
A reality game known as the "Cave" has screens on its walls and floor. Players of this game enter the Cave wearing goggles and view images in stereoscopic vision. The players of this game would not experience significant peripheral vision and therefore the players do not experience a full sense of movement or appreciate visual cues from their surrounding environment.
A surround vision apparatus is known from the United Kingdom patent application published under number 2422500. This patent application describes a display comprising a number of screens, for example five or six screens, each individual screens forming a wall of an enclosure. In this application a camera having a number of camera heads is described.
It is recognised in the afore-mentioned patent application that for an individual to feel as though he is immersed in a three dimensional environment it is important that the images presented by individual screens are not distorted at their edges and that images projected on one screen should be synchronised with images presented on other screens, and for sound to be controlled such that it emanates from the correct part of the image presented on the screens.
The present invention seeks to provide an apparatus and method for projecting images onto screens of an enclosure of the type described in United Kingdom patent application number 2422500.
In order for a user of a device of the type described GB 2422500 to feel as though he is in a three dimensional environment the projected images must occupy his peripheral vision which extends approximately 140 degrees from the axis of the eye. Further, the feeling of being in a three-dimensional environment is enhanced by projecting images onto a screen forming the ceiling of the enclosure.
It is known in gaming to project images on to multiple screens where those screens either form part of or are mounted on a wall. However, it is not known to project images on to screens forming walls ceilings or floors contemporaneously with the images being synchronised. In the scenario where images are projected on to three adjoining walls, for example one wall to the front of a viewer and a wall to the viewer's respective sides, what is required is for the images at the corners of the screens to remain substantially free from distortion, and for the images on adjoining screens to be synchronised so that the image appears to be continuous as it passes around the corner formed by adjoining screens. However, where an image is projected onto the ceiling, this must be synchronised with the images presented on the walls that the ceiling adjoins. Hence, taking the example of projecting images on screens formed by three adjoining walls, and a ceiling, for the ceiling image to remain synchronised with the images presented on the walls, if the user moves with respect to the screens the image on the ceiling must rotate. Such an apparatus is not known.
Interactivity between the user and an apparatus for projecting three-dimensional images would be greatly enhanced if one were able to modify the images so presented to correspond to the position taken up by the user. True perspective would be provided.
It would therefore be desirable to provide an improved apparatus and method for projecting imagery in three dimensions.
Summary of the Invention
According to the invention there is provided an image projection apparatus as specified in Claim 1.
According to another aspect of the invention there is provided a method of presenting imagery to a user in true perspective to an individual located within an image projection apparatus as specified in Claim 13.
According to another aspect of the invention the software of the rendering control includes algorithms configured to switch on or off the rendering of elements of an image. This provides the advantage that the frame rate may be maintained. For example, where part of the scene is falling snow, to render such imagery would take a great deal of processor power due to the reflections produced by the snow flakes. However, it is also known that when snow falls the environment becomes darker and detail is to some degree obscured thereby reducing reflection. In such a case, it is possible to switch off or reduce the rendering of reflections, including those produced by the snowflakes.
Brief Description of the Drawings
In the Drawings, which illustrate preferred embodiments of the invention, and are by way of
example:
Figure 1 is a schematic representation of and enclosure on the surfaces of which images are displayed; Figure 2 is a block diagram of the components of a control system according to an aspect of the invention; and Figure 3 is a schematic representation of the enclosure illustrated in Figure 1 depicting the movement of images responsive to the movement of a person within the enclosure.
Detailed Description of the Preferred Embodiment
Figure 1 illustrates in plan view an enclosure I comprising walls 2, a ceiling 3 and a floor 4. The walls 2 and the ceiling 3 each form a display on which images may be displayed. The floor 4 may also form a screen, but in most scenarios this is not necessary, as will become clear from the description that follows. The nature of the screens themselves is not material to the present invention and hence they will not be described in detail.
The apparatus illustrated in Figure 1 further includes video projectors 7 arranged outside the enclosure I and aligned to project images on to the displays 2. A fifth video projector 8 is located above the enclosure I and is aligned to project images on to the display 3. Each projector 7, 8 projects an image on to one of the displays 2, 3, with respective projected images, each filling one of the displays 2, 3.
The purpose of the apparatus illustrated in Figure 1 is to allow a person standing within the enclosure to view images which closely resemble reality. In order for such a person's experience to resemble reality the presentation of images on to the screens must be synchronised. Similarly, for the person's experience to resemble reality, sound must be distributed through the enclosure such that the person believes that a particular sound has emanated from a particular object. These aspects of the apparatus illustrated in Figure 1 are described in published patent application no GB2422500.
The present invention relates to a system and method for controlling the presentation of images onto the displays of an apparatus of the type illustrated in Figure 1. Of course in certain applications not all of the walls 2, ceiling 3 of floor 4 need comprise displays.
Figure 2 illustrates the principal modules of the control system of the invention. The control system of the invention includes system hardware comprising one or more high speed multi-core processor PC, such as a dual or quad core PC), including multiple graphics boards together providing three or more outputs, that being at least one output per screen (more than one output per screen may be used where for instance one screen is bigger than another, or simply where all the screens are large) and a surround or stereo sound board.
The system hardware operates system software which when run on the system hardware provides system initialisation 12, a scenario manager 13, a rendering controller 14, a physics engine 15, a rendering engine 16 and a sound engine 17. Other plugins 18 may be connected to the system hardware and run in conjunction with the system software. A user controlled device 19 may also be connected to the system hardware and run in conjunction with the system software.
The system hardware includes a data source 11 which may comprise images created in real time, a computer hard drive or other media storage device capable of storing information from which a three dimensional representation of an environment, for example video or digital picture information or information from which a three-dimensional representation may be rendered up, which typically occurs in real time on the fly, either upon command of the user demanding movement or by the controller changing the scene. The data source is also the source of sound information. Data is retrieved from the data source 11 by the Scenario Manager 13, which is described in greater detail below.
The functions of the different parts of the software are described below: System Initialisation The system initialisation software 12 initialises the system hardware to match the specification of the display apparatus being used, configures the other software modules 13 to 18 to work with the display apparatus and the user controlled device where present, sets up the initial screen parameters and alignment, and initialises user interfaces. The system initialisation software also provides settings for the rendering control 14 and rendering engine 16 so that these parts of the system know how the displays are arranged (the displays need not be in the arrangement illustrated in Figure 1).
Scenario Manager The function of the scenario manager is to manage the three dimensional scene presented in the enclosure 1. The software of the scenario manager retrieves data from the data source 11 and loads it into the rendering engine 16. Further, the scenario manager software provides for interactivity between the user and the apparatus, i.e. interactivity between the user and information presented to the user in the enclosure 1, within bounds defined in the scenario. For example, the user may wish to move the viewing point of the scene. When a user makes a valid input to change his view of the scene the input is transmitted to the rendering control 14. Interactivity between the user and the apparatus requires additional control apparatus that allows the user to make inputs to the scenario manager, such additional control apparatus being referred to as a user controlled device. Such a user controlled device may comprise a joystick or a mouse or some other electronic device, such as a device situated on the user which allows the position and/or orientation of the user to be determined. In essence, the user interacts with the scenario manager through the user controlled device. Hence, the user may interact with the images presented on the displays through a virtual user represented by a position and/or orientation determined by a joystick or alternative device. The user controlled device determines position and/or orientation with respect to a fixed set of co-ordinates, typically the co-ordinates of the centre of the enclosure. Hence, where the user controlled device is a joystick, when the joystick is in its rest position the virtual user is stationary in the centre of the enclosure. If the user moves the joystick forward the software controls the imagery projected on to the displays such that the user feels as though he is moving forward. Similarly, if the user moves the joystick to the right, the software controls the imagery projected on to the displays such that the user feels as though he is moving to the right.
For example, the scenario may represent a basement room containing furniture and stairs to an exit. The walls, floor and stairs would be defined as boundaries, as would furniture within the room. In a simple scenario, interaction between the user and the boundaries can be very straight forward in that the user is simply prevented from moving beyond the boundaries. Where a user is operating a joystick to traverse through a scene, parameters associated with the joystick would not allow the position of the joystick to pass through a boundary. However, more complex scenarios may also be provided for. For example, the user may be able to interact with the floor, walking over it, and stairs, walking up and down them. To explain further, when the position determined by the joystick comes within a certain proximity of a particular floor or staircase, the scenario manager controls the images presented on the display such that the user has the sensation of traversing over the floor or up or down the stairs. The user may also be able to interact with the walls. For example, the user and walls may be given properties so that upon colliding with a wall the images are projected such that the user would appear to bounce off. Similarly, items of furniture may be defined in a manner that allows them to move upon collision. These more complex interactions are provided for by the physics engine described in greater detail below.
The scenario manager not only retrieves data from the scene and scenario data source, for transmission to the rendering engine 16, it also controls the occurrence of events within a scenario. For example, a scenario in the above-mentioned basement may consist of a waste paper bin catching fire.
The user may cover it with a tray or cloth, for example by using a joystick or mouse to pick up the tray or cloth and place it over the waste paper bin. The scenario manager would initiate the fire in the waste bin and the response to the user's actions. The events may be defined in the scenario data retrieved from the data source 11, but may need interpretation. For example, the fire may be set to commence two minutes after the user enters the basement. The scenario manager would recognise that the user had entered the basement and thereby start the two minute time period running.
In multi-user interaction, i.e. two connected blue rooms or a blue room and a connected external PC, the scenario manager also handles the interaction between the users and works out how the scene should react.
The rendering control controls the parameters that are required by the rendering engine to allow it to render up images. The images must be rendered up from the point where the user is located.
The scenario manager reads the script for the scene. With this information, together with information regarding the position of the user and his direction, the rendering control can command the rendering engine to project the required images.
Rendering Control The rendering control software 14 controls the way in which the rendering engine is used to create the environment within the enclosure, that is, the presentation of images on the displays and the emission of sound. The software also responds to changes of viewpoint so that the scene viewed by a user is correct for the position occupied by the user, and maintains the correct views on the displays.
Similarly, the rendering control matches sound to imagery, matching not only the correct sound, but also the correct direction of sound to imagery.
Another function of the rendering control is to configure how each section of a scenario will be rendered in order to maintain the frame-rate at an acceptable level. This is achieved by adjusting parameters within the rendering engine and controlling the workload of each rendering process. For example, where part of an image does not require rendering the rendering control commands the rendering engine not to render that part of the image.
In this example, the imagery presented on the displays of the enclosure I is produced by a number of virtual cameras. When the viewpoint of the user changes, it must appear to the user as though all the cameras moved together. Hence, when the user moves location, the location of all the cameras is moved to a new point.
In the case where the user rotates the room to the left, i.e. they move to face to the right, then the front, left, right and back cameras would pan right. However, the top and bottom virtual cameras would roll clockwise and anti clockwise respectively. However, the edges of the images of the top and bottom displays must be synchronised with edges of the images displayed on the walls of the enclosure.
Hence, the image data must be manipulated. This is illustrated in Figure 3 where it can be seen that at the points x and x1 where adjacent displays 2 and 3 meet as the images on the displays 2 move in the direction indicated by arrows y, so the image presented on ceiling 3 must rotate in the direction indicated by arrow z. Rendering Engine The function of the rendering engine is to render images as commanded by the rendering control software.
Physics Engine The software of the physics engine 15 is programmed to manipulate images so that they represent physical events. For example, if an image is of water, in reality where the water meets an edge a ripple would occur. When a ball hits a wall it should bounce off with a trajectory determined by its path to the wall. In such a case, the software of the physics engine would manipulate the images to show the correct physical response. Different physics engines may be used for different physical events.
The apparatus of the invention may be used for many different purposes. One example is in training individuals in operations. For example, individuals could be trained in disaster response at an oil refinery or the like. In such a scenario, control gear may be simulated so that the individual can operate the valve depicted in the image to shut off flow of oil, or press a button to raise an alarm. Where a button is to be pressed or a valve operated, the user controlled device may include a user input means which allows for option selection. For example if the user controlled device included a mouse, the user could move the mouse over an image of a button, and click on the button. The software would detect that the button had been pressed and cause appropriate imagery to be displayed thereafter. Another example would be in military training for familiarisation purposes. In such a situation a reconnaissance flight may have been made capturing suitable imagery. The imagery is replayed through the apparatus of the invention to servicemen about to enter the area reconnoitred. Another example would be in flight simulation. Whereas in flight simulators of the prior art the whole structure of the flight simulator must be configured to move to provide a true sensation of simulation, in the present invention, the manner in which the images are presented on the displays provides such simulation.
The apparatus of the invention provides significant advantages over the prior art apparatus used in the "Cave" type game. The apparatus of the present invention does not use stereoscopic vision as is used in the "Cave" apparatus. This provides two advantages. First, the person situated within the bounds of the apparatus does not need to wear goggles, and hence the user's peripheral vision is not affected. The ability of the user to use peripheral vision allows imagery to be present to the user in true perspective, which means that the user's experience is more realistic and also that the user does not need to turn his head to the same extent. Second, stereoscopic projection of images requires significantly more rendering than is required by the apparatus of the present invention. Hence, for the same processor power, more detailed images may be presented to a user in the apparatus of the present invention than in the case of the "Cave" type apparatus. Finally, it has been found that in particular where one of the displays is the ceiling of the enclosure, many more people can occupy the apparatus of the invention than is possible for a "Cave" type apparatus of similar dimensions, where both the wearing of goggles and the projection of imagery on to the floor of the enclosure limit the number of occupants.

Claims (16)

  1. Claims 1. An image projection apparatus comprising a display adapted to provide true perspective imagery, including at least three display surfaces each oriented in a different plane, wherein the edges of adjacent display surfaces meet such that each of the at least three display surfaces meets another of the at least three display surfaces along two of its edges, means to show images on said display, a data source and a controller, the controller comprising hard ware including a processor and multiple graphics cards and software, wherein the software provides: i) system initialisation; ii) a scenario manager; and iii) rendering control; wherein the images on one of the displays rotate when the images on adjacent display shift sideways, and wherein the controller controls the transmission of images to the display surfaces such that images shown on adjacent display surfaces are synchronised such that the images at the edges of adjacent screens are substantially seamless.
  2. 2. An image projection apparatus according to Claim 1, wherein the rendering control commands the rendering engine to render images onto the displays according to instructions from the rendering control.
  3. 3. An image projection apparatus according to Claim I or 2, wherein the software further provides a sound engine and wherein the rendering control determines the location from where the sound should emanate from and commands the sound engine to cause the emission of sound so that the sound appears to emanate from the desired location.
  4. 4. An image projection apparatus according to any preceding claim, wherein the software further provides a physics engine.
  5. 5. An image projection apparatus according to at least Claim 3 or 4, wherein the sound engine is controlled by the controller such that sound emission sychronised spatially within the apparatus to correspond to events in the projected imagery.
  6. 6. An image projection apparatus according to any preceding claim, wherein the frame rate is between 18 and 30 frames/second.
  7. 7. An image projection apparatus according to Claim 6, wherein the frame rate is at least 25 frames/second.
  8. 8. An image projection apparatus according to any preceding claim, wherein the software of the rendering control includes algorithms configured to switch on or off the rendering of elements of an image.
  9. 9. An image projection apparatus according to any preceding claim, further comprising a user controlled device, wherein the position and/or orientation determined by the user controlled device represents the position and/or orientation of the user within the apparatus.
  10. 10. An image projection apparatus according to Claim 9, wherein the user controlled device is one of a joystick, a mouse or a position sensor.
  11. 11. An image projection apparatus according to Claim 9 or 10, wherein the user controlled device further includes means to select options presented to the user.
  12. 12. An image projection apparatus according to any preceding claim, wherein at least one of the at least three display surfaces is a wall and at least one of the at least three display surfaces is a ceiling.
  13. 13. A method of presenting imagery in true perspective to an individual located within an image projection apparatus according to any preceding claim, and controlling the transmission of images to the display surfaces such that images shown on adjacent display surfaces are synchronised.
  14. 14. A method according to Claim 13, including the step of sensing the position of the user in the apparatus and when the user changes position and/or orientation modifying the images presented on the displays such that the images presented correspond to what the user should see from the new position and/or orientation.
  15. 15. A method according to Claim 14, wherein the sensed position and/or orientation of the user is a position and/or orientation determined by a user controlled device.
  16. 16. An image projection apparatus and/or method of presenting imagery in true perspective substantially as shown in, and as described with reference to, the drawings.
GB1007383A 2009-05-01 2010-05-04 Surround projection display system with synchronised screens Withdrawn GB2470462A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB0907459.2A GB0907459D0 (en) 2009-05-01 2009-05-01 Apparatus and method for projecting 3D images

Publications (2)

Publication Number Publication Date
GB201007383D0 GB201007383D0 (en) 2010-06-16
GB2470462A true GB2470462A (en) 2010-11-24

Family

ID=40792062

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB0907459.2A Ceased GB0907459D0 (en) 2009-05-01 2009-05-01 Apparatus and method for projecting 3D images
GB1007383A Withdrawn GB2470462A (en) 2009-05-01 2010-05-04 Surround projection display system with synchronised screens

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB0907459.2A Ceased GB0907459D0 (en) 2009-05-01 2009-05-01 Apparatus and method for projecting 3D images

Country Status (2)

Country Link
GB (2) GB0907459D0 (en)
WO (1) WO2010125406A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130205237A1 (en) * 2012-02-06 2013-08-08 Anders Nancke-Krogh System and method for providing a circular computer desktop environment
GB2524114A (en) * 2013-03-28 2015-09-16 Third Eye Technologies Ltd Battlefield simulation apparatus and method

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102654804B (en) * 2011-05-18 2016-03-09 上海华博信息服务有限公司 A kind of human-computer interaction system based on irregular screen body multiple point touching
GB201307896D0 (en) * 2013-05-01 2013-06-12 Apparatus for use in the performance of cognitive behaviour therapy and method of performance

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
JPH09289656A (en) * 1996-08-22 1997-11-04 Toppan Printing Co Ltd Video display system
GB2422500A (en) * 2004-11-26 2006-07-26 Tv Sports Network Ltd Surround projection display system

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5746599A (en) * 1994-10-31 1998-05-05 Mcdonnell Douglas Corporation Modular video display system
JPH08271979A (en) * 1995-01-30 1996-10-18 Hitachi Ltd Back projection type multi-screen display device and display system using it
US9188850B2 (en) * 2007-09-10 2015-11-17 L-3 Communications Corporation Display system for high-definition projectors

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
JPH09289656A (en) * 1996-08-22 1997-11-04 Toppan Printing Co Ltd Video display system
GB2422500A (en) * 2004-11-26 2006-07-26 Tv Sports Network Ltd Surround projection display system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PARADIGM AV, "Simulation and 3D Screens" [online], Available from: http://www.rearpro.com/products/section2.asp?S2ID=28 [Accessed 13 September 2010] *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130205237A1 (en) * 2012-02-06 2013-08-08 Anders Nancke-Krogh System and method for providing a circular computer desktop environment
US9229625B2 (en) * 2012-02-06 2016-01-05 Mosaiqq, Inc System and method for providing a circular computer desktop environment
GB2524114A (en) * 2013-03-28 2015-09-16 Third Eye Technologies Ltd Battlefield simulation apparatus and method

Also Published As

Publication number Publication date
GB0907459D0 (en) 2009-06-10
WO2010125406A1 (en) 2010-11-04
GB201007383D0 (en) 2010-06-16

Similar Documents

Publication Publication Date Title
US11206383B2 (en) System and method for presenting virtual reality content to a user
US20170150108A1 (en) Autostereoscopic Virtual Reality Platform
EP3468681B1 (en) Spectator management at view locations in virtual reality environments
US8248462B2 (en) Dynamic parallax barrier autosteroscopic display system and method
US7445549B1 (en) Networked portable and console game systems
KR102615001B1 (en) interactive video game system
JP2008546079A (en) Immersive environment with multiple viewpoints
EP3096849A1 (en) Method and system for portraying a portal with user-selectable icons on a large format display system
Martinez Plasencia et al. MisTable: reach-through personal screens for tabletops
WO2006108279A1 (en) Method and apparatus for virtual presence
Livatino et al. Stereo viewing and virtual reality technologies in mobile robot teleguide
EP0874303B1 (en) Video display system for displaying a virtual threedimensinal image
KR20120114770A (en) Circle-vision based large-scale interactive game system and method theory
GB2470462A (en) Surround projection display system with synchronised screens
KR101076263B1 (en) Tangible Simulator Based Large-scale Interactive Game System And Method Thereof
JPH04204842A (en) Video simulation system
WO2015196877A1 (en) Autostereoscopic virtual reality platform
KR20120069008A (en) System for realistic 3d game
Livatino et al. 3D visualization technologies for teleguided robots
DiVerdi et al. An immaterial pseudo-3D display with 3D Interaction
Cataldi Content creation using UE4
KR20240072286A (en) virtual window
CN117252919A (en) Method, device, equipment, medium and program product for full-automatic calibration of room
JP2021062035A (en) Three-dimensional game image generation program, three-dimensional game image generation device and three-dimensional game image generation method
JP2018200389A (en) 360 degrees horizontal direction vr system

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)