GB2477145A - 3D display with ultrasonic head tracking - Google Patents

3D display with ultrasonic head tracking Download PDF

Info

Publication number
GB2477145A
GB2477145A GB1001146A GB201001146A GB2477145A GB 2477145 A GB2477145 A GB 2477145A GB 1001146 A GB1001146 A GB 1001146A GB 201001146 A GB201001146 A GB 201001146A GB 2477145 A GB2477145 A GB 2477145A
Authority
GB
United Kingdom
Prior art keywords
user
screen
estimated
ultrasonic
locations
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1001146A
Other versions
GB201001146D0 (en
Inventor
Naveen Chawla
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB1001146A priority Critical patent/GB2477145A/en
Publication of GB201001146D0 publication Critical patent/GB201001146D0/en
Publication of GB2477145A publication Critical patent/GB2477145A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Abstract

A 3D display system comprises three ultrasonic transmitters or receivers (6,7,8) which are mounted at the corners of a display screen (9) such as a television. Ultrasonic receivers or transmitters (3,4) are mounted on a user's head, so that the viewing position of the user can be calculated. A virtual camera or cameras are set at the viewers viewing position and a viewing frustum is set to match the shape that would be formed if straight lines were drawn from the camera(s) to the screen corners. Therefore, a 3D viewing experience can be presented to the viewer. A further transmitter 16 may be included in a controller device with a trigger 17.

Description

Ultrasound-Based 3D Virtual Object Manipulation System With Virtual-Camera Frustum Manipulation Using Head Tracking and Screen Dimensions A system which allows you to touch, point at, squeeze, hold, turn, shoot at and throw objects viewed realistically from varying angles in 3D on any size screen is proposed here.
Background
Several things are needed for correct 3D manipulation of virtual objects. Firstly, the viewing frustum or frustums of the virtual camera or cameras in the 3D software scene need to be constantly updated to match the asymmetric pyramid shape that the viewer's left and/or right eyes (or a mean position of the two) make with the corners of the screen, denoted by 1 and 2 in Fig. 1. (This also allows objects to be viewed correctly from different angles when the user moves their head). This is necessary to ensure that objects are viewed in the right perspective so that life-like manipulation can take place. In order to do this, at least two ultrasonic transmitter or receiver units, such as denoted by 3 and 4 in Fig. 1, are mounted on the 3D glasses denoted by 4 in Fig. 1, or on the user's head either directly or via a headset if an auto-stereoscopic screen is being used, in order to track the user's head position and orientation, and thus to enable the calculation of a reasonable estimation of their left and right eye positions.
The head-mounted transmitters' or receivers' locations are tracked in 3D relative to the position of the screen. In order to do this, the system, as proposed, introduces at least 3 ultrasonic transmitters or receivers (transmitters if head-mounted receivers are being used, receivers if head-mounted transmitters are being used), denoted by 6, 7 and 8 in Fig. 1, mounted on 3 of the corners of the display screen (denoted by 9 in Fig. 1) via self-adhesive-backed brackets, denoted by 10, 11 and 12 in Fig.1, in addition to at least 1 ultrasonic receiver if the other 3 are transmitters, or at least one ultrasonic transmitter if the other 3 are receivers, mounted alongside unit 6, 7 or 8 or on its own screen corner bracket. Firstly, an automatic calibration step is performed by the screen-corner-mounted ukrasonic transmitters/transmitter and receivers/receiver units to determine the width and height of the screen. Ultrasonic pulses are sent by each transmitter unit in turn, or by the one transmitter unit if there is only one. The time difference of arrival of the pulse at each receiver unit, or the time it takes from the time of transmission for the pulse to reach each receiver unit, or the one receiver unit if there is one, is used as part of an overall calculation to determine the distance between the screen corner units, e.g. 6, 7, and 8, and hence the width and height dimensions of the screen based on the distance offset between the screen-corner unit position and the actual screen corner position for any given screen-corner unit. The screen dimensions can alternatively be input by the user, which necessitates only one screen-corner bracket with only receivers or transmitters (at least 3).
Given the width and height of the screen, and the fact that the screen-corner units are in known positions, either in the single-bracket version, or with respect to top-left, top-right, bottom-left and bottom-right (this can be known either by clearly pre-labeling the screen-corner units and trusting the user to stick them on correctly or by having an orientation-sensor built in to know which of the 4 possible ways each is mounted, and hence which corner), the 3D coordinate system in relation to the screen can be automatically established by the system whichever way a developer may prefer, such as the centre of the screen as the origin, thereby placing, in software, the screen-corner units in known distance-unit positions in that coordinate system, such as on the x y plane around that origin. The location of each of the head-mounted units such as 3 and 4 in Fig. 1 can then be calculated using pulses generated by the screen mounted transmitters or the other way around if the head-mounted units are transmitters using time-of-flight calculations or time difference of arrival calculations. For the consideration of electrical safety and conserving battery power, it is recommended that the head-mounted units are receivers rather than transmitters. Trilateration using one time-of-flight calculation for each screen-mounted transmitter per time frame, is recommended for this. Multilateration using time difference of arrival calculations would also work.
Such 3D tracking performed also upon ultrasonic receiver or transmitter units such as 16 in Fig. 1, either on handsets such as 13&14 proposed in Fig. 1, or mounted on the user's hand or hands, is also necessary for the manipulation of virtual objects such as in Fig. 1, by determining the position and orientation of the user's hands. 3 receiver or transmitter units per handset as denoted by 16 in Fig. 1, allows tracking in all degrees of rotational and positional movement. More receiver or transmitter units can be used on the user's hands or other parts of the body for greater user tracking.
What is also proposed in conjunction with the system, is at least one pressure-sensitive trigger for the user to control, for example on the handsets denoted by 17 in Fig. 1, which may, for example, distinguish between different finger and thumb presses/pressures, to allow maximal hand control and manipulation in any way the software developer may choose for their application.
Example application modes of the system are shown in Fig 2. The virtual object denoted by 15 is being approached to be manipulated by the user's left hand. The right hand is being used to control a "lightsaber" (or any pointing concept), denoted by 18. Of course in fact there are no limits on the number ways the system can be used for virtual-domain control.

Claims (14)

  1. CLAIMS1. A system which includes at least 3 ultrasonic transmitters mounted on at least one bracket which by its shape has an open inner right angle which can be aligned with a corner of a television or projection screen and is backed by adhesive, and which into this system the user inputs the diagonal screen size and selects either 16:9 widescreen or 4:3, which by this information is able to calculate the 3D location, relative to any given known point of the screen, of at least one ultrasonic sensor, also part of the system, mounted on a person's head using timing calculations of ultrasonic signals passed from at least 3 of the transmitters in turn to the sensor or sensors and, by this or these locations, calculate an estimate of the person's viewing position in 3D and sets the position of a virtual camera in a generated 3D scene to match that positions in the virtual 3D world, and, by that estimated location, and the estimated locations of the corners of the screen deduced by the user-supplied diagonal screen size and aspect ratio information, calculates and sets the viewing frustum of the virtual camera in the 3D scene shown to the user, to match the shape that would be formed if straight lines were to be drawn from the estimated user's viewing position location to the estimated corners of the screen in a given time frame, and by which a viewing experience of the 3D is presented to the user with the described behavior.
  2. 2. A system which includes at least 3 ultrasonic transmitters and at least one ultrasonic receiver mounted on at least two brackets which by their shape each have an open inner right angle which can be aligned with a corner of a television or projection screen and is backed by adhesive, and which automatically calculates dimensions of the screen by using timing calculations of ultrasonic signals passed from at least one of the transmitters in turn to the receiver or receivers, which by this information is able to calculate the 3D position, relative to any given known point of the screen, of at least one ultrasonic sensor, also part of the system, mounted on a person's head using timing calculations of ultrasonic signals passed from at least 3 of the transmitters in turn to the sensor or sensors and, by this or these locations, calculate an estimate of the person's viewing position in 3D and sets the position of a virtual camera in a generated 3D scene to match that positions in the virtual 3D world, and, by that estimated location, and the estimated locations of the corners of the screen deduced by the user-supplied diagonal screen size and aspect ratio information, calculates and sets the viewing frustum of the virtual camera in the 3D scene shown to the user, to match the shape that would be formed if straight lines were to be drawn from the estimated user's viewing position location to the estimated corners of the screen in a given time frame, and by which a viewing experience of the 3D is presented to the user with the described behavior using a projection or television screen.
  3. 3. A system which includes at least 3 ultrasonic receivers mounted on at least one bracket which by its shape has an open inner right angle which can be aligned with a corner of a television or projection screen and is backed by adhesive, and which into this system the user inputs the diagonal screen size and selects either 16:9 widescreen or 4:3, which by this information is able to calculate the 3D location, relative to any given known point of the screen, of at least one ultrasonic transmitter, also part of the system, mounted on a person's head using timing calculations of ultrasonic signals passed from the transmitter or transmitters to at least 3 of the receivers and, by this or these locations, calculate an estimate of the person's viewing position in 3D and sets the position of a virtual camera in a generated 3D scene to match that positions in the virtual 3D world, and, by that estimated location, and the estimated locations of the corners of the screen deduced by the user-supplied diagonal screen size and aspect ratio information, calculates and sets the viewing frustum of the virtual camera in the 3D scene shown to the user, to match the shape that would be formed if straight lines were to be drawn from the estimated user's viewing position location to the estimated corners of the screen in a given time frame, and by which a viewing experience of the 3D is presented to the user with the described behavior using a projection or television screen.
  4. 4. A system which includes at least 3 ultrasonic receivers and at least one ultrasonic transmitter mounted on at least two brackets which by their shape each have an open inner right angle which can be aligned with a corner of a television or projection screen and is backed by adhesive, and which automatically calculates dimensions of the screen by using timing calculations of ultrasonic signals passed from the transmitter or transmitters in turn to at least one of the receivers, which by this information is able to calculate the 3D position, relative to any given known point of the screen, of at least one ultrasonic transmitter, also part of the system, mounted on a person's head using timing calculations of ultrasonic signals passed from the transmitter or transmitters in turn to at least 3 of the receivers in turn and, by this or these locations, calculate an estimate of the person's viewing position in 3D and sets the position of a virtual camera in a generated 3D scene to match that positions in the virtual 3D world, and, by that estimated location, and the estimated locations of the corners of the screen deduced by the user-supplied diagonal screen size and aspect ratio information, calculates and sets the viewing frustum of the virtual camera in the 3D scene shown to the user, to match the shape that would be formed if straight lines were to be drawn from the estimated user's viewing position location to the estimated corners of the screen in a given time frame, and by which a viewing experience of the 3D is presented to the user with the described behavior using a projection or television screen.
  5. 5. A system which includes at least 3 ultrasonic transmitters mounted on at least one bracket which by its shape has an open inner right angle which can be aligned with a corner of a television or projection screen and is backed by adhesive, and which into this system the user inputs the diagonal screen size and selects either 16:9 widescreen or 4:3, which by this information is able to calculate the 3D location, relative to any given known point of the screen, of at least one ultrasonic sensor, also part of the system, mounted on a person's head using timing calculations of ultrasonic signals passed from at least 3 of the transmitters in turn to the sensor or sensors and, by this or these locations, calculate an estimate of the 3D position of the person's left and right eyes and sets the positions of two virtual cameras intended for left and right eye stereoscopic viewing to match those positions in the virtual 3D world, and, by these estimated locations, and the estimated locations of the corners of the screen deduced by the user-supplied diagonal screen size and aspect ratio information, calculates and sets the viewing frustums of the virtual cameras in the 3D scene shown to the user, to match the shape that would be formed if straight lines were to be drawn from the estimated user's eyes' locations to the estimated corners of the screen in a given time frame, and by which a stereoscopic viewing experience is presented to the user with the described behavior, using stereoscopic viewing equipment.
  6. 6. A system which includes at least 3 ultrasonic transmitters and at least one ultrasonic receiver mounted on at least two brackets which by their shape each have an open inner right angle which can be aligned with a corner of a television or projection screen and is backed by adhesive, and which automatically calculates dimensions of the screen by using timing calculations of ultrasonic signals passed from at least one of the transmitters in turn to the receiver or receivers, which by this information is able to calculate the 3D position, relative to any given known point of the screen, of at least one ultrasonic sensor, also part of the system, mounted on a person's head using timing calculations of ultrasonic signals passed from at least 3 of the transmitters in turn to the sensor or sensors and, by this or these locations, calculate an estimate of the 3D position of the person's left and right eyes and sets the positions of two virtual cameras intended for left and right eye stereoscopic viewing to match those positions in the virtual 3D world, and, by these estimated locations, and the estimated locations of the corners of the screen deduced by the user-supplied diagonal screen size and aspect ratio information, calculates and sets the viewing frustums of the virtual cameras in the 3D scene shown to the user, to match the shape that would be formed if straight lines were to be drawn from the estimated user's eyes' locations to the estimated corners of the screen in a given time frame, and which presents a stereoscopic viewing experience to the user with the described behavior, using stereoscopic viewing equipment.
  7. 7. A system which includes at least 3 ultrasonic receivers mounted on at least one bracket which by its shape has an open inner right angle which can be aligned with a corner of a television or projection screen and is backed by adhesive, and which into this system the user inputs the diagonal screen size and selects either 16:9 widescreen or 4:3, which by this information is able to calculate the 3D location, relative to any given known point of the screen, of at least one ultrasonic transmitter, also part of the system, mounted on a person's head using timing calculations of ultrasonic signals passed from the transmitter or transmitters to at least 3 of the receivers and, by this or these locations, calculate an estimate of the 3D position of the person's left and right eyes and sets the positions of two virtual cameras intended for left and right eye stereoscopic viewing to match those positions in the virtual 3D world, and, by these estimated locations, and the estimated locations of the corners of the screen deduced by the user-supplied diagonal screen size and aspect ratio information, calculates and sets the viewing frustums of the virtual cameras in the 3D scene shown to the user, to match the shape that would be formed if straight lines were to be drawn from the estimated user's eyes' locations to the estimated corners of the screen in a given time frame, and which presents a stereoscopic viewing experience to the user with the described behavior, using stereoscopic viewing equipment.
  8. 8. A system which includes at least 3 ultrasonic receivers and at least one ultrasonic transmitter mounted on at least two brackets which by their shape each have an open inner right angle which can be aligned with a corner of a television or projection screen and is backed by adhesive, and which automatically calculates dimensions of the screen by using timing calculations of ultrasonic signals passed from the transmitter or transmitters in turn to at least one of the receivers, which by this information is able to calculate the 3D position, relative to any given known point of the screen, of at least one ultrasonic transmitter, also part of the system, mounted on a person's head using timing calculations of ultrasonic signals passed from the transmitter or transmitters in turn to at least 3 of the receivers in turn and, by this or these locations, calculate an estimate of the 3D position of the person's left and right eyes and sets the positions of two virtual cameras intended for left and right eye stereoscopic viewing to match those positions in the virtual 3D world, and, by these estimated locations, and the estimated locations of the corners of the screen deduced by the user-supplied diagonal screen size and aspect ratio information, calculates and sets the viewing frustums of the virtual cameras in the 3D scene shown to the user, to match the shape that would be formed if straight lines were to be drawn from the estimated user's eyes' locations to the estimated corners of the screen in a given time frame, and which presents a stereoscopic viewing experience to the user with the described behavior, using stereoscopic viewing equipment.
  9. 9. A system as claimed in Claim 1, Claim 2, Claim 5 or Claim 6 in which there is at least one additional ultrasonic sensor which is attached to the user, and whose 3D location or locations are tracked in the same way as that user's head-mounted sensor or sensors are.
  10. 10. A system as claimed in Claim 1, Claim 2, Claim 5, Claim 6 or Claim 9 in which there is at least one additional ultrasonic sensor which is attached to a device controlled by the user, and whose 3D location or locations are tracked in the same way as that user's head-mounted sensor or sensors are.
  11. 11. A system as claimed in Claim 3, Claim 4, Claim 7 or Claim 8 in which there is at least one additional ultrasonic transmitter which is attached to the user, and whose 3D location or locations are tracked in the same way as that user's head-mounted transmitter or transmitters are.
  12. 12. A system as claimed in Claim 3, Claim 4, Claim 7, Claim 8 or Claim 11 in which there is at least one additional ultrasonic transmitter which is attached to a device controlled by the user, and whose 3D location or locations are tracked in the same way as that user's head-mounted transmitter or transmitters are.
  13. 13. A system as claimed in Claim 10 or Claim 12 in which the device controlled by the user has at least one trigger which responds to pressure.
  14. 14. A system as herein described and illustrated by the accompanying drawings.
GB1001146A 2010-01-25 2010-01-25 3D display with ultrasonic head tracking Withdrawn GB2477145A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1001146A GB2477145A (en) 2010-01-25 2010-01-25 3D display with ultrasonic head tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1001146A GB2477145A (en) 2010-01-25 2010-01-25 3D display with ultrasonic head tracking

Publications (2)

Publication Number Publication Date
GB201001146D0 GB201001146D0 (en) 2010-03-10
GB2477145A true GB2477145A (en) 2011-07-27

Family

ID=42046002

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1001146A Withdrawn GB2477145A (en) 2010-01-25 2010-01-25 3D display with ultrasonic head tracking

Country Status (1)

Country Link
GB (1) GB2477145A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013088390A1 (en) 2011-12-14 2013-06-20 Universita' Degli Studi Di Genova Improved three-dimensional stereoscopic rendering of virtual objects for a moving observer

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4207284A1 (en) * 1992-03-07 1993-09-09 Stefan Reich Image processing for three=dimensional representation with measurement of head movements - employing stationary monitor with ultrasonic measurement of variations in direction of line of vision of observer, where observer wears liq. crystal shutter spectacles
US5287437A (en) * 1992-06-02 1994-02-15 Sun Microsystems, Inc. Method and apparatus for head tracked display of precomputed stereo images
JPH08160357A (en) * 1994-12-07 1996-06-21 Hitachi Medical Corp Stereoscopic three-dimensional image reproducing device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4207284A1 (en) * 1992-03-07 1993-09-09 Stefan Reich Image processing for three=dimensional representation with measurement of head movements - employing stationary monitor with ultrasonic measurement of variations in direction of line of vision of observer, where observer wears liq. crystal shutter spectacles
US5287437A (en) * 1992-06-02 1994-02-15 Sun Microsystems, Inc. Method and apparatus for head tracked display of precomputed stereo images
JPH08160357A (en) * 1994-12-07 1996-06-21 Hitachi Medical Corp Stereoscopic three-dimensional image reproducing device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013088390A1 (en) 2011-12-14 2013-06-20 Universita' Degli Studi Di Genova Improved three-dimensional stereoscopic rendering of virtual objects for a moving observer

Also Published As

Publication number Publication date
GB201001146D0 (en) 2010-03-10

Similar Documents

Publication Publication Date Title
US10101807B2 (en) Distance adaptive holographic displaying method and device based on eyeball tracking
EP2979127B1 (en) Display method and system
JP2022530012A (en) Head-mounted display with pass-through image processing
US9380295B2 (en) Non-linear navigation of a three dimensional stereoscopic display
US20120162204A1 (en) Tightly Coupled Interactive Stereo Display
US11226406B1 (en) Devices, systems, and methods for radar-based artificial reality tracking
CN103732299B (en) Utilize three-dimensional devices and the 3d gaming device of virtual touch
WO2012153805A1 (en) Monitoring system and monitoring method
CN101986243B (en) Stereoscopic image interactive system and position offset compensation method
RU2751130C1 (en) Method for coordinate alignment of coordinate systems used by computer-generated reality apparatus and tactile sensation transmission apparatus
US20080297590A1 (en) 3-d robotic vision and vision control system
WO2010062117A3 (en) Immersive display system for interacting with three-dimensional content
JP2015149633A5 (en)
US10180614B2 (en) Pi-cell polarization switch for a three dimensional display system
CN202427156U (en) Multipurpose game controller and multipurpose game system capable of sensing postures
WO2017213974A1 (en) Tap event location with a selection apparatus
RU2018109612A (en) REAL-TIME VIDEO DISTRIBUTION SYSTEM
WO2013111146A4 (en) System and method of providing virtual human on human combat training operations
JP2021060627A (en) Information processing apparatus, information processing method, and program
US11944897B2 (en) Device including plurality of markers
CN108014491A (en) A kind of VR games systems
GB2477145A (en) 3D display with ultrasonic head tracking
CN110622106A (en) Audio processing
WO2019122901A1 (en) Data processing
US20200159339A1 (en) Desktop spatial stereoscopic interaction system

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)