WO2018204980A1 - Three dimensional tracking systems - Google Patents

Three dimensional tracking systems Download PDF

Info

Publication number
WO2018204980A1
WO2018204980A1 PCT/AU2018/050432 AU2018050432W WO2018204980A1 WO 2018204980 A1 WO2018204980 A1 WO 2018204980A1 AU 2018050432 W AU2018050432 W AU 2018050432W WO 2018204980 A1 WO2018204980 A1 WO 2018204980A1
Authority
WO
WIPO (PCT)
Prior art keywords
tracking system
dimensional
user device
tracking
anyone
Prior art date
Application number
PCT/AU2018/050432
Other languages
French (fr)
Inventor
Chang-Yi Yao
Original Assignee
Yao Chang Yi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AU2017901750A external-priority patent/AU2017901750A0/en
Application filed by Yao Chang Yi filed Critical Yao Chang Yi
Publication of WO2018204980A1 publication Critical patent/WO2018204980A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1632External expansion units, e.g. docking stations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • the present invention relates to systems for three-dimensional tracking on a user communications device.
  • a problem with 360 degree photos of the prior art is that the motion tracking (i.e. the processing of accurate movement data on the device) is computationally complex, and as a result, battery consumption is increased. Furthermore, 360 photos are captured from a single point, and thus only allow a viewer to change viewing angle (rather than move through a scene). As more complex movements are considered, the complexity increases considerably.
  • Dedicated virtual reality systems exist, in which a headset is tracked in three- dimensional space, and images are updated on the headset according to the movement of the user.
  • One problem with these headsets is that they require dedicated hardware and large amounts of computational power, making them largely unusable to the general public without specialised equipment.
  • a further problem with such headsets is that they provide a different experience to typical phone use, and generally isolate the user from their surroundings.
  • the object of the present invention is to provide three-dimensional tracking systems which may at least partially overcome at least one of the above mentioned disadvantages or provide the consumer with a useful or commercial choice.
  • the present invention in a first embodiment, resides broadly in a three-dimensional tracking system comprising: a user device comprising a display screen;
  • a tracking device coupled to the user device and configured to capture tracking data relating to movement of the user device
  • the tracking device provides media to the user device for display on the display screen, wherein the media is provided according to the received tracking data.
  • the present invention allows three-dimensional tracking on a user device, such as a smartphone, rather than a remote server to provide for improved ease of use to the user in obviating the need to setup a remote tracking system.
  • the present invention also allows a number of users, each having their own user device, to share a virtual space with other devices.
  • the present invention resides broadly in a three-dimensional tracking system comprising:
  • a user device comprising a display screen
  • a tracking device coupled to the user device and configured to capture tracking data relating to movement of the user device
  • a server coupled to the tracking device and the user device, the server configured to receive the tracking data from the tracking device and provide media to the user device for display on the display screen, wherein the media is provided according to the received tracking data.
  • the present invention resides broadly in a virtual reality streaming system including a three-dimensional tracking system, the virtual reality streaming system comprising: a user device comprising a display screen;
  • a tracking device associated with the user device and configured to capture tracking data relating to movement of the user device
  • a computer or server associated with the user device configured to receive the tracking data from the tracking device and provide virtual reality media to the user device for display on the display screen, wherein the media is provided according to the received tracking data.
  • the system of the preferred embodiment will allow the streaming of virtual reality media from the computer or server associated with the user device, to the user device according to the movement of the user device based on the tracking data.
  • the system of the preferred embodiment therefore preferably uses the tracking device to coordinate movement and/or orientation of the user device to locate and/or orient the streamed virtual reality media on the display screen of the user device, according to the movement an/or orientation of the user device.
  • the present invention includes a source of virtual reality media, preferably the computer or server which provides the virtual reality media from the computer or server to the user device.
  • a software application is provided on the user device and a software application is provided on the computer or server allowing link to be formed to transmit the virtual reality media from the computer or server to the user device.
  • the software application provided on the user device is provided on the user device.
  • tracking is performed by the tracking device and server, and as such, does not require such processing to be performed on the user device, or even that the user device be capable of such processing. Furthermore, as the media is provided to the user device according to the tracking data, bandwidth can be reduced. In particular, as the server is aware of the tracking data, and thus may be aware of the location and direction of the user device, only a relevant angle of the scene need be provided (rather than all angles of the scene).
  • the tracking device may be coupled to the user device though the use of one or more components of the user device as or as a part of a tracking device.
  • the camera and/or one or more sensors in a smartphone may be used as a part of a tracking device together with appropriate software to task these components for use as a tracking device.
  • the tracking device includes a plurality of sensors, each configured to receive data that together defines at least a portion of the tracking data.
  • the sensors may be directed at a multitude of different angles.
  • the sensors may comprise light sensors, configured to receive light from a light source.
  • the light may comprise infrared light.
  • the tracking device may comprise one or more shielding members, configured to prevent light from a particular angle from reaching at least one sensor.
  • the tracking device may comprise a plurality of arms extending outwardly from the tracking device, with a sensor positioned on each arm.
  • An upper surface of each arm may be inclined towards a centre of the sensor body.
  • One or more of the upper surfaces may comprise a sensor positioned thereon.
  • the system comprises a transmitter, configured to emit the light.
  • the transmitter may be configured to transmit infrared light.
  • the light may be transmitted in a predefined pattern that varies over time.
  • the light may be swept in at least two different directions. The two different directions may be orthogonal.
  • the transmitter may be configured to provide a synchronisation signal, to enable the tracking device to synchronise with the transmitter.
  • the tracking device may be configured to send the tracking data in the form of received signals from the sensors to the server for processing.
  • the received signals may be processed or filtered prior to being sent to the server.
  • the tracking device may further comprise an accelerometer and/or a gyroscope.
  • the tracking data may further comprise accelerometer and/or a gyroscope data from the
  • the server may determine a location and direction of the user device with reference to the transmitter.
  • the server may utilise one or more received times of one or more swept signals of the transmitter to determine a location and direction of the user device.
  • the media may be provided in a plurality of sequential image frames.
  • the server may continuously determined a location and direction of the user device, and generate each image frame according to a location and direction.
  • the server may be configured to generate media by selecting a scene portion of a scene based upon the location and direction of the user device.
  • Scene portions may be selected according to movement of the smartphone to simulate movement in a three-dimensional environment.
  • the three-dimensional environment may comprise a virtual reality environment.
  • the three dimensional environment may comprise scene portions overlaid over image data captured by the device to provide mixed reality.
  • the server and the user device may comprise a single device.
  • the server may comprise a plurality of server devices, processors or the like.
  • the tracking device may be releasably coupled to the user device.
  • the tracking device may be releasably coupled to the user device in a fixed relationship, such that movement of the tracking device corresponds to movement of the user device.
  • the tracking device may comprise a clamp configured to clamp the user device.
  • the clamp may be spring loaded.
  • the tracking device may be configured to be installed substantially behind a screen of the user device.
  • the tracking device may comprise a case, configured to receive and retain the user device.
  • the tracking device may comprise a housing.
  • the housing may comprise a processor, a memory and a transmitter, for transmitting the tracking data to the server.
  • the housing may further comprise a power cable, for coupling to the user device.
  • the user device may power the tracking device.
  • the housing may further comprise a battery, configured to power the tracking device.
  • the user device may comprise a touch screen display, with which the user may interact.
  • the user device may transmit details of such interaction to the server.
  • the interaction may comprise interaction with an element of media that has been provided by the server.
  • the server may be configured to provide media according to the received tracking data and according to interaction with the touch screen display.
  • the media may comprise media of a game with which the user interacts.
  • the user device may be configured to track a face and/or eyes of the user.
  • the user device may transmit face and/or eye tracking data to the server.
  • the server may be configured to provide media according to the received tracking data and according to the face and/or eye tracking data.
  • the media may be updated to compensate for movement of the users face and/or eyes with reference to the user device.
  • the server may generate media (such as video) for the user based not only on the location and direction of the user device, but also the relative angle the user is viewing the smartphone.
  • the server may be configured to generate media by selecting a scene portion based upon the user's relative position to the user device according to the face and/or eye tracking data.
  • the server may be configured to generate media by selecting a scene portion based upon the location and direction of the user device according to the tracking data.
  • Scene portions may be selected to simulate movement in a three-dimensional environment.
  • the user device comprises a smartphone.
  • the present invention will utilise a user's mobile device such as a smartphone as an input/output device for the server.
  • the user's mobile device such as a smartphone can then be utilised as a camera/portal into a virtual space, thus reducing the required capabilities of the device and the server to utilise bespoke virtual reality technology.
  • Figure 1 illustrates a three-dimensional tracking system, according to an embodiment of the present invention
  • Figure 2 illustrates a front view of a smartphone and tracking device of the three- dimensional tracking system of Figure 1, according to an embodiment of the present invention
  • Figure 3 illustrates a top view of the tracking device of the three-dimensional tracking system of Figure 1, according to an embodiment of the present invention
  • Figure 4 illustrates a rear view of the tracking device of the three-dimensional tracking system of Figure 1, according to an embodiment of the present invention
  • Figure 5 illustrates a front view tracking device of a three-dimensional tracking system, according to an alternative embodiment of the present invention
  • Figure 6 illustrates a side view of a part of a three-dimensional tracking system, according to an alternative embodiment of the present invention.
  • Figure 7 illustrates a schematic of a three-dimensional tracking system, according to an embodiment of the present invention.
  • Figure 1 illustrates a three-dimensional tracking system 100, according to an embodiment of the present invention.
  • the system 100 provides a low-complexity tracking of smart devices in the 3D space, which in turn enables efficient display of virtual worlds, 3D models or animations on a smartphone or similar user device.
  • the system 100 comprises a user device in the form of a smartphone 105 coupled to a three-dimensional (3D) tracking device 110.
  • the smartphone 105 is coupled to the 3D tracking device 110 in a fixed relationship, such that when a user 115 moves the smartphone 105, the smartphone 105 and the 3D tracking device 110 move together.
  • the 3D tracking device 110 is able to track movement of the smartphone 105, comprising a location and direction thereof.
  • the 3D tracking device 110 comprises first and second clamping members 120, which are spring loaded.
  • the first and second clamping members 120 may be biased apart from each other to fit the smartphone 105 there between, and are configured to retain the smartphone 105 by clamping.
  • Figure 2 illustrates the smartphone being clamped by the clamping members 120.
  • an infrared transmitter 125 is configured to emit infrared light in a predefined pattern (over time).
  • the tracking device 110 comprises a plurality of sensors 130, configured to receive signals from the infrared transmitter 125 and forward results of same to a server or computing device in the form a laptop 135 for processing.
  • the laptop 135 is able to determine a location and angle of the tracking device 110 (and thus the smartphone 105) based upon the sensor data and associated timing data.
  • an arrival time (e.g. with reference to a reference signal) at a sensor may give an indication of a location of the tracking device 110.
  • the presence of a signal in one sensor 130 and not another sensor 130 may give an indication of a direction in which the tracking device 110 is facing.
  • the infrared light is swept periodically across an area
  • a location of the sensor 130 can be determined, which can in turn be used to estimate a location of the tracking device 110 and thus the smartphone 105.
  • the arrival of the infrared light at the sensor may be used to determine a height of the tracking device with reference to the transmitter 125.
  • the infrared transmitter 125 functions as a reference point, in relation to which the data is presented.
  • the skilled addressee will readily appreciate that no strict absolute reference is required, but instead that movements relative to the reference point may be used to generate the data.
  • the horizontal and vertical sweeping mentioned above is provided relative to the infrared transmitter 125.
  • infrared transmitter 125 may be placed on an uneven surface, resulting in the horizontal and vertical sweeps not being truly horizontal or vertical.
  • the infrared transmitter 125 and tracking device 110 may synchronise with each other using a predefined synchronisation signal. This enables accurate timing information to be used between the infrared transmitter 125 and the tracking device 110. For example, a pulse of light may be used to reset a timer, against which the sweeping of signals is measured. [0063] While only one infrared transmitter 125 is provided, the skilled addressee will readily appreciate that multiple infrared transmitters 125 may be used to provide more detailed tracking. In particular, two (or more) infrared transmitters 125 may be provided to enable triangulation there between.
  • the transmitted infrared data is received at the sensors 130, data relating thereto is transmitted from the tracking device 110 to a laptop 135.
  • the data may be pre-processed (e.g. filtered) to remove noise, or perform preliminary analysis.
  • a filter may be provided to remove noise and outliers, by comparing signals at multiple sensors.
  • the laptop 135 determines a location and direction of the tracking device 110, and thus the smartphone 105, and provides media, such as video frames, to the smartphone 105 based thereon.
  • the laptop 135 comprises a three dimensional scene, from which portions thereof are selected for display at the smartphone 105.
  • the computing device is configured to generate media for transmission to the smartphone 105 by selecting a scene portion of a scene based upon the location and direction of the user device. As such, when the user moves the smartphone 105, the view is updated to provide a "moving window" into a three dimensional world.
  • Scene portions may be selected to simulate navigation in a three-dimensional environment.
  • the three-dimensional environment may comprise a virtual reality environment, or may comprise scene portions to be overlaid over image data captured by the smartphone 105 to provide mixed reality.
  • the process is performed continuously so that the media is provided to the user based upon their actual movements. As such, a continuous and immersive experience is provided to the user.
  • the media may be provided sequentially, such as in the form of video, and updated such that each frame of the video is generated according to the current location and direction of the smartphone. This enables the laptop 135 and the smartphone 105 to utilise existing protocols and methods for transporting the media to the smartphone.
  • the media may be generated (or modified) at the laptop 135, lower bandwidth between the smartphone 105 and laptop 135 may be utilised, as only the view to be displayed to the user need be sent (rather than all views). As a result, latency may be reduced, and the overall user experience may be increased.
  • the tracking device 110 comprises a housing 140 and a sensor body 145.
  • the housing 140 may be used to house a battery, and a processor, memory and transmitter, for transmitting the received signal data to the laptop 135.
  • the housing 140 may further comprise a power cable, such as a USB Type-C cable, for coupling the tracking device 110 to the smartphone 105.
  • a power cable such as a USB Type-C cable
  • the housing 140 (and the components therein and relating thereto) may be powered by the smartphone 105 directly, rather than by a separate battery, which can decrease complexity and production costs of the tracking device 110.
  • the sensor body 145 comprises a plurality of arms 150 extending outwardly from a centre thereof and along a rear surface of the housing 140.
  • a sensor 130 is positioned at the end of each arm 150 and is directed outwards along an axis of the arm 150.
  • An upper surface 150a of each arm 150 is inclined towards a centre of the sensor body, and every second such surface 150a comprises a sensor 130 positioned thereon, angled outwards and perpendicular to the corresponding upper surface 150a.
  • annular groove portion 155 when extends around a periphery of the sensor body 145, and above the annular groove portion 155 is a truncated cone portion 160.
  • the annular groove portion 155 and the truncated cone portion 160 comprise sensors 130 around their periphery.
  • the sensor body 145 comprises a plurality of fingers 165, extending upwardly and outwardly from a top of the truncated cone portion 160.
  • An outer face of each of the fingers 165 comprises a sensor 130.
  • the arrangement of sensors 130 described above not only ensures that there are sensors 130 positioned at a plurality of different angles, but also to shield sensors at certain angles. As such, the process of determining a location and angle of the tracking device 110 may be greatly simplified through such arrangement, and less prone to noise.
  • the housing 140 also comprise an accelerometer, a gyroscope, or other motion sensing device to enable the tracking device 110 to provide additional motion detail to the laptop 135. This may be particularly useful in reducing complexity of the motion tracking at the laptop 135.
  • Figure 5 illustrates a front view of a tracking device 510, according to an embodiment of the present invention.
  • the tracking device 510 is similar to the tracking device 110, but encases the smartphone 105.
  • the tracking device 510 comprises a rectangular frame 515, and a transparent screen 520.
  • the rectangular frame 515 is configured to receive and house the smartphone 105, such that it is positioned with the screen of the smartphone 105 directly under the transparent screen 520 of the frame 515.
  • the frame 515 further comprises a plurality of spring-loaded retainers 525, which are configured to retain the smartphone 105 thereon, while enabling different sized smartphones to be used with the tracking device 510.
  • the retainers 525 are equally spaced along both sides of the frame 515, which ensures that the smartphone 105 is held in a central position.
  • the frame 515 further comprises a plurality of external buttons 530, which enable the user to interact with the smartphone 105 therein.
  • the buttons 530 may be coupled electrically to the smartphone by internal wiring (not shown), wirelessly, or through physical interaction with buttons on the smartphone 105.
  • a sensor body 145 is coupled to a rear of the frame and communicates with the laptop 135 as outlined above.
  • the smartphone 105 is further configured to track a face and/or eyes of the user 115, and report same to the laptop 135.
  • the laptop 135 is able to generate media (such as video) for the user based not only on the location and direction of the smartphone 105, but also the relative angle the user is viewing the smartphone (parallax).
  • a different scene portion is generated (or selected) based upon the user's relative position to the smartphone. This may in practice be achieved by first identifying a region of the scene based upon a position of the smartphone 105, and then by refining that positon according to the position of the user 115.
  • the user 115 at a first position 105-a is provided with a first portion 605a of a scene 605 based upon their first position 115-a.
  • a second portion 605b of the scene is provided.
  • Such configuration relates to parallax and the different viewing positions.
  • the scene 605 is illustrated as being a planar surface in Figure 6, the skilled addressee will readily appreciate that the scene 605 may be of any suitable shape, and may vary in depth. Similarly, while the motion in Figure 6 is illustrated in a single dimension, the skilled addressee will readily appreciate that such motion may be in any suitable direction.
  • Figure 7 illustrates a schematic of a three-dimensional tracking system 700, according to an embodiment of the present invention.
  • the three-dimensional tracking system 700 may be similar or identical to the three-dimensional tracking system 100.
  • the system comprises a tracking beacon 725, similar to the infrared transmitter 125, and for transmitting signals to a tracking device 710, similar to the tracking device 110, to assist the tracking device 710 in determining its location and direction.
  • the tracking device 710 is coupled to a smartphone 705, similar to the smartphone 105, and as such, tracking of the tracking device can be used to track the smartphone 705.
  • the tracking device 710 and the smartphone 705 are both coupled to a server 735, which may be in the form of a local computer (e.g. a laptop), a remote server or a combination of one or more remote and/or local computers, like the laptop 135. .
  • a server 735 may be in the form of a local computer (e.g. a laptop), a remote server or a combination of one or more remote and/or local computers, like the laptop 135. .
  • the server 735 is coupled to a data store 740, which comprises content.
  • the content may comprise three-dimensional models, from which video data is generated for transmission to the smartphone 705.
  • the three-dimensional model may comprise part of a game, with which the user interacts through the smartphone 705.
  • the server 735 may generate a sequence of image frames by continuously determined a location and direction of the user device, and generating each image frame according to the location and direction.
  • an application or game may provide a mixed reality experience, where images capture by the smartphone 705 are overlaid with three-dimensional data (such as game characters and game elements).
  • the user may interact with the overlaid data using the touchscreen, for example, upon which such data is sent to the server 735, and the three- dimensional model is updated.
  • the smartphone may also send other data back to the server 735, such as sensor data, data corresponding to button presses, or any other suitable data. This data may provide further interaction to the game, may assist in motion tracking, or be used by the server 735 for any other suitable purpose.
  • an application or game where a virtual world is displayed such that a screen of the smartphone appears to be a "window", through which the user is able to see the virtual world using a three dimensional model which appears to be behind the window.
  • a front facing camera of the smartphone is used to track the user's face, and update the image based thereon to give the perception of depth.
  • the system may incorporate a virtual reality headset (or other virtual reality device) to provide virtual reality to the user.
  • the smartphone may be tracked, as outlined above, and displayed in the virtual reality world.
  • the user may interact with the smartphone in the virtual world as a smartphone or as a controller, while holding it in their hand in the real world.
  • the virtual reality headset and the tracking device of the smartphone may utilise the same transmitter(s)/tracking beacon(s) for independent motion tracking.
  • the system may enable virtual reality to be provided in a manner that is not hindered by use of a headset, and enables the user to use his or her phone, which they are generally familiar with and comfortable with.
  • the system is low complexity as it utilises tracking from a device that is separate to (but attached to) the smartphone.
  • Such configuration not only increases battery life in that processing on the smartphone is reduced, but also enables the smartphone to perform other computationally complex activities, which in turn enables more advanced applications and games.
  • the present invention has the advantage over the prior art of improved ease of use by a user of a user device providing virtual reality three dimensional scene rendering.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A virtual reality streaming system including a three-dimensional tracking system, the virtual reality streaming system comprising: a user device comprising a display screen; a tracking device associated with the user device and configured to capture tracking data relating to movement of the user device; and a computer or server associated with the user device, the server configured to receive the tracking data from the tracking device and provide virtual reality media to the user device for display on the display screen, wherein the media is provided according to the received tracking data.

Description

THREE DIMENSIONAL TRACKING SYSTEMS
TECHNICAL FIELD
[0001] The present invention relates to systems for three-dimensional tracking on a user communications device.
BACKGROUND ART
[0002] As portable communications devices become more advanced, interactive media has become popular. 360 degree photos, for example, are increasing in popularity, as they provide an interactive viewing experience. In short, as the user moves his or her device, the displayed image is updated, allowing the user to view the scene at multiple different angles.
[0003] A problem with 360 degree photos of the prior art is that the motion tracking (i.e. the processing of accurate movement data on the device) is computationally complex, and as a result, battery consumption is increased. Furthermore, 360 photos are captured from a single point, and thus only allow a viewer to change viewing angle (rather than move through a scene). As more complex movements are considered, the complexity increases considerably.
[0004] Additionally, such complex processing may not be possible on low end devices in a manner that achieves suitable results. Even on high end devices where such processing is possible, other computationally complex calculations may not be able to be performed simultaneously. As a result, complex applications that utilise this motion tracking may not be possible.
[0005] Dedicated virtual reality systems exist, in which a headset is tracked in three- dimensional space, and images are updated on the headset according to the movement of the user. One problem with these headsets is that they require dedicated hardware and large amounts of computational power, making them largely unusable to the general public without specialised equipment. A further problem with such headsets is that they provide a different experience to typical phone use, and generally isolate the user from their surroundings.
Furthermore, such virtual reality headsets are not well suited to many traditional mobile games, where a user interacts with a touch screen of the phone.
OBJECT OF THE INVENTION
[0006] As such, there is clearly a need for an improved system for three-dimensional tracking on a user communications device, or at least ameliorates the aforementioned problems. [0007] Accordingly, the object of the present invention is to provide three-dimensional tracking systems which may at least partially overcome at least one of the above mentioned disadvantages or provide the consumer with a useful or commercial choice.
SUMMARY OF INVENTION
[0008] With the foregoing in view, the present invention in a first embodiment, resides broadly in a three-dimensional tracking system comprising: a user device comprising a display screen;
a tracking device coupled to the user device and configured to capture tracking data relating to movement of the user device; and
wherein the tracking device provides media to the user device for display on the display screen, wherein the media is provided according to the received tracking data.
In this way, the present invention allows three-dimensional tracking on a user device, such as a smartphone, rather than a remote server to provide for improved ease of use to the user in obviating the need to setup a remote tracking system. Importantly, the present invention also allows a number of users, each having their own user device, to share a virtual space with other devices.
[0009] In the present specification and claims (if any), the word 'comprising' and its derivatives including 'comprises' and 'comprise' comprises each of the stated integers but does not exclude the inclusion of one or more further integers.
[0010] In a second embodiment, the present invention resides broadly in a three-dimensional tracking system comprising:
a user device comprising a display screen;
a tracking device coupled to the user device and configured to capture tracking data relating to movement of the user device; and
a server coupled to the tracking device and the user device, the server configured to receive the tracking data from the tracking device and provide media to the user device for display on the display screen, wherein the media is provided according to the received tracking data.
[0011] In a further embodiment, the present invention resides broadly in a virtual reality streaming system including a three-dimensional tracking system, the virtual reality streaming system comprising: a user device comprising a display screen;
a tracking device associated with the user device and configured to capture tracking data relating to movement of the user device; and
a computer or server associated with the user device, the server configured to receive the tracking data from the tracking device and provide virtual reality media to the user device for display on the display screen, wherein the media is provided according to the received tracking data.
[0012] The system of the preferred embodiment will allow the streaming of virtual reality media from the computer or server associated with the user device, to the user device according to the movement of the user device based on the tracking data. The system of the preferred embodiment therefore preferably uses the tracking device to coordinate movement and/or orientation of the user device to locate and/or orient the streamed virtual reality media on the display screen of the user device, according to the movement an/or orientation of the user device.
[0013] In this form, the present invention includes a source of virtual reality media, preferably the computer or server which provides the virtual reality media from the computer or server to the user device.
[0014] Generally, a software application is provided on the user device and a software application is provided on the computer or server allowing link to be formed to transmit the virtual reality media from the computer or server to the user device. Typically, the software application provided on the user device
[0015] Advantageously, tracking is performed by the tracking device and server, and as such, does not require such processing to be performed on the user device, or even that the user device be capable of such processing. Furthermore, as the media is provided to the user device according to the tracking data, bandwidth can be reduced. In particular, as the server is aware of the tracking data, and thus may be aware of the location and direction of the user device, only a relevant angle of the scene need be provided (rather than all angles of the scene).
[0016] Alternatively, the tracking device may be coupled to the user device though the use of one or more components of the user device as or as a part of a tracking device. For example, the camera and/or one or more sensors in a smartphone may be used as a part of a tracking device together with appropriate software to task these components for use as a tracking device.
[0017] Preferably, the tracking device includes a plurality of sensors, each configured to receive data that together defines at least a portion of the tracking data. The sensors may be directed at a multitude of different angles.
[0018] The sensors may comprise light sensors, configured to receive light from a light source. The light may comprise infrared light.
[0019] The tracking device may comprise one or more shielding members, configured to prevent light from a particular angle from reaching at least one sensor.
[0020] The tracking device may comprise a plurality of arms extending outwardly from the tracking device, with a sensor positioned on each arm. An upper surface of each arm may be inclined towards a centre of the sensor body. One or more of the upper surfaces may comprise a sensor positioned thereon.
[0021] Preferably, the system comprises a transmitter, configured to emit the light. The transmitter may be configured to transmit infrared light. The light may be transmitted in a predefined pattern that varies over time. The light may be swept in at least two different directions. The two different directions may be orthogonal.
[0022] The transmitter may be configured to provide a synchronisation signal, to enable the tracking device to synchronise with the transmitter.
[0023] The tracking device may be configured to send the tracking data in the form of received signals from the sensors to the server for processing. The received signals may be processed or filtered prior to being sent to the server.
[0024] The tracking device may further comprise an accelerometer and/or a gyroscope. The tracking data may further comprise accelerometer and/or a gyroscope data from the
accelerometer and/or the gyroscope.
[0025] The server may determine a location and direction of the user device with reference to the transmitter. The server may utilise one or more received times of one or more swept signals of the transmitter to determine a location and direction of the user device.
[0026] The media may be provided in a plurality of sequential image frames. The server may continuously determined a location and direction of the user device, and generate each image frame according to a location and direction.
[0027] The server may be configured to generate media by selecting a scene portion of a scene based upon the location and direction of the user device. [0028] Scene portions may be selected according to movement of the smartphone to simulate movement in a three-dimensional environment. The three-dimensional environment may comprise a virtual reality environment. The three dimensional environment may comprise scene portions overlaid over image data captured by the device to provide mixed reality.
[0029] The server and the user device may comprise a single device. Similarly, the server may comprise a plurality of server devices, processors or the like.
[0030] The tracking device may be releasably coupled to the user device. The tracking device may be releasably coupled to the user device in a fixed relationship, such that movement of the tracking device corresponds to movement of the user device.
[0031] The tracking device may comprise a clamp configured to clamp the user device. The clamp may be spring loaded.
[0032] The tracking device may be configured to be installed substantially behind a screen of the user device.
[0033] The tracking device may comprise a case, configured to receive and retain the user device.
[0034] The tracking device may comprise a housing. The housing may comprise a processor, a memory and a transmitter, for transmitting the tracking data to the server.
[0035] The housing may further comprise a power cable, for coupling to the user device. In such case, the user device may power the tracking device. Alternatively, the housing may further comprise a battery, configured to power the tracking device.
[0036] The user device may comprise a touch screen display, with which the user may interact. The user device may transmit details of such interaction to the server. The interaction may comprise interaction with an element of media that has been provided by the server.
[0037] The server may be configured to provide media according to the received tracking data and according to interaction with the touch screen display. The media may comprise media of a game with which the user interacts.
[0038] The user device may be configured to track a face and/or eyes of the user. The user device may transmit face and/or eye tracking data to the server.
[0039] The server may be configured to provide media according to the received tracking data and according to the face and/or eye tracking data. The media may be updated to compensate for movement of the users face and/or eyes with reference to the user device. As such, the server may generate media (such as video) for the user based not only on the location and direction of the user device, but also the relative angle the user is viewing the smartphone.
[0040] The server may be configured to generate media by selecting a scene portion based upon the user's relative position to the user device according to the face and/or eye tracking data. The server may be configured to generate media by selecting a scene portion based upon the location and direction of the user device according to the tracking data. Scene portions may be selected to simulate movement in a three-dimensional environment.
[0041] Preferably, the user device comprises a smartphone. In a particularly preferred embodiment, the present invention will utilise a user's mobile device such as a smartphone as an input/output device for the server. The user's mobile device such as a smartphone can then be utilised as a camera/portal into a virtual space, thus reducing the required capabilities of the device and the server to utilise bespoke virtual reality technology.
[0042] Any of the features described herein can be combined in any combination with any one or more of the other features described herein within the scope of the invention.
[0043] The reference to any prior art in this specification is not, and should not be taken as an acknowledgement or any form of suggestion that the prior art forms part of the common general knowledge.
BRIEF DESCRIPTION OF DRAWINGS
[0044] Preferred features, embodiments and variations of the invention may be discerned from the following Detailed Description which provides sufficient information for those skilled in the art to perform the invention. The Detailed Description is not to be regarded as limiting the scope of the preceding Summary of Invention in any way. The Detailed Description will make reference to a number of drawings as follows:
[0045] Figure 1 illustrates a three-dimensional tracking system, according to an embodiment of the present invention;
[0046] Figure 2 illustrates a front view of a smartphone and tracking device of the three- dimensional tracking system of Figure 1, according to an embodiment of the present invention;
[0047] Figure 3 illustrates a top view of the tracking device of the three-dimensional tracking system of Figure 1, according to an embodiment of the present invention;
[0048] Figure 4 illustrates a rear view of the tracking device of the three-dimensional tracking system of Figure 1, according to an embodiment of the present invention;
[0049] Figure 5 illustrates a front view tracking device of a three-dimensional tracking system, according to an alternative embodiment of the present invention;
[0050] Figure 6 illustrates a side view of a part of a three-dimensional tracking system, according to an alternative embodiment of the present invention; and
[0051] Figure 7 illustrates a schematic of a three-dimensional tracking system, according to an embodiment of the present invention.
[0052] Preferred features, embodiments and variations of the invention may be discerned from the following Detailed Description which provides sufficient information for those skilled in the art to perform the invention. The Detailed Description is not to be regarded as limiting the scope of the preceding Summary of the Invention in any way.
DESCRIPTION OF EMBODIMENTS
[0053] Figure 1 illustrates a three-dimensional tracking system 100, according to an embodiment of the present invention. As described in further detail below, the system 100 provides a low-complexity tracking of smart devices in the 3D space, which in turn enables efficient display of virtual worlds, 3D models or animations on a smartphone or similar user device.
[0054] The system 100 comprises a user device in the form of a smartphone 105 coupled to a three-dimensional (3D) tracking device 110. The smartphone 105 is coupled to the 3D tracking device 110 in a fixed relationship, such that when a user 115 moves the smartphone 105, the smartphone 105 and the 3D tracking device 110 move together. As a result, the 3D tracking device 110 is able to track movement of the smartphone 105, comprising a location and direction thereof.
[0055] As best illustrated in Figure 3, the 3D tracking device 110 comprises first and second clamping members 120, which are spring loaded. In particular, the first and second clamping members 120 may be biased apart from each other to fit the smartphone 105 there between, and are configured to retain the smartphone 105 by clamping. Figure 2 illustrates the smartphone being clamped by the clamping members 120. [0056] Turning back to Figure 1, an infrared transmitter 125 is configured to emit infrared light in a predefined pattern (over time). The tracking device 110 comprises a plurality of sensors 130, configured to receive signals from the infrared transmitter 125 and forward results of same to a server or computing device in the form a laptop 135 for processing.
[0057] As the sensors 130 are directed in various angles, and the infrared light is transmitted in a particular pattern, the laptop 135 is able to determine a location and angle of the tracking device 110 (and thus the smartphone 105) based upon the sensor data and associated timing data.
[0058] Firstly, as the infrared light is transmitted in a pattern, i.e. that the light is transmitted in different directions at different points of time, an arrival time (e.g. with reference to a reference signal) at a sensor may give an indication of a location of the tracking device 110.
[0059] Secondly, as some sensors 130 are on opposite sides of the tracking device 110, the presence of a signal in one sensor 130 and not another sensor 130 may give an indication of a direction in which the tracking device 110 is facing.
[0060] In one embodiment, the infrared light is swept periodically across an area
horizontally, and periodically across the space vertically. By recording a reception time of such a signal at a sensor 130, a location of the sensor 130 can be determined, which can in turn be used to estimate a location of the tracking device 110 and thus the smartphone 105. As an illustrative example, if the infrared light is swept vertically at a known rate, the arrival of the infrared light at the sensor may be used to determine a height of the tracking device with reference to the transmitter 125.
[0061] In such case, the infrared transmitter 125 functions as a reference point, in relation to which the data is presented. The skilled addressee will readily appreciate that no strict absolute reference is required, but instead that movements relative to the reference point may be used to generate the data. Furthermore, the horizontal and vertical sweeping mentioned above is provided relative to the infrared transmitter 125. The skilled addressee will readily appreciate that infrared transmitter 125 may be placed on an uneven surface, resulting in the horizontal and vertical sweeps not being truly horizontal or vertical.
[0062] The infrared transmitter 125 and tracking device 110 may synchronise with each other using a predefined synchronisation signal. This enables accurate timing information to be used between the infrared transmitter 125 and the tracking device 110. For example, a pulse of light may be used to reset a timer, against which the sweeping of signals is measured. [0063] While only one infrared transmitter 125 is provided, the skilled addressee will readily appreciate that multiple infrared transmitters 125 may be used to provide more detailed tracking. In particular, two (or more) infrared transmitters 125 may be provided to enable triangulation there between.
[0064] As the transmitted infrared data is received at the sensors 130, data relating thereto is transmitted from the tracking device 110 to a laptop 135. The data may be pre-processed (e.g. filtered) to remove noise, or perform preliminary analysis. For example, a filter may be provided to remove noise and outliers, by comparing signals at multiple sensors.
[0065] The laptop 135 then determines a location and direction of the tracking device 110, and thus the smartphone 105, and provides media, such as video frames, to the smartphone 105 based thereon.
[0066] In one embodiment, the laptop 135 comprises a three dimensional scene, from which portions thereof are selected for display at the smartphone 105. In particular, the computing device is configured to generate media for transmission to the smartphone 105 by selecting a scene portion of a scene based upon the location and direction of the user device. As such, when the user moves the smartphone 105, the view is updated to provide a "moving window" into a three dimensional world.
[0067] Scene portions may be selected to simulate navigation in a three-dimensional environment. The three-dimensional environment may comprise a virtual reality environment, or may comprise scene portions to be overlaid over image data captured by the smartphone 105 to provide mixed reality.
[0068] The process is performed continuously so that the media is provided to the user based upon their actual movements. As such, a continuous and immersive experience is provided to the user.
[0069] The media may be provided sequentially, such as in the form of video, and updated such that each frame of the video is generated according to the current location and direction of the smartphone. This enables the laptop 135 and the smartphone 105 to utilise existing protocols and methods for transporting the media to the smartphone.
[0070] Furthermore, as the media may be generated (or modified) at the laptop 135, lower bandwidth between the smartphone 105 and laptop 135 may be utilised, as only the view to be displayed to the user need be sent (rather than all views). As a result, latency may be reduced, and the overall user experience may be increased.
[0071] As best illustrated in Figures 3 and 4, the tracking device 110 comprises a housing 140 and a sensor body 145. The housing 140 may be used to house a battery, and a processor, memory and transmitter, for transmitting the received signal data to the laptop 135.
[0072] Alternatively, the housing 140 may further comprise a power cable, such as a USB Type-C cable, for coupling the tracking device 110 to the smartphone 105. As such, the housing 140 (and the components therein and relating thereto) may be powered by the smartphone 105 directly, rather than by a separate battery, which can decrease complexity and production costs of the tracking device 110.
[0073] The sensor body 145 comprises a plurality of arms 150 extending outwardly from a centre thereof and along a rear surface of the housing 140. A sensor 130 is positioned at the end of each arm 150 and is directed outwards along an axis of the arm 150. An upper surface 150a of each arm 150 is inclined towards a centre of the sensor body, and every second such surface 150a comprises a sensor 130 positioned thereon, angled outwards and perpendicular to the corresponding upper surface 150a.
[0074] Above the arms 150 is an annular groove portion 155 when extends around a periphery of the sensor body 145, and above the annular groove portion 155 is a truncated cone portion 160. The annular groove portion 155 and the truncated cone portion 160 comprise sensors 130 around their periphery.
[0075] Finally, the sensor body 145 comprises a plurality of fingers 165, extending upwardly and outwardly from a top of the truncated cone portion 160. An outer face of each of the fingers 165 comprises a sensor 130.
[0076] The arrangement of sensors 130 described above, not only ensures that there are sensors 130 positioned at a plurality of different angles, but also to shield sensors at certain angles. As such, the process of determining a location and angle of the tracking device 110 may be greatly simplified through such arrangement, and less prone to noise.
[0077] In some embodiments, the housing 140 also comprise an accelerometer, a gyroscope, or other motion sensing device to enable the tracking device 110 to provide additional motion detail to the laptop 135. This may be particularly useful in reducing complexity of the motion tracking at the laptop 135. [0078] Figure 5 illustrates a front view of a tracking device 510, according to an embodiment of the present invention. The tracking device 510 is similar to the tracking device 110, but encases the smartphone 105.
[0079] The tracking device 510 comprises a rectangular frame 515, and a transparent screen 520. The rectangular frame 515 is configured to receive and house the smartphone 105, such that it is positioned with the screen of the smartphone 105 directly under the transparent screen 520 of the frame 515.
[0080] The frame 515 further comprises a plurality of spring-loaded retainers 525, which are configured to retain the smartphone 105 thereon, while enabling different sized smartphones to be used with the tracking device 510. The retainers 525 are equally spaced along both sides of the frame 515, which ensures that the smartphone 105 is held in a central position.
[0081] The frame 515 further comprises a plurality of external buttons 530, which enable the user to interact with the smartphone 105 therein. The buttons 530 may be coupled electrically to the smartphone by internal wiring (not shown), wirelessly, or through physical interaction with buttons on the smartphone 105.
[0082] A sensor body 145 is coupled to a rear of the frame and communicates with the laptop 135 as outlined above.
[0083] According to certain embodiments, the smartphone 105 is further configured to track a face and/or eyes of the user 115, and report same to the laptop 135. As such, the laptop 135 is able to generate media (such as video) for the user based not only on the location and direction of the smartphone 105, but also the relative angle the user is viewing the smartphone (parallax).
[0084] As illustrated in Figure 6, when the user 115 moves relative to the smartphone 105, to maintain perspective within a scene, a view of the scene should also change to provide the perception of depth. This is similar to the concept of looking through a window in that a relative position of the user to the window will influence his or her view through the window.
[0085] In order to achieve this effect in embodiments of the present invention, a different scene portion is generated (or selected) based upon the user's relative position to the smartphone. This may in practice be achieved by first identifying a region of the scene based upon a position of the smartphone 105, and then by refining that positon according to the position of the user 115. [0086] As illustrated in Figure 6, the user 115 at a first position 105-a is provided with a first portion 605a of a scene 605 based upon their first position 115-a. As the user moves relative to the smartphone 105, as illustrated by the second position 115-b, a second portion 605b of the scene is provided. Such configuration relates to parallax and the different viewing positions.
[0087] While the scene 605 is illustrated as being a planar surface in Figure 6, the skilled addressee will readily appreciate that the scene 605 may be of any suitable shape, and may vary in depth. Similarly, while the motion in Figure 6 is illustrated in a single dimension, the skilled addressee will readily appreciate that such motion may be in any suitable direction.
[0088] Figure 7 illustrates a schematic of a three-dimensional tracking system 700, according to an embodiment of the present invention. The three-dimensional tracking system 700 may be similar or identical to the three-dimensional tracking system 100.
[0089] The system comprises a tracking beacon 725, similar to the infrared transmitter 125, and for transmitting signals to a tracking device 710, similar to the tracking device 110, to assist the tracking device 710 in determining its location and direction. The tracking device 710 is coupled to a smartphone 705, similar to the smartphone 105, and as such, tracking of the tracking device can be used to track the smartphone 705.
[0090] The tracking device 710 and the smartphone 705 are both coupled to a server 735, which may be in the form of a local computer (e.g. a laptop), a remote server or a combination of one or more remote and/or local computers, like the laptop 135. .
[0091] The server 735 is coupled to a data store 740, which comprises content. The content may comprise three-dimensional models, from which video data is generated for transmission to the smartphone 705. The three-dimensional model may comprise part of a game, with which the user interacts through the smartphone 705.
[0092] In order to provide an immersive experience, the server 735 may generate a sequence of image frames by continuously determined a location and direction of the user device, and generating each image frame according to the location and direction.
[0093] As an illustrative example, an application or game may provide a mixed reality experience, where images capture by the smartphone 705 are overlaid with three-dimensional data (such as game characters and game elements). The user may interact with the overlaid data using the touchscreen, for example, upon which such data is sent to the server 735, and the three- dimensional model is updated. [0094] The smartphone may also send other data back to the server 735, such as sensor data, data corresponding to button presses, or any other suitable data. This data may provide further interaction to the game, may assist in motion tracking, or be used by the server 735 for any other suitable purpose.
[0095] In another example, an application or game is provided where a virtual world is displayed such that a screen of the smartphone appears to be a "window", through which the user is able to see the virtual world using a three dimensional model which appears to be behind the window. A front facing camera of the smartphone is used to track the user's face, and update the image based thereon to give the perception of depth.
[0096] In yet another example, the system may incorporate a virtual reality headset (or other virtual reality device) to provide virtual reality to the user. In such case, the smartphone may be tracked, as outlined above, and displayed in the virtual reality world. In such case, the user may interact with the smartphone in the virtual world as a smartphone or as a controller, while holding it in their hand in the real world. The virtual reality headset and the tracking device of the smartphone may utilise the same transmitter(s)/tracking beacon(s) for independent motion tracking.
[0097] Advantageously, in some embodiments the system may enable virtual reality to be provided in a manner that is not hindered by use of a headset, and enables the user to use his or her phone, which they are generally familiar with and comfortable with.
[0098] The system is low complexity as it utilises tracking from a device that is separate to (but attached to) the smartphone. Such configuration not only increases battery life in that processing on the smartphone is reduced, but also enables the smartphone to perform other computationally complex activities, which in turn enables more advanced applications and games.
CONCLUDING STATEMENTS
[0099] Therefore the present invention has the advantage over the prior art of improved ease of use by a user of a user device providing virtual reality three dimensional scene rendering.
[0100] Reference throughout this specification to One embodiment' or 'an embodiment' means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearance of the phrases 'in one embodiment' or 'in an embodiment' in various places throughout this specification are not necessarily all referring to the same embodiment.
Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more combinations.
[0101] In compliance with the statute, the invention has been described in language more or less specific to structural or methodical features. It is to be understood that the invention is not limited to specific features shown or described since the means herein described comprises preferred forms of putting the invention into effect. The invention is, therefore, claimed in any of its forms or modifications within the proper scope of the appended claims (if any) appropriately interpreted by those skilled in the art.

Claims

1. A three-dimensional tracking system comprising: a user device comprising a display screen; a tracking device coupled to the user device and configured to capture tracking data relating to movement of the user device; and wherein the tracking device provides media to the user device for display on the display screen, wherein the media is provided according to the received tracking data.
2. A three-dimensional tracking system comprising: a user device comprising a display screen; a tracking device coupled to the user device and configured to capture tracking data relating to movement of the user device; and a server coupled to the tracking device and the user device, the server configured to receive the tracking data from the tracking device and provide media to the user device for display on the display screen, wherein the media is provided according to the received tracking data.
3. The three-dimensional tracking system of claim 1 or claim 2, wherein the tracking device includes a plurality of sensors, each configured to receive data that together defines at least a portion of the tracking data.
4. The three-dimensional tracking system of claim 3, wherein the sensors comprise light sensors, configured to receive light from a light source.
5. The three-dimensional tracking system of claim 4, wherein the light comprises infrared light.
6. The three-dimensional tracking system of anyone of claims 1 to 5, wherein the tracking device also comprises one or more shielding members, configured to prevent light from a particular angle from reaching at least one sensor.
7. The three-dimensional tracking system of anyone of claims 1 to 6, wherein the tracking device may comprises a plurality of arms extending outwardly from the tracking device, with a sensor positioned on each arm.
8. The three-dimensional tracking system of anyone of claim 7, wherein an upper surface of each arm is inclined towards a center of the sensor body.
9. The three-dimensional tracking system of anyone of claim 8, wherein one or more of the upper surfaces may comprise a sensor positioned thereon.
10. The three-dimensional tracking system of anyone of claims 1 to 9, wherein the three- dimensional tracking system also comprises a transmitter, configured to emit light.
11. The three-dimensional tracking system of anyone of claim 10, wherein the transmitter is configured to transmit infrared light in a predefined pattern that varies over time.
12. The three-dimensional tracking system of anyone of claim 11, wherein the light is swept in at least two different directions which are orthogonal to one another.
13. The three-dimensional tracking system of anyone of claims 1 to 9, wherein the
transmitter is configured to provide a synchronisation signal, to enable the tracking device to synchronise with the transmitter.
14. The three-dimensional tracking system of anyone of claim 3, wherein the tracking device is configured to send the tracking data in the form of received signals from the sensors to the server for processing.
15. The three-dimensional tracking system of anyone of claim 14, wherein the received
signals are processed or filtered prior to being sent to the server.
16. The three-dimensional tracking system of anyone of claims 1 to 15, wherein the tracking device further comprises an accelerometer and/or a gyroscope.
17. The three-dimensional tracking system of anyone of claim 16, wherein the tracking data further comprises accelerometer data and/or a gyroscope data from the accelerometer and/or the gyroscope.
18. The three-dimensional tracking system of anyone of claim 10, wherein the server
determines a location and direction of the user device with reference to the transmitter.
19. The three-dimensional tracking system of anyone of claim 18, wherein the server utilises one or more received times of one or more swept signals of the transmitter to determine a location and direction of the user device.
20. The three-dimensional tracking system of anyone of claims 1 to 19, wherein the media is provided in a plurality of sequential image frames.
21. The three-dimensional tracking system of anyone of claims 18 to 20, wherein the server continuously determines a location and direction of the user device, and generates each image frame according to a location and direction.
22. The three-dimensional tracking system of anyone of claims 2 to 21, wherein the server is configured to generate media by selecting a scene portion of a scene based upon the location and direction of the user device.
23. The three-dimensional tracking system of claim 22, wherein the scene portions are
selected according to movement of the smartphone to simulate movement in a three- dimensional environment.
24. The three-dimensional tracking system of claim 23, wherein the three-dimensional
environment comprises a virtual reality environment.
25. The three-dimensional tracking system of claim 23, wherein the three dimensional
environment comprises scene portions overlaid over image data captured by the device to provide mixed reality.
26. The three-dimensional tracking system of anyone of claims 2 to 25, wherein the server and the user device comprise a single device.
27. The three-dimensional tracking system of anyone of claims 1 to 26, wherein the tracking device is releasably coupled to the user device.
28. The three-dimensional tracking system of claim 27, wherein the tracking device is
releasably coupled to the user device in a fixed relationship, such that movement of the tracking device corresponds to movement of the user device.
29. The three-dimensional tracking system of anyone of claims 1 to 28, wherein the tracking device comprises a clamp configured to clamp the user device.
30. The three-dimensional tracking system of claim 29, wherein the clamp is spring loaded.
31. The three-dimensional tracking system of anyone of claims 1 to 30, wherein the tracking device is configured to be installed substantially behind a screen of the user device.
32. The three-dimensional tracking system of anyone of claims 1 to 31, wherein the tracking device comprises a case, configured to receive and retain the user device.
33. The three-dimensional tracking system of anyone of claims 1 to 32, wherein the tracking device comprises a housing comprising a processor, a memory and a transmitter for transmitting the tracking data to the server.
34. The three-dimensional tracking system of claim 29, wherein the housing further
comprises a power cable, for coupling to the user device to a power source.
35. The three-dimensional tracking system of anyone of claims 1 to 34, wherein the user device may comprise a touch screen display, with which the user may interact.
36. The three-dimensional tracking system of claim 35, wherein the user device transmits details of such interaction to the server.
37. The three-dimensional tracking system of claim 36, wherein the interaction comprises interaction with an element of media that has been provided by the server.
38. The three-dimensional tracking system of anyone of claims 35 to 37, wherein the server is configured to provide media according to the received tracking data and according to interaction with the touch screen display.
39. The three-dimensional tracking system of anyone of claims 1 to 38, wherein the media comprises media of a game with which the user interacts.
40. The three-dimensional tracking system of anyone of claims 1 to 39, wherein the user device is configured to track a face and/or eyes of the user.
41. The three-dimensional tracking system of claim 40, wherein the user device transmits face and/or eye tracking data to the server.
42. The three-dimensional tracking system of claim 40, wherein the server is configured to provide media according to the received tracking data and according to the face and/or eye tracking data.
43. The three-dimensional tracking system of anyone of claims 40 to 42, wherein the media is updated to compensate for movement of the users face and/or eyes with reference to the user device.
44. The three-dimensional tracking system of anyone of claims 40 to 43, wherein the server is configured to generate media by selecting a scene portion based upon the user's relative position to the user device according to the face and/or eye tracking data.
45. The three-dimensional tracking system of anyone of claims 40 to 44, wherein the server is configured to generate media by selecting a scene portion based upon the location and direction of the user device according to the tracking data.
46. The three-dimensional tracking system of claim 45, wherein the scene portions are
selected to simulate movement in a three-dimensional environment.
47. The three-dimensional tracking system of anyone of claims 1 to 46, wherein the user device comprises a smartphone.
48. A virtual reality streaming system including a three-dimensional tracking system, the virtual reality streaming system comprising:
a user device comprising a display screen;
a tracking device associated with the user device and configured to capture tracking data relating to movement of the user device; and
a computer or server associated with the user device, the server configured to receive the tracking data from the tracking device and provide virtual reality media to the user device for display on the display screen, wherein the media is provided according to the received tracking data.
PCT/AU2018/050432 2017-05-11 2018-05-10 Three dimensional tracking systems WO2018204980A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
AU2017901750 2017-05-11
AU2017901750A AU2017901750A0 (en) 2017-05-11 Three Dimensional Tracking System

Publications (1)

Publication Number Publication Date
WO2018204980A1 true WO2018204980A1 (en) 2018-11-15

Family

ID=64104375

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/AU2018/050432 WO2018204980A1 (en) 2017-05-11 2018-05-10 Three dimensional tracking systems

Country Status (1)

Country Link
WO (1) WO2018204980A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160041391A1 (en) * 2014-08-08 2016-02-11 Greg Van Curen Virtual reality system allowing immersion in virtual space to consist with actual movement in actual space
US20160209658A1 (en) * 2014-05-30 2016-07-21 Sony Computer Entertainment America Llc Head Mounted Device (HMD) System Having Interface With Mobile Computing Device for Rendering Virtual Reality Content
US20160349509A1 (en) * 2015-05-26 2016-12-01 Microsoft Technology Licensing, Llc Mixed-reality headset
US20170090564A1 (en) * 2015-09-24 2017-03-30 Tobii Ab Eye-tracking enabled wearable devices

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160209658A1 (en) * 2014-05-30 2016-07-21 Sony Computer Entertainment America Llc Head Mounted Device (HMD) System Having Interface With Mobile Computing Device for Rendering Virtual Reality Content
US20160041391A1 (en) * 2014-08-08 2016-02-11 Greg Van Curen Virtual reality system allowing immersion in virtual space to consist with actual movement in actual space
US20160349509A1 (en) * 2015-05-26 2016-12-01 Microsoft Technology Licensing, Llc Mixed-reality headset
US20170090564A1 (en) * 2015-09-24 2017-03-30 Tobii Ab Eye-tracking enabled wearable devices

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JOHN-ANYAEHIE, M.: "DualShock 4 Clip Lets Attach Your Smartphone To The Controller", TECHMYMONEY, 3 December 2014 (2014-12-03), XP055613263, Retrieved from the Internet <URL:https://web.archive.org/web/20141203175409/https://techmymoney.com/2014/07/09/dualshock-4-clip-lets-attach-smartphone-controller> [retrieved on 20180607] *
MALVENTANO, A.: "SteamVR HTC Vive In-depth", LIGHTHOUSE TRACKING SYSTEM DISSECTED AND EXPLORED, 10 April 2016 (2016-04-10), XP055613261, Retrieved from the Internet <URL:https://web.archive.org/web/20160410024056/https://www.pcper.com/reviews/General-Tech/SteamVR-HTC-Vive-depth-Lighthouse-Tracking-System-Dissected-and-Explored/SteamV> [retrieved on 20180606] *

Similar Documents

Publication Publication Date Title
CN110536665B (en) Emulating spatial perception using virtual echo location
CN116300091A (en) Method and system for resolving hemispherical ambiguities using position vectors
CN114995647A (en) System and method for augmented reality
US11620785B2 (en) Systems and methods for augmented reality
WO2022199260A1 (en) Static object stereoscopic display method and apparatus, medium, and electronic device
US10356393B1 (en) High resolution 3D content
US10818089B2 (en) Systems and methods to provide a shared interactive experience across multiple presentation devices
WO2018175335A1 (en) Method and system for discovering and positioning content into augmented reality space
CN104866261A (en) Information processing method and device
JP2021060627A (en) Information processing apparatus, information processing method, and program
US9773350B1 (en) Systems and methods for greater than 360 degree capture for virtual reality
US10410390B2 (en) Augmented reality platform using captured footage from multiple angles
US11405531B2 (en) Data processing
KR20200115631A (en) Multi-viewing virtual reality user interface
WO2018204980A1 (en) Three dimensional tracking systems
WO2017163649A1 (en) Image processing device
JP6467039B2 (en) Information processing device
WO2018205426A1 (en) Desktop spatial stereoscopic interaction system
US10902617B2 (en) Data processing for position detection using optically detectable indicators
JPWO2017191701A1 (en) Display control apparatus, display control method, and program
US20220201191A1 (en) Systems and methods for sharing communications with a multi-purpose device
US11132051B2 (en) Systems and methods to provide an interactive environment in response to touch-based inputs
US11310472B2 (en) Information processing device and image generation method for projecting a subject image onto a virtual screen
CN116866541A (en) Virtual-real combined real-time video interaction system and method
WO2018216327A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18798495

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18798495

Country of ref document: EP

Kind code of ref document: A1