WO2015123768A1 - Systems and methods for scanning a physical environment for augmented reality and virtual reality applications - Google Patents

Systems and methods for scanning a physical environment for augmented reality and virtual reality applications Download PDF

Info

Publication number
WO2015123768A1
WO2015123768A1 PCT/CA2015/050117 CA2015050117W WO2015123768A1 WO 2015123768 A1 WO2015123768 A1 WO 2015123768A1 CA 2015050117 W CA2015050117 W CA 2015050117W WO 2015123768 A1 WO2015123768 A1 WO 2015123768A1
Authority
WO
WIPO (PCT)
Prior art keywords
physical environment
mirror
disc
signal
scanning
Prior art date
Application number
PCT/CA2015/050117
Other languages
French (fr)
Inventor
Dhanushan Balachandreswaran
Kibaya Mungai Njenga
Original Assignee
Sulon Technologies Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sulon Technologies Inc. filed Critical Sulon Technologies Inc.
Publication of WO2015123768A1 publication Critical patent/WO2015123768A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • A63F13/26Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • A63F13/655Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition by importing photos, e.g. of the player
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • the following generally relates to systems and methods for scanning a physical environment for augmented reality and virtual reality applications.
  • AR and VR visualisation applications are increasingly popular.
  • the range of applications for AR and VR visualisation has increased with the advent of wearable technologies and 3-dimensional (3D) rendering techniques.
  • AR and VR exist on a continuum of mixed reality visualisation.
  • a system for scanning a physical environment for augmented reality applications.
  • the system comprises: (a) a base; (b) a disc rotatably mounted to the base, the disc having a co-axial aperture disposed therethrough; (c) a mirror module mounted to the disc and comprising a mirror having a reflective surface facing the aperture at an angle of reflection; (d) a transceiver module mounted to the base, the transceiver module comprising a transmitter configured to emit a signal through the aperture to be reflected by the mirror towards a physical environment and a receiver configured to receive the signal through the aperture once reflected by the mirror from the physical environment; and (e) a drive module mounted to the base, the drive module communicatively connected via wires to a motor controller and mechanically coupled to the mirror module for rotatably driving the mirror without obstructing the aperture.
  • the mirror may be disposed at an angle of 45 degrees relative to the disc and the mirror module may comprise a housing permitting passage of the signal from the mirror to the physical environment.
  • the drive module may comprise: a motor mounted to the base, a drive shaft rotated by the motor, and a mechanical link mechanically coupled to the drive shaft and the disc to rotate the disc when the motor is driven.
  • the processor may process the sensor readings to determine distances between the scanning system and surrounding obstacles in the physical environment, and further generate a map of the physical environment using the distances.
  • the system may be mounted to a head mounted display worn by a user occupying the physical environment.
  • the system permits continuous rotational scanning of the physical environment while mitigating tangling of wiring.
  • FIG. 1 is a view of a head mounted display for use with a scanning system or method
  • FIG. 2 is a side view of a first embodiment of a system for scanning a physical environment
  • FIG. 3 is a top view of the first embodiment of a system for scanning a physical environment
  • FIG. 4 is a flowchart illustrating a further method of scanning a physical environment
  • FIG. 5 is a side view of a second embodiment of a system for scanning a physical environment, the system comprising a conduit;
  • FIG. 6 is a top view of the second embodiment of the system for scanning a physical environment
  • FIG. 7 is side view of a third embodiment of a system for scanning a physical
  • the system comprising an adjustable mirror
  • FIG. 8 is a side view of a fourth embodiment of a system for scanning a physical environment.
  • Fig. 9 is a flowchart illustrating a still further method of scanning a physical environment.
  • any module, unit, component, server, computer, terminal or device exemplified herein that executes instructions may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic discs, optical discs, or tape.
  • Computer storage media may include volatile and non-volatile, removable and non- removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
  • Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disc storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by an application, module, or both. Any such computer storage media may be part of the device or accessible or connectable thereto.
  • any processor or controller set out herein may be implemented as a singular processor or as a plurality of processors. The plurality of processors may be arrayed or distributed, and any processing function referred to herein may be carried out by one or by a plurality of processors, even though a single processor may be exemplified.
  • AR augmented reality
  • AR includes: the interaction by a user with real physical objects and structures along with virtual objects and structures overlaid thereon; and the interaction by a user with a fully virtual set of objects and structures that are generated to include renderings of physical objects and structures and that may comply with scaled versions of physical environments to which virtual objects and structures are applied, which may alternatively be referred to as an
  • Certain AR applications require mapping the physical environment in order to later model and render objects within the physical environment and/or render a virtual environment layered upon the physical environment. Achieving an accurate and robust mapping is, therefore, crucial to the accuracy and realism of the AR application.
  • a scanning system for mapping a physical environment for an AR application comprises a transmitter, a receiver and a processor.
  • the transmitter may comprise a laser diode, a light emitting diode or other signal emitter, and the receiver may comprise a photodiode or other optical receiver operable to capture the emitted signal from the transmitter.
  • the transmitter and receiver cooperate to generate sensor readings which are obtained by the processor.
  • the processor processes the sensor readings to determine distances between the scanning system (at a particular moment in time) and surrounding obstacles in the physical environment.
  • the processor can then generate a map of the physical environment using the distances.
  • processors and/or controllers may be communicatively connected to one another, and mounted throughout the scanning system or remotely therefrom.
  • processing tasks may be performed by a processor in network, wireless or wired communication with components of the scanning system described and contemplated herein.
  • the scanning system may be mounted to a head mounted display (HMD) worn by a user occupying a physical environment, such as, for example, a room.
  • HMD head mounted display
  • the scanning system may thereby enable inside-out mapping of the physical environment, i.e., mapping from the point of view of the user, rather than from a fixed location in the physical environment and scanning toward the user.
  • Some alternative scanning systems utilize rotating sensors to generate sensor readings.
  • the sensors are generally electrically coupled to the processor by wiring. It has been found that such an implementation is often problematic because the wires may become tangled during rotation of the sensors.
  • the scanning system provided herein also permits rotational scanning of the physical environment but avoids or mitigates tangling of wiring. In embodiments, the scanning system provided herein avoids the coupling of wiring to any rotating elements of the system.
  • the scanning system is configured to scan and map a physical environment in 2- and/or 3-dimensions.
  • the processor may provide the model of the physical environment to a graphics engine operable to generate a rendered image stream comprising computer generated imagery (CGI) for the modelled physical environment to augment user interaction with, and perception of, the physical environment.
  • CGI may be provided to the user via an HMD as a rendered image stream or layer.
  • the rendered image stream may be dynamic, i.e., it may vary from one instance to the next in accordance with changes in the physical environment and the user's interaction therewith.
  • the rendered image stream may comprise characters, obstacles and other graphics suitable for, for example, "gamifying" the physical environment by displaying the physical environment as an AR.
  • the scanning system may be mounted to an HMD for being removably worn by a user.
  • an HMD 12 configured as a helmet is shown; however, other configurations are contemplated.
  • the HMD 12 may comprise: a processor 130 in communication with one or more of the following components: (i) a scanning, local positioning and orientation module 128 comprising a scanning system for scanning the physical environment, a local positioning system (“LPS") for determining the HMD's 12 position within the physical environment, and an orientation detection system for detecting the orientation of the HMD 12 (such as an inertia measuring unit "IMU" 127); (ii) an imaging system, such as, for example, a camera system comprising one or more cameras 123, to capture image streams of the physical environment; (iii) a display system 121 for displaying to a user of the HMD 12 the AR and the image stream of the physical environment; (iv) a power management system (not shown) for distributing power to the components; and (v) an audio system 124 with audio input and output to provide audio interaction.
  • a scanning, local positioning and orientation module 128 comprising a scanning system for scanning the physical environment, a local positioning system (“LPS") for determining the H
  • the processor 130 may further comprise a wireless communication system 126 having, for example, antennae, to communicate with other components in an AR system, such as, for example, other HMDs, a gaming console, a router, or at least one peripheral to enhance user engagement with the AR.
  • a wireless communication system 126 having, for example, antennae, to communicate with other components in an AR system, such as, for example, other HMDs, a gaming console, a router, or at least one peripheral to enhance user engagement with the AR.
  • the processor 130 may carry out multiple functions, including rendering, imaging, mapping, positioning, and display.
  • the processor may obtain the outputs from the LPS, the IMU and the scanning system to model the physical environment in a map (i.e., to map the physical environment) and generate a rendered image stream comprising computer generated imagery ("CGI") with respect to the mapped physical environment.
  • the processor may then transmit the rendered image stream to the display system of the HMD for display to user thereof.
  • CGI computer generated imagery
  • the scanning system is configured to scan and map the surrounding physical environment, whether in 2D or 3D.
  • the generated map may be stored locally in the HMD or remotely in a console or server.
  • the processor may continuously update the map as the user's location and orientation within the physical environment change.
  • the map serves as the basis for AR rendering of the physical environment, allowing, for example, the user to safely and accurately navigate and interact with the physical environment.
  • the scanning system comprises a transmitter and receiver.
  • the scanning system may comprise a scanning laser range finder (SLRF) which scans the physical environment by emitting a signal from the transmitter towards the physical environment and receiving at the receiver a corresponding reflected signal back from the physical environment.
  • SLRF scanning laser range finder
  • the receiver detects the signal and the processor of the scanning system determines the distance from the scanning system to the obstacle.
  • the processor may calculate the travelled distances using any suitable calculation techniques, such as, for example, phase shift modulation or time-of-flight (TOF), according to the configuration of the electronic components comprised by the scanning system.
  • TOF time-of-flight
  • a TOF integrated circuit records the elapsed time between emission and receipt of the signal 336, 336' and correlates the elapsed time to a travelled distance.
  • the scanning system is configured to emit the signal from the HMD at a plurality of angles along its rotational path, and is configured to track at which angle the signal has been emitted.
  • by collecting received signals along a plurality of angles and determining the distances to various obstacles corresponding to those angles it is possible to generate a map of the surrounding environment.
  • the orientation of the scanning system relative to the physical environment captured thereby may vary in accordance with the user's movements throughout the physical environment.
  • the processor may thus acquire orientation information for the HMD from the orientation detection system of the HMD to adjust the tracked angle of emission for the signal by the change in the orientation of the HMD.
  • the scanning system is preferably stabilised with a stabiliser unit, such as, for example a stabiliser unit comprising gimbals for mounting and stabilising the scanning system.
  • a stabiliser unit such as, for example a stabiliser unit comprising gimbals for mounting and stabilising the scanning system.
  • the scanning system 300 comprises: a base 350; a transceiver module 320 mounted to the base 350 for determining travel parameters for a signal 336 emitted from the scanning system 300 and reflected back toward the scanning system 300 from an obstacle in the physical environment; a mirror module 330 rotatably mounted to the base 350 for rotatably reflecting the signal 336, 336' between the transceiver module and the physical environment; and a drive module 310 mounted to the base 350, communicatively connected via wires 308 to a motor controller 301 and mechanically coupled to the mirror module 330 for rotatably driving a mirror 334 of the mirror module 330.
  • the base 350 may be comprised of a region of an HMD, or the base may be a discrete component for mounting to an HMD.
  • the transceiver module 320 comprises a transmitter 326 for emitting a signal 336, such as, for example, an infrared (IR) beam or a laser beam; a receiver 328 for receiving the reflected signal 336'; a controller 324; support electronics 322, such as, for example, resistors, capacitors regulators, transimpedance amplifiers for increasing the gain of the signal 336' received by the receiver 328 and accordingly increasing the accuracy of any measurement relating to a measurement of the signal 336, 336' (such as a calculated distance), an encoder 21 1 , such as, for example, a rotary encoder or shaft encoder, to determine the angle of emission and reception of the signal 336, 336', and a TOF IC for measuring the time of travel of the signal 336, 336'; and wires 308.
  • a signal 336 such as, for example, an infrared (IR) beam or a laser beam
  • a receiver 328 for receiving the reflected signal 336'
  • the controller 324 may be connected to the receiver 328, the transmitter 326, the support electronics 322 and the processor 130 by wires 308.
  • the controller 324 controls the speed and angle of the drive module 310.
  • the processor 130 may be configured to process characteristics of a reflected signal 336' measured by the receiver 328 in order to calculate the distance travelled by the signal 336 and reflected signal 336', given characteristics of the output signal 336, as determined by controller 324 and transmitter 326, and characteristics of the reflected signal 338, as measured by the receiver 328.
  • the processor 130 may comprise hardware to calculate the distance travelled by the signal, such as a micro- control unit (MCU) to perform some or all the processing tasks required.
  • MCU micro- control unit
  • the mirror module 330 comprises a mirror 334 for reflecting signals, and a disc 342 for rotating the mirror 334 when driven by the drive module 310 to which the disc 342 is
  • the disc 342 comprises an aperture 340 disposed co-axially
  • the mirror 334 is mounted to the disc 342 at an angle of reflection so that it has a downwardly facing reflective surface disposed along the signal path of the transmitter 328 and receiver 326 to reflect the signal 336 outwardly from the transmitter 328 to the physical environment and inwardly from the physical environment to the receiver 326.
  • the mirror may be angled by 45 degrees relative to the disc 342 to reflect the emitted signal 336 by 90 degree angle relative to its trajectory between the transmitter 326 and the mirror 334.
  • the transmitter and/or receiver may themselves comprise mirrors and/or lenses that nevertheless provide for a signal to be reflected from the mirror 334.
  • the scanning system 300 may comprise a housing 332 for covering and protecting the components of at least the mirror module 330.
  • a region of the housing that lies in the signal path is translucent, transparent or defines an aperture configured to permit passage of signals outwardly through the wall of the housing 332 for passage into the physical environment.
  • the housing 332 comprises a top 331 and translucent or transparent sidewalls 333 for permitting passage of light signals therethrough.
  • the housing 332 comprises a plurality of thin, rigid members mounted to the base 350 and extending upwardly therefrom, the rigid members supporting a top disposed over the mirror module 330 to form a rigid housing surrounding the mirror module 330. The members permit the housing to provide openings at least in the signal path.
  • the drive module 310 comprises: a motor 306 mounted to the base 350 and comprising a drive shaft 304; a mechanical link 302 mechanically coupled to the drive shaft 304 and the disc 342 to rotate the disc 342 when the motor 306 is driven; and a motor controller 301 electrically connected via wires 308 to the motor 306 for controllably driving the motor 306.
  • the motor controller may be physically remote from the scanning system 300 or mounted to the base 350 as shown. If the scanning system 300 comprises a housing 332, as shown, the housing 332 may define an aperture or other suitable access passage to receive therethrough the mechanical link 302. In the illustrated embodiment, the housing 332 defines an aperture therethrough where the mechanical link 302 traverses the housing 332.
  • the illustrated wires 308 may communicatively link drive module 310 and transceiver module 320 to one another and to processor 130. Further, wires 308 may electrically connect the drive module 310 and the transceiver module 320 to a power management system (not shown) for powering their components.
  • the power management system may comprise or be electrically coupled to a power source, such as, for example a battery or mains power, and may further comprise a transformer to suitably transform the incoming current to an output current for the scanning system 300.
  • the mechanical link 302 comprises a belt, band or chain coupled to the drive shaft 304 for transmitting mechanical force from the motor 306 to the disc 342.
  • the mechanical link 302 comprises a gear train for transmitting mechanical force from the motor 306 to the disc 342.
  • the mechanical link 302 comprises a planetary gear surrounding the disc 342 for transmitting mechanical force from the motor 306 to the disc 342.
  • Other suitable embodiments are also contemplated.
  • the motor 306 is communicatively linked to a motor controller 301 for controlling operation of the motor.
  • the motor controller 301 is configured to control the rotational angle of the drive shaft 304 at any given time.
  • the motor controller 301 may be configured to provide sensor readings of the angle of the motor output shaft 304 to the processor 130.
  • the motor controller 301 may further be configured to process sensor readings of the angle of the drive shaft 304 to determine a corresponding angle of the disc 342 and mirror 330.
  • the motor controller 301 may further be configured to provide a signal indicating the angle of the mirror 330 to the processor 130.
  • the motor controller is configured to rotate the drive shaft 304 according to a predetermined angular displacement at predetermined intervals of time, thus enabling 360 degree rotation comprising a plurality of intervals at known angles from the HMD.
  • increasing or decreasing the diameter of the disc 342 corresponds to a respective increase or decrease in the mechanical resolution provided by the mechanical link 302, such that a given angular displacement of the drive shaft 304 displaces the disc 342 and the mirror 334 by a respectively lower or greater amount.
  • the disc 342 is rotatably mounted to the base 350 by a rotatable coupling 344.
  • the coupling 344 comprises an interface for retaining the disc 342 to the base 350 while permitting substantially free rotation of the disc 342 relative to the base 350.
  • the interface may comprise, for example, roller bearings or a lubricated region engageable with an adjacent region of the base 350 or disc 342.
  • the coupling 344 may retain the disc 342 distal or adjacent the base 350.
  • the coupling may comprise legs or walls to retain the disc 342 distal the base 350 while permitting substantially free passage of the signal 336, 336' through the coupling.
  • the coupling 344 may rotate with the disc 342 relative to the base 350, or vice versa.
  • FIG. 4 shown therein is a flowchart illustrating blocks 400 relating to a method of operating the scanning system 300, i.e. for a single scan at a single angle.
  • the transmitter 326 is turned on by the controller 324.
  • operation of the scanning system may be triggered by the processor 130 instructing the controller 324 to begin scanning the physical environment.
  • an output signal 336 is emitted from the transmitter towards the mirror 334.
  • the output signal 336 then reflects off the mirror 334 and is directed outwardly from the scanning system 300.
  • the output signal encounters an obstacle (not shown) in the physical environment and reflects back towards the mirror 334 as return signal 336'.
  • the return signal 336' is detected and evaluated by the receiver 328.
  • the transmitter 326 is turned off.
  • the processor 130 processes sensor readings provided by receiver 328 relating to the detected return signal 336', and the TOF IC of the supporting electronics calculates the distance travelled by the signal to and from the encountered obstacle within the physical environment.
  • the calculated distance may be provided to processor 130 for use in mapping the physical environment surrounding the scanning system 300 for use in AR applications, such as, for example, to map a plurality of calculated distances as a corresponding plurality of points in a point cloud representation of the physical environment.
  • the controller 301 actuates the motor 306, causing the disc 342 and mirror 334 to rotate by a predetermined angle.
  • block 414 could be called upon to rotate the disc 342 after a plurality of time-of-flight readings have been calculated, to ensure accuracy of such readings. As illustrated, the blocks may then be repeated to provide additional calculations of distance. The blocks 400 may be repeated so long as the processor 130 controls the controller 324 to continue scanning the physical environment. The processing of a reflected signal to determine distance travelled by the signal, and required hardware, was described in more detail above in relation to the operation of the SLRF.
  • the TOF IC of the support electronics 322 may calculate the distances based on the elapsed time between emissions and detection of the signal 336.
  • the processor 130 may be provided with calculated distances to obstacles located along 360 degrees about the system 300.
  • the system 300 may thus provide calculated distances to processor 130 for further processing and for use in mapping the physical environment surrounding the system 300.
  • the calculated distances provided by the system 300 may be used by the processor 130 for use in further AR applications.
  • the processor 130 may be configured to receive a signal from the controller 301 indicating an angle of the mirror 334 and a signal from the TOF IC indicating a calculated distance to an obstacle at that angle, and the processor may be able to process each signal in order to determine the distance from the HMD to an obstacle at a given angle from the HMD's field of view.
  • the processor 130 may further be configured to process signals from additional scans from the scanning system 300, i.e. repetition of the blocks 400, in order to provide a map of the physical environment comprising distances to obstacles at 360 degrees around the HMD.
  • the transmitter instead of the transmitter being turned off and on at each repetition of the steps described in relation to blocks 400, the transmitter constantly transmits outbound signals 336 and the processor 130 continuously processes received sensor readings provided by receiver 328.
  • the transmitter 326 may constantly transmit output signals 336 and the processor 130 may only process sensor readings periodically or intermittently.
  • the motor may operate continuously.
  • the functions of the transmitter 326, controller 324, receiver 328 and processor 130 may be performed at a predetermined frequency, when polled by the processor 130, or continually.
  • the transmitter 326 is always active to emit signals, but the receiver 328 and controller 324 are only intermittently activated by the processor to perform their functions, such as capturing a reflected signal.
  • the speed of the motor may be varied by controller 301 in order to vary the frequency with which repeated scans according to blocks 400 will map 360 degrees around the system 300.
  • a scanning system 500 for scanning a physical environment, the scanning system 500 being configured to avoid twisting and rotation of wires 308' during rotation of the mirror 334.
  • a slip ring 552 is provided for electrically connecting the wiring 308' to the disc 542.
  • the slip ring 552 is rotatably disposed within the aperture 540 of the disc 542.
  • the slip ring 552 is annular and comprises a ring aperture 553 coaxial with the aperture 540 of the disc 542 to permit passage therethrough of the signals 336, 336'.
  • the slip ring 552 is electrically coupled to, but rotationally independent of, the disc, enabling electrical coupling between the electronic components of the transceiver module 320 and any components (not shown) which may be mounted to the disc 342 while preventing twisting of the wires 308' during rotation of the disc 542.
  • a scanning system 600 configured to prevent twisting and rotation of wires during rotation of the scanning system 600.
  • the mirror module 630 comprises a mirror 634 which is tiltably mounted to the disc 342.
  • a tilt actuator 660 is mechanically coupled to the mirror 634 to tilt the mirror 634 toward and away from the disc 342. Tilting the mirror 634 may allow the mirror to redirect and capture the signal 636, 636' upwardly and downwardly toward the physical environment to provide potential 3D scanning and mapping.
  • Wiring 308' may electrically and communicably couple the tilt actuator 660 via the slip ring 552 to the transceiver module 320, the controller 301 , the processor 130, or any other electronic components located mounted to the base 350, in order to control and power the tilt actuator 660 to tilt the mirror 634.
  • the angular rotation of the tilt actuator 660 is selectable and determinable.
  • the tilt actuator 660 may be a stepper motor, or it may comprise an optical encoder to provide the angular rotation of the actuator.
  • Embodiments described in relation to Fig. 7 may be configured to provide a 3- dimensional map of the physical environment surrounding the system 500.
  • the processor 130 may subsequently correlate the tilt angle of the mirror to a given distance measurement.
  • the processor 130 may be configured to receive signals indicative of the tilt angle of the mirror, the rotational angle of the mirror with respect to the field of view of the HMD (as previously described), as well as transmitter 326 and receiver 328 readings, and the processor 130 may be configured to further process the received signals in order to generate a three dimensional map of the physical environment surrounding the scanning system 600.
  • a scanning system 700 for scanning a physical environment wherein the motor rotates non-continuously to prevent over- twisting of wires 708.
  • the shaft 704 of a brushless motor, more specifically an outrunner motor, 706 is mounted to the base 350 by an armature 730.
  • the shaft 704 of the outrunner motor 706 remains stationary relative to the base 350, while the motor 706 rotates.
  • the mirror 734 is coupled to the rotating portion of the motor 706.
  • the wiring 708 attaches to the rotating portion of the motor 706 and electrically couples the motor 706 to the controller 701 in the base 350.
  • the motor 706 is controlled by the controller 701 to reciprocate the mirror 734 between predetermined first and second rotational angles.
  • the controller 701 may be located proximal the transceiver module 320.
  • the first and second rotational angles may be predetermined and stored in memory at the controller 701.
  • the transmitter 326 may transmit signals for scanning the physical environment surrounding the system 700.
  • the wires 708 may experience some twisting along with the rotation of the motor 706.
  • the wires 708 are prevented from twisting indefinitely in a single direction; instead, the wires 708 wind in the angular direction of the motor 706 as the motor rotates between the first and second rotational angles until and then unwinds (and, in some embodiments, resumes winding in the same direction as the rotation of the motor 706) as the motor travels between the second and first rotational angles.
  • the scanning system 700 may be configured thereby to scan the physical environment while avoiding the wires becoming overly tangled for successful operation of the system. Thus, stressful twisting of the wires is avoided or mitigated.
  • FIG. 9 shown therein is a flowchart illustrating blocks 800 relating to steps of operation of the scanning system 700.
  • the transmitter 326 is turned on by the processor 130.
  • operation of the scanning system may be triggered by the processor 130 instructing the controller 324 to begin scanning the physical environment.
  • an output signal 336 is emitted from the transmitter towards the mirror 334.
  • the output signal 336 then reflects off the mirror 334 and is directed outwardly from the scanning system 700.
  • the output signal encounters an obstacle (not shown) and reflects back towards the mirror 334, the reflected signal being illustrated as reflected signal 336'.
  • the return signal 336' is detected and measured by the receiver 328.
  • the transmitter is turned off.
  • the processor 130 processes sensor readings provided by receiver 328 relating to the detected reflected signal 336', and the processor 130 calculates the distance travelled by the signal between the scanning system 700 and the encountered obstacle. The calculated distance may be provided to processor 130 for use in mapping the physical environment surrounding the system 700 for use in AR and VR applications, as previously described.
  • the motor controller 701 determines if the motor has reached either of the first or second rotational angles. If yes, at block 815 the motor controller 701 reverses the direction of rotation of the motor 706 away from whichever of the first and second rotational angles it has reached. If not, at block 816 the motor controller 701 causes the motor 706 to rotate along the same rotation as in the preceding cycle.
  • the blocks may then be repeated to provide additional calculations of distance.
  • the blocks 800 may be repeated so long as the processor 130 controls the controller 324 to scan the physical environment.
  • Each of the motions at blocks 815 and 816 may be incremental according to a predetermined interval stored in a memory of the scanning system 700. The processing of a reflected signal to determine distance travelled by the signal, and required hardware, was described in more detail above in relation to the operation of the LSRF.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices With Unspecified Measuring Means (AREA)

Abstract

Various embodiments of scanning systems configured to scan a physical environment for a multi dynamic environment and location based active augmented reality (AR) system are described. The scanning systems may be comprised within in the head mounted display (HMD) of a user in a physical environment, and the scanning may be performed from the point of view of a user wearing the HMD. The scanning systems prevent over-twisting of wiring needed for control of the scanning systems.

Description

SYSTEMS AND METHODS FOR SCANNING A PHYSICAL ENVIRONMENT FOR AUGMENTED REALITY AND VIRTUAL REALITY APPLICATIONS TECHNICAL FIELD
[0001] The following generally relates to systems and methods for scanning a physical environment for augmented reality and virtual reality applications.
BACKGROUND
[0002] Augmented reality (AR) and virtual reality (VR) visualisation applications are increasingly popular. The range of applications for AR and VR visualisation has increased with the advent of wearable technologies and 3-dimensional (3D) rendering techniques. AR and VR exist on a continuum of mixed reality visualisation.
SUMMARY
[0003] In embodiments, a system is provided for scanning a physical environment for augmented reality applications. The system comprises: (a) a base; (b) a disc rotatably mounted to the base, the disc having a co-axial aperture disposed therethrough; (c) a mirror module mounted to the disc and comprising a mirror having a reflective surface facing the aperture at an angle of reflection; (d) a transceiver module mounted to the base, the transceiver module comprising a transmitter configured to emit a signal through the aperture to be reflected by the mirror towards a physical environment and a receiver configured to receive the signal through the aperture once reflected by the mirror from the physical environment; and (e) a drive module mounted to the base, the drive module communicatively connected via wires to a motor controller and mechanically coupled to the mirror module for rotatably driving the mirror without obstructing the aperture.
[0004] The mirror may be disposed at an angle of 45 degrees relative to the disc and the mirror module may comprise a housing permitting passage of the signal from the mirror to the physical environment.
[0005] The drive module may comprise: a motor mounted to the base, a drive shaft rotated by the motor, and a mechanical link mechanically coupled to the drive shaft and the disc to rotate the disc when the motor is driven.
[0006] The processor may process the sensor readings to determine distances between the scanning system and surrounding obstacles in the physical environment, and further generate a map of the physical environment using the distances. [0007] The system may be mounted to a head mounted display worn by a user occupying the physical environment.
[0008] In aspects, the system permits continuous rotational scanning of the physical environment while mitigating tangling of wiring.
[0009] These and other embodiments are contemplated and described herein in greater detail. BRIEF DESCRIPTION OF THE DRAWINGS
[0010] A greater understanding of the embodiments will be had with reference to the Figures, in which:
[001 1] Fig. 1 is a view of a head mounted display for use with a scanning system or method;
[0012] Fig. 2 is a side view of a first embodiment of a system for scanning a physical environment;
[0013] Fig. 3 is a top view of the first embodiment of a system for scanning a physical environment;
[0014] Fig. 4 is a flowchart illustrating a further method of scanning a physical environment;
[0015] Fig. 5 is a side view of a second embodiment of a system for scanning a physical environment, the system comprising a conduit;
[0016] Fig. 6 is a top view of the second embodiment of the system for scanning a physical environment;
[0017] Fig. 7 is side view of a third embodiment of a system for scanning a physical
environment, the system comprising an adjustable mirror;
[0018] Fig. 8 is a side view of a fourth embodiment of a system for scanning a physical environment; and
[0019] Fig. 9 is a flowchart illustrating a still further method of scanning a physical environment. DETAILED DESCRIPTION
[0020] It will be appreciated that for simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the Figures to indicate corresponding or analogous elements. In addition, numerous specific details are set forth in order to provide a thorough understanding of the embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein may be practised without these specific details. In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the embodiments described herein. Also, the description is not to be considered as limiting the scope of the embodiments described herein.
[0021] It will be appreciated that various terms used throughout the present description may be read and understood as follows, unless the context indicates otherwise: "or" as used throughout is inclusive, as though written "and/or"; singular articles and pronouns as used throughout include their plural forms, and vice versa; similarly, gendered pronouns include their counterpart pronouns so that pronouns should not be understood as limiting anything described herein to use, implementation, performance, etc. by a single gender. Further definitions for terms may be set out herein; these may apply to prior and subsequent instances of those terms, as will be understood from a reading of the present description.
[0022] It will be appreciated that any module, unit, component, server, computer, terminal or device exemplified herein that executes instructions may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic discs, optical discs, or tape. Computer storage media may include volatile and non-volatile, removable and non- removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile discs (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disc storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by an application, module, or both. Any such computer storage media may be part of the device or accessible or connectable thereto. Further, unless the context clearly indicates otherwise, any processor or controller set out herein may be implemented as a singular processor or as a plurality of processors. The plurality of processors may be arrayed or distributed, and any processing function referred to herein may be carried out by one or by a plurality of processors, even though a single processor may be exemplified. Any method, application or module herein described may be implemented using computer readable/executable instructions that may be stored or otherwise held by such computer readable media and executed by the one or more processors. [0023] The present disclosure is directed to systems and methods for augmented reality (AR). However, the term "AR" as used herein may encompass several meanings. In the present disclosure, AR includes: the interaction by a user with real physical objects and structures along with virtual objects and structures overlaid thereon; and the interaction by a user with a fully virtual set of objects and structures that are generated to include renderings of physical objects and structures and that may comply with scaled versions of physical environments to which virtual objects and structures are applied, which may alternatively be referred to as an
"enhanced virtual reality". Further, the virtual objects and structures could be dispensed with altogether, and the AR system may display to the user a version of the physical environment which solely comprises an image stream of the physical environment. Finally, a skilled reader will also appreciate that by discarding aspects of the physical environment, the systems and methods presented herein are also applicable to virtual reality (VR) applications, which may be understood as "pure" VR. For the reader's convenience, the following may refer to "AR" but is understood to include all of the foregoing and other variations recognized by the skilled reader.
[0024] Certain AR applications require mapping the physical environment in order to later model and render objects within the physical environment and/or render a virtual environment layered upon the physical environment. Achieving an accurate and robust mapping is, therefore, crucial to the accuracy and realism of the AR application.
[0025] One aspect involved in a mapping process is scanning the environment using a scanning system. A scanning system for mapping a physical environment for an AR application is provided herein. The scanning system comprises a transmitter, a receiver and a processor. The transmitter may comprise a laser diode, a light emitting diode or other signal emitter, and the receiver may comprise a photodiode or other optical receiver operable to capture the emitted signal from the transmitter. The transmitter and receiver cooperate to generate sensor readings which are obtained by the processor. The processor processes the sensor readings to determine distances between the scanning system (at a particular moment in time) and surrounding obstacles in the physical environment. The processor can then generate a map of the physical environment using the distances.
[0026] The singular "processor" is used herein, but processing tasks may be distributed amongst one or more processors and/or controllers. The processors and controllers may be communicatively connected to one another, and mounted throughout the scanning system or remotely therefrom. For example, one or more processing tasks may be performed by a processor in network, wireless or wired communication with components of the scanning system described and contemplated herein.
[0027] The scanning system may be mounted to a head mounted display (HMD) worn by a user occupying a physical environment, such as, for example, a room. In aspects, the scanning system may thereby enable inside-out mapping of the physical environment, i.e., mapping from the point of view of the user, rather than from a fixed location in the physical environment and scanning toward the user.
[0028] Some alternative scanning systems utilize rotating sensors to generate sensor readings. The sensors are generally electrically coupled to the processor by wiring. It has been found that such an implementation is often problematic because the wires may become tangled during rotation of the sensors.
[0029] The scanning system provided herein also permits rotational scanning of the physical environment but avoids or mitigates tangling of wiring. In embodiments, the scanning system provided herein avoids the coupling of wiring to any rotating elements of the system.
[0030] In embodiments, the scanning system is configured to scan and map a physical environment in 2- and/or 3-dimensions. The processor may provide the model of the physical environment to a graphics engine operable to generate a rendered image stream comprising computer generated imagery (CGI) for the modelled physical environment to augment user interaction with, and perception of, the physical environment. The CGI may be provided to the user via an HMD as a rendered image stream or layer. The rendered image stream may be dynamic, i.e., it may vary from one instance to the next in accordance with changes in the physical environment and the user's interaction therewith. The rendered image stream may comprise characters, obstacles and other graphics suitable for, for example, "gamifying" the physical environment by displaying the physical environment as an AR.
[0031] In particular embodiments, the scanning system may be mounted to an HMD for being removably worn by a user. Referring now to Fig. 1 , an exemplary HMD 12 configured as a helmet is shown; however, other configurations are contemplated. The HMD 12 may comprise: a processor 130 in communication with one or more of the following components: (i) a scanning, local positioning and orientation module 128 comprising a scanning system for scanning the physical environment, a local positioning system ("LPS") for determining the HMD's 12 position within the physical environment, and an orientation detection system for detecting the orientation of the HMD 12 (such as an inertia measuring unit "IMU" 127); (ii) an imaging system, such as, for example, a camera system comprising one or more cameras 123, to capture image streams of the physical environment; (iii) a display system 121 for displaying to a user of the HMD 12 the AR and the image stream of the physical environment; (iv) a power management system (not shown) for distributing power to the components; and (v) an audio system 124 with audio input and output to provide audio interaction. The processor 130 may further comprise a wireless communication system 126 having, for example, antennae, to communicate with other components in an AR system, such as, for example, other HMDs, a gaming console, a router, or at least one peripheral to enhance user engagement with the AR.
[0032] The processor 130 may carry out multiple functions, including rendering, imaging, mapping, positioning, and display. The processor may obtain the outputs from the LPS, the IMU and the scanning system to model the physical environment in a map (i.e., to map the physical environment) and generate a rendered image stream comprising computer generated imagery ("CGI") with respect to the mapped physical environment. The processor may then transmit the rendered image stream to the display system of the HMD for display to user thereof.
[0033] In conjunction with the processor 130, the scanning system is configured to scan and map the surrounding physical environment, whether in 2D or 3D. The generated map may be stored locally in the HMD or remotely in a console or server. The processor may continuously update the map as the user's location and orientation within the physical environment change. The map serves as the basis for AR rendering of the physical environment, allowing, for example, the user to safely and accurately navigate and interact with the physical environment.
[0034] The scanning system comprises a transmitter and receiver. The scanning system may comprise a scanning laser range finder (SLRF) which scans the physical environment by emitting a signal from the transmitter towards the physical environment and receiving at the receiver a corresponding reflected signal back from the physical environment. When the emitted signal encounters an obstacle in the physical environment, the emitted signal is reflected from the obstacle toward the scanning system. The receiver detects the signal and the processor of the scanning system determines the distance from the scanning system to the obstacle. The processor may calculate the travelled distances using any suitable calculation techniques, such as, for example, phase shift modulation or time-of-flight (TOF), according to the configuration of the electronic components comprised by the scanning system. In a TOF calculation, a TOF integrated circuit (IC) records the elapsed time between emission and receipt of the signal 336, 336' and correlates the elapsed time to a travelled distance. The scanning system is configured to emit the signal from the HMD at a plurality of angles along its rotational path, and is configured to track at which angle the signal has been emitted. Correspondingly, by collecting received signals along a plurality of angles and determining the distances to various obstacles corresponding to those angles, it is possible to generate a map of the surrounding environment.
[0035] When the scanning system is mounted to an HMD worn by a user, the orientation of the scanning system relative to the physical environment captured thereby may vary in accordance with the user's movements throughout the physical environment. The processor may thus acquire orientation information for the HMD from the orientation detection system of the HMD to adjust the tracked angle of emission for the signal by the change in the orientation of the HMD.
[0036] Further, a user moving throughout the physical environment is likely to move his head and/or body, thereby causing the HMD and, correspondingly, the scanning system to constantly move in 3 dimensions and about 3 axes. These movements may decrease scanning accuracy. Therefore, the scanning system is preferably stabilised with a stabiliser unit, such as, for example a stabiliser unit comprising gimbals for mounting and stabilising the scanning system.
[0037] Referring now to Figs. 2 and 3, shown therein is an embodiment of a system for scanning a physical environment. The scanning system 300 comprises: a base 350; a transceiver module 320 mounted to the base 350 for determining travel parameters for a signal 336 emitted from the scanning system 300 and reflected back toward the scanning system 300 from an obstacle in the physical environment; a mirror module 330 rotatably mounted to the base 350 for rotatably reflecting the signal 336, 336' between the transceiver module and the physical environment; and a drive module 310 mounted to the base 350, communicatively connected via wires 308 to a motor controller 301 and mechanically coupled to the mirror module 330 for rotatably driving a mirror 334 of the mirror module 330.
[0038] The base 350 may be comprised of a region of an HMD, or the base may be a discrete component for mounting to an HMD.
[0039] The transceiver module 320 comprises a transmitter 326 for emitting a signal 336, such as, for example, an infrared (IR) beam or a laser beam; a receiver 328 for receiving the reflected signal 336'; a controller 324; support electronics 322, such as, for example, resistors, capacitors regulators, transimpedance amplifiers for increasing the gain of the signal 336' received by the receiver 328 and accordingly increasing the accuracy of any measurement relating to a measurement of the signal 336, 336' (such as a calculated distance), an encoder 21 1 , such as, for example, a rotary encoder or shaft encoder, to determine the angle of emission and reception of the signal 336, 336', and a TOF IC for measuring the time of travel of the signal 336, 336'; and wires 308. The controller 324 may be connected to the receiver 328, the transmitter 326, the support electronics 322 and the processor 130 by wires 308. The controller 324 controls the speed and angle of the drive module 310. The processor 130 may be configured to process characteristics of a reflected signal 336' measured by the receiver 328 in order to calculate the distance travelled by the signal 336 and reflected signal 336', given characteristics of the output signal 336, as determined by controller 324 and transmitter 326, and characteristics of the reflected signal 338, as measured by the receiver 328. The processor 130 may comprise hardware to calculate the distance travelled by the signal, such as a micro- control unit (MCU) to perform some or all the processing tasks required.
[0040] The mirror module 330 comprises a mirror 334 for reflecting signals, and a disc 342 for rotating the mirror 334 when driven by the drive module 310 to which the disc 342 is
mechanically coupled. The disc 342 comprises an aperture 340 disposed co-axially
therethrough for permitting passage of signals between the mirror 334 and the transmitter 326 and receiver 328. The mirror 334 is mounted to the disc 342 at an angle of reflection so that it has a downwardly facing reflective surface disposed along the signal path of the transmitter 328 and receiver 326 to reflect the signal 336 outwardly from the transmitter 328 to the physical environment and inwardly from the physical environment to the receiver 326. For example, the mirror may be angled by 45 degrees relative to the disc 342 to reflect the emitted signal 336 by 90 degree angle relative to its trajectory between the transmitter 326 and the mirror 334. In further embodiments, the transmitter and/or receiver may themselves comprise mirrors and/or lenses that nevertheless provide for a signal to be reflected from the mirror 334.
[0041] The scanning system 300 may comprise a housing 332 for covering and protecting the components of at least the mirror module 330. In this case, at least a region of the housing that lies in the signal path is translucent, transparent or defines an aperture configured to permit passage of signals outwardly through the wall of the housing 332 for passage into the physical environment. In at least one embodiment, the housing 332 comprises a top 331 and translucent or transparent sidewalls 333 for permitting passage of light signals therethrough. In another embodiment, the housing 332 comprises a plurality of thin, rigid members mounted to the base 350 and extending upwardly therefrom, the rigid members supporting a top disposed over the mirror module 330 to form a rigid housing surrounding the mirror module 330. The members permit the housing to provide openings at least in the signal path.
[0042] The drive module 310 comprises: a motor 306 mounted to the base 350 and comprising a drive shaft 304; a mechanical link 302 mechanically coupled to the drive shaft 304 and the disc 342 to rotate the disc 342 when the motor 306 is driven; and a motor controller 301 electrically connected via wires 308 to the motor 306 for controllably driving the motor 306. The motor controller may be physically remote from the scanning system 300 or mounted to the base 350 as shown. If the scanning system 300 comprises a housing 332, as shown, the housing 332 may define an aperture or other suitable access passage to receive therethrough the mechanical link 302. In the illustrated embodiment, the housing 332 defines an aperture therethrough where the mechanical link 302 traverses the housing 332.
[0043] The illustrated wires 308 may communicatively link drive module 310 and transceiver module 320 to one another and to processor 130. Further, wires 308 may electrically connect the drive module 310 and the transceiver module 320 to a power management system (not shown) for powering their components. The power management system may comprise or be electrically coupled to a power source, such as, for example a battery or mains power, and may further comprise a transformer to suitably transform the incoming current to an output current for the scanning system 300.
[0044] Various embodiments of the mechanical link 302 are contemplated. In some
embodiments, the mechanical link 302 comprises a belt, band or chain coupled to the drive shaft 304 for transmitting mechanical force from the motor 306 to the disc 342. In further embodiments, the mechanical link 302 comprises a gear train for transmitting mechanical force from the motor 306 to the disc 342. In still further embodiments, the mechanical link 302 comprises a planetary gear surrounding the disc 342 for transmitting mechanical force from the motor 306 to the disc 342. Other suitable embodiments are also contemplated.
[0045] In various embodiments, the motor 306 is communicatively linked to a motor controller 301 for controlling operation of the motor. In various embodiments, the motor controller 301 is configured to control the rotational angle of the drive shaft 304 at any given time. The motor controller 301 may be configured to provide sensor readings of the angle of the motor output shaft 304 to the processor 130. The motor controller 301 may further be configured to process sensor readings of the angle of the drive shaft 304 to determine a corresponding angle of the disc 342 and mirror 330. The motor controller 301 may further be configured to provide a signal indicating the angle of the mirror 330 to the processor 130.
[0046] In embodiments, the motor controller is configured to rotate the drive shaft 304 according to a predetermined angular displacement at predetermined intervals of time, thus enabling 360 degree rotation comprising a plurality of intervals at known angles from the HMD. [0047] In embodiments, increasing or decreasing the diameter of the disc 342 corresponds to a respective increase or decrease in the mechanical resolution provided by the mechanical link 302, such that a given angular displacement of the drive shaft 304 displaces the disc 342 and the mirror 334 by a respectively lower or greater amount.
[0048] The disc 342 is rotatably mounted to the base 350 by a rotatable coupling 344. The coupling 344 comprises an interface for retaining the disc 342 to the base 350 while permitting substantially free rotation of the disc 342 relative to the base 350. The interface may comprise, for example, roller bearings or a lubricated region engageable with an adjacent region of the base 350 or disc 342. The coupling 344 may retain the disc 342 distal or adjacent the base 350. The coupling may comprise legs or walls to retain the disc 342 distal the base 350 while permitting substantially free passage of the signal 336, 336' through the coupling. The coupling 344 may rotate with the disc 342 relative to the base 350, or vice versa.
[0049] Referring now to Fig. 4, shown therein is a flowchart illustrating blocks 400 relating to a method of operating the scanning system 300, i.e. for a single scan at a single angle. At block 402 the transmitter 326 is turned on by the controller 324. In some embodiments, operation of the scanning system may be triggered by the processor 130 instructing the controller 324 to begin scanning the physical environment. At block 404 an output signal 336 is emitted from the transmitter towards the mirror 334. The output signal 336 then reflects off the mirror 334 and is directed outwardly from the scanning system 300. At block 406 the output signal encounters an obstacle (not shown) in the physical environment and reflects back towards the mirror 334 as return signal 336'. At block 408 the return signal 336' is detected and evaluated by the receiver 328. At block 410 the transmitter 326 is turned off. At block 412 the processor 130 processes sensor readings provided by receiver 328 relating to the detected return signal 336', and the TOF IC of the supporting electronics calculates the distance travelled by the signal to and from the encountered obstacle within the physical environment. The calculated distance may be provided to processor 130 for use in mapping the physical environment surrounding the scanning system 300 for use in AR applications, such as, for example, to map a plurality of calculated distances as a corresponding plurality of points in a point cloud representation of the physical environment. At block 414 the controller 301 actuates the motor 306, causing the disc 342 and mirror 334 to rotate by a predetermined angle. Alternatively, block 414 could be called upon to rotate the disc 342 after a plurality of time-of-flight readings have been calculated, to ensure accuracy of such readings. As illustrated, the blocks may then be repeated to provide additional calculations of distance. The blocks 400 may be repeated so long as the processor 130 controls the controller 324 to continue scanning the physical environment. The processing of a reflected signal to determine distance travelled by the signal, and required hardware, was described in more detail above in relation to the operation of the SLRF.
[0050] In some embodiments, instead of the processor 130 carrying out calculations to determine the distance travelled by the signal, the TOF IC of the support electronics 322 may calculate the distances based on the elapsed time between emissions and detection of the signal 336.
[0051] As the mirror 334 rotates about the environment at predetermined angular increments in conjunction with repetitions of the steps described in relation to the blocks 400 (i.e. additional scans of the environment), the processor 130 may be provided with calculated distances to obstacles located along 360 degrees about the system 300. The system 300 may thus provide calculated distances to processor 130 for further processing and for use in mapping the physical environment surrounding the system 300. The calculated distances provided by the system 300 may be used by the processor 130 for use in further AR applications. Specifically, in some embodiments, the processor 130 may be configured to receive a signal from the controller 301 indicating an angle of the mirror 334 and a signal from the TOF IC indicating a calculated distance to an obstacle at that angle, and the processor may be able to process each signal in order to determine the distance from the HMD to an obstacle at a given angle from the HMD's field of view. The processor 130 may further be configured to process signals from additional scans from the scanning system 300, i.e. repetition of the blocks 400, in order to provide a map of the physical environment comprising distances to obstacles at 360 degrees around the HMD.
[0052] In an alternate embodiment, instead of the transmitter being turned off and on at each repetition of the steps described in relation to blocks 400, the transmitter constantly transmits outbound signals 336 and the processor 130 continuously processes received sensor readings provided by receiver 328. Alternately, the transmitter 326 may constantly transmit output signals 336 and the processor 130 may only process sensor readings periodically or intermittently.
[0053] In further embodiments, instead of the controller merely being activated intermittently, the motor may operate continuously. In such embodiments, the functions of the transmitter 326, controller 324, receiver 328 and processor 130 may be performed at a predetermined frequency, when polled by the processor 130, or continually. In some embodiments, the transmitter 326 is always active to emit signals, but the receiver 328 and controller 324 are only intermittently activated by the processor to perform their functions, such as capturing a reflected signal. [0054] In some embodiments, the speed of the motor may be varied by controller 301 in order to vary the frequency with which repeated scans according to blocks 400 will map 360 degrees around the system 300.
[0055] Referring now to Figs. 5 and 6, shown therein is an alternate embodiment of a scanning system 500 for scanning a physical environment, the scanning system 500 being configured to avoid twisting and rotation of wires 308' during rotation of the mirror 334. In the illustrated embodiment a slip ring 552 is provided for electrically connecting the wiring 308' to the disc 542. The slip ring 552 is rotatably disposed within the aperture 540 of the disc 542. The slip ring 552 is annular and comprises a ring aperture 553 coaxial with the aperture 540 of the disc 542 to permit passage therethrough of the signals 336, 336'. The slip ring 552 is electrically coupled to, but rotationally independent of, the disc, enabling electrical coupling between the electronic components of the transceiver module 320 and any components (not shown) which may be mounted to the disc 342 while preventing twisting of the wires 308' during rotation of the disc 542.
[0056] Referring now to Fig. 7, shown therein is an alternate embodiment of a scanning system 600 configured to prevent twisting and rotation of wires during rotation of the scanning system 600. The mirror module 630 comprises a mirror 634 which is tiltably mounted to the disc 342. A tilt actuator 660 is mechanically coupled to the mirror 634 to tilt the mirror 634 toward and away from the disc 342. Tilting the mirror 634 may allow the mirror to redirect and capture the signal 636, 636' upwardly and downwardly toward the physical environment to provide potential 3D scanning and mapping. Wiring 308' may electrically and communicably couple the tilt actuator 660 via the slip ring 552 to the transceiver module 320, the controller 301 , the processor 130, or any other electronic components located mounted to the base 350, in order to control and power the tilt actuator 660 to tilt the mirror 634. The angular rotation of the tilt actuator 660 is selectable and determinable. For example, the tilt actuator 660 may be a stepper motor, or it may comprise an optical encoder to provide the angular rotation of the actuator.
[0057] Embodiments described in relation to Fig. 7 may be configured to provide a 3- dimensional map of the physical environment surrounding the system 500. The processor 130 may subsequently correlate the tilt angle of the mirror to a given distance measurement.
Accordingly, the processor 130 may be configured to receive signals indicative of the tilt angle of the mirror, the rotational angle of the mirror with respect to the field of view of the HMD (as previously described), as well as transmitter 326 and receiver 328 readings, and the processor 130 may be configured to further process the received signals in order to generate a three dimensional map of the physical environment surrounding the scanning system 600.
[0058] Referring now to Fig. 8, shown therein is an embodiment of a scanning system 700 for scanning a physical environment wherein the motor rotates non-continuously to prevent over- twisting of wires 708. According to the illustrated system 700, the shaft 704 of a brushless motor, more specifically an outrunner motor, 706 is mounted to the base 350 by an armature 730. During rotation, the shaft 704 of the outrunner motor 706 remains stationary relative to the base 350, while the motor 706 rotates. The mirror 734 is coupled to the rotating portion of the motor 706. The wiring 708 attaches to the rotating portion of the motor 706 and electrically couples the motor 706 to the controller 701 in the base 350. In operation, the motor 706 is controlled by the controller 701 to reciprocate the mirror 734 between predetermined first and second rotational angles. The controller 701 may be located proximal the transceiver module 320. The first and second rotational angles may be predetermined and stored in memory at the controller 701. During reciprocation of the mirror, the transmitter 326 may transmit signals for scanning the physical environment surrounding the system 700.
[0059] As the motor 706 rotates between the first and second rotational angles during operation of the system 700, the wires 708 may experience some twisting along with the rotation of the motor 706. By reciprocating between the first and second rotational angles, the wires 708 are prevented from twisting indefinitely in a single direction; instead, the wires 708 wind in the angular direction of the motor 706 as the motor rotates between the first and second rotational angles until and then unwinds (and, in some embodiments, resumes winding in the same direction as the rotation of the motor 706) as the motor travels between the second and first rotational angles. The scanning system 700 may be configured thereby to scan the physical environment while avoiding the wires becoming overly tangled for successful operation of the system. Thus, stressful twisting of the wires is avoided or mitigated.
[0060] Referring now to Fig. 9, shown therein is a flowchart illustrating blocks 800 relating to steps of operation of the scanning system 700. At block 802 the transmitter 326 is turned on by the processor 130. In some embodiments, operation of the scanning system may be triggered by the processor 130 instructing the controller 324 to begin scanning the physical environment. At block 804 an output signal 336 is emitted from the transmitter towards the mirror 334. The output signal 336 then reflects off the mirror 334 and is directed outwardly from the scanning system 700. At block 806 the output signal encounters an obstacle (not shown) and reflects back towards the mirror 334, the reflected signal being illustrated as reflected signal 336'. At block 808 the return signal 336' is detected and measured by the receiver 328. At block 810 the transmitter is turned off. At block 812 the processor 130 processes sensor readings provided by receiver 328 relating to the detected reflected signal 336', and the processor 130 calculates the distance travelled by the signal between the scanning system 700 and the encountered obstacle. The calculated distance may be provided to processor 130 for use in mapping the physical environment surrounding the system 700 for use in AR and VR applications, as previously described. At block 814 the motor controller 701 determines if the motor has reached either of the first or second rotational angles. If yes, at block 815 the motor controller 701 reverses the direction of rotation of the motor 706 away from whichever of the first and second rotational angles it has reached. If not, at block 816 the motor controller 701 causes the motor 706 to rotate along the same rotation as in the preceding cycle. As illustrated, the blocks may then be repeated to provide additional calculations of distance. The blocks 800 may be repeated so long as the processor 130 controls the controller 324 to scan the physical environment. Each of the motions at blocks 815 and 816 may be incremental according to a predetermined interval stored in a memory of the scanning system 700. The processing of a reflected signal to determine distance travelled by the signal, and required hardware, was described in more detail above in relation to the operation of the LSRF.
[0061] Although the following has been described with reference to certain specific
embodiments, various modifications thereto will be apparent to those skilled in the art without departing from the spirit and scope of the invention as outlined in the appended claims. The entire disclosures of all references recited above are incorporated herein by reference.

Claims

CLAIMS What is claimed is:
1. A system for scanning a physical environment for augmented reality applications,
comprising:
a) a base;
b) a disc rotatably mounted to the base, the disc having a co-axial aperture disposed therethrough;
c) a mirror module mounted to the disc and comprising a mirror having a reflective
surface facing the aperture at an angle of reflection;
d) a transceiver module mounted to the base, the transceiver module comprising a
transmitter configured to emit a signal through the aperture to be reflected by the mirror towards a physical environment and a receiver configured to receive the signal through the aperture once reflected by the mirror from the physical environment; and e) a drive module mounted to the base, the drive module communicatively connected via wires to a motor controller and mechanically coupled to the mirror module for rotatably driving the mirror without obstructing the aperture.
2. The system of claim 1 , wherein the mirror is disposed at an angle of 45 degrees relative to the disc.
3. The system of claim 1 , wherein the mirror module comprises a housing, the housing
permitting passage of the signal from the mirror to the physical environment.
4. The system of claim 1 , wherein the drive module comprises: a motor mounted to the base, a drive shaft rotated by the motor, and a mechanical link mechanically coupled to the drive shaft and the disc to rotate the disc when the motor is driven.
5. The system of claim 1 , wherein the transmitter comprises one of: a laser diode and a light emitting diode, and wherein the receiver is configured to receive the signal.
6. The system of claim 1 , wherein the transmitter and receiver cooperate to generate sensor readings which are obtained by a processor.
7. The system of claim 6, wherein the processor processes the sensor readings to determine distances between the scanning system and surrounding obstacles in the physical environment.
8. The system of claim 7, wherein the processor generates a map of the physical environment using the distances.
9. The system of claim 7, wherein the processor determines the distances using a time-of-f light determination.
10. The system of claim 7, wherein the processor determines the distances using a phase shift determination.
1 1. The system of claim 1 , wherein the system is mounted to a head mounted display worn by a user occupying the physical environment
12. The system of claim 1 , wherein the system permits continuous rotational scanning of the physical environment while mitigating tangling of wiring.
13. The system of claim 1 , wherein the system comprises wires coupled solely to stationary elements relative to the base.
14. The system of claim 1 , wherein the system is configured to emit the signal at a plurality of angles along a rotational path, track at which angle the signal has been emitted, collect received signals along a plurality of angles and determining the distances to various obstacles corresponding to those angles, and generate a map of the physical environment.
15. The system of claim 1 , wherein the system further comprises a stabilizer unit to stabilize the mirror module horizontally.
16. The system of claim 1 , wherein the stabiliser unit comprises gimbals.
17. The system of claim 1 , wherein the mirror is tiltably mounted to the disc.
18. The system of claim 17, wherein a tilt actuator mechanically couples the mirror to the disc to tilt the mirror toward and away from the disc.
PCT/CA2015/050117 2014-02-18 2015-02-18 Systems and methods for scanning a physical environment for augmented reality and virtual reality applications WO2015123768A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201461941074P 2014-02-18 2014-02-18
US61/941,074 2014-02-18

Publications (1)

Publication Number Publication Date
WO2015123768A1 true WO2015123768A1 (en) 2015-08-27

Family

ID=53877471

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2015/050117 WO2015123768A1 (en) 2014-02-18 2015-02-18 Systems and methods for scanning a physical environment for augmented reality and virtual reality applications

Country Status (1)

Country Link
WO (1) WO2015123768A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10019837B2 (en) 2016-05-31 2018-07-10 Microsoft Technology Licensing, Llc Visualization alignment for three-dimensional scanning
EP3372289A1 (en) * 2017-03-07 2018-09-12 HTC Corporation Method suitable for a head mounted display device and virtual reality system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010048898A (en) * 2008-08-19 2010-03-04 Seiko Epson Corp Optical scanner and image forming apparatus
US20100182667A1 (en) * 2009-01-21 2010-07-22 Seiko Epson Corporation Optical scanning apparatus and image forming apparatus

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010048898A (en) * 2008-08-19 2010-03-04 Seiko Epson Corp Optical scanner and image forming apparatus
US20100182667A1 (en) * 2009-01-21 2010-07-22 Seiko Epson Corporation Optical scanning apparatus and image forming apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10019837B2 (en) 2016-05-31 2018-07-10 Microsoft Technology Licensing, Llc Visualization alignment for three-dimensional scanning
EP3372289A1 (en) * 2017-03-07 2018-09-12 HTC Corporation Method suitable for a head mounted display device and virtual reality system

Similar Documents

Publication Publication Date Title
US11573562B2 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
AU2019208265B2 (en) Moving robot, method for controlling the same, and terminal
JP6796975B2 (en) UAV measuring device and UAV measuring system
CN107992052B (en) Target tracking method and device, mobile device and storage medium
US8958911B2 (en) Mobile robot
US9798322B2 (en) Virtual camera interface and other user interaction paradigms for a flying digital assistant
US20230366985A1 (en) Rotating Lidar
JP4228132B2 (en) Position measuring device
US20150097719A1 (en) System and method for active reference positioning in an augmented reality environment
US8909375B2 (en) Nodding mechanism for a single-scan sensor
JP6994879B2 (en) Surveying system
US10321065B2 (en) Remote communication method, remote communication system, and autonomous movement device
CN105874349B (en) Detection device, detection system, detection method and movable equipment
JP6823482B2 (en) 3D position measurement system, 3D position measurement method, and measurement module
CN109507686B (en) Control method, head-mounted display device, electronic device and storage medium
WO2016168722A1 (en) Magic wand interface and other user interaction paradigms for a flying digital assistant
WO2015048890A1 (en) System and method for augmented reality and virtual reality applications
KR20130015739A (en) Method for autonomous movement based on driver state and apparatus threof
JP2011180103A (en) Three-dimensional distance measuring device and mobile robot
WO2015123768A1 (en) Systems and methods for scanning a physical environment for augmented reality and virtual reality applications
WO2016004537A1 (en) Scanning system and methods therefor
JP2020525956A (en) Display device for computer-mediated reality
KR20140125538A (en) A fully or partially electronic-scanned high speed 3-dimensional laser scanner system using laser diode arrays
CN111583346A (en) Camera calibration system based on robot sweeping field
WO2022141049A1 (en) Laser ranging apparatus, laser ranging method and movable platform

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15752361

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15752361

Country of ref document: EP

Kind code of ref document: A1