WO2019148350A1 - Détermination de position - Google Patents

Détermination de position Download PDF

Info

Publication number
WO2019148350A1
WO2019148350A1 PCT/CN2018/074685 CN2018074685W WO2019148350A1 WO 2019148350 A1 WO2019148350 A1 WO 2019148350A1 CN 2018074685 W CN2018074685 W CN 2018074685W WO 2019148350 A1 WO2019148350 A1 WO 2019148350A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
scatter pattern
scatter
received
optical
Prior art date
Application number
PCT/CN2018/074685
Other languages
English (en)
Inventor
Zhen Xiao
Original Assignee
Nokia Technologies Oy
Nokia Technologies (Beijing) Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nokia Technologies Oy, Nokia Technologies (Beijing) Co., Ltd. filed Critical Nokia Technologies Oy
Priority to PCT/CN2018/074685 priority Critical patent/WO2019148350A1/fr
Publication of WO2019148350A1 publication Critical patent/WO2019148350A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • Embodiments relates to position determination, particularly, though not exclusively, for use in virtual reality or related systems.
  • Virtual reality is a rapidly developing area of technology in which audio and/or video content is provided to the user device, such as a headset.
  • a virtual space or virtual world is any computer-generated version of a space, for example a captured real world space, in which a user can be immersed through the user device.
  • a virtual reality headset may be configured to provide virtual reality video and/or audio content to the user, e.g. through the use of a pair of video screens and/or headphones incorporated within the headset.
  • Position and/or movement of the user device can enhance the immersive experience.
  • position determination may be required to map a user’s position in a real space to a corresponding position in a virtual space.
  • the position may comprise spatial position and/or orientation.
  • Position determination may also be used to track movement of the user and provide appropriate mappings in real or near real-time.
  • Optical sensors may be used for position determination.
  • Optical sensors convert electromagnetic waves into electrical signals and are often used as part of a larger system, such as a virtual reality system.
  • Optical sensors can be mounted on the user device to track the position and determine the orientation of the user device in real space and map the position and orientation to a virtual space.
  • the specification describes an apparatus comprising: a means for scattering received light to produce a scatter pattern dependent on the angle at which the light is received and a means for determining a position of the apparatus by using the scatter pattern.
  • the scattering means may comprise a body the physical characteristics of which are arranged such as to scatter light dependent on one or both of (i) the angle at which light is received into the body and (ii) the distance of the body from a source of the light, to produce the scatter pattern, the apparatus further comprising one or more optical sensors for sensing the scatter pattern for provision to the position determining means.
  • Each of the one or more optical sensors may be divided into sub-portions configured to detect a respective part or position of the scatter pattern.
  • the body may be formed of a transparent or partially transparent material.
  • the body may comprise an outer surface at least part of which is polished to facilitate light transmission into the body.
  • the body may be shaped so as to cause internal reflection of light transmitted within the body towards the one or more optical sensors.
  • the body may be formed in any one of a cube shape, a cylindrical shape, a hemispherical shape, or other polyhedral shape.
  • the one or more optical sensors may be arranged at or near an outer surface of the body.
  • the body may be substantially formed of a molded material.
  • the body may be substantially formed of one or more cut materials.
  • the body may be substantially formed of a 3D printed material.
  • the body may be formed of or from one or more materials comprising glass, crystal glass, acrylic, polymethyl methacrylate (PMMA) , acrylate or other cured gel material.
  • PMMA polymethyl methacrylate
  • the body may comprise one or more internal structures configured to scatter the received light.
  • the one or more internal structures may be formed in an M-sequence.
  • the one or more internal structures may be formed randomly.
  • the one or more internal structures may comprise one or more of engravings, bubbles, metallic flakes, optical waveguides, nanoparticles, and split mirrors.
  • the one or more internal structures may comprise one or more optical waveguides formed of a plurality of optical fibres.
  • the position determining means may be configured to compare the produced scatter pattern with at least one stored scatter pattern, wherein the at least one stored scatter pattern is associated with a particular angle or distance in order to determine the angle or distance at which the light is received and determine the position of the apparatus.
  • the one or more optical sensors may comprise a plurality of optical elements arranged as a row or grid, wherein the scatter pattern is determined according to which optical elements sense the received light.
  • the one or more optical sensors may comprise a plurality of optical elements arranged as a row or grid, wherein the scatter pattern is determined according to the detected light intensity at each optical element.
  • the apparatus may further comprise a means for guiding the scattered light from the scattering means to the one or more optical sensors.
  • the apparatus may be provided on a wearable device.
  • the wearable device may be any one of a head set, a head band, a pair of glasses, or a helmet.
  • the apparatus may be provided on a stalk that is mounted on the helmet.
  • the scattering means may be positioned on a first side of the wearable device and the one or more optical sensors are positioned on a second side of the wearable device.
  • the position determining means mat be configured to determine the position of the apparatus in a real space in order to map the apparatus to a position in a virtual space.
  • the specification describes a method comprising: receiving scattered light to determine a scatter pattern dependent on the angle at which the light is received to an apparatus; and determining a position of the apparatus by using the scatter pattern.
  • the scatter light may be received from a body, the physical characteristics of which are arranged such as to scatter light dependent on one or both of (i) the angle at which light is received into the body and (ii) the distance of the body from a source of the light, to produce a scatter pattern which is determined using one or more optical sensors.
  • the position may be determined by comparing the produced scatter pattern with at least one stored scatter pattern, wherein the at least one stored scatter pattern is associated with a particular angle or distance in order to determine the angle or distance at which the light is received from the light source.
  • the one or more optical sensors may comprise a plurality of optical elements arranged as a row or grid, wherein the scatter pattern is determined according to which optical elements sense the received light.
  • the one or more optical sensors may comprise a plurality of optical elements arranged as a row or grid, wherein the scatter pattern is determined according to the detected light intensity at each optical element.
  • the method may further comprise guiding the scattered light from the scattering means to the one or more optical sensors.
  • the method may be performed by a processing apparatus provided on a wearable device.
  • the wearable device may be any one of a head set, a head band, a pair of glasses, or a helmet.
  • the position may represent a position in a real space and may further comprise mapping the position to a position in a virtual space.
  • the specification describes a computer program comprising instructions that when executed by a computer program control it to perform a method according to any preceding definition.
  • the specification describes an apparatus configured to perform a method according to any preceding definition.
  • the specification describes a system comprising: a means for issuing a beam of light within a real world space; and an apparatus according to according to any preceding definition, wherein the position determining means is arranged to determine the position of the apparatus within the real world space based on the scatter pattern produced by receiving the beam of light.
  • the light issuing means issues may be a scanning beam of light within the space.
  • the light issuing means issues may be a scanning beam in a first direction and in a second direction.
  • the specification describes a non-transitory computer-readable storage medium having stored thereon computer-readable code, which, when executed by at least one processor, causes the at least one processor to perform a method, comprising: receiving scattered light to determine a scatter pattern dependent on the angle at which the light is received to an apparatus; determining a position of the apparatus by using the scatter pattern.
  • the specification describes an apparatus, the apparatus having at least one processor and at least one memory having computer-readable code stored thereon which when executed controls the at least one processor: to receive scattered light to determine a scatter pattern dependent on the angle at which the light is received to an apparatus; to determine a position of the apparatus by using the scatter pattern.
  • Figure 1 is a perspective view of a virtual reality (VR) system
  • Figure 2 is a schematic view of an apparatus receiving light from a light issuing means in different directions;
  • Figure 3 is a schematic view showing components of an apparatus used to scatter light according to example embodiments
  • Figures 4a and 4b are schematic views respectively illustrating examples of light entering the apparatus at different angles according to example embodiments
  • Figures 5a and 5b are schematic views respectively illustrating examples of light scattering and being sensed by multiple optical sensors at different angles according to example embodiments;
  • Figure 6 is a schematic view showing scattered light travelling through an optical waveguide according to example embodiments.
  • Figures 7a and 7b are schematic views respectively showing the apparatus receiving light at different distances according to example embodiments.
  • Figures 8a to 8d are schematic views respectively showing the apparatus receiving light at different times according to example embodiments.
  • Figure 9 is a schematic view showing the apparatus receiving light from two light sources according to example embodiments.
  • Figure 10 is flow diagram of operations performed at the apparatus according to example embodiments.
  • Figure 11 is a schematic view of the apparatus or external processing device.
  • Figure 12 is a perspective view of a virtual reality system including a mounted sensor according to example embodiments.
  • Figures 13a and 13b are, respectively, a portable memory device and a disc storage medium, being examples of portable and removable memory media upon which data used in the performance of one or more embodiments may be stored.
  • Embodiments herein relate to position determination based on optical sensing. Such embodiments can be used in any application where position determination is required, although particular embodiments relate to virtual reality systems.
  • position determination may be required to map a user’s position in a real space to a corresponding position in a virtual space. This may determine what the user sees and/or hears when consuming the virtual data which can be video and/or audio data.
  • the position may refer to spatial position and/or orientation.
  • Position determination may also be used to track movement of the user and provide appropriate mappings in real or near real-time.
  • Optical sensing may be used for position determination.
  • Optical sensors convert electromagnetic waves into electrical signals which are interpreted to determine a position for mapping purposes.
  • Optical sensors can be mounted on a user device, such as a virtual reality headset, to track the position and orientation of the user in real space and map one or both of the position and orientation to a virtual space.
  • VR virtual reality
  • AR augmented reality
  • Embodiments herein involve methods and systems for scattering received light to produce a scatter pattern dependent on the angle at which the light is received and/or on the distance from the light source, and determining a position of the apparatus by using the scatter pattern.
  • Scattering in this context refers to causing light to deviate from an original trajectory by one or more paths, typically due to a non-uniformity in the medium through which the light passes.
  • scattering usually involves causing a beam of light to deviate into a plurality of paths.
  • the resulting scatter pattern is a detectable pattern resulting from the scattering.
  • the scatter pattern may be detected by one or more light sensors or sub-portions of one or more light sensors.
  • the light may be visible or invisible light, e.g. infra-red light.
  • the light may be issued in the form of one or more beams or blades from one or more light sources.
  • the one or more light beams or blades may be moving, for example by using a scanning source which scans the one or more light beams or blades in a predetermined pattern.
  • Figure 1 is a schematic illustration of a virtual reality display system 1.
  • the virtual reality display system 1 includes a user device 20 and a virtual reality media player 10 for rendering visual content.
  • Figure 1 illustrates the user device 20 as a virtual reality headset but it should be appreciated that the user device could be any other device operable in a virtual reality system, for example, a handheld controller.
  • the user will wear or otherwise carry the user device 20 in a real world space, such as a room, and interact with a virtual space in which the user is immersed, the virtual space being conveyed to the user in the form of video and/or audio data transmitted to, and rendered by, the virtual reality display system 1.
  • a real world space such as a room
  • the virtual space being conveyed to the user in the form of video and/or audio data transmitted to, and rendered by, the virtual reality display system 1.
  • a virtual space is any computer-generated version of a space, for example a captured real world space, in which a user can be immersed.
  • the position and/or orientation of the user device can be mapped from the real world space into the virtual space.
  • the user device 20 is a virtual reality headset of any suitable type.
  • the user device 20 may alternatively be a head band, a pair of glasses, or a helmet.
  • the virtual reality media player 10 may be part of a separate device which is connected to the user device 20 by a wired or wireless connection.
  • the virtual reality media player 10 may include a games console, or a Personal Computer (PC) configured to communicate visual data to the user device 20.
  • the virtual reality media player 10 may form part of the user device 20.
  • the virtual reality media player 10 may comprise a head mount display, TV, mobile phone, smartphone or tablet computer configured to play content through its display.
  • the virtual reality media player 10 may be a touchscreen device having a large display over a major surface of the device, through which video content can be displayed.
  • the virtual reality media player 10 may be inserted into a holder of the user device 20.
  • a smart phone or tablet computer may display the video data which is provided to a user’s eyes via respective lenses in the headset 20.
  • the virtual reality display system 1 may also include hardware configured to convert the device to operate as part of the virtual reality display system 1.
  • the virtual reality media player 10 may be integrated into the user device 20.
  • the virtual reality media player 10 may be implemented in software.
  • a device comprising virtual reality media player software is referred to as the virtual reality media player 10.
  • the user device 20 may be configured to output one or both of virtual reality video and audio content to a user.
  • the user device 20 may be a handheld device and the position and orientation of the handheld device can be mapped from the real world space into the virtual space.
  • the virtual reality display system 1 may include means for determining the spatial position and/or orientation of the user’s head. Over successive time frames, a measure of movement may therefore be calculated and stored. Such means may comprise part of the virtual reality media player 10. Alternatively, the means may comprise part of the user device 20.
  • the user device 20 may incorporate motion tracking sensors which may include one or more of gyroscopes, accelerometers and structured light systems. These sensors generate position data from which a current visual field-of-view (FOV) is determined and updated as the user changes position and/or orientation.
  • the current visual FOV may also be termed a “Viewport” as will be used hereafter.
  • Figures 2a -2c which are useful for understanding example embodiments, show a light issuing means 21 issuing a beam of light 22 into a real world space 15 within which a user 17 is located.
  • the user 17 will likely move in terms of head orientation and possibly in translation within the real world space.
  • the user 17 may for example be exploring a virtualised world, consuming an event such a concert or playing an interactive game.
  • the beam of light 22 may be received by one or more sensors 24 which may be carried by the user 17, for example by being mounted on the user device 20.
  • the one or more sensors 24 according to example embodiments may be arranged to scatter the received light to produce a scatter pattern, the produced pattern depending on the position of the sensor in the real world space 15 and which is detected and processed to determine a user position for mapping to the virtual space.
  • the light issuing means 21 may hereafter be termed a ‘lighthouse’ in that it is an apparatus which may be mounted within or adjacent the real world space 15.
  • the lighthouse 21 may issue one or more beams or blades of light 22, for example using one or more laser diodes or similar light emitters.
  • the emitted beam or blade of light 22 may be infra-red and therefore not visible to the human eye.
  • Figures 2a -2c show a lighthouse 21 issuing a moving beam or blade of light 22, which is received by the sensor 24 at different respective angles due to the user 17 changing orientation.
  • Embodiments herein provide a sensor 24 comprised of a body that may produce a different scatter pattern for the different orientations angles, thereby making it possible to determine at least the user’s orientation based on the produced scatter pattern.
  • Figure 2a shows the moving beam or blade of light 22 entering the sensor 24 at a first angle, with Figures 2b and 2c showing said beam entering the sensor at second and third angles respectfully.
  • the spatial position of the sensor 24 in the real world space 15 may also be determined based on scatter pattern produced.
  • the lighthouse 21 is arranged to provide a moving beam or blade of light 22, that may follow a predetermined two or three -dimensional path within the real world space 15.
  • the lighthouse 21 may provide movable emitters, or it may comprise a means for redirecting emitted light, such as by means of a movable mirror positioned adjacent the emitters.
  • the lighthouse 21 may cause the beam or blade of light 22 to sweep back and forth in the manner of a raster pattern to cover substantially the volume of the real world space 15, and then repeat the pattern.
  • the lighthouse 21 may cause a two-dimensional blade of light to sweep across the real world space 15 horizontally, and then repeat vertically.
  • a plurality of lighthouses 21 may be used. Each lighthouse 21 may generate one or more light beams or blades. However, a single lighthouse 21 may be sufficient for tracking a user’s movement in three or six-degrees-of-freedom (3 or 6DOF) for present embodiments
  • each lighthouse 21 may comprise two rotors that sweep a linear beam or blade of light 22 across a volume of the real world space on orthogonal axes in the real world space.
  • the linear beam or blade of light 22 may broaden in terms of its width or cross-sectional area.
  • the lighthouse 21 may emit a calibrating pulse or flash of light before each sweep or scan of the beam or blade of light 22.
  • the lighthouse 21 may cause a flash of omnidirectional light to be output.
  • the sensor 24 may receive the calibrating pulse or flash of light and, sometime later, receive the subsequently issued linear beam or blade of light 22.
  • a position determining means which may be part of the sensor 24 or part of an external system, may determine the time delay between receiving the calibrating pulse or flash of light and receiving the beam or blade of light 22. The time duration or time delay is used to calculate the spatial position of the sensor 24, and therefore the user 17, in the volume of the real world space 15.
  • This may be by means of, for example, issuing a vertical sweep of the real world space 15 to determine a first axis position or offset, and then a horizontal sweep of the real world space to determine a second axis position or offset.
  • the position of the sensor 24 can then be mapped to a corresponding virtual reality world space.
  • the position determining means can also be used to determine the orientation of the apparatus, as will be explained below, based on the received scatter pattern produced by the sensor 24.
  • a six degrees-of-freedom (6DoF) virtual reality system is one where the user is able to freely move in Euclidean space and rotate their head in the yaw, pitch and roll axes.
  • Six degrees-of-freedom virtual reality systems and methods enable the provision and consumption of volumetric virtual reality content.
  • the user device 20 of some embodiments incorporates one or more of said optical sensors 24 for capturing light from one or more beams or blades of light 22 issued from one or more lighthouses 21. We assume in some embodiments that only one lighthouse 21 is needed.
  • the user device 20 incorporates or carries a sensor 24 with a means for scattering the received light to produce a scatter pattern dependent on one or both of the angle at which the light is received and the distance of the sensor from the lighthouse 21, and a position determining means for determining the position of the apparatus using the scatter pattern, which may be its orientation and/or its spatial position.
  • the particular scatter pattern produced by the scattering means is dependent on the angle at which light enters the scattering means. In some embodiments, the particular scatter pattern produced by the scattering means is also dependent on the distance of the scattering means from the lighthouse 21. Therefore, one or both of the orientation and spatial position of the one or more sensors 24, and therefore the user device 20, can be determined based on the scatter pattern. The above-mentioned use of a calibrating flash may therefore not be necessary in some embodiments.
  • the position determining means may form part of the scattering means or may be part of an external system which receives signals from the scattering means, for example over a wired or wireless channel.
  • the position determining means may be a computer system, for example comprised of one or more processors and/or controllers, for performing a method to be described later on.
  • the processors and/or controllers may operate under the control of software.
  • Light scattering generally refers to the physical process where light is forced to deviate from a straight trajectory by one or more paths due to localized non-uniformities in the medium through which the light passes.
  • FIG. 3 illustrates a sensor 100 according to an embodiment.
  • the sensor 100 includes a light scattering means 102 which has physical characteristics to scatter a received beam or blade of light 108, dependent at least on the angle at which the beam or blade is received into its body, to produce a scatter pattern.
  • the sensor 100 may further include one or more light sensors 106 for sensing the produced scatter pattern for provision to a position determining means 107.
  • the position determining means 107 may be part of, or separate from, the sensor 100 and has the role of decoding signals produced by the light sensors 106 to derive a position or an indication of position.
  • the sensed scatter pattern may for example be represented as a digital signature comprised of one or more bits per spatial position, whereby the digital signature is compared with a look-up-table or metadata indicating the respective position information.
  • the light scattering means 102 may include a body 103 and the one or more light sensors 106 may comprise any suitable type of optical sensor, such as infra-red sensors.
  • the body 103 is transparent to facilitate light transmission into the body and to allow at least some light to pass through the body and onto the one or more light sensors 106.
  • the position determining means 107 of the sensor 100 may have improved performance, for example in terms of positional accuracy.
  • the body 103 may be only partially transparent.
  • the body 103 may be translucent or partially translucent.
  • At least part of an outer surface 109 of the body 103 is polished to facilitate improved light transmission into the body. In some embodiments, the entire outer surface 109 of the body is polished.
  • the body 103 of the sensor 100 may be shaped so that internal reflection may cause the beam or blade of light 108 transmitted within the body to be reflected towards the one or more optical sensors 106.
  • the passage of the beam or blade of light 108 towards the light sensors may therefore be direct, indirect or a combination thereof.
  • the body 103 of the sensor 100 may be shaped in any suitable three-dimensional form, for example as a cube shape, a cylindrical shape, a hemispherical shape, or other polyhedral shape to facilitate multiple reflections and/or refractions of light within the body.
  • the one or more optical sensors 106 of the sensor 100 may be arranged at or near the outer surface 109 of the body 103, thus ensuring that at least an appropriate amount of the scattered light is received by the one or more optical sensors 106 to facilitate position determination, and to reduce or minimise lost light.
  • the one or more optical sensors 106 may be somewhat remote from body 103, with the scattered light being redirected towards said sensors, for example by an optical channel, e.g. using optical fibres.
  • the body 103 of the sensor 100 may be formed of a molded material, such as glass, crystal glass, acrylic or acrylate, for example polymethyl methacrylate (PMMA) .
  • the body 103 of the sensor 100 may be formed of another cured gel material.
  • the body 103 of the sensor 102 is formed of one or more materials that have been cut into shape.
  • the body 103 of the sensor 102 is formed by three-dimensional (3D) printing and the body is formed of a 3D printed material.
  • the body 103 of the sensor 102 may also be formed of a mixture of different types of material and/or formed using a plurality of the above methods.
  • Embodiments may therefore also comprise methods and systems for producing or forming a body 103 for a sensor 100 described herein, for example in order to produce a scatter pattern from a received beam or blade of light 108 from which position can be determined.
  • the body 103 of the sensor 100 may comprise one or more internal features or structures 104, typically physical non-uniformities, configured to scatter the received light transmitted inside the body to facilitate the creation or formation of scatter patterns. For example, a single beam of light 108 may be scattered by a physical non-uniformity to produce two or more beams of light, deviating from the original trajectory, of lower intensity.
  • the term scattering structure 104 is used hereafter to refer to said one or more features or structures.
  • the scattering structure 104 is formed in a predetermined and repeatable way to improve the repeatability of forming structures with the same scattering characteristics, for example other substantially similar sensors.
  • the scattering structure 104 may be formed in an M- sequence structure, for example.
  • the body 103 may comprise a scattering structure 104 that has been formed randomly, for example using randomly distributed physical non-uniformities.
  • the scattering structure 104 may comprise, for example, one or more of engravings, bubbles, metallic flakes, optical waveguides, nanoparticles and split mirrors.
  • an optical waveguide may comprise a plurality of optical fibres.
  • a plurality of physical non-uniformities which may comprise any of the above, are shown arranged in a grid-like manner, in which the received beam of light 108 is split and reflected within the body 103 by said scattering structure 104 towards the optical sensors 106 where the resulting scatter pattern is captured.
  • the scattering structure 104 is arranged such that the scatter pattern produced is at least dependent on the angle at which the beam of light 108 enters the body 103. Therefore, if the beam of light 108 were to enter the body 103 at a different angle to that shown, then a different splitting and/or reflection characteristic would result to produce a different scatter pattern than that shown. It will be appreciated, therefore, that the role of the position determining means 107 is to map the received and detected scatter pattern to a corresponding position, in this case orientation of the sensor 100 and therefore that of the user 17.
  • This may require a learning operation in advance of consumption of the virtual content, or may be programmed in advance.
  • the position determining means 107 in use may compare the produced scatter pattern with a set of pre-provided or pre-learned reference scatter patterns stored in memory. These may be derived in a prior calibration or learning stage.
  • the stored scatter patterns are associated with respective particular positions, e.g. orientation angles. For example, a set of metadata or a look-up-table of correspondences may be stored. If a received scatter pattern and a stored reference scatter pattern match, then the position and/or orientation of the device associated with the matching reference scatter pattern can be identified and used for mapping as part of a virtual reality application.
  • the one or more optical sensors 106 may be arranged in a specified formation, for example as a two-dimensional grid or matrix, although any suitable two-dimensional arrangement may be used. In some embodiments, a one-dimensional row of sensors may be sufficient. Each optical sensor 106 may be sub-divided into sub-portions, each for detecting a respective part or position of the produced scatter pattern. The scatter pattern may be determined according to which optical sensors 106, or sub-portions thereof in the specified formation, sense the scattered light. The determination may be a binary one, i.e. to produce a binary signature corresponding to the one or more optical sensors 106, or sub-portions thereof, which detect said light for a given received beam or blade of light 108.
  • a quantisation or filtering means may be associated with the one or more optical sensors 106 to avoid classifying a particular optical sensor 106 as receiving light unless the intensity of said light is above a certain threshold, for example to avoid mis-sensing due to ambient light.
  • each optical sensor 106, or sub-portion thereof may produce a signal or multi-bit digital signature representing the intensity of the light received at said sensor or sub-portion position.
  • Each sensing position may be have a respective storage bin.
  • most if not all of the optical sensors 106, or sub-portions thereof may sense some amount of light of differing or variable intensity depending on the scatter pattern produced.
  • the user device 20 may comprise a means for guiding the scattered light from the scattering structure 104 to the one or more optical sensors 106 which may be remote from the sensor 100.
  • the light guiding means may be an optical channel or waveguide.
  • one or more optical fibres may be used.
  • the user device 200 may be, or may be provided on, a wearable device.
  • the wearable device may be, for example, a head set, a head band, a pair of glasses, or a helmet.
  • the user device is provided on an arm or stalk that is mounted on the helmet.
  • the sensor 100 is provided on a first side of the user device 20, e.g. a wearable device and the one or more optical sensors 106 are positioned on a second side of the wearable device.
  • the first side may be the front of the wearable device 20 and the second side may be a reverse side of the wearable device, or vice versa.
  • the first and second side may respectively refer to the left and right sides of the wearable device 20.
  • a plurality of sensors 100 may be provided on the wearable device 20.
  • the position determining means 107 may be configured to determine the position or the orientation of the apparatus in a real space so that the position and orientation of the apparatus can be mapped to a position in a virtual space.
  • Figures 4a and 4b show a beam or blade of light 208, 308 entering another example sensor 200 at two different angles, z, y and activating first and/or second different optical elements 206a, 206b of an associated optical sensor 206.
  • Figure 4a shows the beam or blade of light 208 entering the body of the apparatus at angle z.
  • the beam or blade of light 208 is then scattered by at least part of the scatter structure 204 and activates the second optical element 206b.
  • Figure 4b shows the beam or blade of light 208 entering the body of the apparatus at angle y, different from angle z.
  • the beam or blade of light 208 is then scattered by at least part of the scattering structure 204 in a different way to that shown in figure 4a. This thereby activates the first optical element 206a of the optical sensor 206.
  • the first and second optical elements 206a, 206b are activated, i.e.generate a signal, when they sense received light (e.g. light scattered from the body) to be at or above a predetermined threshold, e.g. light intensity threshold.
  • a predetermined threshold e.g. light intensity threshold.
  • the activation of the first and/or the second optical elements 206a, 206b represents in digital form the scatter pattern that is unique to the angle at which light beam or blade 208 is incident to the sensor body.
  • the scatter pattern can be compared to one or more stored reference scatter patterns that are associated with a particular angle.
  • the optical element may be activated when it senses a characteristic of the received and scattered light to be below a predetermined threshold.
  • FIGS 5a and 5b show a further example sensor 400 whereby a beam or blade of light 408, 508 enters the sensor body at different respective angles z, y, and the resulting scatter patterns sensed by multiple optical sensors 410, 412, 414, each comprising multiple optical elements.
  • each optical sensor 410, 412, 414 comprises nine optical elements arranged in a 3 x 3 grid.
  • the scattered light produces multiple scatter patterns 410’, 412’, 414’ detectable by each optical sensor 410, 412, 414 as shown,
  • the detected scatter patterns 410’, 412’, 414’ may be combined to provide a more accurate encoding of the particular angle z, i.e. using more bits, and which is therefore less likely to result in misinterpretation at the positon determination means 107.
  • Figure 6 shows in another example a scattered beam or blade of light 608, travelling from a sensor 600, which may comprise any of the aforementioned sensors, to an optical waveguide 610.
  • the optical waveguide 610 may for example be one or more optical fibres.
  • the optical waveguide 610 is arranged to direct the scattered light to an optical sensor 606, which may comprise any of the aforementioned optical sensors, whereby the created scatter pattern is captured and decoded as before.
  • the optical waveguide 610 can be used to guide the scattered light from the sensor body 602, which may be positioned at a first location of the apparatus, to the optical sensor 606, which may be positioned at a second location, possibly remote from the sensor body 602. Therefore, they do not have to be in close proximity.
  • the optical waveguide 610 may be configured to transmit the scattered light at low loss, for example by using total internal reflection.
  • Figures 7a and 7b relating to another example embodiment, respectively show a beam or blade of light 712, 712’, issued from a lighthouse 708, entering a sensor 700 at different distances 710, 710’ from said light lighthouse, activating first and/or second different optical elements 706a, 706b of an associated optical sensor 706.
  • Figure 7a shows the beam or blade of light 712, whose width or cross-section may naturally increase with distance from the lighthouse 708, entering the body 702 of the sensor 700 at a first distance 710.
  • the beam or blade of light 710 is then scattered by at least part of the scattering structure 704 to produce an associated first scatter pattern.
  • the scattered light activates the first optical element 706a but it will be appreciated that a scatter pattern of higher resolution or complexity may be produced in other embodiments, for example a multi-bit representation of light intensity for each sensor position.
  • Figure 7b shows the beam or blade of light 712’ entering the body 702 of the sensor 700 at a second distance 710’, different from the first distance 710.
  • the width or cross-section of the beam or blade of light 712’ as it enters the body 702 of the sensor 700 is inevitably different by virtue of said sensor being closer to the lighthouse 708.
  • the beam or blade of light 712’ is then scattered by at least part of the scattering structure 704 in a different way to that shown in figure 7a to produce a different scatter pattern.
  • the scattered light activates the second optical element 706b of the optical element 706, but it will be appreciated that a scatter pattern of higher resolution or complexity may be produced in other embodiments, for example a multi-bit representation of light intensity for each sensor position.
  • the first and second optical elements 706a, 706b may be activated, i.e. generate an electrical signal, when they sense received light (e.g. light scattered from the body) at or above a predetermined threshold, e.g. a predetermined light intensity threshold.
  • a predetermined threshold e.g. a predetermined light intensity threshold.
  • the activation of the first and/or second optical elements 706a, 706b represents a simple form of digital scatter pattern that is unique to the particular distance between the lighthouse 708 and the body 702 of the sensor 700. As the distance between the lighthouse 708 and the body increases, the width or cross-sectional area of the light beam or blade 712, 712’, incident to the body 702, increases.
  • the width or cross-sectional area of the light beam or blade 712, 712’ incident to the body will vary with distance and, therefore, the scatter pattern will vary. This can be decoded by the position determining means 107 to determine the spatial distance from the lighthouse 708 and therefore derive an indication of where in the real world space the sensor is located.
  • the scatter pattern may be used to determine spatial position.
  • Figures 8a to 8d relate to another embodiment, showing how a scanning beam or blade of light 812, 812’, 814, 814’ issued from a lighthouse 808 will produce respective scatter patterns ti, t2 and tf, t2’ for different distances 810, 810’ to indicate spatial distance from the lighthouse 808.
  • a sensor 802 formed of a plurality of side-by-side sub-sensors 802a, 802b, 802c, 802d is provided.
  • the sensor 802 may be formed large enough to receive the beam or blade of light 812, 812’, 814, 814’ at spaced-apart times ti, t2.
  • each body includes a different scattering structure.
  • Figure 8a shows a scanning beam or blade of light 812 entering the sensor 802 at a first time ti, which produces a first scatter pattern 820a.
  • Figure 8b further shows the scanning beam or blade of light 812’ at a subsequent time t2, which produces a second scatter pattern 820b.
  • the time between times t1 and t2 is known and the resulting patterns 820a, 820b can be mapped by the position determining means 107 to the distance 810 using any method referred to above.
  • Figure 8c shows a scanning beam or blade of light 814, which may be at a subsequent scan time, entering the sensor 802 at a third time t3 (corresponding in scan period to the first scan time t1) to produce a third scatter pattern 822a.
  • Figure 8d further shows the scanning beam or blade of light 814’ at a subsequent time t4 (corresponding in scan period to the second scan time t2) which produces a fourth scatter pattern 822b.
  • the time between times t3 and t4 is known and the resulting patterns 822a, 822b can be mapped by the position determining means 107 to the distance 810’ using any method referred to above.
  • the width or cross-section of the scanning beam or blade of light 812, 812’ as it enters the sensor 802 at the first distance 810 is inevitably different from width or cross-section of the scanning beam or blade of light 814, 814’ as it enters the sensor 802 at the second distance 810’. Therefore the scatter pattern 820a generated at the first time t1 will be different from the scatter pattern 822a generated at the third time t3. Similarly, the scatter pattern 820b generated at the second time t2 will be different from the scatter pattern 822b generated at a fourth time t4.
  • Scatter patterns 820a and 820b are unique to a first particular distance 810 between the lighthouse 808 and the sensor 802 and scatter patterns 822a and 822b are unique to a second particular distance between the lighthouse 808 and the sensor 802.
  • the width or cross-sectional area of the light beam or blade 812, 812’, 814, 814’ incident to the body will vary with distance and, therefore, the scatter pattern will vary, which can be decoded by the position determining means 107 to determine the spatial distance from the lighthouse 808.
  • the scatter patterns may be in the form of a multi-bit digital signature representing the intensity of the light received at the body, thereby further increasing the resolution of the scatter pattern digital representation.
  • Figure 9 relating to another example embodiment, shows a first lighthouse 908 and a second lighthouse 910 separated by a particular known distance L.
  • Each lighthouse issues a beam or blade of light 912, 914 at respective angles ⁇ , ⁇ , such that said beams or blades of light enter a sensor 900.
  • the first and second beams or blades of light 912, 914 activate a first optical element 906a of an associated optical sensor 906.
  • the activation of the first and/or second optical elements 906a, 906b represents a simple form of digital scatter pattern, unique to the particular receipt of the beam or blade of light 912, 914 at the first angle ⁇ and at the second angle ⁇ . Therefore, the first angle ⁇ and the second angle ⁇ can be determined by the position determining means 107 by comparing the generated scatter pattern with one or more stored scatter patterns.
  • the distance L as shown can be determined and known to the position determining means (for example, the position determining means is programmed with distance L) .
  • the distance R as shown can be determined using triangulation techniques, for example, by using the following formula. As such the position determining means can determine the position of the apparatus.
  • Figure 10 is flow diagram illustrating, in accordance with an example embodiment, operations that may be performed by the position determining means 107 of any above embodiment.
  • the operations may be performed using software, hardware or a combination thereof. Certain operations may be omitted, added to, or changed in order. Numbering of operations is not necessarily indicative of processing order.
  • a first operation 1001, performed by the position determining means 107, comprises receiving scattered light to determine a scatter pattern dependent on the angle at which the light is received to an apparatus.
  • a second operation 1002 comprises determining a position of the apparatus by using the scatter pattern.
  • Figure 11 is a schematic view of an apparatus 1100 providing a position determining means 107 of any above embodiment.
  • the apparatus 1100 may have a processor 1101, a memory 1104 coupled to the processor and comprised or a RAM 1102 and ROM 1103. It may comprise a network interface 1110, a display 1108 and one or more hardware keys 1116.
  • the apparatus 1100 may comprise one or more such network interfaces 1110 for connection to a network, e.g. using Bluetooth or similar.
  • the one or more network interfaces 1110 may also be for connection to the internet, e.g. using WiFi or similar.
  • the processor 1101 is connected to each of the other components in order to control operation thereof.
  • the memory 1104 may comprise a non-volatile memory, a hard disk drive (HDD) or a solid state drive (SSD) .
  • the ROM 1103 of the memory stores, amongst other things, an operating system 1112 and may store one or more software applications 1114.
  • the RAM 1102 of the memory 1104 may be used by the processor 1101 for the temporary storage of data.
  • the operating system 1112 may contain code which, when executed by the processor, implements the operations as described above with reference to Figure 10.
  • the processor 1101 may take any suitable form. For instance, it may be a microcontroller, plural microcontrollers, a processor, or plural processors and it may comprise processor circuitry.
  • Figure 12 shows a user-end apparatus comprising the user device 20 and the virtual reality media player 10 shown in Figure 1.
  • the user device 20 has the Figure 3 sensor 100 mounted on a stalk 1200 which is attached to part of the user device.
  • the position determination means may be provided by the virtual reality media player 10, in addition to its other functions which may include retrieving video and/or audio data from a network and rendering said retrieved data for output via the user device 20.
  • Figures 13a and 13b show tangible non-volatile media, respectively a removable memory unit 137 and a compact disc (CD) 139, storing computer-readable code which when run by a computer perform methods according to embodiments described above.
  • the removable memory unit 137 may be a memory stick, e.g. a USB memory stick, having internal memory 138 storing the computer-readable code.
  • the memory 138 may be accessed by a computer system via a connector 140.
  • the CD 139 may be a CD-ROM or a DVD or similar. Other forms of tangible storage media may be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

La présente invention concerne un appareil et un procédé de détermination de position. L'appareil peut comprendre un moyen de diffusion de la lumière reçue pour produire un motif de diffusion en fonction de l'angle au niveau duquel la lumière est reçue. L'appareil peut en outre comprendre un moyen pour déterminer une position de l'appareil à l'aide du motif de diffusion. L'appareil et le procédé peuvent être utilisés, dans un exemple, pour déterminer la position d'un dispositif utilisateur dans un espace du monde réel en vue d'une mise en correspondance avec un espace virtuel. L'appareil et le procédé peuvent être utiles dans d'autres applications.
PCT/CN2018/074685 2018-01-31 2018-01-31 Détermination de position WO2019148350A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/074685 WO2019148350A1 (fr) 2018-01-31 2018-01-31 Détermination de position

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/074685 WO2019148350A1 (fr) 2018-01-31 2018-01-31 Détermination de position

Publications (1)

Publication Number Publication Date
WO2019148350A1 true WO2019148350A1 (fr) 2019-08-08

Family

ID=67479117

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/074685 WO2019148350A1 (fr) 2018-01-31 2018-01-31 Détermination de position

Country Status (1)

Country Link
WO (1) WO2019148350A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997024682A1 (fr) * 1995-12-27 1997-07-10 Romanik, Carl, J., Jr. Systeme optique de precision pour le controle de la position et de l'orientation d'un objet
US6630915B1 (en) * 1999-01-26 2003-10-07 Lsa. Inc. Wireless transmission system for transmitting data to a simulation system user
US20150097775A1 (en) * 2013-10-03 2015-04-09 Samsung Display Co., Ltd. Method and apparatus for determining the pose of a light source using an optical sensing array
WO2015075720A1 (fr) * 2013-11-21 2015-05-28 Elbit Systems Ltd. Système médical de suivi optique
CN107040990A (zh) * 2017-03-31 2017-08-11 成都理想境界科技有限公司 一种防遮挡的双基站定位系统、定位网络及定位终端

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1997024682A1 (fr) * 1995-12-27 1997-07-10 Romanik, Carl, J., Jr. Systeme optique de precision pour le controle de la position et de l'orientation d'un objet
US6630915B1 (en) * 1999-01-26 2003-10-07 Lsa. Inc. Wireless transmission system for transmitting data to a simulation system user
US20150097775A1 (en) * 2013-10-03 2015-04-09 Samsung Display Co., Ltd. Method and apparatus for determining the pose of a light source using an optical sensing array
WO2015075720A1 (fr) * 2013-11-21 2015-05-28 Elbit Systems Ltd. Système médical de suivi optique
CN107040990A (zh) * 2017-03-31 2017-08-11 成都理想境界科技有限公司 一种防遮挡的双基站定位系统、定位网络及定位终端

Similar Documents

Publication Publication Date Title
US11875012B2 (en) Throwable interface for augmented reality and virtual reality environments
US11954808B2 (en) Rerendering a position of a hand to decrease a size of a hand to create a realistic virtual/augmented reality environment
US11392212B2 (en) Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments
US10866632B2 (en) Interaction engine for creating a realistic experience in virtual reality/augmented reality environments
US11666825B2 (en) Mixed reality gaming system
US8933912B2 (en) Touch sensitive user interface with three dimensional input sensor
US10777016B2 (en) System and method of enhancing user's immersion in mixed reality mode of display apparatus
CN204537054U (zh) 可嵌入运动感测控制设备
US9551914B2 (en) Illuminator with refractive optical element
CN110546595B (zh) 导航全息图像
US20160231821A1 (en) Display with built in 3d sensing capability and gesture control of tv
US20190094955A1 (en) Range finding and accessory tracking for head-mounted display systems
US10652525B2 (en) Quad view display system
US10013065B2 (en) Tangible three-dimensional light display
WO2019148350A1 (fr) Détermination de position
CN116740253B (zh) 一种光线追踪方法和电子设备
US11830515B2 (en) Methods and systems for visualizing audio properties of objects
US20220256137A1 (en) Position calculation system
US12032746B2 (en) Systems and methods of creating a realistic displacement of a virtual object in virtual reality/augmented reality environments
KR20210054089A (ko) 인터랙티브 콘텐츠를 이용한 동작 학습 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18904161

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18904161

Country of ref document: EP

Kind code of ref document: A1