WO2016155767A1 - Method and system for determining an orientation of a mobile device - Google Patents

Method and system for determining an orientation of a mobile device Download PDF

Info

Publication number
WO2016155767A1
WO2016155767A1 PCT/EP2015/056874 EP2015056874W WO2016155767A1 WO 2016155767 A1 WO2016155767 A1 WO 2016155767A1 EP 2015056874 W EP2015056874 W EP 2015056874W WO 2016155767 A1 WO2016155767 A1 WO 2016155767A1
Authority
WO
WIPO (PCT)
Prior art keywords
orientation
mobile device
determining
reference marker
environment
Prior art date
Application number
PCT/EP2015/056874
Other languages
French (fr)
Inventor
Asa Macwilliams
Joseph Newman
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Priority to PCT/EP2015/056874 priority Critical patent/WO2016155767A1/en
Publication of WO2016155767A1 publication Critical patent/WO2016155767A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves
    • G01S5/163Determination of attitude
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • G06K9/00664Recognising scenes such as could be captured by a camera operated by a pedestrian or robot, including objects at substantially different ranges from the camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/20Image acquisition
    • G06K9/32Aligning or centering of the image pick-up or image-field
    • G06K9/3216Aligning or centering of the image pick-up or image-field by locating a pattern
    • G06K2009/3225Special marks for positioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

In one embodiment a method for determining an orientation of a mobile device within an environment is disclosed. The mobile device includes an optical sensor or a camera. A plurality of reference markers, each having a defined orientation with respect to the environment, is disposed within the environment. Within the environment beacons are supplied in a contact-less manner to the mobile device. The beacons include at least one identity pattern of at least one reference marker associated with orientation data of said reference marker. By capturing at least one image of at least one reference marker by the optical sensor, a user of the mobile device is initiating a further procedure of determining an orientation of the mobile device. Thereby, an image of at least one of said reference marker is captured by the optical sensor. The mobile device receives beacons which include an identity pattern of at least one reference marker associated with orientation data of said reference marker. These identity patterns and orientation data delivered by at least one beacon are previously cached by the mobile device or processed simultaneously with the following operation steps. The mobile device retrieves one or more identity pattern matching the captured image. Eventually, the orientation data associated with the retrieved identity pattern is used for determining the orientation of the mobile device.

Description

Method and System for Determining an Orientation of a Mobile

Device

TECHNICAL FIELD

The present disclosure relates generally to a method and sys¬ tem for determining an orientation, and more particularly to an orientation system using a mobile device and a reference marker .

BACKGROUND

Currently known mobile devices, such as smart phones have several sensors (e.g. accelerometers, gyroscopes, and magne- tometers) that can be utilized to calculate device motion and orientation. However, the accuracy of these sensors, and algorithms to determine orientation, vary greatly. The variation can be quite large for these sensors in some environ¬ ments. For example, large metal structures, magnetic anoma- lies, and the like, can render magnetometer data useless in some situations.

It has been suggested to operate orientation sensors so that variations are compensating each other. For example, a magne- tometer may be adversely influenced by magnetic fields within the indoor environment such as those from electric motors, electronic lighting, displays, or monitors. To compensate for these magnetic anomalies, sensor data from the gyroscope and accelerometer could be used to correct for bad magnetometer data. However, the orientation sensors used in smart phones are relatively low cost, subject to drift, and do not provide a high degree of orientation accuracy.

Alternatively, inertial navigation can be used. Inertial nav- igation includes first determining a precise orientation of the mobile device and then tracking an orientation of the device using any one or more of the orientation or motion sensors. However, these internal reference techniques still re- suit in errors in calculating orientation and are still subject to drift.

The effectiveness of a system for determining an orientation and/or a position of a mobile device will depend on its abil- ity to offer sufficient accuracy throughout the area of interest, but also on user acceptance given that tracking peo¬ ple can be controversial because of privacy concerns. An ideal tracking system would be based on an architecture that would provide high accuracy and scalability whilst preserving user privacy.

Accordingly, there is a need in the art for a technique to accurately determine and/or compensate an orientation of a mobile device in an indoor environment without modifying the mobile device hardware.

SUMMARY AND DESCIPTION

Systems and methods in accordance with various embodiments provide for means for determining an orientation of a mobile device .

In one embodiment a method for determining an orientation of a mobile device within an environment is disclosed. The mo- bile device includes an optical sensor or a camera. A plurality of reference markers, each having a defined orientation with respect to the environment, is disposed within the envi¬ ronment. These reference markers are to be understood as predefined image-based markers which are preferably uniquely de- signed within the environment and provide unobtrusiveness and clear sight lines.

Within the environment beacons are supplied in a contact-less manner to the mobile device. The beacons include an identity pattern of at least one reference marker associated with ori¬ entation data of said reference marker. By capturing at least one image of at least one reference marker by the optical sensor, a user of the mobile device is initiating a further procedure of determining an orientation of the mobile device.

The mobile device receives beacons which include an identity pattern of at least one reference marker associated with orientation data of said reference marker. These identity patterns and orientation data delivered by at least one beacon are previously cached by the mobile device or processed sim¬ ultaneously with the following operation steps.

The mobile device retrieves one or more identity pattern matching the captured image. For the case that a plurality of reference markers is visible within the captured image, a respective plurality of identity patterns are obtained from said image. A plurality of markers present in an image allow for an improved estimate of the orientation. Eventually, the orientation data associated with the retrieved identity pattern is used for determining the orientation of the mobile device.

According to an embodiment, the determination of the orienta- tion of the mobile device includes estimating an orientation of the mobile device using at least one orientation sensor disposed within the mobile device and compensating the esti¬ mated orientation by the orientation data associated with the retrieved identity pattern.

According to an embodiment, the determination of the orienta¬ tion of the mobile device includes determining an angle between an image plane of the optical sensor and a plane of the reference marker and considering this angle for determining the orientation of the mobile device. This embodiment pro¬ vides a determination of a skewed alignment between a main axis of the mobile device and a main axis of the reference marker with the resulting skewed representation of the refer- ence marker within the image plane of the mobile device's op¬ tical sensor. This skewed representation is detected and the skew angle is taken into account for determining the orientation of the mobile device.

According to an embodiment, the beacons - including an iden¬ tity pattern of at least one reference marker - are further associated with position data. The position data associated with the retrieved identity pattern are used for determining the position of the mobile device. The availability of pre¬ cise position data further enables finer-grained pedestrian navigation in complex indoor environments.

According to an embodiment, the orientation data include an orientation of a transmitter of the beacon and an orientation of the reference markers relative to the transmitter. According to a similar embodiment, the position data include a position of the transmitter of the beacon and a position of the reference markers relative to the transmitter. Such advanta- geous data sub-structuring provide for extendable assignment of newly entered reference markers.

According to an embodiment, two or more reference markers are captured within one image. A first relative position and a second relative position are determined. The first relative position is the position of the first reference markers relative to the transmitter; the second relative position is the position of the second reference markers relative to the transmitter. A relative spatial relation is determined be- tween the first and the second relative position. This relative spatial relation is taken into consideration for an optimized determination of the orientation of the mobile device . According to an embodiment, beacons are transmitted by a known industry standard entitled Bluetooth, particularly by an implementation entitled Bluetooth Low Energy. Preferably, beacons are transmitted by a low-power Bluetooth (BLE) de¬ vice .

BRIEF DESCRIPTION OF THE DRAWING

The objects as well as further advantages of the present invention will become more apparent and readily appreciated from the following description of the preferred embodiments, taken in conjunction with the accompanying drawing accompanying drawing of which:

FIG. 1 shows a simplified perspective view of an orientation system in accordance with an embodiment of the invention;

FIG. 2 shows a block diagram of orientation and/or position data according to an embodiment;

FIG. 3 shows a block diagram of orientation and/or position data according to an alternative embodiment; FIG. 4 shows a block diagram of orientation and/or position data according to an alternative embodiment; and;

FIG. 5 shows a reference marker displayed by a mobile de¬ vice along with overlaid graphical information.

DETAILED DESCRIPTION

According to some embodiments of the present invention, a technique is described to determine an orientation of a mobile device in an indoor environment without modifying the mobile device hardware and without requiring additional hardware changes within the environment. The present invention provides high orientation accuracy.

The present invention also uses an existing optical sensor or camera and an image processor or image recognition processing unit, which is available in nearly every smart phone that is manufactured today.

The mobile device to be locationed can include a wide variety of business and consumer electronic platforms such as cellular radio telephones, mobile stations, mobile units, mobile nodes, user equipment, subscriber equipment, subscriber stations, mobile computers, access terminals, remote terminals, terminal equipment, cordless handsets, gaming devices, smart phones, personal computers, and personal digital assistants, and the like, all referred to herein as a mobile device.

Each device comprises a processor that can be further coupled to one or two cameras, a keypad, a speaker, a microphone, a display, signal processors, and other features, as are known in the art and therefore not shown or described in detail for the sake of brevity.

In general, components such as processors, memories, and op- tical interfaces are well-known. For example, processing units are known to comprise basic components such as, but not limited to, microprocessors, microcontrollers, memory cache, application-specific integrated circuits, and/or logic cir¬ cuitry. Such components are typically adapted to implement algorithms and/or protocols that have been expressed using high-level design languages or descriptions, expressed using computer instructions, expressed using messaging logic flow diagrams . Thus, given an algorithm, a logic flow, a messaging or signaling flow, and/or a protocol specification, those skilled in the art are aware of the many design and development techniques available to implement one or more processors that perform the given logic.

Therefore, the entities shown represent a system that has been adapted, in accordance with the description herein, to implement various embodiments of the present invention. Fur- thermore, those skilled in the art will recognize that as¬ pects of the present invention may be implemented in and across various physical components and none are necessarily limited to single platform implementations. For example, the aspects of the present invention may be implemented in any of the devices listed above or distributed across such compo¬ nents. It is envisioned that some portions of the present invention can be implemented in an application previously downloaded to the mobile device.

FIG. 1 is simplified perspective view of an orientation system in accordance with an embodiment of the present invention. A plurality of reference markers MRl, MR2 are provided within the environment. In the example described herein, the reference markers can be wall-mounted signs, plates, images or patterns in general.

The reference markers preferably are at least partially rec¬ tangular shaped such that at least one edge of the reference marker is oriented to one of two orthogonal angles of the environment .

It should be recognized that any marking that is present within the environment could be used as long as the reference markers having a defined orientation with respect to the environment. Other examples that can be used as reference markers are ceiling-mounted or wall-mounted light fixtures, clock faces, etc. Preferably, reference markers are uniguely designed within the environment and provide unobtrusiveness and clear sight lines.

In order to determine an orientation of a mobile device MD a user USR of the mobile device MD aligns an image plane of an optical sensor - not shown - of the mobile device MD with a reference marker MRl. As shown in the FIG. 1, the captured image of the reference marker MRl within the image plane of the optical sensor, e.g. a camera, is simultaneously depicted in a display of the mobile device MD. The mobile device MD receives beacons sent by a transmitter TRM. In FIG. 1 the contactless transmission of beacons between the transmitter TRM and the receiving mobile device MD is represented by a dotted line. A beacon includes an identity pattern of at least one reference marker MRl, MR2 associ¬ ated with orientation data of said reference marker MR1,MR2. These identity patterns and orientation data delivered by at least one beacon are previously cached by the mobile device MD or processed simultaneously with the following operation steps .

In a subsequent step, the mobile device MD retrieves an identity pattern matching the captured image of the aligned ref- erence marker MRl. The identity pattern delivered by the beacon preferably includes characteristics of a referred identity pattern which allows device-internal image processing routines to determine a match between a captured image of the aligned reference marker MRl and the identity pattern provid- ed for this reference marker MRl.

Eventually, the orientation data associated with the re¬ trieved identity pattern is used for determining the orienta¬ tion of the mobile device MD. Advantageously, the mobile de- vice, aware of a reference marker MRl identified by a beacon and equipped with a camera observing one or more of these reference markers MR1,MR2, is able to accurately determine its own globally referenced orientation. According to an embodiment, beacons further comprise position data besides orientation data, both associated with the iden¬ tity pattern. In this way a listening mobile device MD is enabled to deduce a geo-referenced pose - i.e. its combined position and orientation - without recourse to a large locally- hosted database or online lookup.

When no reference markers MRl, MR2 are observed, the on-board inertial sensors and magnetometer will continue to calculate a pose estimation which will gradually be subject to drift. By strategically placing markers where it is likely they will be observed, each time a marker is seen the drift can be corrected.

A beacon sighting may give a user a rough idea of their loca¬ tion, however without an accurate estimate of orientation an apparently simple decision such as »left« or »right« cannot be made without recourse to trial and error. By supplementing the beacon with carefully sited markers wherever a user needs to make a significant change in direction then they can be guided accordingly. These will usually be where there is existing signage, and this approach has the additional advantage of supplementing existing signs and notices. It is not necessary to permanently provide high precision tracking, so long as it is available at crucial moments in a user's j ourney .

According to an embodiment, beacons are transmitted by a known protocol entitled Bluetooth Low Energy or BLE . The format of the advertising frame is differing to known beacons in that the beacons according to an embodiment of the invention include orientation data of reference markers in a vicinity of the mobile device along with a respective identity pattern of such a reference marker. This data is preferably embedded into the advertising frame, thus allowing mobile devices to determine an orientation be aligning a reference marker while simply listening for advertising messages. Bluetooth Low Energy advertising packets can have a size up to 47 bytes, of which 2-39 bytes can be used for the adver¬ tising channel protocol data unit, or PDU. The PDU consists of a header having a size of two bytes, a MAC Address having a size of six bytes and the actual payload data having a size of 31 bytes.

A payload data length of 31 bytes is sufficient to express position and orientation of a beacon transmitter and addi- tionally positions and orientations of reference markers rel¬ ative to the beacon.

Generally, an orientation is a property having 3 degrees of freedom indicating the direction an object is faced and including, for example, a pitch, a roll or a yaw. A position, in general, is a property having 3 degrees of freedom, e.g. x, y, z in a Cartesian coordinate system, defining an object's location in 3D space.

Without limiting the generality, the orientation is alternatively defined by only one rotational degree of freedom according to embodiments of the invention described below. Most buildings can be described in units with dimensions of approximately 80 meters. Larger buildings, especially campuslike complexes may exceed these dimensions in their depth and width can be handled as separate blocks. Assumed that an ini¬ tial location of a mobile device can be narrowed down, using conventional positioning techniques, to cubes of space that are 81.92 meters on a side then the position of a beacon can be expressed to centimeter-level accuracy using 13 bits for each dimension x,y, z in a Cartesian coordinate system. Fur¬ ther, assuming that the beacon transmitter is mounted on a ceiling facing downwards then the orientation may be expressed by only one rotational degree of freedom describing an azimuth.

An azimuth is an angular measurement in a spherical coordi- nate system. An angle Θ between a projected main axis of an image plane of the mobile device's optical sensor and a ref¬ erence vector, e.g. the north vector, on the reference plane is defined by the azimuth. This angle Θ can be described to less than one degree accura¬ cy using 9 bits. This means that the position x,y,z and orientation Θ of the beacon transmitter fits neatly into 6 bytes (3 x 13 + 9 = 48 bits = 6 bytes) as shown by FIG. 2. This leaves 25 bytes to describe orientation and/or position of reference markers.

Turning now to FIG. 3 in which a block diagram of orientation and/or position data of a reference marker in relation to a beacon transmitter is shown.

Given that the practical range of Bluetooth using BLE is around 10 meters, and vision-based techniques are similarly constrained, the x and y values of the coordinate of a marker relative to a beacon can be expressed within the range 20.48 meters - i.e. between - 10.24 and + 10.24 meters - using 11 bits respectively to achieve centimeter-level accuracy for x and y. In other words a total of 22 bits is needed to encode a two-dimensional position.

The height z of a wall-mounted reference marker will likely not vary as much as position in a horizontal plane and so centimeter level accuracy, in the range 0 to 10.24 meters, can be provided using 10 bits for z. Therefore a relative three-dimensional position of markers can be encoded using 32 bits, i.e. 3 bytes as shown in FIG. 3.

If markers are constrained to have their z-axis pointing up- wards, then orientation can be encoded to degree-level accuracy using an additional 9 bits.

An identity pattern ID of 7 bits is used to encode up to 128 distinct reference markers.

This means that each marker requires 5 bytes to describe po¬ sition, orientation and identity pattern. So a total of five markers can be described relative to the beacon according to FIG. 4.

The scheme described above is one potential scheme that happens to fit neatly into the advertising frame, and is suited to describing the locations of beacons and relative poses of markers in buildings with typical dimensions.

Further schemes may be implemented suiting alternative embodiments depending on different constraints.

According to an embodiment, a number of location hints is encoded that use different constraints. For example arrays of reference markers are readily described relative to one another, differing only in their identity pattern ID and displacement along a wall.

According to a further embodiment, an ideal size of a reference marker will depend on a size of the room or environment. A small meeting room for example would need much smaller markers than an atrium or railway station concourse.

According to a further embodiment, an encoding for location hints is based on:

1) Absolute position

2) Absolute orientation

3) Relative position

a. Displacement of current marker relative to previous one along each of the x, y and z axis. b. Displacement along fixed axis (e.g. x-axis) c. Displacement at fixed intervals along fixed axis (e.g. every meter along x-axis)

4) Relative scale describes the scale of marker relative to the previous one.

5) Relative identity pattern ID, for example the identity pattern ID of marker could be 1 greater than the previous marker's identity pattern ID.

6) Relative orientation

a. Orientation of current marker relative to previous one around an arbitrary axis. b. Rotation around fixed axis (e.g. z-axis or whichever is vertically upwards if we consider wall-mounted markers)

c . Rotation of fixed increment around fixed axis relative to previous marker. Given that most walls meet at right-angles then 90 degrees rota tions around a vertical axis would be a good choice . The precise details of such an encoding scheme depend on a required range, accuracy and reference markers that would be considered necessary for a given application.

A specific strength of the embodiments disclosed herein be- comes evident when considering navigation in complex and potentially stressful settings such as navigating public transport. Transport planners and architects already place signs in strategic places where travelers have to make navi¬ gation decisions. However, these are not always accurate, complete or up-to-date; and must cater to all travelers whose numerous destinations necessitate very different routes. A given user' s destination gives the context which should inform a navigation decision. Therefore a navigation assistance application, such as that shown in FIG. 5, only needs to pro- vide the information that that particular traveler needs.

FIG. 5 shows a reference marker displayed by a mobile device along with overlaid graphical information in a prototype of pedestrian navigation assistant. A user regarding a road sign captioned Marienplatz which is simultaneously captured by a camera and displayed on a screen of the user's mobile device, is enriched by a number of route suggestions which are displayed by arrows, signs and textual information. The option towards the right shows that the metro train »S7« will leave at 2:51 p.m. and arrive at 3:07 p.m. alternatively, the user is being directed to walk to an underground terminal where the underground line »U5« leaves at 2:52 p.m. and will also arrive at 3:07 p.m. Such usage 8 of the orientation system according to the dis¬ closed embodiments including displaying the captured image of the reference marker by the mobile device and overlaying the captured image of the reference marker by graphical information, is a main motivation for a user to align the mobile device with a reference marker in order to obtain an orientation. The enriched graphical information advantageously includes routing information, timing information, and/or direc- tion guidance information.

Advantageously, the present invention provides a technique to determine or to compensate an orientation and/or a position of a mobile device without requiring any hardware modifica- tion of hardware within the mobile device or of existing distinct markers within an environment used as reference markers. The present invention uses an existing camera and optical pattern recognition processing, which is available in nearly every smart phone that is manufactured today. The pre- sent invention can provide high orientation accuracy by only using an internal algorithm along with simply building orientation data from beacon transmitters.

The present invention is particularly suitable for environ- ments such as retail spaces including shopping centers and malls, mass transit, namely railway, bus, metro stations and campus-like environments such as universities, technology parks and hospitals. In addition to extending the use of existing location-consuming applications, the availability of precise tracking enables finer-grained pedestrian navigation in complex indoor environments.

It is to be understood that the elements and features recited in the appended claims may be combined in different ways to produce new claims that likewise fall within the scope of the present invention. Thus, whereas the dependent claims appended below depend from only a single independent or dependent claim, it is to be understood that these dependent claims can, alternatively, be made to depend in the alternative from any preceding or following claim, whether independent or de¬ pendent, and that such new combinations are to be understood as forming a part of the present specification.

While the present invention has been described above by ref¬ erence to various embodiments, it should be understood that many changes and modifications can be made to the described embodiments. It is therefore intended that the foregoing de- scription be regarded as illustrative rather than limiting, and that it be understood that all equivalents and/or combinations of embodiments are intended to be included in this description .

Claims

CLAIMS :
1. A method for determining an orientation of a mobile device within an environment, the mobile device including an optical sensor, the method comprising the steps of:
- providing a plurality of reference markers being disposed within the environment, the reference markers having a defined orientation with respect to the environment;
- supplying beacons to the mobile device operating within the environment, the beacons including an identity pattern of at least one reference marker associated with orientation data of said reference marker;
- capturing at least one image of at least one reference
marker by the optical sensor;
- retrieving, by the mobile device, at least one identity pattern matching at least one captured image;
- using the orientation data associated with the retrieved at least one identity pattern for determining the orienta¬ tion of the mobile device.
2. The method of claim 1, wherein determining the orientation of the mobile device includes estimating an orientation of the mobile device using at least one orientation sensor dis¬ posed within the mobile device and compensating the estimated orientation by the orientation data associated with the retrieved identity pattern.
3. The method of claim 1, wherein determining the orientation of the mobile device includes determining an angle between an image plane of the optical sensor and a plane of the reference marker and considering this angle for determining the orientation of the mobile device.
4. The method of claim 1, wherein orientation data include an orientation of a transmitter of the beacon and an orientation of the reference markers relative to the transmitter.
5. The method of claim 1, wherein the beacons including an identity pattern of at least one reference marker are further associated with position data of said reference marker and wherein the position data associated with the retrieved iden- tity pattern are used for determining the position of the mobile device.
6. The method of claim 5, wherein position data include a position of a transmitter of the beacon and a position of the reference markers relative to the transmitter.
7. The method of claim 6, wherein at least two reference markers are captured within one of said at least one image, the method comprising the step of:
- determining a first relative position of a first reference marker relative to the transmitter and a second relative position of a second reference marker relative to the transmitter;
- determining a relative spatial relation between the first and the second relative position;
- using the relative spatial relation for determining the orientation of the mobile device.
8. The method of claim 1, wherein the beacons are transmitted by a known industry standard entitled Bluetooth, particularly by an implementation entitled Bluetooth Low Energy.
9. The method of claim 1, including the step of displaying the captured image of the reference marker by the mobile de- vice, and overlaying the captured image of the reference marker by graphical information.
10. The method of claim 1, wherein the graphical information includes routing information, timing information, and/or di- rection guidance information.
11. A system for determining an orientation of a mobile device within an environment, the system comprising: - a plurality of reference markers being disposed within the environment, the reference markers having a defined orien¬ tation with respect to the environment;
- a transmitter operable to supply beacons to the mobile device operating within the environment, the beacons including an identity pattern of at least one reference marker associated with orientation data of said reference marker;
- an optical sensor and image processor disposed within the mobile device and operable to obtain images from the optical sensor, the optical sensor operable to capture an im¬ age of a at least one reference marker, and the image processor operable to retrieve an identity pattern matching at least one captured image; and;
- an orientation engine operable to use the orientation data associated with the retrieved identity pattern for determining the orientation of the mobile device.
PCT/EP2015/056874 2015-03-30 2015-03-30 Method and system for determining an orientation of a mobile device WO2016155767A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2015/056874 WO2016155767A1 (en) 2015-03-30 2015-03-30 Method and system for determining an orientation of a mobile device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2015/056874 WO2016155767A1 (en) 2015-03-30 2015-03-30 Method and system for determining an orientation of a mobile device

Publications (1)

Publication Number Publication Date
WO2016155767A1 true WO2016155767A1 (en) 2016-10-06

Family

ID=53724262

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2015/056874 WO2016155767A1 (en) 2015-03-30 2015-03-30 Method and system for determining an orientation of a mobile device

Country Status (1)

Country Link
WO (1) WO2016155767A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120062702A1 (en) * 2010-09-09 2012-03-15 Qualcomm Incorporated Online reference generation and tracking for multi-user augmented reality
US20130148851A1 (en) * 2011-12-12 2013-06-13 Canon Kabushiki Kaisha Key-frame selection for parallel tracking and mapping
US20130230214A1 (en) * 2012-03-02 2013-09-05 Qualcomm Incorporated Scene structure-based self-pose estimation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120062702A1 (en) * 2010-09-09 2012-03-15 Qualcomm Incorporated Online reference generation and tracking for multi-user augmented reality
US20130148851A1 (en) * 2011-12-12 2013-06-13 Canon Kabushiki Kaisha Key-frame selection for parallel tracking and mapping
US20130230214A1 (en) * 2012-03-02 2013-09-05 Qualcomm Incorporated Scene structure-based self-pose estimation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Similar Documents

Publication Publication Date Title
You et al. Fusion of vision and gyro tracking for robust augmented reality registration
US7634354B2 (en) Location signposting and orientation
EP2663876B1 (en) Camera-based position location and navigation based on image processing
KR101322778B1 (en) Method and apparatus for identification of points of interest within a predefined area
CA2695841C (en) Locating, tracking, and/or monitoring personnel and/or assets both indoors and outdoors
US20060074549A1 (en) Navigation apparatus
EP2434256B1 (en) Camera and inertial measurement unit integration with navigation data feedback for feature tracking
Kang et al. SmartPDR: Smartphone-based pedestrian dead reckoning for indoor localization
CN104736964B (en) Map-assisted positioning of the mobile device based on the sensor
FI124665B (en) Creation of a magnetic map of indoor positioning
US9151621B2 (en) Indoor magnetic field based location discovery
FI124153B (en) Use of magnetic field based navigation
JP2014524577A (en) Logo detection for indoor positioning
US8224024B2 (en) Tracking objects with markers
JP2005257738A (en) Portable terminal with map display function, map display system, information distribution server and program
CA2833662A1 (en) Indoor localization of mobile devices
JP2011506913A (en) People of the support device for navigation
US20080319664A1 (en) Navigation aid
Satoh et al. A hybrid registration method for outdoor augmented reality
US7822545B2 (en) Mobile terminal with navigation function
CA2653622C (en) Method and system for locating and monitoring first responders
JP2001503134A (en) Portable hand-held digital geographic data manager
US8296058B2 (en) Method and apparatus of obtaining improved location accuracy using magnetic field mapping
Qian et al. An improved indoor localization method using smartphone inertial sensors
WO2014176054A1 (en) Localization systems and methods

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15741863

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15741863

Country of ref document: EP

Kind code of ref document: A1