WO2014128507A2 - A mobile indoor navigation system - Google Patents

A mobile indoor navigation system Download PDF

Info

Publication number
WO2014128507A2
WO2014128507A2 PCT/GB2014/050554 GB2014050554W WO2014128507A2 WO 2014128507 A2 WO2014128507 A2 WO 2014128507A2 GB 2014050554 W GB2014050554 W GB 2014050554W WO 2014128507 A2 WO2014128507 A2 WO 2014128507A2
Authority
WO
WIPO (PCT)
Prior art keywords
navigation
user
communication device
markers
steps
Prior art date
Application number
PCT/GB2014/050554
Other languages
French (fr)
Other versions
WO2014128507A3 (en
Inventor
Benjamin Daniel STURGESS
James Lee BURROWS
Original Assignee
Fox Murphy Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fox Murphy Limited filed Critical Fox Murphy Limited
Priority to GB1513966.0A priority Critical patent/GB2525531A/en
Publication of WO2014128507A2 publication Critical patent/WO2014128507A2/en
Publication of WO2014128507A3 publication Critical patent/WO2014128507A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves

Definitions

  • the invention relates to a mobile indoor navigation system.
  • Wi-Fi wireless local area networks
  • Bluetooth trianguiation systems both of these systems require indoor electronic hardware to function.
  • satellite navigation technology which uses global positioning systems in order to navigate roads or the like.
  • the disadvantage of this technology is that it has not been adapted to work in an indoor environment in order to guide a user to a target destination within premises.
  • US 2013/0045751 A1 relies on the detection of ordinary irregularly disposed logos such as the brands on a store's fa3 ⁇ 4ade, It provides a very rough estimate of location but specific real time guidance cannot be achieved in this prior art configuration,
  • US 2010/0092034 concerns tags disposed on the wall of a building and the determination of an individual's position through the analysis of deformation of a tag, It does not provide real time guidance or navigation to an individual
  • US 2008/0137312 concerns a barcode based system for determining a user's position.
  • the configuration relies on barcodes mounted on a wall and a camera configured to simulate a user's view, it does not provide readable real time navigation to a user.
  • JP 2009-222682 provides visual indicators which may be detected by camera for the identification of objects in very much the same way as brail provides a means to communicate an object's purpose to the visually impaired.
  • the invention provides a mobile indoor navigation system comprising:
  • ® a mobile communication device incorporating a user interface; a display for navigating a user from a first location to a second location; and a camera;
  • This configuration provides a system by which a user may navigate premises such as a supermarket in order to efficiently locate desired products, it particularly allows a user to hold a mobile communication device in an ordinary interfacing position i.e. with the user holding the device in the palm of his/her hand whilst the camera captures in preferred embodiments ceiling or floor images where passive markers are disposed.
  • the passive navigation markers are disposed at regular intervals which provide real time
  • the only electronic device0 required may be the communication device. This may be a phone, tablet or an electronic trolley. No other hardware may be required within the premises to allow the system to function, in a preferred embodiment, the system incorporates no additional hardware.
  • said mobile communication device may be configured to scan for prominent5 features and/or one or more particular highlights in the front or rear facing cameras' field of view and may be configured to track their movement across the field of view, thus identifying lateral movement of the communication device in relation to the environment.
  • the device will be configured to scan to identify right-angles, ⁇
  • the device will be configured to scan to identify marb which may for example comprise geometric shapes.
  • the device comprises means for identifying lights and/or5 light fittings. This may be achieved by identifying an area with a characteristic
  • the device is configured to identify a static One, pipe, or cabling in order to determine a direction of travel.
  • said processor determines and assesses the distance between a passive
  • the mobile communication device incorporates a screen and a camera in substantially the same plane as the screen; and the device's screen remains in a plane facilitating the user's viewing whilst capturing images of a passive marker located on a substantially horizontal plane such as a ceiling or a floor.
  • said processor determines and assesses one or more angles between a passive navigation marker in the field of view and the mobile communication device; and displays navigation information dependent upon said assessment
  • said mobile communication device incorporates an accelerometer and the selection of the navigation information is at least partly dependent on values derived from said accelerometer. This provides the advantage of more accurately determining the position of a user in comparison to the use of the processor alone.
  • said mobile communication device locks the exposure of said camera in order to most effectively read said markers, This configuration allows the processor to effectively compare signals representative of the viewed passive navigation markers with signals stored on a database in variable light conditions.
  • said mobile communication device will automatically lock the exposure of the camera to best suit the reading of the markers in variable lighting conditions.
  • said communication device utilises both front and rear facing cameras. This allows markers positioned on the ceiling, floor or wail to be identified by the mobile communication device,
  • said front and rear facing cameras are utilised simultaneously.
  • said mobile communication device incorporates a compass and the selection of the navigation information is at least partly depe dent on values derived from said compass. This provides the advantage of more accurately determining the position of a user in comparison to the use of the processor alone. s
  • said mobile communication device incorporates an accelerometer and/or a gyroscope which are configured to identify displacement characteristics and compare displacement characteristics against pre-determined displacement characteristics; whereby a mode of displacement may be determined,
  • an accelerometer and/or a gyroscope which are configured to identify displacement characteristics and compare displacement characteristics against pre-determined displacement characteristics; whereby a mode of displacement may be determined.
  • said mobile communication device incorporates an accelerometer and/or a gyroscope which are configured to determine the orientation of the device and display navigation information in accordance with said determined orientation.
  • said mobile communication device determines its orientation dependent upon whether a passive navigation marker Is in the field of view of said camera; and dependent solely upon data obtained from said gyroscope and/or accelerometer if no passive navigation marker is present in the field of view.
  • said mobile communication device incorporates a pedometer.
  • configuration coupled with the orientation detection can allow map details to be accurately updated in real time to provide improved indoor navigation.
  • said mobile communication device records a user's walking movement in order to determine the user's walking profile and compares his profile against pre-determined profiles.
  • At least one of said passive navigation markers Is asymmetric; whereby the position of the user relative to the marker is determined.
  • This configuration provides the advantage of determining in which direction a user is travelling.
  • the processor may determine in which direction the user Is travelling towards or away from the marker.
  • said asymmetric navigation markers are located on the ceiling, floor or wall of a building. This enables the front or back facing camera of the mobile communication device to position the user accurately within the premises.
  • said navigation markers are held apart from said ceiling or floor by one or more fasteners.
  • said navigation markers can be visible solely in the infra-red spectrum. This configuration allows accurate detection by the camera of the mobile communication device, where detection in other parts of the spectrum may be difficult
  • the initial location of said user can be selected from a menu in said user interface, in order to determine the location of said user. This provides the advantage of allowing the user to select their initial location if this cannot be otherwise determined using the navigation system.
  • GPS global position satellite
  • a predetermined list of product destinations may be stored within said navigation system. This allows a user to compile a list of products, such as a shopping list, that upon entering the premises can be mapped using said navigation system in order to formulate the optimum path by which a user can travel to efficiently locate all sought after products.
  • said processor launches said navigation system dependent on the detection of a Quick Response ⁇ QR ⁇ code.
  • QR code can also direct the user online, if they have not already done so, to download and install the navigation system, Additionally, said QR codes may generate the latest special offers to be shown on a user's mobile device each time they launch the system.
  • said user interface overlays said navigation information display over the onscreen camera view of said mobile communication device. This configuration allows the user to see the optimum path generated laid over the image seen by the mobile device so that they may more easily be able to locate a product.
  • the navigation information may be overlaid on a floor map not the camera view. This will take the shape of a line to follow, which updates as the user moves to new detectable locations.
  • Figure 1 shows the interaction between a mobile device and said asymmetric navigation markers.
  • Figure 2 shows the measurement of pitch roll and yaw angles of the communication device and the marker position in the field of view, to calculate the lateral distance of the device from the marker as a polar co-ordinate.
  • Figure 3 shows a user holding a communication device.
  • Figure 1 shows a mobile communication device, indicated by 1, that Interacts with asymmetric navigation markers placed below 2 or above 3, horizontally spaced between 3 to 10 meters from each other, within premises.
  • Each marker incorporates a number of shapes spatially arranged so that the marker may not be superimposed on its mirror image. Thus from any direction, the view of the marker Is different therefore allowing the navigation system to determine in which direction the user is travelling.
  • the marker incorporates two or more rectangles, in a further preferred embodiment, the marker incorporates a first rectangle at 90° from a second rectangle.
  • the markers are infrared markers and the camera Is adapted to operate in the infrared spectrum. The markers are located on either the floor, ceiling or other high level fixing point such as structural beams or lighting rigs In said premises.
  • the view frustum (field of view) of the camera is shown 4.
  • Figure 2 shows the process of calculating the lateral distance and angle as a polar coordinate 5 from the marker 3, by comparing the pitch, roll, and yaw angle of the communication device 1 with the height 6 and the position of the marker 3 in the communication device camera view frustum (field of view) 4,
  • Figure 3 shows the user holding their communication device 1, the camera field of view 4 and the live map interface 7. This position allows the user to comfortably view the screen in order to readily follow navigation instructions whilst at the same time allowing the camera to capture the requisite images, in that sense, the operation of navigation is simplified to the extent that the user needn't be aware of the position of the markers.
  • the mobile indoor navigation system is launched on the mobile communication device 1, initially, the navigation system can be Installed on the mobile communication device 1 by scanning a Q code upon entry to the premises.
  • the mobile device 1 Upon launching of the navigation system, the mobile device 1 will determine the location of the user based on GPS positioning. This may be used to determine which store/building they are in, and which map and database to use. If this fails, however, the user can select their location through a menu contained in the navigation system.
  • the user may sign into an account associated with the premises, such as an online shopping account, or loyalty card account.
  • the user may select a desired item that is located within the premises.
  • the navigation system will subsequently search for the location of the chosen item within a database of item locations for the premises.
  • the navigation system will plot, the location of the chosen item on an on-screen map 7 of the premises, generated on the mobile device 1.
  • the navigation system will provide a path for the user to follow from their current position to the location of the desired item.
  • the navigation system will track the user's location as they travel towards the item destination, providing feedback related to the user's position relative to the target location of the item, using the asymmetric navigation markers 2, 3 and the field of view 4 of the front or rear camera 2 of the mobile device 1.
  • the navigation system may indicate useful information, such as special offer information, to the user as they travel through the premises.
  • This useful information may be based on known purchasing patterns of the user, obtained through the account associated with the premises.
  • the user may pre-load a list of desired items located within the premises.
  • the navigation system will determine an optimised route by beating the next nearest item on the user's list for the user to follow in order to efficiently locate all desmd items.
  • the user may scan a Q code as they enter the premises which is associated with a special offer or other points of interest. The navigation system will then guide the user to the location of the special offer or point of interest in the same manner as described above, if the navigation system is not yet installed on the mobile device, scanning of the QR code will direct them to an online location where they will be able to w download and install the navigation system.
  • detection of the user's location is determined through the detection of visual asymmetric navigation markers 2 f 3 that are mounted at ceiling level, on the floor, or other high level fixing point of the premises.
  • Each asymmetric navigation is marker represents a unique location within the premises and may be detected by the front or rear camera of the mobile device.
  • an asymmetric navigation marker passes into the field of view of the front or rear camera of the mobile device 1
  • its unique location is determined and compared against a map and an associated database of each unique location held internally within the navigation system. From this, the user's location can be 0 determined and plotted on a schematic map of the premises.
  • the asymmetric marker moves through the field of view 4 of the front or rear camera, the user's speed and direction can be determined. The accuracy of this may be increased by employing the accelerometer and/or magnetic compass of the mobile device in conjunction with the navigation system.
  • a pedometer might be employed to measure the number of steps taken by the user in order to better predict the location of a user.
  • the user's position relative to the target item destination0 would be presented as a schematic map of the premises with the optimum route plotted on the map.
  • navigation information can be presented as a compass-style arrow which points in the direction of the target item destination.
  • navigation information can be presented over the on-screen view of the camera, in a 3-dimensional presentation, known as Augmented Reality (A ).
  • Augmented Reality A
  • navigation information in the form of a map or otherwise, would rotate and move in relation to the direction in which the user is facing, In a preferred embodiment, the following processing steps are followed:
  • the front or rear facing cameras of the communication device are initialised and may be preferably set to constantly scan for markers using the Aruco augmented reality Library.
  • optical processing via the Aruco system is as follows:
  • Stage 2 Marker identification
  • Otsu the area preferably using Otsu.
  • Otsu s algorithms assume a birnodal distribution and find the threshold that maximizes the extra-class variance while keeping a low intra - class variance.
  • Identification of the internal code if it is a marker, then it has an internal code in preferred embodiments.
  • the marker is preferably divided In a 6x6 grid, of which the internal 5x5 cells contain ID information. The rest corresponds to the external black border, Here, we first check that the external black border is present, Afterwards, we read the internal 5x5 cells and check if they provide a valid code (it might be required to rotate the code to get the valid one).
  • the Harker Identification value Is preferably stored In memory and the centre of the marker is preferably identified in relation to the pixel dimensions of the camera frame as a percentage in X and Y referred to as MarkerPosJC and MarkerPosJ/. 0%X)% being the top left of the frame. : Relative Position Calculation
  • the lateral position of the communication device relative to the marker can be identified as follows.
  • total cross-section area of the viewable ceiling floor is determined.
  • the percentage positioning of the marker in the camera frame is combined with the FOV values to calculate the lateral position
  • the accelerometer values are used to calculate the angle of the phone to ground in Pitch (Y) and Roll
  • the system preferably lookups its real world position via a lookup table and then use the LateralPositlon J retain.corrected

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

A mobile indoor navigation system comprises: • a mobile communication device incorporating a user interface; a display for navigating a user from a first location to a second location; and a camera; • a number of passive navigation markers disposed at a number of locations; at least one of which in use is viewed by the camera as the marker is in the field of view of the camera; and • a processor which compares signals representative of the viewed passive navigation markers with signals stored on a database and displays navigation information dependent upon the comparison.

Description

The invention relates to a mobile indoor navigation system.
Background to the |πν¾η¾ίοη Prior art embodiments known to the applicants are wireless local area networks (Wi-Fi) and Bluetooth trianguiation systems. However, both of these systems require indoor electronic hardware to function.
Also known to the applicant is satellite navigation technology which uses global positioning systems in order to navigate roads or the like. The disadvantage of this technology is that it has not been adapted to work in an indoor environment in order to guide a user to a target destination within premises.
The following prior art documents are acknowledged; US2013/0045751,
US2010/0092034, US2008/0137912, JP2009222682, R 10-2012 -0116202. None of the prior art documents identified addresses the problems which the invention tackles, let- alone are the solutions proposed by the applicant envisaged.
US 2013/0045751 A1 relies on the detection of ordinary irregularly disposed logos such as the brands on a store's fa¾ade, It provides a very rough estimate of location but specific real time guidance cannot be achieved in this prior art configuration,
US 2010/0092034 concerns tags disposed on the wall of a building and the determination of an individual's position through the analysis of deformation of a tag, It does not provide real time guidance or navigation to an individual
US 2008/0137312 concerns a barcode based system for determining a user's position. The configuration relies on barcodes mounted on a wall and a camera configured to simulate a user's view, it does not provide readable real time navigation to a user.
JP 2009-222682 provides visual indicators which may be detected by camera for the identification of objects in very much the same way as brail provides a means to communicate an object's purpose to the visually impaired.
In a first broad independent aspect, the invention provides a mobile indoor navigation system comprising:
® a mobile communication device incorporating a user interface; a display for navigating a user from a first location to a second location; and a camera;
* a number of passive navigation markers disposed at a number of locations; at least one of which in use is viewed by said camera as said marker is in the field of view of said camera; and
* a processor which compares signals representative of said viewed passive navigation markers with signals stored on a database and displays navigation information dependent upon said comparison.
This configuration provides a system by which a user may navigate premises such as a supermarket in order to efficiently locate desired products, it particularly allows a user to hold a mobile communication device in an ordinary interfacing position i.e. with the user holding the device in the palm of his/her hand whilst the camera captures in preferred embodiments ceiling or floor images where passive markers are disposed. Preferably the passive navigation markers are disposed at regular intervals which provide real time
5 updates of a map displayed for the user's guidance. In other words, Indoor navigation has been simplified when compared to the prior art embodiments discussed above where the user would be required to identify the logos or barcodes, point in effect to them in a manner rendering the simultaneous display of navigation information impractical Whilst simplifying the user's interaction with the navigation system, the only electronic device0 required may be the communication device. This may be a phone, tablet or an electronic trolley. No other hardware may be required within the premises to allow the system to function, in a preferred embodiment, the system incorporates no additional hardware.
Preferably, said mobile communication device may be configured to scan for prominent5 features and/or one or more particular highlights in the front or rear facing cameras' field of view and may be configured to track their movement across the field of view, thus identifying lateral movement of the communication device in relation to the environment.
In a preferred embodiment, the device will be configured to scan to identify right-angles,ΰ
in a further subsidiary aspect, the device will be configured to scan to identify marb which may for example comprise geometric shapes.
In a further subsidiary aspect, the device comprises means for identifying lights and/or5 light fittings. This may be achieved by identifying an area with a characteristic
corresponding to a light In a further subsidiary aspect, the device is configured to identify a static One, pipe, or cabling in order to determine a direction of travel.
Preferably, said processor determines and assesses the distance between a passive
& navigation marker in the field of view and the mobile communication device; and displays navigation information dependent upon said assessment. This configuration allows accurate positioning of a user within the premises and therefore accurate navigation towards a destination. Preferably, the mobile communication device incorporates a screen and a camera in substantially the same plane as the screen; and the device's screen remains in a plane facilitating the user's viewing whilst capturing images of a passive marker located on a substantially horizontal plane such as a ceiling or a floor.
Preferably, said processor determines and assesses one or more angles between a passive navigation marker in the field of view and the mobile communication device; and displays navigation information dependent upon said assessment This configuration allows accurate positioning and preferably orientation of a user within the premises and therefore accurate navigation towards a destination. Preferably, said mobile communication device incorporates an accelerometer and the selection of the navigation information is at least partly dependent on values derived from said accelerometer. This provides the advantage of more accurately determining the position of a user in comparison to the use of the processor alone. Preferably, said mobile communication device locks the exposure of said camera in order to most effectively read said markers, This configuration allows the processor to effectively compare signals representative of the viewed passive navigation markers with signals stored on a database in variable light conditions. Preferably, said mobile communication device will automatically lock the exposure of the camera to best suit the reading of the markers in variable lighting conditions.
Preferably, said communication device utilises both front and rear facing cameras. This allows markers positioned on the ceiling, floor or wail to be identified by the mobile communication device,
Preferably, said front and rear facing cameras are utilised simultaneously. This
configuration provides the advantage of allowing the mobile communication device to identify navigational markers on the ceiling, floor or wail of premises at the same time.
Preferably, said mobile communication device incorporates a compass and the selection of the navigation information is at least partly depe dent on values derived from said compass. This provides the advantage of more accurately determining the position of a user in comparison to the use of the processor alone. s
Preferably, said mobile communication device incorporates an accelerometer and/or a gyroscope which are configured to identify displacement characteristics and compare displacement characteristics against pre-determined displacement characteristics; whereby a mode of displacement may be determined, Embodiments of this configuration are particularly advantageous since the system avoids any reliance on a compass which may be prone to errors due to electro-magnetic interference.
Preferably, said mobile communication device incorporates an accelerometer and/or a gyroscope which are configured to determine the orientation of the device and display navigation information in accordance with said determined orientation.
Preferably, said mobile communication device determines its orientation dependent upon whether a passive navigation marker Is in the field of view of said camera; and dependent solely upon data obtained from said gyroscope and/or accelerometer if no passive navigation marker is present in the field of view.
Preferably, said mobile communication device incorporates a pedometer. This
configuration coupled with the orientation detection can allow map details to be accurately updated in real time to provide improved indoor navigation.
Preferably, said mobile communication device records a user's walking movement in order to determine the user's walking profile and compares his profile against pre-determined profiles.
Preferably, at least one of said passive navigation markers Is asymmetric; whereby the position of the user relative to the marker is determined. This configuration provides the advantage of determining in which direction a user is travelling. As a navigation marker is asymmetric, at any point when in the field of view of said camera, the processor may determine in which direction the user Is travelling towards or away from the marker.
Preferably, said asymmetric navigation markers are located on the ceiling, floor or wall of a building. This enables the front or back facing camera of the mobile communication device to position the user accurately within the premises. Preferably, said navigation markers are held apart from said ceiling or floor by one or more fasteners. Preferably, said navigation markers can be visible solely in the infra-red spectrum. This configuration allows accurate detection by the camera of the mobile communication device, where detection in other parts of the spectrum may be difficult
Preferably, the initial location of said user can be selected from a menu in said user interface, in order to determine the location of said user. This provides the advantage of allowing the user to select their initial location if this cannot be otherwise determined using the navigation system.
Preferably, global position satellite (GPS) data is used for positioning said user in an outdoor environment
Preferably, a predetermined list of product destinations may be stored within said navigation system. This allows a user to compile a list of products, such as a shopping list, that upon entering the premises can be mapped using said navigation system in order to formulate the optimum path by which a user can travel to efficiently locate all sought after products.
Preferably, said processor launches said navigation system dependent on the detection of a Quick Response {QR} code. This configuration allows the launching of the processor without the requirement for additional hardware within the building. The QR code can also direct the user online, if they have not already done so, to download and install the navigation system, Additionally, said QR codes may generate the latest special offers to be shown on a user's mobile device each time they launch the system. Optionally, said user interface overlays said navigation information display over the onscreen camera view of said mobile communication device. This configuration allows the user to see the optimum path generated laid over the image seen by the mobile device so that they may more easily be able to locate a product. Preferably, the navigation information may be overlaid on a floor map not the camera view. This will take the shape of a line to follow, which updates as the user moves to new detectable locations.
Figure 1 shows the interaction between a mobile device and said asymmetric navigation markers. Figure 2 shows the measurement of pitch roll and yaw angles of the communication device and the marker position in the field of view, to calculate the lateral distance of the device from the marker as a polar co-ordinate.
Figure 3 shows a user holding a communication device.
Figure 1 shows a mobile communication device, indicated by 1, that Interacts with asymmetric navigation markers placed below 2 or above 3, horizontally spaced between 3 to 10 meters from each other, within premises. Each marker incorporates a number of shapes spatially arranged so that the marker may not be superimposed on its mirror image. Thus from any direction, the view of the marker Is different therefore allowing the navigation system to determine in which direction the user is travelling. In a preferred embodiment, the marker incorporates two or more rectangles, in a further preferred embodiment, the marker incorporates a first rectangle at 90° from a second rectangle. In a further preferred embodiment the markers are infrared markers and the camera Is adapted to operate in the infrared spectrum. The markers are located on either the floor, ceiling or other high level fixing point such as structural beams or lighting rigs In said premises. The view frustum (field of view) of the camera is shown 4.
Figure 2 shows the process of calculating the lateral distance and angle as a polar coordinate 5 from the marker 3, by comparing the pitch, roll, and yaw angle of the communication device 1 with the height 6 and the position of the marker 3 in the communication device camera view frustum (field of view) 4, Figure 3 shows the user holding their communication device 1, the camera field of view 4 and the live map interface 7. This position allows the user to comfortably view the screen in order to readily follow navigation instructions whilst at the same time allowing the camera to capture the requisite images, in that sense, the operation of navigation is simplified to the extent that the user needn't be aware of the position of the markers.
As a user enters the premises, the mobile indoor navigation system is launched on the mobile communication device 1, initially, the navigation system can be Installed on the mobile communication device 1 by scanning a Q code upon entry to the premises.
Upon launching of the navigation system, the mobile device 1 will determine the location of the user based on GPS positioning. This may be used to determine which store/building they are in, and which map and database to use. If this fails, however, the user can select their location through a menu contained in the navigation system. Optionally, the user may sign into an account associated with the premises, such as an online shopping account, or loyalty card account.
Once the navigation system is launched, the user may select a desired item that is located within the premises. The navigation system will subsequently search for the location of the chosen item within a database of item locations for the premises. The navigation system will plot, the location of the chosen item on an on-screen map 7 of the premises, generated on the mobile device 1. The navigation system will provide a path for the user to follow from their current position to the location of the desired item. The navigation system will track the user's location as they travel towards the item destination, providing feedback related to the user's position relative to the target location of the item, using the asymmetric navigation markers 2, 3 and the field of view 4 of the front or rear camera 2 of the mobile device 1. Alternatively, the navigation system may indicate useful information, such as special offer information, to the user as they travel through the premises. This useful information may be based on known purchasing patterns of the user, obtained through the account associated with the premises. Alternatively, the user may pre-load a list of desired items located within the premises. The navigation system will determine an optimised route by beating the next nearest item on the user's list for the user to follow in order to efficiently locate all desmd items. s Alternatively, the user may scan a Q code as they enter the premises which is associated with a special offer or other points of interest. The navigation system will then guide the user to the location of the special offer or point of interest in the same manner as described above, if the navigation system is not yet installed on the mobile device, scanning of the QR code will direct them to an online location where they will be able to w download and install the navigation system.
In a preferred embodiment, detection of the user's location is determined through the detection of visual asymmetric navigation markers 2f 3 that are mounted at ceiling level, on the floor, or other high level fixing point of the premises. Each asymmetric navigation is marker represents a unique location within the premises and may be detected by the front or rear camera of the mobile device. When an asymmetric navigation marker passes into the field of view of the front or rear camera of the mobile device 1, its unique location is determined and compared against a map and an associated database of each unique location held internally within the navigation system. From this, the user's location can be 0 determined and plotted on a schematic map of the premises. As the asymmetric marker moves through the field of view 4 of the front or rear camera, the user's speed and direction can be determined. The accuracy of this may be increased by employing the accelerometer and/or magnetic compass of the mobile device in conjunction with the navigation system.
5
In a further embodiment, a pedometer might be employed to measure the number of steps taken by the user in order to better predict the location of a user.
In a preferred embodiment, the user's position relative to the target item destination0 would be presented as a schematic map of the premises with the optimum route plotted on the map. Alternatively, navigation information can be presented as a compass-style arrow which points in the direction of the target item destination. Alternatively, navigation information can be presented over the on-screen view of the camera, in a 3-dimensional presentation, known as Augmented Reality (A ). Additionally, when the user is travelling, navigation information, in the form of a map or otherwise, would rotate and move in relation to the direction in which the user is facing, In a preferred embodiment, the following processing steps are followed:
Stage 1: Marker Detection
The front or rear facing cameras of the communication device are initialised and may be preferably set to constantly scan for markers using the Aruco augmented reality Library.
The optical processing via the Aruco system is as follows:
® Marker Detection
Apply an Adaptive Thresholding so as to obtain borders of markers.
» Find contours. After that not only the real markers are det&cied but also a lot of undesired. borders. The process then aims to filter out unwanted borders.
® Remove borders with a small number of points,
* Polygonal approximation of contour and preferably keep the concave contours with exactly 4 corners (i.e., rectangles).
® Sort corners in preferably an anti-clockwise direction,
» Remove too close rectangles. This is required because the adaptive threshold
normally detects the internal and external part of the marker's border. At this stage, the system preferably keeps the most external border, Stage 2: Marker identification
* Narker identification
® Remove the projection perspective so as to preferably obtain a frontal view of the rectangle area using homography.
® Threshold the area preferably using Otsu. Otsu s algorithms assume a birnodal distribution and find the threshold that maximizes the extra-class variance while keeping a low intra - class variance. Identification of the internal code, if it is a marker, then it has an internal code in preferred embodiments. The marker is preferably divided In a 6x6 grid, of which the internal 5x5 cells contain ID information. The rest corresponds to the external black border, Here, we first check that the external black border is present, Afterwards, we read the internal 5x5 cells and check if they provide a valid code (it might be required to rotate the code to get the valid one).
The Harker Identification value Is preferably stored In memory and the centre of the marker is preferably identified in relation to the pixel dimensions of the camera frame as a percentage in X and Y referred to as MarkerPosJC and MarkerPosJ/. 0%X)% being the top left of the frame. : Relative Position Calculation
If the communication device is flat and level with the camera facing straight up or down, then the lateral position of the communication device relative to the marker can be identified as follows.
The frustum of the camera field of view is calculated for both X and Y by
FOV.. X = 2 * tan(AngieOfViewJ</2) * (MarkerHeightFromGround -
AveragePhoneHeightFromGround)
FOV...Y = 2 * tan{Ang[eOfVlew„Y/2) * (MarkerHeightFromGround - AveragePhoneHeightFromGround}
Once the frustum is obtained then total cross-section area of the viewable ceiling floor is determined.
The percentage positioning of the marker in the camera frame is combined with the FOV values to calculate the lateral position
LateralPosition„Y = (FOV Y * (PosY/100)) / 2
tateraiPosstionJC « (FOV.. X * {PosY/100)) / 2
As the communication device will likely not be completely flat, the accelerometer values are used to calculate the angle of the phone to ground in Pitch (Y) and Roll
(x)
These Pitch (Ay) and Roll (Ax) values are used to compensate for the angle the communication device is being held by a gain factor calibrated during installation. LateralPositon„X„corrected = (LateralPositionJC - (Ax * gainfactor) * LateralPosltonJCcorrected ~ {LateralPosition„.Y - (Ay * gainfactor)
* This helps in differentiating the preferred embodiment from prior art embodiments which tend to use. the deformation and extrlnsics of the marker to calculate lateral position, not the accelerometer data from the mobile communication device, Stage ; Real World Position Calculation
* With the Marker ID identified, the system preferably lookups its real world position via a lookup table and then use the LateralPositlon J„.corrected
and LateralPosition„Y„corrected values to calculate the offset value, and thus the communication devices real world position in X. Y, and Z (floor).
« MarkerRealWorldPositlonJC + LateraiPositonJ correeted
* MarkerRealWorfdPositsonJf* - LateralPositon„Y„corrected

Claims

Claims
1. A mobile indoor navigation system comprising:
* a mobile communication device incorporating a user interface; a display for navigating a user from a first location to a second location; and a camera;
* a number of passive navigation markers disposed at a number of locations; at least one of which in use is viewed by said camera as said marker is in the field of view of said camera; and
® a processor which compares signals representative of said viewed passive navigation markers with signals stored on a database and displays navigation information dependent upon said comparison,
2. A mobile indoor navigation system according to claim 1, wherein said device is configured to scan for distinctive features in addition to said passive navigation markers,
3. A mobile indoor navigation system according to claim 2, wherein said device is configured to track the movement of said additional distinctive features.
4 A system according to either claim 2 or claim 3 wherein said distinctive features comprise one or more of the following: right-angles, light fittings, pipes, cabling,
5, A system according to any of the preceding claims, wherein said processor
determines and assesses the distance between a passive navigation marker in the field of view and the mobile communication device; and displays navigation information dependent upon said assessment,
8. A system according to any of the preceding claims, wherein said processor
determines and assesses one or more angles between a passive navigation marker in the field of view and the mobile communication device; and displays navigation information dependent upon said assessment , A system according to any of the preceding claims, wherein mobile communication device incorporates an accelerometer and/or a gyroscope and the selection of the navigation information is at least partly dependent on values derived from said accelerometer, and/or gyroscope. , A system according to any of the preceding claims, wherein said mobile
communication device incorporates a compass and the selection of the navigation information is at least partly dependent on values derived from said compass. , A system according to any of the preceding claims, wherein said mobile
communication device incorporates an accelerometer and/or a gyroscope which are configured to Identify displacement characteristics and compare displacement characteristics against pre-determined displacement characteristics; whereby a mode of displacement may be determined. , A system according to any of the preceding claims, wherein said mobile communication device locks the exposure of said camera in order to most effectively read said markers, 1 , A system according to any of the preceding claims, wherein said communication device utilises both front and rear facing cameras. , A system according to claim 11, wherein said front and rear facing cameras are utilised simultaneously, , A system according to any of the preceding claims, wherein said mobile
communication device Incorporates an accelerometer and/or a gyroscope which are configured to determine the orientation of the device and display navigation information in accordance with said determined orientation. , A system according to any of the preceding claims, wherein said mobile
communication device determines its orientation dependent upon whether a passive navigation marker Is in the field of view of said camera; and dependent solely upon data obtained from said gyroscope and/or accelerorrseter if no passive navigation marker Is present in the field of view,
15. A system according to any of the preceding claims, wherein said mobile s communication device incorporates a pedometer.
16. A system according to any of the preceding claims, wherein said mobile
communication device records a user's walking movement in order to determine the user's walking profile and compares his profile against pre- determined profiles.0
17. A system according to any of the preceding claims, wherein at least one of said passive navigation markers is asymmetric; whereby the position of the user relative to the marker is determined. s 18. A system according to claim 17, wherein said visible asymmetric navigation
markers are located on the ceiling or floor of a building.
19. A system according to any of the preceding claims, wherein said navigation
markers are held apart from said ceiling or floor by one or more fasteners.
20. A system according to any of the preceding claims, wherein said navigation
markers are visible solely in the infra-red spectrum,
21. A system according to any of the preceding claims, wherein the initial location of said user is selected from a menu in said user interface, in order to determine the location of said user.
22. A system according to any of the preceding claims, wherein global position satellite (GPS) data is used for positioning said user in an outdoor environment.
23. A system according to any of the preceding claims, wherein a predetermined list of product destinations may be stored within said navigation system.
24. Λ system according to any of the preceding claims, wherein said processor bunches said navigation system dependent on the detection of a Q code.
25. A system according to any of the preceding claims, wherein said user interface overlays said navigation information display over the on-screen floor plan map.
26. A mobile indoor navigation system as hereinbefore described and/or illustrated in any appropriate combination of the accompanying text and/or figures. 27. A method of indoor navigation comprising the steps of:
® providing a mobile communication device incorporating a user interface; a display for navigating a user from a first location to a second location; and a camera;
® providing a number of passive navigation markers disposed at a number of locations; at least one of which in use is viewed by said camera as said marker is in the field of view of said camera; and
® comparing signals representative of said viewed passive navigation markers with signals stored on a database and displaying navigation information dependent upon said comparison.
28. A method according to claim 27, comprising the step of scanning for distinctive features in addition to said passive navigation markers, 29. A method according to claim 28 further comprising the step of tracking the
movement of said additional distinctive features.
30, A method according to either claim 28 or claim 29, wherein said distinctive
features comprises one or more of the following: right-angles, light fittings, pipes, cabling.
31, A method according to any of claims 27 to 30, further comprising the steps of determining and assessing the distance between a passive navigation marker in the field of view and the mobile communication device; and displaying navigation Information dependent upon said assessment.
32. A method according to any of claims 27 to 31 , further comprising the steps of determining and assessing one or more angles between a passive navigation marker in the field of view and the mobile communication device; and displaying navigation information dependent upon said assessment
33. A method according to any of claims 27 to 32, further comprising the steps of selecting navigation information at least partly dependent on values derived from said accelerometer.
34 A method according to any of claims 27 to 33. further comprising the steps of selecting navigation information at least partly dependent on values derived from said compass.
35, A method according to any of claims 27 to 34, further comprising the steps of providing an accelerometer and/or a gyroscope; identifying displacement characteristics; and comparing displacement characteristics against pre-determined displacement characteristics; whereby a mode of displacement may be determined.
36, A method according to any of claims 27 to 35, further comprising the steps of providing an accelerometer and/or a gyroscope; determining the orientation of the device; and displaying navigation information in accordance with said determined orientation.
37, A method according to any of claims 27 to 36, further comprising the steps of
providing an accelerometer and/or a gyroscope; determining the orientation dependent upon whether a passive navigation marker is In the field of view of said camera; and dependent solely upon data obtained from said gyroscope and/or accelerometer if no passive navigation marker is present in the field of view.
38. A method according to any of claims 27 to 37, further comprising the steps of
providing a pedometer.
39. A method according to any of claims 27 to 38, further comprising the steps of recording a user's walking movement in order to determine the user's walking profile and comparing the profile against pre~deterrnlned profiles,
§
40. A method according to any of claims 27 to 39, further comprising the steps of providing asymmetric passive navigation markers; and determining the position of the user relative to said markers. 0 41 - A method according to claim 40, further comprising the steps of providing
asymmetric navigation markers located on the ceiling or floor of said building.
42. A method according to claim 40 or claim 41, further comprising the steps of
providing asymmetric markers visible solely in the infra-red spectrum.5
43, Λ method according to any of claims 27 to 42, further comprising the steps of selection of an initial location of said user from a menu in said user interface; and determining the location of said user. 0 44. A method according to any of claims 27 to 43, further comprising the steps of predetermining a list of product destinations; and storing said destinations within a navigation system.
45, A method according to any of claims 27 to 44, further comprising the steps of detecting a Q code; and launching a navigation system dependent on said detection.
46, A method according to any of claims 27 to 45, further comprising the steps of overlaying said navigation information display over an on-screen floor plan map,
47, An application software configured to operate the method of any of claims 27 to 46.
PCT/GB2014/050554 2013-02-22 2014-02-24 A mobile indoor navigation system WO2014128507A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1513966.0A GB2525531A (en) 2013-02-22 2014-02-24 A mobile indoor navigation system

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GB1303209.9A GB2511096A (en) 2013-02-22 2013-02-22 A Mobile Indoor Navigation System
GB1303209.9 2013-02-22
GB201315701A GB201315701D0 (en) 2013-02-22 2013-09-04 A mobile indoor navigation system
GB1315701.1 2013-09-04

Publications (2)

Publication Number Publication Date
WO2014128507A2 true WO2014128507A2 (en) 2014-08-28
WO2014128507A3 WO2014128507A3 (en) 2014-10-16

Family

ID=48091972

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2014/050554 WO2014128507A2 (en) 2013-02-22 2014-02-24 A mobile indoor navigation system

Country Status (2)

Country Link
GB (3) GB2511096A (en)
WO (1) WO2014128507A2 (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017201329A1 (en) * 2016-05-20 2017-11-23 Magic Leap, Inc. Contextual awareness of user interface menus
CN108332752A (en) * 2018-01-09 2018-07-27 深圳市沃特沃德股份有限公司 The method and device of robot indoor positioning
US10333619B2 (en) 2014-12-12 2019-06-25 Nokia Technologies Oy Optical positioning
US10500373B2 (en) 2015-12-04 2019-12-10 Project Moray, Inc. Lateral articulation anchors for catheters and other uses
US10512757B2 (en) 2016-03-25 2019-12-24 Project Moray, Inc. Fluid-actuated sheath displacement and articulation behavior improving systems, devices, and methods for catheters, continuum manipulators, and other uses
US10525233B2 (en) 2015-12-04 2020-01-07 Project Moray, Inc. Input and articulation system for catheters and other uses
US10600112B2 (en) 2016-10-11 2020-03-24 Walmart Apollo, Llc Systems and methods for directing a user to a location of interest
US10646696B2 (en) 2015-03-27 2020-05-12 Project Moray, Inc. Articulation systems, devices, and methods for catheters and other uses
US10806899B2 (en) 2016-02-17 2020-10-20 Project Moray, Inc. Local contraction of flexible bodies using balloon expansion for extension-contraction catheter articulation and other uses
US10814102B2 (en) 2016-09-28 2020-10-27 Project Moray, Inc. Base station, charging station, and/or server for robotic catheter systems and other uses, and improved articulated devices and systems
US10849205B2 (en) 2015-10-14 2020-11-24 Current Lighting Solutions, Llc Luminaire having a beacon and a directional antenna
US10905861B2 (en) 2017-04-25 2021-02-02 Project Moray, Inc. Matrix supported balloon articulation systems, devices, and methods for catheters and other uses
US20210372798A1 (en) * 2020-05-29 2021-12-02 Peking University Visual navigation method and system for mobile devices based on qr code signposts
US11369432B2 (en) 2016-09-28 2022-06-28 Project Moray, Inc. Arrhythmia diagnostic and/or therapy delivery methods and devices, and robotic systems for other uses
US11420021B2 (en) 2016-03-25 2022-08-23 Project Moray, Inc. Fluid-actuated displacement for catheters, continuum manipulators, and other uses
US11619827B2 (en) 2016-02-24 2023-04-04 Magic Leap, Inc. Polarizing beam splitter with low light leakage

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106441266B (en) * 2015-08-05 2021-07-09 腾讯科技(深圳)有限公司 Navigation method and navigation system based on two-dimensional code
TWI632346B (en) * 2016-02-15 2018-08-11 張季倫 A mobile navigation system and method applies nature feature marker and artificial reality.
CN106840149A (en) * 2017-01-22 2017-06-13 北京铅笔视界科技有限公司 A kind of three dimensions tracing-positioning system and method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1365358A2 (en) * 2002-05-24 2003-11-26 Olympus Optical Co., Ltd. Information presentation system of visual field agreement type, and portable information terminal and server for use in the system
US20080137912A1 (en) * 2006-12-08 2008-06-12 Electronics And Telecommunications Research Institute Apparatus and method for recognizing position using camera
US20100092034A1 (en) * 2008-10-13 2010-04-15 International Business Machines Corporation Method and system for position determination using image deformation
US20110121068A1 (en) * 2004-12-14 2011-05-26 Sky-Trax, Inc. Method and apparatus for determining position and rotational orientation of an object
US20120176491A1 (en) * 2011-01-11 2012-07-12 Qualcomm Incorporated Camera-based position location and navigation based on image processing
US20130002857A1 (en) * 2011-06-30 2013-01-03 Qualcomm Incorporated Navigation in buildings with rectangular floor plan
US20130045751A1 (en) * 2011-08-19 2013-02-21 Qualcomm Incorporated Logo detection for indoor positioning

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009222682A (en) * 2008-03-18 2009-10-01 Saitama Univ Navigation system
KR101208245B1 (en) * 2011-04-12 2012-12-04 서울시립대학교 산학협력단 Method of location recognition using mobile terminal and thereof apparatus

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1365358A2 (en) * 2002-05-24 2003-11-26 Olympus Optical Co., Ltd. Information presentation system of visual field agreement type, and portable information terminal and server for use in the system
US20110121068A1 (en) * 2004-12-14 2011-05-26 Sky-Trax, Inc. Method and apparatus for determining position and rotational orientation of an object
US20080137912A1 (en) * 2006-12-08 2008-06-12 Electronics And Telecommunications Research Institute Apparatus and method for recognizing position using camera
US20100092034A1 (en) * 2008-10-13 2010-04-15 International Business Machines Corporation Method and system for position determination using image deformation
US20120176491A1 (en) * 2011-01-11 2012-07-12 Qualcomm Incorporated Camera-based position location and navigation based on image processing
US20130002857A1 (en) * 2011-06-30 2013-01-03 Qualcomm Incorporated Navigation in buildings with rectangular floor plan
US20130045751A1 (en) * 2011-08-19 2013-02-21 Qualcomm Incorporated Logo detection for indoor positioning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ADACHI T ET AL: "Self-Location Estimation of a Moving Camera Using the Map of Feature Points and Edges of Environment", WORLD AUTOMATION CONGRESS, 2006. WAC '06, IEEE, PI, 1 July 2006 (2006-07-01), pages 1-6, XP031183126, DOI: 10.1109/WAC.2006.375761 ISBN: 978-1-889335-33-9 *
TETSUYA MANABE ET AL: "On the M-CubITS Pedestrian WYSIWYAS Navigation Using Tile Carpets", INTELLIGENT TRANSPORTATION SYSTEMS CONFERENCE, 2007. ITSC 2007. IEEE, IEEE, PI, 1 September 2007 (2007-09-01), pages 879-884, XP031151550, ISBN: 978-1-4244-1395-9 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10333619B2 (en) 2014-12-12 2019-06-25 Nokia Technologies Oy Optical positioning
US10737073B2 (en) 2015-03-27 2020-08-11 Project Moray, Inc. Fluid-expandable body articulation of catheters and other flexible structures
US10758714B2 (en) 2015-03-27 2020-09-01 Project Moray, Inc. Fluid drive system for catheter articulation and other uses
US10646696B2 (en) 2015-03-27 2020-05-12 Project Moray, Inc. Articulation systems, devices, and methods for catheters and other uses
US10849205B2 (en) 2015-10-14 2020-11-24 Current Lighting Solutions, Llc Luminaire having a beacon and a directional antenna
US11642494B2 (en) 2015-12-04 2023-05-09 Project Moray, Inc. Input and articulation system for catheters and other uses
US10500373B2 (en) 2015-12-04 2019-12-10 Project Moray, Inc. Lateral articulation anchors for catheters and other uses
US10525233B2 (en) 2015-12-04 2020-01-07 Project Moray, Inc. Input and articulation system for catheters and other uses
US10806899B2 (en) 2016-02-17 2020-10-20 Project Moray, Inc. Local contraction of flexible bodies using balloon expansion for extension-contraction catheter articulation and other uses
US11619827B2 (en) 2016-02-24 2023-04-04 Magic Leap, Inc. Polarizing beam splitter with low light leakage
US10512757B2 (en) 2016-03-25 2019-12-24 Project Moray, Inc. Fluid-actuated sheath displacement and articulation behavior improving systems, devices, and methods for catheters, continuum manipulators, and other uses
US11420021B2 (en) 2016-03-25 2022-08-23 Project Moray, Inc. Fluid-actuated displacement for catheters, continuum manipulators, and other uses
US11328484B2 (en) 2016-05-20 2022-05-10 Magic Leap, Inc. Contextual awareness of user interface menus
WO2017201329A1 (en) * 2016-05-20 2017-11-23 Magic Leap, Inc. Contextual awareness of user interface menus
US12014464B2 (en) 2016-05-20 2024-06-18 Magic Leap, Inc. Contextual awareness of user interface menus
US10814102B2 (en) 2016-09-28 2020-10-27 Project Moray, Inc. Base station, charging station, and/or server for robotic catheter systems and other uses, and improved articulated devices and systems
US11369432B2 (en) 2016-09-28 2022-06-28 Project Moray, Inc. Arrhythmia diagnostic and/or therapy delivery methods and devices, and robotic systems for other uses
US11730927B2 (en) 2016-09-28 2023-08-22 Project Moray, Inc. Base station, charging station, and/or server for robotic catheter systems and other uses, and improved articulated devices and systems
US10600112B2 (en) 2016-10-11 2020-03-24 Walmart Apollo, Llc Systems and methods for directing a user to a location of interest
US10905861B2 (en) 2017-04-25 2021-02-02 Project Moray, Inc. Matrix supported balloon articulation systems, devices, and methods for catheters and other uses
CN108332752A (en) * 2018-01-09 2018-07-27 深圳市沃特沃德股份有限公司 The method and device of robot indoor positioning
US20210372798A1 (en) * 2020-05-29 2021-12-02 Peking University Visual navigation method and system for mobile devices based on qr code signposts

Also Published As

Publication number Publication date
GB201315701D0 (en) 2013-10-16
GB2525531A (en) 2015-10-28
GB201303209D0 (en) 2013-04-10
GB2511096A (en) 2014-08-27
WO2014128507A3 (en) 2014-10-16
GB201513966D0 (en) 2015-09-23

Similar Documents

Publication Publication Date Title
WO2014128507A2 (en) A mobile indoor navigation system
US10275945B2 (en) Measuring dimension of object through visual odometry
US9250073B2 (en) Method and system for position rail trolley using RFID devices
Bleser et al. Advanced tracking through efficient image processing and visual–inertial sensor fusion
US9317921B2 (en) Speed-up template matching using peripheral information
CN105659304B (en) Vehicle, navigation system and method for generating and delivering navigation information
US20170116783A1 (en) Navigation System Applying Augmented Reality
CN102576064B (en) Method and apparatus for identification of points of interest within a predefined area
TWI485421B (en) Map matching device, system and method
KR101591471B1 (en) apparatus and method for extracting feature information of object and apparatus and method for generating feature map
US20110103651A1 (en) Computer arrangement and method for displaying navigation data in 3d
Huey et al. Augmented reality based indoor positioning navigation tool
US20020010694A1 (en) Method and system for computer assisted localization and navigation in industrial environments
JP2017520063A5 (en)
TW200944830A (en) System and method for map matching with sensor detected objects
JP5956248B2 (en) Image monitoring device
US10997785B2 (en) System and method for collecting geospatial object data with mediated reality
US10242281B2 (en) Hybrid orientation system
US11740083B2 (en) Methods and apparatus for curbside surveying
CN106537409A (en) Determining compass orientation of imagery
US11600024B2 (en) System and method for recalibrating an augmented reality experience using physical markers
Hasler et al. Implementation and first evaluation of an indoor mapping application using smartphones and AR frameworks
KR102699465B1 (en) Method and system for providing mixed reality based on real-space
US11598631B2 (en) Determining depth of buried assets
Rajpurohit et al. A Review on Visual Positioning System

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14706697

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase in:

Ref document number: 1513966

Country of ref document: GB

Kind code of ref document: A

Free format text: PCT FILING DATE = 20140224

WWE Wipo information: entry into national phase

Ref document number: 1513966.0

Country of ref document: GB

122 Ep: pct application non-entry in european phase

Ref document number: 14706697

Country of ref document: EP

Kind code of ref document: A2