WO2014172792A1 - Navigation computer system including landmark identifier scanning - Google Patents

Navigation computer system including landmark identifier scanning Download PDF

Info

Publication number
WO2014172792A1
WO2014172792A1 PCT/CA2014/050395 CA2014050395W WO2014172792A1 WO 2014172792 A1 WO2014172792 A1 WO 2014172792A1 CA 2014050395 W CA2014050395 W CA 2014050395W WO 2014172792 A1 WO2014172792 A1 WO 2014172792A1
Authority
WO
WIPO (PCT)
Prior art keywords
location
user
way
finding
information
Prior art date
Application number
PCT/CA2014/050395
Other languages
French (fr)
Inventor
Janahan Mathuran RAMANAN
Edmund BENTIL
Alexander Nana Kwame OSEI-GYUA
Satyam Hasmukray MERJA
Tharmathai Thammi RAMANAN
Original Assignee
E-Twenty Development Incorporated
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by E-Twenty Development Incorporated filed Critical E-Twenty Development Incorporated
Publication of WO2014172792A1 publication Critical patent/WO2014172792A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids
    • G09B29/106Map spot or coordinate position indicators; Map reading aids using electronic means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/183Compensation of inertial measurements, e.g. for temperature effects
    • G01C21/188Compensation of inertial measurements, e.g. for temperature effects for accumulated errors, e.g. by coupling inertial systems with absolute positioning systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3602Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/16Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using electromagnetic waves other than radio waves

Definitions

  • the following relates generally to mobile technologies and further relates to methods and systems for way-finding.
  • a computer system and computer implemented method that enable a user to access way-finding information relevant to his/her current physical location using a mobile device, and including in indoor environments, comprising: (a) obtaining information for a starting location or approximate starting location ("starting location") in a current physical facility, (b) determining a target location or approximate target location ("target location") in the facility, based on a destination, activity, resource, or service of interest to the user, (c) obtaining way-finding information, such as a map, based on the starting location and target location, (d) initiating one or more inertial functions of the mobile device, and dynamically and in real time analyzing information generated by the inertial functions, using one or more analytical methods, so as to track the movement, or estimated movement, of the user between the starting location and target location (“movement tracking data”), and (e) presenting way- finding information to the user based on the movement tracking data.
  • starting location a starting location or approximate starting location
  • target location target location
  • way-finding information
  • the user's movements are tracked using one or more of device camera image data, electromagnetic signals such as cell phone waves, AM/FM waves, Wi-Fi signal, RFID tags, GPS signal and magnetic field, and/or inertial sensors such as an accelerometer and a gyroscope.
  • electromagnetic signals such as cell phone waves, AM/FM waves, Wi-Fi signal, RFID tags, GPS signal and magnetic field
  • inertial sensors such as an accelerometer and a gyroscope.
  • At least one location aware application uses the movement tracking data so as to provide location based services to the user.
  • dead reckoning is used to generate movement tracking data.
  • step detection is used to track movement of the user, using mobile device functions.
  • the computer system logs stepping information of a user to a profile and this is accessed upon initiation of a way-finding session in order to improve accuracy of tracking of a user.
  • novel and innovative phone orientation techniques are used in combination with dead reckoning.
  • the computer system and method use statistical methods for removing errors and thereby improve iterative estimation of location using analytical methods.
  • a computer implemented method provides to a user access to way- finding information relevant to his/her current physical location using a mobile device, the method comprising: (a) taking one or more photographs or a continuous stream of images in form of a camera pan, of one or more identifiers of landmarks in the physical location that are indicative of the current physical location, (b) extracting identifier information from the one or more photographs, (c) comparing the identifier information to a map information database that includes identifier information, to determine the current physical location or approximate the current physical location of the user, (d) selecting the current physical location from a list of searchable or non-searchable possible locations on the device, (e) selecting the current physical location through clicking on an icon representing a physical location, on the map shown on the device, (f) using voice-activated features to set the current physical location, (g) integrating any of the mentioned methods with 'smart search' to set the current physical location and/or the destination explained below, (h) based on the current physical location
  • Smart search provides a humanistic way of searching which has the ability to link natural language tags with locations in the environment.
  • a non-transitory computer program product tangibly embodying code that, when executed by a processor, causes the processor to carry out at least one of the methods described herein.
  • FIG. 1 is a diagram illustrating an example of a room number identifier.
  • FIG. 2 is an example of a direction sign identifier.
  • FIG. 3 is a further example of an identifier, in this case a washroom symbol.
  • FIG. 4 illustrates the axis representation of a phone.
  • Fig. 5 is a diagram illustrating one orientation of the phone held by users during walking.
  • FIG. 6 is an example of how key lines and features can be extracted from a camera image.
  • FIG. 7a depicts the application during start-up where user can choose specific facility to start navigating inside.
  • FIG. 7b is an example of a venue floor plan with markers of points of interest, and a favorites bar with popular destinations.
  • FIG. 7c depicts a process to initialize location and orientation estimation via text/image recognition of a room sign.
  • FIG. 7d depicts a user's position initialized onto the map, and a plotted route to a destination.
  • FIG. 7e depicts one variation of a location aware pop-up delivered to the user during navigation.
  • FIG. 7f is an example of the user being notified they have reached their destination by changing the destination icon to a green checkmark.
  • FIG. 7g depicts the web portal dashboard giving snapshots of departmental information.
  • FIG. 7h is an example of the web portal analytics page depicting infection hot-spots on a floor plan.
  • FIG. 7i is an example of the web portal user/staff administration page allowing management to grant specific access to various users.
  • FIG. 7j is an example of the web portal maps page, where administration is able to edit map routes, markers, and location info.
  • FIG. 8 is an example of a floor map with estimated user location represented with a circle.
  • FIG. 9 is a computer system architecture for implementing embodiments of the computer system.
  • FIG. 10 illustrates a representative generic implementation of embodiments of the computer system.
  • FIG. 11 is a flowchart showing methods for providing way-finding information.
  • FIG. 12 is a flowchart showing methods for determining a location based on at least one photograph of at least one identifier.
  • the disclosure provides a computer system or technology platform (may be referred to as "platform") that enables the way-finding solutions described in this disclosure.
  • the platform delivers way-finding information to one or more mobile devices.
  • a new computer system and method for localizing and tracking the movement of a user along a path indoors using any smart phone equipped with either a camera, an accelerometer, a gyroscope, and/or a compass are provided. Further embodiments provide a new way-finding solution that does not require access to an external system or to an Internet connection, which is important as in some indoor environments network access may not be available in areas, and in any case performance of such networked solutions may be slow, and may require users to expensive network access which may exclude certain users from using the solution.
  • a series of techniques is provided that enable localization and location tracking using only a mobile application and one or more features or resources already available on many mobile phones. These techniques are described below, and fall under two main categories: (A) motion tracking using mobile device functions only, and (B) further improving performance of localization and/or motion tracking by capturing images of one or more landmarks of a facility, and analyzing these images to support localization / motion tracking. Embodiments describe way-finding in the context of routing between a starting location and a target (or destination) location.
  • a novel and innovative inertial navigation solution is provided that is suitable for indoor environments, and that utilizes one or more novel and innovative localization and location tracking methods described herein.
  • way-finding solutions in relation to a wide range of environments, both indoors and outdoors, and also in relation to a variety of different types of mobile devices in regards to available hardware, software or network resources are provided.
  • the set up or the use of any external hardware may not be required.
  • manual sweeps of a facility may not be required.
  • the solution is capable of tracking the user, using only for example an accelerometer and gyroscope signals. The accuracy and performance of the solution can be increased for example by integrating external radio signals (which would require sweeps).
  • a mobile device such as a smart phone or tablet computer
  • the functionality of the client computer program described in this disclosure may also be implemented, in whole or in part, using one or more smartphone features.
  • the computer system may include a server application, implemented to one or more server computers.
  • the computer system may also include an application repository or may be implemented using a cloud computing service.
  • the computer system does not have to be hosted on an external server or repository; it can be integrated on the device itself making the solution self-contained and fully functional on the device, without any communication with an external server or repository.
  • FIG. 9 illustrates one possible implementation of the computer system of the embodiments.
  • the computer system may include an electronic mapping solution that may be implemented so as to: (A) determine the location or approximate location of a mobile device, (B) receive one or more requests for way-finding information from the mobile device, (C) use the location or approximate location information to access one or more relevant maps or map information from a database, (D) use the maps or map information to generate way-finding information; (E) communicate the map information to the mobile device; (F) optionally the client computer program uses the map information to generate way-finding content responsive to the request, which may consist of way-finding information or content in a variety of formats that may include map information, text information, way-finding instructions, route information and so on.
  • the displayed way-finding information may also depend on one or more user preferences that the user may define by configuring settings using the client computer program.
  • any manner of way-finding content may be generated such as: a map showing the current location of a user, the orientation of the user, a desired location of the user, a suggested path between the user's current location and the desired (target/destination) location, destinations of possible interest between the current location and the desired location and so on.
  • a target or destination can be provided by a user by execution of a target location determination routine, as enabled by the routines described herein.
  • the computer system may connect to various third party systems.
  • the computer system may connect to or integrate with a facility management system to obtain up to date information regarding the facility for example. This information may be used for example to route users away from an area that may be busy or closed.
  • the computer system may be implemented at a hospital and may connect to or integrate with a health information system that may contain for example appointment information for a user. Based on this appointment information the computer system may know the desired location of the user already.
  • the computer system may initiate the determination of the user's current location automatically, and based on this may generate way-finding information for accessing the location of the appointment.
  • the computer system may also connect for example to an enterprise resource management system (or other similar system) that may track for example the current location of an individual or asset that the user wants to find.
  • the computer system may be used to generate way-finding information to find the individual for a meeting for example or to locate the asset.
  • mapping solutions may be implemented to the mobile device including by implementing part of the mapping functionality to the client computer program on the mobile device and optionally storing to a data store on the mobile device some or part of the mapping information. This may enable the computer system to provide way-finding information to the mobile device even in off-line mode and further may be used to improve the performance of the way-finding solution or utilize network resources efficiently.
  • the computer system may include one or more technologies or processes, described hereinafter in greater detail, for acquiring the maps or map information.
  • a floor plan may be used in a map generator to create the maps or map information.
  • Floor plans are generally required due to building code regulations in case of emergency.
  • the map requirement is common to all indoor localization solutions, as is the process of extracting the useful information from the blue prints.
  • novel and innovative mechanisms for determining the location or approximate location of the mobile device are provided.
  • the initial localization of the user's position may be performed through several mechanisms provided by a starting location determination routine, including (a) taking one or more photographs (or panning) of an identifier or landmark in the user's surroundings (as further described below), (b) selecting an initial location (or starting point of way-finding) from a list of possible locations suggested to the user using a display associated with mobile device, (c) setting an initial location within a more general location, area, venue, or facility (referred to collectively as a "facility") by displaying a map of the more general location, area, or facility, by for example using an input means to select a location or sub-area on a map, for example by tapping on such location or sub-area or an associated icon on the map, (d) initiating a voice-activated search for an initial location, or (e) accessing one or more "smart search" processes for establishing an initial location, as
  • a device location determination routine (other embodiments may comprise user interaction).
  • One method is to track the location of the user's mobile device while the application is in the background by first processing GPS information while the device is outdoors to identify the building and entrance the user entered and then transitioning to a dead reckoning system, described below, to track the device indoors.
  • a dead reckoning system described below
  • the initial location functions as a starting point for a way-finding session, involving for example guiding a user from the starting point, being Location A, to one or more end points in the facility, being Location B.
  • a starting location is determined.
  • at least one target location is determined based on at least one of a destination, activity, resource or service of interest to the user.
  • the location of the device is determined.
  • way-finding information is determined based on the starting location, the at least one target location and/or the location of the device.
  • display of the way-finding information on the device is initiated.
  • a mobile application is provided; however, certain features described for the mobile application may also be implemented to a server application. These and other methods are described in greater detail herein.
  • Physical locations generally have landmarks that are indicative of their location or approximate location.
  • Various locations within a facility or other physical environment such as a room, an office, a department entrance, a hallway, a corridor, stairs, an escalator and other physical features of a facility may be associated with visually perceptible information, such as an identifier (an "identifier" being for example a number, letter, symbol or a combination thereof) or tag for any physical feature of a facility relevant to way-finding such as a room, area, office suite, elevator, stairs etc., or a department name, and so on.
  • FIGS. 1 , 2, and 3 show examples of possible identifiers, in examples of photos of such identifiers captured by a mobile device.
  • the computer system is configured to extract information from such photos in order to infer the location or approximate location of the user.
  • these identifiers are stored to the database of the computer system, and in one implementation are associated with specific locations in the map database, for example using the location coordinates associated with the map information.
  • other properties and tags can also be tagged with location entries in the database. For example, other properties may include color (certain departments have color codes), images (certain zones in the facility have logos, symbols that are used as identifiers), and environment (certain zones in the facility have unique designs on the walls or floors which can be used to distinguish them from other zones). All these information elements may be added to the locations database.
  • the identifier may merely be a colour of a hallway that in a particular building for example is used to designate a particular area such as a department, wing of a mall, and ward in a hospital and so on.
  • a user may take at least one for or a camera pan of one or more identifiers, as shown in Figure 12 at step 1201. This may occur by accessing one or more related functions of the client computer program.
  • the client computer program may present a "PAN CAMERA AT A LANDMARK" screen or equivalent.
  • the user may take pictures of successive identifiers.
  • the client computer program may use resources and information available on the mobile device in an off-line mode. In either case, the computer system cross-references the captured information against the contents of the map information database so as to generate way-finding information as previously described.
  • a user may take pictures for example of any sign containing location information such as a door number, directional sign (signs which point into the direction of various destinations), elevator or staircase signs, logos, unique images, on a smartphone.
  • the computer system may utilize any mechanism for extracting relevant information from the image including but not limited to: (A) OCR (text filtering) (to extract text from the images that may be relevant to determining location of the mobile device, such as in the case of a picture of a room number; (B) color filtering; (C) symbol or logo identification and so on.
  • OCR text filtering
  • C symbol or logo identification and so on.
  • This extraction may occur on the mobile device using one or more client computer program features for example, or on the server computer using the server application, or as part of a cloud service.
  • the system can potentially work while offline (i.e. no active data connection to a network).
  • the computer system may utilize one or more solutions (including the identifier based location method described), for example to narrow down facility locations.
  • the computer system may use GPS sensor data or Wi-Fi IP address information to improve the accuracy of location determination.
  • the use of additional localization methods may be particularly useful when a user is taking their first picture at a given physical location.
  • step 1202 information regarding the identifier is extracted from the photograph obtained in step 1201.
  • step 1203 the information is compared to a map database to determine a match between the information and a location in the map database.
  • step 1204 when a match is determined, the location in the map database is returned to provide a device location or starting location.
  • the photograph also referred to herein as "image” or "picture"
  • all text is extracted/filtered using OCR techniques.
  • the text may be analyzed to interpret in part the type of information represented by the text.
  • the computer system may differentiate for example between text representing a room number and text representing a direction.
  • the computer system may implement one or more logic rules for interpreting the information extracted from the photos and also inferring an accurate location from one or more extracted information elements. For example, if the filtered text contains a room number (as shown in Fig. 1) or some other unique code is found in the map information database, an exact position coordinate may be returned by the computer system.
  • the image contains text which does not correspond to a unique identifier, such as a directional sign (as shown in Fig. 2) pointing to various destinations, then further logical steps must be taken.
  • Directional signs contain names of various sites and from their text alone cannot reveal an exact location of the sign. Instead an area or region boundary could be returned as the approximate location of the user. If the sign mentions "food court” and "atrium” (each having an entry on the locations database) there is a high possibility they are occupying a region between or around those locations. Also by recognizing the directions of the arrows on the signs pointing to the destinations, the application could narrow our area estimate considerably. From this example it will be appreciated that filtered text alone can narrow down the user's location to an exact position coordinate or approximate area coordinate based on the nature of the text.
  • the computer system may also extract from the pictures additional context which is the also transferred for the purposes of localization and generation of way- finding information.
  • the computer system may extract information from a direction sign and also may recognize that the direction sign is a direction sign rather than an office number sign, and provide this information for localization purposes in order to improve localization accuracy.
  • mapping utilities will have a locations database which closely resembles this format:
  • the computer system may use a key word search based on the recognized text in the map information database. This may yield one or more results. The computer system may then apply one or more logical rules to search results in order to deduce the user's position or approximate position.
  • Using a text only solution also has the advantages of low computations/processing needed to go from the initial image to user position, and can easily be performed offline. This approach also does not require an extensive database of location tagged venue images to match pictures captured from the computer system.
  • the text extraction is performed on the mobile device using one or more features of the client computer program.
  • the map information database may be implemented as a customized map information or locations database that contains the extra tags and identifiers consisting for example of colors, symbols, logos, and shapes.
  • the extra tags and identifiers consisting for example of colors, symbols, logos, and shapes.
  • different wings of a facility may use signage having different color themes.
  • the color captured from the color filter can be searched for in the database to reveal which wing the user is in. In this manner multiple properties may be detected on one image to give a better position estimate of the user's location.
  • the filtered text, color, or symbol extracted from the image may not be found in the map information database, or may be found in too many records or entries of the map information database. If too many records are found, the next step would be to use the most recent GPS data and/or if possible the most recent radio based position estimate to narrow down the possible user locations within in "active area" defined in the next sections. For example if it is known through GPS that the user has just entered the building through the west wing entrance, and the image taken by the user has captured an elevator image or related text, the user position will be narrowed down to the west-wing elevators rather than any other elevator locations.
  • the client computer program may notify the user of the mobile device, and request additional information. For example, the client computer program may display a message on the mobile device display requesting that the user take another picture, and this picture is processed as described above in order to narrow down the possible positions of the user. If all the above filters are inadequate in finding a specific location, the user may be presented with a shortlist of most likely marker locations for them to pick from. Wherever possible, associated screenshots (general wide field view) of these locations will be displayed to better allow the user to make a selection.
  • the computer system can deduce orientation of the user from the pictures taken and/or filtered text.
  • a user submits an image of a unique door number.
  • the map information database may also include properties such as orientation of entrances. This can be deduced for example from the blueprint in populating map information for a facility into the database. For example the "Meeting Room 5" entrance may be known to be facing west; from this the mobile device must have been oriented east to take the picture. Moreover from the orientation of the text on the image (whether it is in portrait or landscape), the exact orientation of the device can be locked. Even from directional signs, cross referencing the destinations as well as the relative directions to the destinations, can give a good estimate of the user's mobile phone can be inferred.
  • the filtered text, color, or symbol extracted from the image may not be in any entries of the locations database, or may be in too many entries. Either way, an appropriate position estimate cannot be given. In this case the user is asked to take another picture, for which the processed information will be appended to the information from previous pictures to narrow down the possible positions of the user. If all the above filters are inadequate in finding a specific location, the user is presented with a shortlist of most likely marker locations for them to pick from. Wherever possible, associated screenshots (general wide field view) of these locations will be displayed to better allow the user to make a selection. In other words the location determination solution may be iterative. Various modifications to the workflow are possible.
  • the computer system may log location history for each user during each session, for example by storing data regarding location history to the user's profile using the profile manager. This way, by keeping track of the time elapsed since localization, a virtual circle can be created ("active area") and centered on their previous location with radius proportional to the elapsed time. This circle represents an area of high probability where the user is located. Based on this information the mobile device memory can be intelligently populated with location entries and route information from the cloud or server computer (or other information that the computer system determines may be relevant or of interest to the user) via data connection so that if the user goes back offline, they will have some entries around their location to cross reference.
  • Smart search allows the user to use keywords to identify specific locations within the venue.
  • Each location within the venue may be associated in a database with a list of words that describe, or are mapped to, that location.
  • the radiology clinic within a hospital may be associated with the following keywords: “ultrasound”, “x-ray”, “CT”, “CT scan”, “radiation”, and “radiologist”.
  • the keywords may represent the activities and/or the personnel associated with the location. Keywords may also be sentences, synonyms of the location or abbreviation of the location.
  • keywords for an emergency department within a hospital “broke my leg”, “emerg”, “ER”, “urgent care”, “ambulance drop-off”, and “bleeding”.
  • keywords for an emergency department within a hospital “broke my leg”, “emerg”, “ER”, “urgent care”, “ambulance drop-off”, and “bleeding”.
  • Various other keywords are possible.
  • Each of these keywords may be stored to a data store, where a data store may be associated for example with a particular facility or part of a facility.
  • the computer system may incorporate various search functions including for example semantic search tools that allow a user to provide a semantically related term, which is then analyzed an associated with a keyword in a defined list of key words associated with a particular location.
  • semantic search tools that allow a user to provide a semantically related term, which is then analyzed an associated with a keyword in a defined list of key words associated with a particular location.
  • Smart search may be combined with other methods, including the features discussed herein, to improve accuracy of localization of the user.
  • the data stores may be implemented to a server computer, and also may be included in a data store linked to the mobile application to enable operation in an off-line mode.
  • the mobile application may allow the mobile device to switch between on-line and off-line modes.
  • the mobile application may use geo-location to sense that the mobile device is approaching a facility associated with the computer system, and may download one or more data sets or application updates so as to support the way-finding features of the system, once the mobile device is off-line, for example inside the facility where network services may not be available, or the user may be required to shut down the sending/receiving functions of his/her mobile device.
  • the mobile application may acquire updates to a data store associated with the facility such as for example up to date maps for a facility, up to date key word data stores, and so on.
  • dead reckoning or disambiguation processes are used for the first time in connection with way-finding.
  • dead reckoning can be used to keep track of the user's position.
  • Dead reckoning is used to determine a change in the user's position, which can be used to compute the user's current position relative to an initial starting position.
  • Dead reckoning can be applied for example using output from the mobile device's inertial sensors such as data from an accelerometer or a gyroscope, as well as non-inertial sensors and devices such as magnetic field sensors and the camera.
  • Step detection is used to identify whether the user is taking any steps, which corresponds to movement. Step detection is performed by analysis of real time data from inertial sensors such as, for example an (accelerometer, a gravity, linear acceleration, magnetometer, and gyroscopes. All data is given in tri-axial representation with respect to the phone.
  • the accelerometer measures the total phone acceleration, for example in m/s 2
  • gravity data gives the vector direction of normalized gravity with respect to the phone axis.
  • Linear acceleration gives extracted component of acceleration caused by user movement in m/s 2 for example
  • the magnetometer measures the ambient magnetic field in for example
  • the gyroscope measures the angular velocity of the computer system in for example the rad/s.
  • Normal walking will correspond to a sinusoidal and/or periodic signal from inertial sensors, due to the constant swinging and swaying of the body.
  • the period of the signal is proportional to the number of steps taken by the user.
  • the signals can be filtered using FIR filters with standard windowing techniques; the windowed filtered and unfiltered signals are then passed through feature extraction layers to identify key attributes of the signal versus time.
  • Example of features that can be extracted from filtered and unfiltered signals are variance, energy (sum of squares), or magnitude maxima or minima position, mean, moving mean, and peak-peak amplitude, first and second derivative, and first and second integrals for all sensor signals.
  • the values and characteristics of the features can be used to characterize and describe a step profile for ideal walking.
  • the step profile may consist of using a combination of the above-mentioned features and ranges for a subset of specific inertial sensors mentioned above which represent conditions of true steps.
  • the values of the features at any time can be compared against this step profile to determine whether the user is walking or standing still.
  • the step profile can also be used to distinguish between an actual step and noise (either user generated or due to sensor inaccuracies). These profiles are not limited to steps, but can also be used to identify when the user is on the elevator, stairs, and escalator. This is useful to detect floor switching within a facility.
  • noise sources There are many noise components which corrupt the sensor measurements and data: one is white/temporal noise which exists even at high frequencies, the other is DC noise which produces an artificial bias or a base-line.
  • the temporal noise is made up of random fluctuations averaging zero and lasting a short time on the order of milliseconds. This type of noise can be eliminated effectively by applying a low pass filter to sensor data.
  • the DC noise is a slow moving offset caused by poor sensor calibration. For example when the phone is lying still on a table, the accelerometer and gyroscope sensor data should ideally read zero, but rather due to the bias noise it may read some non-zero value. In addition this non-zero value may change slowly over time, for example on the order of seconds, varying changes in the orientation of the phone with respect to gravity, and may differ from device to device.
  • DC elimination firstly involves the ability to separate/identify the bias component from the true data with minimal ambiguity.
  • the cases when the DC noise can be identified theoretically is when: the true data is known to be zero, then the remaining signal is the bias component; or, when the true data is known a priori, in which case the true signal can simply be subtracted from the total to identify the bias, as done in calibration.
  • the true signal is known to have a mean value of zero, the bias can be calculated from the mean of the signal.
  • Variance Based Triggers One of the extracted features mentioned above is signal variance which is independent of the signal mean or bias.
  • the variance measures the distribution of the signal with respect to the mean, but is not affected by it.
  • the variance stays quite low.
  • the variance spikes up to higher values.
  • Appropriate trigger values can be determined such that when the variance is lower than a certain trigger value, the program can assume with reasonable confidence that the user is in State A or B.
  • the program will update the known value for the DC noise by calculating the signal mean, running mean, moving average or similar value.
  • Another approach is based on the curvature of the signal, directly proportional to the second derivative. This feature is also not correlated to the mean value, and is a good indicator of the two states of interest. Similar to the variance the second derivative of the signal is also lower when the user device is in State A and B. Appropriate trigger values can be determined such that when the second derivative is lower than a certain trigger value, the program can assume with reasonable confidence that the user is in State A or B. During this time the program will update the known value for the DC noise by calculating the signal mean, running mean, moving average or similar value.
  • Another approach is to identify either state from signal peak-to-peak value. Again, the peak-to-peak value is lower when the user device is in either State A or B. Appropriate trigger values can be determined such that when the second derivative is lower than a certain trigger value, the program can assume with reasonable confidence that the user is in State A or B. During this time the program will update the known value for the DC noise by calculating the signal mean, running mean, moving average or similar value.
  • All profiles may be generated for common phone orientations, and stored using the profile manager to a suitable database.
  • the phone orientation can be classified into one of the three categories: the phone is held such that the z-axis (Fig. 4) is facing up (phone parallel to ground, Fig. 5), the y-axis (Fig. 4) is facing up (phone perpendicular to ground), or the x-axis (Fig. 4) is facing up or down (phone held at side).
  • Each orientation of the phone corresponds to a specific step profile, which the step-detection algorithm uses to identify steps.
  • the combined knowledge of the orientation of the phone and the features is used to determine whether a step is taken or not.
  • step detection can only be performed effectively by attaching sensors on or near the feet.
  • the step detection method is capable of functioning with the phone held in any orientation at or near any bodily region. That is, the user can have the phone in their hands (around the hip area) in front of them, on their side, or even tucked away in their pockets.
  • optical flow method uses visual information obtained using a camera of a mobile device for example.
  • camera pixel information is analyzed for example using the Lucas Kanade algorithm (or another suitable process) to find the movement of certain key features in a picture window.
  • Lucas Kanade algorithm or another suitable process
  • orientation of the phone can be calculated using the combination of output from an accelerometer (which yields the gravity vector), a magnetometer (which yields the direction of the magnetic field) and a gyroscope (which yields the change in orientation of the device). Orientation is used to assume the direction of user's movement. Orientation of the device is identified as the yaw angle of the device and is independent of the roll and the pitch angles.
  • Gyroscopes do not provide an absolute orientation, but only the relative orientation change (hereby referred to as a relative orientation estimate). For example, one can detect a 90 degree turn to the right, but not the direction that the phone is facing (North, East, West, or South) at the end of the turn. So, to get the absolute orientation one can use the magnetometer (compass) and accelerometer (gravity).
  • magnetometer compass
  • accelerometer gravitation
  • each sensor has its own uncertainties and is prone to corruption from noisy data.
  • a common approach taken is the use of a Kalman filter, which aggregates data of various sensors (for example gyroscope, magnetic sensor, and accelerometer) to come up with a better orientation estimate. Using a Kalman filter, uncertainty profiles are assigned for each sensor which allows the program to determine which sensor should be trusted and given priority.
  • Kalman filters rely on the magnetic fields (measured by magnetometer) and assume that the magnetic fields point constantly in a northerly direction. However, in indoor locations, this is not always the case. Building materials as well as electronic devices can cause interference in the magnetic field to vacillate across a range of directions. Large continuous regions of a facility can have an intrinsic magnetic field which deviates from North. To overcome this limitation, one method heavily relies on other sensors (gyroscope, accelerometer, camera) to track the orientation change of the device. Limiting the use of magnetic field means that the starting absolute orientation of the device is not known when navigation starts.
  • the user can be asked to take a photo of a nearby sign or landmark (note that this same image can be used to perform localization as described above); then by using an automated algorithm to identify the angle of the device when photo was taken (mentioned above [0044]) as well as the data from accelerometer, the initial orientation of the phone can be extracted. The data from the gyroscope can then be used to track the change in orientation of the device as compared with the starting orientation.
  • bias in the inertial data may be reduced by aggregating the inertial data received from each of two or more sensors and calibrating the inertial data received from one sensor against the data received from another sensor.
  • gyroscopes in portable devices are generally not of a high quality, and have their own intrinsic errors. As a result they drift from the true orientation over time. Gyroscopes relay angular velocity information of the device, and are prone to bias (constant) additive noise described earlier.
  • the bias in angular velocity causes drift over time in the relative angular direction of the integrated gyroscope signal. This can make the device appear as if it is rotating (as interpreted from the gyroscope signal), even when it is lying still.
  • the bias noise also varies with time and is dependent on the orientation of the phone, making it difficult to compensate for.
  • To overcome the bias i.e.
  • a recursive algorithm which in one aspect can be a modification of the non-recursive Whittaker method.
  • This is a non-linear least squares method using the average of the signal as well as its curvature based on the triggering methods described above to identify States A and B and track the bias noise.
  • This non-linear algorithm uses threshold triggers based on gyro variance and average to dynamically identify and track the bias of the gyro signal. Once the bias is identified, it is subtracted from the total signal to recover the uncorrupted signal.
  • an accelerometer is used to fix the orientation of the device about the roll, pitch angles (rotation about x and y axis as shown in Fig 4).
  • the accelerometer data can be passed through a low-pass filter which is used to track the gravity vector in the phone coordinates system (Fig. 4). This allows for an independent estimate of the roll and pitch angles (locking in 2 degrees of freedom of the total orientation which needs 3) and correction for any errors accumulated from the roll and pitch angles determined from the gyroscope readings.
  • This gyroscope-accelerometer method provides one example mechanism for estimating orientation change.
  • the computer system utilizes the combined magnetic field (providing yaw info) and accelerometer vectors (providing roll and pitch info) to track relative orientation change.
  • the magnetic field can still be constant over that local area. Therefore, it can be used as a local reference direction used in orientation estimation. In this way, the computer system provides another independent relative orientation estimate coming from magnetometer-accelerometer, making use of external fields.
  • Magnetometer-accelerometer relative orientation estimation can be used when the magnetic field is locally constant, and discarded when the magnetic field is in transition to a new local reference direction.
  • the system and method identifies when the magnetic field is in transition and thereby cannot be used.
  • the change in roll, pitch, and yaw can be calculated each from the gyroscope-accelerometer, and magnetometer-accelerometer estimates, and are compared. When the difference between the two is greater than a threshold determined heuristically, it can be attributed to a transition of the local magnetic field rather than turning of the mobile device.
  • a dynamic Kalman filter combines the two estimates, which continuously changes the uncertainty weights based on the identification of a constant or transient magnetic field. When the magnetic field is identified as being in transition, higher uncertainty is assigned to the magnetometer-accelerometer estimate, thereby trusting gyroscope-accelerometer estimate more. When the magnetic field is identified as stable, the magnetometer-accelerometer estimate is used in addition to come up with final relative orientation estimate. In this manner, the dynamic Kalman filter is able to obtain highly accurate orientation information using low quality sensors.
  • map details can be used in correcting for position/orientation estimation error.
  • the map itself imposes certain restrictions on the position of the user (and thus the device). For instance, the user can only be in valid public hallways and the user can only walk in the directions parallel to hallways (they cannot walk through walls).
  • the system can use this knowledge to improve the estimation of the orientation of the mobile device as well as the estimation of the position of the mobile device.
  • the orientation of the mobile device does not change significantly for about 5 seconds, and steps are being detected while the user is in a hallway, it can be detected that the user is walking in a direction parallel to that hallway. As such, it is possible to snap the orientation of the mobile device to the closest direction parallel to that hallway, calibrating for orientation error.
  • the position of the user can be improved when the user takes turns. For instance, once identified that the user has taken a turn (for example, a right turn), and from the map it is evident that there is a corridor extending to the right near the current estimated location of the user, the system waits to see if the user takes additional steps in this new orientation. Once steps are detected in the new orientation, the system determines that the user is on the new corridor, and his/her position is updated accordingly.
  • a turn for example, a right turn
  • This aspect of the system involving the dynamic update of the user's current position as part of tracking movement of the user, may be implemented using a Bayesian probability based approach. Every time the user passes or approaches a corridor, there is a slight probability that he/she may have entered that corridor. The distance to that corridor from the user's current position, as well as, the angle between the mobile device's current orientation and the closest direction parallel to that corridor can be used as qualifying factors to determine if the corridor is added to the heuristic history probability table. As per these criteria, all the corridors are considered for inclusion into the heuristic history probability table. The heuristic history probability table keeps track of probabilities associated with the user being in any corridor.
  • the position of the user in that corridor is set as the intersection between the corridor in question and the user's current corridor. If a corridor being considered for inclusion is already in the probability table, then the possible user position in that corridor is updated with respect to the user's current position on that corridor. [001 13] If use of dead reckoning by the system determines that the user is going beyond the ranges of the corridor and outside the allowed map areas, then the user position for that corridor is not updated and the probability associated with that corridor is reduced significantly. In effect, the user's position is restricted to a valid location on the map (that is, the user has to be on one of the defined corridor ranges of the map). The probability table can be updated every time a step is detected and the location with the highest probability is considered as the most accurate estimate of the user's current position, and is the position displayed on the map.
  • [001 14] Use of a heuristic algorithm is especially useful in identifying and correcting the user's current position in venues that involve multiple turns at close proximity and asymmetric geometry. For example, consider that the user is walking along a corridor and takes a right turn. The floor plan is such that there are two possible corridors where the user could have turned; corridor A or corridor B (Fig 8). The user might actually turn right on corridor B; however, the heuristic algorithm incorrectly presumes that the user turned right on corridor A. At this time, the heuristic probability table contains both corridors as possible positions but A has a higher probability due to the qualifying factors mentioned above. With initial steps taken after the right turn, the algorithm estimates that the user is proceeding on corridor A.
  • the algorithm will eventually estimate that the user has reached the end of corridor A, and upon further steps the user's position goes beyond the range of corridor A. At this moment, the algorithm will place penalties on corridor A reducing its overall probability in the heuristic table, eventually lowering its probability below the probability given to corridor B, having the next highest probability on the heuristic table. The user will then be repositioned at an appropriate location on corridor B.
  • the performance of the system can also be enhanced by using external radio signals, such as Wi-Fi, BLUETOOTHTM, RFID tags, and cellular signals which act as external references to confirm estimated position and orientation.
  • external radio signals such as Wi-Fi, BLUETOOTHTM, RFID tags, and cellular signals which act as external references to confirm estimated position and orientation.
  • Radio localization requires a manual sweep of indoor environment of radio signals (e.g. cell tower, AM/FM signal, Wi-Fi, BLUETOOTHTM). Any floor area that the user can take must be previously swept to build a signal profile library.
  • the signal collection program and device can continuously record the radio signals along the route and label the signals with the current time.
  • time stamps are used to estimate the location of the recorded signals. This is done in a simple way where signals with earlier time stamps are associated with positions near the start of the particular route, and signals with later time stamps are associated with position near the middle or end or route.
  • the collected radio data is then pre-processed before being used for localization. Pre-processing must be done since different devices have different antenna specifications and firmware which interprets the radio electro-magnetic fields uniquely. So inherently different devices may have different radio maps having different profile values.
  • a method is provided to transform one radio map obtained from one device to radio maps that can be used by other devices. All that is needed is to find the minimum and maximum radio strengths detectable from the new device firmware and normalize the existing map to fit the new range of signals read by this device.
  • a Bayesian network is used to generate, with heuristic probabilities, the signal profiles for various positions on the floor. Then, during run-time the system passes the current signals being read from the antenna through the Bayesian network which then infers which location has the highest probability.
  • the radio sweeping process may be crowd sourced using the inertial navigation (dead reckoning) system mentioned above.
  • inertial navigation dead reckoning
  • This crowd sourcing approach has the advantage in building radio signal maps at multiple locations very fast from collected user data.
  • Radio prediction raises two issues.
  • coarse grain localization having uncertainty of usually 5-10 meters and can even go as high as 15 meters. This uncertainty radius depends on the environment and is not predictable or controllable. This is the main disadvantage of radio localization.
  • the radio localization with the inertial navigation methods to obtain fine grain position estimation is augmented.
  • an initial position once the computer system is first turned on is estimated from radio localization.
  • a pattern and line of movement of the user are identified. This information is used to set the user heading/orientation. Feeding the position and orientation information to the inertial navigation system gives ability to distinguish fine grain position movement on the order of steps.
  • a dynamic Kalman Filter may be used to amalgamate the position estimates coming from inertial navigation and Wife estimation. In general, a Kalman filter is used in many position estimation problems.
  • Kalman filter is proven to be an advantageous recursive filter able to combine two independent sets of noisy inputs into an output signal with minimized noise. It takes into account the known uncertainty distributions of each sensor and understands the situations where one sensor can give more reliable information than the other. So in one scenario, two different inputs, one coming from the inertial navigation estimate, and the other from the radio estimate are provided. The uncertainty distributions of each based on the context of the map are dynamically varied. For example if known a priori that Wi-Fi performs poorly at certain hallway intersections, then the dynamic Kalman will weight the inertial estimate higher. The Dynamic Kalman can produce position estimates with 2-3 meter uncertainty.
  • a web portal is included, implemented for example to a server computer including a web based utility, that when executed presents certain map information and views based on access granted to users. Based on user authentication/authorization, it is possible to change, update, and edit the map and content information. Facility administrators who are responsible for way-finding can sign into the web portal to update allowed routes, mark- off construction zones, and highlight detour corridors, and mark private and public routes (e.g., staff would have access to private corridors during navigation). Once these updates are communicated to the main server, all mobile devices running the specific way-finding computer program will be notified of the update and asked to download the changes to the new map database, file, and images.
  • Marketing administrators can also login to the web portal for managing content related to location based alerts, department information, and any ongoing promotions. Once these updates are uploaded to the main platform server, the next time the user's mobile device contacts the server the content database will be updated. When a user clicks on a department icon, the new information will be displayed. Or if the user is within some radius of the department a pop-up alert with the new information can be displayed.
  • the Web portal can also be made available to members of the public who would like navigation and routing information. In this manner an individual can access the web portal and request routing information from the main server. They will be required to set a starting point and a destination point through the web portal interface and a route will be returned and plotted on the facility map.
  • the web map can also have an interactive component where users see an icon (representing position) jump from the initial starting point, to key turn locations all the way to their destination by clicking next and previous arrows on the map.
  • the URL of the map page can be copied and emailed to anyone using any device with a web browser (out-of-band communication), which when opened will display the map with the requested route pre-plotted. Once the URL is opened on a mobile device or computer system the navigation information as well as the interactive component is cached on local memory so the user can view and interact with the map offline.
  • the user may generate a URL link through the application to send to another user, as described above.
  • the URL may be sent to an email address or shared in a chat room.
  • the URL contains all embedded data for route configurations to be re-created on the web or mobile application on another device, preferably without the need for data connectivity if the application is a mobile application.
  • the user may create "tours" which combine routes with embedded location based alerts with customized triggers on the web or mobile application.
  • the user may generate a URL link of the tour to send to another device.
  • a user creating a tour for a grocery store may include destination points for items such as eggs, milk and bread.
  • the user may further include proximity based alerts.
  • the user may create alerts such as "one 2% bottle, and one bag of skim milk" that appear when the device approaches the dairy shelf.
  • the location based alert contents are stored on the remote server and associated with IDs; the IDs are stored in the URL link.
  • the device receiving the URL link may then download the location based alert contents associated with those IDs to recreate the tour.
  • the user interface enables presentation of way-finding information.
  • the application is first initiated by the user on the mobile device. On initiation, GPS geo-fencing can be used to determine all the indoor venues that the user is near (FIG 7a). If the map is stored on the mobile device, then the floor plan becomes viewable on the device screen (FIG 7b). If the map is not stored on the device's local locations database, then the map and locations database can be downloaded from the main server through active data/Wi-Fi connection. With this, the user can browse locations and map features. To initiate navigation, in one implementation, the application prompts the user to pan the camera and capture signage information (FIG 7c) to infer initial position and orientation of the device.
  • GPS geo-fencing can be used to determine all the indoor venues that the user is near (FIG 7a). If the map is stored on the mobile device, then the floor plan becomes viewable on the device screen (FIG 7b). If the map is not stored on the device's local locations database, then the map and locations database can be downloaded from
  • the user can then search for a destination through the smart search feature, and once destination is chosen a route will be plotted from their initial location (FIG 7d). Audio cues, alerting users when to turn, as well as automated map re-orientation (orienting the map based on device and route orientation) will be activated. While on route the user can also receive location based content (FIG 7e). Once the user arrives at the destination they will be notified by audio cue and new marker (green checkmark icon) at their destination (FIG 7f).
  • the dashboard page (FIG 7g) will be the main page giving snapshot updates on various components. Administration can view recent chat messages, analytics on quarterly usage, etc.
  • the analytics page gives the administration in- depth coverage of useful statistics mined from the main server (FIG 7h). These could be foot traffic hot spots based on time of day, or infection hotspots based on routing and medical record information.
  • the users/staff page allows management to grant specific access to users based on their authorization (FIG 7i).
  • the maps page allows for administration to make updates to map routes, markers, and location based content (FIG 7j). Administration can update route weights to account for renovation/construction, or change location based alert messages.
  • Map information or map data may incorporate venue or building specific data used by the computer system.
  • building maps may be rendered as vectorised map tiles used for visual representation of buildings on the screen of a user's mobile device.
  • the vectorization information comprises a collection of boundaries represented by polygon coordinate vertices, used to segment the venue map into departments, rooms, corridors, and open areas.
  • Vectorization information may also comprise singular coordinates called markers, used for representing individual item locations such as vending machines, ATMs, doors, and entrances.
  • the map data may further comprise another set of boundaries and polygon perimeters defined to segment map areas to assign various attributes and properties. For example, those areas which are handicap accessible may be segmented and assigned accessibility attributes.
  • Corridors, hallways, and passageways may be segmented and assigned attributes according to security clearances dictating which classes of users may access these routes. This may allow building staff, or other users meeting thresholds of security clearance, to access more detailed and sensitive routing information while hiding these features from the general public. A skilled reader will appreciate that this and other vectorization information may enable and/or improve the generation of way-finding information as described herein.
  • Map data may further comprise a directory list which associates occupant information with map segmented areas. For example, in a hospital all department segments may have a corresponding directory of staff who work in that department, as well as the scheduled hours for each staff member. Additionally, patient room segments may have a directory list which includes the doctor and/or patient who is scheduled to be in that room. The directory list for each segmented map area may also store directory information for different times based on a known schedule of events.
  • map data may comprise a database storing location based alerts and/or services and their corresponding locations.
  • the location based alerts may be customizable for various users based on their authorization and profile information.
  • the location based alerts may also be provided with associative distance thresholds to determine when the alerts will be pushed onto the device screen: A user's position is continuously monitored and compared against the database to see if the user's position is within an alert radius.
  • associative distance thresholds to determine when the alerts will be pushed onto the device screen: A user's position is continuously monitored and compared against the database to see if the user's position is within an alert radius.
  • tracking and routing functionalities may work when the system is offline, i.e., not in communication with any external server, one requirement for offline navigation is that certain map data must be pre-downloaded from the remote server. Methods are therefore provided for downloading map data.
  • map data are downloaded from the server to device memory on occurrence of various triggering events. For instance, a trigger may occur when a user selects a venue within which to navigate.
  • the application checks whether the venue map and location database are already stored on the device memory and whether their timestamps are up-to-date with respect to the timestamps of the venue map and location database stored on the server. If this check returns false, and an active data or Wi-Fi connection exists, the application automatically contacts the map server to begin downloading the relevant venue map data; otherwise, the user is alerted to first establish an active data connection to begin downloading.
  • Other triggers may be established by geo-fencing the GPS location of the user's device.
  • the device location may be monitored while the application runs in the background. Periodically the GPS coordinates are compared against a list of other known venue coordinates stored in device memory and, occasionally, the updated venue list stored on the cloud server if accessible. All venues located within a distance threshold of the GPS location are returned. If a data connection is established, the maps for the returned venues may be downloaded in the background in anticipation that the user is more likely to visit these venues.
  • An extension of this method is to use GPS location history to identify popular geo-zones which the user frequents. Map data for corresponding venues in proximity to these zones may be downloaded while the application runs in the background.
  • Still further triggers involve identifying radio signatures the device is in contact with.
  • the base tower ID located within the vicinity, corresponds to a specific geo-location.
  • these alternative signals may be used to infer the macro-level position of the user in the order of hundreds of meters to a few kilometers. This is generally sufficient granularity to analyze the movement patterns of the user in order to identify which venue maps the application should download in the background.
  • the priority download queue system prioritizes, in a download queue, the order in which venue map data are downloaded.
  • Map data for each venue may be segmented by department, floor, and resolution to create a priority queue in which certain components of venue map data are downloaded first.
  • the download queue for the map data can be arranged as follows: The map data pertaining to the nearest entrance or most common entrances will be downloaded first, followed by map information relating to check-in and security areas; following that, map information relating to gate and boarding areas will be downloaded. This represents the order in which the user traverses the airport, and therefore the time at which specific data will be needed.
  • the priority download queue may be extended to venues with multiple storeys, such as office buildings and multi-level retail complexes. Map data for entrances will be prioritized in the download queue, followed by data related to entrance level floors, and then followed by data pertaining to adjacent floors.
  • priority is assigned on the basis of which areas are known to be more confusing to navigate. Map data related to those areas which are expected to be more confusing to navigate are downloaded first, thereby reducing the chances of navigation interruption occurring at those locations.
  • the priority download queue system may enhance the user experience by mitigating the effects of interruptions to downloads of venue map data caused by poor data connections.
  • the user is shown on the user's device an interactive map of venues or buildings.
  • the user may be able to perform pan and zoom functions to explore the entirety of the map.
  • the device resources needed in loading and displaying all aspects of the map data such as marker items and department names during these user interactions may be costly. Often there may be thousands of markers for a specific floor in the map database. Updating the positions and orientations for many marker items and department names during gesture commands may overload the device processor, resulting in a significant lag time apparent to the user.
  • Measures are therefore provided to ensure that the content displayed on the device runs with minimal lag during navigation mode or user interaction.
  • the map area is divided into smaller map sectors.
  • the size of each sector is dictated by the current zoom level of the application. For example, at the lowest zoom level the sector size could be as big as the entire venue floor, while at the highest level the sector size could be as small as a room.
  • Zoom attributes are assigned to items, such as markers or room boundaries, contained in the map data. The zoom attributes dictate the zoom levels at which items are displayed on screen. This allows for marker items such as vending machine to only be displayed at higher zoom levels, while for example department names may be displayed at lower zoom levels. This also may allow for nested map displays or "Map within a map" situations.
  • a department store inside a mall may be displayed as a general non-detailed polygon on the map at low zoom and may generally include one or two markers describing the main store properties.
  • the map may be populated with department store specific details, such as product aisles, cash register locations, etc.
  • an estimated time of arrival may be provided to the user once a destination and start position are determined.
  • Estimated arrival time can be determined using an average step length and average step frequency gathered from collected user motion data as previously described. For example, if the distance from the starting point to the destination is 100 feet, the average step length is found to be 2 feet and the average step frequency is estimated to be 2 steps per second, then the estimated arrival time may be determined:
  • Average step length may be regularly adjusted based on data averaged from past navigation completion times for the current user.
  • the incorporation of such corrections may provide a more precise routing model to determine the optimal path a user should take to reach a destination, and to estimate the time of arrival. Further, the estimated time of arrival may be relayed to the user and updated in real time during navigation based on the application configuration.
  • the user may select within the application a multipoint or multi- destination route.
  • Multi-destination routes allow the user to select several destinations from the origin point.
  • an icon on the map is displayed, the icon prompting the user to for example continue along the route, cancel the list of destinations or modify the remaining destinations.
  • the user may have a shopping list comprising for example eggs, milk, bread, etc. Each of these items may be located in a different location with the grocery store.
  • the list could be translated into a multi-destination route with destinations provided at respective locations for each item on the user's shopping list.
  • the application may also be used for managing appointments at multiple locations.
  • One major concern of creating an appointment is whether the parties attending the appointment will be on time, or will not make it.
  • doctors and patients are frequently scheduled consecutively. When one patient or doctor is late to an appointment the entire schedule for the day becomes shifted. Therefore, the application may be configured to send updates on user location and predicted arrival times to appointment location staff to better manage appointment schedules or queues. These updates may be periodic or based on certain triggers such as when the user enters or starts the route to the appointment location, or when the estimated time of arrival exceeds the appointment start time.
  • the above automated notifications may be sent to both the user and the staff at the appointment location.
  • Notifications to the staff may be sent to the server which relays them to the client application used by the location staff.
  • the communications may be used for the purposes of cancelling or rescheduling the appointment.
  • the user may also be notified or asked to confirm certain event changes if need be.
  • users may authorize appointment location staff to see the user's current position while en route to the appointment. The user's position may be updated periodically for the location staff to monitor.
  • a communication protocol for enabling an application running on one device to communicate directly with applications running on other devices using available BLUETOOTHTM architecture found in many mobile devices.
  • Devices may scan, connect, or broadcast data to other devices running the application within typical BLUETOOTHTM ranges, i.e., typically less than 50 metres.
  • Information such as device location and position, specific map data, and user generated alerts may be shared. Although the bandwidth of data thus shared may be on the order of megabits per second, this can be adequate in certain applications.
  • One embodiment of the communication protocol relates to the emergency notification feature of the application.
  • users may broadcast via the application an emergency message to venue staff, security, via the server. If the user's device is unable to establish a connection to the server in order to broadcast the emergency message, the application, in addition, broadcasts the message via BLUETOOTHTM to other devices within range running the application. Once the other devices receive the message, they push the message to the server if they are able to connect to the server; otherwise, the other devices re-broadcast via BLUETOOTHTM the message to other devices within their proximity. The other devices continuously re-broadcast the message until they are notified that the server has successfully received the message.
  • BLUETOOTHTM BLUETOOTHTM
  • contact may be initiated through a front/info desk or off-site customer service representative via the mobile device, through the messaging service, transferring data through secure/unsecure wireless server connection.
  • This provides a convenient alternative, rather than having to physically walk to the info desk.
  • their position information can automatically be relayed to the support staff, and they can give better responses to various inquiries. For example if the user is contacting the support staff to report an emergency, the location of the emergency can be crucial for support staff to notify emergency personnel.
  • the customer service staff can access any received messages by logging on to the web portal, once entering their access code, they can see the history of all incoming/outgoing messages and can join in any threads and can continue or initiate any conversations.
  • the messages can also be stored on the central server for further review at a later period.
  • Data mining of the stored messages can reveal certain trends connecting location and questions that users ask. Even the choice of words can be mined from the messages to learn if users feel more frustrated, angry, confused in certain locations as opposed to others.
  • navigation and position history can be stored for later analysis.
  • the mobile device store attributes such as start and end points of each user's route, the time taken, and points where user is idle, etc on the device's internal memory are proposed.
  • the stored information can then be sent to the main server via secure/un secure wireless transfer.
  • One can draw many conclusions from the aggregated data from many users and many routes stored on the server. Areas of high traffic, areas where a majority of users pause during navigation, and preferred routes taken to get to a destination can all be used for marketing and logistics. This information can be accessed from the web portal for a convenient dashboard to view all the analytics.
  • Another application is checking-in to a facility. This could be a restaurant, clinic, hospital, store where users are required to wait in a queue to be served. Usually the facility, vendor or store requires the person to present to check in or be added to the queue. Using the location aware ability of the application, specifically when the radio localization is used, the application knows if the user is indeed at the facility, vendor area or store and can then give them the permissions to automatically check-in and add themselves to the queue. The application can also be used to alert users when their queue position nears the top, and can remind them to head back to the service location if they are away.
  • Certain services provided in facilities could have satellite locations dispersed around the venue.
  • pharmacies, banks, kiosks may have multiple locations in one building. If the user has requested a service from one of these vendors it would be useful for the vendor to know the location information of the user, so as to manage which satellite branch would service the user and accordingly allocate resources. If the user has a route plotted or appointment booked at a location near the satellites, management can coordinate that the nearest satellite location will service the user.
  • Locating other individuals within a facility is desired. Taking advantage of the location aware capabilities of the app to go ahead and request the location of other users on the app is proposed.
  • the app includes an option to set the user's status to online or offline mode. When online, users can view profiles of other online users and message them and even request their location. If the consent of the user is given to relay the location information a route can be plotted to said location. This can be vital in a hospital or other facilities where staff using the computer system or their mobile device would like to communicate and find other staff.
  • the application can be used to broadcast and notify other users and facility management of emergencies through the application. Users running the application have access to an emergency button that if pressed can send a message to the server which includes the user, their location, and if applicable the state of emergency. As described in [00148] the application can also broadcast messages to other devices running the application via BLUETOOTH that are within proximity
  • different modes of navigation can also be activated, such as leisure or time-sensitive modes.
  • the application can read appointment entries made through the application, and native or linked calendars, to identify whether user is in a rush or has time to browse leisurely.
  • time-sensitive mode the application plots the most efficient route to the user's appointment or destination.
  • the computer program can also identify what the user's interests are (clothing, gadgets, etc), and in leisure mode can be routed by points of interests that relate to that profile.
  • the technology described may be integrated with or may connect to various other platforms, technologies, or solutions for which way-finding is complementary.
  • the computer system may incorporate systems and processes for delivering various location-based services such as location-based advertising, offers from local businesses (including businesses along a route shown by the computer system, news/info/entertainment services, social networking services, and so on).
  • the computer system may also integrate voice activated features for initiating way- find and other functions.
  • FIG. 5 shows a generic computer device 100 that may include a central processing unit (“CPU") 102 connected to a storage unit 104 and to a random access memory 106.
  • the CPU 102 may process an operating system 101 , application program 103, and data 123.
  • the operating system 101 , application program 103, and data 123 may be stored in storage unit 104 and loaded into memory 106, as may be required.
  • Computer device 100 may further include a graphics processing unit (GPU) 122 which is operatively connected to CPU 102 and to memory 106 to offload intensive image processing calculations from CPU 102 and run these calculations in parallel with CPU 102.
  • An operator 107 may interact with the computer device 100 using a video display 108 connected by a video interface 105, and various input/output devices such as a keyboard 110, mouse 112, and disk drive or solid state drive 114 connected by an I/O interface 109.
  • the mouse 112 may be configured to control movement of a cursor in the video display 108, and to operate various graphical user interface (GUI) controls appearing in the video display 108 with a mouse button.
  • GUI graphical user interface
  • the disk drive or solid state drive 1 14 may be configured to accept computer readable media 116.
  • the computer device 100 may form part of a network via a network interface 1 1 1 , allowing the computer device 100 to communicate with other suitably configured data processing systems (not shown).
  • One or more different types of sensors 130 may be
  • the present system and method may be practiced on virtually any manner of computer device including a desktop computer, laptop computer, tablet computer or wireless handheld.
  • the present system and method may also be implemented as a computer-readable/useable medium that includes computer program code to enable one or more computer devices to implement each of the various process steps, .
  • the computer devices are networked to distribute the various steps of the operation.
  • the terms computer-readable medium or computer useable medium comprised one or more of any type of physical embodiment of the program code.
  • the computer-readable/useable medium can comprise program code embodied on one or more portable storage articles of manufacture (e.g. an optical disc, a magnetic disk, a tape, etc.), on one or more data storage portioned of a computing device, such as memory associated with a computer and/or a storage system.
  • the mobile application may be implemented as a web service, where the mobile device includes a link for accessing the web service, rather than a native application.
  • the functionality described may be implemented to any mobile platform, including the iOS platform, ANDROIDTM, WINDOWSTM or BLACKBERRYTM.
  • the computer systems and methods disclosed may provide cost effective and accurate way-finding solutions for indoor environments, and possibly additional outdoor environments.
  • the disclosure provides systems, devices, methods, and computer programming products, including non-transient machine-readable instruction sets, for use in implementing such methods and enabling the functionality described previously.
  • the disclosure has been described and illustrated in exemplary forms with a certain degree of particularity, it is noted that the description and illustrations have been made by way of example only. Numerous changes in the details of construction and combination and arrangement of parts and steps may be made. Accordingly, such changes are intended to be included in the invention, the scope of which is defined by the claims.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Electromagnetism (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Navigation (AREA)

Abstract

A mobile computing implemented platform is provided that enables the generation of way- finding information by extracting current location information by analyzing one or more photographs including information relevant to a physical landmark in a facility. Related systems and method are provided for generating the way-finding information. Other systems and methods utilize dead reckoning for way-finding.

Description

NAVIGATION COMPUTER SYSTEM INCLUDING LANDMARK IDENTIFIER SCANNING FIELD OF THE INVENTION
[0001] The following relates generally to mobile technologies and further relates to methods and systems for way-finding.
BACKGROUND
[0002] Modern lifestyles consist of individuals spending more time indoors. People are used to working in large offices, shopping in giant complexes, and attending classes at large institutional campuses. Yet there is generally no comprehensive solution to help users locate and navigate themselves indoors, in a manner comparable to GPS positioning available outdoors. Even for outdoor solutions GPS based solution sometimes do not provide sufficient information. There may be gaps in the availability of GPS services, and data service costs associated with using GPS services can be significant.
[0003] Particularly in the case of individuals inside malls, museums, airports, hospitals, institutions, stores, and office buildings, etc. there is a need to know what their current position or orientation is in relation to their surroundings, and to access route information to one or more destinations of interest ("way-finding information"). With this capability individuals can determine the location on a map of, or way to access, various points of interest including particular stores, rooms or areas in a complex, washrooms, ATMs, vending machines, telephones, etc. They can also determine what route they should take to reach their destination.
[0004] In response, larger facilities inevitably display large maps with the "You are here" icon or equivalent to help orient users. Other way-finding solutions are known, and may include printed maps, way-finding kiosks or smart displays for example. These are generally dispersed sparsely around the venue. With the advent of GPS technology there is an expectation of consumers that the same comforts and ease of navigation experienced on the road could somehow be transferred indoors but this has not been the case, because especially for indoor environments way-finding solutions that extend beyond way-finding points (because lack of access to GPS) are generally not available and costly and often complicated to implement.
[0005] Today's modern lifestyle has also brought on a widespread adoption of smartphones among individuals. As an example, 85% of the US population owns smartphones, equipped with Wi-Fi/cell antenna and modems, a high resolution camera, and a combination of inertial sensors. This has caused a wave of companies to exploit these onboard sensors as means to localize users. The most common solution has been to use external radio signals for localization. This involves mapping the radio signals of the venue/facility and defining zones on the map each having a particular signal profile. The signal profile contains unique radio source tags (identifiers) and their measured average signal strengths. When a user's mobile device polls the radio information and finds their data matches one of the zone profiles, the user's position is pinned to the location coordinates of the zone. Common radio sources mapped are those of Wi-Fi access points, BLUETOOTH™ beacons, and cell tower signals. In all cases this requires manual sweeps of zones to create the radio map, and installing additional infrastructure to increase the accuracy of positioning. To adopt this solution in large institutions is a daunting task which requires great amounts of time, money, and maintenance for all added infrastructure. Also if there is a renovation done to the facility or upgrades made on existing radio networks, repeated sweeps may need to be conducted, to ensure accurate localization, demanding more time and money. Another solution uses installing unique codes around the facility such as QR codes which contain location information. When the user scans the QR code on their phone, the application reads the location information and displays the user's position on the map. Again, this requires installing these codes at various points in the facility, which also have to be maintained, and disrupt the look and feel of the environment and might not fit their interior design theme.
[0006] There is therefore a need for a mobile way-finding solution that is efficient, effective, and economical that does not require a great amount of resources or time to set up.
SUM MARY
[0007] In embodiments, a computer system and computer implemented method are provided that enable a user to access way-finding information relevant to his/her current physical location using a mobile device, and including in indoor environments, comprising: (a) obtaining information for a starting location or approximate starting location ("starting location") in a current physical facility, (b) determining a target location or approximate target location ("target location") in the facility, based on a destination, activity, resource, or service of interest to the user, (c) obtaining way-finding information, such as a map, based on the starting location and target location, (d) initiating one or more inertial functions of the mobile device, and dynamically and in real time analyzing information generated by the inertial functions, using one or more analytical methods, so as to track the movement, or estimated movement, of the user between the starting location and target location ("movement tracking data"), and (e) presenting way- finding information to the user based on the movement tracking data. [0008] In one aspect, the user's movements are tracked using one or more of device camera image data, electromagnetic signals such as cell phone waves, AM/FM waves, Wi-Fi signal, RFID tags, GPS signal and magnetic field, and/or inertial sensors such as an accelerometer and a gyroscope.
[0009] In embodiments, at least one location aware application uses the movement tracking data so as to provide location based services to the user.
[0010] In further embodiments, dead reckoning is used to generate movement tracking data.
[001 1] In still further embodiments, step detection is used to track movement of the user, using mobile device functions. In a further related aspect, the computer system logs stepping information of a user to a profile and this is accessed upon initiation of a way-finding session in order to improve accuracy of tracking of a user.
[0012] In another aspect, novel and innovative phone orientation techniques are used in combination with dead reckoning.
[0013] In a further aspect, the computer system and method use statistical methods for removing errors and thereby improve iterative estimation of location using analytical methods.
[0014] In embodiments, a computer implemented method provides to a user access to way- finding information relevant to his/her current physical location using a mobile device, the method comprising: (a) taking one or more photographs or a continuous stream of images in form of a camera pan, of one or more identifiers of landmarks in the physical location that are indicative of the current physical location, (b) extracting identifier information from the one or more photographs, (c) comparing the identifier information to a map information database that includes identifier information, to determine the current physical location or approximate the current physical location of the user, (d) selecting the current physical location from a list of searchable or non-searchable possible locations on the device, (e) selecting the current physical location through clicking on an icon representing a physical location, on the map shown on the device, (f) using voice-activated features to set the current physical location, (g) integrating any of the mentioned methods with 'smart search' to set the current physical location and/or the destination explained below, (h) based on the current physical location or the approximate current physical location, generating way-finding information and presenting this way-finding information, or initiating its presentation, to the user at the mobile device, (i) tracking the current location of the user as the user moves, through the use of one or more of camera, electromagnetic signals such as cell phone waves, AM/FM waves, Wi-Fi signal, RFID tags, GPS signal and magnetic field, and/or inertial sensors such as accelerometer and gyroscope, (j) tracking the current location of the user as the user proceeds towards a destination through use of the above mentioned signals and/or sensors, (k) estimating the distance between the user's current position and points of interest or the destination as well as the time required to reach the points of interest or the destination, (I) communicating with external systems or servers to aid in calculation of any of the above and/or to make use of any of the above information for context- aware applications, (m) providing a self-contained solution for indoor navigation that does not require any communication with any external server or device, acting as an offline solution to indoor navigation.
[0015] "Smart search" provides a humanistic way of searching which has the ability to link natural language tags with locations in the environment.
[0016] For example by searching for "shoes" at a mall, the user is returned with store names that specialize in or sell shoes. Another example is searching for "eye test" at a hospital venue, the user is returned with destinations such as "ophthalmology department".
[0017] In embodiments, there is provided a non-transitory computer program product tangibly embodying code that, when executed by a processor, causes the processor to carry out at least one of the methods described herein.
[0018] In this respect, before explaining embodiments in detail, it is to be understood that the present methods and systems are not limited in their application to the details of construction and to the arrangements of the components set forth in the following description or the examples provided therein, or illustrated in the drawings. The present is capable of other embodiments and of being practiced and carried out in various ways. It is further to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting.
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] In the drawings, embodiments are illustrated by way of example. It is to be expressly understood that the description and drawings are only for the purpose of illustration and as an aid to understanding, and are not intended as a definition of the limits of the claims.
[0020] FIG. 1 is a diagram illustrating an example of a room number identifier.
[0021] FIG. 2 is an example of a direction sign identifier.
[0022] FIG. 3 is a further example of an identifier, in this case a washroom symbol. [0023] FIG. 4 illustrates the axis representation of a phone.
[0024] Fig. 5 is a diagram illustrating one orientation of the phone held by users during walking.
[0025] FIG. 6 is an example of how key lines and features can be extracted from a camera image.
[0026] FIG. 7a depicts the application during start-up where user can choose specific facility to start navigating inside.
[0027] FIG. 7b is an example of a venue floor plan with markers of points of interest, and a favorites bar with popular destinations.
[0028] FIG. 7c depicts a process to initialize location and orientation estimation via text/image recognition of a room sign.
[0029] FIG. 7d depicts a user's position initialized onto the map, and a plotted route to a destination.
[0030] FIG. 7e depicts one variation of a location aware pop-up delivered to the user during navigation.
[0031] FIG. 7f is an example of the user being notified they have reached their destination by changing the destination icon to a green checkmark.
[0032] FIG. 7g depicts the web portal dashboard giving snapshots of departmental information.
[0033] FIG. 7h is an example of the web portal analytics page depicting infection hot-spots on a floor plan.
[0034] FIG. 7i is an example of the web portal user/staff administration page allowing management to grant specific access to various users.
[0035] FIG. 7j is an example of the web portal maps page, where administration is able to edit map routes, markers, and location info.
[0036] FIG. 8 is an example of a floor map with estimated user location represented with a circle.
[0037] FIG. 9 is a computer system architecture for implementing embodiments of the computer system.
[0038] FIG. 10 illustrates a representative generic implementation of embodiments of the computer system. [0039] FIG. 11 is a flowchart showing methods for providing way-finding information.
[0040] FIG. 12 is a flowchart showing methods for determining a location based on at least one photograph of at least one identifier.
DETAILED DESCRIPTION
[0041] In various aspects, the disclosure provides a computer system or technology platform (may be referred to as "platform") that enables the way-finding solutions described in this disclosure. In embodiments, the platform delivers way-finding information to one or more mobile devices.
[0042] In embodiments, a new computer system and method for localizing and tracking the movement of a user along a path indoors using any smart phone equipped with either a camera, an accelerometer, a gyroscope, and/or a compass are provided. Further embodiments provide a new way-finding solution that does not require access to an external system or to an Internet connection, which is important as in some indoor environments network access may not be available in areas, and in any case performance of such networked solutions may be slow, and may require users to expensive network access which may exclude certain users from using the solution.
[0043] In embodiments, a series of techniques is provided that enable localization and location tracking using only a mobile application and one or more features or resources already available on many mobile phones. These techniques are described below, and fall under two main categories: (A) motion tracking using mobile device functions only, and (B) further improving performance of localization and/or motion tracking by capturing images of one or more landmarks of a facility, and analyzing these images to support localization / motion tracking. Embodiments describe way-finding in the context of routing between a starting location and a target (or destination) location.
[0044] More specifically, in one aspect a novel and innovative inertial navigation solution is provided that is suitable for indoor environments, and that utilizes one or more novel and innovative localization and location tracking methods described herein.
[0045] In embodiments, way-finding solutions in relation to a wide range of environments, both indoors and outdoors, and also in relation to a variety of different types of mobile devices in regards to available hardware, software or network resources are provided. [0046] The set up or the use of any external hardware may not be required. Furthermore, manual sweeps of a facility may not be required. In embodiments, the solution is capable of tracking the user, using only for example an accelerometer and gyroscope signals. The accuracy and performance of the solution can be increased for example by integrating external radio signals (which would require sweeps).
[0047] A mobile device (such as a smart phone or tablet computer) in one implementation includes a client computer program that may be installed on the mobile device. The functionality of the client computer program described in this disclosure may also be implemented, in whole or in part, using one or more smartphone features.
[0048] The computer system may include a server application, implemented to one or more server computers. The computer system may also include an application repository or may be implemented using a cloud computing service. The computer system does not have to be hosted on an external server or repository; it can be integrated on the device itself making the solution self-contained and fully functional on the device, without any communication with an external server or repository. FIG. 9 illustrates one possible implementation of the computer system of the embodiments.
[0049] The computer system may include an electronic mapping solution that may be implemented so as to: (A) determine the location or approximate location of a mobile device, (B) receive one or more requests for way-finding information from the mobile device, (C) use the location or approximate location information to access one or more relevant maps or map information from a database, (D) use the maps or map information to generate way-finding information; (E) communicate the map information to the mobile device; (F) optionally the client computer program uses the map information to generate way-finding content responsive to the request, which may consist of way-finding information or content in a variety of formats that may include map information, text information, way-finding instructions, route information and so on. The displayed way-finding information may also depend on one or more user preferences that the user may define by configuring settings using the client computer program.
[0050] A skilled reader will understand that any manner of way-finding content may be generated such as: a map showing the current location of a user, the orientation of the user, a desired location of the user, a suggested path between the user's current location and the desired (target/destination) location, destinations of possible interest between the current location and the desired location and so on. A target or destination can be provided by a user by execution of a target location determination routine, as enabled by the routines described herein.
[0051] In embodiments, the computer system may connect to various third party systems. For example the computer system may connect to or integrate with a facility management system to obtain up to date information regarding the facility for example. This information may be used for example to route users away from an area that may be busy or closed. In another example, the computer system may be implemented at a hospital and may connect to or integrate with a health information system that may contain for example appointment information for a user. Based on this appointment information the computer system may know the desired location of the user already. Once the user initiates the computer system, for example by opening or signing into the application, the computer system may initiate the determination of the user's current location automatically, and based on this may generate way-finding information for accessing the location of the appointment. The computer system may also connect for example to an enterprise resource management system (or other similar system) that may track for example the current location of an individual or asset that the user wants to find. The computer system may be used to generate way-finding information to find the individual for a meeting for example or to locate the asset. A skilled reader will understand that various other applications of the functionality described are possible.
[0052] A skilled reader will understand that the electronic mapping solutions, or aspects thereof, may be implemented to the mobile device including by implementing part of the mapping functionality to the client computer program on the mobile device and optionally storing to a data store on the mobile device some or part of the mapping information. This may enable the computer system to provide way-finding information to the mobile device even in off-line mode and further may be used to improve the performance of the way-finding solution or utilize network resources efficiently.
[0053] The computer system may include one or more technologies or processes, described hereinafter in greater detail, for acquiring the maps or map information. For example, a floor plan may be used in a map generator to create the maps or map information. Floor plans are generally required due to building code regulations in case of emergency. The map requirement is common to all indoor localization solutions, as is the process of extracting the useful information from the blue prints.
[0054] In embodiments, novel and innovative mechanisms for determining the location or approximate location of the mobile device are provided. The initial localization of the user's position may be performed through several mechanisms provided by a starting location determination routine, including (a) taking one or more photographs (or panning) of an identifier or landmark in the user's surroundings (as further described below), (b) selecting an initial location (or starting point of way-finding) from a list of possible locations suggested to the user using a display associated with mobile device, (c) setting an initial location within a more general location, area, venue, or facility (referred to collectively as a "facility") by displaying a map of the more general location, area, or facility, by for example using an input means to select a location or sub-area on a map, for example by tapping on such location or sub-area or an associated icon on the map, (d) initiating a voice-activated search for an initial location, or (e) accessing one or more "smart search" processes for establishing an initial location, as described below.
[0055] There exists also the ability to automatically localize users without additional user interaction using a embodiments of a device location determination routine (other embodiments may comprise user interaction). One method is to track the location of the user's mobile device while the application is in the background by first processing GPS information while the device is outdoors to identify the building and entrance the user entered and then transitioning to a dead reckoning system, described below, to track the device indoors. When the user commands the application to the foreground the user is already localized to their starting position. This hybrid method is efficient in its use of sensors since it relies only on GPS outdoors, where it is prevalent and provides reasonable precision, then transitions to dead reckoning indoors when GPS precision suffers.
[0056] The initial location functions as a starting point for a way-finding session, involving for example guiding a user from the starting point, being Location A, to one or more end points in the facility, being Location B.
[0057] As shown in Figure 1 1 , methods are provided for way-finding. At step 1 101 , a starting location is determined. At step 1 102, at least one target location is determined based on at least one of a destination, activity, resource or service of interest to the user. At step 1 103, the location of the device is determined. At step 1104, way-finding information is determined based on the starting location, the at least one target location and/or the location of the device. At step 1 105, display of the way-finding information on the device is initiated. In embodiments, a mobile application is provided; however, certain features described for the mobile application may also be implemented to a server application. These and other methods are described in greater detail herein. Use of Landmarks
[0058] Physical locations generally have landmarks that are indicative of their location or approximate location. Various locations within a facility or other physical environment, such as a room, an office, a department entrance, a hallway, a corridor, stairs, an escalator and other physical features of a facility may be associated with visually perceptible information, such as an identifier (an "identifier" being for example a number, letter, symbol or a combination thereof) or tag for any physical feature of a facility relevant to way-finding such as a room, area, office suite, elevator, stairs etc., or a department name, and so on. FIGS. 1 , 2, and 3 show examples of possible identifiers, in examples of photos of such identifiers captured by a mobile device. In embodiments, the computer system is configured to extract information from such photos in order to infer the location or approximate location of the user.
[0059] In embodiments, these identifiers are stored to the database of the computer system, and in one implementation are associated with specific locations in the map database, for example using the location coordinates associated with the map information. In addition, other properties and tags can also be tagged with location entries in the database. For example, other properties may include color (certain departments have color codes), images (certain zones in the facility have logos, symbols that are used as identifiers), and environment (certain zones in the facility have unique designs on the walls or floors which can be used to distinguish them from other zones). All these information elements may be added to the locations database.
[0060] In embodiments, the identifier may merely be a colour of a hallway that in a particular building for example is used to designate a particular area such as a department, wing of a mall, and ward in a hospital and so on.
[0061] In embodiments, a user may take at least one for or a camera pan of one or more identifiers, as shown in Figure 12 at step 1201. This may occur by accessing one or more related functions of the client computer program. For example, the client computer program may present a "PAN CAMERA AT A LANDMARK" screen or equivalent. The user may take pictures of successive identifiers. The client computer program may use resources and information available on the mobile device in an off-line mode. In either case, the computer system cross-references the captured information against the contents of the map information database so as to generate way-finding information as previously described. [0062] In this way, a user may take pictures for example of any sign containing location information such as a door number, directional sign (signs which point into the direction of various destinations), elevator or staircase signs, logos, unique images, on a smartphone.
[0063] As shown in figure 12 at step 1202, the computer system may utilize any mechanism for extracting relevant information from the image including but not limited to: (A) OCR (text filtering) (to extract text from the images that may be relevant to determining location of the mobile device, such as in the case of a picture of a room number; (B) color filtering; (C) symbol or logo identification and so on. This extraction may occur on the mobile device using one or more client computer program features for example, or on the server computer using the server application, or as part of a cloud service.
[0064] If the locations database or map information database, and the image detection and recognition processes are performed on the mobile phone, the system can potentially work while offline (i.e. no active data connection to a network).
[0065] The computer system may utilize one or more solutions (including the identifier based location method described), for example to narrow down facility locations. For example, the computer system may use GPS sensor data or Wi-Fi IP address information to improve the accuracy of location determination. The use of additional localization methods may be particularly useful when a user is taking their first picture at a given physical location.
[0066] In embodiments, what follows is a possible implementation of the computer system and computer implemented method. As shown in Figure 12, first, at step 1202, information regarding the identifier is extracted from the photograph obtained in step 1201. At step 1203, the information is compared to a map database to determine a match between the information and a location in the map database. Finally, at step 1204, when a match is determined, the location in the map database is returned to provide a device location or starting location.
[0067] If the photograph (also referred to herein as "image" or "picture") contains text, all text is extracted/filtered using OCR techniques. Second the text may be analyzed to interpret in part the type of information represented by the text. The computer system may differentiate for example between text representing a room number and text representing a direction. The computer system may implement one or more logic rules for interpreting the information extracted from the photos and also inferring an accurate location from one or more extracted information elements. For example, if the filtered text contains a room number (as shown in Fig. 1) or some other unique code is found in the map information database, an exact position coordinate may be returned by the computer system. If for example the image contains text which does not correspond to a unique identifier, such as a directional sign (as shown in Fig. 2) pointing to various destinations, then further logical steps must be taken. Directional signs contain names of various sites and from their text alone cannot reveal an exact location of the sign. Instead an area or region boundary could be returned as the approximate location of the user. If the sign mentions "food court" and "atrium" (each having an entry on the locations database) there is a high possibility they are occupying a region between or around those locations. Also by recognizing the directions of the arrows on the signs pointing to the destinations, the application could narrow our area estimate considerably. From this example it will be appreciated that filtered text alone can narrow down the user's location to an exact position coordinate or approximate area coordinate based on the nature of the text.
[0068] In another aspect, the computer system may also extract from the pictures additional context which is the also transferred for the purposes of localization and generation of way- finding information. For example, the computer system may extract information from a direction sign and also may recognize that the direction sign is a direction sign rather than an office number sign, and provide this information for localization purposes in order to improve localization accuracy.
[0069] The computer system may be integrated with various standard third party mapping utilities. All mapping utilities will have a locations database which closely resembles this format:
Name Type Room# Location
Meeting Room 5, Room entrance, 1-118, x=1 12, y=145, z=1 st floor
The computer system may use a key word search based on the recognized text in the map information database. This may yield one or more results. The computer system may then apply one or more logical rules to search results in order to deduce the user's position or approximate position. Using a text only solution also has the advantages of low computations/processing needed to go from the initial image to user position, and can easily be performed offline. This approach also does not require an extensive database of location tagged venue images to match pictures captured from the computer system. In this respect, in one embodiment the text extraction is performed on the mobile device using one or more features of the client computer program. [0070] In one aspect, the map information database may be implemented as a customized map information or locations database that contains the extra tags and identifiers consisting for example of colors, symbols, logos, and shapes. For example different wings of a facility may use signage having different color themes. The color captured from the color filter can be searched for in the database to reveal which wing the user is in. In this manner multiple properties may be detected on one image to give a better position estimate of the user's location.
[0071] In some cases, the filtered text, color, or symbol extracted from the image may not be found in the map information database, or may be found in too many records or entries of the map information database. If too many records are found, the next step would be to use the most recent GPS data and/or if possible the most recent radio based position estimate to narrow down the possible user locations within in "active area" defined in the next sections. For example if it is known through GPS that the user has just entered the building through the west wing entrance, and the image taken by the user has captured an elevator image or related text, the user position will be narrowed down to the west-wing elevators rather than any other elevator locations. If the last known GPS or radio based position estimate is too old, or the information extracted from the image does not result in a location estimate, the client computer program may notify the user of the mobile device, and request additional information. For example, the client computer program may display a message on the mobile device display requesting that the user take another picture, and this picture is processed as described above in order to narrow down the possible positions of the user. If all the above filters are inadequate in finding a specific location, the user may be presented with a shortlist of most likely marker locations for them to pick from. Wherever possible, associated screenshots (general wide field view) of these locations will be displayed to better allow the user to make a selection.
[0072] In most cases, applications that rely on a phone compass or magnetic field and gravity sensor to infer the phone's absolute orientation do not work well indoors. This is due to the numerous sources of magnetic interference that may be presented by construction materials or electronic equipment housed and running in the facility. In embodiments, an economical and efficient alternative where a way-finding solution accessible from mobile devices of users might not otherwise be available is provided.
[0073] In embodiments, the computer system can deduce orientation of the user from the pictures taken and/or filtered text. For example, a user submits an image of a unique door number. The map information database may also include properties such as orientation of entrances. This can be deduced for example from the blueprint in populating map information for a facility into the database. For example the "Meeting Room 5" entrance may be known to be facing west; from this the mobile device must have been oriented east to take the picture. Moreover from the orientation of the text on the image (whether it is in portrait or landscape), the exact orientation of the device can be locked. Even from directional signs, cross referencing the destinations as well as the relative directions to the destinations, can give a good estimate of the user's mobile phone can be inferred.
[0074] In some cases, the filtered text, color, or symbol extracted from the image may not be in any entries of the locations database, or may be in too many entries. Either way, an appropriate position estimate cannot be given. In this case the user is asked to take another picture, for which the processed information will be appended to the information from previous pictures to narrow down the possible positions of the user. If all the above filters are inadequate in finding a specific location, the user is presented with a shortlist of most likely marker locations for them to pick from. Wherever possible, associated screenshots (general wide field view) of these locations will be displayed to better allow the user to make a selection. In other words the location determination solution may be iterative. Various modifications to the workflow are possible.
[0075] In another aspect, the computer system may log location history for each user during each session, for example by storing data regarding location history to the user's profile using the profile manager. This way, by keeping track of the time elapsed since localization, a virtual circle can be created ("active area") and centered on their previous location with radius proportional to the elapsed time. This circle represents an area of high probability where the user is located. Based on this information the mobile device memory can be intelligently populated with location entries and route information from the cloud or server computer (or other information that the computer system determines may be relevant or of interest to the user) via data connection so that if the user goes back offline, they will have some entries around their location to cross reference. Also by keeping track of past position estimates, when the user takes pictures of signs that are not unique, further filtering can be conducted so as to narrow their exact position. For example if the user takes a picture of a washroom sign (as shown in Fig. 3), and from image recognition (done on phone locally or on server computer side) the computer system knows it is a washroom and cross references with any washrooms in the map information database with location coordinates inside the active area. This can narrow down the possible options significantly. Smart Search
[0076] Another aspect of embodiments involves "smart search". Smart search allows the user to use keywords to identify specific locations within the venue. Each location within the venue may be associated in a database with a list of words that describe, or are mapped to, that location. For instance, the radiology clinic within a hospital may be associated with the following keywords: "ultrasound", "x-ray", "CT", "CT scan", "radiation", and "radiologist". The keywords may represent the activities and/or the personnel associated with the location. Keywords may also be sentences, synonyms of the location or abbreviation of the location. For example, the following are all possible keywords for an emergency department within a hospital: "broke my leg", "emerg", "ER", "urgent care", "ambulance drop-off", and "bleeding". Various other keywords are possible. Each of these keywords may be stored to a data store, where a data store may be associated for example with a particular facility or part of a facility.
[0077] The computer system may incorporate various search functions including for example semantic search tools that allow a user to provide a semantically related term, which is then analyzed an associated with a keyword in a defined list of key words associated with a particular location.
[0078] It is to be understood that when the user searches for a location either through a drop- down or searchable menu, or through voice activated features, the smart search will be conducted not just for all the locations, but also for all the keywords. If a keyword satisfies the search criteria, then the location associated with the keyword is returned.
[0079] Smart search may be combined with other methods, including the features discussed herein, to improve accuracy of localization of the user.
[0080] The data stores may be implemented to a server computer, and also may be included in a data store linked to the mobile application to enable operation in an off-line mode. The mobile application may allow the mobile device to switch between on-line and off-line modes. For example, the mobile application may use geo-location to sense that the mobile device is approaching a facility associated with the computer system, and may download one or more data sets or application updates so as to support the way-finding features of the system, once the mobile device is off-line, for example inside the facility where network services may not be available, or the user may be required to shut down the sending/receiving functions of his/her mobile device. During the on-line mode, for example, the mobile application may acquire updates to a data store associated with the facility such as for example up to date maps for a facility, up to date key word data stores, and so on.
Location Tracking Features
Dead Reckoning
[0081] In embodiments, dead reckoning or disambiguation processes are used for the first time in connection with way-finding. In one aspect, after the primary localization is performed and the user's position is known, dead reckoning can be used to keep track of the user's position. Dead reckoning is used to determine a change in the user's position, which can be used to compute the user's current position relative to an initial starting position. Dead reckoning can be applied for example using output from the mobile device's inertial sensors such as data from an accelerometer or a gyroscope, as well as non-inertial sensors and devices such as magnetic field sensors and the camera.
Step Detection
[0082] Step detection is used to identify whether the user is taking any steps, which corresponds to movement. Step detection is performed by analysis of real time data from inertial sensors such as, for example an (accelerometer, a gravity, linear acceleration, magnetometer, and gyroscopes. All data is given in tri-axial representation with respect to the phone. The accelerometer measures the total phone acceleration, for example in m/s2, gravity data gives the vector direction of normalized gravity with respect to the phone axis. Linear acceleration gives extracted component of acceleration caused by user movement in m/s2 for example, the magnetometer measures the ambient magnetic field in for example the gyroscope measures the angular velocity of the computer system in for example the rad/s.
[0083] Normal walking will correspond to a sinusoidal and/or periodic signal from inertial sensors, due to the constant swinging and swaying of the body. The period of the signal is proportional to the number of steps taken by the user.
[0084] To minimize the number of false positives and false negatives in the step detection algorithm the signals can be filtered using FIR filters with standard windowing techniques; the windowed filtered and unfiltered signals are then passed through feature extraction layers to identify key attributes of the signal versus time. Example of features that can be extracted from filtered and unfiltered signals are variance, energy (sum of squares), or magnitude maxima or minima position, mean, moving mean, and peak-peak amplitude, first and second derivative, and first and second integrals for all sensor signals. The values and characteristics of the features can be used to characterize and describe a step profile for ideal walking. The step profile may consist of using a combination of the above-mentioned features and ranges for a subset of specific inertial sensors mentioned above which represent conditions of true steps. The values of the features at any time can be compared against this step profile to determine whether the user is walking or standing still. The step profile can also be used to distinguish between an actual step and noise (either user generated or due to sensor inaccuracies). These profiles are not limited to steps, but can also be used to identify when the user is on the elevator, stairs, and escalator. This is useful to detect floor switching within a facility.
[0085] One of the main problems using these features and associated ranges for generating and detecting profiles are the effects of noise sources. There are many noise components which corrupt the sensor measurements and data: one is white/temporal noise which exists even at high frequencies, the other is DC noise which produces an artificial bias or a base-line. The temporal noise is made up of random fluctuations averaging zero and lasting a short time on the order of milliseconds. This type of noise can be eliminated effectively by applying a low pass filter to sensor data. The DC noise is a slow moving offset caused by poor sensor calibration. For example when the phone is lying still on a table, the accelerometer and gyroscope sensor data should ideally read zero, but rather due to the bias noise it may read some non-zero value. In addition this non-zero value may change slowly over time, for example on the order of seconds, varying changes in the orientation of the phone with respect to gravity, and may differ from device to device.
[0086] Without applying a robust method to filter out DC noise, analysis and position estimation using sensor data may be non-optimal. DC elimination firstly involves the ability to separate/identify the bias component from the true data with minimal ambiguity. The cases when the DC noise can be identified theoretically is when: the true data is known to be zero, then the remaining signal is the bias component; or, when the true data is known a priori, in which case the true signal can simply be subtracted from the total to identify the bias, as done in calibration. In general when the true signal is known to have a mean value of zero, the bias can be calculated from the mean of the signal. In practice when the mobile device is lying still (State A), or the user is walking with the phone without sudden jerks or turns (State B), the zero mean condition can be assumed. There are many ways to identify these two states, some of which are described in greater detail in the following paragraphs.
Variance Based Triggers [0087] One of the extracted features mentioned above is signal variance which is independent of the signal mean or bias. The variance measures the distribution of the signal with respect to the mean, but is not affected by it. When the phone is lying still, State A, or the user is walking without sudden jerks or turns, State B, the variance stays quite low. During moments of turns or jerks in user movement the variance spikes up to higher values. Appropriate trigger values can be determined such that when the variance is lower than a certain trigger value, the program can assume with reasonable confidence that the user is in State A or B. During this time the program will update the known value for the DC noise by calculating the signal mean, running mean, moving average or similar value.
Signal Curvature
[0088] Another approach is based on the curvature of the signal, directly proportional to the second derivative. This feature is also not correlated to the mean value, and is a good indicator of the two states of interest. Similar to the variance the second derivative of the signal is also lower when the user device is in State A and B. Appropriate trigger values can be determined such that when the second derivative is lower than a certain trigger value, the program can assume with reasonable confidence that the user is in State A or B. During this time the program will update the known value for the DC noise by calculating the signal mean, running mean, moving average or similar value.
Signal Peak-to-Peak Value
[0089] Another approach is to identify either state from signal peak-to-peak value. Again, the peak-to-peak value is lower when the user device is in either State A or B. Appropriate trigger values can be determined such that when the second derivative is lower than a certain trigger value, the program can assume with reasonable confidence that the user is in State A or B. During this time the program will update the known value for the DC noise by calculating the signal mean, running mean, moving average or similar value.
[0090] These methods can be used in conjunction to identify State A and B with increased certainty. As the DC bias is continuously tracked it is directly subtracted from the sensor data to arrive at a value closer to the true signal value. This technique can be used for all accelerometer and gyroscope derived sensor data, for accurate profile generation and detection that can work across many devices. [0091] Once a profile is generated for walking, standing, turning, stairs, elevators, escalators, movement, the corresponding feature ranges and trigger values may be stored using a profile manager on the mobile device for use by the application.
[0092] All profiles may be generated for common phone orientations, and stored using the profile manager to a suitable database.
[0093] The phone orientation can be classified into one of the three categories: the phone is held such that the z-axis (Fig. 4) is facing up (phone parallel to ground, Fig. 5), the y-axis (Fig. 4) is facing up (phone perpendicular to ground), or the x-axis (Fig. 4) is facing up or down (phone held at side). Each orientation of the phone corresponds to a specific step profile, which the step-detection algorithm uses to identify steps. The combined knowledge of the orientation of the phone and the features is used to determine whether a step is taken or not.
[0094] Usually step detection can only be performed effectively by attaching sensors on or near the feet. Using the step profile, the step detection method is capable of functioning with the phone held in any orientation at or near any bodily region. That is, the user can have the phone in their hands (around the hip area) in front of them, on their side, or even tucked away in their pockets.
Optical Flow
[0095] Another technique which can be used in combination with the step detection method previously described, as a mechanism to enhance its performance, is an optical flow method. The optical flow method uses visual information obtained using a camera of a mobile device for example. In this aspect, camera pixel information is analyzed for example using the Lucas Kanade algorithm (or another suitable process) to find the movement of certain key features in a picture window. By comparing the positions of the key features from frame to frame it is possible to infer the movement of the user. In operation, if the user holds the phone with z-axis pointed up the camera is most likely to face the floor. In this orientation the optical flow algorithm picks up lines or spots on the floor as key features (techniques similar to edge detection, SURF or FAST feature extraction are used Fig 6). Thereafter, by analyzing the history and movement of the key features from frame to frame, it is possible to infer information regarding the movement(s) of the user. This allows the use of visual cues from the user's environment to improve the performance of localization and tracking of the user's movement for the purposes of providing way-finding solutions that perform well.
Phone Orientation - Heading [0096] In embodiments, further details are provided on determining more precise mobile device orientation, including for example to support movie device orientation mechanisms, including for use in the described dead reckoning operations.
[0097] In one aspect, orientation of the phone can be calculated using the combination of output from an accelerometer (which yields the gravity vector), a magnetometer (which yields the direction of the magnetic field) and a gyroscope (which yields the change in orientation of the device). Orientation is used to assume the direction of user's movement. Orientation of the device is identified as the yaw angle of the device and is independent of the roll and the pitch angles.
[0098] Gyroscopes do not provide an absolute orientation, but only the relative orientation change (hereby referred to as a relative orientation estimate). For example, one can detect a 90 degree turn to the right, but not the direction that the phone is facing (North, East, West, or South) at the end of the turn. So, to get the absolute orientation one can use the magnetometer (compass) and accelerometer (gravity). Unfortunately due to the poor quality of these sensors on mobile devices, each sensor has its own uncertainties and is prone to corruption from noisy data. To overcome the errors from the individual sensors a common approach taken is the use of a Kalman filter, which aggregates data of various sensors (for example gyroscope, magnetic sensor, and accelerometer) to come up with a better orientation estimate. Using a Kalman filter, uncertainty profiles are assigned for each sensor which allows the program to determine which sensor should be trusted and given priority.
[0099] One issue with the use of Kalman filters is that they rely on the magnetic fields (measured by magnetometer) and assume that the magnetic fields point constantly in a northerly direction. However, in indoor locations, this is not always the case. Building materials as well as electronic devices can cause interference in the magnetic field to vacillate across a range of directions. Large continuous regions of a facility can have an intrinsic magnetic field which deviates from North. To overcome this limitation, one method heavily relies on other sensors (gyroscope, accelerometer, camera) to track the orientation change of the device. Limiting the use of magnetic field means that the starting absolute orientation of the device is not known when navigation starts.
[00100] To derive the initial orientation the user can be asked to take a photo of a nearby sign or landmark (note that this same image can be used to perform localization as described above); then by using an automated algorithm to identify the angle of the device when photo was taken (mentioned above [0044]) as well as the data from accelerometer, the initial orientation of the phone can be extracted. The data from the gyroscope can then be used to track the change in orientation of the device as compared with the starting orientation.
[00101] In embodiments, bias in the inertial data may be reduced by aggregating the inertial data received from each of two or more sensors and calibrating the inertial data received from one sensor against the data received from another sensor.
[00102] As previously mentioned, gyroscopes in portable devices are generally not of a high quality, and have their own intrinsic errors. As a result they drift from the true orientation over time. Gyroscopes relay angular velocity information of the device, and are prone to bias (constant) additive noise described earlier. The bias in angular velocity causes drift over time in the relative angular direction of the integrated gyroscope signal. This can make the device appear as if it is rotating (as interpreted from the gyroscope signal), even when it is lying still. To make matters worse the bias noise also varies with time and is dependent on the orientation of the phone, making it difficult to compensate for. To overcome the bias i.e. separate the bias noise from the relevant signal data, in one aspect a recursive algorithm is used which in one aspect can be a modification of the non-recursive Whittaker method. This is a non-linear least squares method using the average of the signal as well as its curvature based on the triggering methods described above to identify States A and B and track the bias noise. This non-linear algorithm uses threshold triggers based on gyro variance and average to dynamically identify and track the bias of the gyro signal. Once the bias is identified, it is subtracted from the total signal to recover the uncorrupted signal.
[00103] The removal of the gyro bias allows for longer orientation tracking than previously possible by mitigating the accumulation of gyro error.
[00104] To further address the issue of angular drift, in one aspect, an accelerometer is used to fix the orientation of the device about the roll, pitch angles (rotation about x and y axis as shown in Fig 4). The accelerometer data can be passed through a low-pass filter which is used to track the gravity vector in the phone coordinates system (Fig. 4). This allows for an independent estimate of the roll and pitch angles (locking in 2 degrees of freedom of the total orientation which needs 3) and correction for any errors accumulated from the roll and pitch angles determined from the gyroscope readings. This gyroscope-accelerometer method provides one example mechanism for estimating orientation change.
[00105] Although it is established that magnetic field sensors do not permit tracking of absolute orientation indoors, in embodiments, the computer system utilizes the combined magnetic field (providing yaw info) and accelerometer vectors (providing roll and pitch info) to track relative orientation change. In regions where interference pulls the magnetic field vector away from the northerly direction, the magnetic field can still be constant over that local area. Therefore, it can be used as a local reference direction used in orientation estimation. In this way, the computer system provides another independent relative orientation estimate coming from magnetometer-accelerometer, making use of external fields.
[00106] Magnetometer-accelerometer relative orientation estimation can be used when the magnetic field is locally constant, and discarded when the magnetic field is in transition to a new local reference direction. In embodiments, the system and method identifies when the magnetic field is in transition and thereby cannot be used. For each time interval, while the computer system is running, the change in roll, pitch, and yaw can be calculated each from the gyroscope-accelerometer, and magnetometer-accelerometer estimates, and are compared. When the difference between the two is greater than a threshold determined heuristically, it can be attributed to a transition of the local magnetic field rather than turning of the mobile device.
[00107] In embodiments relative orientation through aggregation of the two independent relative orientation estimates is tracked, namely the gyroscope-accelerometer estimate and the magnetometer-accelerometer estimate. In one aspect, a dynamic Kalman filter combines the two estimates, which continuously changes the uncertainty weights based on the identification of a constant or transient magnetic field. When the magnetic field is identified as being in transition, higher uncertainty is assigned to the magnetometer-accelerometer estimate, thereby trusting gyroscope-accelerometer estimate more. When the magnetic field is identified as stable, the magnetometer-accelerometer estimate is used in addition to come up with final relative orientation estimate. In this manner, the dynamic Kalman filter is able to obtain highly accurate orientation information using low quality sensors.
Removing Error
[00108] Any system or method that relies on dead reckoning alone will suffer from build-up of error, due to inaccuracies within the system and noise from the data. Over time the error can build-up at an exponential rate, which may make information inferred from dead reckoning obsolete over relatively long times and distances. It is advantageous therefore to minimize such errors or even eliminate then in order to continue using dead reckoning over long time and distances. This can be accomplished using a variety of techniques for example by introducing corrections based on the context of the map in the form of a heuristic probability table. Bayesian Based Heuristic History Probability Table
[00109] In one aspect of the system and method, map details can be used in correcting for position/orientation estimation error. In one aspect, the map itself imposes certain restrictions on the position of the user (and thus the device). For instance, the user can only be in valid public hallways and the user can only walk in the directions parallel to hallways (they cannot walk through walls). The system can use this knowledge to improve the estimation of the orientation of the mobile device as well as the estimation of the position of the mobile device.
[00110] For instance, if the orientation of the mobile device does not change significantly for about 5 seconds, and steps are being detected while the user is in a hallway, it can be detected that the user is walking in a direction parallel to that hallway. As such, it is possible to snap the orientation of the mobile device to the closest direction parallel to that hallway, calibrating for orientation error.
[00111] The position of the user can be improved when the user takes turns. For instance, once identified that the user has taken a turn (for example, a right turn), and from the map it is evident that there is a corridor extending to the right near the current estimated location of the user, the system waits to see if the user takes additional steps in this new orientation. Once steps are detected in the new orientation, the system determines that the user is on the new corridor, and his/her position is updated accordingly.
[00112] This aspect of the system, involving the dynamic update of the user's current position as part of tracking movement of the user, may be implemented using a Bayesian probability based approach. Every time the user passes or approaches a corridor, there is a slight probability that he/she may have entered that corridor. The distance to that corridor from the user's current position, as well as, the angle between the mobile device's current orientation and the closest direction parallel to that corridor can be used as qualifying factors to determine if the corridor is added to the heuristic history probability table. As per these criteria, all the corridors are considered for inclusion into the heuristic history probability table. The heuristic history probability table keeps track of probabilities associated with the user being in any corridor. If a new corridor is being added to the probability table, then the position of the user in that corridor is set as the intersection between the corridor in question and the user's current corridor. If a corridor being considered for inclusion is already in the probability table, then the possible user position in that corridor is updated with respect to the user's current position on that corridor. [001 13] If use of dead reckoning by the system determines that the user is going beyond the ranges of the corridor and outside the allowed map areas, then the user position for that corridor is not updated and the probability associated with that corridor is reduced significantly. In effect, the user's position is restricted to a valid location on the map (that is, the user has to be on one of the defined corridor ranges of the map). The probability table can be updated every time a step is detected and the location with the highest probability is considered as the most accurate estimate of the user's current position, and is the position displayed on the map.
[001 14] Use of a heuristic algorithm is especially useful in identifying and correcting the user's current position in venues that involve multiple turns at close proximity and asymmetric geometry. For example, consider that the user is walking along a corridor and takes a right turn. The floor plan is such that there are two possible corridors where the user could have turned; corridor A or corridor B (Fig 8). The user might actually turn right on corridor B; however, the heuristic algorithm incorrectly presumes that the user turned right on corridor A. At this time, the heuristic probability table contains both corridors as possible positions but A has a higher probability due to the qualifying factors mentioned above. With initial steps taken after the right turn, the algorithm estimates that the user is proceeding on corridor A. As the user continues to walk, the algorithm will eventually estimate that the user has reached the end of corridor A, and upon further steps the user's position goes beyond the range of corridor A. At this moment, the algorithm will place penalties on corridor A reducing its overall probability in the heuristic table, eventually lowering its probability below the probability given to corridor B, having the next highest probability on the heuristic table. The user will then be repositioned at an appropriate location on corridor B.
Radio based Localization
[001 15] In embodiments, the performance of the system can also be enhanced by using external radio signals, such as Wi-Fi, BLUETOOTH™, RFID tags, and cellular signals which act as external references to confirm estimated position and orientation.
[001 16] Radio localization requires a manual sweep of indoor environment of radio signals (e.g. cell tower, AM/FM signal, Wi-Fi, BLUETOOTH™). Any floor area that the user can take must be previously swept to build a signal profile library. For example to sweep a particular route, the signal collection program and device can continuously record the radio signals along the route and label the signals with the current time. For one of these routes time stamps are used to estimate the location of the recorded signals. This is done in a simple way where signals with earlier time stamps are associated with positions near the start of the particular route, and signals with later time stamps are associated with position near the middle or end or route.
[00117] The collected radio data is then pre-processed before being used for localization. Pre-processing must be done since different devices have different antenna specifications and firmware which interprets the radio electro-magnetic fields uniquely. So inherently different devices may have different radio maps having different profile values. In embodiments, a method is provided to transform one radio map obtained from one device to radio maps that can be used by other devices. All that is needed is to find the minimum and maximum radio strengths detectable from the new device firmware and normalize the existing map to fit the new range of signals read by this device.
[00118] From the pre-processed data, a Bayesian network is used to generate, with heuristic probabilities, the signal profiles for various positions on the floor. Then, during run-time the system passes the current signals being read from the antenna through the Bayesian network which then infers which location has the highest probability.
[00119] In a further aspect, the radio sweeping process may be crowd sourced using the inertial navigation (dead reckoning) system mentioned above. This way, while users use the inertial based navigation system they can also be taking a sweep, and collecting the radio signals passively. That is, en route the inertial navigation system can be used to tag the location of incoming radio signals with the current estimated position. This crowd sourcing approach has the advantage in building radio signal maps at multiple locations very fast from collected user data.
Radio-lnertial Based Sensor Fusion
[00120] Both the inertial location estimation and radio location estimation methods described above can be used in combination to provide more accurate estimation of the device's orientation and position. Radio prediction raises two issues. One is coarse grain localization having uncertainty of usually 5-10 meters and can even go as high as 15 meters. This uncertainty radius depends on the environment and is not predictable or controllable. This is the main disadvantage of radio localization.
[00121] In embodiments, the radio localization with the inertial navigation methods to obtain fine grain position estimation is augmented. In this manner, an initial position once the computer system is first turned on is estimated from radio localization. With successive radio estimates a pattern and line of movement of the user are identified. This information is used to set the user heading/orientation. Feeding the position and orientation information to the inertial navigation system gives ability to distinguish fine grain position movement on the order of steps. Also, a dynamic Kalman Filter may be used to amalgamate the position estimates coming from inertial navigation and Wife estimation. In general, a Kalman filter is used in many position estimation problems. The fundamental reason being Kalman filter is proven to be an advantageous recursive filter able to combine two independent sets of noisy inputs into an output signal with minimized noise. It takes into account the known uncertainty distributions of each sensor and understands the situations where one sensor can give more reliable information than the other. So in one scenario, two different inputs, one coming from the inertial navigation estimate, and the other from the radio estimate are provided. The uncertainty distributions of each based on the context of the map are dynamically varied. For example if known a priori that Wi-Fi performs poorly at certain hallway intersections, then the dynamic Kalman will weight the inertial estimate higher. The Dynamic Kalman can produce position estimates with 2-3 meter uncertainty.
Web Portal
[00122] In embodiments, a web portal is included, implemented for example to a server computer including a web based utility, that when executed presents certain map information and views based on access granted to users. Based on user authentication/authorization, it is possible to change, update, and edit the map and content information. Facility administrators who are responsible for way-finding can sign into the web portal to update allowed routes, mark- off construction zones, and highlight detour corridors, and mark private and public routes (e.g., staff would have access to private corridors during navigation). Once these updates are communicated to the main server, all mobile devices running the specific way-finding computer program will be notified of the update and asked to download the changes to the new map database, file, and images. Marketing administrators can also login to the web portal for managing content related to location based alerts, department information, and any ongoing promotions. Once these updates are uploaded to the main platform server, the next time the user's mobile device contacts the server the content database will be updated. When a user clicks on a department icon, the new information will be displayed. Or if the user is within some radius of the department a pop-up alert with the new information can be displayed.
[00123] The Web portal can also be made available to members of the public who would like navigation and routing information. In this manner an individual can access the web portal and request routing information from the main server. They will be required to set a starting point and a destination point through the web portal interface and a route will be returned and plotted on the facility map. The web map can also have an interactive component where users see an icon (representing position) jump from the initial starting point, to key turn locations all the way to their destination by clicking next and previous arrows on the map. The URL of the map page can be copied and emailed to anyone using any device with a web browser (out-of-band communication), which when opened will display the map with the requested route pre-plotted. Once the URL is opened on a mobile device or computer system the navigation information as well as the interactive component is cached on local memory so the user can view and interact with the map offline.
[00124] In embodiments, once a user has set a route in the web interface or mobile interface, the user may generate a URL link through the application to send to another user, as described above. The URL may be sent to an email address or shared in a chat room. In embodiments, the URL contains all embedded data for route configurations to be re-created on the web or mobile application on another device, preferably without the need for data connectivity if the application is a mobile application.
[00125] In further aspects, the user may create "tours" which combine routes with embedded location based alerts with customized triggers on the web or mobile application. The user may generate a URL link of the tour to send to another device. For example, a user creating a tour for a grocery store may include destination points for items such as eggs, milk and bread. The user may further include proximity based alerts. In the above example, the user may create alerts such as "one 2% bottle, and one bag of skim milk" that appear when the device approaches the dairy shelf. The location based alert contents are stored on the remote server and associated with IDs; the IDs are stored in the URL link. The device receiving the URL link may then download the location based alert contents associated with those IDs to recreate the tour.
User Interface
[00126] The user interface enables presentation of way-finding information. The application is first initiated by the user on the mobile device. On initiation, GPS geo-fencing can be used to determine all the indoor venues that the user is near (FIG 7a). If the map is stored on the mobile device, then the floor plan becomes viewable on the device screen (FIG 7b). If the map is not stored on the device's local locations database, then the map and locations database can be downloaded from the main server through active data/Wi-Fi connection. With this, the user can browse locations and map features. To initiate navigation, in one implementation, the application prompts the user to pan the camera and capture signage information (FIG 7c) to infer initial position and orientation of the device. The user can then search for a destination through the smart search feature, and once destination is chosen a route will be plotted from their initial location (FIG 7d). Audio cues, alerting users when to turn, as well as automated map re-orientation (orienting the map based on device and route orientation) will be activated. While on route the user can also receive location based content (FIG 7e). Once the user arrives at the destination they will be notified by audio cue and new marker (green checkmark icon) at their destination (FIG 7f).
[00127] Facility management and administration will also be able to check current status of the application through the web portal. The dashboard page (FIG 7g) will be the main page giving snapshot updates on various components. Administration can view recent chat messages, analytics on quarterly usage, etc. The analytics page gives the administration in- depth coverage of useful statistics mined from the main server (FIG 7h). These could be foot traffic hot spots based on time of day, or infection hotspots based on routing and medical record information. The users/staff page allows management to grant specific access to users based on their authorization (FIG 7i). The maps page allows for administration to make updates to map routes, markers, and location based content (FIG 7j). Administration can update route weights to account for renovation/construction, or change location based alert messages.
Definition of Map Data
[00128] Map information or map data may incorporate venue or building specific data used by the computer system. In at least one mapping solution, building maps may be rendered as vectorised map tiles used for visual representation of buildings on the screen of a user's mobile device. The vectorization information comprises a collection of boundaries represented by polygon coordinate vertices, used to segment the venue map into departments, rooms, corridors, and open areas. Vectorization information may also comprise singular coordinates called markers, used for representing individual item locations such as vending machines, ATMs, doors, and entrances. The map data may further comprise another set of boundaries and polygon perimeters defined to segment map areas to assign various attributes and properties. For example, those areas which are handicap accessible may be segmented and assigned accessibility attributes. Corridors, hallways, and passageways may be segmented and assigned attributes according to security clearances dictating which classes of users may access these routes. This may allow building staff, or other users meeting thresholds of security clearance, to access more detailed and sensitive routing information while hiding these features from the general public. A skilled reader will appreciate that this and other vectorization information may enable and/or improve the generation of way-finding information as described herein.
[00129] Map data may further comprise a directory list which associates occupant information with map segmented areas. For example, in a hospital all department segments may have a corresponding directory of staff who work in that department, as well as the scheduled hours for each staff member. Additionally, patient room segments may have a directory list which includes the doctor and/or patient who is scheduled to be in that room. The directory list for each segmented map area may also store directory information for different times based on a known schedule of events.
[00130] Still further, map data may comprise a database storing location based alerts and/or services and their corresponding locations. The location based alerts may be customizable for various users based on their authorization and profile information. The location based alerts may also be provided with associative distance thresholds to determine when the alerts will be pushed onto the device screen: A user's position is continuously monitored and compared against the database to see if the user's position is within an alert radius. A skilled reader will appreciate that these and other aspects may enable and/or improve user selection of start and destination points, and location based services and alerts.
Downloading Map Data to Devices
[00131] Although the tracking and routing functionalities may work when the system is offline, i.e., not in communication with any external server, one requirement for offline navigation is that certain map data must be pre-downloaded from the remote server. Methods are therefore provided for downloading map data.
[00132] In embodiments, map data are downloaded from the server to device memory on occurrence of various triggering events. For instance, a trigger may occur when a user selects a venue within which to navigate. The application checks whether the venue map and location database are already stored on the device memory and whether their timestamps are up-to-date with respect to the timestamps of the venue map and location database stored on the server. If this check returns false, and an active data or Wi-Fi connection exists, the application automatically contacts the map server to begin downloading the relevant venue map data; otherwise, the user is alerted to first establish an active data connection to begin downloading.
[00133] Other triggers may be established by geo-fencing the GPS location of the user's device. The device location may be monitored while the application runs in the background. Periodically the GPS coordinates are compared against a list of other known venue coordinates stored in device memory and, occasionally, the updated venue list stored on the cloud server if accessible. All venues located within a distance threshold of the GPS location are returned. If a data connection is established, the maps for the returned venues may be downloaded in the background in anticipation that the user is more likely to visit these venues. An extension of this method is to use GPS location history to identify popular geo-zones which the user frequents. Map data for corresponding venues in proximity to these zones may be downloaded while the application runs in the background.
[00134] Still further triggers involve identifying radio signatures the device is in contact with. This includes Wi-Fi connection properties such as SSI D, MAC ID, or I P address which usually correspond to a set of access points within a specific location range. Additionally if the device has an established cellular network connection, the base tower ID, located within the vicinity, corresponds to a specific geo-location. When a GPS signal is not available, these alternative signals may be used to infer the macro-level position of the user in the order of hundreds of meters to a few kilometers. This is generally sufficient granularity to analyze the movement patterns of the user in order to identify which venue maps the application should download in the background.
Download priority
[00135] A skilled reader will appreciate that the quality of the device data connection cannot always be trusted, and often a data connection can be interrupted mid download due to poor reception. This could potentially disrupt or pause the navigation process due to map data not being available when needed. Therefore, a priority download queue system is provided.
[00136] In embodiments, the priority download queue system prioritizes, in a download queue, the order in which venue map data are downloaded. Map data for each venue may be segmented by department, floor, and resolution to create a priority queue in which certain components of venue map data are downloaded first. For example, in the case where a user's GPS position shows the user entering or nearing a venue such as an airport, the download queue for the map data can be arranged as follows: The map data pertaining to the nearest entrance or most common entrances will be downloaded first, followed by map information relating to check-in and security areas; following that, map information relating to gate and boarding areas will be downloaded. This represents the order in which the user traverses the airport, and therefore the time at which specific data will be needed. [00137] In further embodiments, the priority download queue may be extended to venues with multiple storeys, such as office buildings and multi-level retail complexes. Map data for entrances will be prioritized in the download queue, followed by data related to entrance level floors, and then followed by data pertaining to adjacent floors.
[00138] In still further embodiments, priority is assigned on the basis of which areas are known to be more confusing to navigate. Map data related to those areas which are expected to be more confusing to navigate are downloaded first, thereby reducing the chances of navigation interruption occurring at those locations.
[00139] It will be appreciated that the priority download queue system may enhance the user experience by mitigating the effects of interruptions to downloads of venue map data caused by poor data connections.
Map Data Display Optimization
[00140] In embodiments, the user is shown on the user's device an interactive map of venues or buildings. By using common finger gestures, the user may be able to perform pan and zoom functions to explore the entirety of the map. The device resources needed in loading and displaying all aspects of the map data such as marker items and department names during these user interactions may be costly. Often there may be thousands of markers for a specific floor in the map database. Updating the positions and orientations for many marker items and department names during gesture commands may overload the device processor, resulting in a significant lag time apparent to the user.
[00141] Measures are therefore provided to ensure that the content displayed on the device runs with minimal lag during navigation mode or user interaction. The map area is divided into smaller map sectors. The size of each sector is dictated by the current zoom level of the application. For example, at the lowest zoom level the sector size could be as big as the entire venue floor, while at the highest level the sector size could be as small as a room. Zoom attributes are assigned to items, such as markers or room boundaries, contained in the map data. The zoom attributes dictate the zoom levels at which items are displayed on screen. This allows for marker items such as vending machine to only be displayed at higher zoom levels, while for example department names may be displayed at lower zoom levels. This also may allow for nested map displays or "Map within a map" situations. For example a department store inside a mall may be displayed as a general non-detailed polygon on the map at low zoom and may generally include one or two markers describing the main store properties. When the map is zoomed in on this location, the map may be populated with department store specific details, such as product aisles, cash register locations, etc.
[00142] During map traversal only the items in each sector which have zoom attributes in accordance with the current zoom level will be pushed onto a display stack managed by the application. Items on the stack will be updated frequently for accurate map display by calculating new pixel positions on the screen during user interaction and navigation mode, orientation during map rotation, and item resizing when zooming. Items outside the sector will be not be included or will be pushed off the stack during sector transitioning. By maintaining only smaller sector level data, device resources do not need to process entire floor data, but only subsets thereof, to reduce overall computations processed. This may further enhance the user experience by de-cluttering the application window.
Estimated Time of Arrival
[00143] In embodiments, an estimated time of arrival may be provided to the user once a destination and start position are determined. Estimated arrival time can be determined using an average step length and average step frequency gathered from collected user motion data as previously described. For example, if the distance from the starting point to the destination is 100 feet, the average step length is found to be 2 feet and the average step frequency is estimated to be 2 steps per second, then the estimated arrival time may be determined:
Figure imgf000034_0001
Average step length may be regularly adjusted based on data averaged from past navigation completion times for the current user.
[00144] The estimated time of arrival determination may incorporate corrections based on motion analytics gathered from past data using the application or from other sources. Based on past data, certain routes or passageways may be identified for unusual delay patterns caused by external factors that tend to occur based on a time of day, week, month, or season. For example, in a retail store, the external factors may include foot traffic during peak periods, seasonal shopping trends, and renovations. The external factors may be incorporated into the application routing algorithm to produce a new effective distance for each route or passageway: Distance effectise [ft] = Distance a ctaal [ft] * aMay Above, cideiay is a penalty added to a route passageway to account for external delay factors which may vary with time. The incorporation of such corrections may provide a more precise routing model to determine the optimal path a user should take to reach a destination, and to estimate the time of arrival. Further, the estimated time of arrival may be relayed to the user and updated in real time during navigation based on the application configuration.
Multi-Destination Routes
[00145] In embodiments, the user may select within the application a multipoint or multi- destination route. Multi-destination routes allow the user to select several destinations from the origin point. When the user arrives at each destination, an icon on the map is displayed, the icon prompting the user to for example continue along the route, cancel the list of destinations or modify the remaining destinations. In a grocery store, for example, the user may have a shopping list comprising for example eggs, milk, bread, etc. Each of these items may be located in a different location with the grocery store. In embodiments, the list could be translated into a multi-destination route with destinations provided at respective locations for each item on the user's shopping list.
Real-Time Appointment Scheduling
[00146] The application may also be used for managing appointments at multiple locations. One major concern of creating an appointment is whether the parties attending the appointment will be on time, or will not make it. For example, in a hospital context, doctors and patients are frequently scheduled consecutively. When one patient or doctor is late to an appointment the entire schedule for the day becomes shifted. Therefore, the application may be configured to send updates on user location and predicted arrival times to appointment location staff to better manage appointment schedules or queues. These updates may be periodic or based on certain triggers such as when the user enters or starts the route to the appointment location, or when the estimated time of arrival exceeds the appointment start time. The above automated notifications may be sent to both the user and the staff at the appointment location. Notifications to the staff may be sent to the server which relays them to the client application used by the location staff. The communications may be used for the purposes of cancelling or rescheduling the appointment. The user may also be notified or asked to confirm certain event changes if need be. Additionally, users may authorize appointment location staff to see the user's current position while en route to the appointment. The user's position may be updated periodically for the location staff to monitor. Device-to-Device Communication
[00147] In embodiments, a communication protocol is provided for enabling an application running on one device to communicate directly with applications running on other devices using available BLUETOOTH™ architecture found in many mobile devices. Devices may scan, connect, or broadcast data to other devices running the application within typical BLUETOOTH™ ranges, i.e., typically less than 50 metres. Information such as device location and position, specific map data, and user generated alerts may be shared. Although the bandwidth of data thus shared may be on the order of megabits per second, this can be adequate in certain applications.
[00148] One embodiment of the communication protocol relates to the emergency notification feature of the application. In aspects, users may broadcast via the application an emergency message to venue staff, security, via the server. If the user's device is unable to establish a connection to the server in order to broadcast the emergency message, the application, in addition, broadcasts the message via BLUETOOTH™ to other devices within range running the application. Once the other devices receive the message, they push the message to the server if they are able to connect to the server; otherwise, the other devices re-broadcast via BLUETOOTH™ the message to other devices within their proximity. The other devices continuously re-broadcast the message until they are notified that the server has successfully received the message. A skilled reader will appreciate that communications gaps resulting from data dead spots within facilities may thereby be overcome.
Different Applications
[00149] There are many possible applications this technology can integrate into. Some possible examples of implementations and uses are described.
[00150] During user navigation of indoor facilities, users might have questions regarding hours of operation of various stores and departments or administration information such as doctor available at a hospital during the day etc.
[00151] In one implementation, contact may be initiated through a front/info desk or off-site customer service representative via the mobile device, through the messaging service, transferring data through secure/unsecure wireless server connection. This provides a convenient alternative, rather than having to physically walk to the info desk. In addition if the user is localized, their position information can automatically be relayed to the support staff, and they can give better responses to various inquiries. For example if the user is contacting the support staff to report an emergency, the location of the emergency can be crucial for support staff to notify emergency personnel. The customer service staff can access any received messages by logging on to the web portal, once entering their access code, they can see the history of all incoming/outgoing messages and can join in any threads and can continue or initiate any conversations. The messages can also be stored on the central server for further review at a later period. Data mining of the stored messages can reveal certain trends connecting location and questions that users ask. Even the choice of words can be mined from the messages to learn if users feel more frustrated, angry, confused in certain locations as opposed to others.
[00152] In medical offices, buildings or institutions integration of the way finding application with the electronic medical records system is proposed. This way key information can be scraped from the records that could help the user. For example if the user is known to have a baby in post natal care, and the records show that the baby requires express feeding spontaneously, this information can be used to fill the way-finding application's emergency destinations list with pumping room or feeding room. This emergency destination will appear in a convenient, easy to see location on the mobile device. When the button is clicked the application will route the user to the nearest pumping room. If the user is not initially localized the option to localize using camera pan will pop up, or if radio localization is turned on, automatic localization will commence.
[00153] During active localization of the user, navigation and position history can be stored for later analysis. The mobile device store attributes such as start and end points of each user's route, the time taken, and points where user is idle, etc on the device's internal memory are proposed. The stored information can then be sent to the main server via secure/un secure wireless transfer. One can draw many conclusions from the aggregated data from many users and many routes stored on the server. Areas of high traffic, areas where a majority of users pause during navigation, and preferred routes taken to get to a destination can all be used for marketing and logistics. This information can be accessed from the web portal for a convenient dashboard to view all the analytics.
[00154] One interesting possibility arising from the analytics is infection tracking in hospitals. From the aggregated route information, and the integration with the electronic medical records potential high risk areas can be predicted. For example if it comes to the attention of the hospital that one of their patients has acquired a potentially high risk contagious disease, then the facility can go through the analytics and see the routes that the individual has taken. Comparing this with other users who have taken routes that are in the same vicinity and time frame can lead to conclusions as to where the disease could spread.
[00155] Another application is checking-in to a facility. This could be a restaurant, clinic, hospital, store where users are required to wait in a queue to be served. Usually the facility, vendor or store requires the person to present to check in or be added to the queue. Using the location aware ability of the application, specifically when the radio localization is used, the application knows if the user is indeed at the facility, vendor area or store and can then give them the permissions to automatically check-in and add themselves to the queue. The application can also be used to alert users when their queue position nears the top, and can remind them to head back to the service location if they are away.
[00156] Certain services provided in facilities could have satellite locations dispersed around the venue. For example pharmacies, banks, kiosks may have multiple locations in one building. If the user has requested a service from one of these vendors it would be useful for the vendor to know the location information of the user, so as to manage which satellite branch would service the user and accordingly allocate resources. If the user has a route plotted or appointment booked at a location near the satellites, management can coordinate that the nearest satellite location will service the user.
[00157] Locating other individuals within a facility is desired. Taking advantage of the location aware capabilities of the app to go ahead and request the location of other users on the app is proposed. The app includes an option to set the user's status to online or offline mode. When online, users can view profiles of other online users and message them and even request their location. If the consent of the user is given to relay the location information a route can be plotted to said location. This can be vital in a hospital or other facilities where staff using the computer system or their mobile device would like to communicate and find other staff.
[00158] The application can be used to broadcast and notify other users and facility management of emergencies through the application. Users running the application have access to an emergency button that if pressed can send a message to the server which includes the user, their location, and if applicable the state of emergency. As described in [00148] the application can also broadcast messages to other devices running the application via BLUETOOTH that are within proximity
[00159] Combining some of the features/applications different modes of navigation can also be activated, such as leisure or time-sensitive modes. The application can read appointment entries made through the application, and native or linked calendars, to identify whether user is in a rush or has time to browse leisurely. In time-sensitive mode, the application plots the most efficient route to the user's appointment or destination. Based on the user's profile, the computer program can also identify what the user's interests are (clothing, gadgets, etc), and in leisure mode can be routed by points of interests that relate to that profile.
Possible Implementations
[00160] In embodiments and depending on the particular implementation and various associated factors such as the resources of the mobile device, wireless network parameters, and requirements of the content distribution of social media platforms, different implementation architectures may be used.
[00161] The technology described may be integrated with or may connect to various other platforms, technologies, or solutions for which way-finding is complementary. For example, the computer system may incorporate systems and processes for delivering various location-based services such as location-based advertising, offers from local businesses (including businesses along a route shown by the computer system, news/info/entertainment services, social networking services, and so on).
[00162] The computer system may also integrate voice activated features for initiating way- find and other functions.
[00163] The present system and method may be practiced in various embodiments. A suitably configured computer device, and associated communications networks, devices, software and firmware may provide a platform for enabling one or more embodiments as described above. By way of example, FIG. 5 shows a generic computer device 100 that may include a central processing unit ("CPU") 102 connected to a storage unit 104 and to a random access memory 106. The CPU 102 may process an operating system 101 , application program 103, and data 123. The operating system 101 , application program 103, and data 123 may be stored in storage unit 104 and loaded into memory 106, as may be required. Computer device 100 may further include a graphics processing unit (GPU) 122 which is operatively connected to CPU 102 and to memory 106 to offload intensive image processing calculations from CPU 102 and run these calculations in parallel with CPU 102. An operator 107 may interact with the computer device 100 using a video display 108 connected by a video interface 105, and various input/output devices such as a keyboard 110, mouse 112, and disk drive or solid state drive 114 connected by an I/O interface 109. The mouse 112 may be configured to control movement of a cursor in the video display 108, and to operate various graphical user interface (GUI) controls appearing in the video display 108 with a mouse button. The disk drive or solid state drive 1 14 may be configured to accept computer readable media 116. The computer device 100 may form part of a network via a network interface 1 1 1 , allowing the computer device 100 to communicate with other suitably configured data processing systems (not shown). One or more different types of sensors 130 may be used to receive input from various sources.
[00164] The present system and method may be practiced on virtually any manner of computer device including a desktop computer, laptop computer, tablet computer or wireless handheld. In embodiments, the present system and method may also be implemented as a computer-readable/useable medium that includes computer program code to enable one or more computer devices to implement each of the various process steps, . In case of more than one computer devices performing the entire operation, the computer devices are networked to distribute the various steps of the operation. It is understood that the terms computer-readable medium or computer useable medium comprised one or more of any type of physical embodiment of the program code. In particular, the computer-readable/useable medium can comprise program code embodied on one or more portable storage articles of manufacture (e.g. an optical disc, a magnetic disk, a tape, etc.), on one or more data storage portioned of a computing device, such as memory associated with a computer and/or a storage system.
[00165] In embodiments, the mobile application may be implemented as a web service, where the mobile device includes a link for accessing the web service, rather than a native application.
[00166] The functionality described may be implemented to any mobile platform, including the iOS platform, ANDROID™, WINDOWS™ or BLACKBERRY™.
[00167] It will be appreciated by those skilled in the art that other variations of the embodiments described herein may also be practiced without departing from the scope of the embodiments. Other modifications are therefore possible.
[00168] The computer systems and methods disclosed may provide cost effective and accurate way-finding solutions for indoor environments, and possibly additional outdoor environments.
[00169] In further aspects, the disclosure provides systems, devices, methods, and computer programming products, including non-transient machine-readable instruction sets, for use in implementing such methods and enabling the functionality described previously. [00170] Although the disclosure has been described and illustrated in exemplary forms with a certain degree of particularity, it is noted that the description and illustrations have been made by way of example only. Numerous changes in the details of construction and combination and arrangement of parts and steps may be made. Accordingly, such changes are intended to be included in the invention, the scope of which is defined by the claims.
[00171] Except to the extent explicitly stated or inherent within the processes described, including any optional steps or components thereof, no required order, sequence, or combination is intended or implied. As will be will be understood by those skilled in the relevant arts, with respect to both processes and any systems, devices, etc., described herein, a wide range of variations is possible, and even advantageous, in various circumstances, without departing from the scope of the invention, which is to be limited only by the claims.

Claims

CLAIMS What is claimed is:
1 . A way-finding computer system for presenting way-finding information to a target location to a user, the way-finding system comprising a mobile device configured to: acquire an image of an identifier of a location in a physical environment;
extract from the image information regarding the identifier;
compare the information to a map database to determine a location of the mobile device,
the location defining a starting location;
determine way-finding information based on the starting location and the target location; and
present the way-finding information on a display of the mobile device.
2. A way-finding computer system for presenting way-finding information to a target location to a user, the way-finding system comprising a mobile device configured to: initiate a starting location determination routine to determine a starting location; initiate at least one device location determination routine comprising dead reckoning to determine a location of the device;
determine way-finding information based on the starting location, the at least one target location, and the location of the device; and
present the way-finding information on a display.
3. The way finding-computer system of claim 2, wherein the dead reckoning comprises:
initiating one or more inertial functions in the device to provide inertial data;
receiving the inertial data from at least one sensor; and
analyzing the inertial data to track movement of the device.
4. The way-finding computer system of claim 3, wherein each of the at least one sensor providing inertial data comprises a gyroscope, accelerometer or magnetometer.
5. The way-finding computer system of claim 3, wherein the dead reckoning further comprises reducing bias in the inertial data by applying at least one recursive algorithm to the inertial data.
6. The way-finding computer system of claim 5, wherein the dead reckoning further comprises reducing bias in the inertial data by aggregating the inertial data received from each of two or more sensors and calibrating the inertial data received from one sensor against the data received from another sensor.
7. The way finding-computer system of claim 6, wherein the calibrating comprises applying a Kalman filter to the inertial data.
8. The way-finding computer system of claim 7, wherein the device location determination routine further comprises reducing bias by aggregating the inertial data with map data in a heuristic probability table to provide corrected inertial data.
9. The way-finding computer system of claim 2, wherein the at least one device location determination routine further comprises radio localisation.
10. A computer-implemented method for presenting way-finding information to a target location to user of a mobile device, the method comprising:
acquiring an image of an identifier of a location in a physical environment;
extracting from the image information regarding the identifier;
comparing the information to a map database to determine a location of the mobile device, the location defining a starting location;
determining way-finding information based on the starting location and the target location; and
initiating display on the mobile device of the way-finding information.
1 1. A computer-implemented method for presenting way-finding information to a target location to a user of a mobile device, the method comprising:
determining a starting location;
determining, by dead reckoning, a location of the device;
determining way-finding information based on the starting location, the at least one target location, and the location of the device; and initiating display on the mobile device of the way-finding information.
12. The computer-implemented method of claim 1 1 , wherein the dead reckoning comprises:
initiating one or more inertial functions in the device to provide inertial data;
receiving the inertial data from at least one sensor; and
analyzing the inertial data to track movement of the device.
13. The computer-implemented method of claim 12, wherein each of the at least one sensor providing inertial data comprises a gyroscope, magnetometer, or accelerometer.
14. The computer-implemented method of claim 12, wherein the step of determining the location of the device further comprises reducing bias in the inertial data by applying at least one recursive algorithm to the inertial data.
15. The computer-implemented method of claim 14, wherein the step of determining the location of the device further comprises reducing bias in the inertial data by aggregating the inertial data received from each of two or more sensors and calibrating the inertial data received from one sensor against the data received from another sensor.
16. The computer-implemented method of claim 15, wherein the calibrating is accomplished by applying a Kalman filter to the inertial data.
17. The computer-implemented method of claim 16, wherein the step of determining the location of the device further comprises reducing bias by aggregating the inertial data with map data in a heuristic probability table to provide corrected inertial data.
18. The computer-implemented method of claim 11 , wherein the step of determining the location of the device further comprises radio localisation.
PCT/CA2014/050395 2013-04-26 2014-04-25 Navigation computer system including landmark identifier scanning WO2014172792A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361816481P 2013-04-26 2013-04-26
US61/816,481 2013-04-26
US201361829562P 2013-05-31 2013-05-31
US61/829,562 2013-05-31

Publications (1)

Publication Number Publication Date
WO2014172792A1 true WO2014172792A1 (en) 2014-10-30

Family

ID=51790946

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2014/050395 WO2014172792A1 (en) 2013-04-26 2014-04-25 Navigation computer system including landmark identifier scanning

Country Status (1)

Country Link
WO (1) WO2014172792A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016095050A1 (en) * 2014-12-18 2016-06-23 Innerspace Technology Inc. Method and system for sensing interior spaces to auto-generate a navigational map
DE102015205097A1 (en) * 2015-01-15 2016-07-21 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Location device and method for localization
WO2017086561A1 (en) * 2015-11-20 2017-05-26 Samsung Electronics Co., Ltd. Landmark location determination
WO2017149440A1 (en) * 2016-03-01 2017-09-08 Nokia Technologies Oy Method, apparatus and computer program product for navigation in an indoor space
CN110780325A (en) * 2019-08-23 2020-02-11 腾讯科技(深圳)有限公司 Method and device for positioning moving object and electronic equipment
CN112197768A (en) * 2020-10-21 2021-01-08 中国人民解放军海军航空大学 Aircraft inversion interference observation turning control method for measuring lateral overload
WO2021110335A1 (en) * 2019-12-02 2021-06-10 Audi Ag Method for determining the position of a user of a vehicle after the user has exited the vehicle, and computer program product
US11466989B2 (en) * 2019-11-08 2022-10-11 Industry Academy Cooperation Foundation Of Sejong University Techniques for indoor positioning
US20220381577A1 (en) * 2021-06-01 2022-12-01 Harman International Industries, Incorporated Directional guidance for a space
US11519750B2 (en) 2020-11-20 2022-12-06 Here Global B.V. Estimating a device location based on direction signs and camera output

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120130762A1 (en) * 2010-11-18 2012-05-24 Navteq North America, Llc Building directory aided navigation

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120130762A1 (en) * 2010-11-18 2012-05-24 Navteq North America, Llc Building directory aided navigation

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11193773B2 (en) 2014-12-18 2021-12-07 Innerspace Technology Inc. Wayfinding system for interior spaces using an auto-generated navigational map
US10670408B2 (en) 2014-12-18 2020-06-02 Innerspace Technology Inc. System for sensing interior spaces to auto-generate a navigational map
WO2016095050A1 (en) * 2014-12-18 2016-06-23 Innerspace Technology Inc. Method and system for sensing interior spaces to auto-generate a navigational map
US10458798B2 (en) 2014-12-18 2019-10-29 Innerspace Technology Inc. Method for sensing interior spaces to auto-generate a navigational map
US10584970B2 (en) 2015-01-15 2020-03-10 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Localization apparatus and localization method
DE102015205097A1 (en) * 2015-01-15 2016-07-21 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Location device and method for localization
US10415978B2 (en) 2015-11-20 2019-09-17 Samsung Electronics Co., Ltd. Landmark location determination
WO2017086561A1 (en) * 2015-11-20 2017-05-26 Samsung Electronics Co., Ltd. Landmark location determination
WO2017149440A1 (en) * 2016-03-01 2017-09-08 Nokia Technologies Oy Method, apparatus and computer program product for navigation in an indoor space
CN110780325A (en) * 2019-08-23 2020-02-11 腾讯科技(深圳)有限公司 Method and device for positioning moving object and electronic equipment
US11466989B2 (en) * 2019-11-08 2022-10-11 Industry Academy Cooperation Foundation Of Sejong University Techniques for indoor positioning
WO2021110335A1 (en) * 2019-12-02 2021-06-10 Audi Ag Method for determining the position of a user of a vehicle after the user has exited the vehicle, and computer program product
CN112197768A (en) * 2020-10-21 2021-01-08 中国人民解放军海军航空大学 Aircraft inversion interference observation turning control method for measuring lateral overload
CN112197768B (en) * 2020-10-21 2022-10-11 中国人民解放军海军航空大学 Aircraft inversion interference observation turning control method for measuring lateral overload
US11519750B2 (en) 2020-11-20 2022-12-06 Here Global B.V. Estimating a device location based on direction signs and camera output
US20220381577A1 (en) * 2021-06-01 2022-12-01 Harman International Industries, Incorporated Directional guidance for a space

Similar Documents

Publication Publication Date Title
WO2014172792A1 (en) Navigation computer system including landmark identifier scanning
US9448085B2 (en) Live branded dynamic mapping
US11582576B2 (en) Feature-based slam
EP2671373B1 (en) Method and apparatus for mobile location determination
US9294873B1 (en) Enhanced guidance for electronic devices using objects within in a particular area
US20170323478A1 (en) Method and apparatus for evaluating environmental structures for in-situ content augmentation
US8676623B2 (en) Building directory aided navigation
US8884742B2 (en) Identifying locations within a building using a mobile device
US8588809B2 (en) Managing public resources
US20150138230A1 (en) Method and apparatus for annotating point of interest information
US10623897B1 (en) Augmented reality for data curation
US9191782B2 (en) 2D to 3D map conversion for improved navigation
US10264404B2 (en) Information processing apparatus, system, and method
US10832489B2 (en) Presenting location based icons on a device display
EP3909267B1 (en) A controller, system and method for providing a location-based service to an area
US10338768B1 (en) Graphical user interface for finding and depicting individuals
US8874108B2 (en) Integrating mobile devices into a fixed communication infrastructure
US10830593B2 (en) Cognitive fingerprinting for indoor location sensor networks
Bellini et al. Maintenance and emergency management with an integrated indoor/outdoor navigation support
AU2013362168B2 (en) Integrating mobile devices into a fixed communication infrastructure
US9692867B2 (en) Event accessory item location
US11997578B2 (en) Method and apparatus for indoor mapping and location services
Noreikis Enabling Ubiquitous Augmented Reality with Crowdsourced Indoor Mapping and Localization
EP2747461B1 (en) Integrating mobile devices into a fixed communication infrastructure
BR112013019835B1 (en) METHOD OF GENERATING A USER LOCATION USING A MOBILE DEVICE AND MOBILE DEVICE

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14788213

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC - FORM 1205A (24.02.2016)

122 Ep: pct application non-entry in european phase

Ref document number: 14788213

Country of ref document: EP

Kind code of ref document: A1